WO2024067527A1 - 一种髋关节角度检测系统和方法 - Google Patents

一种髋关节角度检测系统和方法 Download PDF

Info

Publication number
WO2024067527A1
WO2024067527A1 PCT/CN2023/121265 CN2023121265W WO2024067527A1 WO 2024067527 A1 WO2024067527 A1 WO 2024067527A1 CN 2023121265 W CN2023121265 W CN 2023121265W WO 2024067527 A1 WO2024067527 A1 WO 2024067527A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
hip joint
preset
screening
region
Prior art date
Application number
PCT/CN2023/121265
Other languages
English (en)
French (fr)
Inventor
李传东
刘安逸
万祁
Original Assignee
武汉联影医疗科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 武汉联影医疗科技有限公司 filed Critical 武汉联影医疗科技有限公司
Publication of WO2024067527A1 publication Critical patent/WO2024067527A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone

Definitions

  • the present specification relates to the field of hip joint typing, and in particular to a hip joint angle detection system and method.
  • Neonatal hip typing can prevent or treat hip dysplasia in infants through early screening and intervention. This helps avoid serious hip problems later on, such as hip dislocation and hip degeneration. Hip typing is very important for newborns, as it can help doctors detect hip dysplasia in infants in a timely manner and take appropriate preventive or therapeutic measures.
  • Some embodiments of the present specification provide a hip joint angle detection system and method to better detect the hip joint of a newborn.
  • One or more embodiments of the present specification provide a hip joint angle detection system, the system comprising a processor, the processor being configured to perform the following operations: acquiring an ultrasound image of an object to be detected; extracting an iliac region contour corresponding to the hip joint of the object to be detected from the ultrasound image; acquiring a centroid position of the iliac region in the iliac region contour, and acquiring a hip joint image based on the centroid position; and determining the hip joint angle of the object to be detected based on the hip joint image.
  • One or more embodiments of the present specification provide a method for detecting a hip joint angle, the method comprising: acquiring an ultrasonic image of an object to be detected; extracting an iliac region contour corresponding to the hip joint of the object to be detected from the ultrasonic image; acquiring a centroid position of an iliac region in the iliac region contour, and acquiring a hip joint image based on the centroid position; and determining the hip joint angle of the object to be detected based on the hip joint image.
  • One or more embodiments of the present specification provide a computer-readable storage medium, wherein the storage medium stores computer instructions.
  • the computer reads the computer instructions in the storage medium, the computer executes the above-mentioned hip joint angle detection method.
  • One or more embodiments of the present specification provide a hip joint classification method, the method comprising: acquiring an original ultrasound image of an object to be tested; the original ultrasound image is an ultrasound image corresponding to the hip joint of the object to be tested; extracting an iliac region image corresponding to the hip joint of the object to be tested from the original ultrasound image; inputting the iliac region image into a preset active contour model for hip joint classification, and obtaining the hip joint type of the object to be tested.
  • One or more embodiments of the present specification provide a hip joint classification device, which includes a processor, and the processor is used to execute the hip joint classification method as described above.
  • FIG1 is a schematic diagram of an exemplary application scenario of a hip joint classification system according to some embodiments of this specification.
  • FIG2 is an exemplary flow chart of a hip joint angle detection method according to some embodiments of the present specification.
  • FIG3 is an exemplary flow chart of determining the contour of the iliac region according to some embodiments of the present specification
  • FIG4 is an exemplary flow chart of a method for determining a hip joint angle according to some embodiments of the present specification
  • FIG5 is an exemplary flow chart of a hip joint classification method according to some embodiments of the present specification.
  • FIG6 is a schematic diagram of an exemplary process of determining an iliac region image according to other embodiments of the present specification.
  • FIG. 7 is an exemplary flow chart of filtering and threshold segmentation of an ultrasound image according to some embodiments of the present specification.
  • FIG8 is a schematic diagram of an exemplary process of obtaining a candidate connected domain according to some embodiments of this specification.
  • FIG9 is an exemplary schematic flowchart of obtaining a target connected domain according to some embodiments of this specification.
  • FIG10 is a schematic diagram of an exemplary process of determining an iliac region image according to some embodiments of the present specification
  • FIG11 is a schematic diagram of an exemplary process of a hip joint classification method according to some embodiments of the present specification.
  • FIG. 12 is an exemplary flow chart of determining three anatomical points according to some embodiments of the present specification.
  • FIG13 is a schematic diagram of an exemplary flow chart of a method for determining flange points according to some embodiments of the present specification
  • FIG14 is a schematic diagram of the structure of a hip joint classification system according to some embodiments of the present specification.
  • FIG15 is a schematic diagram of the process structure of a hip joint classification method according to some embodiments of this specification.
  • FIG16 is an ultrasound image of an object to be tested according to some embodiments of the present specification.
  • FIG17 is a binary segmentation image corresponding to an ultrasound image according to some embodiments of this specification.
  • FIG18 is a target connected domain image corresponding to an ultrasound image according to some embodiments of this specification.
  • FIG19 is a schematic diagram of the structure of a connected domain screening process according to some embodiments of the present specification.
  • FIG20 is a diagram showing an outline of the iliac region in an ultrasound image according to some embodiments of the present specification.
  • FIG21 is a hip joint image corresponding to the ilium region contour according to some embodiments of the present specification.
  • FIG. 22 is a hip joint image including only a hip joint outline according to some embodiments of the present specification.
  • FIG23 is a hip joint classification effect diagram corresponding to an ultrasound image according to some embodiments of this specification.
  • FIG. 24 is a diagram showing the internal structure of a processing device according to some embodiments of the present specification.
  • system means for distinguishing different components, elements, parts, portions or assemblies at different levels.
  • device means for distinguishing different components, elements, parts, portions or assemblies at different levels.
  • unit means for distinguishing different components, elements, parts, portions or assemblies at different levels.
  • the words can be replaced by other expressions.
  • ultrasound detection combined with the Graf method is usually used to measure the ⁇ angle and ⁇ angle of the newborn hip joint, and the development type of the newborn hip joint is judged based on the measured ⁇ angle and ⁇ angle, for example, whether it is normal development, abnormal development type, etc.
  • the manual measurement method usually requires the doctor to manually select five anatomical points of the hip joint based on the ultrasound image of the neonatal hip joint, and use the five anatomical points to automatically calculate the bone vertex angle ( ⁇ angle) and cartilage vertex angle ( ⁇ angle) of the hip joint.
  • the five anatomical points are the upper edge of the transition between the rectus femoris head and the iliac periosteum, the lower edge of the transition between the rectus femoris head and the iliac periosteum, the turning point of the bone edge, the lowest point of the lower edge of the iliac branch and the midpoint of the labrum.
  • the measurement results are relatively subjective, and the accuracy is highly dependent on the doctor's experience. In addition, the doctor's operation is frequent, and the work efficiency is low.
  • Automatic measurement methods include automatic measurement methods based on hip joint classification and automatic measurement methods based on deep learning.
  • the automatic measurement method based on hip joint classification includes a hip joint classification method based on a region-based active contour model, and its processing process is as follows: after preliminary processing of the neonatal hip joint ultrasound image, the region-based active contour model is used for image segmentation to obtain the hip joint tissue contour, and then the bone vertex angle and cartilage vertex angle are obtained by linear fitting; this method directly uses the preliminary preprocessed ultrasound image and inputs it into the active contour model, which has a great impact on speed and accuracy.
  • Another automatic measurement method based on hip joint classification has the following processing process: after mean filtering the input ultrasound image, manually obtain the region of interest, and then combine image enhancement and binarization strategies, and then combine linear fitting to obtain the bone vertex angle and cartilage vertex angle; this method is a semi-automatic measurement method that relies on manual acquisition of the region of interest, which increases the workload.
  • the automatic measurement method based on deep learning obtains the position of the target key points of the neonatal hip joint and the target measurement value, and sends the above information to the deep learning network for training to obtain the network model, so as to input the ultrasound image and output the target position.
  • this method relies on a large amount of ultrasound image data, but it often involves issues such as patient privacy and small amount of data, which makes it difficult to implement or difficult to achieve obvious results based on a small amount of data.
  • some embodiments of this specification propose a hip joint angle detection system and method, which can improve the processing efficiency of hip joint classification and ensure the objectivity and accuracy of the classification results.
  • FIG1 is a schematic diagram of an exemplary application scenario of a hip joint angle detection system according to some embodiments of the present specification.
  • the hip joint angle detection system 100 may include an ultrasonic imaging device 110, a processing device 120, a network 130, a storage device 140, and a terminal 150. In some embodiments, the hip joint angle detection system 100 can be used to detect and Hip classification.
  • the ultrasonic imaging device 110 can be used to obtain ultrasonic imaging data of a target area on the object to be tested.
  • the ultrasonic imaging device can use the physical properties of ultrasonic waves and the difference in acoustic properties of the target area on the object to obtain ultrasonic imaging data of the target area on the object, and the ultrasonic imaging data can be displayed and/or recorded in the form of a waveform, a curve or an image.
  • the ultrasonic imaging device may include one or more ultrasonic probes for transmitting ultrasonic waves to the target area (for example, an object or its organs and tissues located on a treatment bed).
  • ultrasonic waves After passing through organs and tissues with different acoustic impedances and different attenuation characteristics, ultrasonic waves produce different reflections and attenuations, thereby forming echoes that can be received by the one or more ultrasonic probes.
  • the ultrasonic imaging device can process (for example, amplify, convert) and/or display the received echoes to generate ultrasonic imaging data.
  • the ultrasonic imaging device may include a B-ultrasound device, a color Doppler ultrasound device, a cardiac color ultrasound device, a three-dimensional color ultrasound device, etc. or any combination thereof.
  • the ultrasonic imaging device 110 may send the ultrasonic imaging data to the processing device 120, the storage device 140 and/or the terminal 150 through the network 130 for further processing.
  • the ultrasonic imaging data acquired by the ultrasonic imaging device may be data in a non-image form, and the non-image form of the data may be sent to the processing device 120 for generating an ultrasonic image.
  • the ultrasonic imaging data acquired by the ultrasonic imaging device may be data in an image form, and the image form of the data may be sent to the terminal 150 for display.
  • the ultrasonic imaging data may be stored in the storage device 140.
  • the processing device 120 can process data and/or information obtained from the ultrasonic imaging device 110, the storage device 140 and/or the terminal 150.
  • the processing device 120 can process the ultrasonic imaging data obtained from the imaging device in the ultrasonic imaging device 110 and generate an ultrasonic image of the target area.
  • the ultrasonic image can be sent to the terminal 150 and displayed on one or more display devices in the terminal 150.
  • the processing device 120 can be a single server or a server group.
  • the server group can be centralized or distributed.
  • the processing device 120 can be local or remote.
  • the processing device 120 can access information and/or data stored in the ultrasonic imaging device 110, the storage device 140 and/or the terminal 150 via the network 130.
  • the processing device 120 can be directly connected to the ultrasonic imaging device 110, the storage device 140 and/or the terminal 150 to access the information and/or data stored thereon.
  • the processing device 120 can be integrated in the ultrasonic imaging device 110.
  • the processing device 120 can be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an on-premises cloud, a multi-cloud, the like, or any combination thereof.
  • the processing device 120 may be a single processing device that communicates with the ultrasound imaging device and processes data received from the ultrasound imaging device.
  • the network 130 may include any suitable network that can facilitate information and/or data exchange of the hip angle detection system 100.
  • one or more components of the hip angle detection system 100 e.g., the ultrasound imaging device 110, the processing device 120, the storage device 140, or the terminal 150
  • the processing device 120 may obtain ultrasound imaging data from the ultrasound imaging device 110 through the network 130.
  • the processing device 120 may obtain user instructions from the terminal 150 through the network 130, and the instructions may be used to instruct the ultrasound imaging device 110 to perform imaging.
  • the network 130 may include one or more network access points.
  • the network 130 may include wired and/or wireless network access points, such as base stations and/or Internet access points, through which one or more components of the hip angle detection system 100 may be connected to the network 130 to exchange data and/or information.
  • the storage device 140 may store data and/or instructions. In some embodiments, the storage device 140 may store data obtained from the terminal 150 and/or the processing device 120. In some embodiments, the storage device 140 may store data and/or instructions that the processing device 120 may execute or use to execute the exemplary methods described in this specification. In some embodiments, the storage device 140 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-cloud, etc., or any combination thereof.
  • the storage device 140 can be connected to the network 130 to communicate with one or more components of the hip angle detection system 100 (e.g., the processing device 120, the terminal 150, etc.). One or more components of the hip angle detection system 100 can access data or instructions stored in the storage device 140 via the network 130. In some embodiments, the storage device 140 can be directly connected to or communicate with one or more components of the hip angle detection system 100 (e.g., the processing device 120, the terminal 150, etc.). In some embodiments, the storage device 140 can be part of the processing device 120.
  • the terminal 150 may include a mobile device 150-1, a tablet computer 150-2, a laptop computer 150-3, etc., or any combination thereof. In some embodiments, the terminal 150 may remotely operate the ultrasound imaging device 110. In some embodiments, the terminal 150 may operate the ultrasound imaging device 110 via a wireless connection. In some embodiments, the terminal 150 may receive information and/or instructions input by a user, and send the received information and/or instructions to the ultrasound imaging device 110 or the processing device 120 via the network 130. In some embodiments, the terminal 150 may receive data and/or information from the processing device 120. In some embodiments, the terminal 150 may be part of the processing device 120. In some embodiments, the terminal 150 may be omitted.
  • FIG. 2 is an exemplary flow chart of a hip joint angle detection system according to some embodiments of the present specification.
  • the process 200 can be executed by a processing device (e.g., the processing device 120) or a hip joint angle detection system (e.g., the hip joint angle detection system 100).
  • the process 200 can be stored in a storage device (e.g., a built-in storage unit of the processing device or an external storage device) in the form of a program or instruction, and when the program or instruction is executed, the process 200 can be implemented.
  • the process 200 can include the following operations.
  • Step 210 Acquire an ultrasonic image of the object to be measured.
  • the object to be tested may include a patient or other medical experimental subject (eg, a test phantom, etc.).
  • the object to be tested may also be a part of a patient (eg, a newborn) or other medical experimental subject, including an organ and/or tissue, such as an ilium, etc.
  • Ultrasonic images refer to images obtained after detecting an object to be tested by an ultrasonic device.
  • Ultrasonic signals are usually generated by high-frequency sound waves emitted by an ultrasonic probe, which are reflected back when they encounter different types of tissues or organs inside the object to be tested. After these reflected sound waves are received by the ultrasonic probe, they are amplified, filtered, digitized, and finally converted into images. The converted images are called ultrasonic images. In other places in this manual, the acquired ultrasonic images are also referred to as original ultrasonic images.
  • Step 220 extracting the ilium region contour corresponding to the hip joint of the subject to be tested from the ultrasound image.
  • the ilium is one of the components of the hip bone, forming the posterior and superior part of the hip bone, and is divided into two parts: the ilium body and the ilium wing.
  • the ilium region refers to the area located on the side of the human trunk above the hip bone, and its range is roughly equivalent to the area between the waist and the hip.
  • the ilium is one of the largest flat bones in the human body.
  • the iliac crest on the ilium is one of the main starting points of the femoral muscles.
  • the iliac acetabulum and the femoral head form the hip joint.
  • the hip joint is one of the largest joints in the human body, connecting the hip bone and the femur. It is a ball-and-socket joint consisting of the femoral head, acetabulum, and labrum.
  • the hip joint plays a very important role in human movement, supporting the body weight and allowing walking, running, and other activities. It also plays a key role in body posture and balance control.
  • the ilium region contour refers to an image having an ilium contour region that can be used to locate and identify the ilium and related structures thereof.
  • the ilium region contour may include contours of structures such as the ilium.
  • the processing device may also use target detection, key point detection, image feature recognition and other methods to obtain the ilium region contour.
  • the processing device may also filter the ultrasound image, perform threshold segmentation on the filtered ultrasound image, and finally obtain the ilium region contour based on the result of the threshold segmentation. For a detailed description, please refer to the relevant description of FIG3.
  • Step 230 obtaining the centroid position of the iliac region in the iliac region contour, and obtaining a hip joint image based on the centroid position.
  • the centroid refers to the average position of all mass points inside an object.
  • the centroid position of the iliac region refers to the average position of all element points in the area corresponding to the contour of the iliac region in the ultrasound image.
  • the processing device can calculate the centroid position of the iliac region in various ways, such as a method based on mathematical calculation, a method based on deep learning, etc. This embodiment does not limit the specific method of obtaining the centroid position.
  • the processing device can obtain the centroid position of the iliac region in the following manner.
  • the processing device can extract an iliac region image from the ultrasound image based on the centroid position of the iliac region in the iliac region contour, and obtain a hip joint image based on the iliac region image.
  • the hip joint image refers to an image of the hip joint structure of the object to be measured segmented from the iliac region image. Therefore, in some embodiments, the hip joint image is also called a hip joint segmentation image.
  • the processing device can obtain a hip joint image based on the centroid position of the ilium region through a preset active contour model. For example, the processing device can expand the ilium region in the ilium region contour according to a preset size to obtain an ilium region image. And, the ilium region image is processed using a preset active contour model to obtain the hip joint image. For more information on obtaining the ilium region image, please refer to the description in step 502 below.
  • the iliac region image refers to an image portion in which the main structural region extracted from the ultrasound image is the iliac region.
  • Expansion refers to expanding from the centroid position to the surroundings (e.g., up, down, left, and right). It can be understood that the centroid position is located within the region of the ilium region, and expanding from the centroid position to the surroundings can obtain a more accurate and complete image of the ilium region, thereby minimizing other background areas in the image.
  • the preset size refers to a preset extension size/length.
  • the preset size may include a preset length and a preset width, and the preset length and preset width may be expressed by pixel distance, for example, 100 pixel distance, 200 pixel distance, etc.
  • the preset size is related to the size of the ultrasound image.
  • the preset size may be a certain proportion of the ultrasound image.
  • the preset size may be 500 and 100.
  • the preset size may also be the size of the ultrasound image minus a certain size. For another example, assuming that more longitudinal image information is needed for segmentation of the hip joint, an image with 1/2 the width and 4/5 the height of the original image may be obtained.
  • the centroid position is expanded according to a preset size, and the preset size is related to the ultrasound image, which can ensure that the hip joint can be completely included in the iliac region image obtained during the expansion.
  • the preset active contour model can be an image segmentation method based on energy minimization, which can adaptively adjust the shape and position in the image, thereby achieving effective extraction of the hip joint.
  • the processing device may input the iliac region image into a preset active contour model for processing, and the active contour model may output the hip joint image.
  • the processing device may also process the iliac region image in other ways, for example, obtaining a hip joint image by image segmentation, key point detection, etc.
  • the preset active contour model (or simply referred to as the active contour model) can also be a machine learning model obtained by training based on the ilium region sample image and the gold standard hip joint image corresponding to the ilium region sample image.
  • the training method can be various model training methods, such as gradient descent method, etc., and the present embodiment does not limit the specific model training method.
  • the preset active contour model can also be obtained based on the ilium region sample image and the gold standard training corresponding to the ilium region sample image. Therefore, in some embodiments, the processing device can also directly input the ilium region image into the active contour to obtain a hip joint segmentation image. Exemplarily, the following takes the ilium region image input into the active contour model for processing as an example.
  • the active contour model can segment the hip joint in the iliac region image in an iterative manner, which may include a variety of segmentation methods, such as: Snake segmentation model, level set segmentation, etc.
  • segmentation methods such as: Snake segmentation model, level set segmentation, etc.
  • the level set segmentation method is adopted in the embodiments of this specification.
  • the level set is a digital method for tracking contours and surface motions. It does not directly operate on the contours, but converts the contours into a zero level set of a high-dimensional function. This high-dimensional function is called a level set function.
  • the level set function is then differentiated, and the contour of the motion is obtained by extracting the zero level set from the output.
  • the iliac region curve is called C, and the image is divided into two parts, inside and outside the curve, by C.
  • ⁇ , v, ⁇ 1 , ⁇ 2 are all weights
  • constant c1 is the mean of the internal pixels of curve C
  • constant c2 is the mean of the external pixels of curve C
  • Length(C) is the length of curve C
  • Area(inside(C)) is the internal area of curve C
  • inside(C) represents the inside of curve C
  • outside(C) represents the outside of curve C
  • u 0 (x, y) is the pixel of input image u 0.
  • the Heaviside function H ⁇ ( ⁇ (x,y)) and the Delta function ⁇ ⁇ ( ⁇ (x,y)) are:
  • c1 and c2 can be obtained through ⁇ , that is:
  • the level set algorithm process mainly seeks the minimum value of the energy function through iteration to achieve the purpose of curve evolution, and then exits the loop when the set number of iterations is reached.
  • Step 240 Determine the hip joint angle of the subject based on the hip joint image.
  • the hip joint angle is an indicator for evaluating the angle between the femoral head and neck and the femoral shaft axis.
  • the hip joint angle includes the ⁇ angle and the ⁇ angle, which can be used to classify the hip joint of a newborn.
  • the ⁇ angle is also called the bone vertex angle
  • the ⁇ angle is also called the cartilage vertex angle.
  • the processing device may determine the ⁇ angle and the ⁇ angle by marking the bone top line and the cartilage top line in the hip joint image, and then measuring or determining the slopes of the bone top line and the cartilage top line.
  • the hip joint image of the subject to be tested can be accurately obtained. Then, based on the hip joint image, the hip joint angle of the subject to be tested can be determined. According to the preset hip joint classification template (hip joint classification standard), the hip joint type of the object to be tested can be accurately determined.
  • the hip joint segmentation method based on the centroid position can accurately segment the hip joint area and shorten the calculation time, thereby improving the segmentation rate and segmentation accuracy, thereby achieving the purpose of improving the accuracy and processing efficiency of hip joint classification.
  • Fig. 3 is an exemplary flow chart of determining the contour of the ilium region according to some embodiments of the present specification.
  • the process 300 may be executed by a processing device (eg, the processing device 120).
  • Step 310 filter the ultrasound image.
  • Filtering is a signal processing technique that can be used to change the frequency characteristics of a signal or reduce noise. Filtering can be divided into different types such as low-pass filtering, high-pass filtering, band-pass filtering and band-stop filtering.
  • the processing device may filter the ultrasound image using a variety of filtering methods, for example, mean filtering, Gaussian filtering, median filtering, edge-preserving filtering, etc.
  • Step 320 Perform threshold segmentation on the filtered ultrasonic image to obtain a binary image containing multiple contour areas.
  • a binary image is an image that contains only two colors, for example, black and white. Each pixel in a binary image is one of these two colors, with black representing 0 or a low intensity value and white representing 1 or a high intensity value.
  • the multiple contour regions included may be contour structures of tissues or organs in the ultrasound image obtained by segmentation, and are generally regions where a group of pixels in the ultrasound image are connected in the image, for example, connected domains.
  • Threshold segmentation is a binary processing method based on the grayscale value of an image. Threshold segmentation can be used to divide tissue structures or regions of different grayscales in ultrasound images into two parts: foreground (signal) and background (noise). Its basic principle is to set a suitable threshold value, set the pixels in the ultrasound image that are smaller than the threshold value as the background, and the pixels that are larger than the threshold value as the foreground.
  • the processing device may process the filtered ultrasound image by a preset threshold segmentation algorithm to obtain a binary image.
  • the preset threshold segmentation algorithm may include maximum entropy threshold segmentation, Otsu threshold segmentation, adaptive threshold segmentation, and fixed threshold segmentation.
  • the processing device can use a median filtering method to filter the ultrasound image, and use a maximum entropy threshold segmentation method to segment the filtered ultrasound image to obtain a better processing result.
  • Median filtering and maximum entropy threshold segmentation have high adaptability, and the combination of the two can make the iliac region segmentation effect better.
  • the processing device can also use other combinations, for example, using median filtering and Otsu threshold segmentation, which is not limited in this embodiment.
  • the process of filtering and threshold segmentation of the ultrasound image is also referred to as binarization processing.
  • binarization processing please refer to the relevant description of FIG. 6 .
  • Step 330 Determine the contour of the ilium region based on the binary image.
  • the processing device may process the binary image by a preset screening method to determine the contour of the ilium region.
  • the preset screening method refers to a preset strategy/method for screening out the ilium region contour from the binary image.
  • the preset screening method may include two rounds of screening based on the preset screening strategy, wherein the first round of screening can be used to obtain the candidate connected domain from the binary image and remove the interfering contours, and the second round of screening can be used to obtain the ilium region contour based on the results of the first round of screening.
  • the processing device can obtain the candidate connected domain from the binary image through the preset screening strategy; and determine the ilium region contour based on the candidate connected domain.
  • the preset screening strategy refers to an operation plan/method set to achieve a goal under a preset specific goal and conditions.
  • the preset screening strategy includes using at least one of the three-dimensional screening conditions of area screening, centroid screening, and length screening for screening.
  • it can be screening by centroid, it can be screening by area, or it can be screening by centroid and area.
  • screening is performed using the three-dimensional screening conditions of area screening, centroid screening, and length screening.
  • a connected domain is a region where a group of pixels are connected in an image.
  • a candidate connected domain is a connected domain obtained from a binary image through a preset screening strategy.
  • Area screening refers to selecting the connected domains with an area greater than a certain threshold as candidate contours by area screening.
  • the threshold is related to the image area. For example, the larger the image area, the larger the threshold.
  • Centroid screening refers to screening out connected domains whose centroids are in a certain part of the image area in the middle. For example, screening out connected domains whose centroids are in the middle half of the image area.
  • Length filtering refers to filtering out connected domains whose length is longer than a certain threshold.
  • the threshold of length filtering is related to the width of the image. For example, the wider the image, the larger the threshold.
  • the processing device can perform dimension reduction of the preset screening strategy to obtain the screening strategy after dimension reduction; and, use the screening strategy after dimension reduction to obtain the candidate connected domains from the binary image.
  • Dimension reduction of screening conditions refers to discarding one or more of the screening conditions.
  • the order of dimension reduction of screening conditions is length screening, area screening, and centroid screening.
  • the preset screening strategy can be to first use the above three conditions simultaneously to perform connected domain screening. When the number of the screened candidate connected domains is 0 (less than 1), remove one of the screening conditions, and then remove them in order. Length filtering condition.
  • the second round of screening includes calculating, for each of the candidate connected domains, a discrimination score corresponding to the candidate connected domain; taking the candidate connected domain with the highest discrimination score as the target connected domain, and extracting the ilium region contour from the ultrasound image based on the target connected domain.
  • the target connected domain refers to the connected domain corresponding to the ilium region in the ultrasound image.
  • the identification score refers to the value obtained by calculating the connected domain through a preset identification algorithm.
  • the preset identification algorithm may be an identification algorithm related to the distance of the connected domain, and the calculation formula of the preset identification algorithm may be as shown in formula (7).
  • S is the identification score corresponding to the connected domain
  • N is the number of pixels in the connected domain
  • i is a pixel
  • I is the average pixel intensity of each pixel in the corresponding area of the connected domain in the ultrasound image
  • R is the aspect ratio of the connected domain
  • is the main axis direction of the connected domain.
  • the center distance of the connected domain needs to be used.
  • the calculation formula of the center distance of the connected domain is shown in formula (8).
  • m pq refers to the moment of the image
  • p+q represents the order, such as the second-order moment includes m 20 , m 02 , m 11 , etc.
  • x and y are coordinates from the connected domain, and is the centroid coordinate
  • I xy is the pixel intensity of the pixel point in the corresponding area of the connected domain in the ultrasound image, based on this, the calculation formula of the aspect ratio R of the connected domain is shown in formula (9).
  • m 20 , m 02 , and m 11 are the second-order moments of the image.
  • the calculation method for the principal axis direction ⁇ of the connected domain may include Radon transform, calculation using the second-order central moment, or fitting the connected domain contour point set using the least squares method to obtain a straight line and then calculate the principal axis direction.
  • the preset identification algorithm may also include other algorithms, such as area judgment algorithm, rectangular frame fitting algorithm, circle fitting algorithm, pixel density algorithm, etc.
  • the specific algorithm is not limited in this specification.
  • the processing device can intercept a partial area corresponding to the target connected domain from the ultrasound image as the iliac region contour corresponding to the hip joint of the subject to be tested.
  • Fig. 4 is an exemplary flow chart of a method for determining a hip joint angle according to some embodiments of the present specification.
  • process 400 may be executed by a processing device (eg, processing device 120).
  • Step 410 based on the hip joint image, determine the flange point, end point and labrum midpoint of the ilium region.
  • the flange point is usually located at the center of the femoral head and is the approximate position of the center of rotation of the hip joint. In some embodiments, the flange point is also referred to as the bone edge turning point.
  • the processing device may segment the hip joint image into a first image and a second image according to a preset baseline.
  • the preset baseline is a straight line formed based on the ordinate of the ilium mass center in the ilium region image.
  • the preset baseline can be a straight line passing through the ordinate of the ilium mass center and in the same direction as the length of the image.
  • the first image refers to the image of the upper half after the hip joint image is segmented along a preset baseline.
  • the second image refers to the image of the lower half after the hip joint image is segmented along a preset baseline.
  • the processing device may determine the upper edge line of the ilium contour in the second image, and extract points on the upper edge line to form a bone vertex point set.
  • the upper edge line refers to the upper boundary of the ilium contour. After extracting the points (eg, pixel points) on the upper edge line, a bone vertex point set can be obtained.
  • the processing device may obtain the slope of the coping line based on the coping line point set.
  • the processing device can use the least square method to fit the bone vertex points in the bone vertex point set to obtain the slope of the bone vertex.
  • the slope of the bone vertex can also be obtained by other methods, which are not limited in this embodiment.
  • the processing device may use the slope of the coping line to determine the flange point from the coping line point set.
  • the processing device can use the slope of the bone top line, combined with the bone top line point set, to obtain n straight lines (n is the number of bone top line points in the bone top line point set). Sort the y coordinates of each bone top line point to ensure that all points on the bone top line point set are at the lower left of a certain straight line.
  • the coordinate point in the bone top line point set corresponding to the straight line is the flange point.
  • the endpoint is usually the lowest point of the inferior edge of the iliac ramus.
  • the processing device may take the bone vertex point with the maximum y coordinate in the bone vertex point set as the end point.
  • the midpoint of the labrum is usually a point on the acetabulum.
  • the processing device may segment a labrum region from the first image; and use the centroid of the labrum region as the labrum midpoint.
  • the segmentation of the labrum region can be performed in various ways, such as using the segmentation algorithm described in other embodiments of this specification, etc., which is not limited in this specification.
  • the centroid acquisition method can also refer to the description elsewhere in this specification.
  • Step 420 connecting the flange point and the end point to determine the bone top line, connecting the flange point and the midpoint of the labrum to obtain the cartilage top line.
  • Step 430 Determine the hip joint angle of the subject based on the slopes of the bone top line and the cartilage top line.
  • the method for calculating the bone apex angle ⁇ in the hip joint angle based on the slope of the bone apex line can be as shown in formula (10).
  • the method for calculating the cartilage apex angle ⁇ in the hip joint angle based on the slope of the cartilage apex line can be as shown in formula (11).
  • k1 is the slope of the bone top line
  • k2 is the slope of the cartilage top line
  • FIG5 is an exemplary flow chart of a hip joint classification method according to some embodiments of this specification. The method can be applied to the processing device in FIG1 , which may include the following steps:
  • Step 501 Acquire an original ultrasound image of the object to be tested.
  • the original ultrasound image is an ultrasound image corresponding to the hip joint of the object to be tested, such as a coronal ultrasound image corresponding to the hip joint.
  • the original ultrasound image can be a color image or a grayscale image, which is not limited in this embodiment.
  • the processing device can obtain an ultrasonic image corresponding to the hip joint of the object to be measured collected by the ultrasonic device, or can obtain an ultrasonic image corresponding to the hip joint of the object to be measured from a server.
  • the processing device can also obtain ultrasonic data corresponding to the hip joint of the object to be measured collected by the ultrasonic device, and reconstruct the ultrasonic image corresponding to the hip joint of the object to be measured based on the original ultrasonic data corresponding to the hip joint of the object to be measured.
  • the processing device may also obtain a local or whole-body ultrasound image of the object to be tested, and intercept an ultrasound image corresponding to the hip joint from the local or whole-body ultrasound image. It should be noted that the method for obtaining the ultrasound image of the hip joint of the object to be tested is not specifically limited in the embodiments of this specification. For example, the processing device may also obtain the ultrasound image of the object to be tested by reading from a storage device or a database.
  • Step 502 extracting an iliac region image corresponding to the hip joint of the subject to be tested from the original ultrasound image.
  • the processing device can input the ultrasound image corresponding to the hip joint of the object to be tested into a preset image segmentation model, and output an iliac region image corresponding to the hip joint of the object to be tested.
  • the preset image segmentation model can be a threshold-based image segmentation model, a region-based image segmentation model, an edge-based image segmentation model, an energy functional-based image segmentation model (such as an active contour model), an image segmentation model based on deep learning/neural network, or an image segmentation model based on machine learning, etc.
  • the type and implementation principle of the preset image segmentation model are not specifically limited, as long as the preset image segmentation model can segment the iliac region image from the ultrasound image of the hip joint.
  • the processing device may also use basic image processing operations to perform image analysis on the original ultrasound image, so as to extract the iliac area image corresponding to the hip joint of the subject to be tested from the original ultrasound image.
  • the image processing operation includes but is not limited to image filtering, image smoothing, image geometric transformation, image morphological processing, etc.
  • the processing device may also use basic image processing operations to perform image analysis on the original ultrasound image, so as to extract the iliac area image corresponding to the hip joint of the subject to be tested from the original ultrasound image.
  • the image processing operation includes but is not limited to image filtering, image smoothing, image geometric transformation, image morphological processing, etc.
  • extracting the iliac region image corresponding to the hip joint of the object to be tested from the ultrasound image includes: filtering the original ultrasound image; performing threshold segmentation on the filtered original ultrasound image to obtain a binary image containing multiple contour areas; determining the iliac region contour based on the binary image; and determining the iliac region image based on the iliac region contour.
  • determining the iliac region image based on the iliac region contour includes: obtaining the centroid position of the iliac region in the iliac region contour; expanding according to a preset size based on the centroid position of the iliac region in the iliac region contour to obtain the iliac region image; wherein the preset size is related to the size of the original ultrasound image.
  • FIGS. 2 to 4 A more detailed description can be found in the descriptions of FIGS. 2 to 4 , which will not be repeated here.
  • Step 503 Input the ilium region image into a preset active contour model to perform hip joint classification, and obtain the hip joint type of the subject to be tested.
  • the processing device can input the iliac region image of the object to be tested into a preset active contour model to obtain a hip joint segmentation image, and then the hip joint segmentation image can be input into a preset classification algorithm to output the hip joint type of the object to be tested, wherein the preset active contour model can be trained based on the iliac region image sample and its corresponding hip joint label.
  • the preset classification algorithm can also be embedded in the preset active contour model, that is, the preset active contour model can directly output the hip joint type of the object to be tested.
  • the processing device can determine the bone vertices angle and the cartilage vertices angle from the hip joint segmentation image, and then determine the hip joint type corresponding to the bone vertices angle and the cartilage vertices angle from the preset correspondence relationship of the hip joint types; wherein the preset correspondence relationship of the hip joint types includes the correspondence between different bone vertices angles, different cartilage vertices angles and different hip joint types.
  • the ilium region image is input into a preset active contour model for hip classification to obtain the hip joint type of the subject to be tested, including: using the preset active contour model to process the ilium region image to obtain a hip joint image; based on the hip joint image, determining the flange point, end point and labrum midpoint of the ilium region; connecting the flange point and the end point to determine the bone top line, and connecting the flange point and the labrum midpoint to determine the cartilage top line; based on the slopes of the bone top line and the cartilage top line, determining the hip joint angle of the subject to be tested; based on the hip joint angle of the subject to be tested and the hip joint classification standard, determining the hip joint type of the subject to be tested.
  • the processing device may determine the hip joint classification result of the subject to be tested by searching the hip joint classification standard.
  • the hip joint classification standard may be as shown in Table 1.
  • different bone apex angles ⁇ and different cartilage apex angles ⁇ correspond to different hip joint types.
  • the hip joint type of the subject to be tested corresponding to the bone apex angle and the cartilage apex angle can be determined.
  • the processing device obtains the original ultrasonic image corresponding to the hip joint of the object to be tested, then extracts the ilium region image corresponding to the hip joint of the object to be tested from the original ultrasonic image, and inputs the ilium region image into a preset active contour model for hip joint classification to obtain the hip joint type of the object to be tested; that is, the hip joint classification method provided in the embodiment of this specification is to obtain the ilium region image by segmenting the ilium region from the complete ultrasonic image, and then input the ilium region image into the active contour model for image analysis.
  • the active contour model does not need to perform image analysis on the entire hip joint ultrasonic image, but only needs to perform image analysis on the partial image where the key area in the entire hip joint ultrasonic image is located, which can greatly improve the processing rate of the active contour model; at the same time, when the technical solution disclosed in this specification is used for hip joint recognition, since the ilium region is extracted from the entire hip joint ultrasonic image in advance, that is, the region of interest of the hip joint is determined in advance from the entire hip joint ultrasonic image, therefore, when it is subsequently input into the active contour model for image processing, the accuracy of hip joint segmentation can be greatly improved, the precision of image processing can be improved, and the efficiency of image processing can be greatly improved.
  • FIG6 is a schematic diagram of an exemplary process of determining an iliac region image according to other embodiments of the present specification. This embodiment relates to an optional implementation process of extracting an iliac region image corresponding to the hip joint of a subject to be tested from an original ultrasound image by a processing device. Based on the above embodiment, as shown in FIG6 , the above step 502 may include:
  • Step 601 binarization is performed on the original ultrasonic image to obtain a binary image corresponding to the original ultrasonic image.
  • the processing device may use a simple and fast threshold segmentation to perform binarization processing on the original ultrasound image to obtain a binary image corresponding to the original ultrasound image; wherein the threshold segmentation may include maximum entropy threshold segmentation, Otsu threshold segmentation, adaptive threshold segmentation, and fixed threshold segmentation, etc.
  • the maximum entropy threshold segmentation is used to perform binarization processing on the original ultrasound image.
  • the binary image obtained by performing the binarization process can roughly segment the ilium region. In other places of this specification, the segmented binary image is also referred to as an ilium region image.
  • the maximum entropy threshold segmentation method uses the image grayscale probability information to obtain the image binary segmentation threshold, and then realizes image segmentation. Assuming that the image segmentation threshold is t, the pixels with image grayscale less than t constitute the background area B, and the pixels greater than or equal to t constitute the target area T.
  • the probability distribution of each grayscale level is:
  • PB (i) represents the probability distribution of each pixel in the background area
  • PT (i) represents the probability distribution of each pixel in the target area
  • Pi is the probability of pixel value i
  • L is the grayscale level of the image
  • the information entropy corresponding to the background and foreground can be expressed as:
  • H(B) is the information entropy corresponding to the background
  • H(T) is the information entropy corresponding to the foreground.
  • Traverse the pixel value t from 0 to 255 as the segmentation threshold count the sum of the information entropy under each threshold, and use the t when the maximum value is obtained as the segmentation threshold.
  • the two parts of the background and foreground can maintain the maximum amount of information; at this point, the original ultrasound image can be binarized based on the segmentation threshold t to obtain the maximum entropy segmentation image, that is, the binary image corresponding to the original ultrasound image.
  • Step 602 Perform connected domain analysis on the binary image to determine the target connected domain.
  • the ilium region can be roughly segmented, but there may still be some interfering contours. Therefore, a connected domain analysis can be performed on the binary image corresponding to the original ultrasound image to screen out the target connected domain corresponding to the ilium region.
  • the target connected domain is the connected domain corresponding to the ilium region.
  • a first preset screening rule of a connected domain corresponding to the iliac region can be determined by analyzing the features of the iliac region, wherein the first preset screening rule can be a screening rule related to at least one of the area of the iliac region, the centroid of the iliac region, the length of the iliac region, etc.
  • the processing device can perform a connected domain analysis on the binary image based on the first preset screening rule to determine the target connected domain.
  • the processing device can perform a connected domain analysis on the binary image to determine all connected domains in the binary image, and then, based on the first preset screening rule, determine a candidate connected domain that meets the first preset screening rule from all connected domains, and finally, determine the target connected domain corresponding to the iliac region based on the candidate connected domain.
  • the candidate connected domain when the candidate connected domain includes one, the candidate connected domain can be used as the target connected domain; when the candidate connected domain includes multiple, any one of the multiple candidate connected domains can be used as the target connected domain; when multiple candidate connected domains have an intersection, the multiple candidate connected domains with an intersection can also be merged and the merged connected domain can be used as the target connected domain.
  • the processing device can also filter out the target connected domain corresponding to the iliac region from the multiple candidate connected domains according to the second preset filtering rule.
  • the second preset filtering rule can be a filtering rule related to at least one of the area of the connected domain, the aspect ratio of the connected domain, and the average pixel intensity in the connected domain.
  • Step 603 extracting an iliac region image corresponding to the hip joint of the subject to be tested from the original ultrasound image according to the target connected domain.
  • the processing device may intercept a partial area image corresponding to the target connected domain from the original ultrasound image as an iliac region image corresponding to the hip joint of the subject to be tested.
  • the target connected domain may not be a rectangular region.
  • a rectangular region corresponding to the target connected domain may be determined based on the target connected domain, and then a partial area image corresponding to the rectangular region is intercepted from the original ultrasound image as an iliac region image corresponding to the hip joint of the subject to be tested.
  • the processing device may determine a rectangular area tangent to the target connected domain based on the target connected domain, wherein the rectangular area encloses the target connected domain; the processing device may also first determine the centroid position of the target connected domain, and then, based on the centroid position and a preset size, the preset size includes a preset length and a preset width, determine a rectangular area centered at the centroid position and having a preset size.
  • the processing device may determine a rectangular area tangent to the target connected domain based on the target connected domain, wherein the rectangular area encloses the target connected domain; the processing device may also first determine the centroid position of the target connected domain, and then, based on the centroid position and a preset size, the preset size includes a preset length and a preset width, determine a rectangular area centered at the centroid position and having a preset size.
  • this embodiment does not specifically limit the method of determining a rectangular area based on the target connected domain.
  • the processing device obtains a binary image corresponding to the original ultrasound image by binarizing the original ultrasound image; then, a connected domain analysis is performed on the binary image to determine the target connected domain corresponding to the iliac region; finally, based on the target connected domain corresponding to the iliac region, an iliac region image corresponding to the hip joint of the subject to be tested is extracted from the original ultrasound image; that is, this embodiment provides a method for obtaining an iliac region image, which provides a feasibility basis for obtaining the iliac region image.
  • the processing device when the processing device executes the above step 601, before binarizing the original ultrasound image, it can also filter the original ultrasound image to smooth the image, which can suppress the speckle noise in the original ultrasound image to a certain extent, improve the accuracy of the subsequent image segmentation process, and reduce the interference in the connected domain screening process.
  • Figure 7 is an exemplary flow chart of filtering and threshold segmentation of ultrasound images according to some embodiments of the present specification.
  • the above step 601 may include:
  • Step 701 Use a preset filtering algorithm to filter the original ultrasonic image to obtain a filtered ultrasonic image.
  • the preset filtering algorithm may include mean filtering, Gaussian filtering, median filtering, or anisotropic diffusion filtering, etc.
  • the type of the preset filtering algorithm is not specifically limited in the embodiments of this specification.
  • Step 702 using a preset threshold segmentation algorithm, performs binary segmentation processing on the filtered ultrasonic image to obtain a binary image corresponding to the original ultrasonic image.
  • the preset threshold segmentation algorithm may include maximum entropy threshold segmentation, Otsu threshold segmentation, adaptive threshold segmentation, and fixed threshold segmentation, etc. This embodiment does not limit the specific type of the preset threshold segmentation algorithm.
  • the processing device may use a median filtering algorithm to filter the original ultrasonic image to obtain a filtered ultrasonic image, and then use a maximum entropy threshold segmentation algorithm to perform binary segmentation on the filtered ultrasonic image to obtain a binary image corresponding to the original ultrasonic image.
  • the processing device before the processing device performs binarization processing on the original ultrasonic image, it first uses a preset filtering algorithm to filter the original ultrasonic image to obtain a filtered ultrasonic image; then, it uses a preset threshold segmentation algorithm to perform binary segmentation processing on the filtered ultrasonic image to obtain a binary image corresponding to the original ultrasonic image; this can reduce the noise in the original ultrasonic image and achieve a smoothing effect, thereby improving the accuracy of subsequent image segmentation and connected domain screening and avoiding noise interference.
  • FIG8 is a schematic diagram of an exemplary process of obtaining a candidate connected domain according to some embodiments of the present specification.
  • This embodiment relates to an optional implementation process in which a processing device performs connected domain analysis on a binary image to determine a target connected domain.
  • the above step 602 may include:
  • Step 801 extracting a candidate connected component that meets a first preset condition from a binary image.
  • the first preset condition may include conditions in multiple dimensions, and the conditions in multiple dimensions may include at least two of the conditions that the area of the connected domain reaches a preset area threshold, the condition that the center of mass of the connected domain is located at a preset position, and the condition that the length of the connected domain reaches a preset length threshold.
  • the first preset condition includes at least two screening conditions, and when a candidate connected domain satisfying the first preset condition is extracted from a binary image, the candidate connected domain should satisfy the at least two screening conditions simultaneously.
  • the processing device can perform a connected domain analysis on the binary image to determine all connected domains in the binary image, and then, for each connected domain, determine whether the connected domain satisfies all conditions in the first preset condition at the same time. If the connected domain satisfies all conditions in the first preset condition at the same time, the connected domain is determined as a candidate connected domain. If any one of the first preset conditions is not met, it means that the connected domain does not meet the screening condition.
  • Step 802 When a candidate connected domain that meets the first preset condition is extracted, a target connected domain is determined according to the candidate connected domain.
  • the target connected domain corresponding to the iliac region can be further determined based on the candidate connected domain.
  • the candidate connected domain when the number of candidate connected domains is one, can be used as the target connected domain, or a deformed connected domain of the candidate connected domain (such as a rectangular area tangent to the candidate connected domain) can be used as the target connected domain; when the number of candidate connected domains is multiple, a target connected domain that meets the second preset condition can be determined from the candidate connected domains, or the target connected domain can be determined based on multiple candidate connected domains; for example: the target connected domain can be obtained by fusing multiple candidate connected domains.
  • a solution is provided for determining a target connected domain that satisfies a second preset condition from candidate connected domains, wherein the second preset condition may be a condition with the highest identification score. That is, a discrimination analysis is performed on each candidate connected domain to obtain the identification score corresponding to each candidate connected domain, and the candidate connected domain with the highest identification score is used as the target connected domain.
  • FIG. 9 is an exemplary schematic flowchart of obtaining a target connected domain according to some embodiments of the present specification. As shown in FIG. 9 , the processing device determines a target connected domain that satisfies a second preset condition from candidate connected domains, which may include:
  • Step 901 for each candidate connected domain, determine a candidate image corresponding to the candidate connected domain from an original ultrasound image.
  • Step 902 Calculate the identification score corresponding to the candidate image according to the candidate image and a preset identification algorithm.
  • step 901 and step 902 For more information about step 901 and step 902, please refer to the relevant description of FIG. 3 .
  • Step 903 The candidate connected domain corresponding to the candidate image with the highest identification score is used as the target connected domain.
  • the target connected domain corresponding to the iliac region can be determined from multiple candidate connected domains.
  • Step 803 when no candidate connected domain satisfying the first preset condition is extracted, the first preset condition is subjected to dimensionality reduction processing to obtain an updated condition, and the updated condition is used as the first preset condition, and the step of extracting the candidate connected domain satisfying the first preset condition from the binary image is re-executed until a candidate connected domain satisfying the first preset condition is extracted from the binary image or the dimension of the first preset condition is one-dimensional.
  • the screening conditions can be reduced and the connected domain screening can be performed again; in some embodiments, when the first preset condition is subjected to dimensionality reduction processing, that is, when the screening conditions in the first preset condition are reduced, the screening conditions with low importance can be removed preferentially according to the importance of each screening condition.
  • the first preset condition includes an area condition that the area of the connected domain reaches a preset area threshold, a center of mass condition that the center of mass of the connected domain is located at a preset position, and a length condition that the length of the connected domain reaches a preset length threshold
  • the length condition can be removed first, that is, the area condition and the center of mass condition are taken as the first preset condition, and the connected domain is screened again; if there is still no candidate connected domain that meets both the area condition and the center of mass condition, the area condition can be removed at this time, the center of mass condition is taken as the first preset condition, and the connected domain is screened again.
  • the step of binarizing the original ultrasound image can be returned to be re-executed, the binary image can be re-determined, and based on the new binary image, the screening operation of the connected domain is performed according to the above-mentioned screening process until there is a candidate connected domain that meets the first preset condition.
  • a target connected domain that meets the second preset condition may be determined from the candidate connected domains.
  • the specific implementation method may refer to the description of the relevant contents of step 802 above, which will not be described in detail here.
  • the processing device first extracts a candidate connected domain that satisfies a first preset condition including multiple dimensional conditions from a binary image; when a candidate connected domain that satisfies the first preset condition is extracted, a target connected domain that satisfies a second preset condition is determined from the candidate connected domain; when a candidate connected domain that satisfies the first preset condition is not extracted, a dimensionality reduction process is performed on the first preset condition to obtain an updated condition, and the updated condition is used as the first preset condition, and the step of extracting a candidate connected domain that satisfies the first preset condition from the binary image is re-executed until a candidate connected domain that satisfies the first preset condition is extracted from the binary image or the dimension of the first preset condition is one-dimensional; in this way, the accuracy of iliac region segmentation can be improved.
  • FIG10 is a schematic diagram of an exemplary process of determining an iliac region image according to some embodiments of this specification.
  • This embodiment relates to an optional implementation process in which a processing device extracts an iliac region image corresponding to a hip joint of a subject to be tested from an original ultrasound image according to a target connected domain.
  • the above step 603 may include:
  • Step 1001 Determine the centroid position of the target connected domain based on the original ultrasound image.
  • the processing device can determine the image area corresponding to the target connected domain from the original ultrasound image based on the position of the target connected domain in the binary image, analyze the image area corresponding to the target connected domain in the original ultrasound image, and determine the center of mass position corresponding to the target connected domain.
  • Step 1002 extracting an iliac region image corresponding to the hip joint of the subject to be tested from the original ultrasound image based on the centroid position of the target connected domain and a preset size.
  • the preset size is related to the size of the original ultrasound image.
  • the processing device can determine the proportional relationship between the image size of the iliac region and the image size of the ultrasound image by analyzing the image size of the ultrasound image and the image size of the iliac region in the ultrasound image.
  • the image width of the iliac region can be 1/2 of the image width of the ultrasound image
  • the image height of the iliac region can be 4/5 of the image width of the ultrasound image.
  • the processing device can determine the preset size corresponding to the iliac region image according to the proportional relationship and the image size of the original ultrasound image of the object to be tested. Furthermore, based on the centroid position of the target connected domain and the preset size, an image region of the preset size centered at the centroid position of the target connected domain in the original ultrasound image can be determined, and the image region can be cut out from the original ultrasound image to obtain the iliac region image corresponding to the hip joint of the object to be tested.
  • the processing device determines the centroid position of the target connected domain based on the original ultrasound image, and extracts the iliac region image corresponding to the hip joint of the object to be measured from the original ultrasound image based on the centroid position of the target connected domain combined with a preset size; wherein the preset size is related to the size of the original ultrasound image; in this way, an accurate iliac region image can be obtained, thereby improving the accuracy of iliac region image segmentation.
  • FIG11 is an exemplary flow chart of a hip joint classification method according to some embodiments of this specification.
  • This embodiment relates to an optional implementation process in which a processing device inputs an iliac region image into a preset active contour model to perform hip joint classification and obtain the hip joint type of the object to be tested.
  • the above step 503 includes:
  • Step 1101 inputting the ilium region image into a preset active contour model to obtain a hip joint segmentation image.
  • Step 1102 based on the hip joint segmentation image, determine the bone top line and the cartilage top line.
  • the bone top line is the line between the turning point of the bone edge and the lowest point of the lower edge of the iliac branch
  • the cartilage top line is the line between the turning point of the bone edge and the midpoint of the labrum.
  • the processing device can determine three anatomical points, namely, the turning point of the bone edge, the lowest point of the lower edge of the iliac branch, and the midpoint of the labrum, based on the hip joint segmentation image, and then generate a bone top line based on the turning point of the bone edge and the lowest point of the lower edge of the iliac branch, and generate a cartilage top line based on the turning point of the bone edge and the midpoint of the labrum.
  • FIG12 is an exemplary flow chart of determining three anatomical points according to some embodiments of the present specification. As shown in FIG12 , the method of determining the bone edge turning point, the lowest point of the lower edge of the iliac ramus, and the midpoint of the labrum may include:
  • Step 1201 segmenting the hip joint segmentation image into a first image and a second image according to a preset baseline.
  • the preset baseline is a straight line formed by the ordinate of the centroid of the ilium in the ilium region image, and the first image is a partial hip joint image including the labrum.
  • the centroid of the ilium in the ilium region image is the centroid of the target connected domain determined in step 1001.
  • the straight line formed by the ordinate of the center of mass position of the target connected domain is the preset baseline, that is, the baseline for determining the top angle of the hip joint bone and the top angle of the cartilage.
  • the hip joint segmentation image can be segmented into a first image corresponding to the upper half of the hip joint and a second image corresponding to the lower half of the hip joint.
  • Step 1202 determine the flange point of the upper edge line from the second image, and generate the bone top line based on the flange point and the end point of the upper edge line.
  • the flange point of the upper line is the turning point of the above-mentioned bone edge
  • the end point of the upper line is the lowest point of the lower edge of the above-mentioned iliac branch.
  • Step 1203 determine the labrum midpoint from the first image, and generate the cartilage top line based on the flange point and the labrum midpoint.
  • the processing device can perform morphological processing and logical screening on the first image to segment the image of the area where the labrum is located, determine the center of mass of the area where the labrum is located based on the image, and determine the center of mass as the midpoint of the labrum.
  • a cartilage top line is generated based on the flange point determined in step 1202 and the labrum midpoint.
  • Step 1103 determining the bone apex angle according to the slope of the bone apex line, and determining the cartilage apex angle according to the slope of the cartilage apex line.
  • the bone vertex angle is the angle formed by the preset baseline and the bone vertex line in the fourth quadrant
  • the cartilage vertex angle is the angle formed by the preset baseline and the cartilage vertex line in the first quadrant.
  • the slope of the bone top line can be calculated by coordinate calculation.
  • the cartilage top line can be calculated. Then, the method of calculating the bone top angle and the cartilage top angle according to the slope of the bone top line can be referred to the description of the previous embodiment, which will not be repeated here.
  • Step 1104 determining the hip joint type of the subject to be tested according to the bone vertex angle, cartilage vertex angle, and a preset hip joint classification template.
  • the preset hip joint classification template can be found in Table 1.
  • the processing device obtains a hip joint segmentation image by inputting an ilium region image into a preset active contour model; then, based on the hip joint segmentation image, the bone apex line and the cartilage apex line are determined; and the bone apex angle is determined according to the slope of the bone apex line, and the cartilage apex angle is determined according to the slope of the cartilage apex line; further, the hip joint type of the object to be tested is determined according to the bone apex angle, the cartilage apex angle, and a preset hip joint classification template; using the method in this embodiment, the region of interest of the hip joint, i.e., the ilium region image, is used as the input image for hip joint segmentation, which can shorten the calculation time, improve the segmentation rate and segmentation accuracy, and thus improve the accuracy and processing efficiency of hip joint classification.
  • the region of interest of the hip joint i.e., the ilium region image
  • FIG13 is a schematic diagram of an exemplary flow chart of a method for determining flange points according to some embodiments of this specification. This embodiment relates to an optional implementation process of a processing device determining flange points of an upper edge line from a second image. Based on the above embodiment, as shown in FIG13, the above step 1202 includes:
  • Step 1301 determine the upper edge line in the second image, and extract points on the upper edge line to form a bone vertex point set.
  • the upper edge line in the second image is the upper boundary line corresponding to the lower edge of the iliac branch.
  • multiple points between the first point and the last point on the upper edge line are extracted according to a preset interval to form a bone top line point set.
  • Step 1302 generating straight lines corresponding to each point according to each point in the bone top line point set and a preset slope.
  • the preset slope is the slope of the line segment obtained by fitting the points in the bone vertex point set with the least square method. That is, the bone vertex point set is fitted with the least square method to obtain a fitted straight line and the slope of the straight line, and the slope of the straight line is used as the preset slope.
  • the preset slope is used in combination with the bone top line point set to obtain straight lines corresponding to each point in the bone top line point set.
  • Step 1303 Select a target straight line from the straight lines corresponding to the points, and determine the points corresponding to the target straight line as flange points.
  • each point in the bone top line point set is below the target straight line.
  • the processing device can sort the vertical coordinates of the straight line of each point to ensure that all points on the bone top line point set are at the lower left of a target straight line. Then, the coordinate point in the bone top line point set corresponding to the target straight line is the flange point, and the point with the maximum vertical coordinate in the bone top line point set is regarded as the end point of the upper edge line. Then, the flange point and the end point are connected to obtain the bone top line, and the slope of the bone top line is calculated.
  • the processing device determines the upper edge line in the second image and extracts the points on the upper edge line to form a bone top line point set; and generates straight lines corresponding to each point according to the points in the bone top line point set and the preset slope; then, a target straight line is selected from the straight lines corresponding to each point, and the points corresponding to the target straight line are determined as flange points; wherein each point in the bone top line point set is below the target straight line, and the preset slope is the slope of the line segment obtained by fitting the points in the bone top line point set in combination with the least squares method; that is, this embodiment provides an implementation method for determining flange points, which improves the feasibility and operability of the processing device for automatically determining flange points, and this method can improve the processing efficiency of the processing device in determining flange points and improve the accuracy of flange points.
  • an ultrasonic hip joint automatic classification system is also provided, and its system structure diagram is shown in FIG14, which is mainly divided into 7 parts.
  • the ultrasonic transducer is mainly responsible for transmitting and receiving signals
  • the data receiving module is mainly responsible for receiving, amplifying and converting the electrical signals into analog and digital signals, and compressing the data and sending it to the beamforming module.
  • the beamforming module and the image processing module are mainly responsible for analyzing and interpolating the echo signals to form B signals
  • the data storage module is responsible for storing the acquired B signals in the form of images.
  • the stored original ultrasound image is sent to the image algorithm module to automatically obtain the hip joint classification, and finally the classification result is displayed in the image display module.
  • the image algorithm module is specifically used to execute the steps of the hip joint typing method in any of the above embodiments to realize automatic typing of the hip joint.
  • Its implementation method can be shown in Figure 15, and the entire algorithm process is divided into five parts: input, iliac region segmentation, hip joint segmentation, hip joint typing and output.
  • the hip joint typing method in the embodiment of this specification is a fully automatic measurement method. The doctor selects the appropriate hip joint section of the object to be measured as the input image in a frozen state, and the final output result is the ⁇ angle, ⁇ angle and hip joint typing of the hip joint of the object to be measured.
  • Its specific implementation steps may include:
  • the original ultrasonic image is filtered using a median filtering algorithm to obtain a filtered ultrasonic image; then, the filtered ultrasonic image is binary segmented using a maximum entropy threshold segmentation algorithm to obtain a binary image corresponding to the original ultrasonic image, as shown in FIG17 .
  • the specific implementation process may include:
  • a binary image i.e., the maximum entropy segmentation image in FIG. 19
  • the first preset condition is subjected to dimensionality reduction processing to obtain an updated condition, and the updated condition is used as the first preset condition, and the above step 1) is re-executed until a candidate connected domain that satisfies the first preset condition is extracted from the binary image or the dimension of the first preset condition is one-dimensional.
  • the order of dimensionality reduction processing is to remove the length condition first and then the area condition.
  • each candidate connected domain determines the candidate image corresponding to each candidate connected domain from the original ultrasound image, input each candidate image into a preset discriminator, and calculate the identification score corresponding to each candidate image; then, take the candidate connected domain corresponding to the candidate image with the highest identification score as the target connected domain.
  • the centroid position of the target connected domain is determined, such as the centroid position marked in FIG18 ; and based on the centroid position of the target connected domain combined with a preset size, an iliac region image corresponding to the hip joint of the subject to be tested is extracted from the original ultrasound image, as shown in FIG20 ; wherein the preset size is related to the size of the original ultrasound image.
  • the ilium region image is input into a preset active contour model to obtain a hip joint segmentation image, as shown in FIG21 ; then, the hip joint contour is filtered out from the hip joint segmentation image to obtain a hip joint segmentation image including only the hip joint contour, as shown in FIG22 .
  • the specific implementation process may include:
  • the hip joint segmentation image is segmented into a first image and a second image; wherein the preset baseline is a straight line formed based on the vertical coordinate of the center of mass of the ilium in the ilium region image, and the first image is a partial hip joint image including the labrum.
  • For the second image determine the upper edge line in the second image, and extract the points on the upper edge line to form a bone top line point set; and generate straight lines corresponding to each point according to the points in the bone top line point set and the preset slope; then, select the target straight line from the straight lines corresponding to each point, and determine the points corresponding to the target straight line as flange points; and regard the point with the maximum value of the vertical coordinate in the bone top line point set as the end point; finally, connect the flange point and the end point to generate the bone top line.
  • the bone apex angle is determined based on the slope of the bone apex line
  • the cartilage apex angle is determined based on the slope of the cartilage apex line.
  • Alpha represents the ⁇ angle
  • Beta represents the ⁇ angle
  • HIP Type represents the hip joint type.
  • the hip joint classification method in the embodiment of the present specification, first, by obtaining the ilium segmentation area and extracting the center of mass of the ilium segmentation area, the region of interest of the hip joint is obtained based on the center of mass position, and then the region of interest of the hip joint is input into the active contour model for further detailed segmentation to obtain the hip joint segmentation result; compared with inputting the entire ultrasound image into the active contour model, this process can improve the processing speed and processing accuracy of the active contour model algorithm, thereby improving the accuracy of hip joint angle measurement, as well as improving the speed and accuracy of hip joint segmentation.
  • the hip joint classification method provided in this embodiment only needs to input the original ultrasound image of the object to be tested to obtain the hip joint classification result, which greatly improves the doctor's diagnostic efficiency and saves the patient's waiting time.
  • the fully automatic method can make the classification result more objective and reduce the misdiagnosis caused by the lack of experience of some doctors.
  • the method provided in the embodiment of this specification is different from deep learning. For learning methods, there is no need to rely on a large amount of data for training, but only a small amount of ultrasound images for algorithm verification, which greatly saves algorithm costs.
  • the ultrasonic hip joint automatic classification system provided in this embodiment includes the entire process from receiving signals from the ultrasonic transducer, converting them into B-mode images through beamforming, to data storage, image algorithm processing and displaying results, which improves the integrity and reliability of the system.
  • steps in the flowcharts involved in the above-mentioned embodiments can include multiple steps or multiple stages, and these steps or stages are not necessarily executed at the same time, but can be executed at different times, and the execution order of these steps or stages is not necessarily carried out in sequence, but can be executed in turn or alternately with other steps or at least a part of the steps or stages in other steps.
  • FIG. 24 is a diagram showing the internal structure of a processing device according to some embodiments of the present specification.
  • a computer device (the computer device may also be referred to as a processing device, for example, a processing device 120) is provided.
  • the computer device may be an ultrasound device, a server connected to the ultrasound device in communication, or a terminal device connected to the ultrasound device in communication. Of course, it may also be a terminal device connected to the server in communication, etc.
  • the computer device is a terminal device, its internal structure diagram may be as shown in FIG. 24.
  • the computer device includes a processor, a memory, a communication interface, a display unit, and an input device connected via a system bus.
  • the communication interface, the display unit, and the input device may be connected to the system bus via an I/O (input/output) interface.
  • the processor of the computer device is used to provide computing and control capabilities.
  • the memory of the computer device includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium stores an operating system and a computer program.
  • the internal memory provides an environment for the operation of the operating system and the computer program in the non-volatile storage medium.
  • the communication interface of the computer device is used to communicate with an external terminal in a wired or wireless manner, and the wireless manner may be implemented via WIFI, a mobile cellular network, NFC (near field communication) or other technologies.
  • the display screen of the computer device can be a liquid crystal display screen or an electronic ink display screen
  • the input device of the computer device can be a touch layer covering the display screen, or a key, trackball or touchpad provided on the computer device housing, or an external keyboard, touchpad or mouse, etc.
  • FIG. 24 is only a block diagram of a partial structure related to the solution of this specification, and does not constitute a limitation on the computer device to which the solution of this specification is applied.
  • the specific computer device may include more or fewer components than shown in the figure, or combine certain components, or have different component arrangements.
  • the system and its modules shown in FIG. 24 can be implemented in various ways.
  • the system and its modules can be implemented by hardware, software, or a combination of software and hardware.
  • the hardware part can be implemented using dedicated logic; the software part can be stored in a memory and executed by an appropriate instruction execution system, such as a microprocessor or dedicated design hardware.
  • a processor control code such as providing such code on a carrier medium such as a disk, CD or DVD-ROM, a programmable memory such as a read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier.
  • the system and its modules of the present specification can be implemented not only by hardware circuits such as very large-scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, but can also be implemented by software executed by, for example, various types of processors, and can also be implemented by a combination of the above hardware circuits and software (for example, firmware).
  • each module can share a storage module, or each module can have its own storage module. Such variations are all within the scope of protection of this specification.
  • numbers describing the number of components and attributes are used. It should be understood that such numbers used in the description of the embodiments are modified by the modifiers "about”, “approximately” or “substantially” in some examples. Unless otherwise specified, “about”, “approximately” or “substantially” indicate that the numbers are allowed to vary by ⁇ 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximate values, which may change according to the required features of individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and adopt the general method of retaining digits. Although the numerical domains and parameters used to confirm the breadth of their range in some embodiments of this specification are approximate values, in specific embodiments, the setting of such numerical values is as accurate as possible within the feasible range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Processing (AREA)

Abstract

本说明书实施例提供一种髋关节角度检测系统和方法,其中,该系统包括处理器,处理器用于执行以下操作:获取待测对象的超声图像;从超声图像中提取待测对象的髋关节对应的髂骨区域轮廓;获取髂骨区域轮廓中髂骨区域的质心位置,并基于质心位置获取髋关节图像;基于髋关节图像,确定待测对象的髋关节角度。

Description

一种髋关节角度检测系统和方法
交叉引用
本说明书要求2022年09月27日提交的中国申请号202211180701.9的优先权,其全部内容通过引用并入本文。
技术领域
本说明书涉及髋关节分型领域,特别涉及一种髋关节角度检测系统和方法。
背景技术
新生儿髋关节分型可以通过早期筛查和干预预防或治疗婴儿髋关节发育不良。这有助于避免后续出现严重的髋关节问题,如髋关节脱位、髋关节退行性变等。对于新生儿来说,进行髋关节分型非常重要,其可以帮助医生及时发现婴儿髋关节发育不良,并采取相应的预防或治疗措施。
本说明书一些实施例提供了一种髋关节角度检测系统和方法,以更好地对新生儿髋关节进行检测。
发明内容
本说明书一个或多个实施例提供一种髋关节角度检测系统,所述系统包括处理器,所述处理器用于执行以下操作:获取待测对象的超声图像;从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域轮廓;获取所述髂骨区域轮廓中髂骨区域的质心位置,并基于所述质心位置获取髋关节图像;基于所述髋关节图像,确定所述待测对象的髋关节角度。
本说明书一个或多个实施例提供一种髋关节角度检测方法,所述方法包括:获取待测对象的超声图像;从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域轮廓;获取所述髂骨区域轮廓中髂骨区域的质心位置,并基于所述质心位置获取髋关节图像;基于所述髋关节图像,确定所述待测对象的髋关节角度。
本说明书一个或多个实施例提供一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行上述髋关节角度检测方法。
本说明书一个或多个实施例提供一种髋关节分型方法,所述方法包括:获取待测对象的原始超声图像;所述原始超声图像为所述待测对象的髋关节对应的超声图像;从所述原始超声图像中提取所述待测对象的髋关节对应的髂骨区域图像;将所述髂骨区域图像输入至预设的活动轮廓模型中进行髋关节分型,得到所述待测对象的髋关节类型。
本说明书一个或多个实施例提供一种髋关节分型装置,所述装置包括处理器,所述处理器用于执行如上述的髋关节分型方法。
附图说明
本说明书将以示例性实施例的方式进一步说明,这些示例性实施例将通过附图进行详细描述。这些实施例并非限制性的,在这些实施例中,相同的编号表示相同的结构,其中:
图1是根据本说明书一些实施例所示的髋关节分型系统的示例性应用场景示意图;
图2是根据本说明书一些实施例所示的髋关节角度检测方法的示例性流程图;
图3是根据本说明书一些实施例所示的确定髂骨区域轮廓的示例性流程图;
图4是根据本说明书一些实施例所示的确定髋关节角度的方法的示例性流程图;
图5是根据本说明书一些实施例所示的髋关节分型方法的示例性流程图;
图6为根据本说明书另一些实施例所示的确定髂骨区域图像的示例性流程示意图;
图7是根据本说明书一些实施例所示的对超声图像进行滤波和阈值分割的示例性流程图;
图8是根据本说明书一些实施例所示的获取候选连通域的示例性流程示意图;
图9根据本说明书一些实施例所示的获取目标连通域的示例性示意图流程图;
图10是根据本说明书一些实施例所示的确定髂骨区域图像的示例性流程示意图;
图11是根据本说明书一些实施例所示的髋关节分型方法的示例性流程示意图;
图12是根据本说明书一些实施例所示的确定三个解剖点的示例性流程图;
图13是根据本说明书一些实施例所示的确定凸缘点方法的示例性流程示意图;
图14是根据本说明书一些实施例所示的髋关节分型系统的结构示意图;
图15是根据本说明书一些实施例所示的髋关节分型方法的流程结构示意图;
图16是根据本说明书一些实施例所示的待测对象的超声图像;
图17是根据本说明书一些实施例所示的超声图像对应的二值分割图像;
图18是根据本说明书一些实施例所示的超声图像对应的目标连通域图像;
图19是根据本说明书一些实施例所示的连通域筛选的流程结构示意图;
图20是根据本说明书一些实施例所示的超声图像中的髂骨区域轮廓;
图21是根据本说明书一些实施例所示的髂骨区域轮廓对应的髋关节图像;
图22是根据本说明书一些实施例所示的仅包括髋关节轮廓的髋关节图像;
图23是根据本说明书一些实施例所示的超声图像对应的髋关节分型效果图;
图24是根据本说明书一些实施例所示的处理设备的内部结构图。
具体实施方式
为了更清楚地说明本说明书实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本说明书的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本说明书应用于其它类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
应当理解,本文使用的“系统”、“装置”、“单元”和/或“模块”是用于区分不同级别的不同组件、元件、部件、部分或装配的一种方法。然而,如果其他词语可实现相同的目的,则可通过其他表达来替换所述词语。
如本说明书和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其它的步骤或元素。
本说明书中使用了流程图用来说明根据本说明书的实施例的系统所执行的操作。应当理解的是,前面或后面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同时处理各个步骤。同时,也可以将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。
目前,针对新生儿常见的髋关节疾病,通常采用超声检测结合Graf方法测量新生儿髋关节的α角和β角,并基于测量得到的α角和β角来判断新生儿髋关节的发育类型,例如,是否为正常发育,异常发育类型等。
相关技术中,新生儿髋关节的角度测量的方法有传统的手动测量方法,也有一些自动测量方法。
手动测量方法通常需要医生基于新生儿髋关节的超声图像,手动选取髋关节部位的五个解剖点,并利用五个解剖点自动计算髋关节的骨顶角(α角)和软骨顶角(β角),其中,五个解剖点分别为股直肌反折头与髂骨骨膜移行处上缘、股直肌反折头与髂骨骨膜移行处下缘、骨缘转折点、髂骨支下缘最低点和盂唇中点;对于手动测量方法,测量结果较为主观,准确性高度依赖于医生的经验,且医生操作频繁,工作效率低。
自动测量方法包括基于髋关节分型的自动测量方法和基于深度学习的自动测量方法。基于髋关节分型的自动测量方法包括基于区域的活动轮廓模型的髋关节分型方法,其处理过程为:对新生儿的髋关节超声图像进行初步处理后,运用基于区域的活动轮廓模型进行图像分割,获取髋关节组织轮廓,进而利用直线拟合的方式获取骨顶角与软骨顶角;该方法是直接使用初步预处理的超声图像,输入至活动轮廓模型,对于速度和精度都有较大的影响。另一种基于髋关节分型的自动测量方法,其处理过程为:对输入的超声图像进行均值滤波后,手动获取感兴趣区域,再结合图像增强以及二值化策略,进而结合直线拟合的方式获取骨顶角与软骨顶角;该方法属于一种半自动测量方法,依赖于手动获取感兴趣区域,增加了工作量。
基于深度学习的自动测量方法,是通过获取新生儿髋关节目标关键点的位置以及目标测量值,并将上述信息送入深度学习网络中进行训练获取网络模型,实现输入超声图像,输出目标位置的功能。但该方法依赖于庞大的超声图像数据,但往往涉及到病患隐私以及数据量较少等问题,难以实施或依据少量数据难以有较为明显的效果。
针对传统新生儿髋关节的角度测量方法所存在的技术问题,本说明书一些实施例提出了一种髋关节角度检测系统和方法,可以提高髋关节分型的处理效率,而且还能保证分型结果的客观性以及准确性。
图1是根据本说明书一些实施例所示的髋关节角度检测系统的示例性应用场景示意图。
如图1所示,髋关节角度检测系统100可以包括超声成像设备110、处理设备120、网络130、存储设备140和终端150。在一些实施例中,髋关节角度检测系统100能够用于进行髋关节角度检测和 髋关节分型。
超声成像设备110能够用于获取待测对象上目标区域的超声成像数据。在一些实施例中,超声成像设备可以利用超声波的物理特性和对象上目标区域在声学性质上的差异获取对象上目标区域的超声成像数据,所述超声成像数据可以以波形、曲线或图像的形式显示和/或记录与所述对象上目标区域相关的特征。仅作为示例,所述超声成像设备可以包括一个或多个超声探头,用于向所述目标区域(例如,位于治疗床上的对象或其器官、组织)发射超声波。超声波在经过具有不同声阻抗和不同衰减特性的器官与组织后产生不同的反射与衰减,从而形成可以被所述一个或多个超声探头接收的回声。超声成像设备可以将对收到的回声进行处理(例如,放大、转换)和/或显示,生成超声成像数据。在一些实施例中,超声成像设备可以包括B超设备、彩色多普勒超声设备、心脏彩超设备、三维彩超设备等或其任意组合。
在一些实施例中,超声成像设备110可以通过网络130将超声成像数据发送到处理设备120、存储设备140和/或终端150以进行进一步处理。例如,超声成像设备获取的超声成像数据可以是非图像形式的数据,所述非图像形式的数据可以被发送到处理设备120,用于生成超声图像。再例如,超声成像设备获取的超声成像数据可以是图像形式的数据,所述图像形式的数据可以被发送到终端150以进行显示。又例如,所述超声成像数据可以被存储在存储设备140中。
处理设备120可以处理从超声成像设备110、存储设备140和/或终端150获得的数据和/或信息。例如,处理设备120可以处理从超声成像设备110中的成像设备获取的超声成像数据,并生成目标区域的超声图像。在一些实施例中,所述超声图像可以被发送到终端150并显示在终端150中的一个或以上显示设备上。在一些实施例中,处理设备120可以是单个服务器或服务器组。服务器组可以是集中的,也可以是分布式的。在一些实施例中,处理设备120可以是本地的或远程的。例如,处理设备120可以经由网络130访问存储在超声成像设备110、存储设备140和/或终端150中的信息和/或数据。又例如,处理设备120可以直接连接到超声成像设备110、存储设备140和/或终端150,以访问其上存储的信息和/或数据。再例如,处理设备120可以集成在超声成像设备110中。在一些实施例中,处理设备120可以在云平台上实现。仅作为示例,所述云平台可以包括私有云、公共云、混合云、社区云、分布云、内部云、多云等或其任意组合。
在一些实施例中,处理设备120可以是与超声成像设备通信并处理从超声成像设备接收的数据的单个处理设备。
网络130可以包括可以促进髋关节角度检测系统100的信息和/或数据交换的任何合适的网络。在一些实施例中,髋关节角度检测系统100的一个或以上组件(例如,超声成像设备110、处理设备120、存储设备140或终端150)可以通过网络130与髋关节角度检测系统100的其他组件连接和/或通信。例如,处理设备120可以通过网络130从超声成像设备110获取超声成像数据。又例如,处理设备120可以通过网络130从终端150获取用户指令,所述指令可以用于指示超声成像设备110进行成像。在一些实施例中,网络130可以包括一个或以上网络接入点。例如,网络130可以包括有线和/或无线网络接入点,例如基站和/或互联网接入点,髋关节角度检测系统100的一个或以上组件可以通过它们连接到网络130以交换数据和/或信息。
存储设备140可以存储数据和/或指令。在一些实施例中,存储设备140可以存储从终端150和/或处理设备120获得的数据。在一些实施例中,存储设备140可以存储处理设备120可以执行或用于执行本说明书中描述的示例性方法的数据和/或指令。在一些实施例中,所述存储设备140可以在云平台上实现。仅作为示例,所述云平台可以包括私有云、公共云、混合云、社区云、分布云、内部云、多云等或其任意组合。
在一些实施例中,存储设备140可以连接到网络130以与髋关节角度检测系统100的一个或以上组件(例如,处理设备120、终端150等)通信。髋关节角度检测系统100的一个或以上组件可以经由网络130访问存储设备140中存储的数据或指令。在一些实施例中,存储设备140可以直接连接到髋关节角度检测系统100的一个或以上组件(例如,处理设备120、终端150等)或与之通信。在一些实施例中,存储设备140可以是处理设备120的一部分。
终端150可以包括移动设备150-1、平板电脑150-2、膝上型计算机150-3等,或其任意组合。在一些实施例中,终端150可以远程操作超声成像设备110。在一些实施例中,终端150可以经由无线连接操作超声成像设备110。在一些实施例中,终端150可以接收由用户输入的信息和/或指令,并且经由网络130将所接收的信息和/或指令发送到超声成像设备110或处理设备120。在一些实施例中,终端150可以从处理设备120接收数据和/或信息。在一些实施例中,终端150可以是处理设备120的一部分。在一些实施例中,可以省略终端150。
图2是根据本说明书一些实施例所示的髋关节角度检测系统的示例性流程图。在一些实施例 中,流程200可以由处理设备(例如,处理设备120)或髋关节角度检测系统(例如,髋关节角度检测系统100)执行。例如,流程200可以以程序或指令的形式存储在存储装置(如处理设备的自带存储单元或外接存储设备)中,所述程序或指令在被执行时,可以实现流程200。流程200可以包括以下操作。
步骤210,获取待测对象的超声图像。
待测对象可以包括患者或者其他医学实验对象(例如,试验模体等)。待测对象还可以是患者(例如,新生儿)或其他医学实验对象的一部分,包括器官和/或组织,例如,髂骨等。
超声图像是指通过超声设备对待测对象进行检测后获得的图像。超声信号通常由超声探头发射的高频声波产生,这些声波在待测对象内部遇到不同类型组织或器官时会反射回来。这些反射声波经过被超声探头接收后,经过放大、滤波、数字化等处理后,最终被转换成图像,转换而得到的图像即为超声图像。在本说明书的其他一些地方,获取的超声图像也被称为原始超声图像。
步骤220,从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域轮廓。
髂骨是髋骨的组成部分之一,构成髋骨的后上部,分为髂骨体和髂骨翼两部分。髂骨区域是指位于人体躯干侧面、髋骨以上的区域,其范围大致相当于腰部和胯部之间的区域。髂骨是人体最大的平板骨之一,髂骨上的髂嵴是股骨肌群的主要起始点之一,髂骨臼与股骨头组成了髋关节。
髋关节是人体最大的关节之一,连接了髋骨和股骨。它是一个球-and-socket(球窝)型关节,由股骨头、髋臼以及盂唇等组成。髋关节在人体运动中具有非常重要的作用,支持身体重量并进行行走、跑步和其他活动。同时,它也对身体姿态和平衡控制起着关键的作用。
髂骨区域轮廓是指能够用于定位和识别髂骨及其相关结构的具有髂骨轮廓区域的图像。例如,在髂骨区域轮廓中可以包括髂骨等结构的轮廓。
在一些实施例中,处理设备还可以使用目标检测、关键点检测、识别图像特征等方法获取髂骨区域轮廓。在一些实施例中,处理设备还可以对超声图像进行滤波处理,并对滤波处理后的超声图像进行阈值分割,最后基于阈值分割的结果获取髂骨区域轮廓。其详细的说明可参见图3的相关描述。
步骤230,获取所述髂骨区域轮廓中髂骨区域的质心位置,并基于所述质心位置获取髋关节图像。
质心是指一个物体内部所有质点的平均位置。髂骨区域的质心位置则是指在超声图像中髂骨区域轮廓对应区域内的所有元素点的平均位置。
在一些实施例中,处理设备可以通过各种方式计算获得髂骨区域的质心位置,例如,基于数学计算的方法、基于深度学习的方法等,本实施例对获取质心位置的具体方式不做限定。示例性地,处理设备可以通过以下方式获得髂骨区域的质心位置。
处理设备可以基于髂骨区域轮廓中髂骨区域的质心位置,从超声图像中提取得到髂骨区域图像,并基于髂骨区域图像获得髋关节图像。髋关节图像是指从髂骨区域图像中分割出来的待测对象的髋关节结构图像。因此,在一些实施例中,髋关节图像也称为髋关节分割图像。
在一些实施例中,处理设备可以基于髂骨区域的质心位置,通过预设的活动轮廓模型获取髋关节图像。例如,处理设备能够基于髂骨区域轮廓中髂骨区域的质心位置按照预设尺寸进行扩展,获取髂骨区域图像。以及,利用预设的活动轮廓模型对所述髂骨区域图像进行处理,获取所述髋关节图像。关于获取髂骨区域图像的更多说明可参见后文步骤502中的描述。
髂骨区域图像是指从超声图像中提取的结构主要区域为髂骨的图像部分。
扩展是指将从质心位置向四周(例如,上下左右四个方向)扩张。可以理解,质心位置是位于髂骨区域的区域范围内,从质心位置向四周扩展可以获得更加准确、完整地髂骨区域图像,达到尽可能地减少图中其他背景区域的目的。
预设尺寸是指预先设定的扩展大小/长度。预设尺寸可以包括预设长度和预设宽度,预设长度和预设宽度可以用像素距离表示,例如,100像素距离、200像素距离等。在一些实施例中,预设尺寸与所述超声图像的尺寸相关。例如,预设尺寸可以是超声图像的一定比例,比如,超声图像为1000*200像素大小,则预设尺寸可以是500和100。又例如,预设尺寸还可以是超声图像的尺寸减去一定大小的尺寸。再例如,假设对髋关节进行分割时需要的是更多纵向的图像信息,则可以获取原图1/2宽度的、4/5高度的图像。
在本实施例中,将质心位置按照预设尺寸进行扩展,预设尺寸相关于超声图像,可以保证在扩展时获取得到的髂骨区域图像中能够完整将髋关节包括在内。
预设的活动轮廓模型可以是基于能量最小化的图像分割方法。其可以在图像中自适应地调整形状和位置,从而实现对髋关节的有效提取。
在一些实施例中,处理设备可以将髂骨区域图像输入至预设的活动轮廓模型进行处理,由活动轮廓模型输出髋关节图像。
在一些实施例中,处理设备也可以通过其他方式对髂骨区域图像进行处理,例如,通过图像分割、关键点检测等方式获得髋关节图像。
在一些实施例中,预设的活动轮廓模型(或简称为活动轮廓模型)还可以为基于髂骨区域样本图像和该髂骨区域样本图像对应的金标准髋关节图像训练得到的机器学习模型。训练方式可以各种模型训练方法,例如,梯度下降法等,本实施例对具体的模型训练方法不作限定。在一些实施例中,该预设的活动轮廓模型也可以基于髂骨区域样本图像和髂骨区域样本图像对应的金标准训练获得。因此,在一些实施例中,处理设备也可以直接将髂骨区域图像输入至活动轮廓获取,获取髋关节分割图像。示例性地,下文以髂骨区域图像输入至活动轮廓模型进行处理为例。
活动轮廓模型可以通过迭代的方式对髂骨区域图像中的髋关节进行分割,其可以包含多种分割方式,如:Snake分割模型、水平集分割等。示例性地,本说明书实施例中采用水平集分割方式。水平集是跟踪轮廓和表面运动的一种数字化方法,它不直接对轮廓进行操作,而是将轮廓转换成一个高维函数的零水平集。这个高维函数叫做水平集函数。然后对该水平集函数进行微分,通过从输出中提取零水平集来得到运动的轮廓。假设输入图像为u0,将髂骨区域曲线称为C,通过C将图像划分为曲线内和曲线外两部分,关于曲线C的能量函数表示为:
F(C)=μ·Length(C)+v·Area(inside(C))+λ1inside(C)|u0(x,y)-c1|2dxdy
2outside(C)|u0(x,y)-c2|2dxdy     (1)
其中,μ、v、λ1、λ2均为权值,常数c1为曲线C的内部像素点的均值,常数c2为曲线C的外部像素点的均值,Length(C)为曲线C的长度,Area(inside(C))为曲线C的内部面积,inside(C)表示曲线C的内部,outside(C)表示曲线C的外部,u0(x,y)为输入图像u0的像素点。水平集算法通过迭代的方式最小化能量函数来获取髂骨区域,进而实现分割。
将公式(1)转换成为关于水平集函数φ的能量函数,可表示为:
其中,Heaviside函数Hε(φ(x,y))和Delta函数δε(φ(x,y))分别为:

其中,z为φ(x,y),c1和c2可以通过φ求出,即:
固定c1和c2保持不变,使得F相对于φ最小,通过变分法得到欧拉-拉格朗日公式:
由此可见,水平集算法步骤主要包含以下几个步骤:(1)初始化φ0=φ0,n=0;(2)根据公式(5)计算均值c1和c2;(3)根据公式(6)的偏微分方程求解φn+1;(4)利用φn+1重新初始化φ;(5)检查是否收敛,是则停止,否则重复步骤(2)。
水平集算法过程主要是通过迭代求取能量函数最小值,以达到曲线演化的目的,直至达到设定的迭代次数,则跳出循环。
步骤240,基于所述髋关节图像,确定所述待测对象的髋关节角度。
髋关节角度是用于评估股骨头颈部与股骨干轴线之间夹角的指标。髋关节角度包括α角和β角,其可以用于对新生儿的髋关节分型。在一些实施例中,α角也称为骨顶角,β角也称为软骨顶角。
在一些实施例中,处理设备可以通过在髋关节图像中标记出骨顶线和软骨顶线的,然后通过测量或者确定骨顶线、软骨顶线的斜率的方式确定α角和β角。
关于确定髋关节角度的更多说明可以参见图4的相关描述。
在本说明书一些实施例中,通过对超声图像进行髂骨区域提取,在基于髂骨区域的质心位置进行图像分割,可以准确地获得待测对象的髋关节图像。进而基于髋关节图像,确定待测对象的髋关节角 度。根据预设的髋关节分型模板(髋关节分型标准),可以准确地确定待测对象的髋关节类型。基于质心位置的髋关节分割方法,可以准确的分割出髋关节区域并且能够缩短计算时间,提高了分割速率及分割精度,进而达到提高髋关节分型的准确性和处理效率的目的。
图3是根据本说明书一些实施例所示的确定髂骨区域轮廓的示例性流程图。在一些实施例中,流程300可以由处理设备(例如,处理设备120)执行。
步骤310,对所述超声图像进行滤波处理。
滤波是一种信号处理技术,能够用于改变信号的频率特征或减少噪声。滤波可以分为低通滤波、高通滤波、带通滤波和带阻滤波等不同类型。
在一些实施例中,处理设备可以通过多种滤波方式对超声图像进行滤波处理,例如,可以通过均值滤波、高斯滤波、中值滤波、边缘保留滤波等。
步骤320,对滤波处理后的超声图像进行阈值分割,获取包含多个轮廓区域的二值图像。
二值图像是指一种只包含两种颜色的图像,例如,黑色和白色。二值图像中的每个像素为这两种颜色之一,黑色表示0或偏低的强度值,而白色表示1或偏高的强度值。
包含的多个轮廓区域可以是分割得到的超声图像中组织或器官的轮廓结构。通常为超声图像中一组像素点在图像中连通的区域,例如,连通域。
阈值分割是一种基于图像灰度值的二值化处理方法。阈值分割可以用于将超声图像中不同灰度的组织结构或区域分为两个部分:前景(信号)和背景(噪声)。其基本原理是通过设定一个合适的阈值,将超声图像中小于该阈值的像素点设置为背景,大于该阈值的像素点设置为前景。
在一些实施例中,处理设备可以通过预设的阈值分割算法对滤波处理后的超声图像进行处理,获得二值图像。预设的阈值分割算法可以包括最大熵阈值分割、大津法阈值分割、自适应阈值分割以及固定阈值分割等。
在一些实施例中,处理设备可以采用中值滤波的方法对超声图像进行滤波处理,以及采用最大熵阈值分割的方法对滤波后的超声图像进行分割,以获得更优的处理结果。中值滤波与最大熵阈值分割具有较高的适配性,两者组合使用可以使得髂骨区域分割效果更佳。当然,处理设备也可以采用其他的组合方式,例如,采用中值滤波和大津法阈值分割,本实施例对此不作限定。
在本说明书的其他一些地方,对超声图像进行滤波处理以及阈值分割的过程也称为二值化处理,更多的描述可以参见图6的相关说明。
步骤330,基于所述二值图像,确定所述髂骨区域轮廓。
在一些实施例中,处理设备可以通过预设筛选方法对所述二值图像进行处理,确定髂骨区域轮廓。
预设筛选方法是指预先设定的用于从二值图像中筛选出髂骨区域轮廓的策略/方式。在一些实施例中,预设筛选方法可以包括基于预设筛选策略进行两轮筛选,其中,第一轮筛选可以用于从二值图像中获取候选连通域,去除掉干扰轮廓,第二轮筛选可以基于第一轮筛选的结果获取髂骨区域轮廓。具体地,处理设备能够通过预设筛选策略从所述二值图像中获取候选连通域;以及基于所述候选连通域确定所述髂骨区域轮廓。
预设筛选策略是指在预先设定的特定目标和条件下,为实现目标而设置的操作方案/方法。在一些实施例中,所述预设筛选策略包括使用面积筛选、质心筛选以及长度筛选三个维度的筛选条件中的至少一个进行筛选。例如,可以是使用质心筛选,也可以是使用面积筛选,还可以是使用质心筛选和面积筛选。优选地,在一些实施例中,使用面积筛选、质心筛选和长度筛选三个维度的筛选条件进行筛选。
连通域是指一组像素点在图像中连通的区域。候选连通域则是指通过预设筛选策略从二值图像中获取得到的连通域。
面积筛选是指通过面积筛选出连通域面积大于某个阈值的作为候选轮廓。其中,该阈值与图像面积大小有关,例如,图像面积越大,该阈值越大。
质心筛选是指筛选出质心处在中间某一部分图像区域的连通域。例如,筛选出质心处在中间二分之一图像区域的连通域。
长度筛选是指筛选出长度长于某个阈值的连通域。长度筛选的阈值与图像宽度有关,例如,图像越宽,该阈值越大。
在一些实施例中,当所述候选连通域的数量小于1时,处理设备能够对所述预设筛选策略进行筛选条件降维,得到降维后的筛选策略;以及,使用所述降维后的筛选策略从所述二值图像中获取候选连通域。筛选条件降维是指舍弃掉其中的一个或多个筛选条件。在一些实施例中,筛选条件降维的顺序依次为长度筛选、面积筛选和质心筛选。例如,预设筛选策略可以是首先同时使用上述三种条件进行连通域筛选,当筛选出的候选连通域的数量为0(小于1)时,去掉其中的一个筛选条件,按照顺序去掉 长度筛选条件。使用降维后的筛选策略再次进行筛选,并判断筛选出的连通域的数量是否小于1,若仍然为0,则再次去掉一个筛选条件,例如,去掉面积筛选条件.之后,再次进行筛选,直至候选轮廓数量大于等于1。若最终的候选连通域的数量为1时,可以直接将该候选连通域输出作为髂骨区域。若最终的候选轮廓数量大于1时,进行二轮筛选。
在一些实施例中,第二轮筛选包括针对每个所述候选连通域,计算所述候选连通域对应的鉴别分数;将鉴别分数最高的候选连通域作为目标连通域,并基于所述目标连通域,从所述超声图像中提取得到所述髂骨区域轮廓。目标连通域是指超声图像中的髂骨所在区域对应的连通域。
鉴别分数是指通过预设鉴别算法对连通域进行计算后得到的数值。
在一些实施例中,预设鉴别算法可以是与连通域的距离相关的鉴别算法,该预设鉴别算法的计算公式可以如公式(7)所示。
其中,S为连通域对应的鉴别分数,N为连通域中像素点的数量,i为像素点,I为超声图像中连通域对应区域内各像素点的平均像素强度,R为连通域的长宽比,θ为连通域的主轴方向。
在计算连通域的长宽比R时,需要用到连通域的中心距,连通域的中心距的计算公式如公式(8)所示。
其中,mpq是指图像的矩,p+q代表阶数,如二阶矩包括m20、m02、m11等,x、y均是来自连通域中的坐标,是重心坐标,Ixy为超声图像中连通域对应区域内像素点的像素强度,基于此,连通域的长宽比R的计算公式如公式(9)所示。
其中,m20、m02、m11均为图像的二阶矩。
另外,对于连通域的主轴方向θ,其计算方式可以包括拉东变换、利用二阶中心矩计算、或者使用最小二乘法拟合连通域轮廓点集获取直线进而计算主轴方向等。
需要说明的是,上述例子仅出于示例的目的,预设鉴别算法还可以包括其他算法,例如,面积判断算法、矩形框拟合算法、圆形拟合算法、像素密度算法等,本说明书具体算法不作限定。
在一些实施例中,处理设备能够从超声图像中截取目标连通域所对应的部分区域作为待测对象的髋关节对应的髂骨区域轮廓。
关于获取髂骨区域轮廓的更多说明可以参见图6的相关描述。
图4是根据本说明书一些实施例所示的确定髋关节角度的方法的示例性流程图。在一些实施例中,流程400可以由处理设备(例如,处理设备120)执行。
步骤410,基于所述髋关节图像,确定髂骨区域的凸缘点、终点和盂唇中点。
凸缘点通常位于股骨头的中心位置,是髋关节旋转中心的近似位置。在一些实施例中,凸缘点也称为骨缘转折点。
在一些实施例中,处理设备可以根据预设基线,将所述髋关节图像分割为第一图像和第二图像。
预设基线为基于所述髂骨区域图像中髂骨质心的纵坐标所形成的直线。例如,预设基线可以是穿过髂骨质心的纵坐标的与图像的长度方向相同的直线。
第一图像是指沿着预设基线将髋关节图像分割后的上半部分的图像。
第二图像是指沿着预设基线将髋关节图像分割后的下半部分的图像。
处理设备可以确定所述第二图像中髂骨轮廓的上边线,并提取所述上边线上的点构成骨顶线点集。
上边线是指髂骨轮廓的上边界。上边线上的点(例如,像素点)提取后可以得到骨顶线点集。
处理设备可以基于所述骨顶线点集,获取骨顶线的斜率。
在一些实施例中,处理设备可以通过最小二乘法直线拟合骨顶线点集中的骨顶线点,得到骨顶线的斜率。骨顶线的斜率也可以通过其他方法获得,本实施例对此不作限定。
处理设备可以用所述骨顶线的斜率从所述骨顶线点集中确定所述凸缘点。
在一些实施例中,处理设备可以利用骨顶线的斜率,结合骨顶线点集,得到n条直线(n为骨顶线点集中骨顶线点的个数)。对每个骨顶线点的y坐标进行排序,确保骨顶线点集上的所有点全部在某一条直线的左下方。则该条直线对应的骨顶线点集中的坐标点即为凸缘点。
终点通常为髂骨支下缘最低点。
在一些实施例中,处理设备可以将骨顶线点集中y坐标最大值的骨顶线点作为终点。
盂唇中点通常为髋臼上的一个点。
在一些实施例中,处理设备可以从所述第一图像中分割出盂唇区域;并将所述盂唇区域的质心作为所述盂唇中点。
盂唇区域的分割可以通过各种方式进行,如采用本说明书其他实施例所描述的分割算法等,本说明书对此不作限定。质心获取方式同样可以参见本说明书其他地方的描述。
步骤420,连接所述凸缘点和所述终点确定骨顶线,连接所述凸缘点和所述盂唇中点得到软骨顶线。
步骤430,基于所述骨顶线和所述软骨顶线的斜率,确定所述待测对象的髋关节角度。
在一些实施例中,根据骨顶线的斜率计算髋关节角度中的骨顶角α的方式可以如公式(10)所示根据软骨顶线的斜率计算髋关节角度中的软骨顶角β的方式可以如公式(11)所示。

其中,k1为骨顶线的斜率,k2为软骨顶线的斜率。
关于确定待测对象的髋关节角度的更多说明可以参见后文的相关描述。
基于相同的发明构思,本说明书一些实施例还公开了一种髋关节分型方法。如图5所示,图5是根据本说明书一些实施例所示的髋关节分型方法的示例性流程图。该方法可以应用于图1中的处理设备,其可以包括以下步骤:
步骤501,获取待测对象的原始超声图像。
在一些实施例中,原始超声图像为待测对象的髋关节对应的超声图像,如,髋关节对应的冠状位超声图像,该原始超声图像可以为彩色图像,也可以为灰度图像,本实施例对此不作限定。
在一些实施例中,处理设备可以从超声设备中获取超声设备所采集的待测对象的髋关节对应的超声图像,也可以从服务器中获取待测对象的髋关节对应的超声图像,当然,处理设备还可以获取超声设备所采集的待测对象的髋关节对应的超声数据,并基于待测对象的髋关节对应的原始超声数据,重建得到待测对象的髋关节对应的超声图像。
在一些实施例中,处理设备还可以获取待测对象的局部或者全身的超声图像,并从该局部或者全身的超声图像中截取髋关节对应的超声图像。需要说明的是,本说明书实施例中对获取待测对象的髋关节超声图像的方式并不做具体限定,例如,处理设备还可以通过从存储设备、数据库读取的方式获取待测对象的超声图像。
步骤502,从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像。
在一些实施例中,处理设备可以将待测对象的髋关节对应的超声图像输入至预设的图像分割模型中,输出得到待测对象的髋关节对应的髂骨区域图像。该预设的图像分割模型可以是基于阈值的图像分割模型、基于区域的图像分割模型、基于边缘的图像分割模型、基于能量泛函的图像分割模型(如活动轮廓模型)、基于深度学习/神经网络的图像分割模型、或者基于机器学习的图像分割模型等。本实施例中对预设的图像分割模型的类型和实现原理并不做具体限定,只要该预设的图像分割模型能够从髋关节的超声图像中分割出髂骨区域图像即可。
在一些实施例中,处理设备也可以采用基本的图像处理操作,对原始超声图像进行图像分析,从而能够从原始超声图像中提取出待测对象的髋关节对应的髂骨区域图像,该图像处理操作包括但不限于图像滤波、图像平滑、图像的几何变换、图像形态学处理等。
在一些实施例中,处理设备也可以采用基本的图像处理操作,对原始超声图像进行图像分析,从而能够从原始超声图像中提取出待测对象的髋关节对应的髂骨区域图像,该图像处理操作包括但不限于图像滤波、图像平滑、图像的几何变换、图像形态学处理等。
在一些实施例中,所述从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域图像,包括:对所述原始超声图像进行滤波处理;对滤波处理后的原始超声图像进行阈值分割,获取包含多个轮廓区域的二值图像;基于所述二值图像,确定髂骨区域轮廓;基于所述髂骨区域轮廓,确定所述髂骨区域图像。
在一些实施例中,所述基于所述髂骨区域轮廓,确定所述髂骨区域图像,包括:获取所述髂骨区域轮廓中髂骨区域的质心位置;基于所述髂骨区域轮廓中髂骨区域的质心位置按照预设尺寸进行扩展,获取所述髂骨区域图像;其中,所述预设尺寸与所述原始超声图像的尺寸相关。
更具体的描述可在图2至图4的说明中找到,此处不再赘述。
步骤503,将髂骨区域图像输入至预设的活动轮廓模型中进行髋关节分型,得到待测对象的髋关节类型。
在一些实施例中,处理设备可以将待测对象的髂骨区域图像输入至预设的活动轮廓模型中,得到髋关节分割图像,接着,可以将该髋关节分割图像输入至预设分型算法中,输出得到待测对象的髋关节类型,其中,该预设的活动轮廓模型可以为基于髂骨区域图像样本和其对应的髋关节标签所训练得到的。需要说明的是,该预设分型算法也可以嵌入到该预设的活动轮廓模型中,也就是说,该预设的活动轮廓模型可以直接输出待测对象的髋关节类型。
另外,对于该预设分型算法,在一些实施例中,处理设备在得到髋关节分割图像之后,可以从该髋关节分割图像中,确定出骨顶角和软骨顶角,接着,从髋关节类型的预设对应关系中,确定出与该骨顶角和该软骨顶角对应的髋关节类型;其中,髋关节类型的预设对应关系中包括不同的骨顶角、不同的软骨顶角以及不同的髋关节类型之间的对应关系。
在一些实施例中,将所述髂骨区域图像输入至预设的活动轮廓模型中进行髋关节分型,得到所述待测对象的髋关节类型,包括:利用所述预设的活动轮廓模型对所述髂骨区域图像进行处理,获取髋关节图像;基于所述髋关节图像,确定髂骨区域的凸缘点、终点和盂唇中点;连接所述凸缘点和所述终点确定骨顶线,连接所述凸缘点和所述盂唇中点确定软骨顶线;基于所述骨顶线和所述软骨顶线的斜率,确定所述待测对象的髋关节角度;基于所述待测对象的髋关节角度和髋关节分型标准,确定所述待测对象的髋关节类型。
在一些实施例中,处理设备可通过查找髋关节分型标准的方式确定待测对象的髋关节分型结果。
示例性地,髋关节分型标准可以如表1所示。
表1髋关节分型标准
如表1所示,不同的骨顶角α和不同的软骨顶角β,对应不同的髋关节类型,通过查询该预设的髋关节分型模板,可以确定出与骨顶角和软骨顶角对应的待测对象的髋关节类型。
更详细的说明可参见图2中的相关描述,此处不再赘述。
上述髋关节分型方法中,处理设备通过获取待测对象的髋关节对应的原始超声图像,接着,从该原始超声图像中提取待测对象的髋关节对应的髂骨区域图像,并将该髂骨区域图像输入至预设的活动轮廓模型中进行髋关节分型,得到待测对象的髋关节类型;也就是说,本说明书实施例中提供的髋关节分型方法,是通过从完整的超声图像中分割出髂骨所在区域,得到髂骨区域图像,进而将该髂骨区域图像输入至活动轮廓模型中进行图像分析,活动轮廓模型无需再对整个髋关节超声图像进行图像分析,而仅需对整个髋关节超声图像中的关键区域所在的部分图像进行图像分析即可,能够大大提高活动轮廓模型的处理速率;同时,采用本说明书所披露的技术方案进行髋关节识别时,由于预先从整个髋关节超声图像中提取出了髂骨所在区域,也就是预先从整个髋关节超声图像中确定出了髋关节的感兴趣区域,因此,在后续输入至活动轮廓模型进行图像处理时,能够大大提高髋关节分割的准确性,提高图像处理的精度,进而能够大大提高图像处理的效率。
图6为根据本说明书另一些实施例所示的确定髂骨区域图像的示例性流程示意图。本实施例涉及的是处理设备从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像的一种可选的实现过程,在上述实施例的基础上,如图6所示,上述步骤502可以包括:
步骤601,对原始超声图像进行二值化处理,得到原始超声图像对应的二值图像。
在一些实施例中在一些实施例中,处理设备可以采用简单、速度快的阈值分割,对原始超声图像进行二值化处理,得到原始超声图像对应的二值图像;其中,阈值分割可以包括最大熵阈值分割、大津法阈值分割、自适应阈值分割以及固定阈值分割等,示例性地,采用最大熵阈值分割对原始超声图像 进行二值化处理,所得到的二值图像能够大致将髂骨区域分割出来。在本说明书的其他一些地方,分割得到的二值图像也被称为髂骨区域图像。
其中,最大熵阈值分割法利用图像灰度概率信息,获取图像二值化分割阈值,进而实现图像分割。假设图像的分割阈值为t,图像灰度小于t的像素点构成背景区域B,大于等于t的像素点构成目标区域T,各个灰度级的概率分布为:

其中,PB(i)表示背景区域内各像素点的概率分布,PT(i)表示目标区域内各像素点的概率分布,表示小于等于分割阈值t的像素点的概率,Pi为像素值为i的概率,L为图像灰度级数,背景和前景对应的信息熵可表示为:

其中,H(B)为背景对应的信息熵,H(T)为前景对应的信息熵。对于图像信息熵的总和为H(t)=H(T)+H(B)。遍历0-255像素值t作为分割阈值,统计每个阈值下的信息熵总和,使用取得最大值时的t作为分割阈值,此时两部分背景和前景能够保持最大的信息量;至此便可基于该分割阈值t对原始超声图像进行二值化分割,得到最大熵分割图像,即原始超声图像对应的二值图像。
步骤602,对二值图像进行连通域分析,确定目标连通域。
在对原始超声图像进行二值化分割之后,能够大致将髂骨区域分割出来,但仍可能存在一些干扰轮廓,因此,可以对原始超声图像对应的二值图像进行连通域分析,以筛选出髂骨区域所对应的目标连通域。换句话说,该目标连通域为髂骨所在区域对应的连通域。
在一些实施例中,可以通过对髂骨区域的特征分析,确定出与该髂骨区域对应的连通域的第一预设筛选规则,其中,该第一预设筛选规则可以是与髂骨区域的面积、髂骨区域的质心、髂骨区域的长度等中的至少一个相关的筛选规则。处理设备可以基于该第一预设筛选规则,对二值图像进行连通域分析,确定目标连通域。在一些实施例中处理设备在一些实施例中,处理设备可以对该二值图像进行连通域分析,确定出该二值图像中的所有连通域,接着,基于该第一预设筛选规则,从所有连通域中确定出满足该第一预设筛选规则的候选连通域,最后,根据该候选连通域,确定与髂骨区域对应的目标连通域。
在一些实施例中,在该候选连通域包括一个的情况下,可以将该候选连通域作为该目标连通域;在该候选连通域包括多个的情况下,可以将该多个候选连通域中的任一个作为该目标连通域;在多个候选连通域存在交集的情况下,也可以将该多个存在交集的候选连通域进行合并处理,并将合并后的连通域作为该目标连通域。在一些实施例中,处理设备还可以按照第二预设筛选规则,从多个候选连通域中筛选出与髂骨区域对应的目标连通域。在一些实施例中,在一些实施例中第二预设筛选规则可以是与连通域的面积、连通域的长宽比、连通域内的平均像素强度等至少一个相关的筛选规则。
步骤603,根据目标连通域,从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像。
在一些实施例中,处理设备可以从原始超声图像中截取该目标连通域所对应的部分区域图像,作为待测对象的髋关节对应的髂骨区域图像。在一种实现方式中,该目标连通域可能并非一个矩形区域,这种情况下,可以根据该目标连通域,确定出与该目标连通域对应的矩形区域,接着,再从原始超声图像中截取该矩形区域所对应的部分区域图像,作为待测对象的髋关节对应的髂骨区域图像。
在一些实施例中,处理设备可以基于该目标连通域,确定出与该目标连通域相切的矩形区域,其中,该矩形区域将该目标连通域包围在内;处理设备还可以先确定出该目标连通域的质心位置,接着,基于该质心位置和预设尺寸,该预设尺寸包括预设长度和预设宽度,确定出以该质心位置为中心、以预设尺寸为大小的矩形区域。当然,还可以采用其他方式,本实施例对根据目标连通域确定矩形区域的方式并不做具体限定。
本实施例中,处理设备通过对原始超声图像进行二值化处理,得到原始超声图像对应的二值图像;接着,对该二值图像进行连通域分析,确定髂骨区域对应的目标连通域;最后,根据髂骨区域对应的目标连通域,从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像;即本实施例中提供了一种髂骨区域图像的获取方式,为髂骨区域图像的获取提供了可实施性依据。
在本说明书的一个可选的实施例中,处理设备在执行上述步骤601时,在对原始超声图像进行二值化处理之前,还可以先对原始超声图像进行滤波处理,以平滑图像,能够在一定程度上抑制原始超声图像中的斑点噪声,提升后续的图像分割过程的精度,以及减少连通域筛选过程中的干扰。在上述实施例的基础上,如图7所示,图7是根据本说明书一些实施例所示的对超声图像进行滤波和阈值分割的示例性流程图。在一些实施例中,上述步骤601可以包括:
步骤701,采用预设滤波算法,对原始超声图像进行滤波处理,得到滤波处理后的超声图像。
在一些实施例中,该预设滤波算法可以包括均值滤波、高斯滤波、中值滤波或者各向异性扩散滤波等,本说明书实施例中对预设滤波算法的类型不做具体限定。
步骤702,采用预设阈值分割算法,对滤波处理后的超声图像进行二值分割处理,得到原始超声图像对应的二值图像。
在一些实施例中,该预设阈值分割算法可以包括最大熵阈值分割、大津法阈值分割、自适应阈值分割以及固定阈值分割等,本实施例对预设阈值分割算法的具体类型不作限定。
示例性地,处理设备可以采用中值滤波算法,对原始超声图像进行滤波处理,得到滤波处理后的超声图像,接着,采用最大熵阈值分割算法,对滤波处理后的超声图像进行二值分割处理,得到原始超声图像对应的二值图像。
本实施例中,处理设备在对原始超声图像进行二值化处理之前,先采用预设滤波算法,对原始超声图像进行滤波处理,得到滤波处理后的超声图像;接着,再采用预设阈值分割算法,对滤波处理后的超声图像进行二值分割处理,得到原始超声图像对应的二值图像;能够减少原始超声图像中的噪声,达到平滑处理的效果,进而,能够提高后续图像分割和连通域筛选的精度,避免噪声干扰。
图8是根据本说明书一些实施例所示的获取候选连通域的示例性流程示意图。本实施例涉及的是处理设备对二值图像进行连通域分析,确定目标连通域的一种可选的实现过程,在上述实施例的基础上,如图8所示,上述步骤602可以包括:
步骤801,从二值图像中提取满足第一预设条件的候选连通域。
其中,第一预设条件可以包括多个维度的条件,多个维度的条件可以包括连通域的面积达到预设面积阈值的条件、连通域的质心位于预设位置的条件、连通域的长度达到预设长度阈值的条件中的至少两个。
也就是说,该第一预设条件包括至少两个筛选条件,在从二值图像中提取满足该第一预设条件的候选连通域时,该候选连通域应同时满足该至少两个筛选条件。
在一些实施例中,处理设备可以对该二值图像进行连通域分析,确定出该二值图像中的所有连通域,接着,针对每个连通域,判断该连通域是否同时满足该第一预设条件中的各个条件,若连通域同时满足该第一预设条件中的各个条件,则将该连通域确定为候选连通域,若存在第一预设条件中的任一个条件不满足,则说明该连通域不符合筛选条件。
步骤802,在提取到满足第一预设条件的候选连通域的情况下,根据候选连通域确定目标连通域。
也就是说,在存在满足第一预设条件的候选连通域的情况下,可以进一步根据候选连通域再确定出髂骨区域对应的目标连通域。
在一些实施例中,在候选连通域的数量为一个的情况下,可以将候选连通域作为目标连通域,或者,将候选连通域的变形连通域(如与候选连通域相切的矩形区域)作为目标连通域;在候选连通域的数量为多个的情况下,可以从候选连通域中确定满足第二预设条件的目标连通域,或者,根据多个候选连通域,确定目标连通域;例如:通过对多个候选连通域进行融合处理,来得到目标连通域。
在一个可选的实现方式中,针对从候选连通域中确定满足第二预设条件的目标连通域的方案,其中,该第二预设条件可以为鉴别分数最高的条件,也就是说,对各个候选连通域进行鉴别分析,得到各个候选连通域对应的鉴别分数,将鉴别分数最高的候选连通域作为该目标连通域。
图9是根据本说明书一些实施例所示的获取目标连通域的示例性示意图流程图,如图9所示,处理设备从候选连通域中确定满足第二预设条件的目标连通域,可以包括:
步骤901,针对各候选连通域,从原始超声图像中确定候选连通域对应的候选图像。
步骤902,根据该候选图像和预设鉴别算法,计算该候选图像对应的鉴别分数。
关于步骤901和步骤902的更多说明可以参见图3的相关描述。
步骤903,将鉴别分数最高的候选图像对应的候选连通域作为目标连通域。
至此,通过二次筛选,即可从多个候选连通域中确定出与髂骨区域对应的目标连通域。
步骤803,在未提取到满足第一预设条件的候选连通域的情况下,对第一预设条件进行降维处理,得到更新后的条件,并将更新后的条件作为第一预设条件,重新执行从二值图像中提取满足第一预设条件的候选连通域的步骤,直到从二值图像中提取到满足第一预设条件的候选连通域或第一预设条件的维度为一维为止。
也就是说,在通过多个筛选条件进行连通域筛选时,如果不存在满足多个筛选条件的候选连通域的情况下,可以减少筛选条件,重新进行连通域筛选;在一些实施例中,在对第一预设条件进行降维处理,降维处理也就是减少该第一预设条件中的筛选条件时,可以按照各个筛选条件的重要程度,优先去除重要程度低的筛选条件。
示例性地,在该第一预设条件包括连通域的面积达到预设面积阈值的面积条件、连通域的质心位于预设位置的质心条件、以及连通域的长度达到预设长度阈值的长度条件的情况下,若不存在满足该第一预设条件的候选连通域的情况下,可以优先去除长度条件,即将面积条件和质心条件作为第一预设条件,重新进行连通域的筛选;若还是不存在同时满足面积条件和质心条件的候选连通域,则此时可以去除面积条件,将质心条件作为第一预设条件,并重新进行连通域的筛选,若此时依旧不存在满足质心条件的候选连通域,则可以返回重新执行对原始超声图像进行二值化处理的步骤,重新确定二值图像,并基于新的二值图像,按照上述筛选过程执行连通域的筛选操作,直至存在满足第一预设条件的候选连通域为止。
接着,在确定出候选连通域之后,可以从候选连通域中确定满足第二预设条件的目标连通域,其具体实现方式可以参照上述步骤802的相关内容描述,在此不再赘述。
本实施例中,处理设备先从二值图像中提取满足包括多个维度条件的第一预设条件的候选连通域;在提取到满足第一预设条件的候选连通域的情况下,从候选连通域中确定满足第二预设条件的目标连通域;在未提取到满足第一预设条件的候选连通域的情况下,对第一预设条件进行降维处理,得到更新后的条件,并将更新后的条件作为第一预设条件,重新执行从二值图像中提取满足第一预设条件的候选连通域的步骤,直到从二值图像中提取到满足第一预设条件的候选连通域或第一预设条件的维度为一维为止;采用该方式,能够提高髂骨区域分割的准确性。
图10是根据本说明书一些实施例所示的确定髂骨区域图像的示例性流程示意图。本实施例涉及的是处理设备根据目标连通域,从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像的一种可选的实现过程,在上述实施例的基础上,如图10所示,上述步骤603可以包括:
步骤1001,基于原始超声图像,确定目标连通域的质心位置。
在一些实施例中,处理设备可以根据目标连通域在二值图像中的位置,从原始超声图像中确定出该目标连通域对应的图像区域,对该原始超声图像中该目标连通域对应的图像区域进行分析,确定出该目标连通域对应的质心位置。
步骤1002,基于目标连通域的质心位置结合预设尺寸,从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像。
其中,该预设尺寸与原始超声图像的尺寸相关。在一些实施例中,处理设备可以通过对超声图像和超声图像中的髂骨区域的图像大小分析,确定出髂骨区域的图像大小与超声图像的图像大小之间的比例关系,例如:髂骨区域的图像宽度可以为超声图像的图像宽度的1/2,髂骨区域的图像高度可以为超声图像的图像宽度的4/5。
基于此,处理设备根据该比例关系和待测对象的原始超声图像的图像尺寸,即可确定出髂骨区域图像对应的预设尺寸。进而,基于目标连通域的质心位置和该预设尺寸,即可确定出原始超声图像中以该目标连通域的质心位置为中心的预设尺寸大小的图像区域,将该图像区域从原始超声图像中截取出来,即可得到待测对象的髋关节对应的髂骨区域图像。
本实施例中,处理设备基于原始超声图像,确定目标连通域的质心位置,并基于目标连通域的质心位置结合预设尺寸,从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像;其中,该预设尺寸与原始超声图像的尺寸相关;采用该方式,能够得到准确的髂骨区域图像,提高髂骨区域图像分割的准确性。
图11是根据本说明书一些实施例所示的髋关节分型方法的示例性流程示意图。本实施例涉及的是处理设备将髂骨区域图像输入至预设的活动轮廓模型中进行髋关节分型,得到待测对象的髋关节类型的一种可选的实现过程,在上述实施例的基础上,如图11所示,上述步骤503包括:
步骤1101,将髂骨区域图像输入至预设的活动轮廓模型中,得到髋关节分割图像。
步骤1102,基于髋关节分割图像,确定骨顶线和软骨顶线。
其中,骨顶线为骨缘转折点与髂骨支下缘最低点之间的连线,软骨顶线为骨缘转折点与盂唇中点之间的连线。
在一些实施例中,处理设备可以基于髋关节分割图像,确定出骨缘转折点、髂骨支下缘最低点、以及盂唇中点等三个解剖点,接着,基于骨缘转折点与髂骨支下缘最低点,生成骨顶线,以及基于骨缘转折点与盂唇中点,生成软骨顶线。
在一种实现方式中,图12是根据本说明书一些实施例所示的确定三个解剖点的示例性流程图。如图12所示,确定骨缘转折点、髂骨支下缘最低点、以及盂唇中点,三个解剖点的方式可以包括:
步骤1201,根据预设基线,将髋关节分割图像分割为第一图像和第二图像。
其中,预设基线为基于髂骨区域图像中髂骨的质心的纵坐标所形成的直线,第一图像为包括盂唇的部分髋关节图像。该髂骨区域图像中髂骨的质心即为上述步骤1001中所确定出的目标连通域的质 心位置,基于该目标连通域的质心位置的纵坐标所形成的直线即为预设基线,也就是确定髋关节骨顶角和软骨顶角的基线。
基于该预设基线,可以将髋关节分割图像分割为髋关节上半部分对应的第一图像和髋关节下半部分对应的第二图像。
步骤1202,从第二图像中确定上边线的凸缘点,并基于凸缘点和上边线的终点生成骨顶线。
其中,上边线的凸缘点即为上述骨缘转折点,上边线的终点即为上述髂骨支下缘最低点。
步骤1203,从第一图像中确定盂唇中点,并基于凸缘点和盂唇中点,生成软骨顶线。
在一些实施例中,处理设备可以对第一图像进行形态学处理和逻辑筛选,分割出盂唇所在区域的图像,并基于该图像确定出盂唇所在区域的质心,并将该质心确定为盂唇中点。
在确定出盂唇中点之后,基于步骤1202中所确定出的凸缘点和该盂唇中点,生成软骨顶线。
步骤1103,根据骨顶线的斜率确定骨顶角,以及根据软骨顶线的斜率确定软骨顶角。
其中,骨顶角为预设基线与骨顶线所形成的处于第四象限的夹角,软骨顶角为预设基线与软骨顶线所形成的处于第一象限的夹角。
在确定出骨顶线之后,通过坐标计算可以计算得到骨顶线的斜率,同样地,在确定出软骨顶线之后,可以计算得到软骨顶线的斜率。接着,根据骨顶线的斜率计算骨顶角和软骨顶角的方式可以参见前文实施例的描述,此处不再赘述。
步骤1104,根据骨顶角、软骨顶角、以及预设的髋关节分型模板,确定待测对象的髋关节类型。
其中,预设的髋关节分型模板可以参见表1。
本实施中,处理设备通过将髂骨区域图像输入至预设的活动轮廓模型中,得到髋关节分割图像;接着,基于髋关节分割图像,确定骨顶线和软骨顶线;并根据骨顶线的斜率确定骨顶角,以及根据软骨顶线的斜率确定软骨顶角;进而,根据骨顶角、软骨顶角、以及预设的髋关节分型模板,确定待测对象的髋关节类型;采用本实施例中的方法,将髋关节的感兴趣区域,即髂骨区域图像作为髋关节分割的输入图像,能够缩短计算时间,提高分割速率及分割精度,进而能够提高髋关节分型的准确性和处理效率。
图13是根据本说明书一些实施例所示的确定凸缘点方法的示例性流程示意图。本实施例涉及的是处理设备从第二图像中确定上边线的凸缘点的一种可选的实现过程,在上述实施例的基础上,如图13所示,上述步骤1202包括:
步骤1301,确定第二图像中的上边线,并提取上边线上的点构成骨顶线点集。
其中,该第二图像中的上边线为髂骨支下缘对应的上边界线。
在一些实施例中,按照预设间隔,提取上边线上的第一个点到最后一个点之间的多个点构成骨顶线点集。
步骤1302,根据骨顶线点集中的各个点以及预设斜率,生成各个点分别对应的直线。
其中,预设斜率为基于骨顶线点集中的各个点结合最小二乘法拟合得到的线段的斜率。也就是说,利用最小二乘法直线拟合骨顶线点集,得到拟合后的直线以及该直线的斜率,将该直线的斜率作为预设斜率。
基于此,利用该预设斜率,结合骨顶线点集,得到骨顶线点集中的各个点分别对应的直线。
步骤1303,从各个点分别对应的直线中选择目标直线,并将目标直线对应的点确定为凸缘点。
其中,骨顶线点集中的各个点均在目标直线的下方。
在一些实施例中,处理设备可以对每个点的直线的纵坐标进行排序,确保骨顶线点集上的所有点全部都在某一条目标直线的左下方,那么,该目标直线对应的骨顶线点集中的坐标点即为凸缘点,将骨顶线点集中纵坐标最大值的点视作为上边线的终点,接着,连接凸缘点与终点即可得到骨顶线,并计算得到骨顶线的斜率。
本实施例中,处理设备通过确定第二图像中的上边线,并提取上边线上的点构成骨顶线点集;并根据骨顶线点集中的各个点以及预设斜率,生成各个点分别对应的直线;接着,从各个点分别对应的直线中选择目标直线,并将目标直线对应的点确定为凸缘点;其中,骨顶线点集中的各个点均在目标直线的下方,该预设斜率为基于骨顶线点集中的各个点结合最小二乘法拟合得到的线段的斜率;即本实施例中提供了一种确定凸缘点的实现方式,提高了处理设备自动确定凸缘点的可实施性和可操作性,且该方式能够提高处理设备确定凸缘点的处理效率以及提高凸缘点的准确性。
在一些实施例中,还提供一种超声髋关节自动分型系统,其系统结构图如图14所示,主要分成7个部分。其中超声换能器主要是负责对信号进行发射与接收,数据接收模块主要是对电信号进行接收、放大以及模数转换,并对数据进行压缩送入波束合成模块中,波束合成模块与图像处理模块主要是对回波信号进行解析与插值形成B信号,数据存储模块则是将获取到的B信号以图像形式存储下来, 将存储的原始超声图像送入图像算法模块中自动获取髋骨节分型,并最终在图像显示模块中显示分型结果。
其中,图像算法模块具体用于执行上述任一实施例中的髋关节分型方法的步骤,实现髋关节的自动分型。其实现方法可以如图15所示,整个算法流程分为五个部分:输入、髂骨区域分割、髋关节分割、髋关节分型以及输出。本说明书实施例中的髋关节分型方法为全自动测量方法,医生在冻结情况下选择待测对象的合适的髋关节切面作为输入图像,最终输出的结果为待测对象的髋关节的α角、β角以及髋关节分型。其具体的实现步骤可以包括:
(1)获取待测对象的髋关节对应的原始超声图像,如图16所示。
(2)采用中值滤波算法,对原始超声图像进行滤波处理,得到滤波处理后的超声图像;接着,采用最大熵阈值分割算法,对滤波处理后的超声图像进行二值分割处理,得到原始超声图像对应的二值图像,如图17所示。
(3)对二值图像进行连通域分析,确定髂骨区域所在的目标连通域,如图18所示。参考图19,其具体实现过程可以包括:
a.从二值图像(即图19中的最大熵分割图像)中提取满足第一预设条件的候选连通域;其中,第一预设条件包括连通域的面积达到预设面积阈值的面积条件、连通域的质心位于预设位置的质心条件、连通域的长度达到预设长度阈值的长度条件。
b.在未提取到满足第一预设条件的候选连通域(即剩余轮廓数量为0)的情况下,对第一预设条件进行降维处理,得到更新后的条件,并将更新后的条件作为第一预设条件,并重新执行上述步骤1),直到从二值图像中提取到满足第一预设条件的候选连通域或第一预设条件的维度为一维为止。其中,降维处理的顺序为先去除长度条件,再去除面积条件。
c.在提取到满足第一预设条件的候选连通域的情况下,若候选连通域的数量为一个,则将候选连通域作为目标连通域。
d.若候选连通域的数量为多个,则针对各候选连通域,从原始超声图像中确定各候选连通域对应的候选图像,并将各候选图像输入至预设鉴别器中,计算各候选图像对应的鉴别分数;进而,将鉴别分数最高的候选图像对应的候选连通域作为目标连通域。
(4)基于原始超声图像,确定目标连通域的质心位置,如图18中标记的质心位置;并基于目标连通域的质心位置结合预设尺寸,从原始超声图像中提取待测对象的髋关节对应的髂骨区域图像,如图20所示;其中,该预设尺寸与原始超声图像的尺寸相关。
(5)将髂骨区域图像输入至预设的活动轮廓模型中,得到髋关节分割图像,如图21所示;接着,从该髋关节分割图像中筛选出髋关节轮廓,得到仅包括髋关节轮廓的髋关节分割图像,如图22所示。
(6)基于该仅包括髋关节轮廓的髋关节分割图像,确定预设基线、骨顶线和软骨顶线。其具体实现过程可以包括:
a.根据预设基线,将髋关节分割图像分割为第一图像和第二图像;其中,预设基线为基于髂骨区域图像中髂骨的质心的纵坐标所形成的直线,第一图像为包括盂唇的部分髋关节图像。
b.针对第二图像,确定第二图像中的上边线,并提取上边线上的点构成骨顶线点集;并根据骨顶线点集中的各个点以及预设斜率,生成各个点分别对应的直线;接着,从各个点分别对应的直线中选择目标直线,并将目标直线对应的点确定为凸缘点;以及将骨顶线点集中的纵坐标最大值的点视作为终点;最后,连接凸缘点和终点,生成骨顶线。
c.从第一图像中分割出盂唇所在区域,并确定该区域的质心,将该质心确定为盂唇中点,进而,连接凸缘点和盂唇中点,生成软骨顶线。
(7)根据骨顶线的斜率确定骨顶角,以及根据软骨顶线的斜率确定软骨顶角。
(8)根据骨顶角、软骨顶角、以及预设的髋关节分型模板,确定待测对象的髋关节类型。
如图23所示,其中,Alpha表示α角,Beta表示β角,HIP Type表示髋关节类型。
采用本说明书实施例中的髋关节分型方法,首先通过获取髂骨分割区域,并提取髂骨分割区域的质心,以质心位置为基准,获取髋关节的感兴趣区域,再将髋关节的感兴趣区域输入至活动轮廓模型中进行进一步细致的分割,得到髋关节分割结果;相较于将整张超声图像输入到活动轮廓模型而言,该过程能够提升活动轮廓模型算法的处理速度以及处理精度,进而提升髋关节角度测量的精度,以及提升髋关节分割的速度与精度。
另外,本实施例中提供的髋关节分型方法,仅需输入待测对象的原始超声图像即可得到髋关节分型结果,大大提高了医生诊断的工作效率,节省了病人等待时间,此外全自动方法能够使得分型结果更为客观,减少部分医生经验不足导致的误诊。而且,本说明书实施例所提供的的方法,相比于深度学 习方法而言,不需要依赖于大量数据进行训练,仅需依赖于少量的超声图像进行算法验证即可,大大节省算法成本。
再者,本实施例中提供的超声髋关节自动分型系统,包括从超声换能器接收信号经过波束形成转换成B模式图像到数据存储、图像算法处理以及显示结果的整个流程,提高了系统的完整性和可靠性。
应该理解的是,虽然如上所述的各实施例所涉及的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,如上所述的各实施例所涉及的流程图中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。
图24是根据本说明书一些实施例所示的处理设备的内部结构图。
在一些实施例中,提供了一种计算机设备(该计算机设备也可以被称为处理设备,例如,处理设备120),该计算机设备可以为超声设备,也可以为与超声设备通信连接的服务器,还可以为与超声设备通信连接的终端设备,当然,还可以为与服务器通信连接的终端设备等,在该计算机设备为终端设备的情况下,其内部结构图可以如图24所示。该计算机设备包括通过系统总线连接的处理器、存储器、通信接口、显示单元和输入装置。通信接口、显示单元和输入装置可通过I/O(输入/输出)接口与系统总线连接。其中,该计算机设备的处理器用于提供计算和控制能力。该计算机设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统和计算机程序。该内存储器为非易失性存储介质中的操作系统和计算机程序的运行提供环境。该计算机设备的通信接口用于与外部的终端进行有线或无线方式的通信,无线方式可通过WIFI、移动蜂窝网络、NFC(近场通信)或其他技术实现。该计算机程序被处理器执行时以实现一种髋关节分型方法。该计算机设备的显示屏可以是液晶显示屏或者电子墨水显示屏,该计算机设备的输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。
本领域技术人员可以理解,图24中示出的结构,仅仅是与本说明书方案相关的部分结构的框图,并不构成对本说明书方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。应当理解,图24所示的系统及其模块可以利用各种方式来实现。例如,在一些实施例中,系统及其模块可以通过硬件、软件或者软件和硬件的结合来实现。其中,硬件部分可以利用专用逻辑来实现;软件部分则可以存储在存储器中,由适当的指令执行系统,例如微处理器或者专用设计硬件来执行。本领域技术人员可以理解上述的方法和系统可以使用计算机可执行指令和/或包含在处理器控制代码中来实现,例如在诸如磁盘、CD或DVD-ROM的载体介质、诸如只读存储器(固件)的可编程的存储器或者诸如光学或电子信号载体的数据载体上提供了这样的代码。本说明书的系统及其模块不仅可以有诸如超大规模集成电路或门阵列、诸如逻辑芯片、晶体管等的半导体、或者诸如现场可编程门阵列、可编程逻辑设备等的可编程硬件设备的硬件电路实现,也可以用例如由各种类型的处理器所执行的软件实现,还可以由上述硬件电路和软件的结合(例如,固件)来实现。
需要注意的是,以上对于髋关节分型系统及其模块的描述,仅为描述方便,并不能把本说明书限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接。例如,各个模块可以共用一个存储模块,各个模块也可以分别具有各自的存储模块。诸如此类的变形,均在本说明书的保护范围之内。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述详细披露仅仅作为示例,而并不构成对本说明书的限定。虽然此处并没有明确说明,本领域技术人员可能会对本说明书进行各种修改、改进和修正。该类修改、改进和修正在本说明书中被建议,所以该类修改、改进、修正仍属于本说明书示范实施例的精神和范围。
同时,本说明书使用了特定词语来描述本说明书的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本说明书至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一个替代性实施例”并不一定是指同一实施例。此外,本说明书的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,除非权利要求中明确说明,本说明书所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本说明书流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些 目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本说明书实施例实质和范围的修正和等价组合。例如,虽然以上所描述的系统组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的服务器或移动设备上安装所描述的系统。
同理,应当注意的是,为了简化本说明书披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本说明书实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本说明书对象所需要的特征比权利要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述成分、属性数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根据个别实施例所需特点可以发生改变。在一些实施例中,数值参数应考虑规定的有效数位并采用一般位数保留的方法。尽管本说明书一些实施例中用于确认其范围广度的数值域和参数为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。
针对本说明书引用的每个专利、专利申请、专利申请公开物和其他材料,如文章、书籍、说明书、出版物、文档等,特此将其全部内容并入本说明书作为参考。与本说明书内容不一致或产生冲突的申请历史文件除外,对本说明书权利要求最广范围有限制的文件(当前或之后附加于本说明书中的)也除外。需要说明的是,如果本说明书附属材料中的描述、定义、和/或术语的使用与本说明书所述内容有不一致或冲突的地方,以本说明书的描述、定义和/或术语的使用为准。
最后,应当理解的是,本说明书中所述实施例仅用以说明本说明书实施例的原则。其他的变形也可能属于本说明书的范围。因此,作为示例而非限制,本说明书实施例的替代配置可视为与本说明书的教导一致。相应地,本说明书的实施例不仅限于本说明书明确介绍和描述的实施例。

Claims (33)

  1. 一种髋关节角度检测系统,所述系统包括处理器,所述处理器用于执行以下操作:
    获取待测对象的超声图像;
    从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域轮廓;
    获取所述髂骨区域轮廓中髂骨区域的质心位置,并基于所述质心位置获取髋关节图像;
    基于所述髋关节图像,确定所述待测对象的髋关节角度。
  2. 根据权利要求1所述的系统,所述从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域轮廓,包括:
    对所述超声图像进行滤波处理;
    对滤波处理后的超声图像进行阈值分割,获取包含多个轮廓区域的二值图像;
    基于所述二值图像,确定所述髂骨区域轮廓。
  3. 根据权利要求2所述的系统,所述滤波处理的方法包括中值滤波,所述阈值分割的方法包括最大熵阈值分割。
  4. 根据权利要求2所述的系统,所述基于所述二值图像,确定所述髂骨区域轮廓,包括:
    通过预设筛选方法对所述二值图像进行处理,确定所述髂骨区域轮廓。
  5. 根据权利要求4所述的系统,所述通过预设筛选方法对所述二值图像进行处理,包括:
    通过预设筛选策略从所述二值图像中获取候选连通域;其中,所述预设筛选策略包括使用面积筛选、质心筛选以及长度筛选三个维度的筛选条件中的至少一个进行筛选;
    基于所述候选连通域确定所述髂骨区域轮廓。
  6. 根据权利要求5所述的系统,当所述候选连通域的数量小于1时,所述处理器还用于:
    对所述预设筛选策略进行筛选条件降维,得到降维后的筛选策略;以及,
    使用所述降维后的筛选策略从所述二值图像中获取候选连通域。
  7. 根据权利要求6所述的系统,所述筛选条件降维的顺序依次为长度筛选、面积筛选和质心筛选。
  8. 根据权利要求5所述的系统,当所述候选连通域的数量大于1时,所述处理器还用于:
    针对每个所述候选连通域,计算所述候选连通域对应的鉴别分数;
    将鉴别分数最高的候选连通域作为目标连通域;
    基于所述目标连通域,从所述超声图像中提取得到所述髂骨区域轮廓。
  9. 根据权利要求1所述的系统,所述基于所述质心位置获取髋关节图像,包括:
    基于所述髂骨区域轮廓中髂骨区域的质心位置按照预设尺寸进行扩展,获取髂骨区域图像;其中,所述预设尺寸与所述超声图像的尺寸相关;
    利用预设的活动轮廓模型对所述髂骨区域图像进行处理,获取所述髋关节图像。
  10. 根据权利要求9所述的系统,所述基于所述髋关节图像,确定所述待测对象的髋关节角度,包括:
    基于所述髋关节图像,确定髂骨区域的凸缘点、终点和盂唇中点;
    连接所述凸缘点和所述终点确定骨顶线,连接所述凸缘点和所述盂唇中点确定软骨顶线;
    基于所述骨顶线和所述软骨顶线的斜率,确定所述待测对象的髋关节角度。
  11. 根据权利要求10所述的系统,基于所述髋关节图像,确定凸缘点,包括:
    根据预设基线,将所述髋关节图像分割为第一图像和第二图像;所述预设基线为基于所述髂骨区域图像中髂骨质心的纵坐标所形成的直线;
    确定所述第二图像中髂骨轮廓的上边线,并提取所述上边线上的点构成骨顶线点集;
    基于所述骨顶线点集,获取骨顶线的斜率;
    利用所述骨顶线的斜率从所述骨顶线点集中确定所述凸缘点。
  12. 根据权利要求11所述的系统,基于所述髋关节图像,确定终点,包括:
    将所述骨顶线点集中的纵坐标最大值对应的点作为所述终点。
  13. 根据权利要求11所述的系统,基于所述髋关节图像,确定盂唇中点,包括:
    从所述第一图像中分割出盂唇区域;
    将所述盂唇区域的质心作为所述盂唇中点。
  14. 一种髋关节角度检测方法,所述方法包括:
    获取待测对象的超声图像;
    从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域轮廓;
    获取所述髂骨区域轮廓中髂骨区域的质心位置,并基于所述质心位置获取髋关节图像;
    基于所述髋关节图像,确定所述待测对象的髋关节角度。
  15. 根据权利要求14所述的方法,所述从所述超声图像中提取所述待测对象的髋关节对应的髂骨区域轮廓,包括:
    对所述超声图像进行滤波处理;
    对滤波处理后的超声图像进行阈值分割,获取包含多个轮廓区域的二值图像;
    基于所述二值图像,确定所述髂骨区域轮廓。
  16. 根据权利要求15所述的方法,所述滤波处理的方法包括中值滤波,所述阈值分割的方法包括最大熵阈值分割。
  17. 根据权利要求15所述的方法,所述基于所述二值图像,确定所述髂骨区域轮廓,包括:
    通过预设筛选方法对所述二值图像进行处理,确定所述髂骨区域轮廓。
  18. 根据权利要求17所述的方法,所述通过预设筛选方法对所述二值图像进行处理,包括:
    通过预设筛选策略从所述二值图像中获取候选连通域;其中,所述预设筛选策略包括使用面积筛选、质心筛选以及长度筛选三个维度的筛选条件中的至少一个进行筛选;
    基于所述候选连通域确定所述髂骨区域轮廓。
  19. 根据权利要求18所述的方法,当所述候选连通域的数量小于1时,所述处理器还用于:
    对所述预设筛选策略进行筛选条件降维,得到降维后的筛选策略;以及,
    使用所述降维后的筛选策略从所述二值图像中获取候选连通域。
  20. 根据权利要求19所述的方法,所述筛选条件降维的顺序依次为长度筛选、面积筛选和质心筛选。
  21. 根据权利要求18所述的方法,当所述候选连通域的数量大于1时,所述处理器还用于:
    针对每个所述候选连通域,计算所述候选连通域对应的鉴别分数;
    将鉴别分数最高的候选连通域作为目标连通域;
    基于所述目标连通域,从所述超声图像中提取得到所述髂骨区域轮廓。
  22. 根据权利要求14所述的方法,所述基于所述质心位置获取髋关节图像,包括:
    基于所述髂骨区域轮廓中髂骨区域的质心位置按照预设尺寸进行扩展,获取髂骨区域图像;其中,所述预设尺寸与所述超声图像的尺寸相关;
    利用预设的活动轮廓模型对所述髂骨区域图像进行处理,获取所述髋关节图像。
  23. 根据权利要求22所述的方法,所述基于所述髋关节图像,确定所述待测对象的髋关节角度,包括:
    基于所述髋关节图像,确定髂骨区域的凸缘点、终点和盂唇中点;
    连接所述凸缘点和所述终点确定骨顶线,连接所述凸缘点和所述盂唇中点确定软骨顶线;
    基于所述骨顶线和所述软骨顶线的斜率,确定所述待测对象的髋关节角度。
  24. 根据权利要求23所述的方法,基于所述髋关节图像,确定凸缘点,包括:
    根据预设基线,将所述髋关节图像分割为第一图像和第二图像;所述预设基线为基于所述髂骨区域图像中髂骨质心的纵坐标所形成的直线;
    确定所述第二图像中髂骨轮廓的上边线,并提取所述上边线上的点构成骨顶线点集;
    基于所述骨顶线点集,获取骨顶线的斜率;
    利用所述骨顶线的斜率从所述骨顶线点集中确定所述凸缘点。
  25. 根据权利要求24所述的方法,基于所述髋关节图像,确定终点,包括:
    将所述骨顶线点集中的纵坐标最大值对应的点作为所述终点。
  26. 根据权利要求24所述的方法,基于所述髋关节图像,确定盂唇中点,包括:
    从所述第一图像中分割出盂唇区域;
    将所述盂唇区域的质心作为所述盂唇中点。
  27. 一种髋关节分型方法,所述方法包括:
    获取待测对象的原始超声图像;所述原始超声图像为所述待测对象的髋关节对应的超声图像;
    从所述原始超声图像中提取所述待测对象的髋关节对应的髂骨区域图像;
    将所述髂骨区域图像输入至预设的活动轮廓模型中进行髋关节分型,得到所述待测对象的髋关节类型。
  28. 根据权利要求27所述的方法,所述从所述原始超声图像中提取所述待测对象的髋关节对应的髂骨区域图像,包括:
    对所述原始超声图像进行滤波处理;
    对滤波处理后的原始超声图像进行阈值分割,获取包含多个轮廓区域的二值图像;
    基于所述二值图像,确定髂骨区域轮廓;
    基于所述髂骨区域轮廓,确定所述髂骨区域图像。
  29. 根据权利要求28所述的方法,所述基于所述二值图像,确定所述髂骨区域轮廓,包括:
    通过预设筛选方法对所述二值图像进行处理,确定所述髂骨区域轮廓。
  30. 根据权利要求29所述的方法,所述通过预设筛选方法对所述二值图像进行处理,包括:
    通过预设筛选策略从所述二值图像中获取候选连通域;其中,所述预设筛选策略包括使用面积筛选、质心筛选以及长度筛选三个维度的筛选条件中的至少一个进行筛选;
    基于所述候选连通域确定所述髂骨区域轮廓。
  31. 根据权利要求28所述的方法,所述基于所述髂骨区域轮廓,确定所述髂骨区域图像,包括:
    获取所述髂骨区域轮廓中髂骨区域的质心位置;
    基于所述髂骨区域轮廓中髂骨区域的质心位置按照预设尺寸进行扩展,获取所述髂骨区域图像;其中,所述预设尺寸与所述原始超声图像的尺寸相关。
  32. 根据权利要求27所述的方法,将所述髂骨区域图像输入至预设的活动轮廓模型中进行髋关节分型,得到所述待测对象的髋关节类型,包括:
    利用所述预设的活动轮廓模型对所述髂骨区域图像进行处理,获取髋关节图像;
    基于所述髋关节图像,确定髂骨区域的凸缘点、终点和盂唇中点;
    连接所述凸缘点和所述终点确定骨顶线,连接所述凸缘点和所述盂唇中点确定软骨顶线;
    基于所述骨顶线和所述软骨顶线的斜率,确定所述待测对象的髋关节角度;
    基于所述待测对象的髋关节角度和髋关节分型标准,确定所述待测对象的髋关节类型。
  33. 一种髋关节分型装置,所述装置包括处理器,所述处理器用于执行如权利要求27-32任一项所述的髋关节分型方法。
PCT/CN2023/121265 2022-09-27 2023-09-25 一种髋关节角度检测系统和方法 WO2024067527A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211180701.9 2022-09-27
CN202211180701.9A CN115527065A (zh) 2022-09-27 2022-09-27 髋关节分型方法、装置和存储介质

Publications (1)

Publication Number Publication Date
WO2024067527A1 true WO2024067527A1 (zh) 2024-04-04

Family

ID=84699080

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/121265 WO2024067527A1 (zh) 2022-09-27 2023-09-25 一种髋关节角度检测系统和方法

Country Status (2)

Country Link
CN (1) CN115527065A (zh)
WO (1) WO2024067527A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115527065A (zh) * 2022-09-27 2022-12-27 武汉联影医疗科技有限公司 髋关节分型方法、装置和存储介质
CN117274272B (zh) * 2023-09-08 2024-04-30 青岛市市立医院 一种基于深度学习的冠状动脉造影图分割的优化方法
CN117455925B (zh) * 2023-12-26 2024-05-17 杭州健培科技有限公司 一种胸部多器官和肋骨分割方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107146214A (zh) * 2016-03-01 2017-09-08 厦门大学 儿童髋关节发育状况计算机自动诊断的方法
CN108537838A (zh) * 2018-03-13 2018-09-14 北京理工大学 一种髋关节骨性髋臼角度的检测方法
CN109919943A (zh) * 2019-04-16 2019-06-21 广东省妇幼保健院 婴儿髋关节角度自动检测方法、系统和计算设备
US20210209399A1 (en) * 2020-01-08 2021-07-08 Radu Dondera Bounding box generation for object detection
CN115527065A (zh) * 2022-09-27 2022-12-27 武汉联影医疗科技有限公司 髋关节分型方法、装置和存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107146214A (zh) * 2016-03-01 2017-09-08 厦门大学 儿童髋关节发育状况计算机自动诊断的方法
CN108537838A (zh) * 2018-03-13 2018-09-14 北京理工大学 一种髋关节骨性髋臼角度的检测方法
CN109919943A (zh) * 2019-04-16 2019-06-21 广东省妇幼保健院 婴儿髋关节角度自动检测方法、系统和计算设备
US20210209399A1 (en) * 2020-01-08 2021-07-08 Radu Dondera Bounding box generation for object detection
CN115527065A (zh) * 2022-09-27 2022-12-27 武汉联影医疗科技有限公司 髋关节分型方法、装置和存储介质

Also Published As

Publication number Publication date
CN115527065A (zh) 2022-12-27

Similar Documents

Publication Publication Date Title
WO2024067527A1 (zh) 一种髋关节角度检测系统和方法
Loizou A review of ultrasound common carotid artery image and video segmentation techniques
Golemati et al. Using the Hough transform to segment ultrasound images of longitudinal and transverse sections of the carotid artery
JP6467041B2 (ja) 超音波診断装置、及び画像処理方法
Loizou et al. Snakes based segmentation of the common carotid artery intima media
Lu et al. Automated fetal head detection and measurement in ultrasound images by iterative randomized Hough transform
Menchón-Lara et al. Automatic detection of the intima-media thickness in ultrasound images of the common carotid artery using neural networks
Gil et al. Statistical strategy for anisotropic adventitia modelling in IVUS
US20110196236A1 (en) System and method of automated gestational age assessment of fetus
US10405834B2 (en) Surface modeling of a segmented echogenic structure for detection and measurement of anatomical anomalies
JP2012512672A (ja) 医用画像内病変自動検出方法およびシステム
Rossi et al. Automatic recognition of the common carotid artery in longitudinal ultrasound B-mode scans
Liu et al. Cardiac magnetic resonance image segmentation based on convolutional neural network
Antunes et al. Phase symmetry approach applied to children heart chambers segmentation: a comparative study
Yu et al. Fetal abdominal contour extraction and measurement in ultrasound images
Gao et al. Segmentation of ultrasonic breast tumors based on homogeneous patch
TWI574671B (zh) 乳房影像的分析方法以及其電子裝置
CN108670301B (zh) 一种基于超声影像的脊柱横突定位方法
Yu et al. Fetal ultrasound image segmentation system and its use in fetal weight estimation
Martins et al. A new active contours approach for finger extensor tendon segmentation in ultrasound images using prior knowledge and phase symmetry
Elwazir et al. Fully automated mitral inflow doppler analysis using deep learning
Lazrag et al. Combination of the Level‐Set Methods with the Contourlet Transform for the Segmentation of the IVUS Images
Wang et al. A multiresolution framework for ultrasound image segmentation by combinative active contours
Song et al. Fully Automatic Ultrasound Fetal Heart Image Detection and Segmentation based on Texture Analysis
Velmurugan et al. A review on systemic approach of the ultra sound image to detect renal calculi using different analysis techniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23870756

Country of ref document: EP

Kind code of ref document: A1