CN210277194U - Image diagnosis system - Google Patents

Image diagnosis system Download PDF

Info

Publication number
CN210277194U
CN210277194U CN201821524684.5U CN201821524684U CN210277194U CN 210277194 U CN210277194 U CN 210277194U CN 201821524684 U CN201821524684 U CN 201821524684U CN 210277194 U CN210277194 U CN 210277194U
Authority
CN
China
Prior art keywords
image
bed
height
actual
diagnostic system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201821524684.5U
Other languages
Chinese (zh)
Inventor
胡仁芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201821524684.5U priority Critical patent/CN210277194U/en
Priority to US16/236,585 priority patent/US11182927B2/en
Application granted granted Critical
Publication of CN210277194U publication Critical patent/CN210277194U/en
Priority to US17/455,933 priority patent/US11727598B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The utility model provides an image diagnostic system, image diagnostic equipment includes: scanning device, bed and image reconstruction device, image diagnostic system still includes: shoot device and workstation, wherein the workstation includes: the device comprises a bed control unit, a shooting control unit, a position calibration calculation unit and a bed positioning calculation unit. The utility model provides an image diagnostic system confirms the actual position of bed and the corresponding relation of image position according to a plurality of actual positions and the position image position that corresponds. Therefore, the utility model discloses an image diagnostic equipment can ensure that the position of examining of waiting to scan of examinee is detected and the operation is comparatively simple and convenient.

Description

Image diagnosis system
Technical Field
The utility model relates to an image diagnostic system field especially relates to an image diagnostic system with calibration function.
Background
Imaging diagnostic systems such as CT, PET-CT, MR, and PET-MR are increasingly used for examination of various diseases because they can rapidly acquire clear images of the in-vivo conditions of a subject. Taking a CT machine as an example, the CT machine generally has a scanning device and a table. The scanning device has a device capable of emitting and receiving radiation. These devices emit radiation and convert the received radiation signals into electrical signals for generating an image. The bed is used for bearing the examinee. Generally, after a subject is in place on a bed in a posture such as lying on his stomach or lying on his/her back, the bed needs to move a portion to be scanned of the subject to a region suitable for examination by moving the bed itself. The reasons for providing a movable bed are various. Generally, the examinee does not need to examine the whole body, and in order to obtain a good imaging effect, the scanning device is designed to be annular and narrow and difficult to enter, while the examinee generally suffers from diseases, has weak movement ability and is difficult to enter the detected area autonomously. On the other hand, the movable bed can also ensure that the part to be scanned of the examinee is detected.
Since the bed is movable, merely positioning the subject relative to the bed is often insufficient to ensure that the portion of the subject to be scanned is detected. In order to ensure that the part to be scanned of the subject is detected, various positioning methods have been devised. For example, some imaging devices are beginning to be equipped with laser emitters. When the imaging equipment is used, the part to be scanned of the examinee can be firstly drawn, and the part to be scanned of the examinee is ensured to be detected by a method of aligning the light spot formed by the laser emitter with the drawn line of the part to be scanned. However, this method requires that a line be drawn on a portion to be scanned of a subject in advance and manually aligned before performing an inspection, which is inefficient.
Therefore, it is necessary to provide an image diagnosis system which can ensure that a part to be scanned of a subject is detected and which is easy to operate.
SUMMERY OF THE UTILITY MODEL
An object of the utility model is to provide a can ensure that the person examined's the image diagnostic system that waits to scan the position and detect and operate comparatively portably.
In order to solve the technical problem, the utility model provides an at least some technical problem, the utility model provides an image diagnosis system, include: an image diagnosis apparatus, comprising: the scanning device is provided with a round hole-shaped scanning cavity and is used for scanning a patient and obtaining scanning data; the bed table can move along the axial direction of the scanning cavity and is used for bearing a patient to enter the scanning cavity; the image reconstruction device is used for reconstructing to obtain a medical image according to the scanning data; the shooting device is arranged at a specific position and is used for shooting a position image of the bed; a table, the table comprising: the bed control module is connected with the image diagnosis equipment, sends a bed moving signal to the image diagnosis equipment and moves the bed to a plurality of actual positions; a shooting control unit which is connected with the shooting device and controls the shooting device to shoot the position image of the bed at the actual position; a position calibration calculation unit for obtaining the corresponding relation between the actual position of the bed and the position image according to the actual positions of the bed and the corresponding position images; the corresponding relation is used for determining the actual position of the bed according to the image shot by the shooting device; and the bed positioning calculation unit is used for determining the actual position of the bed according to the position image shot by the shooting device.
In at least one embodiment of the present invention, the work table further includes: the input unit is connected with the bed control unit and is used for inputting a command for controlling the movement of the bed; and the display unit is used for displaying the position image shot by the shooting device.
In at least one embodiment of the present invention, the camera is suspended above the bed, and when the bed is located at any position within the movable range, the camera can shoot all or part of the bed.
In at least one embodiment of the present invention, the table control module moves the table to a plurality of actual positions including: sending an initial moving signal for moving the bed to an actual initial position to the image diagnosis equipment; sending a stepping movement signal for moving the bed to a plurality of actual stepping positions for a plurality of times by a preset stepping length to the image diagnosis equipment; the shooting control module controls the shooting device to shoot the stepping image of the bed after moving every time.
In at least one embodiment of the present invention, the calculating unit, when obtaining the correspondence between the actual position of the bed and the position image: obtaining the image position of the bed in the position image; and determining the corresponding relation between the actual position and the image position of the bed according to the actual positions and the corresponding image positions.
In at least one embodiment of the present invention, the bed has at least one identification feature thereon, and the position image includes at least one of the identification features of the bed; the calculation unit takes the position of the identification feature in the position image as the image position.
In at least one embodiment of the present invention, the system further comprises an image recognition module; the image recognition module obtains the position of the recognition feature in the position image in an automatic image recognition mode.
In at least one embodiment of the present invention, the input unit is further configured to receive input information of a user regarding a position of the identification feature in the position image.
In at least one embodiment of the present invention, the position calibration calculation unit is configured to: determining a corresponding table of the actual position and the image position of the bed according to the actual position and the image position; and interpolating the corresponding table to obtain the corresponding relation between the movement situation and the image position, or directly obtaining the corresponding relation between the movement situation and the image position by using the corresponding table.
In at least one embodiment of the present invention, the bed control module sends a bed moving signal to the image diagnosis device, and moves the bed to a first actual height and a second actual height that are preset; the shooting control module controls the shooting device to shoot a first height image and a second height image of the bed at the first actual height; the calculating unit obtains the height corresponding relation between the actual height of the bed and the height image position according to the first actual height, the second actual height, the first height image position and the second height image position.
In at least one embodiment of the present invention, the upper surface of the bed has two identifying features; the bed control module moves the bed to a preset first actual height and enables the first identification feature to be located in the shooting center of the shooting device, and moves the bed to a preset second actual height and enables the first identification feature to be located in the shooting center; the calculating unit takes the position of the second recognition feature in the first height image as the position of the first height image, and takes the position of the second recognition feature in the second height image as the position of the second height image.
In at least one embodiment of the present invention, the calculating unit corrects the correspondence between the actual position of the bed and the image position according to the height of the bed and the height correspondence when the image diagnosis apparatus is operated.
The utility model provides an image diagnostic system, through the method that obtains a plurality of images of bed at a plurality of actual positions, according to a plurality of actual positions and the corresponding position image position of corresponding position image position determination bed's actual position and the corresponding relation of image position. Therefore, the image diagnosis system of the utility model can ensure that the part to be scanned of the examined person is detected and the operation is comparatively simple.
Drawings
The above and other features, properties and advantages of the present invention will become more apparent from the following description of the embodiments with reference to the accompanying drawings, in which:
fig. 1 is a schematic structural diagram of an image diagnosis device of an image diagnosis system according to an embodiment of the present invention;
fig. 2 is a schematic system structure diagram of an image diagnosis system according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an image diagnosis system according to an embodiment of the present invention in a first state;
fig. 4 is a schematic structural diagram of an image diagnosis system in a second state according to an embodiment of the present invention;
fig. 5 is a flowchart illustrating a calibration method of the image diagnosis system according to an embodiment of the present invention;
fig. 6 is a schematic top view of an image diagnosis device of the image diagnosis system according to an embodiment of the present invention;
fig. 7 is a schematic flow chart illustrating a part of optional steps of a calibration method of an image diagnosis system according to another embodiment of the present invention;
fig. 8 is a schematic diagram of a method for establishing a calibration relationship at different bed heights according to an embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
An embodiment of the image diagnosis system of the present invention is described below with reference to fig. 1 to 5. Referring first to fig. 1, a basic structure of an image diagnosis apparatus in some embodiments is described by taking a CT machine as an example. In the current embodiment, the CT machine 200 includes a bed 21, a scanning device 22, and an image reconstruction device. Wherein the scanning device 22 has a circular bore-shaped scanning cavity for performing a scan of the patient and obtaining scan data. The bed 21 is adapted to carry a subject. The bed 21 is movable in the axial direction of the scanning chamber and is thus capable of carrying a patient into the scanning chamber so that the portion of the subject to be scanned is moved to a position suitable for examination, i.e., position 23 in fig. 1. After the portion to be scanned of the subject is moved to a position suitable for the subject to be detected, the CT machine 200 can inspect the portion to be scanned. For example, CT machine 200 has a source 221 and a detector 222. The radiation source 221 emits radiation to a portion to be scanned, and the radiation penetrates the portion to be scanned of the subject and is received by the detector 222. The image reconstruction device can reconstruct the ray received by the detector 222 according to the scanning data, and further obtain the medical image of the part to be scanned of the subject. The image diagnosis apparatus 200 may be any one of PET-CT, MR, and PET-MR.
Referring to fig. 2, in some embodiments, the image diagnosis system includes a photographing device 11, a table control module 12a, a photographing control module 12b, a position calibration calculation unit 12c, and a table positioning calculation unit 12e in addition to the image diagnosis apparatus 200. Wherein the bed control module 12a, the photographing control module 12b, the position calibration calculation unit 12c and the bed positioning calculation unit 12e are connected to each other and can exchange data. In the present embodiment, the structures collectively comprise a table 12. The table 12 also has a bus 12d, and the bed control module 12a, the photographing control module 12b, the position calibration calculation unit 12c, and the bed positioning calculation unit 12e are connected via the bus 12 d. For convenience of description, the table 12 and the photographing device 11 are also collectively referred to as a calibration system 100 hereinafter.
The imaging device 11 is installed at a specific position, and images of all areas that the bed 21 of the diagnostic imaging apparatus 200 can reach. The bed control module 12a is connected 200 to the image diagnosis apparatus. The bed control module 12a may transmit a bed moving signal to the image diagnosis apparatus 200, thereby moving the bed 21 of the image diagnosis apparatus 200 to a plurality of actual positions. The imaging control module 12b is connected to the imaging device 11. The imaging control module 12b can control the imaging device 11 to capture position images of the bed 21 at the respective actual positions. The position calibration calculation unit 12c is able to obtain a plurality of said actual positions and corresponding position images of the bed 21. The bed positioning calculation unit 12e can obtain the corresponding relationship between the actual position of the bed and the position image according to the information. This correspondence relationship can be used to determine the actual position of the bed from the image captured by the imaging device when the image diagnostic apparatus 200 scans the subject.
The specific method by which the position calibration calculation unit 12c obtains a plurality of said actual positions of the bed 21 and the corresponding position images may be various, and in some embodiments the photographing control module 12b obtains these images from the photographing device 11 after controlling the photographing device 11 to photograph the position images of the bed 21 at the respective actual positions, and transmits these images to the position calibration calculation unit 12c via the bus 12 d. When the bed control module 12a sends a bed moving signal to the image diagnosis apparatus 200, the moving signal is also transmitted to the position calibration calculation unit 12c, and the position calibration calculation unit 12c can obtain the actual position of the bed from these signals.
Referring to fig. 3 and 4, in other embodiments, the calibration system 100 of the image diagnosis apparatus 200 includes the photographing device 11 and the table 12. The table 12 may be any device capable of realizing the functions of the table control module 12a, the photographing control module 12b, and the position calibration calculation unit 12 c. For example, in the present embodiment, the table 12 is a personal computer. In other embodiments, the workstation 12 may be a workstation, a server, or a smart mobile device. The table 12 may be connected to the image diagnosis apparatus 200 and the camera 11 directly as shown in fig. 3 and 4, or indirectly through a network connection or the like.
The photographing device 11 is exemplified by a camera or a video camera, and the photographing device 11 is provided at a specific position and can photograph a position image of the bed 21. For example, the imaging device can capture images of all areas that the bed 21 can reach. In fig. 3, the image diagnostic apparatus 200 is in the first state in which the bed 21 of the image diagnostic apparatus 200 is located closest to the scanner 22, and the broken line 111 shows the range in which the imaging device 11 can image. The imaging device 11 can image all the unobstructed areas of the bed 21. In fig. 4, the image diagnostic apparatus 200 is in the second state where the bed 21 of the image diagnostic apparatus 200 is located at the farthest position from the scanner 22, and the imaging device 11 can image the entire area of the bed 21.
In one embodiment, the working platform 12 controls the image diagnosis apparatus 200 and the photographing device 11 to perform the following steps:
the table 12 moves the bed 21 of the image diagnostic apparatus 200 to a plurality of actual positions by giving a movement command or the like to the image diagnostic apparatus 200, where the position of the bed 21 is an actual position rather than a position in the picture, and thus the position of the bed 21 will be hereinafter referred to as an actual position. Images of the bed 21 at a plurality of actual positions are also taken while the bed 21 is moved, and for convenience, these images will be referred to as position images hereinafter.
The table 12 obtains a correspondence between the actual position of the bed 21 and the position image based on the plurality of actual positions of the bed 21 and the corresponding position images.
After obtaining the correspondence, the table 12 applies the correspondence to a subsequent process of "determining the actual position of the bed 21 from the image captured by the imaging device 11".
It should be noted that the specific method for moving the bed 21 of the image diagnosis apparatus 200 to a plurality of actual positions may be various, for example, the bed 21 of the image diagnosis apparatus 200 may be moved to a certain position, and the position is used as an initial position, and then the bed 21 may be moved to a plurality of actual stepping positions for a predetermined stepping length.
On the other hand, the specific implementation method for obtaining the corresponding relationship between the actual position of the bed 21 and the position image may be various according to the actual positions of the bed 21 and the position images. In some embodiments, the specific method of obtaining the correspondence between the actual position of the bed 21 and the position image further includes the steps of: first, an initial image position of the bed in the initial image is obtained from the initial image. And then obtaining the stepping image position of the bed in the stepping image according to the stepping image. And finally, determining the corresponding relation between the actual position of the bed and the image position according to the actual initial position, the initial image position, the actual stepping position and the stepping image position. These steps may be performed with the table 12.
Next, referring to fig. 5, a description will be given of steps performed by the stage 12 to control the image diagnosis apparatus 200 and the photographing device 11 in another embodiment. In the other embodiment, the work table 12 controls the image diagnosis apparatus 200 and the photographing device 11, and performs the following steps:
in step 301, the bed control module 12a moves the bed 21 of the diagnostic imaging apparatus 200 to the actual initial position by, for example, issuing a movement command to the diagnostic imaging apparatus 200. As shown in fig. 3, in the present embodiment, this actual initial position is a position closest to the scanning device 22 within the movable range of the bed 21. In other embodiments, the actual initial position may be other positions, such as the position farthest from the scanning device 22 within the movable range.
Although this step is described first, there may be other steps preceding this step. For example, referring to FIG. 5, this step is preceded by a "start" step. In the starting step, the calibration system 100 and the image diagnosis apparatus 200 complete self-test and the like.
In step 302, the imaging control module control 12b controls the imaging device 11 to capture an initial image of the bed 21 located at the actual initial position. After the photographing is completed, the position calibration calculation unit 12c can obtain the initial image from the photographing device 11.
In step 303, the bed control module moves 12a the bed 21 to a plurality of actual stepping positions by a preset stepping length, and after the step of moving the bed 21 is completed each time, the imaging control module controls 12b controls the imaging device 11 to capture a stepping image of the bed 21. This step may be performed in a round-robin fashion, with continued reference to fig. 5, and in some embodiments, this step 303 may include the steps of:
step 331, moving the bed 21 to a first actual stepping position for a first time by a preset stepping length;
step 332 of taking a first further image of the bed 21 after the bed 21 is moved for the first time;
step 333 moving the bed 21 for the second time to a second actual stepping position by a preset stepping length;
step 334 is to photograph a second step image of the bed 21 after the second movement of the bed 21;
this is repeated until the bed 21 is moved for the nth time and the nth step image is obtained, and then the operation is stopped. Referring to fig. 4, the timing of stopping may be when the number N of times the bed 21 is moved reaches a predetermined number of times or when the bed 21 reaches the end of the moving range.
In step 304, the position calibration calculation unit 12c obtains an initial image position of the bed 21 in the initial image. Although this step is illustrated in fig. 5 as being performed after step 303 is completed, in fact, this step may be performed at other times. For example, in some embodiments, after step 3032, the table 12 generates the initial image position of the table 21 in the initial image in real-time.
In step 305, the position calibration calculation unit 12c obtains the step image position of the bed 21 in each step image. Similarly, the step 305 does not necessarily need to be performed after the step 304 is completed, but may be performed at other times. In some embodiments, each time a step image is acquired, the table 12 steps the image position in the just acquired step image with the generation bed 21 in real time.
In step 306, the position calibration calculation unit 12c determines the correspondence relationship between the actual position of the bed 21 and the image position from the actual initial position, the initial image position, the actual step position, and the step image position. Since the actual initial position and the actual stepping position are both positions that the position calibration calculation unit 12c makes the bed 21 reach by sending information to the image diagnosis apparatus 200. The position calibration calculation unit 12c can know the actual initial position of the table 21 and each actual step position. In some embodiments, the position calibration calculation unit 12c sets the actual initial position as zero and takes the product of the preset step length and the number of movements as the current actual step position. On the other hand, in the foregoing step, the position calibration calculation unit 12c may obtain the initial image position and the step image position. The table can determine the correspondence between the actual position of the table 21 in the entire movable range and the position in the image based on this information.
The calibration method for the calibration system of the image diagnostic apparatus provided in this embodiment can determine the corresponding relationship between the actual position of the table and the image position by combining the actual initial position and the actual stepping position by a method of obtaining the initial image of the table at the actual initial position and the stepping images at the plurality of actual stepping positions. Therefore, the calibration process is simple, convenient and quick and has higher precision. Therefore, the calibration system and the calibration method of the image diagnostic apparatus provided by the embodiment can ensure the characteristic that the part to be scanned of the subject is detected by the scanning device.
In addition, since the foregoing embodiment realizes calibration of the image diagnosis apparatus and obtains the correspondence between the actual position of the bed and the image position, when the examiner desires to examine a specific part of the examinee, the examiner needs to input the image by frame selection or the like on the picture taken by the imaging device after the examinee is in position on the bed. The table 12 can obtain the actual position of the bed corresponding to the input according to the input of the examiner in combination with the correspondence between the actual position of the bed and the image position obtained in the calibration process, and further control the image diagnosis apparatus 200 to examine the part selected by the examiner. Therefore, the efficiency of the inspection process can be improved by using the calibration system of the present embodiment or the image diagnostic apparatus to which the calibration method of the present embodiment is applied.
In some embodiments, after the above calibration method is completed, medical imaging may be performed in the following manner.
A position image is obtained from the imaging device 11. The position image is an image of the bed 21 on which the patient is placed, which is captured by the imaging device 11.
A target scanned area is obtained, which includes a region of a patient to be scanned. This scanned area may be obtained by a doctor performing a framing operation on the position image and reading this framing operation.
After the position image and the target scanning area are obtained, since the corresponding relationship between the actual position of the table 21 and the position image is obtained in the calibration, the table positioning calculation unit 12e may reversely deduce the target scanning area in the position image by using the corresponding relationship to determine the actual position corresponding to the table, and control the table 21 to move to the target position according to the corresponding actual position, thereby realizing the scanning of the target scanning area and obtaining the medical image.
It should be noted that, although the calibration method of the calibration system of the image diagnostic apparatus mentioned in the foregoing embodiments is described as being implemented on the calibration system of the image diagnostic apparatus shown in fig. 3 and 4, this method may also be implemented on other calibration systems.
Furthermore, although one embodiment of the image diagnostic system and calibration method of the present invention is described above, in other embodiments of the present invention, the image diagnostic system and calibration method may have more details than the above-described embodiment, and may have various variations in at least some of these details. For example, the specific value of the preset step length can be set according to actual conditions. When it is desired that the correspondence between the actual position of the bed and the image position has high accuracy, the preset step length may be selected to be set to 5 mm or 10 mm. Conversely, if a faster calibration is desired, for example, at each power-on routine calibration to verify whether the actual position of the currently applied table and the image position correspondence is still applicable, the preset step length may be set to 15 mm or 20 mm larger. At least some of these variations are described below in terms of several embodiments.
As described in the previous embodiment, the specific form of the initial position may be various. Accordingly, in the step of moving the bed 21 to the plurality of actual stepping positions a plurality of times with the preset stepping length, the direction and manner of moving the bed 21 may also be various. For example, in some embodiments, the bed 21 can move in two dimensions in the horizontal plane, and it is necessary to move the bed 21 in two directions a plurality of times with the same or different preset step lengths. In some embodiments, referring to fig. 3, the initial position is a position closest to the scanning device 22 within the movable range of the bed 21 and the bed 21 performs only one-dimensional movement in the horizontal plane. At this time, in the step of moving the bed 21 to a plurality of actual step positions a plurality of times by a preset step length, the bed 21 is moved from the actual initial position in a direction away from the scanning device 22. In other embodiments, the actual initial position is the position farthest from the scanning device 22 within the movable range of the bed 21. In this embodiment, in the step of moving the bed 21 to a plurality of actual step positions a plurality of times by a preset step length, the bed 21 is moved in the direction of the scanning device 22.
Besides the specific form of the initial position, the determination manner of the initial position may also be various. For example, in some embodiments, the position where the bed 21 is located after moving closest to the scanning device 22 is taken as the initial actual position. Referring to fig. 3, in some embodiments, the image diagnosis apparatus further includes an alignment module 23. The actual initial position is the position determined by the alignment module. In other words, the step of "moving the bed to the actual initial position" may further include the steps of:
in step a, the bed control module 12a moves the bed 21 to a preset position. The preset position may be a position closer to the position determined by the alignment module, and the preset position is a position where the table 21 moves closest to the scanning device 22 with reference to fig. 3.
In step B, the alignment module 23 is activated. In the present embodiment, the alignment module 23 is a laser transmitter provided on the scanning device 22 and capable of emitting laser light to the bed 21.
In step C, the table control module 12a moves the table 21 to the actual initial position determined by the alignment module 23 according to the alignment result of the alignment module 23. Referring to fig. 3 and 6, in the present embodiment, the bed 21 has an identification feature 211 thereon. After the quasi-mechanism 23 is activated, the laser transmitter forms a light spot on the bed 21. The table 21 is moved manually or automatically to align the spot with the identification feature 211. The table 21 is considered to have been in the actual initial position when the spot is aligned with the identification feature 211.
With continued reference to fig. 3, 4, in some embodiments, the camera 11 may use the same camera parameters to take the shot when taking the initial image and the step image. Such an arrangement makes it unnecessary to take into account the influence of the photographing parameters in the step of determining the actual relationship between the actual position of the bed 21 in the entire movable range and the position in the image.
In some embodiments, the initial image and the step image may be captured by a camera 11 suspended above the image diagnostic apparatus 200. The arrangement is such that the photographing device 11 can photograph all or part of the bed 21 when the bed 21 is located at any position within its movable range. Further, in order to enable the photographing device 11 to keep the photographing parameters unchanged when photographing the initial image and the step image, the photographing device 11 may be made to always photograph the entire area reachable by the bed 21. In other words, the initial image and the step image are made to include all the areas that the bed 21 can reach.
The method how to determine the initial image position in the initial image and the step image position in the step image may be varied. In some embodiments, the bed 21 has one or more identification features specifically made thereon. Referring to fig. 6, in some embodiments, the upper surface of the table 21 has two identifying features. The two identifying features are cross markings 211, 212, respectively. In other embodiments, the position of the outline or corner of the table 21 is taken as the initial image position in the initial image and the step image position in the step image. In yet other embodiments, the identifying feature is a transverse line perpendicular to the length of the bed 21. Of course, the above examples are merely illustrative of identifying features. And the plurality of identifying features may be the same or different. For example, in some embodiments, the upper surface of the bed 21 has two identifying features. One of the identifying features is a cross mark and the other identifying feature is a transverse line perpendicular to the length direction of the bed 21. In summary, the identification feature may be an image marking feature, such as a cross mark, a bar mark, or the like, or a shape configuration feature, such as a specific shape of the edge of the bed, a head rest of the bed, or the like.
On the other hand, although the size of the identifying features may also be varied. In the foregoing example, the size of the cross mark or cross line is small compared to the size of the bed 21. In some other embodiments, the size of the identifying feature is similar or identical to the size of the table 21.
In order to increase the probability that the cross mark is accurately recognized, the bed 21 and the two cross marks 211, 212 may be set to have a large color difference in color. For example, the color of the upper surface of the bed 21 may be set to black and the colors of the two cross marks 211 and 212 may be set to white (of course, sundries such as bedspreads on the bed 21 may be opened). On the other hand, in order to have at least one cross mark in the image when the bed 21 is at any position, two cross marks 211, 212 may be provided at both ends of the bed 21, respectively. It should be noted that "both ends" herein shall refer to an area of both ends of the bed 21, and not to both end lines of the bed. As shown in fig. 6, the cross marks 211, 212 near the ends of the bed 21 should also be understood as "located at the ends of the bed 21".
In the current embodiment, at least one of the initial image and the step image includes at least one of the identifying features. In other words, in a portion of the initial image and the step image, for example, when the table 21 is in the position shown in fig. 3, the cross mark 211 may be blocked by the scanning device 22, and only the cross mark 212 may be present in the image, while in the state shown in fig. 4 and 6, both cross marks 211, 212 are present in the image. Since the relative positions of the cross markers 211, 212 and the bed 21 are fixed, the position of the recognition feature in the initial image can be taken as the initial image position, and the position of the recognition feature in the step image can be taken as the step image position.
The manner in which the locations of the identifying features in the initial image and the locations of the identifying features in the step image are obtained may be varied. Referring to fig. 3, 4, in some embodiments, the table 12 also has a display unit 12g and an input unit 12 h. In the present exemplary embodiment, the display unit 12g and the input unit 12h are connected to the bus 12d via an I/O interface. The input unit 12h is indirectly connected to a bed control unit for controlling the movement of the bed 21, and can be used to input a command for controlling the movement of the bed. The display unit 12g can display the position image captured by the imaging device 11.
Alternatively, in some embodiments, after the stage 12 obtains the initial image and the step image from the photographing device 11, these images can be displayed on the display unit. The operator can recognize the position of the recognition feature in these images through the display unit and input the recognition feature by clicking, touching, or the like with an input unit such as a mouse, a touch screen, a keyboard, or the like. The stage 12 can determine the location of the identifying feature in the initial image and the location of the identifying feature in the step image based on the information entered by the user.
The display unit and the input unit may also have more functions. In some embodiments, the functions of the display unit and the input unit include:
before the step of the bed control module 12a moving the bed 21 to a plurality of actual positions, the movement prompt information is displayed with the display unit so that the user knows that the movement of the bed 21 has been started. Thereafter receiving a movement instruction with the input unit; the user controls the diagnostic imaging apparatus to move the bed 21 to a plurality of actual positions in response to the instructions. In other words, in the step of moving the bed to a plurality of actual positions, the image diagnosis apparatus instructs the moving bed 21 to a plurality of actual positions according to the movement;
after the initial image is displayed with the display unit, the initial image and the position prompt information may be displayed with the display unit so that the user knows the position at which the identification feature in the initial image should be input at this time. When the user makes an input, the position of the recognition feature in the initial image is received with the input unit. Similarly, after the step image is displayed with the display unit, the step image and the position prompt information may be displayed with the display unit, and the position of the identification feature in the step image may be received with the input unit when the user makes an input. In addition, after the corresponding relation between the actual position of the bed and the position image is obtained, a prompt of successful calibration can be displayed by the display unit.
Referring to fig. 2, in some embodiments, the visual diagnostic system 100 further includes an image recognition module 12 f. The image recognition module may be specialized hardware. When the position calibration calculation unit 12c obtains the initial image and the step image from the camera 11, the image recognition module 12f determines the positions of the recognition features in the initial image and the positions of the recognition features in the step image. For example, the image recognition module 12f may have a template (template) for recognizing features and software capable of running a template matching algorithm (template matching algorithm). After obtaining the initial image and the step image, the image recognition module 12f runs a template matching algorithm on the initial image and the step image with the template of the recognition feature, and automatically recognizes the position of the recognition feature in the initial image and the position of the recognition feature in the step image using the algorithm. Of course, the above-mentioned template matching algorithm for identifying the positions of the identification features in the initial image and the positions of the identification features in the step image is only an example of an alternative method for "automatically identifying the positions of the identification features in the initial image and the positions of the identification features in the step image", and in other examples, the identification of the positions of the identification features may be implemented in other manners. For example, in some embodiments, this process of automatic recognition is accomplished using a trained neural network.
Other than the methods of manually and automatically obtaining the location of the identifying feature in the above examples. In some embodiments, the method of obtaining the location of the identifying feature may also be combined manually and automatically. Specifically, in some embodiments, the first recognition feature, for example, a position near a white bar mark (first white bar for short) is manually input, and then the center ROI (2winx, 2winy) of the white bar is automatically input according to the input, wherein the ROI represents the Region of interest (Region of interest). For example, the precise position of the center of the white bar is automatically obtained by the gradient maximum method, and then the ROI image of the white bar is cut out as template. The second feature identified by default bed height (second white bar for short) automatically moves the ROI to the vicinity of the second white bar based on the distance from the first white bar, and then finds the exact location.
The specific manner of determining the correspondence between the actual position of the table and the image position according to the actual initial position, the initial image position, the actual stepping position, and the stepping image position may be various. For example, in some embodiments, a table of actual positions and image positions of the table 21 is determined based on the actual initial position, initial image position, actual step position, and step image position. The table includes the correspondence between the actual position of the bed 21 and the image position at a plurality of discrete points. Then, a curve of the correspondence relationship between the actual position of the bed 21 and the image position is formed.
The specific method in which the curve of the correspondence relationship between the actual position of the bed 21 and the image position is formed may be various. In some embodiments, the correspondence between each actual position of the bed and the image position is directly obtained from the correspondence table between the actual position of the bed 21 and the image position. In other embodiments, the corresponding relationship between each actual position of the table and the image position is obtained by interpolation, such as spline interpolation. In addition, other methods can be adopted to obtain the corresponding relation between each actual position of the bed and the image position. For example, a curve of the correspondence relationship between the actual position of the bed 21 and the image position can be formed by simply connecting a plurality of discrete points in a straight line.
Although in the foregoing example, the calibration process for the image diagnosis apparatus is completed after the correspondence between the actual position of the bed and the image position is determined. However, in some embodiments, the calibration method of the image diagnosis apparatus further includes more steps. Referring to fig. 7, further steps of the calibration system and the calibration method of the image diagnostic apparatus are described in some embodiments. In this embodiment, the steps of steps 301 to 306 may be the same as those in any of the previous embodiments, and therefore, are not described herein again. In the step of moving the bed 21 to a plurality of actual stepping positions a plurality of times with a preset stepping length, the bed 21 is moved in the horizontal direction.
In step 307, the bed control module 12a moves the bed 21 to a preset first actual height. This step may be implemented in such a manner that the table transmits an instruction to the image diagnosis apparatus 200. The first actual height here may be the height of the bed 21 when the bed 21 is first moved horizontally, and therefore, in this step, not changing the height of the bed 21 may also be understood as the bed 21 "moving" to the preset first actual height.
In step 308, the imaging control module 12b controls the imaging device 11 to capture a first height image of the bed 21 at the first actual height. Similarly, if the height of the bed 21 is not changed in the previous step, the first height picture may also use a corresponding initial image or step picture, and since the initial image and the step picture are both obtained by the photographing device 11, the use of the corresponding initial image or step picture should also be understood as "photographing" the first height image of the bed 21 at the first actual height.
In step 309, the bed control module 12a moves the bed vertically to a preset second actual height. In this step, the position of the bed 21 in the horizontal direction can be maintained.
In step 310, the photographing control module 12b controls the photographing device 11 to photograph an image of the bed 21 at a second height that is different from the second actual height of the first height. In theory, the first height and the second height may be different, but generally, in order to obtain a good effect, the first height and the second height may be set to have a large height difference. For example, the first height and the second height may be set to the highest height and the lowest height that the bed 21 can reach.
In step 311, the position calibration calculation unit 12c obtains the first height image position of the bed 21 in the first height image. The specific implementation of this step may be the same or different from the method of obtaining the initial image position, the step image position.
In step 312, the position calibration calculation unit 12c obtains the second height image position of the bed 21 in the second height image. Similarly, the specific implementation of this step may be the same or different from the method of obtaining the initial image position, the step image position.
In step 313, the position calibration calculation unit 12c obtains the height correspondence between the actual height of the bed and the height image position according to the first actual height, the second actual height, the first height image position, and the second height image position.
It should be noted that although the steps 307 to 313 are described as having a larger sequence number than the steps 301 to 306, it does not mean that the steps 307 to 313 are necessarily performed after the steps 301 to 306 are completed, nor does it mean that the steps 307 to 313 are necessarily performed in the order of the steps 307 to 313. For example, in one embodiment, steps 307 through 310 may be performed before step 306, before step 305, or even before step 304.
On the other hand, although the bed 21 is moved to only two heights for taking the height pictures in the above-described steps, in other embodiments, the bed may be moved to more heights, more height pictures may be taken, and the height positions obtained from the height pictures may be used in obtaining the height correspondence relationship between the actual height of the bed and the height image position.
Similar to the initial image position, the step image position, the specific method of obtaining the first height image position of the bed 21 in the first height image and obtaining the second height image position of the bed 21 in the second height image may be various. For example, the positions of the recognition features of the upper surface of the bed may be taken as the first height image position and the second height image position.
Referring to fig. 4, in some embodiments, the upper surface of the bed has two identifying features. When the first height image is captured, the bed 21 is moved first until the first recognition feature 211 is located at the capture center (position indicated by the broken line 112) of the capturing device 11. Similarly, when the second height image is captured, the bed 21 is moved until the first identification feature 211 is located at the center of the image capture device 11 (as indicated by the dashed line 112). Of course, if the bed 21 is kept at the horizontal position while moving from the first height to the second height, the first recognition feature 211 is naturally located at the center of the image captured by the image capturing device 11 when capturing the image at the second height. This arrangement enables only one position data of one identifying feature to represent the first elevation image position or the second elevation image position. And the subsequent processing process is quicker due to the smaller data volume.
And obtaining the height corresponding relation between the actual height of the bed and the height image position according to the first actual height, the second actual height, the first height image position and the second height image position. The correspondence between the actual position of the bed 21 and the image position at different heights can be corrected by this height correspondence. In some embodiments, during operation of the image diagnosis apparatus 200, the correspondence between the actual position of the bed and the image position is corrected according to the height of the bed 21 and the height correspondence, so as to ensure that the actual position of the bed 21 and the image position are kept corresponding at any height.
A specific method for optionally correcting the correspondence between the actual position of the bed and the image position according to the height of the bed and the height correspondence will be described below. After the corresponding relation between the actual position of the bed and the image position is obtained, the first characteristic image is kept at the shooting center position (namely, the image center), and the position u (hereinafter referred to as object distance) of the second characteristic image under different bed heights is recorded.
Known by lens imaging formula
Figure BDA0001803858920000171
Where u, v, f are the object distance, image distance, and focal length, respectively. Assuming that the lens of the photographing device 11 is parallel to the bed plane, the relationship between the object distance and the bed height h is
uh=H-h (2)
In the formula (2), H is the distance from the lens plane to the floor. Assuming that the focal length is constant, the correspondence of the imaging system can be calculated by the following equation (3)
Figure BDA0001803858920000172
An actual coordinate system of the actual position of the bed and an image coordinate system of the image position are established. Then, in the coordinate system, the position of the second feature image in the actual coordinate system is (z, h) and the position of the second feature image in the image coordinate system is U (z, h). Then, using the above formula (3), the correspondence between the pixel coordinate U (z, h) of the second feature image and its actual position (z, h) can be established:
Figure BDA0001803858920000181
in the above formula (4), k represents the actual size of each position (each pixel) in the picture coordinate system. The parameters of the focal length f, k, etc. can be obtained by means of acquisition, measurement, etc. from a camera supplier. On the basis, the formula (4) can be further arranged as follows:
Figure BDA0001803858920000182
on the basis, the value of the position U (Z, H) of the image coordinate system of the second characteristic image and the value of the position (Z, H) of the second characteristic image in the actual coordinate system are used for obtaining H, Z by fitting of least square method and the likelaw,UlawF, k, etc. 5 parameters.
After obtaining the above parameters, when the image diagnosis apparatus 200 is running, the actual position Z can be obtained by obtaining the actual current bed height h and the position in the image (e.g. input by frame selection, click, etc.) U (Z, h), and then the actual position Z based on the second feature image can be obtained#2A correction table of all pixel coordinates U (z, h) and the bed-entering distance L (z, h) is obtained:
Figure BDA0001803858920000183
of course, the above specific method for correcting the corresponding relationship between the actual position of the bed and the image position according to the height of the bed and the corresponding relationship between the heights is only an optional scheme. In other embodiments, the specific method for correcting the correspondence between the actual position of the bed and the image position according to the height of the bed and the height correspondence may also be varied more. For example, the correction tables at other heights can be corrected by using the correction table of the pixel and bed code at a certain height precisely calibrated in the previous step. Assuming a precisely calibrated height of h0Then, according to the formula (5), the relationship of the pixel coordinates U (z, h) at different heights when the second feature image is at the same actual position z can be calculated as follows:
Figure BDA0001803858920000184
by the above formula, the bed height h can be calibrated0The correction table of (2) corrects the pixel coordinates of other bed heights at the same bed position, so that the correction relationship of different bed heights can be established.
A method of establishing a calibration relationship at different bed heights in another embodiment will be described with reference to fig. 8. In this embodiment, the same steps as those in the previous embodiment are not described again. When the bed 21 is moved to different heights, all bed heights are traversed in preset steps, and a height image is taken of the bed 21 under each bed height. In fig. 8 only three levels of the bed 21 are shown, respectively designated 21a, 21b and 21c for the purpose of distinction. The difference in size of the three beds 21a, 21b, and 21c does not represent a change in the actual size of the bed 21 during the lifting, but is intended to show the change in the size of the bed 21 in the height picture when the bed 21 has different heights due to the principle of the near-far distance.
After obtaining height pictures of the bed 21 at a plurality of heights, since the actual height of the bed 21 is known when each height picture is taken, the position of the identification feature in each height picture can also be obtained. Two correspondence curves between the bed height and the image position as shown by the broken line A, B in fig. 8 can be obtained. Of course, if the bed 21 has more identification features, more corresponding relationship curves can be obtained.
After the corresponding relationship curve is obtained, a plurality of linear slopes corresponding to the bed length may be obtained in a manner of conformal spline interpolation, and the horizontal pixels corresponding to the default bed code form a linear equation of the horizontal pixels corresponding to the default bed code and the bed height, as shown by the multiple dot-dash lines C in fig. 7. The linear equations of the dot-dash lines are the correction relations under different bed heights. When the actual position of a certain image position under a certain bed height needs to be obtained, the bed height and the image position can be substituted into a corresponding linear equation, and the actual position of the image position under the bed height can be obtained. Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present invention. Therefore, any modification, equivalent changes and modifications made to the above embodiments according to the technical spirit of the present invention, all without departing from the content of the technical solution of the present invention, fall within the scope of protection defined by the claims of the present invention.

Claims (9)

1. An image diagnostic system, comprising:
an image diagnosis apparatus, comprising:
the scanning device is provided with a round hole-shaped scanning cavity and is used for scanning a patient and obtaining scanning data;
the bed table can move along the axial direction of the scanning cavity and is used for bearing a patient to enter the scanning cavity;
the image reconstruction device is used for reconstructing to obtain a medical image according to the scanning data;
a photographing device provided at a position where an image of all areas that the bed can reach is photographed, for photographing a position image of the bed;
a table, the table comprising:
the bed control unit is connected with the image diagnosis equipment, sends a bed moving signal to the image diagnosis equipment and moves the bed to a plurality of actual positions;
a shooting control unit connected with the shooting device and used for controlling the shooting device to shoot the position image of the bed at the actual position;
the position calibration calculation unit is used for obtaining the corresponding relation between the actual position of the bed and the position image according to the plurality of actual positions of the bed and the corresponding position images; the corresponding relation is used for determining the actual position of the bed according to the image shot by the shooting device;
and the bed positioning calculation unit is used for determining the actual position of the bed according to the position image shot by the shooting device.
2. The image diagnostic system of claim 1, wherein the table further comprises:
the input unit is connected with the bed control unit and is used for inputting a command for controlling the movement of the bed;
and the display unit is used for displaying the position image shot by the shooting device.
3. The image diagnostic system of claim 1, wherein the camera is suspended above the bed, and the camera can capture all or part of the bed when the bed is at any position within its movable range.
4. The visual diagnostic system of claim 1, wherein the table has at least one identifying feature thereon, and the position image includes at least one of the identifying features of the table;
the calculation unit is configured to take a position of the identification feature in the position image as the image position.
5. The visual diagnostic system of claim 4, further comprising an image recognition module;
the image recognition module obtains the positions of the recognition features in the position image in an automatic image recognition mode.
6. The visual diagnostic system of claim 4, wherein the input unit is further configured to receive user input regarding the location of the identifying feature in the location image.
7. The image diagnostic system of claim 1, wherein: the bed control unit is used for sending a bed moving signal to the image diagnosis equipment and moving the bed to a preset first actual height and a preset second actual height;
the shooting control module is used for controlling the shooting device to shoot a first height image and a second height image of the bed at the first actual height;
the calculation unit is used for obtaining the height corresponding relation between the actual height of the bed and the height image position according to the first actual height, the second actual height, the first height image position and the second height image position.
8. The image diagnostic system of claim 7, wherein the upper surface of the bed has two identifying features;
the bed control unit is used for moving the bed to a preset first actual height and enabling the first identification feature to be located in the shooting center of the shooting device, and moving the bed to a preset second actual height and enabling the first identification feature to be located in the shooting center;
the calculation unit is configured to use a position of a second recognition feature in the first height image as the first height image position, and use a position of the second recognition feature in the second height image as the second height image position.
9. The image diagnostic system according to claim 7, wherein the calculation unit is configured to correct the correspondence between the actual position of the bed and the image position based on the height of the bed and the height correspondence when the image diagnostic apparatus is in operation.
CN201821524684.5U 2018-09-18 2018-09-18 Image diagnosis system Active CN210277194U (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201821524684.5U CN210277194U (en) 2018-09-18 2018-09-18 Image diagnosis system
US16/236,585 US11182927B2 (en) 2018-09-18 2018-12-30 Systems and methods for positioning an object
US17/455,933 US11727598B2 (en) 2018-09-18 2021-11-22 Systems and methods for positioning an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201821524684.5U CN210277194U (en) 2018-09-18 2018-09-18 Image diagnosis system

Publications (1)

Publication Number Publication Date
CN210277194U true CN210277194U (en) 2020-04-10

Family

ID=70058555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201821524684.5U Active CN210277194U (en) 2018-09-18 2018-09-18 Image diagnosis system

Country Status (1)

Country Link
CN (1) CN210277194U (en)

Similar Documents

Publication Publication Date Title
CN109171789B (en) Calibration method and calibration system for image diagnosis equipment
CN107789001B (en) Positioning method and system for imaging scanning
US11629955B2 (en) Dual-resolution 3D scanner and method of using
CN106859675B (en) Method and system for scanner automation for X-ray tubes with 3D camera
EP3364214B1 (en) Method of automatically positioning an x-ray source of an x-ray system and an x-ray system
US11276166B2 (en) Systems and methods for patient structure estimation during medical imaging
JP4484462B2 (en) Method and apparatus for positioning a patient in a medical diagnostic or therapeutic device
CN103181775B (en) For detecting the method and system of patient body's cursor position
CN103767722B (en) The localization method that CT or PET-CT system and carrying out scans
CN106388851A (en) Arranging position control method and device
US10742962B2 (en) Method and system for capturing images for wound assessment with moisture detection
US11600021B2 (en) System and method for calibration between coordinate systems of 3D camera and medical imaging apparatus and application thereof
KR20160076487A (en) Imaging arrangement and method for positioning a patient in an imaging modality
US10830850B2 (en) Optical camera for patient position monitoring
CN111067531A (en) Wound measuring method and device and storage medium
US10779793B1 (en) X-ray detector pose estimation in medical imaging
US20190130598A1 (en) Medical apparatus
WO2023272372A1 (en) Method for recognizing posture of human body parts to be detected based on photogrammetry
JP2015198824A (en) Medical image diagnostic apparatus
CN113440156A (en) Mobile CT intelligent scanning positioning system, positioning method and storage medium
CN111528895A (en) CT visual positioning system and positioning method
US20180168535A1 (en) X-ray image capturing apparatus and method of controlling the same
CN210277194U (en) Image diagnosis system
KR102313801B1 (en) Apparatus and method for guiding correct posture of medical image system
CN109118480B (en) Adjusting method and device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Patentee after: Shanghai Lianying Medical Technology Co., Ltd

Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Patentee before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.