EP2864807B1 - System and method for 3d ultrasound volume measurements - Google Patents
System and method for 3d ultrasound volume measurements Download PDFInfo
- Publication number
- EP2864807B1 EP2864807B1 EP13753208.1A EP13753208A EP2864807B1 EP 2864807 B1 EP2864807 B1 EP 2864807B1 EP 13753208 A EP13753208 A EP 13753208A EP 2864807 B1 EP2864807 B1 EP 2864807B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- cursor
- point
- display
- user
- ultrasound
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000002604 ultrasonography Methods 0.000 title claims description 65
- 238000005259 measurement Methods 0.000 title claims description 29
- 238000000034 method Methods 0.000 title claims description 20
- 238000012285 ultrasound imaging Methods 0.000 claims description 33
- 230000000007 visual effect Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 6
- 239000000523 sample Substances 0.000 description 16
- 230000000694 effects Effects 0.000 description 7
- 238000004904 shortening Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000002592 echocardiography Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000010349 pulsation Effects 0.000 description 2
- 102100025283 Gap junction alpha-8 protein Human genes 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7455—Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
- A61B8/145—Echo-tomography characterised by scanning multiple planes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52073—Production of cursor lines, markers or indicia by electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/012—Dimensioning, tolerancing
Definitions
- the present invention relates to an ultrasound imaging system and method for determining a distance between a first point and a second point in a three-dimensional ultrasound image of a volume, for example an anatomical site of a patient.
- the present invention further relates to a computer program for implementing such method.
- volume imaging In three-dimensional ultrasound imaging, or volume imaging, the acquisition of a three-dimensional image is accomplished by conducting many two-dimensional scans that slice through the volume of interest. Hence, a multitude of two-dimensional images is acquired that lie next to another. By proper image processing, a three-dimensional image of the volume of interest can be built out of the multitude of two-dimensional images. The three-dimensional information acquired from the multitude of two-dimensional images is displayed in proper form on a display for the user of the ultrasound system.
- the reference US 2011/0066031 A1 discloses embodiments for providing an ultrasound system for performing a three-dimensional measurement and comprising an ultrasound data acquisition unit configured to transmit ultrasound signals to a target object and receive ultrasound echo signals reflected from the target object to acquire ultrasound data. Further, it comprises a user interface configured to receive input data from a user and a processor configured to form a three-dimensional ultrasound image based on volume data derived from the ultrasound data, establish two or more points on the 3D-ultrasound image based on the input data, generate connection data among the established two or more points on the 3D-ultrasound image, and measure distances among the established two or more points based on the input data and the connection data.
- document JP 3 325224 B shows an ultrasonic diagnostic imaging equipment which conducts observation and inspection.
- an ultrasound imaging system for providing a three-dimensional image of a volume and for determining a distance between a first point and a second point in the three-dimensional image.
- the ultrasound imaging system comprises a transducer array configured to provide an ultrasound receive signal, a controlling unit configured to receive the ultrasound receive signal and to provide display data representing the three-dimensional image, wherein the controlling unit is further configured to determine a distance between the first point and the second point identified in the three-dimensional image, a display configured to receive the display data and to display the three-dimensional image of the volume and a cursor for identifying the first point and the second point, and an input device configured to provide input data to the controlling unit, wherein the input data includes a movement of the cursor, and wherein the ultrasound imaging system is configured to enable a user to perform a first movement of the cursor parallel to a plane displayed to the user when viewing the three-dimensional image displayed on the display based on input data, to identify a first coordinate and a second coordinate of at least one of the first and
- a method for determining a distance between a first point and a second point in a three-dimensional ultrasound image of a volume comprises the steps of displaying the three-dimensional ultrasound image on a display together with a cursor for identifying the first point and the second point, moving the cursor parallel to a plane displayed to a user when viewing the three-dimensional image provided on the display based on input data to identify a first coordinate and a second coordinate of at least one of the first and second points, moving the cursor, without changing a perspective of the view displayed on the display, perpendicularly to the plane displayed on the display and into the three-dimensional image, based on input data to identify a third coordinate of the respective point, providing, while moving the cursor perpendicularly to the plane that is displayed, an indication on the display when the cursor touches a structure displayed within the volume so as to inform the user that the cursor has touched the structure and so that the user may place the cursor properly on the structure in the direction perpendicular
- a computer program comprising program code means for causing a computer to carry out the steps of such method when said computer program is carried out on the computer.
- the basic idea of the invention is to overcome the "fore-shortening effect" by providing the user with the possibility to place measurement cursors directly into the volume to touch the structures that are to be measured in the three-dimensional image.
- the problem that the user may only place the cursor in a plane shown on the display can be overcome. Further, there is no need for the user to rotate the three-dimensional volume extensively to find a proper in which it is possible to locate the cursor at a proper position touching a structure in the three-dimensional volume. Instead, the user may position the cursor in a plane of the three-dimensional volume shown on the display first and then may "dive" the cursor into the volume until it touches the structure.
- a user is provided with a cursor end-point depth control, for example for the z-dimension, in addition to the trackball for placing the end-point in the dimensions of the screen, for example the x and y dimensions.
- a cursor end-point depth control for example for the z-dimension
- the user uses the endpoint depth control to move the cursor down into the volume.
- the ultrasound imaging system calculates the true three-dimensional distance between the points. Then, the ultrasound imaging system may display the distance as a "length" measurement.
- the ultrasound imaging system is configured to enable the second movement after the first movement has been completed.
- a user may first move a cursor within the plane shown on the display. If a proper position has been reached, the user may then fix this position and, hence, first and second coordinated of a respective first or second being determined. Subsequently, the user can move the cursor perpendicular to the plane to place the cursor properly touching the structure to be measured. By this, the third coordinate can be determined. As the first and second coordinate may remain fixed during this second movement, alignment and orientation within the three-dimensional image is facilitated.
- the ultrasound imaging system is configured to enable the second movement and the first movement simultaneously. By this, the positioning of the cursor may be accelerated although an advanced navigation within the three-dimensional image is required.
- the ultrasound imaging system is configured to conduct the second movement automatically. This may be the provided in case the first and second movements are conducted subsequently. However, this may also be provided in case the second movement is conducted simultaneously with the first movement.
- the automatic second movement may be conducted in a way that a collision detection takes pace that is able to determine a first collision between the cursor and a structure within the volume starting from the plane in which the first movement is conducted.
- the ultrasound imaging systems automatically moves the cursor down into the in the third dimension and detects the first point of collision between the cursor and the structure.
- the user may be enabled to activate the automatic second movement via the input device, for example by hitting a corresponding button.
- the user may be enabled to manually correct the location of the point of collision.
- the user may be enabled to leave the automatic second movement activated while conducting the first movement, i.e. altering the first and second coordinates.
- the corresponding third coordinate would then be determined continuously.
- the third coordinate may be shown to the user.
- the ultrasound imaging system is further configured to provide an indication if the cursor collides with a structure within the volume.
- an indication which may be of any sufficient type such as a visual indication, an audio indication or a tactually sensible indication.
- the user now may locate the cursor in the plane shown on the display during the first movement. Then, the second movement may be conducted without any need to change a perspective of the view shown on the display.
- the second movement can be conducted although it is not visible on the display since the second movement is merely perpendicular to the shown plane. This even more facilitates making inputs the ultrasound imaging system and making measurements during the observation of a volume.
- the indication is a visual indication displayed on the display.
- a user moving the cursor via the input device and watching the display can easily recognize the visual indication which may itself be displayed on the display.
- no further means are necessary to provide a visual indication.
- the visual indication is a change in the appearance of the cursor or a tag showing up on the display.
- a change of the appearance of the cursor may makes recognizing that the cursor collides with a structure more obvious. As the cursor will, of course, be observed by a user properly locating the cursor on the screen, a change in its appearance is instantly recognized.
- a tag may show up on the display.
- the tag may be any symbol or phrase suitable to indicate a collision between the cursor and the structure.
- the tag may an exclamation mark or the phrase "structure touched" showing up in a part of the display.
- the visual indication may also be a light, in particular a coloured light, lighting up when the cursor collides with the structure.
- the change in the appearance of the cursor causes the cursor to light-up or to disappear.
- a hidden-line mechanism may cause the cursor to disappear into the structure, providing the user with a visual indication that the structure is being touched by the cursor.
- the cursor might also light-up when it is in the structure.
- the visual indication is a change of the appearance of the structure within the volume.
- a change of the appearance of the structure may provide an even more apparent visual indication of the cursor touching the structure.
- a change of the appearance of the structure may be implemented, for example, as a change of the colour of the tissue and the structure, respectively.
- the brightness of the structure may change additionally or alternatively to a change of the colour.
- the structure may also switch into a state of pulsation as soon as the cursor collides with the structure. Such pulsation can, for example, be implemented by a dynamic change of the colour and/or brightness of the structure.
- the ultrasound imaging system further comprises a speaker, and wherein the indication is an audio indication provided via the speaker. Additionally or alternatively to the visual indication, an audio indication can be provided to the user. By this, even when not inspecting the display, the user can conduct the second movement. When the cursor touches the structure, a noise or tone provides an indication that the cursor is properly located.
- the indication is tactually sensible indication provided via the input device.
- a tactual indication of the cursor touching the structure may be provided.
- the input device may provide rumble movement when the cursor collides with the structure in the volume. Again, by this, even when not inspecting the display, the user can conduct the second movement. The user will receive a quick and immediate indication as soon as the cursor touches the structure.
- the ultrasound system is further configured to enable inputting a measurement path between the first point and the second point, and wherein the distance is determined along the measurement path.
- the ultrasound system is configured to input the measurement path by identifying at least one further point within the volume and/or by selecting a geometric form to connect the first point and the second point.
- a geometric form For example, user defined measurement paths as defined by connecting dots can be applied. Also geometrical standard forms such as ellipses, parts of circles and splines of second or even higher order can be used.
- the system further comprises a beam former configured to control the transducer array to scan the volume along a multitude of scanning lines, and further configured to receive the ultrasound receive signal and to provide an image signal, a signal processor configured to receive the image signal and to provide image data, an image processor configured to receive the image data from the signal processor and to provide display data.
- Fig. 1 shows a schematic illustration of an ultrasound system 10 according to an embodiment, in particular a medical ultrasound three-dimensional imaging system.
- the ultrasound system 10 is applied to inspect a volume of an anatomical site, in particular an anatomical site of a patient 12.
- the ultrasound system 10 comprises an ultrasound probe 14 having at least one transducer array having a multitude of transducer elements for transmitting and/or receiving ultrasound waves.
- the transducer elements each can transmit ultrasound waves in form of at least one transmit impulse of a specific pulse duration, in particular a plurality of subsequent transmit pulses.
- the transducer elements can for example be arranged in a one-dimensional row, for example for providing a two-dimensional image that can be moved or swiveled around an axis mechanically. Further, the transducer elements may be arranged in a two-dimensional array, in particular for providing a multi-planar or three-dimensional image.
- the multitude of two-dimensional images, each along a specific acoustic line or scanning line, in particular scanning receive line, may be obtained in three different ways.
- the user might achieve the multitude of images via manual scanning.
- the ultrasound probe may comprise position-sensing devices that can keep track of a location and orientation of the scan lines or scan planes. However, this is currently not contemplated.
- the transducer may be automatically mechanically scanned within the ultrasound probe. This may be the case if a one dimensional transducer array is used.
- a phased two-dimensional array of transducers is located within the ultrasound probe and the ultrasound beams are electronically scanned.
- the ultrasound probe may be hand-held by the user of the system, for example medical staff or a doctor.
- the ultrasound probe 14 is applied to the body of the patient 12 so that an image of an anatomical site in the patient 12 is provided.
- the ultrasound system 10 has a controlling unit 16 that controls the provision of a three-dimensional image via the ultrasound system 10.
- the controlling unit 16 controls not only the acquisition of data via the transducer array of the ultrasound probe 14 but also signal and image processing that form the three-dimensional images out of the echoes of the ultrasound beams received by the transducer array of the ultrasound probe 14.
- the ultrasound system 10 further comprises a display 18 for displaying the three-dimensional images to the user.
- an input device 20 is provided that may comprise keys or a keyboard 22 and further inputting devices, for example a track ball 24.
- the input device 20 might be connected to the display 18 or directly to the controlling unit 16.
- Fig. 2 shows a schematic block diagram of the ultrasound system 10.
- the ultrasound system 10 comprises an ultrasound probe (PR) 14, the controlling unit (CU) 16, the display (DI) 18 and the input device (ID) 20.
- the probe 14 comprises a phased two-dimensional transducer array 26.
- the controlling unit (CU) 16 may comprise a central processing unit 28 that may include analog and/or digital electronic circuits, a processor, microprocessor or the like to coordinate the whole image acquisition and provision.
- the central processing unit 28 does not need to be a separate entity or unit within the ultrasound system 10. It can be a part of the controlling unit 16 and generally be hardware or software implemented. The current distinction is made for illustrative purposes only.
- the central processing unit 28 as part of the controlling unit 16 may control a beam former and, by this, what images of the volume 40 are taken and how these images are taken.
- the beam former 30 generates the voltages that drives the transducer array 26, determines parts repetition frequencies, it may scan, focus and apodize the transmitted beam and the reception or receive beam(s) and may further amplify filter and digitize the echo voltage stream returned by the transducer array 26.
- the central processing unit 28 of the controlling unit 16 may determine general scanning strategies. Such general strategies may include a desired volume acquisition rate, lateral extent of the volume, an elevation extent of the volume, maximum and minimum line densities, scanning line times and the line density.
- the beam former 30 further receives the ultrasound signals from the transducer array 26 and forwards them as image signals.
- the ultrasound system 10 comprises a signal processor 34 that receives the image signals.
- the signal processor 34 is generally provided for analogue-to-digital-converting, digital filtering, for example, band pass filtering, as well as the detection and compression, for example a dynamic range reduction, of the received ultrasound echoes or image signals.
- the signal processor forwards image data.
- the ultrasound system 10 comprises an image processor 36 that converts image data received from the signal processor 34 into display data finally shown on the display 18.
- the image processor 36 receives the image data, preprocesses the image data and may store it in an image memory. These image data is then further post-processed to provide images most convenient to the user via the display 18.
- the image processor 36 may form the three-dimensional images out of a multitude of two-dimensional images acquired.
- a user interface is generally depicted with reference numeral 38 and comprises the display 18 and the input device 20. It may also comprise further input devices, for example, a trackball, a mouse or further buttons which may even be provided on the ultrasound probe 14 itself. Further, the central processing unit 28 receives all data input by a user via the input device 20 and controls the output to the user via the display 18 and the image processor 36. Hence, the central processing unit 28 may also control the whole user interface 38.
- a particular example for a three-dimensional ultrasound system which may apply the current invention is the CX50 CompactXtreme Ultrasound system sold by the applicant, in particular together with a X7-2t TEE transducer of the applicant or another transducer using the xMATRIX technology of the applicant.
- matrix transducer systems as found on Philips iE33 systems or mechanical 3D/4D transducer technology as found, for example, on the Philips iU22 and HD15 systems may apply the current invention.
- Fig. 3 shows an example of a volume 40 relative to the ultrasound probe 14.
- the exemplary volume 40 depicted in this example is of a sector type, due to the transducer array of the ultrasound probe 14 being arranged as a phased two-dimensional electronically scanned array.
- the size of the volume 40 may be expressed by an elevation angle 42 and a lateral angle 44.
- a depth 46 of the volume 40 may be expressed by a so-called line time in seconds per line. That is the scanning time spent to scan a specific scanning line.
- the two-dimensional transducer array of the ultrasound probe 14 is operated by the beam former 30 in a way that the volume 40 is scanned along a multitude of scan lines sequentially.
- a single transmit beam might illuminate a multitude, for example four, receive scanning lines along which signals are acquired in parallel. If so, such sets of receive lines are then electronically scanned across the volume 40 sequentially.
- Figs. 4a and 4b show a schematic representation of screen shots of an image 50.
- the image 50 shows a structure 52 within the volume 40 that has been scanned. Further, it is shown how an in-plane measurement of a distance is conducted according to the prior art.
- the figures provide regular screen shots of three-dimensional images 50 of a volume 40 that may be provided on state of the art for ultrasound imaging systems.
- a structure 52 is displayed as it was processed out of the data acquired by the transducer array 26 and processed via the signal processor 34 and the image processor 36.
- the structure 52 may be any part of an anatomical side of a patient, for example such as a vessel, a heart or, as depicted in the following figures, different ripples in a corrugated curved surface.
- a distance 58 between the points 54 and 56 equals an actual distance between the two points on the structure 52 that the user had marked when viewing the image 50 in Fig. 4a .
- a linear measurement path 60 between the first point 54 and the second point 58 will result in the actual distance between the two points 54, 56 to be determined as the distance 58 shown to a user.
- Figs. 5a and 5b show the case in which the two points 54 and 56 do not lie within a same viewing plane as shown in the image 50 in Fig. 5a . If a user marks the two points 54 and 56 in the image 50 in Fig. 5a , the distance between the two points 54, 56 determined by the ultrasound imaging system 10 will be shorter than the actual distance between the two point 54, 56 on the structure 52. This means that has not marked points that the user would like have marked when viewing the image 50 in Fig. 5a .
- Fig. 5b shows the structure of Fig. 5a rotated by 90°.
- a plane 62 corresponds to the plane shown to the user when viewing the image 50 in Fig. 5a .
- the first point 54 lies within the plane 62 as the corresponding part of the structure 52 also lies within the plane 62.
- a second point 56 the user has selected when viewing the image 50 in Fig. 5a , does not correspond to a true second point 64 on the structure 52 the user would like have selected when viewing the image 50 in Fig. 5a .
- the distance determined between the two points 54 and 56 will be shorter than the actual distance.
- Figs. 6a and 6b show an in-plane measurement of a distance 58 between the first point 54 and the second point 56 according to an embodiment.
- the user is shown an image 50 of the volume 40 as depicted in Fig. 6a .
- the image 50 is within the plane 62.
- a first coordinate 66 e.g. the X-dimension
- a second coordinate 68 e.g. the Y-dimension
- this may be conducted by moving a cursor 53 through the image and, hence, through the plane 62.
- the user may move the cursor 53 over the structure 52 and a point of the structure 52 that should form one of the endpoints of a distance 58 to be measured.
- the user would move the cursor in the plane 62 and the view as shown in Fig. 6a to, for example, the location of the first point 54. Then, the user may confirm the location of the cursor 53 by hitting a corresponding button or else. Now, without changing the view as shown in Fig. 6a , the user may be given an in-depth control to place the cursor 53 properly in the third dimension 70. As a movement of the cursor 53 in the third dimension 70 will not be recognizable for a user when viewing the image 50 as shown in Fig. 6a , a visual indicator 72 is provided on the display 18 and in the image 50 to inform the user that the cursor 53 has touched the structure 52. The exclamation mark shown as the visual indicator 72 in Fig.
- 6a is of merely exemplary nature. Other symbols or phrases may be used that may only be visible if the cursor actually touches the structure 52. Alternatively or additionally, it may be provided that the cursor 53, according to a hidden line mechanism, disappears when it enters the structure 52. Further, it may be provided that the cursor lights up and/or that the structure 52 lights up when the cursor and the structure 52 collide.
- Moving the cursor 53 within the plane 62 is called a "first movement”. Moving the cursor 53 perpendicularly to the plane 62 is called a "second movement”. As the first movement and the second movement have been described as being conducted subsequently, it has to be emphasized that this is only one possible embodiment of conducting the first and second movements. It may also be provided that the first and second movements are conducted simultaneously.
- Figs. 7a and 7b show how this "in-depth control" of the cursor 53 may avoid the fore-shortening effect.
- the cursor 53 may be of any form suitable. It may have the form of an arrow, a cross hair or else to properly identify the parts of the structure 52 that the user may want to select.
- the user may now select in the view as shown in Fig. 7a the first point 54 by first moving the cursor 53 in the plane 62 to determine the first and second coordinates 66, 68 and then perform a second movement of the cursor 53 in the third dimension 70 into the depth of the volume 40 until the cursor 53 and the structure 52 collide as indicated by the visual indicator 72.
- the second point 56 may then be selected the same way.
- a distance 58 between the two points 54, 56 - wherein the second point 56 is not lying within the plane 62 - can be properly determined.
- the first point 54 and the second point 56 can be set touching the structure 52. Further, this all can be done without changing the view as depicted in Fig. 7a . Then, the distance 58 between the first point 54 and the second point 56 can be properly determined.
- the display 18 or any other part of the ultrasound imaging system 10 may comprise a speaker 73 that may be configured to make a noise or tone in case the cursor 53 collides with the structure 52 to provide an audio indicator 74.
- a tactually sensible indicator 76 may be provided, for example by including a rumble mechanism 75 into the input device 20. By this, the user may feel when the cursor collides with the volume when using the input device 20 to move the cursor to 53 around the volume.
- the ultrasound imaging system 10 may be configured so as to provide the user with a possibility to measure the distance between the first point 54 and the second point 56 not only as the shortest distance along a straight line connecting both points 54, 56, but also along any other measurement path 78.
- the user may set up further points 79 in the volume by conducting the first movement and the second movement as explained above or may apply standard geometrical forms, for example an ellipse, to connect the first point 54 and the second point 56.
- Fig. 8 shows a schematic block diagram of an embodiment of a method 80.
- the method starts at a step 82.
- a three-dimensional ultrasound image 50 is shown on a display 18 together with the cursor 53 for identifying the first point 54 and the second point 56.
- Such an image may be one of the images as shown in Figs. 6a and 7a .
- step S2 the cursor is moved parallel to a plane provided on the display 18 based on input data by a user to identify a first coordinate 66 and a second coordinate 68 of at least one of the first and second points 54, 56.
- a step S3 the cursor 53 is moved perpendicularly to the plane 62 provided on the display based on input data by the user to identify a third coordinate 70 of the respective point.
- step S4 it is controlled whether the cursor 53 collides with the structure 52. If not, no amendments to the display do occur and the method runs in a loop as indicated by line 86. If so, an indication is provided on the display that the cursor 53 collides with the structure 52. In addition to visual indication 72, an audio indication 74 or a tactual indication 76 may be provided as explained above. The respective indication is given in a step S5. Now, the third coordinate 70 may be set. In case only one point has been defined so far, the method returns back before step S2 is indicated by arrow 87 to also define the coordinates of the respective second point.
- step S6 the distance 58 between the two points 54 and 56 is determined. The method then ends in a step 90.
- steps S2 and S3 do not necessarily have to be conducted subsequently. It may also be possible that the first movement within the plane and the second movement perpendicularly to the plane may be conducted in parallel directly after step S 1 as indicated by arrow 88 in dashed lines. The user may then simultaneously move the cursor 53 to define all three coordinates 66, 68, 70 at the same time.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Gynecology & Obstetrics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Description
- The present invention relates to an ultrasound imaging system and method for determining a distance between a first point and a second point in a three-dimensional ultrasound image of a volume, for example an anatomical site of a patient. The present invention further relates to a computer program for implementing such method.
- In three-dimensional ultrasound imaging, or volume imaging, the acquisition of a three-dimensional image is accomplished by conducting many two-dimensional scans that slice through the volume of interest. Hence, a multitude of two-dimensional images is acquired that lie next to another. By proper image processing, a three-dimensional image of the volume of interest can be built out of the multitude of two-dimensional images. The three-dimensional information acquired from the multitude of two-dimensional images is displayed in proper form on a display for the user of the ultrasound system.
- Further, in three-dimensional ultrasound imaging, there is often a need to make measurements of anatomical structures within the inspected volume. For convenience to users, a measurement capability is available on three-dimensional ultrasound imaging systems where the user can conduct a measurement directly on the rendered image of the three-dimensional volume containing those anatomic structures. This so called "on-glass" measurement method is very easy and convenient for users. However this technique is susceptible to a so-called "fore-shortening effect". If the structures being measured are not in the same plane as the plane of the projected image of the three-dimensional volume, the distance measured between the structures as seen on the screen will be less than the true distance between the structures in the actual three-dimensional space.
- Therefore, ultrasound systems and methods of performing measurement on three-dimensional ultrasound images have been contemplated. The reference
US 2011/0066031 A1 discloses embodiments for providing an ultrasound system for performing a three-dimensional measurement and comprising an ultrasound data acquisition unit configured to transmit ultrasound signals to a target object and receive ultrasound echo signals reflected from the target object to acquire ultrasound data. Further, it comprises a user interface configured to receive input data from a user and a processor configured to form a three-dimensional ultrasound image based on volume data derived from the ultrasound data, establish two or more points on the 3D-ultrasound image based on the input data, generate connection data among the established two or more points on the 3D-ultrasound image, and measure distances among the established two or more points based on the input data and the connection data. - Further, document
JP 3 325224 B - There is a need to further improve such three-dimensional ultrasound systems.
- It is an object of the present invention to provide an improved ultrasound system and method. It is a further object of the present invention to provide a computer program for implementing such method.
- In a first aspect of the present invention an ultrasound imaging system for providing a three-dimensional image of a volume and for determining a distance between a first point and a second point in the three-dimensional image is presented. The ultrasound imaging system comprises a transducer array configured to provide an ultrasound receive signal, a controlling unit configured to receive the ultrasound receive signal and to provide display data representing the three-dimensional image, wherein the controlling unit is further configured to determine a distance between the first point and the second point identified in the three-dimensional image, a display configured to receive the display data and to display the three-dimensional image of the volume and a cursor for identifying the first point and the second point, and an input device configured to provide input data to the controlling unit, wherein the input data includes a movement of the cursor, and wherein the ultrasound imaging system is configured to enable a user to perform a first movement of the cursor parallel to a plane displayed to the user when viewing the three-dimensional image displayed on the display based on input data, to identify a first coordinate and a second coordinate of at least one of the first and second points, and without changing a perspective of the view displayed on the display, enable the user to perform a second movement of the cursor perpendicular to the plane that is displayed and into the three dimensional image, based on input data, to identify a third coordinate of the respective point, and wherein, during the second movement of the cursor perpendicular to the plane displayed on the display, the ultrasound imaging system is further configured to provide an indication on the display when the cursor touches a structure within the volume so as to inform the user that the cursor has touched the structure and so that the user may place the cursor properly on the structure in the direction perpendicular to the plane displayed on the display.
- In a further aspect of the present invention a method for determining a distance between a first point and a second point in a three-dimensional ultrasound image of a volume is presented. The method comprises the steps of displaying the three-dimensional ultrasound image on a display together with a cursor for identifying the first point and the second point, moving the cursor parallel to a plane displayed to a user when viewing the three-dimensional image provided on the display based on input data to identify a first coordinate and a second coordinate of at least one of the first and second points, moving the cursor, without changing a perspective of the view displayed on the display, perpendicularly to the plane displayed on the display and into the three-dimensional image, based on input data to identify a third coordinate of the respective point, providing, while moving the cursor perpendicularly to the plane that is displayed, an indication on the display when the cursor touches a structure displayed within the volume so as to inform the user that the cursor has touched the structure and so that the user may place the cursor properly on the structure in the direction perpendicular to the plane displayed on the display, and when the user has placed the cursor on the structure, determining the distance between the first point and the second point.
- In a further aspect of the present invention a computer program is presented comprising program code means for causing a computer to carry out the steps of such method when said computer program is carried out on the computer.
- The basic idea of the invention is to overcome the "fore-shortening effect" by providing the user with the possibility to place measurement cursors directly into the volume to touch the structures that are to be measured in the three-dimensional image.
- By this, the problem that the user may only place the cursor in a plane shown on the display can be overcome. Further, there is no need for the user to rotate the three-dimensional volume extensively to find a proper in which it is possible to locate the cursor at a proper position touching a structure in the three-dimensional volume. Instead, the user may position the cursor in a plane of the three-dimensional volume shown on the display first and then may "dive" the cursor into the volume until it touches the structure.
- A user is provided with a cursor end-point depth control, for example for the z-dimension, in addition to the trackball for placing the end-point in the dimensions of the screen, for example the x and y dimensions. After placing the first point or the second point in the plane, e.g. with a trackball, over the structure to be measured, the user then uses the endpoint depth control to move the cursor down into the volume. After both points are placed in the same manner, the ultrasound imaging system calculates the true three-dimensional distance between the points. Then, the ultrasound imaging system may display the distance as a "length" measurement.
- Due to this, it can be ensured that the first and second points are touching the structure and are not floating somewhere within the volume. Hence, the "fore-shortening effect" cannot occur.
- Preferred embodiments of the invention are defined in the dependent claims. It shall be understood that the claimed method has similar and/or identical preferred embodiments as the claimed device and as defined in the dependent claims.
- In one example, the ultrasound imaging system is configured to enable the second movement after the first movement has been completed. By this, a user may first move a cursor within the plane shown on the display. If a proper position has been reached, the user may then fix this position and, hence, first and second coordinated of a respective first or second being determined. Subsequently, the user can move the cursor perpendicular to the plane to place the cursor properly touching the structure to be measured. By this, the third coordinate can be determined. As the first and second coordinate may remain fixed during this second movement, alignment and orientation within the three-dimensional image is facilitated.
- In a further example, the ultrasound imaging system is configured to enable the second movement and the first movement simultaneously. By this, the positioning of the cursor may be accelerated although an advanced navigation within the three-dimensional image is required.
- In a further example, the ultrasound imaging system is configured to conduct the second movement automatically. This may be the provided in case the first and second movements are conducted subsequently. However, this may also be provided in case the second movement is conducted simultaneously with the first movement. The automatic second movement may be conducted in a way that a collision detection takes pace that is able to determine a first collision between the cursor and a structure within the volume starting from the plane in which the first movement is conducted. In other words, the ultrasound imaging systems automatically moves the cursor down into the in the third dimension and detects the first point of collision between the cursor and the structure. In case the second movement is conducted subsequently to the first movement, the user may be enabled to activate the automatic second movement via the input device, for example by hitting a corresponding button. Further, the user may be enabled to manually correct the location of the point of collision. In case the first and the second movement are conducted simultaneously, the user may be enabled to leave the automatic second movement activated while conducting the first movement, i.e. altering the first and second coordinates. The corresponding third coordinate would then be determined continuously. The third coordinate may be shown to the user. By this, there may be provided the advantage that the user is able to trace a surface of the structure while conducting the first movement. This may facilitate getting an impression of the three-dimensional shape of the structure shown on the display.
- In a further example, the ultrasound imaging system is further configured to provide an indication if the cursor collides with a structure within the volume. By this, locating the first and second points within the volume is even further facilitated. By providing an indication, which may be of any sufficient type such as a visual indication, an audio indication or a tactually sensible indication. The user now may locate the cursor in the plane shown on the display during the first movement. Then, the second movement may be conducted without any need to change a perspective of the view shown on the display. As an indication is given in case the cursor touches or collides with a structure, for example an anatomical structure within the volume, the second movement can be conducted although it is not visible on the display since the second movement is merely perpendicular to the shown plane. This even more facilitates making inputs the ultrasound imaging system and making measurements during the observation of a volume.
- In a further example, the indication is a visual indication displayed on the display. By this, a user moving the cursor via the input device and watching the display can easily recognize the visual indication which may itself be displayed on the display. As a display is already in an ultrasound imaging system, no further means are necessary to provide a visual indication.
- In a further example, the visual indication is a change in the appearance of the cursor or a tag showing up on the display. A change of the appearance of the cursor may makes recognizing that the cursor collides with a structure more obvious. As the cursor will, of course, be observed by a user properly locating the cursor on the screen, a change in its appearance is instantly recognized. As an alternative, a tag may show up on the display. The tag may be any symbol or phrase suitable to indicate a collision between the cursor and the structure. For example, the tag may an exclamation mark or the phrase "structure touched" showing up in a part of the display. Last, the visual indication may also be a light, in particular a coloured light, lighting up when the cursor collides with the structure.
- In a further example, the change in the appearance of the cursor causes the cursor to light-up or to disappear. In particular, when the cursor reaches the structure, a hidden-line mechanism may cause the cursor to disappear into the structure, providing the user with a visual indication that the structure is being touched by the cursor. As an alternative, the cursor might also light-up when it is in the structure. By this, well recognizable options for a change of the appearance of the cursor are provided.
- In a further example, the visual indication is a change of the appearance of the structure within the volume. As the structure is usually significantly larger than the cursor, a change of the appearance of the structure may provide an even more apparent visual indication of the cursor touching the structure. A change of the appearance of the structure may be implemented, for example, as a change of the colour of the tissue and the structure, respectively. In further examples, the brightness of the structure may change additionally or alternatively to a change of the colour. Furthermore, the structure may also switch into a state of pulsation as soon as the cursor collides with the structure. Such pulsation can, for example, be implemented by a dynamic change of the colour and/or brightness of the structure.
- In a further example, the ultrasound imaging system further comprises a speaker, and wherein the indication is an audio indication provided via the speaker. Additionally or alternatively to the visual indication, an audio indication can be provided to the user. By this, even when not inspecting the display, the user can conduct the second movement. When the cursor touches the structure, a noise or tone provides an indication that the cursor is properly located.
- In a further example, the indication is tactually sensible indication provided via the input device. Additionally or alternative to each of the visual and audio indication, a tactual indication of the cursor touching the structure may be provided. For example, the input device may provide rumble movement when the cursor collides with the structure in the volume. Again, by this, even when not inspecting the display, the user can conduct the second movement. The user will receive a quick and immediate indication as soon as the cursor touches the structure.
- In a further example, the ultrasound system is further configured to enable inputting a measurement path between the first point and the second point, and wherein the distance is determined along the measurement path. Hence, in addition to simple point-to-point measurement, other length measurements along different measurement paths can also be accomplished.
- In a further example, the ultrasound system is configured to input the measurement path by identifying at least one further point within the volume and/or by selecting a geometric form to connect the first point and the second point. For example, user defined measurement paths as defined by connecting dots can be applied. Also geometrical standard forms such as ellipses, parts of circles and splines of second or even higher order can be used.
- In a further example, the system further comprises a beam former configured to control the transducer array to scan the volume along a multitude of scanning lines, and further configured to receive the ultrasound receive signal and to provide an image signal, a signal processor configured to receive the image signal and to provide image data, an image processor configured to receive the image data from the signal processor and to provide display data. By this, a proper signal processing and control scheme to capture and display three-dimensional images of the volume can be provided.
- These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter. In the following drawings
-
Fig. 1 shows a schematic illustration of an ultrasound imaging system according to an embodiment; -
Fig. 2 shows a schematic block diagram of an ultrasound system according to a refinement of the ultrasound system inFig. 1 ; -
Fig. 3 shows a schematic representation of an exemplary volume in relation to an ultrasonic probe; -
Fig. 4a and Fig. 4b show an in plane measurement of a distance between two points according to the prior art; -
Fig. 5a and Fig. 5b show an out of plane measurement of a distance between two points and the occurrence of the "fore-shortening effect"; -
Fig. 6a and Fig. 6b show an in plane measurement of a distance between two points according to an embodiment; -
Fig. 7a and Fig. 7b show an out of plane measurement of a distance between two points according to an embodiment avoiding the "fore-shortening effect"; and -
Fig. 8 shows a schematic flow diagram of a method according to an embodiment. -
Fig. 1 shows a schematic illustration of anultrasound system 10 according to an embodiment, in particular a medical ultrasound three-dimensional imaging system. Theultrasound system 10 is applied to inspect a volume of an anatomical site, in particular an anatomical site of apatient 12. Theultrasound system 10 comprises anultrasound probe 14 having at least one transducer array having a multitude of transducer elements for transmitting and/or receiving ultrasound waves. In one example, the transducer elements each can transmit ultrasound waves in form of at least one transmit impulse of a specific pulse duration, in particular a plurality of subsequent transmit pulses. The transducer elements can for example be arranged in a one-dimensional row, for example for providing a two-dimensional image that can be moved or swiveled around an axis mechanically. Further, the transducer elements may be arranged in a two-dimensional array, in particular for providing a multi-planar or three-dimensional image. - In general, the multitude of two-dimensional images, each along a specific acoustic line or scanning line, in particular scanning receive line, may be obtained in three different ways. First, the user might achieve the multitude of images via manual scanning. In this case, the ultrasound probe may comprise position-sensing devices that can keep track of a location and orientation of the scan lines or scan planes. However, this is currently not contemplated. Second, the transducer may be automatically mechanically scanned within the ultrasound probe. This may be the case if a one dimensional transducer array is used. Third, and preferably, a phased two-dimensional array of transducers is located within the ultrasound probe and the ultrasound beams are electronically scanned. The ultrasound probe may be hand-held by the user of the system, for example medical staff or a doctor. The
ultrasound probe 14 is applied to the body of the patient 12 so that an image of an anatomical site in thepatient 12 is provided. - Further, the
ultrasound system 10 has a controllingunit 16 that controls the provision of a three-dimensional image via theultrasound system 10. As will be explained in further detail below, the controllingunit 16 controls not only the acquisition of data via the transducer array of theultrasound probe 14 but also signal and image processing that form the three-dimensional images out of the echoes of the ultrasound beams received by the transducer array of theultrasound probe 14. - The
ultrasound system 10 further comprises adisplay 18 for displaying the three-dimensional images to the user. Further, aninput device 20 is provided that may comprise keys or akeyboard 22 and further inputting devices, for example atrack ball 24. Theinput device 20 might be connected to thedisplay 18 or directly to the controllingunit 16. -
Fig. 2 shows a schematic block diagram of theultrasound system 10. As already laid out above, theultrasound system 10 comprises an ultrasound probe (PR) 14, the controlling unit (CU) 16, the display (DI) 18 and the input device (ID) 20. As further laid out above, theprobe 14 comprises a phased two-dimensional transducer array 26. In general, the controlling unit (CU) 16 may comprise acentral processing unit 28 that may include analog and/or digital electronic circuits, a processor, microprocessor or the like to coordinate the whole image acquisition and provision. However, it has to be understood that thecentral processing unit 28 does not need to be a separate entity or unit within theultrasound system 10. It can be a part of the controllingunit 16 and generally be hardware or software implemented. The current distinction is made for illustrative purposes only. - The
central processing unit 28 as part of the controllingunit 16 may control a beam former and, by this, what images of thevolume 40 are taken and how these images are taken. The beam former 30 generates the voltages that drives thetransducer array 26, determines parts repetition frequencies, it may scan, focus and apodize the transmitted beam and the reception or receive beam(s) and may further amplify filter and digitize the echo voltage stream returned by thetransducer array 26. Further, thecentral processing unit 28 of the controllingunit 16 may determine general scanning strategies. Such general strategies may include a desired volume acquisition rate, lateral extent of the volume, an elevation extent of the volume, maximum and minimum line densities, scanning line times and the line density. - The beam former 30 further receives the ultrasound signals from the
transducer array 26 and forwards them as image signals. - Further, the
ultrasound system 10 comprises asignal processor 34 that receives the image signals. Thesignal processor 34 is generally provided for analogue-to-digital-converting, digital filtering, for example, band pass filtering, as well as the detection and compression, for example a dynamic range reduction, of the received ultrasound echoes or image signals. The signal processor forwards image data. - Further, the
ultrasound system 10 comprises animage processor 36 that converts image data received from thesignal processor 34 into display data finally shown on thedisplay 18. In particular, theimage processor 36 receives the image data, preprocesses the image data and may store it in an image memory. These image data is then further post-processed to provide images most convenient to the user via thedisplay 18. In the current case, in particular, theimage processor 36 may form the three-dimensional images out of a multitude of two-dimensional images acquired. - A user interface is generally depicted with
reference numeral 38 and comprises thedisplay 18 and theinput device 20. It may also comprise further input devices, for example, a trackball, a mouse or further buttons which may even be provided on theultrasound probe 14 itself. Further, thecentral processing unit 28 receives all data input by a user via theinput device 20 and controls the output to the user via thedisplay 18 and theimage processor 36. Hence, thecentral processing unit 28 may also control thewhole user interface 38. - A particular example for a three-dimensional ultrasound system which may apply the current invention is the CX50 CompactXtreme Ultrasound system sold by the applicant, in particular together with a X7-2t TEE transducer of the applicant or another transducer using the xMATRIX technology of the applicant. In general, matrix transducer systems as found on Philips iE33 systems or mechanical 3D/4D transducer technology as found, for example, on the Philips iU22 and HD15 systems may apply the current invention.
-
Fig. 3 shows an example of avolume 40 relative to theultrasound probe 14. Theexemplary volume 40 depicted in this example is of a sector type, due to the transducer array of theultrasound probe 14 being arranged as a phased two-dimensional electronically scanned array. Hence, the size of thevolume 40 may be expressed by anelevation angle 42 and alateral angle 44. Adepth 46 of thevolume 40 may be expressed by a so-called line time in seconds per line. That is the scanning time spent to scan a specific scanning line. During image acquisition, the two-dimensional transducer array of theultrasound probe 14 is operated by the beam former 30 in a way that thevolume 40 is scanned along a multitude of scan lines sequentially. However, in multi-line receive processing, a single transmit beam might illuminate a multitude, for example four, receive scanning lines along which signals are acquired in parallel. If so, such sets of receive lines are then electronically scanned across thevolume 40 sequentially. -
Figs. 4a and 4b show a schematic representation of screen shots of animage 50. Theimage 50 shows astructure 52 within thevolume 40 that has been scanned. Further, it is shown how an in-plane measurement of a distance is conducted according to the prior art. The figures provide regular screen shots of three-dimensional images 50 of avolume 40 that may be provided on state of the art for ultrasound imaging systems. - In the three-dimensional image 50 a
structure 52 is displayed as it was processed out of the data acquired by thetransducer array 26 and processed via thesignal processor 34 and theimage processor 36. Thestructure 52, for example, may be any part of an anatomical side of a patient, for example such as a vessel, a heart or, as depicted in the following figures, different ripples in a corrugated curved surface. - In case a user would like to measure a distance when viewing the
image 50 inFig. 4a , the user might select afirst point 54 and asecond point 56. As is derivable fromFig. 4b , adistance 58 between thepoints structure 52 that the user had marked when viewing theimage 50 inFig. 4a . Hence, alinear measurement path 60 between thefirst point 54 and thesecond point 58 will result in the actual distance between the twopoints distance 58 shown to a user. - However,
Figs. 5a and 5b show the case in which the twopoints image 50 inFig. 5a . If a user marks the twopoints image 50 inFig. 5a , the distance between the twopoints ultrasound imaging system 10 will be shorter than the actual distance between the twopoint structure 52. This means that has not marked points that the user would like have marked when viewing theimage 50 inFig. 5a . - This is clearly derivable from
Fig. 5b. Fig. 5b shows the structure ofFig. 5a rotated by 90°. Aplane 62 corresponds to the plane shown to the user when viewing theimage 50 inFig. 5a . As it is derivable fromFig. 5b , thefirst point 54 lies within theplane 62 as the corresponding part of thestructure 52 also lies within theplane 62. However, as thestructure 52 extends through thevolume 40, asecond point 56 the user has selected when viewing theimage 50 inFig. 5a , does not correspond to a truesecond point 64 on thestructure 52 the user would like have selected when viewing theimage 50 inFig. 5a . Hence, as only a distance along themeasurement path 60 and within theplane 62 is determined, the distance determined between the twopoints ultrasound imaging system 10. This is called the "fore-shortening effect". -
Figs. 6a and 6b show an in-plane measurement of adistance 58 between thefirst point 54 and thesecond point 56 according to an embodiment. The user is shown animage 50 of thevolume 40 as depicted inFig. 6a . Theimage 50 is within theplane 62. Within theplane 62, a first coordinate 66 (e.g. the X-dimension) and a second coordinate 68 (e.g. the Y-dimension) of each of thepoints cursor 53 through the image and, hence, through theplane 62. The user may move thecursor 53 over thestructure 52 and a point of thestructure 52 that should form one of the endpoints of adistance 58 to be measured. Thus, the user would move the cursor in theplane 62 and the view as shown inFig. 6a to, for example, the location of thefirst point 54. Then, the user may confirm the location of thecursor 53 by hitting a corresponding button or else. Now, without changing the view as shown inFig. 6a , the user may be given an in-depth control to place thecursor 53 properly in thethird dimension 70. As a movement of thecursor 53 in thethird dimension 70 will not be recognizable for a user when viewing theimage 50 as shown inFig. 6a , avisual indicator 72 is provided on thedisplay 18 and in theimage 50 to inform the user that thecursor 53 has touched thestructure 52. The exclamation mark shown as thevisual indicator 72 inFig. 6a is of merely exemplary nature. Other symbols or phrases may be used that may only be visible if the cursor actually touches thestructure 52. Alternatively or additionally, it may be provided that thecursor 53, according to a hidden line mechanism, disappears when it enters thestructure 52. Further, it may be provided that the cursor lights up and/or that thestructure 52 lights up when the cursor and thestructure 52 collide. - Moving the
cursor 53 within theplane 62 is called a "first movement". Moving thecursor 53 perpendicularly to theplane 62 is called a "second movement". As the first movement and the second movement have been described as being conducted subsequently, it has to be emphasized that this is only one possible embodiment of conducting the first and second movements. It may also be provided that the first and second movements are conducted simultaneously. -
Figs. 7a and 7b show how this "in-depth control" of thecursor 53 may avoid the fore-shortening effect. - First, as shown in
Fig. 7a and 7b , thecursor 53 may be of any form suitable. It may have the form of an arrow, a cross hair or else to properly identify the parts of thestructure 52 that the user may want to select. - As explained above, the user may now select in the view as shown in
Fig. 7a thefirst point 54 by first moving thecursor 53 in theplane 62 to determine the first andsecond coordinates cursor 53 in thethird dimension 70 into the depth of thevolume 40 until thecursor 53 and thestructure 52 collide as indicated by thevisual indicator 72. Thesecond point 56 may then be selected the same way. - As it is derivable from a view rotated by 90° and shown in
Fig. 7b , by this, adistance 58 between the twopoints 54, 56 - wherein thesecond point 56 is not lying within the plane 62 - can be properly determined. Thefirst point 54 and thesecond point 56 can be set touching thestructure 52. Further, this all can be done without changing the view as depicted inFig. 7a . Then, thedistance 58 between thefirst point 54 and thesecond point 56 can be properly determined. - Additionally to the
visual indicator 72, also an audio indicator or a tactually sensible indicator may be provided. Referring back toFig. 1 , thedisplay 18 or any other part of theultrasound imaging system 10 may comprise aspeaker 73 that may be configured to make a noise or tone in case thecursor 53 collides with thestructure 52 to provide anaudio indicator 74. - Further, a tactually
sensible indicator 76 may be provided, for example by including arumble mechanism 75 into theinput device 20. By this, the user may feel when the cursor collides with the volume when using theinput device 20 to move the cursor to 53 around the volume. - Further, the
ultrasound imaging system 10 may be configured so as to provide the user with a possibility to measure the distance between thefirst point 54 and thesecond point 56 not only as the shortest distance along a straight line connecting bothpoints other measurement path 78. To define thisalternative measurement path 78, the user may set up further points 79 in the volume by conducting the first movement and the second movement as explained above or may apply standard geometrical forms, for example an ellipse, to connect thefirst point 54 and thesecond point 56. -
Fig. 8 shows a schematic block diagram of an embodiment of amethod 80. The method starts at astep 82. In the first step S1, a three-dimensional ultrasound image 50 is shown on adisplay 18 together with thecursor 53 for identifying thefirst point 54 and thesecond point 56. Such an image may be one of the images as shown inFigs. 6a and7a . - Then, in step S2, the cursor is moved parallel to a plane provided on the
display 18 based on input data by a user to identify a first coordinate 66 and a second coordinate 68 of at least one of the first andsecond points - After the
first coordinates 66 and the second coordinate 68 have been defined, in a step S3, thecursor 53 is moved perpendicularly to theplane 62 provided on the display based on input data by the user to identify a third coordinate 70 of the respective point. - When this second movement in step S3 is conducted, in a step S4, it is controlled whether the
cursor 53 collides with thestructure 52. If not, no amendments to the display do occur and the method runs in a loop as indicated byline 86. If so, an indication is provided on the display that thecursor 53 collides with thestructure 52. In addition tovisual indication 72, anaudio indication 74 or atactual indication 76 may be provided as explained above. The respective indication is given in a step S5. Now, the third coordinate 70 may be set. In case only one point has been defined so far, the method returns back before step S2 is indicated byarrow 87 to also define the coordinates of the respective second point. - After both points have been defined, in a step S6, the
distance 58 between the twopoints step 90. - However, the steps S2 and S3 do not necessarily have to be conducted subsequently. It may also be possible that the first movement within the plane and the second movement perpendicularly to the plane may be conducted in parallel directly after step S 1 as indicated by
arrow 88 in dashed lines. The user may then simultaneously move thecursor 53 to define all threecoordinates
Claims (14)
- An ultrasound imaging system (10) for providing a three-dimensional image (50) of a volume (40) and for determining a distance (58) between a first point (54) and a second point (56) in the three-dimensional image (50), the ultrasound imaging system (10) comprising:a transducer array (26) configured to provide an ultrasound receive signal,a controlling unit (16) configured to receive the ultrasound receive signal and to provide display data representing the three-dimensional image, wherein the controlling unit (16) is further configured to determine a distance (58) between the first point (54) and the second point (56) identified in the three-dimensional image,a display (18) configured to receive the display data and to display the three-dimensional image (50) of the volume (40) and a cursor (53) for identifying the first point (54) and the second point (56), andan input device (20) configured to provide input data to the controlling unit (16), wherein the input data includes a movement of the cursor (53),wherein the ultrasound imaging system (10) is configured to enable a user to perform a first movement of the cursor (53) parallel to a plane (62) displayed to the user when viewing the three-dimensional image (50) displayed on the display (18) based on input data, to identify a first coordinate (66) and a second coordinate (68) of at least one of the first and second points (56); and characterised in that the ultrasound imaging system is further configured to:without changing a perspective of the view displayed on the display (18), enable the user to perform a second movement of the cursor (53) perpendicular to the plane (62) that is displayed and into the three dimensional image, based on input data, to identify a third coordinate (70) of the respective point, andwherein, during the second movement of the cursor (53) perpendicular to the plane (62) displayed on the display, the ultrasound imaging system is further configured to provide an indication (72) on the display when the cursor (53) touches a structure (52) within the volume (40) so as to inform the user that the cursor (53) has touched the structure (52) and so that the user may place the cursor properly on the structure (52) in the direction perpendicular to the plane displayed on the display.
- The system of claim 1, wherein the ultrasound imaging system (10) is configured to enable the second movement after the first movement has been completed.
- The system of claim 1, wherein the ultrasound imaging system (10) is configured to enable the second movement and the first movement simultaneously.
- The system of claim 1, wherein the indication is a visual indication (72) displayed on the display (18).
- The system of claim 4, wherein the visual indication (72) is a change in the appearance of the cursor (53) or a tag showing up on the display (18).
- The system of claim 5, wherein the change in the appearance of the cursor (53) causes the cursor (53) to light-up or to disappear.
- The system of claim 1, wherein the visual indication (72) is a change of the appearance of the structure (52) within the volume (40).
- The system of claim 1, wherein the ultrasound imaging system (10) further comprises a speaker (73), and wherein the indication (72) further comprises an audio indication (74) provided via the speaker (73).
- The system of claim 1, wherein the indication (72) further comprises tactually sensible indication (76) provided via the input device (20).
- The system of claim 1, wherein the ultrasound imaging system (10) is further configured to enable inputting a measurement path (60) between the first point (54) and the second point (56), and wherein the distance (58) is determined along the measurement path (60).
- The system of claim 10, wherein the ultrasound system is configured to input the measurement path by identifying at least one further point (79) within the volume (40) and/or by selecting a geometric form to connect the first point (54) and the second point (56).
- The system according to claim 1, further comprising:a beam former (30) configured to control the transducer array (26) to scan the volume (40) along a multitude of scanning lines (59), and further configured to receive the ultrasound receive signal and to provide an image signal,a signal processor (34) configured to receive the image signal and to provide image data,an image processor (36) configured to receive the image data from the signal processor (34) and to provide display data.
- A method (80) for determining a distance (58) between a first point (54) and a second point (56) in a three-dimensional ultrasound image (50) of a volume (40), the method comprising the following steps:displaying (S1) the three-dimensional ultrasound image on a display (18) together with a cursor (53) for identifying the first point (54) and the second point (56),moving (S2) the cursor (53) parallel to a plane (62) displayed to a user when viewing the three-dimensional image (50) provided on the display based on input data to identify a first coordinate (66) and a second coordinate (68) of at least one of the first and second points (56),moving (S3) the cursor (53), without changing a perspective of the view displayed on the display (18), perpendicularly to the plane (62) displayed on the display and into the three dimensional image, based on input data to identify a third coordinate (70) of the respective point,providing (S4), while moving the cursor (53) perpendicularly to the plane (62) that is displayed, an indication on the display when the cursor (53) touches a structure (52) displayed within the volume (40) so as to inform the user that the cursor (53) has touched the structure (52) and so that the user may place the cursor properly on the structure (52) in the direction perpendicular to the plane displayed on the display, andwhen the user has placed the cursor on the structure, determining (S6) the distance (58) between the first point (54) and the second point (56).
- Computer program comprising program code means for causing a computer to carry out the steps of the method (80) as claimed in claim 13 when said computer program is carried out on the computer.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261663652P | 2012-06-25 | 2012-06-25 | |
PCT/IB2013/054962 WO2014001954A1 (en) | 2012-06-25 | 2013-06-17 | System and method for 3d ultrasound volume measurements |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2864807A1 EP2864807A1 (en) | 2015-04-29 |
EP2864807B1 true EP2864807B1 (en) | 2021-05-26 |
Family
ID=49035618
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13753208.1A Active EP2864807B1 (en) | 2012-06-25 | 2013-06-17 | System and method for 3d ultrasound volume measurements |
Country Status (7)
Country | Link |
---|---|
US (1) | US10335120B2 (en) |
EP (1) | EP2864807B1 (en) |
JP (1) | JP6114822B2 (en) |
CN (1) | CN104412123B (en) |
BR (1) | BR112014032020B1 (en) |
RU (1) | RU2620865C2 (en) |
WO (1) | WO2014001954A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180052589A1 (en) * | 2016-08-16 | 2018-02-22 | Hewlett Packard Enterprise Development Lp | User interface with tag in focus |
US10355120B2 (en) * | 2017-01-18 | 2019-07-16 | QROMIS, Inc. | Gallium nitride epitaxial structures for power devices |
EP3378405A1 (en) * | 2017-03-20 | 2018-09-26 | Koninklijke Philips N.V. | Volume rendered ultrasound imaging |
EP3561656A1 (en) * | 2018-04-23 | 2019-10-30 | Koninklijke Philips N.V. | Precise positioning of a marker on a display |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5920319A (en) * | 1994-10-27 | 1999-07-06 | Wake Forest University | Automatic analysis in virtual endoscopy |
JP3015728B2 (en) * | 1996-05-21 | 2000-03-06 | アロカ株式会社 | Ultrasound diagnostic equipment |
JPH10201755A (en) * | 1997-01-24 | 1998-08-04 | Hitachi Medical Corp | Method for measuring three-dimensional size in pseudo-three-dimensional image and its system |
JP3325224B2 (en) * | 1998-04-15 | 2002-09-17 | オリンパス光学工業株式会社 | Ultrasound image diagnostic equipment |
US6048314A (en) | 1998-09-18 | 2000-04-11 | Hewlett-Packard Company | Automated measurement and analysis of patient anatomy based on image recognition |
JP4558904B2 (en) * | 2000-08-17 | 2010-10-06 | アロカ株式会社 | Image processing apparatus and storage medium |
CN101002107B (en) * | 2004-03-01 | 2010-06-23 | 阳光溪流女子学院健康科学中心 | System and method for ECG-triggered retrospective color flow ultrasound imaging |
EP1797456A1 (en) * | 2004-09-30 | 2007-06-20 | Koninklijke Philips Electronics N.V. | Microbeamforming transducer architecture |
JP2006314518A (en) * | 2005-05-12 | 2006-11-24 | Toshiba Corp | Ultrasonic diagnostic unit |
JP4740695B2 (en) * | 2005-08-26 | 2011-08-03 | 株式会社日立メディコ | Ultrasonic diagnostic equipment |
JP2011530366A (en) * | 2008-08-12 | 2011-12-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Ultrasound imaging |
US20100106431A1 (en) * | 2008-10-29 | 2010-04-29 | Hitachi, Ltd. | Apparatus and method for ultrasonic testing |
KR101121301B1 (en) | 2009-09-16 | 2012-03-22 | 삼성메디슨 주식회사 | Ultrasound system and method of performing 3-dimensional measurement |
US8819591B2 (en) * | 2009-10-30 | 2014-08-26 | Accuray Incorporated | Treatment planning in a virtual environment |
KR101175426B1 (en) | 2010-01-26 | 2012-08-20 | 삼성메디슨 주식회사 | Ultrasound system and method for providing three-dimensional ultrasound image |
JP5535725B2 (en) * | 2010-03-31 | 2014-07-02 | 富士フイルム株式会社 | Endoscope observation support system, endoscope observation support device, operation method thereof, and program |
JP2011215692A (en) * | 2010-03-31 | 2011-10-27 | Hokkaido Univ | Three-dimensional three-degree-of-freedom rotation parameter processor |
US9047394B2 (en) * | 2010-10-22 | 2015-06-02 | Samsung Medison Co., Ltd. | 3D ultrasound system for intuitive displaying to check abnormality of object and method for operating 3D ultrasound system |
-
2013
- 2013-06-17 CN CN201380033938.7A patent/CN104412123B/en active Active
- 2013-06-17 WO PCT/IB2013/054962 patent/WO2014001954A1/en active Application Filing
- 2013-06-17 US US14/406,760 patent/US10335120B2/en active Active
- 2013-06-17 RU RU2015102094A patent/RU2620865C2/en active
- 2013-06-17 EP EP13753208.1A patent/EP2864807B1/en active Active
- 2013-06-17 BR BR112014032020-9A patent/BR112014032020B1/en not_active IP Right Cessation
- 2013-06-17 JP JP2015517899A patent/JP6114822B2/en active Active
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
BR112014032020A8 (en) | 2021-03-16 |
BR112014032020B1 (en) | 2023-02-23 |
RU2620865C2 (en) | 2017-05-30 |
JP2015519990A (en) | 2015-07-16 |
EP2864807A1 (en) | 2015-04-29 |
US10335120B2 (en) | 2019-07-02 |
RU2015102094A (en) | 2016-08-20 |
US20150157297A1 (en) | 2015-06-11 |
CN104412123B (en) | 2017-05-17 |
BR112014032020A2 (en) | 2017-06-27 |
CN104412123A (en) | 2015-03-11 |
WO2014001954A1 (en) | 2014-01-03 |
JP6114822B2 (en) | 2017-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9084556B2 (en) | Apparatus for indicating locus of an ultrasonic probe, ultrasonic diagnostic apparatus | |
KR101495528B1 (en) | Ultrasound system and method for providing direction information of a target object | |
US20170090571A1 (en) | System and method for displaying and interacting with ultrasound images via a touchscreen | |
US20170086785A1 (en) | System and method for providing tactile feedback via a probe of a medical imaging system | |
US11793483B2 (en) | Target probe placement for lung ultrasound | |
CN105518482B (en) | Ultrasonic imaging instrument visualization | |
US20110201935A1 (en) | 3-d ultrasound imaging | |
US10456106B2 (en) | Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing method | |
US9517049B2 (en) | Ultrasonic probe, position display apparatus and ultrasonic diagnostic apparatus | |
JP6063454B2 (en) | Ultrasonic diagnostic apparatus and locus display method | |
EP2302414A2 (en) | Ultrasound system and method of performing measurement on three-dimensional ultrasound image | |
KR20150089836A (en) | Method and ultrasound apparatus for displaying a ultrasound image corresponding to a region of interest | |
EP2864807B1 (en) | System and method for 3d ultrasound volume measurements | |
JP6019363B2 (en) | Medical image measurement method and medical image diagnostic apparatus | |
CN109923432A (en) | Utilize the system and method for the feedback and tracking intervention instrument about tracking reliability | |
CN111265247B (en) | Ultrasound imaging system and method for measuring volumetric flow rate | |
US20210196237A1 (en) | Methods and apparatuses for modifying the location of an ultrasound imaging plane | |
KR20130124750A (en) | Ultrasound diagnostic apparatus and control method for the same | |
JP2012143356A (en) | Ultrasonic diagnostic equipment and program | |
US20210338204A1 (en) | Ultrasound system and methods for smart shear wave elastography | |
US20190183453A1 (en) | Ultrasound imaging system and method for obtaining head progression measurements | |
EP3826542A1 (en) | Ultrasound system and method for guided shear wave elastography of anisotropic tissue | |
US9877701B2 (en) | Methods and systems for automatic setting of color flow steering angle | |
KR102615722B1 (en) | Ultrasound scanner and method of guiding aim | |
US20210093298A1 (en) | Methods and apparatuses for providing feedback for positioning an ultrasound device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20150126 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20171219 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: KONINKLIJKE PHILIPS N.V. |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602013077657 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G01S0015890000 Ipc: G01S0007520000 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 8/00 20060101ALI20201125BHEP Ipc: G01S 15/89 20060101ALI20201125BHEP Ipc: A61B 8/08 20060101ALI20201125BHEP Ipc: G16H 50/30 20180101ALI20201125BHEP Ipc: G01S 7/52 20060101AFI20201125BHEP Ipc: G06T 19/00 20110101ALI20201125BHEP |
|
INTG | Intention to grant announced |
Effective date: 20201217 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: ELANGOVAN, VINODKUMAR Inventor name: SNYDER, RICHARD, ALLEN |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1396766 Country of ref document: AT Kind code of ref document: T Effective date: 20210615 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602013077657 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R084 Ref document number: 602013077657 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: 746 Effective date: 20210701 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1396766 Country of ref document: AT Kind code of ref document: T Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210826 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210926 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210827 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210826 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210927 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602013077657 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210630 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210617 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210617 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
26N | No opposition filed |
Effective date: 20220301 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210926 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IT Payment date: 20220622 Year of fee payment: 10 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20220623 Year of fee payment: 10 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20130617 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20230627 Year of fee payment: 11 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20230620 Year of fee payment: 11 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230630 |