US9542012B2 - System and method for processing a pointer movement - Google Patents

System and method for processing a pointer movement Download PDF

Info

Publication number
US9542012B2
US9542012B2 US14/380,735 US201314380735A US9542012B2 US 9542012 B2 US9542012 B2 US 9542012B2 US 201314380735 A US201314380735 A US 201314380735A US 9542012 B2 US9542012 B2 US 9542012B2
Authority
US
United States
Prior art keywords
contour
pointer
pointer movement
image
pointing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/380,735
Other languages
English (en)
Other versions
US20150035753A1 (en
Inventor
Daniel Bystrov
Rafael Wiemker
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to US14/380,735 priority Critical patent/US9542012B2/en
Assigned to KONINKLIJKE PHILIPS N.V. reassignment KONINKLIJKE PHILIPS N.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BYSTROV, DANIEL, WIEMKER, RAFAEL
Publication of US20150035753A1 publication Critical patent/US20150035753A1/en
Application granted granted Critical
Publication of US9542012B2 publication Critical patent/US9542012B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Definitions

  • the invention relates to a system and method for real-time processing of a pointer movement provided by a pointing device.
  • the invention further relates to a workstation and an imaging apparatus comprising said system, and a computer program product comprising instructions for causing a processor system to perform said method.
  • a pointing device such as a computer mouse, touch screen, stylus, etc.
  • a user enables a user to interact with apparatuses such as computers, tablets, etc.
  • Said interaction may involve the use of a graphical user interface, wherein a position or movement of the pointing device is represented onscreen by a pointer or cursor being positioned and/or moved accordingly. In this manner, an intuitive way of interacting with the device is provided.
  • Systems and methods are known that enable users to operate a pointing device to perform processing of an image using the pointer shown onscreen. For example, the user may draw a line or curve in the image by moving the pointing device accordingly.
  • a problem of the above tool is that snapping the mouse pointer to a particular pixel in the image to establish a position in the image is inconvenient for a user.
  • a first aspect of the invention provides a system for real-time processing of a pointer movement provided by a pointing device, the system comprising:
  • a workstation and an imaging apparatus comprising the system set forth.
  • a method for real-time processing of a pointer movement provided by a pointing device comprising:
  • a computer program product comprising instructions for causing a processor system to perform the method set forth.
  • the above measures provide real-time processing of a pointer movement provided by a pointing device.
  • real-time refers to the pointer movement being processed while the user operates the pointing device, i.e., at so-termed interactive rates.
  • the system comprises an output such as a display or display output which enables display of a pointer at a first position in an image, e.g., by overlaying the pointer over the image.
  • the pointer is a graphical indicator such as an arrow or crosshair, also commonly referred to as a cursor. As a result, the pointer is visible to the user when viewing the image onscreen.
  • the first position of the pointer may reflect a physical position of the pointing device or a physical position of the user with respect to the pointing device at a first instance in time, e.g., a position of a finger of the user relative to a surface of a touchscreen.
  • the system further comprises a user input that obtains pointer movement data from the pointing device when the user operates the pointing device.
  • the pointer movement data is indicative of a movement of the pointer from the first position to a second position in the image, and thus allows the system to establish the second position.
  • the pointer movement may reflect a physical movement of the pointing device or of the user with respect to the pointing device, e.g., a movement of the finger of the user on the touch-screen.
  • the image comprises a contour.
  • the contour is formed by elements of the image such as pixels or voxels.
  • the contour may be an edge, i.e., a clear boundary between two objects or an object and its background.
  • the contour may also be formed by pixels or voxels having the same or similar value, e.g., a line in a texture area or along an image gradient, e.g., being conceptually similar to a height contour line in a topographical map.
  • the contour may be an open contour, i.e., having a start and an end which are not connected, as well as a closed contour, i.e., being a contour that connects with itself.
  • the system further comprises a processor that reduces the pointer movement in a direction that is perpendicular to the orientation or direction of the contour in the image.
  • a part, i.e. a component, of the pointer movement which is directed towards and/or away from the contour is reduced.
  • a dampened pointer movement is obtained, which is used to establish a third position of the pointer.
  • the third position therefore corresponds with the pointer's dampened movement from the first position.
  • the third position typically differs from the second position.
  • the above measures have the effect that a movement of the pointer, as provided by the user, is reduced in a direction towards and/or away from a contour. Hence, the pointer movement is dampened based on an orientation of the contour. Therefore, whereas the pointer movement would otherwise result in a second position that is nearer to and/or further away from the contour, the pointer movement is now dampened such that a third position is established which is less near to and/or less far away from the contour.
  • the invention is partially based on the recognition that it is generally difficult for a user to follow a contour in the image, or follow the contour at a given distance, using a pointing device.
  • Following the contour and following the contour at a distance, i.e., parallel to the contour is henceforth commonly referred to as following the contour.
  • Such following of the contour may be highly relevant, e.g., when drawing a line around an object, pointing out features along the contour, visually inspecting the contour by viewing a zoomed-in view based on the position of the pointer, etc.
  • the user may therefore frequently move the pointer to a second position that is further away from, or nearer to, the contour.
  • a third position is established that better follows the contour than the second position.
  • the user is not restricted to following the contour, as he may still move towards or away from the contour, albeit with more effort, e.g., requiring more or longer movement of the pointing device.
  • less concentration is needed to accurately follow the contour.
  • the user may more conveniently establish a third position that follows the contour in the image.
  • the output is arranged for displaying the pointer at the third position in the image.
  • the user is therefore shown a dampened movement of the pointer.
  • the user perceives the pointer as stabilized, i.e., more stable, when it follows the contour in the image.
  • the user perceives the sensitivity of the pointer to be higher when it follows the contour than when it deviates from the contour. Therefore, the user may more conveniently follow the contour with the pointer.
  • the processor is arranged for processing the image based on the first position and the third position.
  • the image is processed in accordance with the dampened pointer movement.
  • Image processing which is based on the pointer movement therefore benefits from the dampening of the pointer movement.
  • processing the image comprises drawing a line in the image between the first position and the third position.
  • a line is thus obtained that is drawn between the first position and the third position. Therefore, the line better follows the contour as compared to a line drawn between the first position and the second position.
  • the user can more accurately draw a line that follows the contour.
  • the user can draw a line that is less wiggly than would be possible without the dampened pointer movement.
  • the processor is arranged for establishing the third position by maximizing a similarity feature between image data at the first position and further image data located along the direction orthogonal to the contour from the second position.
  • the third position is established at a point towards or away from the contour where further image data most resembles the image data at the first position.
  • the image data is compared with further image data at said positions, and one of the positions is selected at which the further image data maximizes the similarity feature.
  • Said measures provide an implicit following of a contour, since, when the image data at the first position shows a part of an image gradient, a third position is established which shows a similar part of an image gradient. Similarly, when the image data at the first position shows a part of an edge at a given distance, a third position is established which shows a similar part of an edge at a similar distance.
  • image data refers to pixels or voxels at a given position, possibly including pixels and voxels near said position, e.g., within the neighborhood of said position.
  • a contour may be followed without a need for explicitly detecting the contour in the image.
  • errors associated with wrongly detecting a contour in the image are avoided.
  • the processor is arranged for determining the contour in the image within the neighborhood of the first position.
  • the processor explicitly establishes the contour in the image, e.g., by detecting an object edge within the image or receiving contour data.
  • the processor determines the contour within the neighborhood of the first position. The contour is therefore located at a limited distance from the first position of the pointer.
  • the processor is arranged for dampening the pointer movement based on whether the pointer movement in the direction orthogonal to the contour is towards or away from the contour.
  • the dampening is thus different, i.e., the pointer movement is differently reduced, in the direction towards and the direction away from the contour.
  • the processor when said pointer movement is away from the contour, is arranged for reducing said pointer movement more than when said pointer movement is towards the contour. It is therefore easier to deviate from following the contour in a direction towards the contour than in a direction away from the contour. Moving the pointer towards the contour typically improves the following of the contour. Moving the pointer away from the contour typically worsens the following of the contour.
  • the user may easily move the pointer along and towards the contour, but not further away from the contour.
  • the processor is arranged for dampening the pointer movement by decomposing a vector indicative of the pointer movement in a direction parallel to the contour and in the direction orthogonal to the contour for reducing a size of a component of the vector along the direction orthogonal to the contour.
  • Vector decomposition is well suited for establishing the part of the pointer movement along the direction orthogonal to the contour.
  • the processor is arranged for (i) determining a gradient based on the contour, (ii) determining an orthogonal gradient based on the gradient, the orthogonal gradient being orthogonal to the gradient, and (iii) decomposing the vector along the gradient and the orthogonal gradient for reducing a component of the vector along the gradient.
  • a gradient is well suited for obtaining the direction orthogonal to the contour, as the gradient typically points in a direction that is orthogonal to the contour direction.
  • the output is arranged for displaying a user interface for enabling the user to provide an adjustment of said dampening, and the processor is arranged for performing said dampening based on the adjustment.
  • the image is a medical image having associated anatomical data
  • the processor is arranged for dampening the pointer movement further, based on the anatomical data.
  • Anatomical data is thus used to further control the dampening of the pointer movement.
  • the anatomical data may indicate, e.g., a nearby contour, a direction of the contour, whether dampening is needed, how strong the dampening is, etc.
  • the processor is arranged for determining the gradient by (i) detecting a plurality of edges within the neighborhood of the first position, (ii) calculating a distance map based on the plurality of edges and the first position, and (iii) obtaining the gradient at the first position based on the distance map.
  • a person skilled in the art will appreciate that the method may be applied to multi-dimensional image data, e.g. two-dimensional (2-D), three-dimensional (3-D) or four-dimensional (4-D) images.
  • a dimension of the multi-dimensional image data may relate to time.
  • a three-dimensional image may comprise a time domain series of two-dimensional images.
  • the image may be a medical image, acquired by various acquisition modalities such as, but not limited to, standard X-ray Imaging, Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Ultrasound (US), Positron Emission Tomography (PET), Single Photon Emission Computed Tomography (SPECT), and Nuclear Medicine (NM).
  • CT Computed Tomography
  • MRI Magnetic Resonance Imaging
  • US Ultrasound
  • PET Positron Emission Tomography
  • SPECT Single Photon Emission Computed Tomography
  • NM Nuclear Medicine
  • the image may also be of any other type, e.g., a cartographic or seismic image which the user wishes
  • FIG. 1 shows a system according to the present invention, and a display
  • FIG. 2 shows a method according to the present invention
  • FIG. 3 shows a computer program product according to the present invention
  • FIG. 4 shows a pointer near a contour in an image, with a pointer movement from a first position to a second position in the image being schematically indicated;
  • FIG. 5 shows a decomposition of the pointer movement in a direction along the contour and in a direction orthogonal to the contour
  • FIG. 6 shows the decomposition based on a gradient of the contour
  • FIG. 7 shows a dampened pointer movement and a third position of the pointer based on the dampened pointer movement
  • FIG. 8 shows the pointer being displayed based on the dampened pointer movement
  • FIG. 9 shows a contour drawn based on the dampened pointer movement
  • FIG. 10 shows again the pointer near the contour in the image
  • FIG. 11 shows an instance of dampening based on maximizing a similarity measure.
  • FIG. 1 shows a system 100 for real-time processing of a pointer movement provided by a pointing device 140 .
  • the system 100 comprises an output 130 for displaying a pointer 162 at a first position in an image 152 .
  • the output 130 is shown to be connected to a display 150 .
  • the display 150 may, but does not have to be part of the system 100 .
  • the output 130 is shown to provide display data 132 to the display 150 for said displaying of the pointer 162 in the image 152 .
  • the system 100 further comprises a user input 110 for obtaining pointer movement data 112 from a pointing device 140 operable by a user.
  • the pointer movement data 112 is indicative of a pointer movement of the pointer 162 from the first position to a second position in the image 152 .
  • the pointing device 140 is shown to be a computer mouse. However, it will be appreciated that any other type of pointing device 140 , such as a touch screen, gaze tracker, stylus, etc., may be advantageously used.
  • the system 100 comprises a processor 120 .
  • the processor 120 is shown to be connected to the user input 110 for receiving the pointer movement data 112 from the user input 110 .
  • the processor 120 is shown to be connected to the output 130 for providing output data 122 to the output 130 .
  • the output data 122 may, but do not have to be equal to the display data 132 .
  • the output data 122 is at least indicative of a position of the pointer.
  • the processor 120 is arranged for (i) dampening the pointer movement by reducing the pointer movement along a direction orthogonal to a contour 160 in the image, and (ii) establishing a third position of the pointer 162 based on said dampened pointer movement.
  • FIG. 2 shows a method 200 for real-time processing of a pointer movement provided by a pointing device.
  • the method 200 comprises, in a first step titled “DISPLAY POINTER AT FIRST POSITION”, displaying 210 a pointer at a first position in an image, the image comprising a contour.
  • the method 200 further comprises, in a second step titled “OBTAINING POINTER MOVEMENT DATA”, obtaining 220 pointer movement data from a pointing device operable by a user, the pointer movement data being indicative of a pointer movement of the pointer from the first position to a second position in the image.
  • the method 200 further comprises, in a third step titled “DAMPENING THE POINTER MOVEMENT BASED ON THE CONTOUR”, dampening 230 the pointer movement by reducing the pointer movement along a direction orthogonal to the contour.
  • the method 200 further comprises, in a fourth step titled “ESTABLISHING THIRD POSITION BASED ON DAMPENED MOVEMENT”, establishing 240 a third position of the pointer based on said dampened pointer movement.
  • the method 200 may correspond to an operation of the system 100 , and will be further explained in reference to the system 100 . It will be appreciated, however, that the method may be performed in separation from said system.
  • FIG. 3 shows a computer program product 260 comprising instructions for causing a processor system to perform the method according to the present invention.
  • the computer program product 260 may be comprised on a computer readable medium 250 , for example in the form of a series of machine readable physical marks and/or a series of elements having different electrical, e.g., magnetic, or optical properties or values.
  • FIG. 4 and further figures illustrate an operation of the system 100 based on, for the sake of explanation, a zoomed-in view of the image 152 .
  • a pointer 162 is displayed in the image 152 .
  • the pointer 162 is shown as an arrow but may equally be a crosshair or any other suitable graphical representation.
  • the pointer 162 is shown to be positioned at a first position p 1 , i.e., the tip of the arrow is located at the first position p 1 .
  • a second position p 2 which corresponds to a pointer movement v of the pointer 162 away from the first position p 1 , leading to the second position p 2 .
  • the pointer movement v is schematically illustrated by means of a dashed line in FIG. 4 .
  • the pointer movement v may constitute, or be interpreted as, a vector v.
  • the image 152 further shows a contour 160 .
  • the contour 160 may be an object edge, i.e., an edge between an object and a background. However, the contour 160 may equally be a texture edge or line along an image gradient.
  • the contour 160 may be determined, by the system, to be within the neighborhood 166 of the first position p 1 . Determining the contour 160 may comprise performing edge detection within the neighborhood 166 or the entire image 152 . For that purpose, various techniques from the field of image processing may be advantageously used. Alternatively or additionally, determining the contour 160 may comprise obtaining contour data indicative of the contour 60 . The contour data may be obtained from a previously performed contour detection. Alternatively, the contour 160 may be defined by contour data within the system, e.g., in case the contour 160 is a vector-based contour. Here, the contour 160 may be determined directly from the contour data, e.g., from coordinates of the start, intermediate and end points. The contour 160 may be a closest contour to the first position p 1 , or a closest contour having a certain strength.
  • the neighborhood 166 is shown to be a rectangular neighborhood. However, any other suitable shape may equally be used. Moreover, the neighborhood 166 may be an explicitly defined neighborhood, e.g., defined by a width and a height, or may be an implicit defined neighborhood, e.g., being the result of the edge detection or of a search algorithm being limited to a certain distance from the first position p 1 in the image 152 . In the latter case, edge detection may be performed on the entire image 152 , with an edge selection algorithm then selecting a contour 160 nearest to the first position p 1 based on the edge detection.
  • FIG. 5 shows a result of the pointer movement v being decomposed in a direction 170 that follows the contour 160 , i.e., runs in parallel with the contour 160 , and a direction 172 that is orthogonal to the contour 160 .
  • said directions 170 , 172 are shown as dashed lines having a length that corresponds to the pointer movement constituting a vector v, a decomposition of the vector v yielding vector components in the aforementioned directions, and the length of the arrows then being indicative of a size of the vector components. Consequently, the sum of the individual vectors represented by said vector components yields the pointer movement v.
  • FIG. 5 shows the pointer movement v predominantly following the contour 160 and, to a lesser extent, moving away from the contour 160 , as shown in FIG. 5 by the line 170 being more than twice as long as the line 172 .
  • FIG. 6 corresponds to FIG. 5 , with the exception that the lines 170 and 172 are now referred to by the results of the vector decomposition.
  • the vector decomposition may be as follows.
  • a gradient g is determined at the first position p 1 .
  • the gradient is a vector that points in the direction of the largest change in the image.
  • the direction may be that of the largest pixel change, i.e., the largest change in luminance and/or chrominance value of spatially adjacent pixels.
  • a gradient at a position on the contour 160 will typically be directed away from the contour 160 in a direction orthogonal to the contour.
  • the gradient g at the first position p 1 will typically, by virtue of p 1 being located near the contour 160 , point also in the direction orthogonal to the contour. Based on the gradient g, an orthogonal gradient g 0 is determined.
  • FIG. 6 shows the resulting vectors a ⁇ g and b ⁇ g 0 , which, when summed together, correspond to the vector v, i.e., the pointer movement v.
  • a gradient field may be calculated for the neighborhood 166 .
  • Calculating the gradient field may comprise detecting a plurality of edges within the neighborhood 166 , e.g., by performing edge detection with said neighborhood 166 .
  • a distance map may be calculated based on the plurality of edges and the first position p 1 , with the distance map being indicative of the distance between each of the plurality of edges and the first position p 1 .
  • the gradient g may be obtained by calculating said gradient at the first position p 1 in the distance map. It is noted that various techniques from the fields of image processing and linear algebra may be advantageously used for calculating the gradient based on the distance map.
  • the calculation of the gradient g may comprise the use of discrete differentiation operators such as Sobel operators, as known from the aforementioned field of image processing.
  • the distance map may be pre-computed, avoiding the need to detect edges and calculating, based thereon, the distance map while the user moves the pointer in the image.
  • FIG. 7 shows a result of dampening of the pointer movement by reducing the pointer movement along the direction orthogonal to the contour 160 .
  • the vector component a of the vector v is dampened by a dampening factor f
  • the vector component b of the vector b is not dampened.
  • the dampening factor f is smaller than one, resulting in the pointer movement along the direction orthogonal to the contour 160 being reduced.
  • the dampening factor f is chosen to be approximately 0.5.
  • FIG. 7 shows a third position p 3 , which is established based on said dampened pointer movement v d .
  • the third position p 3 corresponds to the dampened pointer movement v d of the pointer 162 away from the first position p 1 , leading to the third position p 3 .
  • the third position p 3 may be established by adding the vector v d to the first position p 1 .
  • the third position p 3 is located nearer to the contour 160 than the second position p 2 .
  • FIG. 7 further shows a user interface 180 for enabling the user to adjust the dampening.
  • the user interface 180 may, for example, enable the user to set the dampening factor f by sliding a slider on a numerical scale. As a result, the system 100 may perform the dampening in accordance with said selection of the user, i.e., with the selected dampening factor f.
  • the user interface 180 may provide a selection menu enabling the user to select, e.g., a ‘weak’, ‘medium’ or ‘strong’ dampening, with the system 100 establishing dampening parameters, e.g., the dampening factor f, that correspond to said selection.
  • FIG. 8 shows a result of the pointer 164 being displayed at the third position p 3 in the image 154 .
  • the pointer 162 at the previous position p 1 is shown having a dashed outline to indicate that the pointer 162 used to be located there.
  • the pointer 164 at the third position p 3 corresponds to an update of the position of the pointer 162 , with the pointer 164 thus being displayed only at the third position p 3 .
  • the system 100 may be arranged for continuously receiving pointer movement data from the pointing device, e.g., in millisecond-based intervals, and then continuously updating the position of the pointer 164 in the image 154 based on a dampened pointer movement being calculated for each of the pointer movement data. Therefore, the movement of the pointer 164 as shown to the user corresponds to the dampened pointer movement.
  • the image may be processed based on the first position p 1 and the third position p 3 .
  • an image segmentation may be performed with the first position p 1 and the third position p 3 providing an initialization of the image segmentation.
  • FIG. 9 shows an example in which the processing of the image 156 comprises drawing a contour 168 in the image 156 between the first position p 1 and the third position p 3 .
  • the system 100 may be arranged for continuously receiving pointer movement data from the pointing device, e.g., in millisecond-based intervals, and then drawing the contour 168 in the image 156 based on a dampened pointer movement being calculated for each of the pointer movement data.
  • the position of the pointer 164 may be updated together with the image processing being performed based on the third position p 3 .
  • only the image processing may be performed, i.e., without displaying the pointer 164 at the third position p 3 .
  • This may be desirable, e.g., when the dampening is only needed for the image processing, or when it is desirable not to display the pointer during the image processing.
  • the pointer may not be shown, since the end of the contour, as it is being drawn, may already effectively serve as a pointer.
  • FIGS. 10 and 11 illustrate an operation of the system 100 in which the contour 160 is not explicitly determined by the system 100 .
  • the processor 120 may be arranged for establishing the third position p 3 by maximizing a similarity feature between image data 190 at the first position p 1 and further image data 192 , 194 located along the direction orthogonal 172 to the contour 160 , starting at the second position p 2 .
  • a similarity feature various techniques from the field of image processing may be advantageously used.
  • a basic example is the calculation of a Sum of Absolute Differences (SAD) between pixels in a block 190 centered on the first position p 1 and corresponding pixels in blocks 192 , 194 along the direction orthogonal 172 to the contour 160 , from the second position p 2 .
  • SAD Sum of Absolute Differences
  • a first SAD may be calculated between the block 190 at the first position p 1 and the block at the second position p 2 , as well as multiple blocks further along said direction.
  • FIG. 11 only shows the block at a position p 3 which minimizes the similarity feature, i.e., yields a lowest SAD.
  • said position is established as the third position p 3 .
  • the similarity feature may emphasize edges or contours in the image data.
  • the similarity feature may incorporate a distance-dependent penalty to avoid deviating too much from the second position p 2 .
  • the maximizing of the similarity feature may be constrained to a given distance from or neighborhood of the second position p 2 .
  • the distance-dependent penalty or the limitation to a given distance or neighborhood may constitute a dampening factor, i.e., determine a magnitude of the dampening, and thus have a similar functionality as the earlier mentioned dampening factor f as referred to in the description of FIG. 7 .
  • the dampening may be based on a constant dampening factor or a user-selectable dampening factor.
  • the dampening may be adaptive. For example, the dampening of the pointer movement may be based on whether the pointer movement in the direction orthogonal to the contour is towards or away from the contour. Therefore, when the pointer movement is away from the contour, the dampening may be stronger, i.e., said pointer movement may be more reduced than when said pointer movement is towards the contour. This may allow the user to follow a contour directly and more easily, i.e., move the pointer on top of the contour.
  • Said dampening may also be reversed, i.e., the dampening may be stronger when the pointer movement is towards the contour. This may allow the user to stay clear of contours more easily.
  • the dampening may be based on anatomical data associated with the medical image.
  • the anatomical data may determine the dampening, e.g., the dampening factor f, based on an anatomical context of the first position p 1 . For example, in an area where contours are known to constitute contours of organs, the dampening may be stronger than in an area where contours are known to constitute texture edges of the organs.
  • the user may freely move the pointer within an organ, with the dampening only being provided around organ boundaries.
  • the dampening may be based on general metadata associated with an image, i.e., data that is neither anatomical data nor a medical image. For example, knowledge about the shape of an object in the image may be used to dampen the pointer movement while the user draws a line, i.e., a straight line or a curve, around the object.
  • the invention may be advantageously used in the medical domain, e.g., for interactive segmentation of organs at risk or target volumes in radio therapy treatment planning, tumor and organ delineation in oncology diagnosis, bone delineations in X-ray, etc.
  • the invention may also be used in non-medical image processing, such as manual photo editing using an image processing software package.
  • the invention also applies to computer programs, particularly computer programs on or in a carrier, adapted to put the invention into practice.
  • the program may be in the form of a source code, an object code, a code intermediate source and object code such as in a partially compiled form, or in any other form suitable for use in the implementation of the method according to the invention.
  • a program may have many different architectural designs.
  • a program code implementing the functionality of the method or system according to the invention may be sub-divided into one or more sub-routines. Many different ways of distributing the functionality among these sub-routines will be apparent to the skilled person.
  • the sub-routines may be stored together in one executable file to form a self-contained program.
  • Such an executable file may comprise computer-executable instructions, for example, processor instructions and/or interpreter instructions (e.g. Java interpreter instructions).
  • one or more or all of the sub-routines may be stored in at least one external library file and linked with a main program either statically or dynamically, e.g. at run-time.
  • the main program contains at least one call to at least one of the sub-routines.
  • the sub-routines may also comprise function calls to each other.
  • An embodiment relating to a computer program product comprises computer-executable instructions corresponding to each processing step of at least one of the methods set forth herein. These instructions may be sub-divided into sub-routines and/or stored in one or more files that may be linked statically or dynamically.
  • Another embodiment relating to a computer program product comprises computer-executable instructions corresponding to each means of at least one of the systems and/or products set forth herein. These instructions may be sub-divided into sub-routines and/or stored in one or more files that may be linked statically or dynamically.
  • the carrier of a computer program may be any entity or device capable of carrying the program.
  • the carrier may include a storage medium, such as a ROM, for example, a CD ROM or a semiconductor ROM, or a magnetic recording medium, for example, a hard disk.
  • the carrier may be a transmissible carrier such as an electric or optical signal, which may be conveyed via electric or optical cable or by radio or other means.
  • the carrier may be constituted by such a cable or other device or means.
  • the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted to perform, or to be used in the performance of, the relevant method.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Position Input By Displaying (AREA)
US14/380,735 2012-02-24 2013-01-28 System and method for processing a pointer movement Active US9542012B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/380,735 US9542012B2 (en) 2012-02-24 2013-01-28 System and method for processing a pointer movement

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261602756P 2012-02-24 2012-02-24
PCT/IB2013/050712 WO2013124751A1 (en) 2012-02-24 2013-01-28 System and method for processing a pointer movement
US14/380,735 US9542012B2 (en) 2012-02-24 2013-01-28 System and method for processing a pointer movement

Publications (2)

Publication Number Publication Date
US20150035753A1 US20150035753A1 (en) 2015-02-05
US9542012B2 true US9542012B2 (en) 2017-01-10

Family

ID=47901247

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/380,735 Active US9542012B2 (en) 2012-02-24 2013-01-28 System and method for processing a pointer movement

Country Status (6)

Country Link
US (1) US9542012B2 (zh)
EP (1) EP2817695B1 (zh)
JP (1) JP6133906B2 (zh)
CN (1) CN104145237B (zh)
BR (1) BR112014020960B1 (zh)
WO (1) WO2013124751A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013124751A1 (en) 2012-02-24 2013-08-29 Koninklijke Philips N.V. System and method for processing a pointer movement
US9189860B2 (en) * 2013-04-22 2015-11-17 General Electric Company Real-time, interactive image analysis
US9652125B2 (en) 2015-06-18 2017-05-16 Apple Inc. Device, method, and graphical user interface for navigating media content
US9990113B2 (en) 2015-09-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control
US9928029B2 (en) 2015-09-08 2018-03-27 Apple Inc. Device, method, and graphical user interface for providing audiovisual feedback
US10423293B2 (en) * 2015-11-25 2019-09-24 International Business Machines Corporation Controlling cursor motion
US11922006B2 (en) 2018-06-03 2024-03-05 Apple Inc. Media control for screensavers on an electronic device
US11182073B2 (en) 2018-11-28 2021-11-23 International Business Machines Corporation Selection on user interface based on cursor gestures

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0714057A1 (en) 1994-11-23 1996-05-29 Computervision Corporation Method and apparatus for displaying a cursor along a two dimensional representation of a computer generated three dimensional surface
WO2005048193A1 (en) 2003-11-13 2005-05-26 Koninklijke Philips Electronics N.V. Three-dimensional segmentation using deformable surfaces
US20080139895A1 (en) 2006-10-13 2008-06-12 Siemens Medical Solutions Usa, Inc. System and Method for Selection of Points of Interest During Quantitative Analysis Using a Touch Screen Display
WO2010038172A1 (en) 2008-10-01 2010-04-08 Koninklijke Philips Electronics N.V. Selection of snapshots of a medical image sequence
US8013837B1 (en) * 2005-10-11 2011-09-06 James Ernest Schroeder Process and apparatus for providing a one-dimensional computer input interface allowing movement in one or two directions to conduct pointer operations usually performed with a mouse and character input usually performed with a keyboard
EP2385451A1 (en) 2010-05-07 2011-11-09 Samsung Electronics Co., Ltd. Method for providing gui using pointer having visual effect showing that pointer is moved by gravity and electronic apparatus thereof
US20150035753A1 (en) 2012-02-24 2015-02-05 Koninklijke Philips N.V. System and method for processing a pointer movement

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02171821A (ja) * 1988-12-23 1990-07-03 Shimadzu Corp 座標入力装置
JPH05143238A (ja) * 1991-11-25 1993-06-11 Fujitsu Ltd ポインテイングカーソル移動制御装置
JPH10295679A (ja) * 1997-04-23 1998-11-10 Fuji Photo Film Co Ltd 放射線画像の照射野認識方法および装置
JP2004181240A (ja) * 2002-12-03 2004-07-02 Koninkl Philips Electronics Nv 超音波撮像により撮像される対象物の境界を生成するシステム及び方法
US7383517B2 (en) * 2004-04-21 2008-06-03 Microsoft Corporation System and method for acquiring a target with intelligent pointer movement
KR20140102762A (ko) * 2007-10-05 2014-08-22 지브이비비 홀딩스 에스.에이.알.엘. 포인터 제어장치
JP5380079B2 (ja) * 2009-01-06 2014-01-08 株式会社東芝 超音波治療支援装置および超音波治療支援プログラム
JP2011048423A (ja) * 2009-08-25 2011-03-10 Sony Corp 情報処理方法、情報処理装置およびプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0714057A1 (en) 1994-11-23 1996-05-29 Computervision Corporation Method and apparatus for displaying a cursor along a two dimensional representation of a computer generated three dimensional surface
WO2005048193A1 (en) 2003-11-13 2005-05-26 Koninklijke Philips Electronics N.V. Three-dimensional segmentation using deformable surfaces
US8013837B1 (en) * 2005-10-11 2011-09-06 James Ernest Schroeder Process and apparatus for providing a one-dimensional computer input interface allowing movement in one or two directions to conduct pointer operations usually performed with a mouse and character input usually performed with a keyboard
US20080139895A1 (en) 2006-10-13 2008-06-12 Siemens Medical Solutions Usa, Inc. System and Method for Selection of Points of Interest During Quantitative Analysis Using a Touch Screen Display
WO2010038172A1 (en) 2008-10-01 2010-04-08 Koninklijke Philips Electronics N.V. Selection of snapshots of a medical image sequence
EP2385451A1 (en) 2010-05-07 2011-11-09 Samsung Electronics Co., Ltd. Method for providing gui using pointer having visual effect showing that pointer is moved by gravity and electronic apparatus thereof
US20150035753A1 (en) 2012-02-24 2015-02-05 Koninklijke Philips N.V. System and method for processing a pointer movement

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Barrett, W.A. et al. "Interactive Live-Wire Boundary Extraction", Medical Image Analysis, 1(4):331-341, 1997.
Elliott, P.J. et al. "Interactive image segmentation for radiation treatment planning". IBM Systems Journal, vol. 31, No. 4(1992).
Mortensen, E.N. et al. "Interactive segmentation with Intelligent Scissors". Graphical Models and Image Processing, vol. 60, Issue 5, Sep. 1998, pp. 349-384.

Also Published As

Publication number Publication date
EP2817695B1 (en) 2018-03-14
JP2015509628A (ja) 2015-03-30
US20150035753A1 (en) 2015-02-05
BR112014020960A2 (pt) 2017-07-04
CN104145237A (zh) 2014-11-12
WO2013124751A1 (en) 2013-08-29
BR112014020960B1 (pt) 2022-04-19
EP2817695A1 (en) 2014-12-31
JP6133906B2 (ja) 2017-05-24
CN104145237B (zh) 2017-10-13

Similar Documents

Publication Publication Date Title
US9542012B2 (en) System and method for processing a pointer movement
EP3210163B1 (en) Gaze-tracking driven region of interest segmentation
US9980692B2 (en) System and method for interactive annotation of an image using marker placement command with algorithm determining match degrees
US20140143716A1 (en) System and method for processing a medical image
US9678644B2 (en) Displaying a plurality of registered images
EP1999717B1 (en) Systems and methods for interactive definition of regions and volumes of interest
US10297089B2 (en) Visualizing volumetric image of anatomical structure
US11099724B2 (en) Context sensitive magnifying glass
US20160041733A1 (en) Enabling a user to study image data
US10282917B2 (en) Interactive mesh editing
EP2786345B1 (en) Image processing apparatus.
US20130332868A1 (en) Facilitating user-interactive navigation of medical image data

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONINKLIJKE PHILIPS N.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BYSTROV, DANIEL;WIEMKER, RAFAEL;REEL/FRAME:033598/0613

Effective date: 20130215

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4