WO2004029786A1 - Control of robotic manipulation - Google Patents

Control of robotic manipulation Download PDF

Info

Publication number
WO2004029786A1
WO2004029786A1 PCT/GB2003/004077 GB0304077W WO2004029786A1 WO 2004029786 A1 WO2004029786 A1 WO 2004029786A1 GB 0304077 W GB0304077 W GB 0304077W WO 2004029786 A1 WO2004029786 A1 WO 2004029786A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
motion
user
fixation point
manipulator
Prior art date
Application number
PCT/GB2003/004077
Other languages
French (fr)
Other versions
WO2004029786A8 (en
Inventor
Guang Zhong Yang
Ara Darzi
Original Assignee
Imperial College Innovations Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imperial College Innovations Limited filed Critical Imperial College Innovations Limited
Priority to EP03748296A priority Critical patent/EP1550025A1/en
Priority to AU2003267604A priority patent/AU2003267604A1/en
Priority to US10/529,023 priority patent/US20060100642A1/en
Publication of WO2004029786A1 publication Critical patent/WO2004029786A1/en
Publication of WO2004029786A8 publication Critical patent/WO2004029786A8/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the invention relates to control of robotic manipulation; in particular motion compensation in robotic manipulation.
  • the invention further relates to the use of stereo images.
  • Robotic manipulation is known in a range of fields.
  • Typical systems include a robotic manipulator such as a robotic arm which is remote controlled by a user.
  • the robotic arm may be configured to mirror the actions of the human hand.
  • a human controller may have sensors monitoring actions of the controller's hand. Those sensors provide signals allowing the robotic arm to be controlled in the same manner.
  • Robotic manipulation is useful in a range of applications, for example in confined or in niiniaturised/microscopic applications.
  • robotic manipulation is in medical procedures such as surgery.
  • a robotic arm carries a medical instrument.
  • a camera is mounted on or close to the arm and the arm is controlled remotely by a medical practitioner who can view the operation via the camera.
  • keyhole surgery and microsurgery can be achieved with great precision.
  • a problem found particularly in medical procedures but also in other applications arises when it is required to operate on a moving object or moving surface such as a beating heart.
  • One known solution in medical procedures is to hold the relevant surface stationary.
  • heart surgery it is known to stop the heart altogether and rely on other life support means while the operation is taking place.
  • the surface can be stabilised by using additional members to hold it stationary. Both techniques are complex, difficult and increase the stress on the patient.
  • a position controller is also included.
  • the medical instrument is mounted on a robotic arm and remotely controlled by a surgeon.
  • the surface of the heart to be operated on is mechanically stabilised and the stabiliser also includes inertia or other position/movement sensors to detect any residual movement of the surface.
  • a motion controller controls the robotic arm or instrument to track the residual movement of the surface such that the distance between them remains constant and the surgeon effectively operates on a stationary surface.
  • a problem with this system is that the arm and instrument are motion locked to a specific point or zone on the heart defined by the mechanical stabiliser but there is no way of locking it to other areas. As a result if the surgeon needs to operate on another region of the surface then the residual motion will no longer be compensated and can indeed be enhanced if the arm is tracking another region of the surface, bearing in mind the complex surface movement of the heart.
  • the motion sensor can sense motion of a range of points
  • the controller can determined the part of the object to be tracked. Eye tracking relative to a stereo image allows the depth of a fixation point to be determined.
  • Fig. 1 is a schematic view of a known robotic manipulator
  • Fig. 2 shows the components of an eye tracking system
  • FIG. 3 shows a robotic manipulator according to the invention
  • Fig. 4 shows a schematic view of a stereo image display
  • Fig. 5 shows the use of stereo image in depth determination.
  • a robotic manipulator 20 includes an articulated arm 22 carrying a medical instalment 24 as well as the cameras 26.
  • the arm is mounted on a controller 28.
  • a surgical station designated generally 40 includes binocular vision eye pieces 42 through which the surgeon can view a stereo image generated by cameras 26 and control gauntlets 44. The surgeon inserts his hands into the control gauntlets and controls a remote analogue of the robotic manipulator 20 based on the visual feedback from eyepiece 42.
  • Interface between the robotic manipulator 20 and surgical station 40 is via an appropriate computer processor 50 which can be of any appropriate type for example a PC or laptop.
  • the processor 50 conveys the images from camera 26 to the surgical station 40 and returns control signals from the robotic arm analogue controlled by the surgeon via gauntlets 44.
  • a fully fed back surgical system is provided.
  • Such a system is available under the trademark Da Vinci Surgical Systems from Intuitive Surgical, Inc of Sunnyvale California USA or Zeus Robotic Surgical Systems from Computer Motion, Inc Goleta California USA.
  • the surgical instrument operates on the patient and the only incision required is sufficient to allow camera vision and movement of the instrument itself as a result of which r nimal stress to the patient is introduced.
  • micro surgery can very easily take place.
  • the present invention further incorporates an eye tracking capability at the surgical station 40 identifying which part of the surface the surgeon is fixating on and ensuring that the robotic arm tracks that particular point, the motion of which may vary relative to other points because of the complex motion of the heart' s surface.
  • the invention achieves dynamic reference frame locking.
  • An eye- tracking device 70 includes one or more light projectors 71 and a light detector 72.
  • the Ught projectors may be infra-red (IR) LEDs and the detector may be an IR camera.
  • the LEDs project light 73 onto the eye of the user 60 and the angle of gaze of the eye can be derived using known techniques by detecting the Ught 74 reflected onto the camera.
  • Any appropriate eye tracking system may in practice be used for example an ASL model 504 remote eye- tracking system (Applied Science Laboratories, MA, USA).
  • This embodiment may be particularly applicable when a single camera is provided on the articulated arm 22 of a robotic manipulator and thus a single image is presented to the user.
  • the gaze of the user is used to determine the fixation point of the user on the image 62.
  • a calibration stage may be incorporated on initialisation of any eye-tracking system to accommodate differences between users' eyes or vision. The nature of any such calibration stage will be well known to the skilled reader.
  • FIG. 3 the robotic arm and tracking system are shown in more detail.
  • An object 80 is operated on by a robotic manipulator designated generally 82.
  • the manipulator 82 includes 3 robotic arms 84, 86, 88 articulated in any appropriate manner and carrying appropriate operating instruments. Arm 84 and arm 86 each support a camera 90a, 90b displaced from one another sufficient to provide stereo imaging according to known techniques. Since the relative positions of the three arms are known, the position of the cameras in 3D space is also known.
  • the system allows motion compensation to be directed to the point on which the surgeon is fixating (i.e. the point he is looking at, at a given moment). Identifying the fixation point can be achieved using known techniques which will generally be built in with an appropriate eye tracking device provided, for example, in the product discussed above.
  • the cameras are used to detect the motion of the fixation point and send the information back to the processor for control of the motion of the robotic arm.
  • the fixation point position is identified on the image viewed by the human operator, given that the position of the stereo cameras 90a and 90b are known the position of the point on the object 80 can be identified.
  • this can be repUcated at the stereo camera to focus on the relevant point.
  • the motion of that point is then determined by stereo vision.
  • the position of a point can be determined by measuring the disparity in the view taken by each camera 90a,
  • the cameras take respective images Al, Bl defining a distance XL
  • a more distant object 104 creates images A2, B2 in which the distance between the objects as shown in the respective images is X2.
  • the computer 50 calculates the position in the image plane of the co-ordinates in the real world (so-called "world coordinates"). This may be done as follows:
  • R and t represent the 3x3 rotation matrix and the 3x1 translation vector defining the rigid displacement between the two cameras.
  • matrix A can have the form of
  • f u and f v correspond to the focal distance in pixels along the axes of the image. All parameters of A can be computed through classical calibration method (e.g. as described in the book by O. Faugeras, "Three-Dimensional Computer Vision: a Geometric Viewpoint", MIT press, Cambridge, MA, 1993).
  • the apparatus is calibrated for a given user.
  • the user looks at predetermined points on a displayed image and the eye tracking device tracks the eye(s) of the user as they look at each predetermined point.
  • This sets the user's gaze within a reference frame generally two-dimensional if one image is displayed and three-dimensional if stereo images are displayed.
  • the user's gaze on the image(s) is tracked and thus the gaze of the user within this reference frame is determined.
  • the robotic arms 84, 86 then move the cameras 90a, 90b to focus on the determined fixation point.
  • FIG. 2 again which shows a user 60, an image 62 on a display 63 and an eye tracking device 70.
  • the tracking device 70 is first calibrated for the user. This involves the computer 50 displaying on the display a number of pre-determined caUbration points, indicated by 92. A user is instructed to focus on each of these in turn (for instance, the computer 50 may cause each calibration point to be displayed in turn). As the user stares at a calibration point, the eye tracking device 70 tracks the gaze of the user. The computer then correlates the position of the caUbration point with the position of the user's eye. Once all the calibration points have been displayed to a user and the corresponding eye position recorded, the system has been calibrated to the user.
  • a user's gaze can be correlated to the part of the image being looked at by the user.
  • the coordinates [x ]5 y ⁇ and [x r , y r ] are known from each eye tracker from which [x, y, z] ⁇ can be calculated from Equations (l)-(4).
  • the motion of the point fixated on by the human operator can be tracked and the camera and arm moved by any appropriate means to maintain a constant distance from the fixation point.
  • This can either be done by monitoring the absolute position of the two points and keeping it constant or by some form of feedback control such as using PID control.
  • the cameras can be focussed or directed towards the fixation point determined by eye-tracking, simply by providing appropriate direction means on or in relation to the robotic arm. As a result the tracked point can be moved to centre screen if desired.
  • the surgical station provides a stereo image via binocular eyepiece 42 to the surgeon, where the required offset left and right images are provided by the respective cameras mounted on the robotic arm.
  • FIG. 4 a further embodiment of the invention is shown.
  • the system requires left and right images slightly offset to provide, when appropriately combined, a stereo image as well known to the skilled reader.
  • Images of a subject being viewed are displayed on displays 200a, 200b. These displays are typically LCD displays.
  • a user views the images on the displays 200a, 200b through individual eye pieces 202a, 202b via intermediate optics including mirrors 204a, b, c (and any appropriate lens although any appropriate optics can of course be used).
  • Eye tracking devices are provided for each individual eye piece.
  • the eye- tracking device includes Ught projectors 206 and Ught detectors 208 a,b .
  • the light projectors are IR LEDs and the Ught detector comprises an IR camera for each eye.
  • An IR filter may be provided in front of the IR camera.
  • the images (indicated in Figure 4 by the numerals 210a, 210b) captured by the Ught detectors 208a, 208b show the position of the pupils of each eye of the user and also the Purkinje Reflections of the Ught sources 206.
  • the angle of gaze of the eye can be derived using known techniques by the detecting the reflected light.
  • Purkinje images are formed by Ught reflected from surfaces in the eye.
  • the first reflection takes place at the anterior surface of the cornea while the fourth occurs at the posterior surface of the lens of the eye.
  • Both the first and fourth Purkinje images Ue in approximately the same plane in the pupil of the eye and, since eye rotation alters the angle of the IR beam from the IR projectors 206 with respect to the optical axis of the eye, and eye translations move both images by the same amount, eye movement can be obtained from the spatial position and distance between the two Purkinje reflections.
  • This technique is commonly known as the Dual-Purkinje Image (DPI) technique.
  • DPI also allows for the calculation of a user's accommodation of focus i.e. how far away the user is looking.
  • Another eye tracking technique subtracts the Purkinje reflections from the nasal side of the pupil and the temporal side of the pupil and uses the difference to determine the eye position signal.
  • Any appropriate eye tracking system may in practice be used for example an ASL model 504 remote eye-tracking system (Applied Science Laboratories, MA, USA).
  • the computer 50 uses this signal to determine where, in the reference field, the user is looking and calculates the corresponding position on the subject being viewed. Once this position is determined, the computer signals the robotic manipulator 82 to move the arms 84 and/or 86 which support the cameras 90a and 90b to focus on the part of the subject determined from the eye-tracking device, allowing the motion sensor to track movement of that part and hence lock the frame of reference to it.
  • eye tracking devices that use reflected light
  • other forms of eye tracking may be used, e.g. measuring the electric potential of the skin around the eye(s) or applying a special contact lens and tracking its position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

In a remote controlled robotic manipulator (20) a motion sensor (26) senses motion of a region of an object to be manipulated. A controller (50) locks motion of the robotic manipulator (26) relative to the region of the object and also selects the region of the object to be sensed. As a result the frame of reference of the manipulator is locked to the relevant region of the object to be manipulated improving ease of control and manipulation.

Description

Control of Robotic Manipulation
The invention relates to control of robotic manipulation; in particular motion compensation in robotic manipulation. The invention further relates to the use of stereo images.
Robotic manipulation is known in a range of fields. Typical systems include a robotic manipulator such as a robotic arm which is remote controlled by a user. For example the robotic arm may be configured to mirror the actions of the human hand. In that case a human controller may have sensors monitoring actions of the controller's hand. Those sensors provide signals allowing the robotic arm to be controlled in the same manner. Robotic manipulation is useful in a range of applications, for example in confined or in niiniaturised/microscopic applications.
One known application of robotic manipulation is in medical procedures such as surgery. In robotic surgery a robotic arm carries a medical instrument. A camera is mounted on or close to the arm and the arm is controlled remotely by a medical practitioner who can view the operation via the camera. As a result keyhole surgery and microsurgery can be achieved with great precision. A problem found particularly in medical procedures but also in other applications arises when it is required to operate on a moving object or moving surface such as a beating heart. One known solution in medical procedures is to hold the relevant surface stationary. In the case of heart surgery it is known to stop the heart altogether and rely on other life support means while the operation is taking place. Alternatively the surface can be stabilised by using additional members to hold it stationary. Both techniques are complex, difficult and increase the stress on the patient. One proposed solution is set out in US5971976 in which a position controller is also included. The medical instrument is mounted on a robotic arm and remotely controlled by a surgeon. The surface of the heart to be operated on is mechanically stabilised and the stabiliser also includes inertia or other position/movement sensors to detect any residual movement of the surface. A motion controller controls the robotic arm or instrument to track the residual movement of the surface such that the distance between them remains constant and the surgeon effectively operates on a stationary surface. A problem with this system is that the arm and instrument are motion locked to a specific point or zone on the heart defined by the mechanical stabiliser but there is no way of locking it to other areas. As a result if the surgeon needs to operate on another region of the surface then the residual motion will no longer be compensated and can indeed be enhanced if the arm is tracking another region of the surface, bearing in mind the complex surface movement of the heart.
The invention is set out in the appended claims. Because the motion sensor can sense motion of a range of points, the controller can determined the part of the object to be tracked. Eye tracking relative to a stereo image allows the depth of a fixation point to be determined.
Embodiments of the invention will now be described, by way of example, with reference to the drawings of which:
Fig. 1 is a schematic view of a known robotic manipulator; Fig. 2 shows the components of an eye tracking system;
Fig. 3 shows a robotic manipulator according to the invention; Fig. 4 shows a schematic view of a stereo image display; and Fig. 5 shows the use of stereo image in depth determination. Referring to Fig. 1 a typical arrangement for performing robotic surgery is shown designated generally 10. A robotic manipulator 20 includes an articulated arm 22 carrying a medical instalment 24 as well as the cameras 26. The arm is mounted on a controller 28. A surgical station designated generally 40 includes binocular vision eye pieces 42 through which the surgeon can view a stereo image generated by cameras 26 and control gauntlets 44. The surgeon inserts his hands into the control gauntlets and controls a remote analogue of the robotic manipulator 20 based on the visual feedback from eyepiece 42. Interface between the robotic manipulator 20 and surgical station 40 is via an appropriate computer processor 50 which can be of any appropriate type for example a PC or laptop. The processor 50 conveys the images from camera 26 to the surgical station 40 and returns control signals from the robotic arm analogue controlled by the surgeon via gauntlets 44. As a result a fully fed back surgical system is provided. Such a system is available under the trademark Da Vinci Surgical Systems from Intuitive Surgical, Inc of Sunnyvale California USA or Zeus Robotic Surgical Systems from Computer Motion, Inc Goleta California USA. In use the surgical instrument operates on the patient and the only incision required is sufficient to allow camera vision and movement of the instrument itself as a result of which r nimal stress to the patient is introduced. Furthermore using appropriate magnifications/reduction techniques, micro surgery can very easily take place.
As discussed above, it is known to add motion compensation to a system such as this whereby motion sensors on the surface send a movement signal which is tracked by the robotic arm such that the surface and arm are stationary relative to one another. In overview the present invention further incorporates an eye tracking capability at the surgical station 40 identifying which part of the surface the surgeon is fixating on and ensuring that the robotic arm tracks that particular point, the motion of which may vary relative to other points because of the complex motion of the heart' s surface. As a result the invention achieves dynamic reference frame locking.
Referring to Fig. 2 an appropriate eye tracking arrangement is shown schematically. The user 60 views an image 62 on a display 63. An eye- tracking device 70 includes one or more light projectors 71 and a light detector 72. In practice the Ught projectors may be infra-red (IR) LEDs and the detector may be an IR camera. The LEDs project light 73 onto the eye of the user 60 and the angle of gaze of the eye can be derived using known techniques by detecting the Ught 74 reflected onto the camera. Any appropriate eye tracking system may in practice be used for example an ASL model 504 remote eye- tracking system (Applied Science Laboratories, MA, USA). This embodiment may be particularly applicable when a single camera is provided on the articulated arm 22 of a robotic manipulator and thus a single image is presented to the user. The gaze of the user is used to determine the fixation point of the user on the image 62. It will be appreciated that a calibration stage may be incorporated on initialisation of any eye-tracking system to accommodate differences between users' eyes or vision. The nature of any such calibration stage will be well known to the skilled reader.
Referring now to Fig. 3, the robotic arm and tracking system are shown in more detail.
An object 80 is operated on by a robotic manipulator designated generally 82. The manipulator 82 includes 3 robotic arms 84, 86, 88 articulated in any appropriate manner and carrying appropriate operating instruments. Arm 84 and arm 86 each support a camera 90a, 90b displaced from one another sufficient to provide stereo imaging according to known techniques. Since the relative positions of the three arms are known, the position of the cameras in 3D space is also known.
In use the system allows motion compensation to be directed to the point on which the surgeon is fixating (i.e. the point he is looking at, at a given moment). Identifying the fixation point can be achieved using known techniques which will generally be built in with an appropriate eye tracking device provided, for example, in the product discussed above. In the preferred embodiment the cameras are used to detect the motion of the fixation point and send the information back to the processor for control of the motion of the robotic arm.
In particular, once, at any one moment, the fixation point position is identified on the image viewed by the human operator, given that the position of the stereo cameras 90a and 90b are known the position of the point on the object 80 can be identified. Alternatively, by determining the respective direction of gaze of each eye, this can be repUcated at the stereo camera to focus on the relevant point. The motion of that point is then determined by stereo vision. In particular, referring to Fig. 5 it will be seen that the position of a point can be determined by measuring the disparity in the view taken by each camera 90a,
90b. For example for a relatively distant object 100 on a plane 102 the cameras take respective images Al, Bl defining a distance XL A more distant object 104 creates images A2, B2 in which the distance between the objects as shown in the respective images is X2. There is an inverse relationship between the distance and the depth of the point. As a result the relative position of the point to the camera can be determined. In particular, the computer 50 calculates the position in the image plane of the co-ordinates in the real world (so-called "world coordinates"). This may be done as follows:
A 3D point M = [x,y,z]τ is projected to a 2D image point m = [x,y,]τ through a
3x4 projection matrix P, such that S m =PM, where S is a non-zero scale factor and m =[x, y, l]1 and M= [x,y,z,l] In binocular stereo systems, each physical point M in 3D space is projected to mi and m2 in the two image planes, i.e;
Figure imgf000008_0001
S2 m2 = P2 M (1)
If we assume that the world coordinate system is associated with the first camera, we have
Figure imgf000008_0002
P2 = [A'R|A* t] (2)
Where R and t represent the 3x3 rotation matrix and the 3x1 translation vector defining the rigid displacement between the two cameras.
The matrices A and A1 are the 3x3 intrinsic parameter matrices of the two cameras. In general, when the two cameras have the same parameter settings and with square pixels (aspect ration = 1), and the angle (0) between the two image coordinate axes being π/2 we have:
f 0 u0
A = 0 / v0 (3) 0 0 1 Where (u0, o) are the coordinates of the image principal point, i.e, the point where points located at infinity in world coordinates are projected.
Generally, matrix A can have the form of
A = (4)
Figure imgf000009_0001
Where fu and fv correspond to the focal distance in pixels along the axes of the image. All parameters of A can be computed through classical calibration method (e.g. as described in the book by O. Faugeras, "Three-Dimensional Computer Vision: a Geometric Viewpoint", MIT press, Cambridge, MA, 1993).
Known techniques for determining the depth are for example as follows. Firstly, the apparatus is calibrated for a given user. The user looks at predetermined points on a displayed image and the eye tracking device tracks the eye(s) of the user as they look at each predetermined point. This sets the user's gaze within a reference frame (generally two-dimensional if one image is displayed and three-dimensional if stereo images are displayed). In use, the user's gaze on the image(s) is tracked and thus the gaze of the user within this reference frame is determined. The robotic arms 84, 86 then move the cameras 90a, 90b to focus on the determined fixation point.
For instance, consider Figure 2 again which shows a user 60, an image 62 on a display 63 and an eye tracking device 70. In use, the tracking device 70 is first calibrated for the user. This involves the computer 50 displaying on the display a number of pre-determined caUbration points, indicated by 92. A user is instructed to focus on each of these in turn (for instance, the computer 50 may cause each calibration point to be displayed in turn). As the user stares at a calibration point, the eye tracking device 70 tracks the gaze of the user. The computer then correlates the position of the caUbration point with the position of the user's eye. Once all the calibration points have been displayed to a user and the corresponding eye position recorded, the system has been calibrated to the user.
Subsequently a user's gaze can be correlated to the part of the image being looked at by the user. For each eye, the coordinates [x]5 y^ and [xr, yr] are known from each eye tracker from which [x, y, z]τ can be calculated from Equations (l)-(4).
By carrying out this step across time the motion of the point fixated on by the human operator can be tracked and the camera and arm moved by any appropriate means to maintain a constant distance from the fixation point. This can either be done by monitoring the absolute position of the two points and keeping it constant or by some form of feedback control such as using PID control. Once again the relevant techniques will be well known to the skilled person.
It will be further recognised that the cameras can be focussed or directed towards the fixation point determined by eye-tracking, simply by providing appropriate direction means on or in relation to the robotic arm. As a result the tracked point can be moved to centre screen if desired. In the preferred embodiment the surgical station provides a stereo image via binocular eyepiece 42 to the surgeon, where the required offset left and right images are provided by the respective cameras mounted on the robotic arm.
According to a further aspect of the invention enhanced eye tracking in relation to stereo images is provided. Referring to Fig. 4, a further embodiment of the invention is shown. The system requires left and right images slightly offset to provide, when appropriately combined, a stereo image as well known to the skilled reader. Images of a subject being viewed are displayed on displays 200a, 200b. These displays are typically LCD displays. A user views the images on the displays 200a, 200b through individual eye pieces 202a, 202b via intermediate optics including mirrors 204a, b, c (and any appropriate lens although any appropriate optics can of course be used).
Eye tracking devices are provided for each individual eye piece. The eye- tracking device includes Ught projectors 206 and Ught detectors 208a,b. In a preferred implementation , the light projectors are IR LEDs and the Ught detector comprises an IR camera for each eye. An IR filter may be provided in front of the IR camera. The images (indicated in Figure 4 by the numerals 210a, 210b) captured by the Ught detectors 208a, 208b show the position of the pupils of each eye of the user and also the Purkinje Reflections of the Ught sources 206.
The angle of gaze of the eye can be derived using known techniques by the detecting the reflected light.
In a preferred, known implementation Purkinje images are formed by Ught reflected from surfaces in the eye. The first reflection takes place at the anterior surface of the cornea while the fourth occurs at the posterior surface of the lens of the eye. Both the first and fourth Purkinje images Ue in approximately the same plane in the pupil of the eye and, since eye rotation alters the angle of the IR beam from the IR projectors 206 with respect to the optical axis of the eye, and eye translations move both images by the same amount, eye movement can be obtained from the spatial position and distance between the two Purkinje reflections. This technique is commonly known as the Dual-Purkinje Image (DPI) technique. DPI also allows for the calculation of a user's accommodation of focus i.e. how far away the user is looking. Another eye tracking technique subtracts the Purkinje reflections from the nasal side of the pupil and the temporal side of the pupil and uses the difference to determine the eye position signal. Any appropriate eye tracking system may in practice be used for example an ASL model 504 remote eye-tracking system (Applied Science Laboratories, MA, USA).
By tracking the individual motion of each eye and identifying the fixation point F on the left and right images 200a, 200b, not only the position of the fixation point in the X Y plane (the plane of the images) can be identified but also the depth into the image, in the Z direction.
Once the eye position signal is determined, the computer 50 uses this signal to determine where, in the reference field, the user is looking and calculates the corresponding position on the subject being viewed. Once this position is determined, the computer signals the robotic manipulator 82 to move the arms 84 and/or 86 which support the cameras 90a and 90b to focus on the part of the subject determined from the eye-tracking device, allowing the motion sensor to track movement of that part and hence lock the frame of reference to it.
Although the invention has been described with reference to eye tracking devices that use reflected light, other forms of eye tracking may be used, e.g. measuring the electric potential of the skin around the eye(s) or applying a special contact lens and tracking its position.
It will be appreciated that the embodiments above and elements thereof can be combined or interchanged as appropriate. Although specific discussion is made of the appUcation of the invention to surgery, it will be recognised that the invention can be equally applied in many other areas where robotic manipulation or stereo imaging is required. Although stereo vision is described, monocular vision can also be appUed. Also other appropriate means of motion sensing can be adopted, for instance, by the use of casting structured Ught onto the object and observing changes as the object moves, or by using laser range finding. These examples are not supposed to be Umiting.

Claims

Claims
1. A remote controlled robotic manipulator for manipulating a moving object comprising a motion sensor for sensing motion of a region of an object to be manipulated, and a controller for locking motion of the robotic manipulator relative to the region of the object based on the sensed motion, wherein controller further controls for which region of the object the motion sensor senses motion.
2. A manipulator as claimed in claim 1 in which the motion sensor is controllable by a human user.
3. A manipulator as claimed in claim 2 in which the motion sensor is controllable by tracking the visual fixation point of the user.
4. A manipulator as claimed in claim 3 in which the user views a remote representation of the object.
5. A method of identifying a visual fixation point of a user observing a stereo image formed by visually superposing mono images comprising the steps of presenting one mono image to each user eye to form the stereo image and tracking the fixation point of each eye.
6. A method as claimed in claim 5 in which the three dimensional position of the visual fixation point is determined.
7. An apparatus for identifying a fixation point in a stereo image comprising first and second displays for displaying mono images, a stereo image presentation module for visually super-posing the mono images to form the stereo image and an eye tracker for tracking the fixation point of each eye.
PCT/GB2003/004077 2002-09-25 2003-09-25 Control of robotic manipulation WO2004029786A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP03748296A EP1550025A1 (en) 2002-09-25 2003-09-25 Control of robotic manipulation
AU2003267604A AU2003267604A1 (en) 2002-09-25 2003-09-25 Control of robotic manipulation
US10/529,023 US20060100642A1 (en) 2002-09-25 2003-09-25 Control of robotic manipulation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0222265.1 2002-09-25
GBGB0222265.1A GB0222265D0 (en) 2002-09-25 2002-09-25 Control of robotic manipulation

Publications (2)

Publication Number Publication Date
WO2004029786A1 true WO2004029786A1 (en) 2004-04-08
WO2004029786A8 WO2004029786A8 (en) 2004-06-03

Family

ID=9944753

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2003/004077 WO2004029786A1 (en) 2002-09-25 2003-09-25 Control of robotic manipulation

Country Status (5)

Country Link
US (1) US20060100642A1 (en)
EP (1) EP1550025A1 (en)
AU (1) AU2003267604A1 (en)
GB (1) GB0222265D0 (en)
WO (1) WO2004029786A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6721356B1 (en) 2000-01-03 2004-04-13 Advanced Micro Devices, Inc. Method and apparatus for buffering data samples in a software based ADSL modem
US7907166B2 (en) 2005-12-30 2011-03-15 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
ITMI20100579A1 (en) * 2010-04-07 2011-10-08 Sofar Spa ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL.
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
EP2781200A1 (en) * 2005-09-30 2014-09-24 Restoration Robotics, Inc. Automated systems and methods for harvesting and implanting follicular units
AU2012227252B2 (en) * 2011-09-21 2014-09-25 Digital Surgicals Pte, Ltd. Surgical Stereo Vision Systems And Methods For Microsurgery
ITMI20130702A1 (en) * 2013-04-30 2014-10-31 Sofar Spa ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL
US8971597B2 (en) 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US9155592B2 (en) 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9766441B2 (en) 2011-09-22 2017-09-19 Digital Surgicals Pte. Ltd. Surgical stereo vision systems and methods for microsurgery
AU2014231345B2 (en) * 2013-03-15 2019-01-17 Synaptive Medical Inc. Intelligent positioning system and methods therefore
WO2019080358A1 (en) * 2017-10-28 2019-05-02 深圳市前海安测信息技术有限公司 Robot for surgical navigation using 3d images and control method thereof

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8888688B2 (en) 2000-04-03 2014-11-18 Intuitive Surgical Operations, Inc. Connector device for a controllable instrument
US8517923B2 (en) 2000-04-03 2013-08-27 Intuitive Surgical Operations, Inc. Apparatus and methods for facilitating treatment of tissue via improved delivery of energy based and non-energy based modalities
US6610007B2 (en) 2000-04-03 2003-08-26 Neoguide Systems, Inc. Steerable segmented endoscope and method of insertion
US6468203B2 (en) 2000-04-03 2002-10-22 Neoguide Systems, Inc. Steerable endoscope and improved method of insertion
EP1531749A2 (en) 2002-08-13 2005-05-25 Microbotics Corporation Microsurgical robot system
US7626569B2 (en) * 2004-10-25 2009-12-01 Graphics Properties Holdings, Inc. Movable audio/video communication interface system
US8079950B2 (en) 2005-09-29 2011-12-20 Intuitive Surgical Operations, Inc. Autofocus and/or autoscaling in telesurgery
US20070106147A1 (en) * 2005-11-01 2007-05-10 Altmann Andres C Controlling direction of ultrasound imaging catheter
FR2917598B1 (en) * 2007-06-19 2010-04-02 Medtech MULTI-APPLICATIVE ROBOTIC PLATFORM FOR NEUROSURGERY AND METHOD OF RECALING
EP2108328B2 (en) * 2008-04-09 2020-08-26 Brainlab AG Image-based control method for medicinal devices
NO332220B1 (en) * 2008-07-02 2012-07-30 Prezioso Linjebygg As Apparatus for surgical zone surgery
KR100998182B1 (en) * 2008-08-21 2010-12-03 (주)미래컴퍼니 3D display system of surgical robot and control method thereof
US8698898B2 (en) * 2008-12-11 2014-04-15 Lucasfilm Entertainment Company Ltd. Controlling robotic motion of camera
DE102009010263B4 (en) * 2009-02-24 2011-01-20 Reiner Kunz Method for navigating an endoscopic instrument during technical endoscopy and associated device
FR2963693B1 (en) 2010-08-04 2013-05-03 Medtech PROCESS FOR AUTOMATED ACQUISITION AND ASSISTED ANATOMICAL SURFACES
EP2774380B1 (en) 2011-11-02 2019-05-22 Intuitive Surgical Operations, Inc. Method and system for stereo gaze tracking
FR2983059B1 (en) 2011-11-30 2014-11-28 Medtech ROBOTIC-ASSISTED METHOD OF POSITIONING A SURGICAL INSTRUMENT IN RELATION TO THE BODY OF A PATIENT AND DEVICE FOR CARRYING OUT SAID METHOD
JP6251962B2 (en) * 2012-03-01 2017-12-27 日産自動車株式会社 Camera apparatus and image processing method
JP6251963B2 (en) 2012-03-01 2017-12-27 日産自動車株式会社 Camera apparatus and image processing method
US20140024889A1 (en) * 2012-07-17 2014-01-23 Wilkes University Gaze Contingent Control System for a Robotic Laparoscope Holder
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9993273B2 (en) 2013-01-16 2018-06-12 Mako Surgical Corp. Bone plate and tracking device using a bone plate for attaching to a patient's anatomy
US9566120B2 (en) * 2013-01-16 2017-02-14 Stryker Corporation Navigation systems and methods for indicating and reducing line-of-sight errors
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
WO2015023513A1 (en) * 2013-08-14 2015-02-19 Intuitive Surgical Operations, Inc. Endoscope control system
WO2015046081A1 (en) * 2013-09-24 2015-04-02 ソニー・オリンパスメディカルソリューションズ株式会社 Medical robot arm device, medical robot arm control system, medical robot arm control method, and program
JP6644699B2 (en) 2014-03-19 2020-02-12 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Medical devices, systems and methods using gaze tracking
JP6689203B2 (en) * 2014-03-19 2020-04-28 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Medical system integrating eye tracking for stereo viewer
US10179407B2 (en) * 2014-11-16 2019-01-15 Robologics Ltd. Dynamic multi-sensor and multi-robot interface system
WO2016181696A1 (en) * 2015-05-14 2016-11-17 ソニー・オリンパスメディカルソリューションズ株式会社 Microscope device for surgical use, and microscope system for surgical use
US10537395B2 (en) 2016-05-26 2020-01-21 MAKO Surgical Group Navigation tracker with kinematic connector assembly
US11612446B2 (en) * 2016-06-03 2023-03-28 Covidien Lp Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator
CA3023266A1 (en) * 2016-06-03 2017-12-07 Covidien Lp Systems, methods, and computer-readable storage media for controlling aspects of a robotic surgical device and viewer adaptive stereoscopic display
EP3975909B1 (en) * 2019-05-29 2024-01-10 Intuitive Surgical Operations, Inc. Operating mode control systems and methods for a computer-assisted surgical system
US12114955B2 (en) * 2019-07-16 2024-10-15 Asensus Surgical Us, Inc. Dynamic scaling of surgical manipulator motion based on surgeon stress parameters
WO2021133186A1 (en) * 2019-12-23 2021-07-01 федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)" Method for controlling robotic manipulator
JP2024514432A (en) * 2021-03-29 2024-04-02 アルコン インコーポレイティド Stereoscopic Imaging Platform with Continuous Autofocus Mode - Patent application

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995001757A1 (en) * 1993-07-07 1995-01-19 Cornelius Borst Robotic system for close inspection and remote treatment of moving parts
JPH11155152A (en) * 1997-11-21 1999-06-08 Canon Inc Method and system for three-dimensional shape information input, and image input device thereof
EP1056049A2 (en) * 1999-05-27 2000-11-29 United Bristol Healthcare NHS Trust Method and apparatus for displaying volumetric data
US6368332B1 (en) * 1999-03-08 2002-04-09 Septimiu Edmund Salcudean Motion tracking platform for relative motion cancellation for surgery

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3462604A (en) * 1967-08-23 1969-08-19 Honeywell Inc Control apparatus sensitive to eye movement
US4028725A (en) * 1976-04-21 1977-06-07 Grumman Aerospace Corporation High-resolution vision system
US4348186A (en) * 1979-12-17 1982-09-07 The United States Of America As Represented By The Secretary Of The Navy Pilot helmet mounted CIG display with eye coupled area of interest
US5626595A (en) * 1992-02-14 1997-05-06 Automated Medical Instruments, Inc. Automated surgical instrument
WO1994026167A1 (en) * 1993-05-14 1994-11-24 Sri International Remote center positioner
DE69417824T4 (en) * 1993-08-26 2000-06-29 Matsushita Electric Industrial Co., Ltd. Stereoscopic scanner
CA2126142A1 (en) * 1994-06-17 1995-12-18 David Alexander Kahn Visual communications apparatus
US6463361B1 (en) * 1994-09-22 2002-10-08 Computer Motion, Inc. Speech interface for an automated endoscopic system
US5971976A (en) * 1996-02-20 1999-10-26 Computer Motion, Inc. Motion minimization and compensation system for use in surgical procedures
US5726916A (en) * 1996-06-27 1998-03-10 The United States Of America As Represented By The Secretary Of The Army Method and apparatus for determining ocular gaze point of regard and fixation duration
US6847336B1 (en) * 1996-10-02 2005-01-25 Jerome H. Lemelson Selectively controllable heads-up display system
US6351273B1 (en) * 1997-04-30 2002-02-26 Jerome H. Lemelson System and methods for controlling automatic scrolling of information on a display or screen
US6027216A (en) * 1997-10-21 2000-02-22 The Johns University School Of Medicine Eye fixation monitor and tracker
GB9813041D0 (en) * 1998-06-16 1998-08-12 Scient Generics Ltd Eye tracking technique
US6468265B1 (en) * 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
JP3608448B2 (en) * 1999-08-31 2005-01-12 株式会社日立製作所 Treatment device
US6554444B2 (en) * 2000-03-13 2003-04-29 Kansai Technology Licensing Organization Co., Ltd. Gazing point illuminating device
IL138831A (en) * 2000-10-03 2007-07-24 Rafael Advanced Defense Sys Gaze-actuated information system
US20030060808A1 (en) * 2000-10-04 2003-03-27 Wilk Peter J. Telemedical method and system
US6478425B2 (en) * 2000-12-29 2002-11-12 Koninlijke Phillip Electronics N. V. System and method for automatically adjusting a lens power through gaze tracking
GB0119859D0 (en) * 2001-08-15 2001-10-10 Qinetiq Ltd Eye tracking system
US6919907B2 (en) * 2002-06-20 2005-07-19 International Business Machines Corporation Anticipatory image capture for stereoscopic remote viewing with foveal priority
WO2004036378A2 (en) * 2002-10-15 2004-04-29 Mcintyre David J System and method for simulating visual defects

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995001757A1 (en) * 1993-07-07 1995-01-19 Cornelius Borst Robotic system for close inspection and remote treatment of moving parts
JPH11155152A (en) * 1997-11-21 1999-06-08 Canon Inc Method and system for three-dimensional shape information input, and image input device thereof
US6611283B1 (en) * 1997-11-21 2003-08-26 Canon Kabushiki Kaisha Method and apparatus for inputting three-dimensional shape information
US6368332B1 (en) * 1999-03-08 2002-04-09 Septimiu Edmund Salcudean Motion tracking platform for relative motion cancellation for surgery
EP1056049A2 (en) * 1999-05-27 2000-11-29 United Bristol Healthcare NHS Trust Method and apparatus for displaying volumetric data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 11 30 September 1999 (1999-09-30) *

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6721356B1 (en) 2000-01-03 2004-04-13 Advanced Micro Devices, Inc. Method and apparatus for buffering data samples in a software based ADSL modem
US8971597B2 (en) 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
EP2781200A1 (en) * 2005-09-30 2014-09-24 Restoration Robotics, Inc. Automated systems and methods for harvesting and implanting follicular units
US7907166B2 (en) 2005-12-30 2011-03-15 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US9402690B2 (en) 2008-12-31 2016-08-02 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local and remote robotic proctoring
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9155592B2 (en) 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
CN102958464A (en) * 2010-04-07 2013-03-06 索发有限公司 Robotized surgery system with improved control
US11857278B2 (en) 2010-04-07 2024-01-02 Asensus Surgical Italia, S.R.L. Roboticized surgery system with improved control
RU2569699C2 (en) * 2010-04-07 2015-11-27 Софар Спа Advanced controlled robotic surgical system
US10251713B2 (en) 2010-04-07 2019-04-09 Transenterix Italia S.R.L. Robotized surgery system with improved control
US9360934B2 (en) 2010-04-07 2016-06-07 Transenterix Italia S.R.L. Robotized surgery system with improved control
WO2011125007A1 (en) * 2010-04-07 2011-10-13 Sofar Spa Robotized surgery system with improved control
ITMI20100579A1 (en) * 2010-04-07 2011-10-08 Sofar Spa ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL.
US11224489B2 (en) 2010-04-07 2022-01-18 Asensus Surgical Italia, S.R.L. Robotized surgery system with improved control
EP3395251A1 (en) * 2010-04-07 2018-10-31 TransEnterix Italia S.r.l. Robotized surgery system with improved control
AU2012227252B2 (en) * 2011-09-21 2014-09-25 Digital Surgicals Pte, Ltd. Surgical Stereo Vision Systems And Methods For Microsurgery
US9330477B2 (en) 2011-09-22 2016-05-03 Digital Surgicals Pte. Ltd. Surgical stereo vision systems and methods for microsurgery
US9766441B2 (en) 2011-09-22 2017-09-19 Digital Surgicals Pte. Ltd. Surgical stereo vision systems and methods for microsurgery
AU2014231345B2 (en) * 2013-03-15 2019-01-17 Synaptive Medical Inc. Intelligent positioning system and methods therefore
US11103279B2 (en) 2013-03-15 2021-08-31 Synaptive Medical Inc. Intelligent positioning system and methods therefor
ITMI20130702A1 (en) * 2013-04-30 2014-10-31 Sofar Spa ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL
WO2019080358A1 (en) * 2017-10-28 2019-05-02 深圳市前海安测信息技术有限公司 Robot for surgical navigation using 3d images and control method thereof

Also Published As

Publication number Publication date
AU2003267604A1 (en) 2004-04-19
WO2004029786A8 (en) 2004-06-03
EP1550025A1 (en) 2005-07-06
GB0222265D0 (en) 2002-10-30
US20060100642A1 (en) 2006-05-11

Similar Documents

Publication Publication Date Title
WO2004029786A1 (en) Control of robotic manipulation
US11438572B2 (en) Medical devices, systems and methods using eye gaze tracking for stereo viewer
US20220303435A1 (en) Stereoscopic visualization camera and integrated robotics platform
Zhu et al. Novel eye gaze tracking techniques under natural head movement
Rolland et al. Optical versus video see-through head-mounted displays in medical visualization
AU2024219905A1 (en) Stereoscopic visualization camera and integrated robotics platform
Breedveld et al. Observation in laparoscopic surgery: overview of impeding effects and supporting aids
US12099200B2 (en) Head wearable virtual image module for superimposing virtual image on real-time image
US20220272272A1 (en) System and method for autofocusing of a camera assembly of a surgical robotic system
Dera et al. Low-latency video tracking of horizontal, vertical, and torsional eye movements as a basis for 3dof realtime motion control of a head-mounted camera
Piszczek et al. The importance of monitoring vergence eye movements for solutions using virtual technologies
WO2023069745A1 (en) Controlling a repositionable structure system based on a geometric relationship between an operator and a computer-assisted device
Plooy et al. Judging size, distance, and depth with an active telepresence system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WR Later publication of a revised version of an international search report
WWE Wipo information: entry into national phase

Ref document number: 2003748296

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2003748296

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2006100642

Country of ref document: US

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 10529023

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 10529023

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP