WO2022172227A1 - Measurement of position and orientation of an object - Google Patents

Measurement of position and orientation of an object Download PDF

Info

Publication number
WO2022172227A1
WO2022172227A1 PCT/IB2022/051266 IB2022051266W WO2022172227A1 WO 2022172227 A1 WO2022172227 A1 WO 2022172227A1 IB 2022051266 W IB2022051266 W IB 2022051266W WO 2022172227 A1 WO2022172227 A1 WO 2022172227A1
Authority
WO
WIPO (PCT)
Prior art keywords
patterns
orientation
light emitting
emitting device
projection
Prior art date
Application number
PCT/IB2022/051266
Other languages
French (fr)
Inventor
Arbind GUPTA
Original Assignee
Gupta Arbind
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gupta Arbind filed Critical Gupta Arbind
Publication of WO2022172227A1 publication Critical patent/WO2022172227A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Definitions

  • the invention relates to position and orientation and more specifically to determination of position and orientation of an object, and providing guidance to achieve a desired position and orientation.
  • Ultrasound imaging are usually performed by manual placement of the probes onto the anatomy of the subject. For example, echocardiography is done manually by applying the probe to the chest of the subject and the axis of the probe is appropriately manipulated by looking into the monitor screen to obtain best possible view. The quality of image is dependent on the position and orientation in 3D space.
  • This process being manual, is dependent on the expertise of the individual performing the imaging. Moreover, this expertise is acquired by the individual performing the imaging over a period of time.
  • the object of the invention is to provide an apparatus and method for determining the position and orientation of an object in three-dimensional space and provide guidance to achieve a desired position and orientation.
  • the apparatus comprises a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object, at-least one surface positioned in the direction of the emitted first set of two patterns to intercept the first set of two patterns, a camera to capture an image of the projection of the first set of two patterns on the at-least one surface, and a processor operably coupled to the camera to receive the image and determines the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.
  • the at-least one surface comprises a first surface and a second surface at an angle to each other, wherein the first surface is positioned in the direction of projection of one of the first set of two patterns and the second surface is positioned in the direction of projection of the other of the first set of two pattern.
  • the first set of two patterns comprises of a set of two line segments.
  • the apparatus further comprises a second light emitting device for emitting a second set of two patterns, the second set of two patterns having the same shape and size but distinct in appearance from the first set of two patterns, the second set of two patterns being at an angle to each other, the second light emitting device is positioned such that the second set of two patterns are projected on the at-least one surface.
  • the second light emitting device is operably coupled to the processor and the processor is configured to control the second light emitting device for projecting the second set of two patterns on the at-least one surface, indicating the desired position and orientation of the object.
  • the user can manipulate the position and orientation of the object, thereby changing the position and orientation of the first set of two patterns on the at-least one surface, to bring it to the desired position.
  • an alignment of the first set of two patterns and the second set of two patterns on the at-least one surface provides an indication of the object being at the desired position and orientation.
  • the apparatus further comprises of a switch operably coupled to the processor, for controlling the first light emitting device such that the first set of two patterns are within the field of view of the camera, based on the last known position of the first set of two patterns.
  • the processor is configured to provide an audio feedback based on a distance between the first set of two patterns and the second set of two patterns.
  • the processor is configured to alter the pitch and amplitude of the audio feedback based on the distance between the first set of two patterns and the second set of two patterns.
  • the method comprises, providing a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object, positioning at- least one surface in the direction of the emitted first set of two patterns to intercept the first set of two patterns, capturing an image of the projection of the first set of two patterns on the at-least one surface by a camera, and determining the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.
  • the first light emitting device can be attached onto the object and the position and orientation of the object can be determined using the projection of the first set of two patterns captured in the image.
  • the first set of two patterns can comprise any shape and geometry, for example a line segment, a set of points, an L shaped line, a rectangle, and the like.
  • the first set of two patterns and the second set of two patterns are visibly distinct.
  • the first set of two patterns and the second set of two patterns may be of two different distinguishable colors or may comprise of different styles such as dashed line, dotted line, solid line, flashing line etc.
  • the apparatus may be attached to an ECG probe of an ECG system, where the first set of two patterns provide the current position and orientation information of the ECG probe and the second set of two patterns provide the desired position and orientation of the probe, which provide a visual feedback to the operator to align current position to the more desirable position and orientation of the probe that helps in acquiring a better quality of ultrasound images.
  • the desired position can be pre-defined, may come from an external source or may be computed by other means, which is used by the processor in controlling the second light emitting device.
  • the second set of two patterns emitted by the second light emitting device provides the indication of the desired position.
  • the user of the ECG probe may adjust the ECG probe such that the projections of the first set of two patterns aligns or superimposes with the second set of two patterns. This provides the advantage of computing the position and orientation of the ECG probe and provide visual guidance to achieve the desired position and orientation of the ECG probe.
  • FIG 1 shows an exemplary illustration of a chest of a subject where an ultrasound probe can be placed for acquiring ultrasound images, wherein the position of placing the ultrasound probes is designated as A, B, C and D.
  • Fig 2a illustrates a block diagram of the apparatus for determining the position and orientation of an object and providing guidance to achieve a desired position and orientation of the object according to an embodiment herein - i) object whose position and orientation is to be determined; ii) two surfaces at an angle on which a pattern can be projected (surface S x and S y ); iii) light emitting device P1 attached to the object; iv) light emitting device P2, fixed to the room or surface S x and S y , and v) a camera for acquiring images of patterns projected on the two surfaces S x and S y .
  • Fig 2b illustrates projection of the first set of two patterns (line segments) L T X and L T y projected on the two surfaces S x and S y by light emitting device P1 and the second set of two patterns (line segments) L x and L y projected on the two surfaces S x and S y by light emitting device P2.
  • the first set of two patterns indicates the current position and orientation of the object and the second set of two patterns indicates a desired position and orientation of the probe.
  • Fig 3 illustrates a flow chart of an exemplary method for determining the position and orientation of an object according to an embodiment herein.
  • Fig 4 shows these arrangements of a preferred embodiment.
  • Fig 5 shows a visualization of (i) probe coordinate system (PCS) X P , Y V ,Z P with origin at O p attached to the object and its virtual surface S p ,S p ⁇ , ii) the table coordinate system (TCS) attached to the room) X T , Y T ,Z T with origin at 0 T and its surface
  • PCS probe coordinate system
  • Fig 6a an alternate arrangement of the two surfaces, where one of the surface S x is at the top.
  • Fig 6b shows an alternate arrangement where the two surfaces S x and S y are not at right angle to each other. They are also not vertical or perpendicular to the floor.
  • Fig 7a shows and alternate arrangement where the camera cum projection system are not attached to the two surfaces S x and S y and it is placed at a different location.
  • Fig 7b shows and alternate arrangement where the camera cum projection system is fixed at an angle to the two surfaces S x and Sy.
  • Fig 2b shows the various components of the said apparatus and methods for use with an object that are described below.
  • Fig 4 shows a preferred embodiment of the apparatus and methods and it is described in detail for better understanding.
  • the first and second set of two patterns comprise of a pair of line segments each, and the at-least one surface comprises of two surface S x and S Y on which projections from light emitting devices P1 and P2 will be projected.
  • the first set of two patterns and the second set of two patterns are distinguishable from each other by their color. It is sufficient to have one large surface on which patterns from light emitting devices P1 and P2 can be projected.
  • a description of the apparatus and methods comprising of - a.
  • the first light emitting device P1 will be attached to the object whose position and orientation information need to be measured accurately. It emits a first set of two patterns L T X and L T y onto the surface S x and S y .
  • the first set of two patterns are emitted such that they are at an angle to each other.
  • the position and orientation of the first set of two patterns L T X and L T y projected onto the surfaces S x and S y provide an indication of the position and orientation of the object.
  • the surface S x and S y are at right angle to each other and to the floor, as shown in Fig 4. c. a camera that can be fixed to the room or surface S x and/ or S y . It can take image of the two surfaces S x and S y . d. a processor, operably coupled to the camera, that computes the position and orientation of the object based on the position and orientation of the first set of two patterns L T X and L T y in the image. e. a second light emitting device P2, operably coupled to the processor, for emitting a second set of two patterns L x and L y that are projected onto the surface S x and S y respectively.
  • the second set of two patterns emitted by P2 are visually distinct from the first set of two patterns emitted by P1.
  • P2 is controlled by the processing unit to project the second set of two patterns to indicate the desired position and orientation of the object.
  • the user can attain the desired position and orientation of the object by moving and / or rotating the object so that the first set of two patterns L T X and L T y projected by P1 aligns with the second set of two patterns L x and L y projected by P2.
  • Fig 2 shows an embodiment of the said apparatus where upon it is used to measure the absolute position and orientation of an ultrasound probe and provide a visual feedback to the sonographer for a more desirable position and orientation of the probe.
  • FIG. 6a and Fig 6b show yet another embodiment of the said apparatus where the camera and light emitting device P2 are not aligned with the light emitting device P1 of the apparatus.
  • the color of pattern projected by light emitting device P1 and light emitting device P2 are shown in red and blue respectively. It is understood that the colors are only one example to make the two patterns emitted by P1 and P2 distinct from each other, and there are many other ways in which they can be made distinct from each other.
  • the patterns emitted by light emitting device P1 and P2 are shown as line segments. It is clearly understood that they can be any other pattern such as rectangle, two lines forming an L shape, a set of points etc.
  • the surface S x and S y can be light sensing suraces to sense the line segments projected by light emitting device P1 and P2.
  • first light emitting device P1 is attached to the object projecting two patterns L T X and L T y on surface S x and S y respectively, as shown in Fig 2a and Fig 2b. Moving and / or rotating the object also causes the first set of two patterns L T X and L T y to move and / or tilt.
  • Block 320 a camera is mounted to the surface S x and S y (Fig 2a), or to the floor/ ceiling of the room (Fig 7a and Fig 7b). The camera acquires images of the surfaces S x and S y and sends to a processor.
  • Block 330 is for a processor that detects the position and orientation of the first set of two patterns L T X and L T y , shown in Fig 2b from the given image. It then computes the position and orientation of the object based on the position and orientation of the first set of two patterns L T X and L T y in the image.
  • P T T * P p .
  • Points are collinear Points are collinear Points are collinear Points are collinear Points are collinear Points are collinear Line L' and L are coplanar Line L’ and L p are coplanar
  • the calculation of the transformation T can also be done if the first set of two patterns are projected at two different locations and having different orientation on just one surface. The calculation is not given here for the sake of brevity and simplicity.
  • the desired position of the object is provided to the system either manually or by an external source.
  • the desired position of the ultrasound probe can be derived from the acquired ultrasound image by detecting the various chambers of the heart and computing the desired position based on the size and shape of the chambers detected.
  • the desired position and orientation can be defined based on the location where the probe is placed (as shown in Fig 1). In yet another embodiment, it can be provided by an external system.
  • T pz be the transformation for the light emitting device P2 with respect to the TCS.
  • a point P T in TCS can be transformed into the coordinate system for light emitting device P2 as pP2 _ yR2 rt or rt _ [pP2 -i * pP2 ⁇ w
  • L x is the coordinates of line segments in the default position and orientation of the light emitting device P1 (similarly one can calculate L Y )
  • L x 2 and L pz are coordinates of the desired position of the second set of two patterns in the local coordinate system of the light emitting device P2.
  • Block 350 in figure 3 represents this computation needed to compute the L p x and L pz for projection by the light emitting device P2 on the two surfaces S x and S x respectively
  • the processing unit takes the desired position of the second set of two patterns L x 2 and L pz and projects on the two surfaces S x and S Y using the second light emitting device P2. This provides a visual feedback to the operator for adjusting the probe.
  • Block 370 in Fig 3 represents the adjustment of the object by the user so that the current position of the object, indicated by first set of two patterns emitted by the first light emitting device P1 , matches with the desired position and orientation of the probe, indicated by the second set of two patterns emitted by the second light emitting device P2. This can be complimented using voice feedback to the sonographer. Moreover, the difference in the current and desired position can also be mapped to the pitch and amplitude of an audio sound.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This invention relates to an apparatus and method for determining the position and orientation of an object in three dimensional space, wherein the apparatus comprises a first light emitting device for emitting first set of two patterns, being at an angle to each other, the first light emitting device attachable to the object, at-least one surface positioned in the direction of the emitted first set of two patterns to intercept the first set of two patterns, a camera to capture an image of the projection of the first set of two patterns on the at-least one surface, and a processor operably coupled to the camera to receive the image and determine the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.

Description

Measurement of position and orientation of an object
TECHNICAL FIELD
[001] The invention relates to position and orientation and more specifically to determination of position and orientation of an object, and providing guidance to achieve a desired position and orientation.
BACKGROUND
[002] Ultrasound imaging are usually performed by manual placement of the probes onto the anatomy of the subject. For example, echocardiography is done manually by applying the probe to the chest of the subject and the axis of the probe is appropriately manipulated by looking into the monitor screen to obtain best possible view. The quality of image is dependent on the position and orientation in 3D space.
[003] This process, being manual, is dependent on the expertise of the individual performing the imaging. Moreover, this expertise is acquired by the individual performing the imaging over a period of time.
[004] Attempts have been made to determine the position and orientation using various sensors and providing a feedback to the sonographer for a desired location and orientation.
[005] However, such solutions have limitation either in terms of accuracy of measurement or size of sensors used for the purpose, or cost. The accuracy of such systems are dependent on accuracy of sensors, such as, magnetometer, gyroscopes and accelerometer used. [006] Moreover, such sensors sense the position and orientation in a relative manner. As a result, the cumulative value of position and orientation, measured over a series of movement, also leads to accumulation of associated measurement error.
[007] Since the probe has six degrees of freedom (three for translation along X, Y and Z axes, and three for rotation along the three axes), an audio feedback may often be confusing. Hence, a visual feedback to the sonographer about the desired position and orientation of the probe and its current position and orientation information will make it much easier for sonographer to quickly converge to the desired position and orientation.
OBJECTS
[008] The object of the invention is to provide an apparatus and method for determining the position and orientation of an object in three-dimensional space and provide guidance to achieve a desired position and orientation.
[009] The object of the invention is achieved by an apparatus and method, for determining the position of an object in three-dimensional space. According to an embodiment, the apparatus comprises a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object, at-least one surface positioned in the direction of the emitted first set of two patterns to intercept the first set of two patterns, a camera to capture an image of the projection of the first set of two patterns on the at-least one surface, and a processor operably coupled to the camera to receive the image and determines the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.
[0010] According to another embodiment, the at-least one surface comprises a first surface and a second surface at an angle to each other, wherein the first surface is positioned in the direction of projection of one of the first set of two patterns and the second surface is positioned in the direction of projection of the other of the first set of two pattern.
[0011] According to yet another embodiment, the first set of two patterns comprises of a set of two line segments.
[0012] According to yet another embodiment, the apparatus further comprises a second light emitting device for emitting a second set of two patterns, the second set of two patterns having the same shape and size but distinct in appearance from the first set of two patterns, the second set of two patterns being at an angle to each other, the second light emitting device is positioned such that the second set of two patterns are projected on the at-least one surface.
[0013] According to yet another embodiment, the second light emitting device is operably coupled to the processor and the processor is configured to control the second light emitting device for projecting the second set of two patterns on the at-least one surface, indicating the desired position and orientation of the object. The user can manipulate the position and orientation of the object, thereby changing the position and orientation of the first set of two patterns on the at-least one surface, to bring it to the desired position. [0014] According to yet another embodiment, an alignment of the first set of two patterns and the second set of two patterns on the at-least one surface provides an indication of the object being at the desired position and orientation.
[0015] According to yet another embodiment, the apparatus further comprises of a switch operably coupled to the processor, for controlling the first light emitting device such that the first set of two patterns are within the field of view of the camera, based on the last known position of the first set of two patterns.
[0016] According to yet another embodiment, the processor is configured to provide an audio feedback based on a distance between the first set of two patterns and the second set of two patterns.
[0017] According to yet another embodiment, the processor is configured to alter the pitch and amplitude of the audio feedback based on the distance between the first set of two patterns and the second set of two patterns.
[0018] According to yet another embodiment, the method comprises, providing a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object, positioning at- least one surface in the direction of the emitted first set of two patterns to intercept the first set of two patterns, capturing an image of the projection of the first set of two patterns on the at-least one surface by a camera, and determining the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.
[0019] The first light emitting device can be attached onto the object and the position and orientation of the object can be determined using the projection of the first set of two patterns captured in the image. The first set of two patterns can comprise any shape and geometry, for example a line segment, a set of points, an L shaped line, a rectangle, and the like. The first set of two patterns and the second set of two patterns are visibly distinct. For example, the first set of two patterns and the second set of two patterns may be of two different distinguishable colors or may comprise of different styles such as dashed line, dotted line, solid line, flashing line etc.
[0020] For example, the apparatus may be attached to an ECG probe of an ECG system, where the first set of two patterns provide the current position and orientation information of the ECG probe and the second set of two patterns provide the desired position and orientation of the probe, which provide a visual feedback to the operator to align current position to the more desirable position and orientation of the probe that helps in acquiring a better quality of ultrasound images. The desired position can be pre-defined, may come from an external source or may be computed by other means, which is used by the processor in controlling the second light emitting device. The second set of two patterns emitted by the second light emitting device provides the indication of the desired position. The user of the ECG probe may adjust the ECG probe such that the projections of the first set of two patterns aligns or superimposes with the second set of two patterns. This provides the advantage of computing the position and orientation of the ECG probe and provide visual guidance to achieve the desired position and orientation of the ECG probe.
BRIEF DESCRIPTION OF FIGURES [0021] Embodiments herein are illustrated in the accompanying drawings, throughout which reference letters or symbols indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0022] Fig 1 shows an exemplary illustration of a chest of a subject where an ultrasound probe can be placed for acquiring ultrasound images, wherein the position of placing the ultrasound probes is designated as A, B, C and D.
[0023] Fig 2a illustrates a block diagram of the apparatus for determining the position and orientation of an object and providing guidance to achieve a desired position and orientation of the object according to an embodiment herein - i) object whose position and orientation is to be determined; ii) two surfaces at an angle on which a pattern can be projected (surface Sx and Sy); iii) light emitting device P1 attached to the object; iv) light emitting device P2, fixed to the room or surface Sx and Sy, and v) a camera for acquiring images of patterns projected on the two surfaces Sx and Sy.
[0024] Fig 2b illustrates projection of the first set of two patterns (line segments) LT X and LT y projected on the two surfaces Sx and Sy by light emitting device P1 and the second set of two patterns (line segments) Lx and Ly projected on the two surfaces Sx and Sy by light emitting device P2. The first set of two patterns indicates the current position and orientation of the object and the second set of two patterns indicates a desired position and orientation of the probe. [0025] Fig 3 illustrates a flow chart of an exemplary method for determining the position and orientation of an object according to an embodiment herein.
[0026] Fig 4 shows these arrangements of a preferred embodiment.
[0027] Fig 5 shows a visualization of (i) probe coordinate system (PCS) XP, YV,ZP with origin at Op attached to the object and its virtual surface Sp,Sp·, ii) the table coordinate system (TCS) attached to the room) XT, YT,ZT with origin at 0T and its surface
ST ST
[0028] Fig 6a an alternate arrangement of the two surfaces, where one of the surface Sx is at the top.
[0029] Fig 6b shows an alternate arrangement where the two surfaces Sx and Sy are not at right angle to each other. They are also not vertical or perpendicular to the floor.
[0030] Fig 7a shows and alternate arrangement where the camera cum projection system are not attached to the two surfaces Sx and Sy and it is placed at a different location.
[0031] Fig 7b shows and alternate arrangement where the camera cum projection system is fixed at an angle to the two surfaces Sx and Sy.
DETAILED DESCRIPTION
[0032] Fig 2b shows the various components of the said apparatus and methods for use with an object that are described below. Fig 4 shows a preferred embodiment of the apparatus and methods and it is described in detail for better understanding. In this, the first and second set of two patterns comprise of a pair of line segments each, and the at-least one surface comprises of two surface Sx and SY on which projections from light emitting devices P1 and P2 will be projected. Further, the first set of two patterns and the second set of two patterns are distinguishable from each other by their color. It is sufficient to have one large surface on which patterns from light emitting devices P1 and P2 can be projected. A description of the apparatus and methods comprising of - a. two surface Sx and Sy which are fixed at an angle to the room, and make an angle to each other. Light patterns can be projected on these two surfaces by light emitting devices P1 and P2. b. the first light emitting device P1 will be attached to the object whose position and orientation information need to be measured accurately. It emits a first set of two patterns LT X and LT y onto the surface Sx and Sy. The first set of two patterns are emitted such that they are at an angle to each other. The position and orientation of the first set of two patterns LT X and LT y projected onto the surfaces Sx and Sy provide an indication of the position and orientation of the object. To simplify the calculations for computing the position and orientation information of the object, the surface Sx and Sy are at right angle to each other and to the floor, as shown in Fig 4. c. a camera that can be fixed to the room or surface Sx and/ or Sy. It can take image of the two surfaces Sx and Sy. d. a processor, operably coupled to the camera, that computes the position and orientation of the object based on the position and orientation of the first set of two patterns LT X and LT y in the image. e. a second light emitting device P2, operably coupled to the processor, for emitting a second set of two patterns Lx and Ly that are projected onto the surface Sx and Sy respectively. The second set of two patterns emitted by P2 are visually distinct from the first set of two patterns emitted by P1. P2 is controlled by the processing unit to project the second set of two patterns to indicate the desired position and orientation of the object. The user can attain the desired position and orientation of the object by moving and / or rotating the object so that the first set of two patterns LT X and LT y projected by P1 aligns with the second set of two patterns Lx and Ly projected by P2.
[0033] Before the present subject matter is described in further detail, it is to be understood that the subject matter is not limited to the particular embodiments described, as such, and may of course vary. It shall become abundantly clear after reading this specification, that the subject matter may, without departing from the spirit and scope of the subject matter, also be practiced in other than the exemplified embodiments.
[0034] For the purpose of describing the invention, Fig 2 shows an embodiment of the said apparatus where upon it is used to measure the absolute position and orientation of an ultrasound probe and provide a visual feedback to the sonographer for a more desirable position and orientation of the probe.
[0035] Other embodiments, for example, are shown in Fig 6a and Fig 6b where the projection surface Sx and Sy are not at right angle to each other nor with the floor. Fig 7a and Fig 7b show yet another embodiment of the said apparatus where the camera and light emitting device P2 are not aligned with the light emitting device P1 of the apparatus.
[0036] In the preferred embodiment shown in Fig 4, the color of pattern projected by light emitting device P1 and light emitting device P2 are shown in red and blue respectively. It is understood that the colors are only one example to make the two patterns emitted by P1 and P2 distinct from each other, and there are many other ways in which they can be made distinct from each other. [0037] Also, in the preferred embodiment shown in Fig 4, the patterns emitted by light emitting device P1 and P2 are shown as line segments. It is clearly understood that they can be any other pattern such as rectangle, two lines forming an L shape, a set of points etc. [0038] In another embodiment, the surface Sx and Sy can be light sensing suraces to sense the line segments projected by light emitting device P1 and P2.
[0039] It shall also become clear that the drawings may not to the scale. In some other examples, the method may vary to include some additional block or may be practiced in the order different than the order of the blocks discussed in this specification. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting. It must be noted that as used herein, the singular forms "a", "an", and "the" include plural referents unless the context clearly dictates otherwise. The present subject matter provides solution to a number of problems, including but not limited to measuring position and orientation information of an ultrasound probe and providing visual feedback to the sonographer for aligning the probe for a more desirable orientation and position of the probe. [0040] Reference is now made to Figure 3 that describes the steps required, according to one embodiment of the apparatus and methods, for measuring position and orientation of an object and visual feedback on desired position and orientation of the object. [0041] In block 310, first light emitting device P1 is attached to the object projecting two patterns LT X and LT y on surface Sx and Sy respectively, as shown in Fig 2a and Fig 2b. Moving and / or rotating the object also causes the first set of two patterns LT X and LT y to move and / or tilt.
[0042] In block 320, a camera is mounted to the surface Sx and Sy (Fig 2a), or to the floor/ ceiling of the room (Fig 7a and Fig 7b). The camera acquires images of the surfaces Sx and Sy and sends to a processor. [0043] Block 330 is for a processor that detects the position and orientation of the first set of two patterns LT X and LT y, shown in Fig 2b from the given image. It then computes the position and orientation of the object based on the position and orientation of the first set of two patterns LT X and LT y in the image.
[0044] It is now required to calculate the transformation T (3 translation values Tx , Ty and Tz and 3 rotation values Rx, Ry and Rz) of the object that will transform the patterns LT X, Ly from a default initial position and orientation of the object to its current position and orientation. This is equivalent to finding the transformation between a probe coordinate system PCS (at the default position and orientation of the object) and table coordinate system TCS (that is fixed). Reference is made to Fig 6 illustrating (i) PCS defined by the axes XP, YV,ZP with origin at Op\ ii) the TCS defined by the axes Ct, Ut, Zt with origin at 0T. The surface Sp, Sp on TCS is represented by the virtual surface Sp,Sy in the PCS, as shown in Fig 6.
[0045] A point Pp in the PCS, may be represented as a 1D vector Pp = [x y z 1]T where (x, y, z) are coordinates of the point in PCS. Hence, the coordinates of the point Pp (in the PCS) in TCS will be PT = T * Pp. Consider the virtual line segment Lp with reference to PCS, as shown in Fig 6 and the corresponding line segment L in TCS. Let Lx' be the coordinates of Lp in TCS. Applying transformation T on Rcί, Rc2, Rgi, Rgi and Op (coordinates of line segments Lp and Lp in default position and orientation of the probe and measured in PCS), we get the virtual points Rcg,R'^, R'gg, R'g2 and 0’T of these points in TCS with the following constraints -
Points are collinear Points are collinear Points are collinear Points
Figure imgf000013_0001
are collinear Line L' and L are coplanar Line L’ and Lp are coplanar
[0046] Solving the six equations will give the six transformation parameters - 3 translation values Tx , Ty and Tz and 3 rotation values Rx, Ry and Rz.
[0047] The calculation of the transformation T can also be done if the first set of two patterns are projected at two different locations and having different orientation on just one surface. The calculation is not given here for the sake of brevity and simplicity.
[0048] In block 340, the desired position of the object, as a transformation TD on the PCS, is provided to the system either manually or by an external source. In an embodiment of the apparatus where the apparatus is attached to an ultrasound probe, the desired position of the ultrasound probe can be derived from the acquired ultrasound image by detecting the various chambers of the heart and computing the desired position based on the size and shape of the chambers detected. In yet another embodiment, the desired position and orientation can be defined based on the location where the probe is placed (as shown in Fig 1). In yet another embodiment, it can be provided by an external system.
[0049] Let Tpz be the transformation for the light emitting device P2 with respect to the TCS. Hence, a point PT in TCS can be transformed into the coordinate system for light emitting device P2 as pP2 _ yR2 rt or rt _ [pP2 -i * pP2^ w|-|ere pP2 js coordinate of the point in the coordinate system for light emitting device P2. Therefore, the position of the first set of two patterns LT X and LT Y, after applying the desired transformation TD on the coordinate system of P2 and transforming it to the coordinate system of the TCS, will be Lx 2 = TP2 * Lx and Lx = TD * Lx, where Lx is the coordinates of line segments in the default position and orientation of the light emitting device P1 (similarly one can calculate LY) Here, Lx 2 and Lpz are coordinates of the desired position of the second set of two patterns in the local coordinate system of the light emitting device P2. Block 350 in figure 3 represents this computation needed to compute the Lp x and Lpz for projection by the light emitting device P2 on the two surfaces Sx and Sx respectively
[0050] In block 360 of figure 3, the processing unit takes the desired position of the second set of two patterns Lx 2 and Lpz and projects on the two surfaces Sx and SY using the second light emitting device P2. This provides a visual feedback to the operator for adjusting the probe.
[0051] Block 370 in Fig 3 represents the adjustment of the object by the user so that the current position of the object, indicated by first set of two patterns emitted by the first light emitting device P1 , matches with the desired position and orientation of the probe, indicated by the second set of two patterns emitted by the second light emitting device P2. This can be complimented using voice feedback to the sonographer. Moreover, the difference in the current and desired position can also be mapped to the pitch and amplitude of an audio sound.
[0052] If necessary, these steps can be repeated for a more refined positioning of the probe.

Claims

I/We claim:
1. An apparatus for determining position and orientation of an object in three dimensional space, the apparatus comprising: a first light emitting device for emitting first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object; at-least one surface positioned in the direction of the emitted first set of two patterns to intercept the first set of two patterns; a camera to capture an image of the projection of the first set of two patterns on the at-least one surface; and a processor operably coupled to the camera to receive the image and determining the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.
2. The apparatus according to claim 1, wherein the at-least one surface comprises a first surface and a second surface at an angle to each other, wherein the first surface is positioned in the direction of projection of one of the first set of two patterns and the second surface is positioned in the direction of projection of the other pattern of the first set of two patterns.
3. The apparatus according to claim 1 , wherein the first set of two patterns comprises of two line segments.
4. The apparatus, further comprising a second light emitting device for emitting a second set of two patterns, the second set of two patterns being distinct from the first set of two patterns, the second set of two patterns being at an angle to each other, the second set of two patterns providing an indication of a desired position on being intercepted by the at-least one surface.
5. The apparatus according to claim 4, wherein the second light emitting device is operably coupled to the processor and the processor is configured to control the second light emitting device.
6. The apparatus according to claim 4, wherein the second light emitting device is positioned such that the second set of two patterns are projected on the at-least one surface.
7. The apparatus according to claim 4, wherein an alignment of the first set of two patterns and the second set of two patterns on the at- least one surface provides an indication of the object being at the desired position and orientation.
8. The apparatus according to claim 1, further comprising of a switch operably coupled to the processor, for controlling the first light emitting device such that the first set of two patterns are within the field of view of the camera, based on the last known position of the first set of two patterns.
9. The apparatus according to claim 4, wherein the processor is configured to provide an audio feedback based on the distance between the first set of two patterns and the second set of two patterns.
10. The apparatus according to claim 9, wherein the processor is configured to alter the volume and/or pitch of the audio feedback based on distance between the first set of two patterns and the second set of two patterns.
11. A method of determining the position and orientation of an object in three-dimensional space, the method comprising: providing a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object; positioning at-least one surface in the direction of the emitted first set of two patterns to intercept the first set of two patterns; capturing an image of the projection of the first set of two patterns on the at-least one surface by a camera; and determining, the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.
12. The method according to claim 11, wherein the at-least one surface comprises a first surface and a second surface at an angle to each other, wherein the first surface is positioned in the direction of projection of one of the first set of two patterns and the second surface is positioned in the direction of projection of the other of the first set of two patterns.
13. The method according to claim 11 , wherein the first set of two patterns comprise set of two line segments.
14. The method according to claim 11, further comprising of a second light emitting device for emitting a second set of two patterns, the second set of two patterns being distinct from the first set of two patterns, the second set of two patterns being at an angle to each other, the second set of two patterns providing an indication of a desired position on being intercepted by the at-least one surface.
15. The method according to claim 14, is controlling the second light emitting device to emit the second set of two patterns.
16. The method according to claim 14, wherein the second light emitting device is positioned such that the second set of two patterns are projected on the at-least one surface.
17. The method according to claim 4, wherein an alignment of the first set of two patterns and the second set of two patterns on the at- least one surface provides an indication of the object being at the desired position and orientation.
18. The method according to claim 11, further comprising controlling the first light emitting device such that the first set of two patterns are within the field of view of the camera based on the last known position of the first set of two patterns.
19. The method according to claim 1, further comprising providing an audio feedback based on a distance between the first set of two patterns and the second set of two patterns.
20. The method according to claim 19, further comprising altering the pitch and / or volume of the audio feedback based on the distance between the first set of two patterns and the second set of two patterns.
PCT/IB2022/051266 2021-02-13 2022-02-13 Measurement of position and orientation of an object WO2022172227A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202141006151 2021-02-13
IN202141006151 2021-02-13

Publications (1)

Publication Number Publication Date
WO2022172227A1 true WO2022172227A1 (en) 2022-08-18

Family

ID=82837361

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/051266 WO2022172227A1 (en) 2021-02-13 2022-02-13 Measurement of position and orientation of an object

Country Status (1)

Country Link
WO (1) WO2022172227A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150204662A1 (en) * 2014-01-17 2015-07-23 Canon Kabushiki Kaisha Three-dimensional-shape measurement apparatus, three-dimensional-shape measurement method, and non-transitory computer-readable storage medium
US20190130589A1 (en) * 2017-11-01 2019-05-02 Omron Corporation Three-dimensional measurement apparatus, three-dimensional measurement method and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150204662A1 (en) * 2014-01-17 2015-07-23 Canon Kabushiki Kaisha Three-dimensional-shape measurement apparatus, three-dimensional-shape measurement method, and non-transitory computer-readable storage medium
US20190130589A1 (en) * 2017-11-01 2019-05-02 Omron Corporation Three-dimensional measurement apparatus, three-dimensional measurement method and program

Similar Documents

Publication Publication Date Title
US11911914B2 (en) System and method for automatic hand-eye calibration of vision system for robot motion
JP4757142B2 (en) Imaging environment calibration method and information processing apparatus
US6628322B1 (en) Device and method for positioning a measuring head on a noncontact three-dimensional measuring machine
JP6211157B1 (en) Calibration apparatus and calibration method
JP5496008B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
US10877155B2 (en) Survey data processing device, survey data processing method, and survey data processing program
EP3441788A1 (en) Apparatus and method for generating a representation of a scene
EP2932191A2 (en) Apparatus and method for three dimensional surface measurement
CN112446906A (en) Virtual spatially registered video overlay display
JP2002092647A (en) Information presentation system and model error detection system
JP2017129567A (en) Information processing apparatus, information processing method, and program
US9613421B2 (en) Optical tracking
US20130314533A1 (en) Data deriving apparatus
CN111508020B (en) Cable three-dimensional position calculation method and device for fusing image and laser radar
US20200242806A1 (en) Stereo camera calibration method and image processing device for stereo camera
US11956537B2 (en) Location positioning device for moving body and location positioning method for moving body
US20220011750A1 (en) Information projection system, controller, and information projection method
CN109773589B (en) Method, device and equipment for online measurement and machining guidance of workpiece surface
WO2019230284A1 (en) Three-dimensional measuring device, method for displaying position of three-dimensional measuring device, and program
JPH06189906A (en) Visual axial direction measuring device
WO2022172227A1 (en) Measurement of position and orientation of an object
JP2000205821A (en) Instrument and method for three-dimensional shape measurement
JPS6256814A (en) Calibration system for three-dimensional position measuring camera
KR20140099622A (en) Robot localization detecting system using a multi-view image and method thereof
JP3207023B2 (en) Calibration method of image measurement device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22752440

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22752440

Country of ref document: EP

Kind code of ref document: A1