NL2011858C2 - Stereo-imaging sensor position localization method and system. - Google Patents

Stereo-imaging sensor position localization method and system. Download PDF

Info

Publication number
NL2011858C2
NL2011858C2 NL2011858A NL2011858A NL2011858C2 NL 2011858 C2 NL2011858 C2 NL 2011858C2 NL 2011858 A NL2011858 A NL 2011858A NL 2011858 A NL2011858 A NL 2011858A NL 2011858 C2 NL2011858 C2 NL 2011858C2
Authority
NL
Netherlands
Prior art keywords
sensors
coordinate system
stereo
images
stereo image
Prior art date
Application number
NL2011858A
Other languages
Dutch (nl)
Inventor
Alistair Neil Vardy
Original Assignee
Univ Delft Tech
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Delft Tech filed Critical Univ Delft Tech
Priority to NL2011858A priority Critical patent/NL2011858C2/en
Priority to PCT/NL2014/050811 priority patent/WO2015080583A1/en
Application granted granted Critical
Publication of NL2011858C2 publication Critical patent/NL2011858C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The disclosure relates to a method and system wherein sensors on a cap or directly disposed on a head can be localized using a stereo camera. By capturing a plurality of stereo images, the positions of the sensors can be determined with respect to each other. At least a first stereo image having a first set of sensors and a second stereo image having a second set of sensors are captured in a first position respectively a second position of the stereo camera relative to the cap by a relative rotation of the stereo camera around the cap. The first set of sensors and the second set of sensors may have one or more sensors in common. The relative rotation of the stereo camera around the cap can be obtained by at least one of a rotation of the stereo camera around a non-rotating cap and a rotation of the cap with respect to a fixed stereo camera.

Description

Stereo-imaging sensor position localization method and system
FIELD OF THE INVENTION
The invention relates to a method and system for locating sensors, e.g. electroencephalographic (EEC)electrodes. In particular, the invention relates to determining three-dimensional positions of sensors disposed on a head of a subject, e.g. on a cap to be worn by a subject.
BACKGROUND
Electroencephalography (EEG) is a noninvasive method to measure the electrical activity of the brain. EEG enables monitoring a person's brain activity over time and to localize sources of activity. This requires a precise localization of the responsible sources. They can be reconstructed from EEG data when the sensor positions are known. A common method for determining the sensor positions involves using a digitization pen. This method is cumbersome since it is time consuming and current systems are expensive .
More advanced methods for localizing sensors have been disclosed.
For example, US 2005/0058336 discloses a method and apparatus for measuring the location of sensors arranged on a cap worn on the head of a person. A plurality of cameras is arranged in a stationary array on a frame about the cap. A model of the head is developed and locations of the sensors are established by triangulation. A disadvantage of this method is the high cost of the system requiring the use of many expensive camera's and a very precise frame for mounting the cameras. The frame is used to have accurate knowledge of the direction of each of the cameras. Furthermore, the localization can only be done once a shape of the head of the person is known. This may be cumbersome because the sensors are generally not flat ob-j ects.
The article "EEG-MRI Co-registration and Sensor Labeling Using a 3D Laser Scanner" by Koessler et al, Annals of Biomedical Engineering, Vol. 39, No. 3, March 2011 discloses the use of a handheld 3D laser scanner for obtaining locations of EEG sensors. In this method, the cameras of the scanner monitor a line from the laser beam and the sensors are detected as bumps in the line. The laser scanner is a very expensive apparatus.
There exists a need in the art for a less expensive yet accurate method and system for localizing sensors disposed on a cap.
SUMMARY
To that end, in one aspect of the disclosure, a method is presented for locating a plurality of sensors arranged on head of a subject using a stereo camera. The sensors may e.g. be disposed on a cap. The cap is on a head of a subject (e.g. a human being). In the remainder of the text it will be assumed that the sensors are disposed on a cap, but this is not a requirement, they may, for example also be individually attached to the head.
The sensors are optically distinct from the head or cap background for the stereo camera, i.e. optically distinct in view of the detection capability of the stereo camera. For example, optical differences between the sensors and the cap may comprise one or more of brightness, colour, hue, luminance, reflectivity, absorption and material parameters, such as electrical conductivity, thermal conductivity etc. In one embodiment, the sensors are electrodes.
In a first step, at least a first stereo image having a first set of sensors and a second stereo image having a second set of sensors are captured in a first position respectively a second position of the stereo camera relative to the cap by a relative rotation of the stereo camera around the cap. The first set of sensors and the second set of sensors may have one or more sensors in common. The relative rotation of the stereo camera around the cap can be obtained by at least one of a rotation of the stereo camera around a non-rotating cap and a rotation of the cap with respect to a fixed stereo camera.
In one embodiment, the phase of capturing stereo images is first completed before the stereo images are processed in order to determine the three-dimensional positions of the sensors. In another embodiment, the processing is started as soon as sufficient stereo images have been captured .
Whereas the first stereo image may be the first captured stereo image, it should be appreciated that the first captured stereo image may be any image in the sequence of stereo images. The second stereo image preferably is a preceding or subsequent stereo image in the sequence of stereo images, but may also be a stereo image further away in the sequence, preferably such that the images can be processed linearly. The first stereo image and second stereo image may contain at least one common sensor.
The processing involves analysing two images of the first stereo image and two images of the second stereo image to determine, respectively, three-dimensional positions of the first set of sensors in a first coordinate system and the second set of sensors in a second coordinate system. The first coordinate system and the second coordinate system are coordinate systems defined by the stereo camera in respectively the first position and the second position. The three-dimensional positions of the sensors in the first coordinate system and the second coordinate system can be obtained by triangulation in a manner known as such.
The processing further involves expressing the three-dimensional positions of the first set of sensors and the three-dimensional positions of the second set of sensors in a common coordinate system. As such, the three-dimensional positions of the first set and second set of sensors are known in the common coordinate system. This may e.g. be achieved by a transformation from the first coordinate system to the second coordinate system or vice versa or the transformation for both the first and second coordinate system to a new coordinate system.
Another aspect of the disclosure is a system for locating a plurality of sensors arranged on a cap or individually attached to the head of a subject. The system may be part of a setup for electroencephalography (EEG). Again, the cap may be on a head of a subject and the sensors are optically distinct from the cap. The system comprises a stereo camera configured for relative rotation around the cap and processing means. The processing means are configured, e.g. by having circuitry arranged or programmed for that purpose, in operation of the system for performing one or more the method as disclosed herein.
Other aspects of the disclosure comprise a computer program comprising software code portion configured for, when executed on a computer system, performing the one or more steps and to a non-transitory computer medium or media storing the computer program. A still further aspect of the disclosure relates to the use of a stereo camera for locating a plurality of sensors arranged on a head, e.g. on a cap. The sensors are optically distinct from the cap for the stereo camera. The stereo camera captures one or more stereo images by relative rotation of the stereo camera around the cap. The stereo camera preferably comprises only two cameras. The (two) cameras are mechanically connected, preferably rigidly, e.g. using a rigid connector such as a bar. Two cameras are provided at a distance to each other of less than 20 cm, preferably less than 10 cm, e.g. 8, 6 or 4 cm.
The applicant has found that sensors on a head can be localized using a stereo camera. By capturing a plurality of stereo images, the positions of the sensors can be determined with respect to each other. The technical requirements for the cameras are moderate, such that the method and system are less expensive than prior art methods and systems. In fact, in one experiment, the localization was done using two simple webcams.
It should be noted that the same or similar results can be obtained using stereo images in the non-visual domain. Any property may be used as long as it can be obtained and digitized in a manner similar to how a camera records light and variations thereof, and that the sensors have sufficient resolution .
The method could additionally be used to determine the shape of the head. This can be used to make an estimate of the conductivity model that is used for EEG source reconstruction. For source reconstruction, areas in the brain that are active during a task are sought for. The mathematical calculations require knowledge about how electrical signals propagate through the various tissues of the skull (the brain, bone, and skin). Typically, an MRI is used for his. This is very expensive method. A rough estimate could be made from just the shape of the head. The location of the sensors could be used to determine this shape.
In one embodiment, the first stereo image and the second stereo image are captured such that one or more sensors belong to both the first set of sensors and the second set of sensors. Preferably, the stereo images are captured such that more than one sensor belongs to both the first and the second set, e.g. two, three or four sensors. The three-dimensional positions of the sensors are obtained by triangulation for the two images of the first stereo image in the first coordinate system and the three-dimensional positions of sensors are obtained by triangulation of for the two images of the second stereo image in the second coordinate system. A first transformation function is determined for transforming the second coordinate system to the first coordinate system on the basis of the sensors belonging to both the first set of sensors and the second set of sensors. The transformation may comprise a Rigid Transform. It should be noted that after the transform, another transform may be performed to improve the accuracy of the three-dimensional localization. Δη iterative closest point algorithm may be used to complete the task. The three-dimensional positions of the second set of sensors are expressed as three-dimensional positions in the first coordinate system as the common coordinate system applying the first transformation. In this manner, the three-dimensional positions of the sensors are all known in the first coordinate system.
In another embodiment, further stereo images are captured. A third stereo image having a third set of sensors may be captured such that one or more sensors belong to both the third set of sensors and a common set of sensors containing sensors from the first set of sensors and the second set of sensors. Again, the three-dimensional positions of the sensors can be obtained by triangulation in two images of the third stereo image in a third coordinate system. The third coordinate system may be a coordinate system of the stereo camera in a third position of the stereo camera at which the third stereo image is obtained. A second transformation is determined for transforming the third coordinate system to the common coordinate system on the basis of the sensors belonging to both the third set of sensors and the common set of sensors. This results in that the three-dimensional positions of the third set of sensors may also be expressed in the first coordinate system as the common coordinate system by applying the second transformation.
After a certain number of stereo images have been captured at different positions around the cap, a transformation from a camera coordinate system to the common coordinate system may not be possible because a mathematical singularity is encountered. The common coordinate system may then be rotated. To that end, an n-th coordinate system for an n-th position of the stereo camera may be taken as a new common coordinate system. Three-dimensional positions of an n+l-th set of sensors obtained for an n+l-th position of the stereo camera can be obtained in the new common coordinate system. Positions already determined in the former common coordinate system may also be expressed in the new common coordinate system.
In order to improve the accuracy of the determined positions for the sensors, in a further embodiment, the obtained three-dimensional positions for one or more of the sensors expressed in the common coordinate system may be averaged. The averaging step may be performed either after a certain number of captures of stereo images or after capturing all stereo images.
The positions of the sensors may, in a further embodiment, be transferred to a head coordinate system. This may e.g. be used for EEG procedures or to approximate the shape of the head of the subject. To that end, fiducial points of the head of the subject may be registered when capturing the stereo images such that the fiducial points are registered simultaneously with the sensors in at least some of the stereo images. The fiducial points may comprise the well-known nasion and preauricular points. The head coordinate system may be determined using the fiducial points registered in the stereo images. The three three-dimensional positions of the sensors in the common coordinate system may then be expressed as three-dimensional positions of the sensors in the head coordinate system after determining a transformation, e.g. a Rigid Transform. Optionally, the sensors may then be labelled using a standard labelling scheme, e.g. the 10/20 scheme, for the cap.
In one embodiment, sensors may be detected in the two images of each stereo image by segmenting images of the first stereo image and second stereo image using the optical distinction of the sensors from the cap, e.g. the luminance, in pixels associated with the cap and pixels associated with the sensors. Detection (e.g. identification) may be performed by modelling the image data using Gaussian Mixture Modelling. Assuming the sensors have a common shape, such as circular or elliptic, a Hough Transform is performed for detecting the sensors in the images.
In one embodiment, corresponding sensors may be matched in the two images of a stereo image using sparse stereo matching. In sparse stereo matching, disparities for determining distances are only calculated for feature points, e.g. the centre of the detected ellipse or circle. Sparse stereo matching saves resources, such as computation time. A further approach that may be used to reduce computational complexity is the rectification of each two images of the stereo image.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Functions described in this disclosure may be implemented as an algorithm executed by a microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable media may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wired, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java(TM), Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It is noted that the invention relates to all possible combinations of features recited in the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
Aspects of the invention will be explained in greater detail by reference to exemplary embodiments shown in the drawings, in which:
Fig. 1 is a schematic two-dimensional illustration of a system according to an embodiment of the disclosure;
Fig. 2 is a schematic illustration of a processing system for use in the system of Fig. 1;
Fig. 3 is flow chart depicting a steps according to an embodiment of a disclosed method;
Figs. 4A and 4B are a flow chart and a few schematic representations of another embodiment of the method;
Fig. 5 is a flow chart depicting more detailed steps according to an embodiment of the disclosed method; and
Fig. 6 shows experimental left and right images of a stereo image and obtained locations for the sensors expressed in a common coordinate system.
DETAILED DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic planar illustration of a system 10 configured for locating a plurality of sensors 11 arranged on a cap C (see Fig. 6) worn on a head H of a person using a stereo camera 12. The head H has some known fiducial points, indicated as the preauricular points Al, A2 and the nasion N.
Sensors 11 may be electroencephalographic (EEC) electrodes. The electrodes 11 are typically provided in a standardized layout, wherein each of the electrodes is assigned an identity, as can be observed from Fig. 1.
As shown in Fig. 1, the sensors 11 are optically distinct from the head of the subject, in this case optically distinct from the cap C background, for the stereo camera 12, i.e. optically distinct in view of the detection capability of the stereo camera 12. For example, optical differences between the sensors and the cap may comprise one or more of brightness, colour, hue, luminance, reflectivity, absorption and material parameters, such as electrical conductivity, thermal conductivity etc.
Stereo camera 12 comprises two cameras connected by a rigid connector 13, such as a rigid bar. The mutual distance between the cameras of stereo camera 12 is less than 20 cm, preferably less than 10 cm, e.g. 8, 6 or 4 cm.
The stereo camera 12 is configured to rotate R around the head H of the person to obtain various stereo images. Alternatively, or in addition, the person rotates around an axis such that a relative rotation R of the stereo camera 12 around the head H is obtained. It should be noted that the rotation is not required to follow a perfect rotational path.
Stereo camera 12 is shown in a first position and a second position, wherein stereo camera 12 is configured to obtain a stereo image and provide the pixel data to a processing system 14. It should be appreciated that some or all of the processing may also be performed in the stereo camera 12. The coordinate system CS of the stereo camera 12 moves with the rotation R of the stereo camera 12. In the first position, the coordinate system is designated CS1 and in the second position, the coordinate system is designated CS2. Stereo camera 12 may comprise two connected webcams.
Fig. 2 is a block diagram illustrating an exemplary data processing system that may be used in as a processing device 14.
Data processing system 5 may include at least one processor 21 coupled to memory elements 22 through a system bus 23. As such, the data processing system 14 may store program code within memory elements 22. Further, processor 21 may execute the program code accessed from memory elements 22 via system bus 23. In one aspect, data processing system 14 may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that data processing system 14 may be implemented in the form of any system including a processor and memory that is capable of performing the functions described within this disclosure .
Memory elements 22 may include one or more physical memory devices such as, for example, local memory 24 and one or more bulk storage devices 25. Local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The data processing system 14 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the number of times program code must be retrieved from bulk storage device 25 during execution .
Input/output (I/O) devices depicted as input device 26 and output device 27 optionally can be coupled to the data processing system 14. Examples of input device may include, but are not limited to, for example, a keyboard, a pointing device such as a mouse, or the like. Examples of output device may include, but are not limited to, for example, a monitor or display, speakers, or the like. Input device and/or output device may be coupled to data processing system either directly or through intervening I/O controllers. A network adapter 28 may also be coupled to data processing system 14 to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter 28 may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to said data and a data transmitter for transmitting data to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapters that may be used with data processing system 14.
As pictured in Fig. 2, memory elements 22 may store an application 29. It should be appreciated that data processing system 14 may further execute an operating system (not shown) that can facilitate execution of the application. Applications, being implemented in the form of executable program code, can be executed by data processing system 14, e.g., by processor 21. Responsive to executing the application 29, the data processing system 14 may be configured to perform one or more operations to be described herein in further detail.
Fig. 3 is flow chart 30 depicting steps according to an embodiment of a disclosed method for locating a plurality of sensors 11 arranged on a cap C using a stereo camera 12 .
In a first step 31, at least a first stereo image having a first set of sensors and a second stereo image having a second set of sensors are captured in a first position respectively a second position of the stereo camera 12 relative to the cap by a relative rotation of the stereo camera around the cap. The first set of sensors and the second set of sensors may have one or more sensors in common. For example, the first and second set have three or four sensors in common.
In one embodiment, the phase of capturing stereo images in the data processing system 14 is first completed before the stereo images are processed in order to determine the three-dimensional positions of the sensors 11. In another embodiment, the processing is started as soon as sufficient stereo images have been captured.
Whereas the first stereo image may be the first captured stereo image, it should be appreciated that the first captured stereo image may be any image in the sequence of stereo images. The second stereo image preferably is a preceding or subsequent stereo image in the sequence of stereo images, but may also be a stereo image further away in the sequence. The first stereo image and second stereo image may contain at least one common sensor.
The processing in data processing system 14 involves analysing two images of the first stereo image and two images of the second stereo image to determine, respectively, three-dimensional positions of the first set of sensors 11 in the first coordinate system CS1 and the second set of sensors 11 in the second coordinate system CS2. This is step 32 in Fig. 3. The first coordinate system CS1 and the second coordinate system CS2 are coordinate systems CS defined by the stereo camera 12 in respectively the first position and the second position as shown in Fig. 1. The three-dimensional positions of the sensors 11 in the first coordinate system CS1 and the second coordinate system CS2 can be obtained by triangulation in a manner known as such.
Optionally, the processing further involves expressing the three-dimensional positions of the first set of sensors 11 and the three-dimensional positions of the second set of sensors 11 in a common coordinate system CCS. This is shown in Fig. 3 as step 33. As such, the three-dimensional positions of the first set and second set of sensors are known in the common coordinate system. This may e.g. be achieved by a transformation from the first coordinate system CS1 to the second coordinate system CS2 or vice versa or the transformation for both the first and second coordinate system to another coordinate system.
The sensors 11 are thereby localized on a cap using stereo imaging. By capturing a plurality of stereo images, the positions of the sensors 11 can be determined with respect to each other. For example, between 150 and 2500 stereo images may be taken, e.g. 200, 300, 400, 500, 600, 1000, 1200, 1500, 1700, 1800 or 2000 stereo images, depending e.g. on the distance between the camera's and the sensors. The technical requirements for the cameras are moderate, such that the method and system are less expensive than prior art methods and systems. In fact, in one experiment, the localization was done using two simple webcams.
Figs. 4A and 4B are a flow chart 40 and a few schematic representations for further clarifying the method.
The first stereo image and the second stereo image are captured in step 41 such that one or more sensors 11 belong to both the first set of sensors (sensors S1-S4) and the second set of sensors (sensors S3-S6) . The three-dimensional positions of the sensors 11 are obtained by triangulation for the two images of the first stereo image in the first coordinate system CS1 and the three-dimensional positions of sensors are obtained by triangulation of for the two images of the second stereo image in the second coordinate system CS2, as very schematically shown in Fig. 4B. A first transformation function RT1 is determined for transforming the second coordinate system to the first coordinate system on the basis of the sensors belonging to both the first set of sensors and the second set of sensors. The transformation may comprise a Rigid Transform. The three-dimensional positions of the second set of sensors (S3-S6) are expressed as three-dimensional positions in the first coordinate system CS1 as the common coordinate system CCS applying the first transformation RT1. In this manner, the three-dimensional positions of the sensors S1-S6 are all known in the first coordinate system. The mapping is assisted using common points S3 and S4 in both coordinate systems CSl and CS2.
Further stereo images are captured. A third stereo image having a third set of sensors (S5-S8) may be captured such that one or more sensors (S5, S6) belong to both the third set of sensors and a common set of sensors containing sensors from the first set of sensors and the second set of sensors. Again, the three-dimensional positions of the sensors (S5-S8) can be obtained by triangulation in two images of the third stereo image in a third coordinate system CS3. The third coordinate system may be a coordinate system of the stereo camera 12 in a third position of the stereo camera 12 (not shown in Fig. 1) at which the third stereo image is obtained. A second transformation RT2 is determined for transforming the third coordinate system to the common coordinate system on the basis of the sensors belonging to both the third set of sensors and the common set of sensors (sensors S5 and S6 in this example). This results in that the three-dimensional positions of the third set of sensors may also be expressed in the first coordinate system as the common coordinate system by applying the second transformation RT2 .
It should be appreciated that, while in Fig. 4B the coordinate systems are three-dimensional coordinate systems with perpendicular axis, other coordinate systems CS may be envisaged such as polar coordinate systems and spherical coordinate systems.
Fig. 5 provides a more detailed flowchart 50 of a method for localizing sensors 11.
In general, the procedure starts with the first stereo image. As the stereo camera 12 moves, a series of images is captured. To determine the sensor positions, the data processing system 14 processes through the series of images one by one. For each stereo image, the data processing system 14 searches for new sensors that were not found in previous stereo images. The new sensors 11 are added to a list of sensors and their positions. All sensors 11 may, optionally, be expressed in a common coordinate system CSS.
The first stereo image may be used to determine the common coordinate system CCS. The first stereo image may be the initial image for the procedure. In the first stereo image, a certain number of sensors 11 can be identified and their positions are determined. For the first stereo image, the sensors 11 are already expressed in the CCS. This step initiates the set of discovered sensors and their positions.
As the stereo camera is rotated R, subsequent stereo images are captured. The steps below describe what is done when the data processing system 14 has processed previous stereo image n and moves to current stereo image n+1.
Each stereo image had a local coordinate system CSn defined by the stereo camera 12. These are 'local' coordinate systems .
In a first step 51, the cameras of the stereo camera 12 are calibrated.
The calibration step 51 may comprise two stages. Δη intrinsic calibration is performed to determine the deformations of the image due to imperfections in the lenses of the cameras. As each lens is different, the intrinsic properties of the lenses may be different. A calibration frame can be used to determine the distortions. This may be a black and white checker board of fixed dimensions. The image produced by the cameras is not a perfect checkerboard. However, the images of the reference frame can be undistorted as the dimensions are known. Using multiple calibration images the intrinsic parameters can be found and used in the data processing system 14 to compensate.
The second part of the camera calibration is the extrinsic calibration of the cameras. This is to determine the relation between both cameras. Using a calibration frame, e.g. similar to the intrinsic calibration, the extrinsic parameters can be found.
In a next step 52, stereo images are obtained from the cap C having the sensors disposed thereon by relative rotation of the stereo camera 12 around the cap C.
In step 53, the images of each of the stereo images are rectified in accordance with the calibration. For stereography the images can be rectified so that there is an epipolar line in both of the image spaces, on which the corresponding point should be situated.
In a next step 54, the sensors 11 in both images are identified. The sensors 11 are assumed to have a certain shape, e.g. circular or elliptic. The identification is performed using Gaussian mixture modelling to segment the sensors 11 from the cap background. To this end, optical differences between the sensors 11 and the background can be applied, e.g. the luminance. Each image may be segmented into two or more segments. Each segment is then modeled as a Gaussian distribution and by using Expectation Maximization (EM) the probabilities of each pixel intensity is maximized.
This results in two masks, one describing the pixels in the image that have a high probability of being of high intensity and the other mask of low intensity. Since the sensors have a higher light intensity than the cap, these can be segmented appropriately. A Hough transform may be applied fitting an ellipse to the set of pixels identified by the Gaussian mixture modeling algorithm corresponding to a sensor.
In step 55, the sensors 11 on the cap C are localized in the coordinate system CS of the stereo camera.
First, the sensors 11, detected in each of the images of a stereo image, are matched using the K-nearest neighbor (KNN) algorithm. Taking a training set of entities as points, and taking a single entity as a query point in Euclidean (multi) dimensional space, the distances are calculated between the query point and all the training points. Taking the 'K' nearest points the most similar points can be obtained and a match can be determined. Using the matched results for the sensors 11, the position can be triangulated to obtain the three-dimensional position of the sensors 11 in the coordinate system CS of the stereo camera 12.
In step 56, the localized sensors 11 for a certain stereo image are registered with to those already available in the common set of sensors 12. This common set contains the 3D positions from sensors localized in all previous stereo images.
As for stereo image n, sensors 11 have also been detected for stereo image n+1 (in step 54) and the positions of the sensors 11 have been determined with respect to a local coordinate system CSn+1 (in step 55). In this stereo image n+1, a certain number of sensors have been identified. As only a small deviation from the expected rotation R is performed, some of the sensors 11 are close to the same sensors as the ones detected in stereo image n. The rotation in the previous step is used as information for the algorithm where to find common electrodes for the coordinate systems CS. The rotation between two stereo images may be such that at least one common sensor can be identified and therefore depends e.g. on the distance between the various sensors.
Using this information, a transformation RT can be determined, using the iterative closest point (ICP) method, that transforms the coordinate system CSn+1 into the coordinate system CSn of the previous stereo image n. This transformation is a Rigid Transform. A rigid transformation can be used to describe the translation and rotation between coordinate systems CSn and CSn+1, but also between the common coordinate system CSS and a local coordinate system, such as CSn and CSn+1. Translation and rotation together can describe all changes from one coordinate system CS to another.
In particular, in stereo image n+1, a set of sensors 11 and their position relative to the coordinate system CSn+1 of the stereo camera 12 are determined. Some of these sensors are new, some have already been identified. Some sensors from the current coordinate system CSn+1 will match those already discovered in previous iterations as the total of discovered sensors 11 and their positions in the common coordinate system CSS frame are kept in a list. By using the registered sensors 11 of the previous stereo image n and the Rigid Transform RT between the current and the previous coordinate system, a set of matching sensors is obtained. This set is then used in the ICP method to obtain the Rigid Transform between the current and the common coordinate system CCS. With this Rigid Transform the positions of the sensors in the current coordinate system CSn+1 can be expressed as positions in the CCS. The positions of all discovered sensors in the current coordinate system CSn+1 are expressed in terms of the CSS.
The sensors 11 that that have not previously been identified are added to the set of sensors listed for the common coordinate system CCS. A mathematical singularity in the Rigid Transformation may be encountered after processing a number of stereo images by the data processing system 14. This means that the algorithm may not be able to relate the positions of the sensors 11 in the current coordinate system CS to those in the common coordinate system CCS because the change is too large. To solve this, the common coordinate system CCS is changed so that the current coordinate system CSn+1 (and subsequent coordinate systems for subsequent positions of the stereo camera 12) are 'closer' to the new common coordinate system CCS' and the transformation RT can be made using the ICP algorithm. This shift of the global frame may be performed whenever necessary.
In order to improve the accuracy of the determined three-dimensional location of the sensors 11, the determined locations of the sensors 11 may be averaged using all determined locations in the set of locations for the common coordinate system. Averaging may occur for every i-th stereo image or after capturing all stereo images. The optional averaging step is shown as step 57 in Fig. 5.
After the localization of the sensors 11 is finished, the obtained three-dimensional positions of the sensors 11 may be transformed once again, e.g. to a head coordinate system HCS. To that end, the markers N, A1 and A2 (see Fig. 1) on the head H are detected to determine the HCS. The HCS is then determined and a Rigid Transform is determined to transfer the sensors positions in the CCS to sensor position in the HCS. Appropriate labels, as depicted in Fig. 1, may be assigned to the sensors using the standard layout of the cap C and the KNN algorithm.
Fig. 6 shows experimental left and right images of a stereo image (top-left image and top-right image, respectively) and a collection of sensors identified over a series of processed images (bottom-left image).
It is noted that the method has been described in terms of steps to be performed, but it is not to be construed that the steps described must be performed in the exact order described and/or one after another. One skilled in the art may envision to change the order of the steps and/or to perform steps in parallel to achieve equivalent technical results .
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Various embodiments of the invention may be implemented as a program product for use with a computer system or a processor, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media (generally referred to as "storage"), where, as used herein, the expression "non-transitory computer readable storage media" comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or harddisk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.

Claims (15)

1. Een werkwijze voor het lokaliseren met behulp van een stereocamera van een aantal sensoren dat op een hoofd van een subject is aangebracht, bijv. op een kap die op het hoofd wordt gedragen, waarbij de sensoren voor de stereoca-mera optisch onderscheidend zijn ten opzichte van het hoofd of de kap, waarbij de werkwijze de stappen omvat van: het opnemen van ten minste een eerste stereo-beeld met een eerste verzameling van sensoren en een tweede stereobeeld met een tweede verzameling van sensoren door een relatieve rotatie van de stereocamera rond de kap; het analyseren van twee beelden van het eerste stereobeeld en twee beelden van het tweede stereobeeld voor het bepalen van driedimensionale posities van de eerste verzameling van sensoren in een eerste coördinatensysteem en de tweede verzameling van sensoren in een tweede coördinatensysteem; het uitdrukken van de driedimensionale posities van de eerste verzameling van sensoren en de driedimensionale posities van de tweede verzameling van sensoren in een gemeenschappelijk coördinatensysteem.A method for locating with the aid of a stereo camera a plurality of sensors mounted on a head of a subject, e.g. on a hood worn on the head, wherein the sensors for the stereo camera are optically distinct relative to the head or hood, the method comprising the steps of: recording at least a first stereo image with a first set of sensors and a second stereo image with a second set of sensors by a relative rotation of the stereo camera around the hood; analyzing two images of the first stereo image and two images of the second stereo image to determine three-dimensional positions of the first set of sensors in a first coordinate system and the second set of sensors in a second coordinate system; expressing the three-dimensional positions of the first set of sensors and the three-dimensional positions of the second set of sensors in a common coordinate system. 2. De werkwijze volgens conclusie 1, verder omvattende de stappen van: het opnemen van het eerste stereobeeld en het tweede stereobeeld zodanig dat een of meer sensoren tot zowel de eerste verzameling van sensoren als de tweede verzameling van sensoren behoren; het verkrijgen van de driedimensionale posities van de sensoren door driehoeksmeting voor de twee beelden van het eerste stereobeeld in het eerste coördinatensysteem en de driedimensionale posities van de sensoren door driehoeksmeting voor de twee beelden van het tweede stereobeeld in het tweede coördinatensysteem; het bepalen van een eerste transformatie voor het transformeren van het tweede coördinatensysteem naar het eerste coördinatensysteem op basis van de sensoren die zowel de eerste verzameling van sensoren als de tweede verzameling van sensoren behoren; het uitdrukken van de driedimensionale posities van de tweede verzameling sensoren in driedimensionale posities in het eerste coördinatensysteem als het gemeenschappelijk coördinatensysteem door toepassing van de eerste transformatie.The method of claim 1, further comprising the steps of: recording the first stereo image and the second stereo image such that one or more sensors belong to both the first set of sensors and the second set of sensors; obtaining the three-dimensional positions of the sensors by triangulation for the two images of the first stereo image in the first coordinate system and the three-dimensional positions of the sensors by triangulation for the two images of the second stereo image in the second coordinate system; determining a first transformation for transforming the second coordinate system to the first coordinate system based on the sensors that include both the first set of sensors and the second set of sensors; expressing the three-dimensional positions of the second set of sensors into three-dimensional positions in the first coordinate system as the common coordinate system by applying the first transformation. 3. De werkwijze volgens conclusie 2, verder omvattende de stappen van: het opnemen van een derde stereobeeld met een derde verzameling van sensoren zodanig dat een of meer sensoren tot zowel de derde verzameling van sensoren als een gemeenschappelijke verzameling van sensoren behoren, waarbij de gemeenschappelijke verzameling sensoren bevat van de eerste verzameling sensoren en de tweede verzameling sensoren; het verkrijgen van driedimensionale posities van de sensoren door driehoeksmeting in twee beelden van het derde stereobeeld in een derde coördinatensysteem; het bepalen van een tweede transformatie voor het transformeren van het derde coördinatensysteem naar het gemeenschappelijk coördinatensysteem op basis van de sensoren die tot zowel de derde verzameling van sensoren als de gemeenschappelijke verzameling van sensoren behoren; het uitdrukken van de driedimensionale posities van de derde verzameling sensoren in het eerste coördinatensysteem als het gemeenschappelijk coördinatensysteem door toepassing van de tweede transformatie.The method of claim 2, further comprising the steps of: recording a third stereo image with a third set of sensors such that one or more sensors belong to both the third set of sensors and a common set of sensors, the common set of set of sensors includes the first set of sensors and the second set of sensors; obtaining three-dimensional positions of the sensors by triangulation in two images of the third stereo image in a third coordinate system; determining a second transformation for transforming the third coordinate system to the common coordinate system based on the sensors that belong to both the third set of sensors and the common set of sensors; expressing the three-dimensional positions of the third set of sensors in the first coordinate system as the common coordinate system by applying the second transformation. 4. De werkwijze volgens een of meer van de voorgaande conclusies, omvattende de stappen van: het gebruiken van een n-de coördinatensysteem voor een n-de positie van de stereocamera as een nieuw gemeenschappen j k coördinatensysteem; het uitdrukken van driedimensionale posities van een n+l-de verzameling van sensoren verkregen in een n+1-de positie van de stereocamera in het nieuwe gemeenschappelijke coördinatensysteem.The method of any one of the preceding claims, comprising the steps of: using an nth coordinate system for an nth position of the stereo camera as a new common coordinate system; expressing three-dimensional positions of an n + 1-th set of sensors obtained in an n + 1-th position of the stereo camera in the new common coordinate system. 5. De werkwijze volgens een of meer van de voorgaande conclusies, omvattende de stap van het middelen van de verkregen driedimensionale posities voor een of meer van de sensoren uitgedrukt in het gemeenschappelijk coördinatensysteem.The method according to one or more of the preceding claims, comprising the step of averaging the obtained three-dimensional positions for one or more of the sensors expressed in the common coordinate system. 6. De werkwijze volgens een of meer van de voorgaande conclusies, verder omvattende de stappen van: het registreren van fiduciale punten op het hoofd tijdens het opnemen van de stereobeelden; het bepalen van een coördinatensysteem voor het hoofd onder gebruikmaking van de met de stereobeelden geregistreerde fiduciale punten; het uitdrukken van de driedimensionale posities van de sensoren in het gemeenschappelijke coördinatensysteem als driedimensionale posities van de sensoren in het coördinatensysteem voor het hoofd; en optioneel, het labelen van de sensoren onder gebruikmaking van een standaard labelschema voor de kap.The method of any one of the preceding claims, further comprising the steps of: registering fiducial points on the head while recording the stereo images; determining a coordinate system for the head using the fiducial points recorded with the stereo images; expressing the three-dimensional positions of the sensors in the common coordinate system as three-dimensional positions of the sensors in the coordinate system for the head; and optionally, labeling the sensors using a standard labeling scheme for the hood. 7. De werkwijze volgens een of meer van de voorgaande conclusies, verder omvattende de stap van het detecteren van sensoren in twee beelden van elk stereobeeld door het segmenteren van beelden van het eerste stereobeeld en het tweede stereobeeld onder gebruikmaking van het optische onderscheid van de sensoren ten opzichte van de kap en het toepassen van een Hough Transformatie voor het detecteren van de sensoren in de beelden, waarbij de sensoren bij voorkeur een ronde of elliptische vorm hebben.The method of any one of the preceding claims, further comprising the step of detecting sensors in two images of each stereo image by segmenting images of the first stereo image and the second stereo image using the optical discrimination of the sensors with respect to the hood and applying a Hough Transform to detect the sensors in the images, the sensors preferably having a round or elliptical shape. 8. De werkwijze volgens een of meer van de voorgaande conclusies, verder omvattende de stap van het bijeenbrengen van overeenkomstige sensoren in de twee beelden van het eerste stereobeeld en het bijeenbrengen van overeenkomstige sensoren in de twee beelden van het tweede stereobeeld, waarbij de stap van het bijeenbrengen sparse stereo matching bevat.The method of any one of the preceding claims, further comprising the step of bringing together corresponding sensors in the two images of the first stereo image and bringing together corresponding sensors in the two images of the second stereo image, the step of it contains sparse stereo matching. 9. De werkwijze volgens conclusie 8, verder omvattende de stap van het uitlijnen van de twee beelden van het eerste stereobeeld en het uitlijnen van de twee beelden van het tweede stereobeeld voorafgaand aan het uitvoeren van de stap van het bijeenbrengen.The method of claim 8, further comprising the step of aligning the two images of the first stereo image and aligning the two images of the second stereo image prior to performing the assembling step. 10. Een systeem voor het lokaliseren van een aantal sensoren dat op een hoofd van een subject is aangebracht, bijv. op een kap die op het hoofd wordt gedragen, waarbij de sensoren voor de stereocamera optisch onderscheidend zijn ten opzichte van het hoofd of de kap, waarbij het systeem omvat: een stereocamera ingericht voor relatieve rotatie rond de kap; verwerkingsmiddelen die zijn ingericht voor: het opnemen van ten minste een eerste stereobeeld met een eerste verzameling van sensoren en een tweede stereobeeld met een tweede verzameling van sensoren door een relatieve rotatie van de stereocamera rond de kap; het analyseren van twee beelden van het eerste stereobeeld en twee beelden van het tweede stereobeeld voor het bepalen van driedimensionale posities van de eerste verzameling van sensoren in een eerste coördinatensysteem en de tweede verzameling van sensoren in een tweede coördinatensysteem; het uitdrukken van de driedimensionale posities van de eerste verzameling van sensoren en de driedimensionale posities van de tweede verzameling van sensoren in een gemeenschappelijk coördinatensysteem.10. A system for locating a plurality of sensors mounted on a subject's head, e.g. on a hood worn on the head, the sensors for the stereo camera being optically distinct from the head or hood , the system comprising: a stereo camera adapted for relative rotation around the hood; processing means adapted to: record at least a first stereo image with a first set of sensors and a second stereo image with a second set of sensors by relative rotation of the stereo camera around the hood; analyzing two images of the first stereo image and two images of the second stereo image to determine three-dimensional positions of the first set of sensors in a first coordinate system and the second set of sensors in a second coordinate system; expressing the three-dimensional positions of the first set of sensors and the three-dimensional positions of the second set of sensors in a common coordinate system. 11. Het systeem volgens conclusie 10, waarbij de verwerkingsmiddelen verder zijn ingericht voor het uitvoeren van de stappen van een of meer van de conclusies 2-9.The system of claim 10, wherein the processing means are further adapted to perform the steps of one or more of claims 2-9. 12. Een computerprogramma omvattende software code-delen ingericht om, wanneer uitgevoerd door een computer systeem, de stappen volgens een of meer van de conclusies 1-9 uitvoert.A computer program comprising software code parts arranged to, when executed by a computer system, perform the steps according to one or more of claims 1-9. 13. Een niet-overdraagbaar computermedium of media waarop het computerprogramma volgens conclusie 12 is opgeslagen .A non-transferable computer medium or media on which the computer program according to claim 12 is stored. 14. Het gebruik van een stereocamera, waarbij bij voorkeur gebruik wordt gemaakt van slechts twee mechanisch verbonden camera's, voor het lokaliseren van een aantal sensoren aangebracht op het hoofd van een subject, bijv. op een kap die op het hoofd wordt gedragen, waarbij de sensoren voor de stereocamera optisch onderscheidend zijn van het hoofd of de kap door relatieve rotatie van de stereocamera rond de kap.14. The use of a stereo camera, preferably using only two mechanically connected cameras, for locating a number of sensors mounted on the head of a subject, e.g. on a hood worn on the head, the Sensors for the stereo camera are optically distinctive from the head or hood due to relative rotation of the stereo camera around the hood. 15. Het gebruik van de stereocamera volgens conclusie 14, verder omvattende het gebruik van de stereocamera voor het uitvoeren van een of meer van de stappen van conclusies 1-9.The use of the stereo camera of claim 14, further comprising the use of the stereo camera to perform one or more of the steps of claims 1-9.
NL2011858A 2013-11-28 2013-11-28 Stereo-imaging sensor position localization method and system. NL2011858C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
NL2011858A NL2011858C2 (en) 2013-11-28 2013-11-28 Stereo-imaging sensor position localization method and system.
PCT/NL2014/050811 WO2015080583A1 (en) 2013-11-28 2014-11-28 Stereo-imaging sensor position localization method and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2011858A NL2011858C2 (en) 2013-11-28 2013-11-28 Stereo-imaging sensor position localization method and system.
NL2011858 2013-11-28

Publications (1)

Publication Number Publication Date
NL2011858C2 true NL2011858C2 (en) 2015-06-01

Family

ID=50114490

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2011858A NL2011858C2 (en) 2013-11-28 2013-11-28 Stereo-imaging sensor position localization method and system.

Country Status (2)

Country Link
NL (1) NL2011858C2 (en)
WO (1) WO2015080583A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115670659A (en) * 2022-11-04 2023-02-03 上海微创医疗机器人(集团)股份有限公司 Tissue registration system, method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1422495A1 (en) * 2001-07-30 2004-05-26 Topcon Corporation Surface shape measurement apparatus, surface shape measurement method, surface state graphic apparatus
EP2551633A1 (en) * 2010-03-25 2013-01-30 Kabushiki Kaisha Toshiba Three dimensional distance measuring device and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7190826B2 (en) 2003-09-16 2007-03-13 Electrical Geodesics, Inc. Measuring the location of objects arranged on a surface, using multi-camera photogrammetry
EP2561810A1 (en) * 2011-08-24 2013-02-27 Université Libre de Bruxelles Method of locating eeg and meg sensors on a head

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1422495A1 (en) * 2001-07-30 2004-05-26 Topcon Corporation Surface shape measurement apparatus, surface shape measurement method, surface state graphic apparatus
EP2551633A1 (en) * 2010-03-25 2013-01-30 Kabushiki Kaisha Toshiba Three dimensional distance measuring device and method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ABBOT A L ED - INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS: "Merging multiple stereo surface maps through camera self-calibration", PROCEEDINGS OF SOUTHEASTCON. WILLIAMSBURG, APRIL 7 - 10, 1991; [PROCEEDINGS OF THE SOUTHEAST CONFERENCE], NEW YORK, IEEE, US, vol. -, 7 April 1991 (1991-04-07), pages 356 - 360, XP010045119, ISBN: 978-0-7803-0033-0, DOI: 10.1109/SECON.1991.147772 *
MICHEL SARKIS ET AL: "Sparse stereo matching using belief propagation", 15TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING : ICIP 2008 ; SAN DIEGO, CALIFORNIA, USA, 12 - 15 OCTOBER 2008, IEEE, PISCATAWAY, NJ, USA, 12 October 2008 (2008-10-12), pages 1780 - 1783, XP031374368, ISBN: 978-1-4244-1765-0 *
NAIR P S ET AL: "Hough transform based ellipse detection algorithm", PATTERN RECOGNITION LETTERS, ELSEVIER, AMSTERDAM, NL, vol. 17, no. 7, 10 June 1996 (1996-06-10), pages 777 - 784, XP004007562, ISSN: 0167-8655, DOI: 10.1016/0167-8655(96)00014-1 *
UGUR BAYSAL ET AL: "Single Camera Photogrammetry System for EEG Electrode Identification and Localization", ANNALS OF BIOMEDICAL ENGINEERING, KLUWER ACADEMIC PUBLISHERS-PLENUM PUBLISHERS, NE, vol. 38, no. 4, 9 February 2010 (2010-02-09), pages 1539 - 1547, XP019786081, ISSN: 1573-9686 *

Also Published As

Publication number Publication date
WO2015080583A1 (en) 2015-06-04

Similar Documents

Publication Publication Date Title
US20250241559A1 (en) Medical camera assembly comprising range camera and thermographic camera
JP5467404B2 (en) 3D imaging system
Pintaric et al. Affordable infrared-optical pose-tracking for virtual and augmented reality
JP6360260B2 (en) Optical tracking method and system based on passive markers
JP2019194616A (en) Position detection method, device and equipment based upon image, and storage medium
CN111626125A (en) Face temperature detection method, system and device and computer equipment
RU2015141083A (en) SEGMENTATION OF LARGE OBJECTS FROM SEVERAL THREE-DIMENSIONAL VIEWS
BRPI0919448B1 (en) method for tracking a follicular unit and system for tracking a follicular unit.
CN111768486A (en) Method and system for 3D reconstruction of monocular camera based on rotating refractor
CN114004891B (en) A distribution network line inspection method based on target tracking and related devices
Cardone et al. Automated warping procedure for facial thermal imaging based on features identification in the visible domain
CN112200838A (en) Projectile trajectory tracking method, device, equipment and storage medium
Patruno et al. Optical encoder neural network: a CNN-based optical encoder for robot localization
CN109345632B (en) Method for acquiring image, related device and readable storage medium
Safavian et al. Endoscopic measurement of the size of gastrointestinal polyps using an electromagnetic tracking system and computer vision-based algorithm
CN117896626A (en) Method, device, equipment and storage medium for detecting motion trajectory with multiple cameras
Baysal et al. Single camera photogrammetry system for EEG electrode identification and localization
NL2011858C2 (en) Stereo-imaging sensor position localization method and system.
BR102014025597A2 (en) method and reference system of an imagined object
Bae et al. Fast and scalable 3D cyber-physical modeling for high-precision mobile augmented reality systems
US20210383147A1 (en) Methods and systems for translating fiducial points in multispectral imagery
US10390798B2 (en) Computer-aided tracking and motion analysis with ultrasound for measuring joint kinematics
CN108564626B (en) Method and apparatus for determining relative attitude angles between cameras mounted on acquisition entities
Neves et al. A calibration algorithm for multi-camera visual surveillance systems based on single-view metrology
Yoshida et al. 3D measurement of a moving target using multiple slits with a random-dot pattern

Legal Events

Date Code Title Description
MM Lapsed because of non-payment of the annual fee

Effective date: 20161201