GB2485359A - A Simulator Including A Controller And A Means For 3-Dimensional Position Recognition Using Correlation - Google Patents

A Simulator Including A Controller And A Means For 3-Dimensional Position Recognition Using Correlation Download PDF

Info

Publication number
GB2485359A
GB2485359A GB1018974.4A GB201018974A GB2485359A GB 2485359 A GB2485359 A GB 2485359A GB 201018974 A GB201018974 A GB 201018974A GB 2485359 A GB2485359 A GB 2485359A
Authority
GB
United Kingdom
Prior art keywords
image
controller
training
mach
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1018974.4A
Other versions
GB201018974D0 (en
GB2485359B (en
Inventor
Jan Telensky
Prajay Kamat
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
J T CONSULTANCY Ltd
Original Assignee
J T CONSULTANCY Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by J T CONSULTANCY Ltd filed Critical J T CONSULTANCY Ltd
Priority to GB1018974.4A priority Critical patent/GB2485359B/en
Publication of GB201018974D0 publication Critical patent/GB201018974D0/en
Priority to IN883DEN2012 priority patent/IN2012DN00883A/en
Priority to CN201180003615.4A priority patent/CN102667858B/en
Priority to GB1119413.1A priority patent/GB2485471B/en
Priority to CN201180003614.XA priority patent/CN102652328B/en
Priority to IN884DEN2012 priority patent/IN2012DN00884A/en
Priority to RU2012105332/12A priority patent/RU2600906C2/en
Priority to PCT/GB2011/052187 priority patent/WO2012063068A1/en
Priority to GB1119412.3A priority patent/GB2486527B/en
Priority to PCT/GB2011/052188 priority patent/WO2012063069A1/en
Priority to RU2012105335A priority patent/RU2608350C2/en
Publication of GB2485359A publication Critical patent/GB2485359A/en
Application granted granted Critical
Publication of GB2485359B publication Critical patent/GB2485359B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Abstract

The invention relates to a training simulator for use in teaching plumbing skills. The simulator includes a controller, including a gyroscope, magnetometer and accelerometer. The controller is formed of two rotatable connected portions, and includes a bending sensor, to measure the relative angle between the two portions and a pressure sensor. The simulator includes a method and apparatus for 3-dimensional position recognition of the controller. A camera and infrared camera are provided, for capturing images of the controller. The infrared image is processed using an active contour model to produce a training image from an image from the camera. An OT-MACH filter is constructed from the training image, which is correlated with images from the camera to measure the position of the controller.

Description

A SIMULATOR INCLUDING A CONTROLLER AND A METHOD AND APPARATUS FOR
3-DIMENSIONAL POSITION RECOGNITION This invention relates to a simulator, including a controller and a method and apparatus for 3-dimensional position recognition thereof. More specifically, but not exclusively, this invention relates to a simulator for the training of a tradesman.
Traditionally, tradesmen, such as plumbers, have learned their trade on the job as an apprentice. An apprentice learns the various requisite skills by attempting to replicate their master's work. Apprenticeships provide a focussed, personal training experience. However, this form of training is not scalable, as the master's time is dissipated over many pupils.
Furthermore, at least in the early stages of training, the apprentice will make mistakes, which will cost the master and deter him/her from hiring apprentices in the future.
Vocational training courses were developed to give pupils the initial experience needed to start work as a tradesman, in an attempt to reduce the initial, costly, period of the apprenticeship. However, these courses are subject to relatively high fees due to the number of mistakes the pupils make in the early stages.
According to a first aspect of the invention, there is provided apparatus, for measuring a position of an object, comprising a first camera, for producing an infrared image of the object; a second camera, for producing a calibration image of the object, a processor for producing a training image, configured to extract a vector, corresponding to an edge of the object, from the infrared image using an active contour model, and to apply the vector to the calibration image to produce the training image; an OT-MACH filter, constructed from the training image; and the processor being configured to correlate the image from the second camera to the OT-MACH filter to determine the position of the object in x-y coordinates by comparing the amplitude of the maximum peak in the OT-MACH correlation plane to a detection threshold.
S
By applying an active contour model to the image from the infrared camera, which is in turn used to create the training image for the 01-MACH filter, the apparatus may accurately measure the position of the object with only one infrared camera. In the prior art, multiple infrared camera are used to measure the position of the object. The present invention therefore reduces the cost of 3-dimensional position recognition, by alleviating the need for multiple, expensive, infrared cameras. The active contour model works synergistically with the infrared camera, as the contrast between the object and a human being allows an accurate vector corresponding to an edge of the object to be produced. The apparatus therefore works well in cluttered environments, such as a living-room.
The present invention can therefore use a standard camera, e.g. a colour camera such as a VGA camera, which provides images from the visible spectrum which may be used by the apparatus.
Preferably, a plurality of rotated training images are produced, the rotated training images constructed by rotating the training image, and the OT-MACH filter being constructed from the rotated training images.
By rotating the training images, the apparatus may maintain a track on the object, that is, may accurately and consistently measure the position of the object, for a longer period without updating the OT-MACH with a new training image. The rotated training images increase the tolerance of the apparatus to changes in orientation, scale changes and position.
Preferably, the training image, or rotated training images, are produced periodically to update the OT-MACH filter. The OT-MACH filter may be updated between every 0.5 and 1.5 seconds, or more preferably, every second.
By updating the OT-MACH filter with a new training image, and using a set of rotated training images, the OT-MACH filter is retrained'. This increases the accuracy of the OT-MACH filter for a subsequent image from the second camera.
The object may be a controller, the controller having a first portion and second portion, the portions being rotatably connected, a magnetometer, a gyroscope, an accelerometer, and a bending sensor, the bending sensor configured to measure a relative angle between the first and second portion.
The magnetometer, gyroscope and accelerometer provide accurate measurements of the orientation of the controller. Furthermore, the bending sensor provides data relating to the relative angle between the first and second portion, which may be used in a simulator to recreate the scenario of bending a pipe. The bending sensor may be a friction plate.
The controller may further comprise a motor for resisting change in the relative angle between the portions. The apparatus may therefore output feedback to the user by resisting change in the relative angle between the first and second portion. Therefore, in the scenario of bending a pipe, the apparatus may simulate the pipe's resistance to bending.
The controller may further comprise a pressure sensor. The pressure sensor may be used to measure the amount of pressure applied by a user to the controller. This may be used in a simulator, for example, to recreate using a blowtorch.
Optionally, the first or second portion includes a groove, the pressure sensor being positioned in the groove.
According to a second aspect of the invention, there is provided a method, for measuring a position of an object, the method comprising the steps of: acquiring an infrared image including the object from an infrared camera; acquiring an image, including the object and a calibration image including the object from a camera; producing a vector, corresponding to an edge of the object, from the infrared image, using an active contour model; producing a training image, by extracting the object from the calibration image using the vector; constructing an OT-MACH filter using the training image; and measuring the position of the object in the image, by correlating the image with the OT-MACH filter.
The method may further comprise the step of producing a plurality of rotated training images, the rotated training image set is constructed by rotating the training image, and constructing the OT-MACH filter from the rotated training images.
The training image, or plurality of rotated training images may periodically update the OT-MACH filter. The OT-MACH filter may be updated every 0.5 to 1.5 seconds, or more preferably, every second.
A computer program, embodied on a computer-readable medium, may be configured to execute the method according to the second aspect of the invention.
According to a third aspect of the invention, there is provided a controller comprising a first portion and second portion, the portions being rotatably connected, a magnetometer, a gyroscope, an accelerometer, and a bending sensor, wherein the bending sensor is configured to measure a relative angle between the first and second portion.
The bending sensor may be a friction plate.
The controller may further comprise a motor for resisting change in the relative angle between the portions.
The controller may further comprise a pressure sensor. Preferably, the first or second portion includes a groove, the pressure sensor being positioned in the groove.
Embodiments of the invention will now be described, by way of example, and with reference to the drawings in which: Figure 1 illustrates a simulator of an embodiment of the present invention, including a computer, head mounted display, controller and camera unit; Figure 2 illustrates a flow diagram illustrating a method of measuring the x-axis and y-axis co-ordinate of the controller of an embodiment of the present invention; Figure 3 illustrates an OT-MACH filter of the embodiment of Figure 2; Figure 4 illustrates a method of measuring the z-axis of the controller of the embodiment of Figure 2; Figure 5 illustrates a controller of an embodiment of the present invention, showing a first portion and a second portion in a parallel position; Figure 6 illustrates the controller of Figure 5, showing a relative angle between the first and second portion; Figure 7 illustrates the controller of Figure 5, showing the first and second portion in a perpendicular position; and Figure 8 illustrates the hardware of the controller of Figure 5.
Figure 1 illustrates an overview of a simulator I of the present invention. The simulator 1 includes a controller 100, a computer 200, a camera unit 300 and a head mounted display 400. For the purposes of this description, the computer 200 is configured to run a computer program which simulates a training scenario, such as using a blowtorch, or bending a pipe.
The computer 200 receives data from the controller 100 and the camera unit 300. The controller 100 includes various sensors to measure spatial properties, such as acceleration and orientation, and to measure user input. The controller 100 outputs the data from the sensors to the computer 200. The camera unit 300 includes a first camera 310 and a second, infrared, camera 320, for image acquisition. The camera unit 300 outputs the image data to the computer 200.
S
The computer 200 is configured to process the data from the controller 100 and camera unit 300 as input variables in the computer program. The controller 100 provides spatial data, such as acceleration and orientation, and user inputs, and the camera unit 300 provides images which may be processed for 3-dimensional position recognition of the controller 100.
The computer program, which may simulate a training scenario, can therefore give the user an immersive and accurate simulation of a real-life skill, such as using a blow-torch or bending a pipe. Each component of the simulator I is described in more detail below.
* 3-Dimensional Position Recognition In normal use, the simulator 1 is set up in a room, with the camera unit 300 facing the controller 100. Generally, the camera unit 300 will be positioned against a wall, and face the controller 100 in the centre of the room. The controller 100 is held by a user.
The computer 200 is configured to calculate the position of the controller 100 in three dimensions. For the purpose of this description, the three dimensions are denoted along the Cartesian x, y and z axes, wherein the z-axis is in the direction from the camera unit 300 to the controller 100 (that is, the axis parallel to the floor). The x-axis and y-axis are both orthogonal to the z-axis and to each other. The computer 200 is configured to calculate the x-axis and y-axis co-ordinates of the controller via a first method, and calculate the z-axis co-ordinate via a second method.
The first method, that is, the method of calculating the x-axis and y-axis co-ordinates of the controller 100, will now be described with reference to Figures 2 to 3. The method is performed on the computer 200, using the image data from the camera unit 300. The camera unit 300 acquires a calibration image via the first camera 310, and acquires an infrared image via the second, infrared, camera 320. As mentioned above, the camera unit 300 faces the controller 100, which is held by the user. Therefore the calibration image and the infrared image include the controller 100 and the user.
An overview of the first method is illustrated in Figure 2. As a preliminary step, background subtraction of the infrared image, via temporal differencing, differentiates the controller 100 and the user from the constant background. This produces a processed infrared image, including only the controller and the user, suitable for the subsequent steps.
An active contour model is applied to the processed infrared image to produce an accurate vector contouring the edge of the controller 110. The controller 110 is readily distinguishable from the user in the processed infrared image due to the use of an IR reflectant, coating on the controller 110.
The active contour model works on the principle of energy minimization to ascertain the vector of the controller's 100 edge in the processed infrared image. The energy of each vector point is calculated based on its neighbouring pixels. A Difference of Gaussian (DoG) filtered image is computed for emphasizing the edges of the controller 100. This energy minimization process is an iterative, continuous, process until the vector of the edge of the controller 100 is accurately computed. The energy function computed and iterated for each vector point is described in the equation below, where i, the number of iterations, runs from 1 to n, n being the number of points on the vector, and is the calculated energy of the vector point.
E;ector = a.E (v) + .E (vi) The computer 200 includes a configuration file, for modifying the number of iterations, i, required to accurately compute the vector of the edge of the controller 100.
Once the vector of the edge of the controller 110 has been computed, the vector is then applied to the calibration image from the first camera, to extract the controller 110. The extracted controller 110 is then applied to the centre of a blank background, which forms a training image, suitable for an OT-MACH filter.
In this embodiment, the training image is further processed to produce a plurality of rotated training images for the OT-MACH filter. For example, the training image is rotated by two-degree increments between -6 degrees and +6 degrees, thus obtaining 7 rotated training images. The rotated training images are multiplexed and input to the OT-MACH filter.
The operation of the OT-MACH filter will now be described in more detail, with reference to Figure 3. The OT-MACH (Optimal Trade-off Maximum Average Correlation Height) filter is performed on the computer 200 using a FFTVV ("Fastest Fast Fourer Transform in the West") library. The FFTW library is a C subroutine library for computing discrete Fourier transforms in one or more dimensions. The FFTW library is interfaced with Intel's (RTM) OpenCV library for Computer vision, making the OT-MACH filter efficient with respect to processing time and frequency.
As shown on the left hand column in Figure 3, the OT-MACH filter receives the set of rotated training images ti=ltoN, where N is the number of rotated training images. Each rotated training image is Fourier transformed FT(T). The output of the FFTW is not a shifted FFT.
Shifting of the zero component of the FFT to the centre of the spectrum is performed using the following function, designed in C. cvFFTWShW() The function has the effect of swapping the upper-left quadrant with the lower-right quadrant, and swapping the upper-right quadrant with the lower-left quadrant.
The OT-MACH filter is expressed in the equation below, where m is an average of the rotated training image vector xltoN in the frequency domain, C is a diagonal power spectral density matrix of any chosen noise model, D is a diagonal average power spectral density of the rotated training image, and S, denotes the similarity matrix of the rotated training image set. These parameters are derivable from the training image. Alpha, beta and gamma are non-negative optimal trade-off parameters, which allow the OT-MACH filter to be tailored for external conditions, such as light levels, ci, 13, and y can be modified in the configuration file. h-
-aC+D +yS The computer 200 receives a stream of images from the first camera. As shown on the right hand side of the column in Figure 3, a set of sub-images Sk1tON, where N is the number of sub-images, are derived from one image from the stream. Each sub-image is Fourier transformed FT(Sk). The Fourier transformed sub images are correlated with the OT-MACH filter, in the frequency domain, via the function below.
conj(FT(h))FT(Sk) Each sub-image is then classified as in-class or out-of-class by comparing the amplitude of the maximum peak in the correlation plane to a detection threshold. The detection threshold is given in the equation below.
Threshold = L CentrePeak(FT(h) * FT(t)) A correlation plot is produced for each in-class sub image. The position of the controller 100 in the x and y direction corresponds to the highest value in the correlation plot.
The OT-MACH filter is applied to every m1 image from the first camera to generate a correlation plot and to determine the position of the controller 110. The parameter m may be modified in the configuration file. The OT-MACH filter may be updated, that is, by a new set of rotated training images obtained and applied to the OT-MACH filter, either in real-time or at a frequency determined by a parameter in the configuration file.
The second method, that is, for calculating the z-axis co-ordinate of the controller 100, will now be described in more detail, with reference to Figure 4. The z-axis co-ordinate is the distance from the first and second camera's centroid to the controller 100.
A half angle of the first camera 0 and second camera 02 is calculated using the following expression, where D is the first or second camera's field stop and f is the first or second camera's focal length.
D
0 =tan' 1,2 1.2 2f12 With reference to Figure 4, the z-axis co-ordinate can be determined from the following expression, where cc 1,2 can be measured using the half angle of view and the x-axis and y-axis position of the controller 100 calculated using the first method.
tan[--ai *tan[_aJ*zx z= tan[-_aiJ+tan[E-_a2J Alternatively, if the first and second camera are calibrated, the intrinsic and extrinsic camera parameters can be found using OpenCV functions The skilled reader will understand that the rotational multiplexing, that is, the rotation of the training image to produce a plurality of rotated training images, is a non essential feature of the present invention. Rather, the OT-MACH filter may be constructed from the training image. The skilled reader will understand that constructing the OT-MACH filter from the plurality of rotated training images is preferable, as it provides a degree of tolerance to the OT-MACH filter between filter updates, such that the accuracy of position recognition is increased and the computer is less likely to lose tracking of the controller 100.
The skilled reader will also understand that the updating the OT-MACH filter, that is, producing a new set of rotated training images or training image, is a non-essential feature.
Rather, the OT-MACH filter can be constructed from a first set of training images and not updated. Of course, the skilled reader will understand that updating the OT-MACH filter is highly preferable, as it provides for more accurate position recognition of the controller 100.
Furthermore, it is a non-essential feature for the OT-MACH filter to be updated once every 25 images from the stream of images from the first camera (that is, for a common camera, once every second where the camera captures 25 frames per second). The skilled reader will understand that the frequency of updating the OT-MACH filter may be changed, by modification of the configuration file.
* The Controller The controller 100 will now be described in more detail, with reference to Figures 5 to 8. The controller 100 includes a housing formed of a first portion 110 and a second portion 120.
The first portion 110 and second portion 120 are rotatably connected at one end. The first portion 110 and second portion 120 are configured to rotate between a parallel position, as shown in Figure 5 where the relative angle is zero, and a perpendicular position, as shown in Figure 7 where the relative angle is 900.
The first portion 110 includes a closing button 127, disposed between the first portion 110 and second portion 120. The closing button 127 is configured to depress as the relative angle between the first portion 110 and second portion 120 approaches zero (that is, approaches the parallel position).
The controller 100 includes a number of buttons thereon, including smaller general purpose buttons I 13a-c, a larger general purpose button 111 and a thumb operated joystick 114.
The buttons allow the user to input basic commands to the computer program, such as menu navigation. In this embodiment, the first portion 110 includes a plurality of LEDS (not shown) to display status and diagnostic information to the user.
The second portion 120 includes a plurality of grooves 123a-d, for receiving the user's fingers. The grooves 123a-d allow the user to comfortably hold the controller 110.
Furthermore, the second portion 120 includes a plurality of pressure sensors 125a-d, positioned within the grooves 123a-d. The pressure sensors 125a-d are configured to measure the pressure exerted thereon, by varying their resistance in proportion to the pressure. The pressure sensors 125a-d may be activated only when the closing button is depressed, and include a rubber casing to absorb shock.
The controller 100 also includes a bending sensor, for measuring a relative angle between the first portion 110 and second portion 120. In this embodiment, the bending sensor is a friction plate. The bending sensor outputs data that may be used by the computer program to simulate a pipe bending scenario.
In this embodiment, the bending sensor includes a braking motor, for resisting change in the relative angle between the first portion 110 and the second portion 120. This allows the simulator to replicate the resistance to bending, for example, when the user is bending a pipe. The controller 110 also includes vibration generator motors, which may be activated to provide a physical notification to the user.
Figure 8 is a block diagram illustrating the hardware inside the housing of the controller 100.
The controller 100 includes a microcontroller Soc 150 (including a plurality of modules described below), a battery 161, such as a Lithium-ion cell, a battery management module 162, and voltage regulators 163.
The battery management module 162 includes a battery charger, adapted to receive an A input. The charger includes dynamic power path management (DPPM) that powers the controller 100 while simultaneously and independently charging the battery 161. The battery management module further includes protection and fuel gauge circuits.
The voltage regulators 163 distribute power to the modules on the microcontroller SOC 150, the sensors, and other active components detailed below.
The microcontroller SOC 150 includes a CPU 151, program memory 152 and execution memory 153, connected via a system bus. The microcontroller SOC 150 further includes GIPO 171, Power Management 172, ADC 173, DAC 174, UART 175, Audio DAC Output 176, 12C 177, and USB 178 modules, connected via a peripheral bus.
The GIPO module 171 is a digital 10, configured to receive data from the smaller and larger general purpose buttons 123a-c, 111, and the joystick 114. The GIPO module 171 is also configured to control the LEDs to provide status and diagnostic information to the user.
The controller 100 includes an accelerometer 180, gyroscope 181 and magnetometer 182, providing nine degrees of freedom tracking. The three sensors 180, 181, 182 are embodied on a circuit board. The circuit board is designed to filter noise from the sensor 180, 181, 182 readings to provide Euler angles or Quaternions to output as data relating to the orientation of the controller 100. The three sensors 180, 181, 182 are connected to the microcontroller SOC 150 via the 12C module 177, which configures, initializes and calibrates the sensors 180, 181, 182.
The pressure sensors 125a-d are connected to the microcontroller SOC 150 via a programmable gain amplifier 190 and the ADC module 173. The ADC module 173 and programmable gain amplifier 190 also connect a hall effect sensor 191 and electric field imaging sensor 192 to the microcontroller SOC 150. The electric field imaging sensor 192 is used for non-contact sensing of objects, by generating a low frequency sine-wave field. The electric field imaging sensor 192 detects proximal objects by changes in the sine-wave field.
Similarly, the hall effect sensor 191 measures the proximal magnetic field.
The ADO module 173 is configured to receive the data from the programmable gain amplifier 190, convert itto a digital signal and pass onto the OPU 151 for computation.
The microcontroller SOO 150 further includes motor driving circuitry 193, for driving motors such as the vibration generating motor, or the dynamic braking motor. The motor driving circuitry 193 is modulated by the PWM module 172, which may be configured to operate without OPU 151 intervention.
The microcontroller SOO 150 also includes a USB module 177, for connection with an external USB device 194, and a UART module 175, for interfacing with a wireless communications module 195, e.g. a Bluetooth (RTM) dongle, for communication with the computer 200. The wireless communications module 195 is a transceiver for sending the data collected from the sensors and input devices to the computer 200, and for receiving feedback data, for example, to drive the dynamic braking motor.
The microcontroller S0O 150 also includes an Audio DAO output module 176, for controlling a speaker 196 on the controller 100.
The skilled reader will understand that the pressure sensor is a non-essential feature. The pressure sensor is preferable, as it allows a further user input to the simulator 1, such that the user may engage in certain training scenarios, such as using a blow-torch.
The skilled reader will also understand that it is non-essential for the controller 100 to rotate between the parallel and perpendicular position. Rather, the controller 100 may rotate between any two relative angles, smaller or greater than 90 degrees.
In the above embodiment, the controller 100 uses a friction plate to measure the relative angle between the first and second portion. The skilled reader will understand that the friction plate is just one way of measuring the relative angle, and further examples may be used. Furthermore, the dynamic braking motor is just one example of a means to resist change in the relative angle between the first and second portion. For example, friction can be achieved by positioning friction plates and applying pressure between them or by using a The skilled reader will also understand that the simulator I is not limited to the plumbing scenarios detailed above. Rather, the simulator I may be used for various forms of virtual reality situations, such as other training, recreational or industrial situations. In particular, the 3-dimnesional position recognition method outlined above will have uses in other situations, for example the manufacturing industry.
In the above embodiment, the computer 200 includes a computer program. The skilled reader will understand that the computer program may be embodied on a computer readable medium, such as a compact disc or USB flash drive, or may be downloadable through the internet. The computer program may also be stored on a server in a remote location, and the user's personal computer may send and receive data to the server through a network connection.
The skilled reader will also understand that the head mounted display is a non-essential feature. Rather, the computer 200 may output graphics to a computer monitor, projector, TV, HDTV, 3DTV, or the like. The head mounted display is a preferable feature, as it provides an immersive experience for the user, and may also provide data relating to the user's head orientation, which can then in turn be used by the simulator 1.
The skilled person will understand that any combination of features is possible without departing from the spirit or scope of the present invention, as claimed.

Claims (24)

  1. CLAIMS1. Apparatus, for measuring a position of an object, comprising a first camera, for producing an infrared image of the object; a second camera, for producing a calibration image of the object.a first computational module, for producing a training image, configured to extract a vector, corresponding to an edge of the object, from the infrared image using an active contour model, and to apply the vector to the calibration image to produce the training image; an OT-MACH filter, constructed from the training image; and a second computational module, configured to correlate the image from the second camera to the OT-MACH filter to determine the position of the object in x-y coordinates by comparing the amplitude of the maximum peak in the OT-MACH correlation plane to a detection threshold.
  2. 2. Apparatus as claimed in Claim 1, wherein a plurality of rotated training images are produced, the rotated training images constructed by rotating the training image, and the OT-MACH filter being constructed from the rotated training images.
  3. 3. Apparatus as claimed in Claim 1, wherein the training image is produced periodically to update the OT-MACH filter.
  4. 4. Apparatus as claimed in Claim 2, wherein the rotated training images are produced periodically to update the OT-MACH filter.
  5. 5. Apparatus as claimed in either Claim 3 or Claim 4, wherein the OT-MACH filter is updated between every 0.5 and 1.5 seconds.
  6. 6. Apparatus as claimed in any preceding claim, wherein the object is a controller, the S controller having a first portion and second portion, the portions being rotatably connected, a magnetometer, a gyroscope, an accelerometer, and a bending sensor, the bending sensor configured to measure a relative angle between the first and second portion.
  7. 7. Apparatus as claimed in Claim 6, wherein the bending sensor is a friction plate.
  8. 8. Apparatus as claimed in any one of Claims 6 to 7, wherein the controller further comprises a motor for resisting change in the relative angle between the portions.
  9. 9. Apparatus as claimed in any one of Claims 6 to 8, wherein the controller further comprises a pressure sensor.
  10. 10. Apparatus as claimed in any one of Claims 6 to 9, wherein the first or second portion includes a groove, the pressure sensor being positioned in the groove.
  11. 11. A method, for measuring a position of an object, the method comprising the steps of: acquiring an infrared image including the object from an infrared camera; acquiring an image, including the object and a calibration image including the object from a camera; producing a vector, corresponding to an edge of the object, from the infrared image, using an active contour model; producing a training image, by extracting the object from the calibration image using the vector; constructing an OT-MACH filter using the training image; and measuring the position of the object in the image, by correlating the image with the OT-MACH filter.
  12. 12. The method as claimed in Claim 11, further comprising the step of producing a plurality of rotated training images, the rotated training images constructed by rotating the training image, and constructing the OT-MACH filter from the rotated training images.
  13. 13. The method as claimed in either Claim 11, wherein the training image is produced periodically to update the OT-MACH filter.
  14. 14. The method as claimed in Claim 12, wherein the rotated training images are produced periodically to update the OT-MACH filter.
  15. 15. The method as claimed in any one of Claims 11 to 14, wherein the OT-MACH filter is updated between every 0.5 and 1.5 seconds.
  16. 16. A computer program, embodied on a computer readable medium, configured to execute the method according to any one of Claims 11 to 15.
  17. 17. A controller comprising a first portion and second portion, the portions being rotatably connected, a magnetometer, a gyroscope, an accelerometer, and a bending sensor, wherein the bending sensor is configured to measure a relative angle between the first and second portion.
  18. 18 A controller as claimed in Claim 17, wherein the bending sensor is a friction plate.
  19. 19. A controller as claimed in any one of Claims 17 to 18, further comprising a motor for resisting change in the relative angle between the portions.
  20. 20. A controller as claimed in any one of Claims 17 to 19, further comprising a pressure sensor.
  21. 21. A controller as claimed in Claim 20, wherein the first or second portion includes a groove, the pressure sensor being positioned in the groove.
  22. 22. Apparatus substantially as herein described with reference to and as shown in any one of the accompanying drawings.
  23. 23. A method substantially as herein described with reference to and as shown in any one of the accompanying drawings.
  24. 24. A controller substantially as herein described with reference to and as shown in any one of the accompanying drawings.Amendments to the claims have been filed as followsCLAIMS1. Apparatus, for determining co-ordinates of an object in two dimensions, comprising a first camera, for producing an infrared image of the object; a second camera, for producing a calibration image of the object and a stream of images of the object; a first computational module, for producing a training image, configured to extract a vector, corresponding to an edge of the object, from the infrared image using an active contour model, and to apply the vector to the calibration image to produce the training image; an OT-MACH filter, constructed from the training image; and e4 a second computational module, configured to correlate at least one image of the stream of images from the second camera to the OT-MACH filter for determining the x-y coordinates of the object by comparing the amplitude of the maximum peak in the 01-MACH correlation plane to a detection threshold.2. Apparatus as claimed in Claim 1, wherein a plurality of rotated training images are produced, the rotated training images constructed by rotating the training image, and the OT-MACH filter being constructed from the rotated training images.3. Apparatus as claimed in Claim 1, wherein the training image is produced periodically to update the aT-MACH filter.4. Apparatus as claimed in Claim 2, wherein the rotated training images are produced periodically to update the OT-MACH filter.5. Apparatus as claimed in either Claim 3 or Claim 4, wherein the OT-MACH filter is updated between every 0.5 and 1.5 seconds.6. A method, for determining the co-ordinates of an object in two dimensions, the method comprising the steps of: acquiring an infrared image including the object from an infrared camera; acquiring a stream of images, including the object, and a calibration image, including the object, from a camera; producing a vector, corresponding to an edge of the object, from the infrared image, using an active contour model; producing a training image, by extracting the object from the calibration image c4i using the vector; constructing an OT-MACH filter using the training image; and determining the x-y co-ordinates of the object in a correlation plot, by correlating at least one image from the stream of images to the OT-MACH filter.7. The method as claimed in Claim 6, further comprising the step of producing a plurality of rotated training images, the rotated training images constructed by rotating the training image, and constructing the OT-MACH filter from the rotated training images.8. The method as claimed in either Claim 6, wherein the training image is produced periodically to update the OT-MACH filter.9. The method as claimed in Claim 7, wherein the rotated training images are produced periodically to update the 01-MACH filter.10. The method as claimed in any one of Claims 6 to 9, wherein the 01-MACH filter is updated between every 0.5 and 1.5 seconds.11. A computer program, embodied on a computer readable medium, configured to execute the method according to any one of Claims 6 to 10. rr4-12. Apparatus substantially as herein described with reference to and as shown in any one of the accompanying drawings.N C"1513. A method substantially as herein described with reference to and as shown in any one of the accompanying drawings.
GB1018974.4A 2010-11-10 2010-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions Active GB2485359B (en)

Priority Applications (11)

Application Number Priority Date Filing Date Title
GB1018974.4A GB2485359B (en) 2010-11-10 2010-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions
RU2012105332/12A RU2600906C2 (en) 2010-11-10 2011-11-10 Modelling installation including control device
GB1119412.3A GB2486527B (en) 2010-11-10 2011-11-10 A simulator including a controller
GB1119413.1A GB2485471B (en) 2010-11-10 2011-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions
CN201180003614.XA CN102652328B (en) 2010-11-10 2011-11-10 Simulator including controller
IN884DEN2012 IN2012DN00884A (en) 2010-11-10 2011-11-10
IN883DEN2012 IN2012DN00883A (en) 2010-11-10 2011-11-10
PCT/GB2011/052187 WO2012063068A1 (en) 2010-11-10 2011-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions
CN201180003615.4A CN102667858B (en) 2010-11-10 2011-11-10 Including for determining the simulator of the method and apparatus of the two-dimensional coordinate of object
PCT/GB2011/052188 WO2012063069A1 (en) 2010-11-10 2011-11-10 A simulator including a controller
RU2012105335A RU2608350C2 (en) 2010-11-10 2011-11-10 Simulating installation involving method and device for determining coordinates of object in two dimensions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1018974.4A GB2485359B (en) 2010-11-10 2010-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions

Publications (3)

Publication Number Publication Date
GB201018974D0 GB201018974D0 (en) 2010-12-22
GB2485359A true GB2485359A (en) 2012-05-16
GB2485359B GB2485359B (en) 2012-10-31

Family

ID=43414641

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1018974.4A Active GB2485359B (en) 2010-11-10 2010-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions
GB1119413.1A Active GB2485471B (en) 2010-11-10 2011-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB1119413.1A Active GB2485471B (en) 2010-11-10 2011-11-10 A simulator including a method and apparatus for determining the co-ordinates of an object in two dimensions

Country Status (5)

Country Link
CN (1) CN102667858B (en)
GB (2) GB2485359B (en)
IN (1) IN2012DN00884A (en)
RU (1) RU2608350C2 (en)
WO (1) WO2012063068A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0465784A2 (en) * 1990-08-10 1992-01-15 Kaman Aerospace Corporation Process for automatically detecting and locating a target from a plurality of two dimensional images
US5870486A (en) * 1991-12-11 1999-02-09 Texas Instruments Incorporated Method of inferring sensor attitude through multi-feature tracking
EP1061748A2 (en) * 1999-06-10 2000-12-20 University of Washington Video object segmentation using active contour modelling with global relaxation
US20020118861A1 (en) * 2001-02-15 2002-08-29 Norman Jouppi Head tracking and color video acquisition via near infrared luminance keying
US20050018925A1 (en) * 2003-05-29 2005-01-27 Vijayakumar Bhagavatula Reduced complexity correlation filters
WO2005082249A2 (en) * 2004-02-26 2005-09-09 K.U. Leuven Research & Development Time-dependent three-dimensional musculo-skeletal modeling based on dynamic surface measurements
US20060115160A1 (en) * 2004-11-26 2006-06-01 Samsung Electronics Co., Ltd. Method and apparatus for detecting corner
US20080259187A1 (en) * 2007-04-18 2008-10-23 Fujifilm Corporation System for and method of image processing and computer program for causing computer to execute the method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6529614B1 (en) * 1998-08-05 2003-03-04 California Institute Of Technology Advanced miniature processing handware for ATR applications
US6757422B1 (en) * 1998-11-12 2004-06-29 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
US7399969B2 (en) * 2003-01-21 2008-07-15 Suren Systems, Ltd. PIR motion sensor
US7693331B2 (en) * 2006-08-30 2010-04-06 Mitsubishi Electric Research Laboratories, Inc. Object segmentation using visible and infrared images
CN101383004A (en) * 2007-09-06 2009-03-11 上海遥薇实业有限公司 Passenger target detecting method combining infrared and visible light images

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0465784A2 (en) * 1990-08-10 1992-01-15 Kaman Aerospace Corporation Process for automatically detecting and locating a target from a plurality of two dimensional images
US5870486A (en) * 1991-12-11 1999-02-09 Texas Instruments Incorporated Method of inferring sensor attitude through multi-feature tracking
EP1061748A2 (en) * 1999-06-10 2000-12-20 University of Washington Video object segmentation using active contour modelling with global relaxation
US20020118861A1 (en) * 2001-02-15 2002-08-29 Norman Jouppi Head tracking and color video acquisition via near infrared luminance keying
US20050018925A1 (en) * 2003-05-29 2005-01-27 Vijayakumar Bhagavatula Reduced complexity correlation filters
WO2005082249A2 (en) * 2004-02-26 2005-09-09 K.U. Leuven Research & Development Time-dependent three-dimensional musculo-skeletal modeling based on dynamic surface measurements
US20060115160A1 (en) * 2004-11-26 2006-06-01 Samsung Electronics Co., Ltd. Method and apparatus for detecting corner
US20080259187A1 (en) * 2007-04-18 2008-10-23 Fujifilm Corporation System for and method of image processing and computer program for causing computer to execute the method

Also Published As

Publication number Publication date
RU2012105335A (en) 2014-12-20
CN102667858B (en) 2017-03-15
GB201119413D0 (en) 2011-12-21
GB201018974D0 (en) 2010-12-22
IN2012DN00884A (en) 2015-07-10
WO2012063068A1 (en) 2012-05-18
GB2485471B (en) 2016-10-12
CN102667858A (en) 2012-09-12
GB2485471A (en) 2012-05-16
RU2608350C2 (en) 2017-01-18
GB2485359B (en) 2012-10-31

Similar Documents

Publication Publication Date Title
DiFilippo et al. Characterization of different Microsoft Kinect sensor models
US9785249B1 (en) Systems and methods for tracking motion and gesture of heads and eyes
CN109446892B (en) Human eye attention positioning method and system based on deep neural network
CN112925223B (en) Unmanned aerial vehicle three-dimensional tracking virtual test simulation system based on visual sensing network
US8441438B2 (en) 3D pointing device and method for compensating movement thereof
Jia et al. 3D image reconstruction and human body tracking using stereo vision and Kinect technology
US11232590B2 (en) Information processing apparatus, information processing method, and program
Asadzadeh et al. Low-cost interactive device for virtual reality
KR20130067856A (en) Apparatus and method for performing virtual musical instrument on the basis of finger-motion
Deldjoo et al. A low-cost infrared-optical head tracking solution for virtual 3d audio environment using the nintendo wii-remote
Fornasier et al. Vinseval: Evaluation framework for unified testing of consistency and robustness of visual-inertial navigation system algorithms
GB2485359A (en) A Simulator Including A Controller And A Means For 3-Dimensional Position Recognition Using Correlation
GB2485428A (en) A Controller for a Plumbing Simulator
CN115294213A (en) Calibration tower, camera calibration method and device, electronic equipment and storage medium
KR20200065945A (en) Method for stop signal generation that combines LiDAR data with object recognition data from camera
Xu et al. A flexible 3D point reconstruction with homologous laser point array and monocular vision
CN115359422A (en) High-altitude parabolic image generation method, device and system
Zhang et al. Universal range data acquisition for educational laboratories using Microsoft Kinect
Yang et al. Perceptual issues of a passive haptics feedback based MR system
CN106931879B (en) Binocular error measurement method, device and system
GB2486527A (en) A Controller for a Plumbing Simulator
KR20140089220A (en) Apparatus and method for control military strategy
Liu et al. A virtual simulation and driver evaluation platform for smart wheelchairs
TWI534659B (en) 3d pointing device and method for compensating movement thereof
CN115802021A (en) Volume video generation method and device, electronic equipment and storage medium