WO1996017258A2 - Dispositif optique de detection de position - Google Patents

Dispositif optique de detection de position Download PDF

Info

Publication number
WO1996017258A2
WO1996017258A2 PCT/GB1995/002799 GB9502799W WO9617258A2 WO 1996017258 A2 WO1996017258 A2 WO 1996017258A2 GB 9502799 W GB9502799 W GB 9502799W WO 9617258 A2 WO9617258 A2 WO 9617258A2
Authority
WO
WIPO (PCT)
Prior art keywords
light
reflector
accordance
beams
retro
Prior art date
Application number
PCT/GB1995/002799
Other languages
English (en)
Other versions
WO1996017258A3 (fr
Inventor
Milan Momcilo Popovich
Original Assignee
Novus Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB9424277A external-priority patent/GB9424277D0/en
Application filed by Novus Limited filed Critical Novus Limited
Priority to AU39878/95A priority Critical patent/AU3987895A/en
Publication of WO1996017258A2 publication Critical patent/WO1996017258A2/fr
Publication of WO1996017258A3 publication Critical patent/WO1996017258A3/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver

Definitions

  • This invention relates to optical position sensing systems generally and more particularly, but not exclusively, to an optical position sensing system operating in the infra-red spectrum to determine the position, or the position and attitude, of a target object in three dimensional space.
  • a new display concept in the field of interactive television video image software applications uses 'knowledge' of the head position of a viewer disposed in front of a video display screen to provide an enhanced field of view and a pseudo three-dimensional effect, i.e. by two-dimensional screen image perspective, similar to viewing a hologram.
  • This requires some means of automatically determining the position and thereby tracking movement of the head in three dimensional Cartesian co-ordinates at a frequency compatible with the picture update rate of the video display system.
  • the head position could be determined in other spatial coordinate systems, such as spherical polar coordinates, for example.
  • Hitherto position sensing means have relied on the provision of separate transmitting and receiving means with one such means being fixed at a base station and the other being attached to a moving target object.
  • an optical triangulation tracking system comprising means for creating at least two beams of light, each being divergent, mounted relative to one another in a predetermined spatial and angular relationship such that the beams intersect one another thereby illuminating a volume of space, at least one reflector relatively moveable within said volume of space, a light detection device adapted to receive light from each of the two beams which has been reflected back along its path of transmission by the reflector and an image processing system connected to each of said detection devices, the arrangement being such that relative movement between said light beam creation means and the reflector will be monitored by the image processing system which determines from received images by triangulation spatial coordinates of the reflector thereby to determine the relative position within said volume of space of the reflector.
  • the reflector is a retroreflector.
  • the image processing system may be adapted additionally to monitor angular movements of the reflector.
  • Two light detection means may be provided with each being masked by a corresponding slitted aperture plate with the slits of the aperture plates being relatively mutually perpendicular in orientation.
  • the detection device may comprise a plurality of light detection means each associated with a respective one of said divergent beams and each adapted to receive light reflected back along its path of transmission by the retro-reflector.
  • the system may further comprise comprise a beam splitting means associated with each light detection means through which passes both a corresponding one of the divergent beams and light reflected by the reflector therefrom, the arrangement providing that the path of reflected light exiting the beam splitting means and the paths of light of the divergent beams entering said beam splitting means are angularly displaced.
  • the divergent beams may pass directly through the corresponding beam splitting means and the path of light reflected by the reflector may be angularly displaced by these beam splitting means.
  • the light detection means may each comprise a two dimensional detector array, a CCD video camera, a Position Sensing Detector and/or any type of detector which can detect the position of a light spot in two dimensions.
  • the divergent beams of light may each be paired coincident light beams with a first beam of each said pair having a first wavelength and a uniform angular intensity profile and the second beam of that pair having a second wavelength and a non-uniform symmetrical intensity profile, and the light detection means or device may be adapted to measure the intensity of light incident thereon and the image processing system may be adapted to determine the trajectory of incident light from the intensities received from said paired uniform and non-uniform light beams.
  • Filter means such as diffusing screens, holographic light shaping diffusers, apodising filters and diffractive optics may be utilised to provide both said uniform and said non-uniform intensity profiles.
  • the light detection means may be provided by a single detection device optically multiplexed so as to function as a plurality of individual detection means.
  • the light reflected try the reflector may be channelled to the detector by means of a light guide consisting of reflective surfaces.
  • the system may further comprise a reflecting device having a plurality of spaced apart retro-reflectors the relative positions of which are known by the image processing system, the arrangement enabling both position and orientation of the reflecting device within the illuminated space to be determined.
  • the image processing system may be adapted to identify predetermined types of motion of the or each retro-reflector. The amplitude of the light of each divergent beam may be modulated.
  • Polarising devices may be used to reject background illumination and the beam splitting means may be a half-silvered mirror.
  • the reflector may be a retroreflector in the form of a high gain screen with a narrow gain profile around the angle of incidence and/or may be in the form of one or more retroreflective prisms.
  • the divergent light beams may be laser light of one or more narrow spectral band widths of wavelength and/or infrared radiation from a light-emitting diode.
  • the means for creating light may be adapted to emit sequentially a plurality of light beams each having a different wavelength, a corresponding plurality of retro-reflectors may be provided each being adapted by filter or other means to retro-reflect only light of an associated one of said wavelengths, and the light detection means may be adapted to receive light from the plurality of retro-reflectors and by being multiplexed with the sequencing of the light emission may be able to discriminate between light reflected from each of the retro-reflectors.
  • the second support means may comprise a body housing a retro-reflector and a closure or shutter member associated with said body, with the closure or shutter member being operative to mask or un-mask the retro-reflector.
  • Figure 1 illustrates diagrammatically the operation of a first embodiment of a head or target tracking system in accordance with the invention
  • Figure 2 illustrates diagrammatically the operation of a second embodiment of a head or target tracking system generally similar to that shown in Figure 1 wherein like components or items are denoted by like numerals;
  • Figure 3 illustrates diagrammatically the operation of a third embodiment generally similar in layout to that shown in Figure 1 wherein similar components or items are denoted by similar numerals;
  • Figure 3a illustrates the polar intensity profile of one light source used in the embodiment shown in Figure 3;
  • Figure 3b illustrate the polar intensity profile of another light source used in the embodiment shown in Figure 3;
  • Figure 4 illustrates a fourth embodiment of the invention comprising optical heads similar in function to those of said first embodiment shown in Figure 1 and like parts or items are denoted by like numerals;
  • Figure 5 illustrates a reflector pattern for use with a fifth embodiment of the invention
  • Figure 6 illustrates diagrammatically a sixth embodiment of the invention wherein like parts or items to those of the embodiments of Figures 1 to 3 are denoted by like numerals;
  • Figure 7 illustrates a seventh embodiment of the invention wherein like parts or items to those of the embodiments of Figures 1 to 3 are denoted by like numerals;
  • Figure 8 illustrates an eighth embodiment of the invention similar to said aforedescribed third embodiment shown in Figure 3 wherein like parts are denoted by like numerals;
  • Figure 9 illustrates diagrammatically generally in sectional view a number of retroreflector devices for use with the embodiments of the invention, and
  • Figure 10 illustrates diagrammatically a sensor layout for use with at least one of the described embodiments of the invention.
  • a first embodiment of the present invention in the form of a head tracking system shown in Figure 1, is based on the use of optical triangulation to determine the spatial position of a viewer. It comprises two identical optical receiver/transmitter heads 1,2 which would typically be mounted at either side of a video display screen (not shown), a small retroreflector or retroreflective target 3 worn by the viewer and an electronic image processor means 4.
  • the optical heads 1,2 are fixed in space at positions relative to a datum point and one another known to the processor means 4.
  • Each optical head 1,2 comprises along a cannon optical axis thereof a light source 5, having a substantial portion of its output in the near infra-red region of the optical spectrum, together with suitable projection optics which in this embodiment includes a concave mirror 6, a condenser lens or lens array 7 and a projection lens or lens array 8 to provide a diverging beam of field angle alpha with a smoothly varying intensity profile.
  • suitable projection optics which in this embodiment includes a concave mirror 6, a condenser lens or lens array 7 and a projection lens or lens array 8 to provide a diverging beam of field angle alpha with a smoothly varying intensity profile.
  • the light sources in this embodiment are rendered invisible to the viewer by virtue of the near infra-red wavelengths used. However, it will be appreciated that other wavelengths of light may be used in conjunction with or in place of infra-red wavelengths.
  • the spectral output and intensity of light output are controlled by means of optical filters 9,10.
  • the beam passes through a beam divider or beamsplitter 11 which divides the beam intensity in a fixed ratio.
  • the type of beam divider shown in Figure 1 is a glass plate 11 coated with a multi-layer partially transmissive coating, inclined at 45 degrees to the optical axis. Alternatively, a beam-splitting cube could be used.
  • Light direct from light source 5 not passing through the beam divider 11 is intercepted by a light trap 12. The beam then passes through a window 13.
  • Optical receiver/transmitter head 1 radiates diverging beam 14 through its window 13 and similarly optical head 2 radiates diverging beam 15. Relative orientation of optical heads 1,2 is such that their respective optical axes are incident so that beams 14 and 15 overlap to create an illuminated volume of space 16 in front of the video display screen in which the viewer may move at will.
  • the retroreflective target 3 worn by the viewer is typically mounted at the centre of a pair of spectacles or attached directly to the viewer's head in the region of the eye mid-point.
  • the retroreflective target 3 may be either a high gain screen, a single retroreflective prism or an array of smell prisms.
  • retroreflective refers to any reflective device and/or material from which a substantial portion of incident electromagnetic radiation is reflected back along its incident path, i.e. through 360 degrees.
  • a single element prism to return light rays back along the same path would be used in situations where the superior optical performance provided by a single element prism is required. In some cases it may be desirable to use such a prism for aesthetic reasons.
  • the retroreflector 3 is shown in Figure 1 to be at the centre of the volume 16 corresponding to the point of intersection of the optical axes of the two beams 14,15 (i.e. of the two optical heads 1,2).
  • Light rays 17,18 intercepted by and incident on retroreflector 3 are returned along the same path (in the case of a high gain screen the effect is similar but the mechanism is that rays not scattered in the incident ray direction experience negligible gain).
  • Light ray 17 returning to the beam divider 11 of optical head 1 is split and a portion 19 of the returning light is reflected by beam divider 11 toward an imaging sensor 20 equipped with a lens or lens array 21 whose field angle matches that of the projected beam 14.
  • An infra-red filter 22 is used to select only light of the same wavelength as the light source 5.
  • a portion 23 of the ray 18 is received by imaging sensor 20 of optical head 2.
  • Each light ray portion 19,23 results in a corresponding sensor image within the respective imaging sensor 20.
  • the following description relates to one such sensor image by describing components of one sensor 20 and is equally applicable to the other sensor image and sensor.
  • the sensor image is fanned on a two-dimensional array of photodetectors whose responses are thresholded such that a binary image consisting of a single spot moving against a black background is formed.
  • a raster scanning sensor typically comprising a CCD video camera could be used.
  • the spot intensity is characterised in terms of a number of grey levels - typically up to 256.
  • PSD Position Sensing Detectors
  • the position of the target retroreflector 3 in three dimensional space is obtained.
  • the determination of the spot position in this embodiment is carried out by electronic circuitry 24,25 associated respectively with optical heads 1,2 and which each autocratically locates the peak intensity in the matrix of associated sensor 20 by searching for the maximum intensity along each line of data in a frame as it is read out of that sensor.
  • electronic circuitry 24,25 associated respectively with optical heads 1,2 and which each autocratically locates the peak intensity in the matrix of associated sensor 20 by searching for the maximum intensity along each line of data in a frame as it is read out of that sensor.
  • processing of data from both sensors 20 is carried out simultaneously at frame refresh rate.
  • the sensor outputs being multiplexed, in which case co-ordinates computed by electronic circuits 24,25 would be delivered every other frame, for example.
  • the computed x,y co-ordinates from each sensor 20 are transmitted by respective circuits 24,25 to a processor 26 where they are used to determine the co-ordinates x,y,z of the target retroreflector 3 in three dimensional space within volume 16.
  • the described electronic image processor means 4 comprises three distinct circuit elements 24,25,26 that these could be substituted by unitary or other means either hardwired or software controlled. Additional sensors may be used to improve the accuracy of the computation. Instead of sensors with two-dimensional detector arrays it is also possible to use linear arrays, which will provide greater accuracy. However, at least three such sensors would be required, i.e. three optical heads would be required.
  • optical heads 1,2 are ceiling mounted or otherwise mounted overhead with the retroreflector 3 being attached to the top of a helmet worn by the viewer, for example.
  • the aforedescribed head tracking system could be used to determine head velocity and acceleration and to recognise different types of head movements that could be used in interactive games, for example nodding and shaking of the head.
  • optical components of the aforedescribed head tracking system are conventional in nature.
  • the invention could be implemented with a variety of optical technologies including Fresnel optics, holographic optical elements and micro-optics.
  • the most expensive components of the aforedescribed embodiment are the sensors 20. Therefore, it is desirable to produce a head tracking system requiring only a single sensor. This could be done by intercepting light rays 19,23 (seen in Figure 1) by corresponding light guides based on mirrors, relay optics or fibre optics, etc all leading to a single sensor (20). Operation of light sources 5 and the single sensor could be multiplexed to determine sequentially the vectors for retroreflector 3.
  • two optical transmitter heads 100,101 each comprise a light source 5, lens or lens array 8 and beamsplitter 11 resulting in two diverging beams of light 14,15 converging on illuminated volume of space 16 in which retroreflector 3 can be moved.
  • the transmitter heads 100,101 share a single optical receiver head 103 which comprises a single detector or imaging sensor 20 and two planar mirrors 104,105.
  • Mirror 104 is disposed so as to reflect light incident from beamsplitter 11 of optical transmitting head 100 into sensor 20 and similarly mirror 105 reflects light received from transmitter head 101 into the sensor.
  • mirrors 104 and 105 are mutually perpendicular as shown in Figure 2. Alternatively, they may be relatively disposed at a more convenient or optimum angle as required.
  • an optical combiner in the form of a prism for example, could be used in place of mirrors 104,105.
  • Electronic circuit and computation means 106 are associated with the detector or sensor 20. Self-evidently for the operation of this embodiment of the invention it is necessary for means 106 to be able to differentiate output from sensor 20 resulting from optical beam 14 from that resulting from beam 15.
  • light beam 14 and the associated optical transmitting head 100 provides a first optical channel A and beam 15 and optical head 101 provide a second optical channel B. Differentiation of sensor output may be effected by synchronising light source pulsing frequencies of optical channels A and B with that of read-out electronics 107.
  • the read-out electronics 107 permits the processing of output signals from sensor 20 including amplification, filtering, analogue to digital conversion, noise removal, multiplexing and any other required signal conditioning enabling the output of voltages or other digital signal means that can be used by other components of the computation means 106.
  • a spot co-ordinate determination tracking algorithm in component or segment 108 of computation means 106 receives multiplexed input from read-out electronics 107 to enable means 106 to determine by application of the algorithm position vectors relating to retroreflector 3 corresponding to optical channels A and B respectively and thereby to output the spatial position of the retroreflector 3.
  • the tracking algorithm is able to determine the trajectory of a target in the associated optical channel A or B illuminating a detector element in the sensor 20.
  • each optical channel has two light sources 5A and 5B.
  • Light from sources 5A and 5B of optical head 200 is transmitted as coincident diverging beams 14A and 14B respectively.
  • light from optical head 201 is in the form of coincident diverging beams 15A and 15B.
  • These beams 14A, 14B,15A and 15B illuminate a volume of space 16 in which retroreflector 3 is able to be moved.
  • the angular intensity distribution profile of light from sources 5A is shown in the diagram of Figure 3b. It will be apparent that the light sources 5A have a uniform angular intensity profile and correspond to the type of light sources 5 which may be used in the aforementioned first and second embodiments, whilst light sources 5B have the non-uniform symmetrical profile shown in Figure 3a.
  • filter or other means may be utilised.
  • the filter or other means may be diffractive or holographic optical elements which accept incoming light and redistribute it over some pre-specified polar diagram.
  • each optical head 200,201 is provided with respective receiving sensors 220.
  • Reflected light 19A and 19B corresponding to beams 14A and 14B respectively are incident on sensors 220 of optical head 200.
  • reflected light 23A and 23B correspond to beams 15A and 15B respectively and are incident on sensors 220 of optical head 201.
  • the sensors 200 measure the intensity of incident light 19A, 19B and 23A, 23B respectively. The received intensity will be a function of the angular bearing with respect to each light source 5A,5B and the distance of the retroreflector 3 from each light source.
  • Range-angle ambiguities are avoided in this embodiment by using light sources 5A,5B having different wavelengths and sensors 220 comprising single element detectors 220A,220B each able to measure light intensity of the wavelength of source 5A or 5B respectively.
  • the range dependence of the reflected signed can thus be eliminated by ratioing the signals from the paired detectors 220A,220B.
  • the sources 5A and 5B could be cvharacterised by modulating the amplitudes of the beams at different temporal frequencies. The signals obtained after filtering by the processing electronics would then be ratioed to eliminate range dependence.
  • An improvement to this third embodiment is to provide a further third optical channel in order that an unambiguous determination of the co-ordinates of retroreflector 3 can be obtained.
  • a single element detector By measuring intensity rather than position a single element detector can be used in place of the more complex requirements of the first two embodiments. This may offer cost advantages.
  • nodding or shaking of the head may be detected by using a movement tracking algorithm within associated computation means which is able to discriminate between different movement patterns of reflected light.
  • specific means may be provided for determining such movement as hereinafter described.
  • a fourth embodiment of the invention is adapted to detect the nodding or shaking motion of the head wearing a retroreflector in the form of a spot or disc 303.
  • the optical transmitter heads 300,301 of this fourth embodiment each have a single detector element 320 toward which respectively reflected light beams 319,323 are reflected by corresponding beamsplitter 11.
  • a mask 308 is disposed between detector 320 and beamsplitter 11, similarly in optical head 301 a mask 309 is provided.
  • Detail (a) shows that mask 308 is provided with a vertical slot or aperture 310 and detail (b) shows that mask 309 is provided with a horizontal slot or aperture 311, i.e. apertures 310 and 311 are mutually perpendicular.
  • the resultant image of disc 303 is shown as spot 312 and similarly on detail (b) the image is shown as spot 313.
  • a retroreflector device 403 shown in detail in Figure 5a comprises two mutually perpendicular typically respectively vertically and horizontally disposed striped elements 404,405 wherein the black bars are of low reflectivity and the white bars are of high reflectivity.
  • Element 404 has widely spaced bars and element 405 has closely spaced bars. Many different configurations are possible. The essential requirement is to have widely spaced and closely spaced bars patterns arranged orthogonally.
  • the retroreflector 403 is used in conjunction with optical transmitter/receiver heads in accordance with said first to third embodiments of the invention.
  • a single element detector 420 is represented in Figure 5b on which is superimposed an outline image 403' of the retroreflector 403 moving horizontally in the direction of the arrow. Detector 420 will occupy sequentially positions 420,420" and 420'' relative to the image 403'.
  • a signal from the detector 420 will have a relatively high frequency waveform such as that shown as 421 which indicates that the retroreflector 403 is being shaken from side to side. From Figure 5c it will be apparent that nodding of the retroreflector up and down will result in a relatively low frequency output waveform 422.
  • two optical heads 500,501 each have light sources 505 one of which is shown in detail (a).
  • Each light source 505 is able to generate light at four different wavelengths A,B,C,D, e.g. four light emitting elements may be used.
  • such illumination may be in sequential pulsed form as shown by transmitted signal 510 where pulses corresponding to the different wavelengths and beams A-D follow one after another.
  • this sixth embodiment allows for the tracking of movement of four target retroreflectors 503A,503B,503C,503D each adapted to reflect only light of wavelengths A,B,C,D respectively.
  • This is accomplished as shown in detail (b) of the retroreflector 503B by the retroreflector 503 being overlaid by a transmissive filter (in this case 511B) which absorbs all light except that of the corresponding wavelength.
  • a transmissive filter in this case 511B
  • the seventh embodiment of the invention tracks multiple retroreflective targets 603A,603B,603C,603D by virtue of each target having a distinct pattern as shown in Figure 7.
  • These targets each has a unique spatial array of retroreflective dots known to associated computation means 600 from a digitised bitmap or look-up table/template 601.
  • the computation means can from signals received from sensors 20 generate a video frame 602 on which will appear 'spots' corresponding to each dot of targets 603 within space 16. From this frame it is possible to calculate the spatial position of each dot and from this determine in conjunction with template 601 the actual position of the individual targets.
  • two optical transmitting/receiving heads 700,701 each illuminate a flat surface 716 with respective diverging beams of light 14,15.
  • Light sources 5 within optical heads 700,701 result in a non-uniform light pattern, such as for example in the aforedescribed manner of the third embodiment of the invention.
  • the juxtaposition of light beams 14,15 on flat surface 716 provides that the position of a retroreflector disc or puck 703 on the surface 716 can be determined merely by analysing the intensities of reflected light 19,23 and comparing that with an intensity look-up table 710 held within computation means associated with this embodiment.
  • contour map is shown in which those contours radiating from corner 717 correspond to beam 14 and those radiating from corner 718 correspond to beam 15.
  • An application of this eighth embodiment would be in a board game having some 100 to 200 individual patches each identifiable by a position on the look-up table 710.
  • Figure 9a That shown in Figure 9a comprises a hand held plastics moulding 800 having a cavity 801 which cam be enclosed by a resiliently flexible closure member 802.
  • a retroreflector 3 is attached to a rear wall of the cavity 801 and at rest the member 802 is in the upper position shown. By applying light pressure to button 803 the closure member 802 can be moved to the closed position shown in dotted outline thereby masking the retroreflector 3.
  • a retroreflector 3 is housed within a cavity 810 and masked by a closure member 811 pivotal about pin 812.
  • a closure member 811 pivotal about pin 812.
  • FIG. 9c The more complex example shown in Figure 9c is adapted for use with an embodiment of the invention able to discriminate between the position of different targets by illuminating and/or detecting different wavelengths of light.
  • This example has a hollow body 820 housing an elongate retroreflector 3.
  • Buttons 821,822,823 are operative to open shutter mechanisms disposed respectively behind filters 824,825,826. These filters may correspond to each of three wavelengths of illumination or alternatively they may correspond with each of three wavelengths of detection. In the latter case the illustrated example will be illuminated with 'white' light and only reflect that 'colour', i.e. wavelengths associated with the filter whose shutter is open.
  • FIG. 10 illustrates a sensor layout 920 utilising linear detector arrays 900, 901 which, for example, receives light beam 919 reflected from a retroreflector (not shown).
  • a lens 921 focuses the spot of beam 919 on a beamsplitter cube 902 from which beams 919A and 919B are reflected.
  • Lenses 903,904 focus respectively beams 919A,919B to provide elongate images respectively 905,906 of the spot of beam 919.
  • each array 900,902 comprises a plurality of sensors each of which is connected to read-out electronics shown respectively as 907,908. It will be appreciated that one dimensional PSD's could be used as detector arrays 900,901.
  • the target(s) have been moveable in a static volume of light created by stationary illumination means. It should be understood that in alternative embodiments the target(s) nay be static and the illumination means may form part of apparatus mounted to a moveable object, typically the head of a user. The appended claims are intended to extend to this variation. Also, although previously described embodiments are designed to determine precise spatial position relative to a fixed datum, embodiments are envisaged in which either or both of the position and orientation of the illumination means and target(s) is or are determined relative to one another and/or relative to an arbitrary datum such as a given start position of the moveable target(s) or illumination means.
  • the reflector has been a retroreflector but it is this is not considered to be am essential feature of the invention.
  • a conventional diffuse reflector may be acceptable.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

Un système de poursuite par triangulation optique comporte des moyens de création d'au moins deux faisceaux lumineux (14, 15), tous deux divergents, montés l'un par rapport à l'autre dans une relation spatiale et angulaire telle que les faisceaux se coupent de manière à éclairer un volume spatial (16); au moins un réflecteur (3) relativement mobile à l'intérieur dudit volume spatial; un détecteur (20) de lumière conçu pour recevoir la lumière des deux faisceaux renvoyée sur sa trajectoire de transmission par le réflecteur et un système (24, 25) de traitement d'images relié à chacun desdits détecteurs, le montage étant tel que les mouvements relatifs entre les moyens susdits de création de faisceaux et le réflecteur soient contrôlés par le système de traitement d'images qui détermine par triangulation à partir des images reçues les coordonnées spatiales du réflecteur, et ainsi, sa position relative à l'intérieur dudit volume spatial.
PCT/GB1995/002799 1994-12-01 1995-11-30 Dispositif optique de detection de position WO1996017258A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU39878/95A AU3987895A (en) 1994-12-01 1995-11-30 Optical position sensing system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB9424277A GB9424277D0 (en) 1994-12-01 1994-12-01 Optical position system
GB9424277.3 1994-12-01
GBGB9500943.7A GB9500943D0 (en) 1994-12-01 1995-01-18 Optical position sensing system
GB9500943.7 1995-01-18

Publications (2)

Publication Number Publication Date
WO1996017258A2 true WO1996017258A2 (fr) 1996-06-06
WO1996017258A3 WO1996017258A3 (fr) 1997-02-13

Family

ID=26306075

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1995/002799 WO1996017258A2 (fr) 1994-12-01 1995-11-30 Dispositif optique de detection de position

Country Status (3)

Country Link
AU (1) AU3987895A (fr)
GB (1) GB9500943D0 (fr)
WO (1) WO1996017258A2 (fr)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000072039A1 (fr) * 1999-05-19 2000-11-30 National Research Council Of Canada Detection de mouvement optique d'irm
WO2003038468A2 (fr) * 2001-10-30 2003-05-08 Conrad Technologies, Inc. Detection optique de position de sources de rayonnement multiples dans un corps pouvant etre deplace
WO2007136745A3 (fr) * 2006-05-19 2008-01-17 Univ Hawaii Système de suivi de mouvement pour imagerie adaptative en temps réel et spectroscopie
WO2012093323A1 (fr) * 2011-01-04 2012-07-12 Koninklijke Philips Electronics N.V. Système de détection de présence et système d'éclairage.
US8284988B2 (en) 2009-05-13 2012-10-09 Applied Vision Corporation System and method for dimensioning objects using stereoscopic imaging
US8508591B2 (en) 2010-02-05 2013-08-13 Applied Vision Corporation System and method for estimating the height of an object using tomosynthesis-like techniques
US8606401B2 (en) 2005-12-02 2013-12-10 Irobot Corporation Autonomous coverage robot navigation system
US8656550B2 (en) 2002-01-03 2014-02-25 Irobot Corporation Autonomous floor-cleaning robot
US8781159B2 (en) 2009-05-13 2014-07-15 Applied Vision Corporation System and method for dimensioning objects using stereoscopic imaging
TWI451064B (zh) * 2011-04-25 2014-09-01 Univ Nat Formosa Laser displacement measuring device and method combined with image measuring instrument
US8838274B2 (en) 2001-06-12 2014-09-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9607377B2 (en) 2013-01-24 2017-03-28 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9734589B2 (en) 2014-07-23 2017-08-15 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9769387B1 (en) 2013-11-05 2017-09-19 Trace Live Network Inc. Action camera system for unmanned aerial vehicle
US9782141B2 (en) 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US10004462B2 (en) 2014-03-24 2018-06-26 Kineticor, Inc. Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10470629B2 (en) 2005-02-18 2019-11-12 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
WO2020081927A1 (fr) * 2018-10-18 2020-04-23 Cyberoptics Corporation Capteur tridimensionnel à canaux placés à l'opposé les uns des autres
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US6956348B2 (en) 2004-01-28 2005-10-18 Irobot Corporation Debris sensor for cleaning apparatus
US6690134B1 (en) 2001-01-24 2004-02-10 Irobot Corporation Method and system for robot localization and confinement
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9128486B2 (en) 2002-01-24 2015-09-08 Irobot Corporation Navigational control system for a robotic device
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US7332890B2 (en) 2004-01-21 2008-02-19 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US7720554B2 (en) 2004-03-29 2010-05-18 Evolution Robotics, Inc. Methods and apparatus for position estimation using reflected light sources
KR101142564B1 (ko) 2004-06-24 2012-05-24 아이로보트 코퍼레이션 자동 로봇 장치용의 원격 제어 스케줄러 및 방법
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8392021B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
WO2006089307A2 (fr) 2005-02-18 2006-08-24 Irobot Corporation Robot autonome pour nettoyage humide et sec
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
ES2413862T3 (es) 2005-12-02 2013-07-17 Irobot Corporation Robot modular
EP2251757B1 (fr) 2005-12-02 2011-11-23 iRobot Corporation Mobilité de robot de couverture
EP2533120B1 (fr) 2005-12-02 2019-01-16 iRobot Corporation Système de robot
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
EP3655793A1 (fr) * 2017-07-19 2020-05-27 Signify Holding B.V. Système et procédé pour fournir des informations spatiales d'un objet à un dispositif

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4709580A (en) * 1986-02-26 1987-12-01 Bd Systems, Inc. Retroflective attitude determining system
FR2601443A1 (fr) * 1986-07-10 1988-01-15 Centre Nat Etd Spatiales Capteur de position et son application a la telemetrie, notamment pour la robotique spatiale
US4964722A (en) * 1988-08-29 1990-10-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Remote object configuration/orientation determination
EP0405423A2 (fr) * 1989-06-30 1991-01-02 Fraunhofer-Gesellschaft Zur Förderung Der Angewandten Forschung E.V. Méthode et dispositif pour déterminer la position d'un objet
EP0554978A2 (fr) * 1992-01-22 1993-08-11 Acushnet Company Système de surveillance pour mesurer les caractéristiques de vol d'un objet de sport mouvant

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4709580A (en) * 1986-02-26 1987-12-01 Bd Systems, Inc. Retroflective attitude determining system
FR2601443A1 (fr) * 1986-07-10 1988-01-15 Centre Nat Etd Spatiales Capteur de position et son application a la telemetrie, notamment pour la robotique spatiale
US4964722A (en) * 1988-08-29 1990-10-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Remote object configuration/orientation determination
EP0405423A2 (fr) * 1989-06-30 1991-01-02 Fraunhofer-Gesellschaft Zur Förderung Der Angewandten Forschung E.V. Méthode et dispositif pour déterminer la position d'un objet
EP0554978A2 (fr) * 1992-01-22 1993-08-11 Acushnet Company Système de surveillance pour mesurer les caractéristiques de vol d'un objet de sport mouvant

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000072039A1 (fr) * 1999-05-19 2000-11-30 National Research Council Of Canada Detection de mouvement optique d'irm
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US8838274B2 (en) 2001-06-12 2014-09-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
WO2003038468A2 (fr) * 2001-10-30 2003-05-08 Conrad Technologies, Inc. Detection optique de position de sources de rayonnement multiples dans un corps pouvant etre deplace
WO2003038468A3 (fr) * 2001-10-30 2003-11-27 Conrad Technologies Inc Detection optique de position de sources de rayonnement multiples dans un corps pouvant etre deplace
US8656550B2 (en) 2002-01-03 2014-02-25 Irobot Corporation Autonomous floor-cleaning robot
US8671507B2 (en) 2002-01-03 2014-03-18 Irobot Corporation Autonomous floor-cleaning robot
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US10470629B2 (en) 2005-02-18 2019-11-12 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8606401B2 (en) 2005-12-02 2013-12-10 Irobot Corporation Autonomous coverage robot navigation system
US10869611B2 (en) 2006-05-19 2020-12-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US8571293B2 (en) 2006-05-19 2013-10-29 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US8374411B2 (en) 2006-05-19 2013-02-12 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9867549B2 (en) 2006-05-19 2018-01-16 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US8121361B2 (en) 2006-05-19 2012-02-21 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
WO2007136745A3 (fr) * 2006-05-19 2008-01-17 Univ Hawaii Système de suivi de mouvement pour imagerie adaptative en temps réel et spectroscopie
US9076212B2 (en) 2006-05-19 2015-07-07 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9138175B2 (en) 2006-05-19 2015-09-22 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US10244915B2 (en) 2006-05-19 2019-04-02 Irobot Corporation Coverage robots and associated cleaning bins
US11072250B2 (en) 2007-05-09 2021-07-27 Irobot Corporation Autonomous coverage robot sensing
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US10299652B2 (en) 2007-05-09 2019-05-28 Irobot Corporation Autonomous coverage robot
US11498438B2 (en) 2007-05-09 2022-11-15 Irobot Corporation Autonomous coverage robot
US8781159B2 (en) 2009-05-13 2014-07-15 Applied Vision Corporation System and method for dimensioning objects using stereoscopic imaging
US8284988B2 (en) 2009-05-13 2012-10-09 Applied Vision Corporation System and method for dimensioning objects using stereoscopic imaging
US8508591B2 (en) 2010-02-05 2013-08-13 Applied Vision Corporation System and method for estimating the height of an object using tomosynthesis-like techniques
US11058271B2 (en) 2010-02-16 2021-07-13 Irobot Corporation Vacuum brush
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
WO2012093323A1 (fr) * 2011-01-04 2012-07-12 Koninklijke Philips Electronics N.V. Système de détection de présence et système d'éclairage.
TWI451064B (zh) * 2011-04-25 2014-09-01 Univ Nat Formosa Laser displacement measuring device and method combined with image measuring instrument
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US10663553B2 (en) 2011-08-26 2020-05-26 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9779502B1 (en) 2013-01-24 2017-10-03 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10339654B2 (en) 2013-01-24 2019-07-02 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9607377B2 (en) 2013-01-24 2017-03-28 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9782141B2 (en) 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US10653381B2 (en) 2013-02-01 2020-05-19 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US9769387B1 (en) 2013-11-05 2017-09-19 Trace Live Network Inc. Action camera system for unmanned aerial vehicle
US10187580B1 (en) 2013-11-05 2019-01-22 Dragonfly Innovations Inc. Action camera system for unmanned aerial vehicle
US10004462B2 (en) 2014-03-24 2018-06-26 Kineticor, Inc. Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US9734589B2 (en) 2014-07-23 2017-08-15 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10438349B2 (en) 2014-07-23 2019-10-08 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US11100636B2 (en) 2014-07-23 2021-08-24 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10660541B2 (en) 2015-07-28 2020-05-26 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
WO2020081925A1 (fr) * 2018-10-18 2020-04-23 Cyberoptics Corporation Capteur tridimensionnel à canaux opposés
US11029146B2 (en) 2018-10-18 2021-06-08 Cyberoptics Corporation Three-dimensional sensor with counterposed channels
CN112912688A (zh) * 2018-10-18 2021-06-04 赛博光学公司 具有对列通道的三维传感器
CN112888913A (zh) * 2018-10-18 2021-06-01 赛博光学公司 具有对列通道的三维传感器
US10883823B2 (en) 2018-10-18 2021-01-05 Cyberoptics Corporation Three-dimensional sensor with counterposed channels
JP2022505166A (ja) * 2018-10-18 2022-01-14 サイバーオプティクス コーポレーション 対向配置チャネルを有する三次元センサ
WO2020081927A1 (fr) * 2018-10-18 2020-04-23 Cyberoptics Corporation Capteur tridimensionnel à canaux placés à l'opposé les uns des autres
US11604062B2 (en) 2018-10-18 2023-03-14 Cyberoptics Corporation Three-dimensional sensor with counterposed channels
CN112912688B (zh) * 2018-10-18 2023-11-03 赛博光学公司 具有对列通道的三维传感器

Also Published As

Publication number Publication date
GB9500943D0 (en) 1995-03-08
WO1996017258A3 (fr) 1997-02-13
AU3987895A (en) 1996-06-19

Similar Documents

Publication Publication Date Title
WO1996017258A2 (fr) Dispositif optique de detection de position
US11868522B2 (en) Method for ascertaining a viewing direction of an eye
US10721459B2 (en) Scanning projectors and image capture modules for 3D mapping
US5861940A (en) Eye detection system for providing eye gaze tracking
US20210311171A1 (en) Improved 3d sensing
US11067692B2 (en) Detector for determining a position of at least one object
US7123351B1 (en) Method and apparatus for measuring distances using light
KR101762525B1 (ko) 다수의 이미터들을 이용한 깊이 주사를 위한 장치 및 방법
US6867753B2 (en) Virtual image registration in augmented display field
CA2700603C (fr) Systeme de poursuite optique a grand champ de vision
EP2581034B1 (fr) Éclairage de suivi des yeux
WO2019009963A1 (fr) Système optique compact à dispositifs de balayage mems pour une génération d'images et un suivi d'objet
EP0327072A2 (fr) Dispositif et méthode de mesure d'alignement d'axes pour des systèmes électro-optiques
US20210025978A1 (en) Coordinate measuring device having automatic target object recognition
US4501961A (en) Vision illumination system for range finder
WO2005050130A1 (fr) Detecteur de proximite
US7221437B1 (en) Method and apparatus for measuring distances using light
RU2543680C2 (ru) Оптический отражатель с полуотражающими пластинами для устройства отслеживания положения шлема и шлем, содержащий такое устройство
US5760932A (en) Target for laser leveling systems
CN113885694A (zh) 用激光多普勒干涉测量法进行眼睛跟踪
US20030164841A1 (en) System and method for passive three-dimensional data acquisition
CN108303708B (zh) 三维重建系统及方法、移动设备、护眼方法、ar设备
JP4114637B2 (ja) 位置計測システム
KR102595778B1 (ko) 후방 산란된 레이저 스페클 패턴들의 광학 흐름을 추적함으로써 움직임의 6개의 자유도들을 검출하기 위한 시스템
GB2301968A (en) Helmet position measurement system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AL AM AT AU BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TT UA UG US UZ VN

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

CFP Corrected version of a pamphlet front page
CR1 Correction of entry in section i

Free format text: PAT.BUL.26/96 THE CODE IDENTIFYING THE KIND OF DOCUMENT TO WHICH THE PAMPHLET RELATES SHOULD READ"A1"INSTEAD OF"A2"

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AL AM AT AU BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TT UA UG US UZ VN

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA