WO2002075350A1 - Method and device for determining an angular position of a reflector - Google Patents

Method and device for determining an angular position of a reflector Download PDF

Info

Publication number
WO2002075350A1
WO2002075350A1 PCT/SE2002/000544 SE0200544W WO02075350A1 WO 2002075350 A1 WO2002075350 A1 WO 2002075350A1 SE 0200544 W SE0200544 W SE 0200544W WO 02075350 A1 WO02075350 A1 WO 02075350A1
Authority
WO
WIPO (PCT)
Prior art keywords
position
reflector
angular
pixels
determining
Prior art date
Application number
PCT/SE2002/000544
Other languages
French (fr)
Inventor
Kalevi HYYPPÄ
Original Assignee
Danaher Motion Särö AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to SE0100954-7 priority Critical
Priority to SE0100954A priority patent/SE523318C2/en
Application filed by Danaher Motion Särö AB filed Critical Danaher Motion Särö AB
Publication of WO2002075350A1 publication Critical patent/WO2002075350A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/783Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0216Vehicle for transporting goods in a warehouse, factory or similar

Abstract

A method for determining an angular position of a reflector from a vehicle reference position in relation to an angular reference position including the step of arranging vertically extended reflectors in a working area. From the vehicle reference position vertical image slices of at least one horizontal segment of the working area are obtained, each image slice comrising a plurality of pixels and intensity values of pixels in each vertical image slice are added into a set of column sums, one position in teh set forming an angular reference point. Then the angular position of a reflector is determined as the position of a peak column sum or a subset of adjacent peak column sums in the said set in relation to the angular reference point. The invention also includes a method for determining the distance between a reflector and a vehicle reference position using reflectors with a predetermined vertical extension. The number of consecutive pixels having intensity values exceeding a predetermined intensity value is calculated and based on said number of consecutive pixels the distance is calculated. A device for determining the angular position comprises an image sensing means. An adder is provided for adding intensity values of pixels in vertical image slices into a set of column sums and computing means are provided for determining the angular position of a reflector as the position of a peak column sum or a subset of adjacent peak column sums in the said set in relation to the angular reference point.

Description

METHOD AND DEVICE FOR DETERMINING AN ANGULAR POSITION OF A REFLECTOR

TECHNICAL FIELD The invention relates to a method and a device for determining an angular position of a reflector in relation to an angular reference position including the step of arranging vertically extending reflectors in a working area. Automated guided vehicles are used in many industrial settings, for example in the form of trucks for transport of goods in factories and ware- houses.

PRIOR ART

According to a commonly used system, magnetic globs or similar de- vices are laid out along the transport paths of the trucks. As a result of high initial costs and difficulties in later modification of the route to be followed by the trucks in such systems, new systems including light reflectors have been developed.

In some prior art systems reflectors are used with identification, that is, the vehicles can determine, on the basis of the reflected signal, which unique reflector the signal is coming from. Such systems can be fast and effective, but the unique reflectors are relatively expensive. There are also limitations as regards to the distance at which the signal can be registered.

A navigation system with completely anonymous reflectors in the form of reflector strips is disclosed in US-A-4811228. The reflectors lack identity but are well defined with respect to their position. The position of each reflector is stored onboard a vehicle together with relevant coordinates for the transport area. A light source onboard the vehicle sends out a concentrated laser beam that sweeps over transport area. Reflections from the reflectors and other objects are registered in the vehicle and give bearings to a possible reflector.

The initial steps that are taken for defining an initial position of the vehicle are further developed in WO99/21026. Also in this case, a beam is transmitted from the vehicle over a search sector and reflected signals are received onboard the vehicle. The method disclosed in WO99/21026 also includes steps for determining continuously the distances between reflectors and a reference point on the vehicle.

Use of a laser or other light source for producing a beam that is swept over a working area limits the design possibilities of the vehicle, because the light source should not be covered in any direction by objects onboard the vehicle. The laser device used in prior art systems normally require a complicated system of mirrors to sweep the beam around. A problem involved with the system of mirrors is that a lot of moving parts are used.

SUMMARY OF THE INVENTION

It is an object of the present invention to overcome the problems and drawbacks referred to above. According to the invention there is provided a method and a device for determining an angular position of a reflector in rela- tion to an angular reference position. Generally, the invention can be used for determining the position of a vehicle. More specifically the invention can be used in a system for navigating an automated guided vehicle.

According to the invention an image of a working area is obtained. A simple and fast algorithm is applied to the image to substantially limit the amount of data necessary for the evaluation of the image. Intensity values of pixels in vertical image slices are added into a set of column sums. A large sum indicates the presence of a reflector in an angular position corresponding to a specific slice or a set of adjacent slices. A very precise value of the angular direction to the reflector can be determined by calculating the gravity point of the adjacent slices relating to a reflector. By using a gravity point calculation it is possible to achieve an accuracy exceeding the width of a pixel. By using reflectors having a predetermined vertical extension it is possible also to calculate the distance from a reference position to the reflector. A method for calculating the distance includes the steps of calculating the number of consecutive pixels in a column, or vertical slice, having intensity values exceeding a predetermined threshold value. The number of such pixels corresponds, through a non-linear relation, to the vertical extension of the reflector. The sums of intensity values from each vertical slice are preferably stored in a first vector. The number of elements in the first vector corresponds to the number of vertical image slices that can be obtained from an image sensor such as a CCD device. In a similar way the number of pixels having an intensity value exceeding a predetermined value are gathered in a second vector. Further calculations for determining the angular position of and the distance to a reflector are then performed on the basis of the content of said first and said second vector.

According to the invention it is possible to use a conventional CCD de- vice for obtaining an image of the working area where reflectors are mounted. Depending on angular working area of the CCD device it may be appropriate to mount a CCD device in each corner of a vehicle to cover a complete 360° circle around the vehicle.

Preferably, so called retro-reflectors are used. Such reflectors will re- fleet a large amount of incident light towards the source of light. Therefore, the light source is preferably mounted adjacent to the CCD-device.

Further features and advantages of the invention will be more apparent from the following detailed description and the accompanying claims and drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

A further understanding of the features and advantages of the present invention may be realised with reference to the description below and to the accompanying drawings, in which

Fig. 1 is a top view of an automated guided vehicle in accordance with one embodiment of the present invention, Fig. 2 is a side elevational view of the vehicle in Fig. 1, Fig. 3 is a schematic diagram showing one embodiment of a column adding section of the invention, Fig. 4 is a diagram showing the result of adding in Fig. 3 in a segment of the obtained image, Fig. 5 is a schematic diagram showing one embodiment of a threshold and adding section of the invention, Fig. 6 is a diagram showing the result of adding in Fig. 5 in a segment of the obtained image, and Fig. 7 is a schematic diagram showing one embodiment of a hardware in accordance with the invention

DETAILED DESCRIPTION In the embodiment shown in Fig. 1 an automated guided vehicle (AGV)

10 comprises a driver unit 11 and a forklift 12. The driver unit 11 includes the electronic and hydraulic means necessary to operate the forklift 12. The driver unit 11 is rectangular and in each corner there is provided an image sensing means 13, such as a CCD camera. The view angle of the cameras 13 is approximately 90° as indicated by dotted lines. Each one of the cameras 13 is connected to an image processing means 14, which will be further described in connection with Figs. 3-6. The images processing means 14 is operatively connected to a computing means 15, such as a computer.

The AGV 10 is designed to operate in a working area. On walls 16 sur- rounding the working area and on other suitable objects within the working area there are provided a plurality of reflectors 17. The reflectors 17 are preferably so called retro-reflectors, that is they reflect light efficiently in the direction of a light source. The reflectors preferably have less extension horizontally than vertically, as shown in Fig. 2. The direction to a reflector 17 is defined in relation to a reference direction D of the vehicle. As shown in Fig. 1 an angle α is defined between the reference direction D and the direction to a reflector 17.

As shown in Fig. 2 the cameras 13 are mounted at a low vertical position. If the AGV 10 carries a load 18 as indicated by dot and dash lines a low position of the cameras will allow a free view of sight below the load. It is also possible to arrange the cameras 13 at another vertical position and also more closed together, so as to achieve a more complete sight angle.

Each reflector 17 has a well defined vertical extension or height, which can be used for determining the distance between the reflector and the AGV 10, as will be further described below. Each of the cameras 13 produces a video signal, which is an input to a circuitry shown in Fig. 3 and Fig. 5. The video signal is converted in an analog-digital converter 19 to produce a digital signal for further processing. It should be noted that also digital cameras could be used. In such cases the converter 19 can be left out. If a PAL video signal is used the video signal comprises 625 lines and each image is updated 25 times per second. In a NTSC video signal there are 525 lines which are updated 30 times per second. To obtain vertical slices of the image obtained by the video cameras 13 it is possible to use a frame grabber to pro- duce an image containing 512x512 pixels. One line of the video signal lasts for 52 μs. There is a need for 512 points of measuring over each line. Each measuring point over one line corresponds to a pixel.

The analog-digital converter 19 is operatively connected to an adder 20, which in turn is operatively connected to a first memory means 21. The intensity value in each position of the 512x512 pixels metrics is added to the corresponding intensity values of other lines in the image. Thus, a vector is formed, each element of the vector holding the sum of the intensity values of a vertical slice of the image. The full image information is condensed to a one-dimensional vector. Those positions in the one-dimensional vector that correspond to a vertical slice of the image of a reflector will have a higher column sum value CS than other vertical slices. Thus, a high column sum CS indicates the presence of a reflector in a direction corresponding to the position of the element in the vector.

Fig. 4 is a diagram showing the column sum CS of a segment of the image from pixel position (PP) 250 to pixel position 270. In approximately 10 pixel positions, from 253 to 263, there is a peak in the column sum. The peak indicates the presence of a reflector in an angular position corresponding to these pixel positions PP. The width of the peak illustrates that the width of a reflector exceeds the width of one pixel. This is of course related also to the distance between the reflector and the camera 13. By calculating the gravity point of the peak as shown in Fig. 4 it is possible to determine the angle to the reflector at a resolution exceeding the pixel resolution. Forming the reflectors in the shape of a parallel trapezoid with horizontal bases, as shown in Fig. 2, and thus reducing the aliasing effect inherent in discrete pixel cameras, such as CCD-equipped cameras, may further increase the resolution. The pixel position, or rather the gravity point position, of the peak as shown in Fig. 4 corresponds to an angular position of a reflector in relation to the reference direction D of the vehicle. For cameras having an angle of site not adjoining the reference direction D an angular offset O can be used. The circuitry of Fig. 3 will produce a first vector. On the basis of said first vector the computing means 15 is able to calculate with high accuracy the angle between the reference direction D of the vehicle and the reflector. The circuitry of Fig. 5 also will produce a vector similar to the vector produced by the circuitry of Fig. 3. The video signal is converted in the analog-digital converter 19 and all further calculations are based on digital values of the image intensity in each pixel. A threshold value corresponding to an assumed intensity value from a reflector is stored in a threshold memory 22. Each intensity value in digital form of a pixel is compared in a comparing means 25 to the value stored in the threshold memory 22. If the incoming intensity value exceeds the threshold value the content of that vector element is incremented by one in a second adder 23. The new added value is stored in a second memory means 24. The result of the circuitry of Fig. 5 will proc- ess intensity values of a video signal to produce as a result a second vector. Each element of the vector holds the calculated number of pixels in a vertical column having an intensity value exceeding the threshold value stored in the threshold memory 22.

The diagram in Fig. 6 shows the content of a segment of the vector produced by the circuitry in Fig. 5. The segment of pixel positions PP from approximately 253 to approximately 263 indicates the presence of a reflector in the corresponding angular position. The diagram shows that number of pixels having intensity values exceeding the threshold value is 70. The number of such pixels is indicative of the distance to the reflector, because the number is indicative of the height of the reflector as seen from the camera 13. The relationship between the number of pixels in the vector and the distance between the reflector and the camera can be expressed as in the equation below.

ph 7 = κ 7 * a ♦ tan (h l 2λ

where ph = height of reflector in number of pixels h = geometric height of reflector in meters x = distance between reflector and camera k = optical constant

In one example the optical constant k was determined to be 1180. The distance was calculated with an acceptable accuracy up to approximately 25 m when using a 0,75 m reflector. It should be possible to measure larger distances by improving the light conditions during measurement. Fig. 7 shows a more detailed diagram of the hardware that can be used in connection with the invention. The video signal from the cameras 13 is used as an input to the analog-digital converter 19 as described above. The video signal is also fed to a sync separator 26 for line and field synchronising signals. In the separator 26 both the line synchronising pulses and the field synchronising pulses are extracted from the video signal. The synchronising signals are used in different elements of the hardware for controlling the image processing. The video signal is used also to generate a clamping signal used by analog-digital coverter 19. A synchronising means 27 is used to synchronise the cameras and other parts of the hardware. The synchronis- ing means is operatively connected to the separator 26. The converter 19 converts the analog signal from the camera 13 to an eight bit digital word. At the start of each line the black level of the video signal is locked with a clamping signal.

The digital signal is fed to a comparing and adding means 28. In the comparing and adding means 28 the digital value of the intensity is compared to a threshold value. If the digital intensity value is larger, that is the measured pixel is lighter than the threshold, an accumulated threshold sum is in- cremented by one in that vector position. The digital signal is fed also to an adding means 29 that adds up all the digital intensity values related to one vertical slice of the image.

The counted number of light pixels is fed to a first FIFO memory 30 through a first latch 31. The FIFO memory (First In First Out) do not require an address bus. Instead reading and writing in the memory is done by a pulse at a clock input. A FIFO memory is an appropriate type of memory in time critical application such as the present one.

In a corresponding way the result of the addition of intensity values in the adding means 29 is fed to a second FIFO memory 32 through a second latch 33. The first and second latches are used to retain the information so as to allow the next unit to read the information.

An output of the first FIFO memory 30 is connected to a third latch 34 and in a similar manner an output of the second FIFO memory 32 is con- nected to a fourth latch 35. Third and fourth latches are used also for a reset during the first line of the image where no accumulated value should be added.

Several elements of the hardware require a timer control signal. In the embodiment shown in Fig. 7 a finite state machine circuitry 36 is used. The timing can be divided into three sections. A first section relates to the actual line been processed, a second section relates to the position on the line, and a third section controls the generation of control signals. It should be noted that the video signal contains also redundant image information. The relevant information starts a few lines from the top of the image and the information at each line starts at some distance from the synchronising signal.

Two specific timing situations occur. During the first line of the image no information should be read from the first FIFO memory 30 and the second FIFO memory 32. In these cases incoming digital image data is added to zero and the result of the addition is stored in a FIFO memory pixel by pixel. A second problem is to handle the last line of the image. The complete vector is then fed into the first and second two port memory means 21, 24.

The computing means 15 comprises basically two elements. A first element is a microprocessor 37, which in a practical embodiment is a Mo- torola chip 68332. The microprocessor 37 is operatively connected to first memory means 21 and second memory means 24 for processing the content of said memories. The microprocessor 37 is also operatively connected to the image processing means 14 to receive interrupt signals and status infor- mation, for instance from the finite state machine 36.

The first memory means 21 holds a first vector, each element thereof holding the number of pixels in each column that are brighter than a threshold value. The second memory means 24 hold a second vector, each element thereof holding the sum of the intensities in each column. On the basis of said vectors the microprocessor 37 is able to calculate relevant angular positions and distances to the reflectors.

An external control register 38 is operatively connected to the microprocessor 37. The external control register 38 generates control signals to different elements of the image processing means 14. One control signal may include the threshold value used in the comparing and adding means 28.

The detection of the retro-reflective reflectors 17 is facilitated if a light source is arranged in the vicinity of each of the cameras 13. It is desired not to disturb the environment and therefore the wavelengths of the light from the light source can be within or close to the infra red area where the human eye has a low sensitivity. The light source can be a stroboscope lamp or a plurality of infrared diodes connected in series. Normally the camera is provided with an electronic shutter that is opened during a short time interval (e.g. 1/1000 s). The short opening time of the shutter makes it possible to increase the current through the diodes during that short time interval.

The infrared light source is synchronised to the camera by using the field synchronising pulse from the camera. Preferably a separated signal is obtained from the separator 26.

Due to the relatively wide field of view required, most lenses introduce a distortion of the projected image. While the evaluation algorithms described above are still valid, the device should nevertheless be calibrated to compensate for any resulting angular error.

Claims

1. A method for determining an angular position of a reflector from a vehicle reference position in relation to an angular reference position including the step of arranging vertically extended reflectors in a working area, c h a ra c t e ri s e d by obtaining from the vehicle reference position vertical image slices of at least one horizontal segment of the working area, each image slice comprising a plurality of pixels, adding intensity values of pixels in each vertical image slice into a set of column sums, one position in the set forming an angular reference point, and determining the angular position of a reflector as the position of a peak column sum or a subset of adjacent peak column sums in the said set in relation to the angular reference point.
2. A method as claimed in claim 1 , further including the step of calculating the gravity point of a peak formed by a peak column sum or a subset of adjacent peak column sums and determining the angular position of a reflector as the position of the gravity point.
3. A method as claimed in claim 1 , wherein the reflectors have a predetermined vertical extension, further including the steps of calculating the number of consecutive pixels having intensity values exceeding a predetermined intensity value and calculating the distance from the reference position to the reflector in dependence of said number of consecutive pixels.
4. A method as claimed in claim 1 , further including the steps of obtaining the vertical image slices by a discrete pixel camera and forming the reflectors as parallel trapezoids to reduce an aliasing effect occurring in such cameras.
5. A method as claimed in claim 4, further including the step of calibrating for the image distortion introduced by the camera lens.
6. A method as claimed in claim 1 , wherein an image is obtained by a camera producing an analog video signal, further including the step of extracting line and field synchronising signals and intensity values from the video signal and feeding the line synchronising signal to an image processing means for associating an intensity value to a specific vertical slice in dependence on a time period expired from last occurrence of the line synchronising signal.
7. A method for determining the distance between a reflector and a vehicle reference position including the step of arranging vertically extending reflec- tors in a working area, c h a ra c te ri s e d by providing said reflectors with a predetermined vertical extension, obtaining from the vehicle reference position vertical image slices of at least one horizontal segment of the working area, each image slice comprising a plurality of pixels, calculating the number of consecutive pixels having intensity values exceeding a predetermined intensity value, and calculating the distance from the reference position to the reflector in dependence of said number of consecutive pixels.
8. A device for determining an angular position of a vertically extending reflector arranged in a working area from a vehicle reference position in relation to an angular reference position including the step of, c h a ra c te ris e d in that image sensing means are provided to obtain from the vehicle refer- ence position vertical image slices of at least one horizontal segment of the working area, each image slice comprising a plurality of pixels, that an adder is provided for adding intensity values of pixels in vertical image slices into a set of column sums, one position in the set forming an angular reference point, and that computing means are provided for determining the angular position of a reflector as the position of a peak column sum or a subset of adjacent peak column sums in the said set in relation to the angular reference point.
9. A device as claimed in claim 8, wherein said image sensing means is a CCD camera.
10. An automated guided vehicle including a device for determining an angular position of a vertically extending reflector arranged in a working area from a vehicle reference position in relation to an angular reference position including the step of, c h a ra c t e ri s e d in that image sensing means are provided on the vehicle to obtain from the vehicle reference position vertical image slices of at least one horizontal segment of the working area, each image slice comprising a plurality of pixels, that an adder is provided on the vehicle for adding intensity values of pixels in vertical image slices into a set of column sums, one position in the set forming an angular reference point, and that computing means are provided on the vehicle for determining the angular position of a reflector as the position of a peak column sum or a subset of adjacent peak column sums in the said set in relation to the angular reference point.
PCT/SE2002/000544 2001-03-20 2002-03-20 Method and device for determining an angular position of a reflector WO2002075350A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE0100954-7 2001-03-20
SE0100954A SE523318C2 (en) 2001-03-20 2001-03-20 Camera-based distance and angle gauges

Publications (1)

Publication Number Publication Date
WO2002075350A1 true WO2002075350A1 (en) 2002-09-26

Family

ID=20283428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2002/000544 WO2002075350A1 (en) 2001-03-20 2002-03-20 Method and device for determining an angular position of a reflector

Country Status (2)

Country Link
SE (1) SE523318C2 (en)
WO (1) WO2002075350A1 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005015256A1 (en) * 2003-08-06 2005-02-17 Siemens Aktiengesellschaft Determination of the position of goods in transit by the combination of local, absolute position measuring and relative position measuring
WO2005033628A2 (en) * 2003-09-23 2005-04-14 Snap-On Technologies, Inc. Invisible target illuminators for 3d camera-based alignment systems
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US8463438B2 (en) 2001-06-12 2013-06-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8515578B2 (en) 2002-09-13 2013-08-20 Irobot Corporation Navigational control system for a robotic device
US8584305B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US8600553B2 (en) 2005-12-02 2013-12-03 Irobot Corporation Coverage robot mobility
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8780342B2 (en) 2004-03-29 2014-07-15 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US8781626B2 (en) 2002-09-13 2014-07-15 Irobot Corporation Navigational control system for a robotic device
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US9008835B2 (en) 2004-06-24 2015-04-14 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9038233B2 (en) 2001-01-24 2015-05-26 Irobot Corporation Autonomous floor-cleaning robot
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4947094A (en) * 1987-07-23 1990-08-07 Battelle Memorial Institute Optical guidance system for industrial vehicles
US5051906A (en) * 1989-06-07 1991-09-24 Transitions Research Corporation Mobile robot navigation employing retroreflective ceiling features
JP2000161918A (en) * 1998-12-01 2000-06-16 Tsubakimoto Chain Co Method and device for detecting position of moving body

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4947094A (en) * 1987-07-23 1990-08-07 Battelle Memorial Institute Optical guidance system for industrial vehicles
US5051906A (en) * 1989-06-07 1991-09-24 Transitions Research Corporation Mobile robot navigation employing retroreflective ceiling features
JP2000161918A (en) * 1998-12-01 2000-06-16 Tsubakimoto Chain Co Method and device for detecting position of moving body

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 9 16 June 2000 (2000-06-16) *

Cited By (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8761935B2 (en) 2000-01-24 2014-06-24 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8565920B2 (en) 2000-01-24 2013-10-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9446521B2 (en) 2000-01-24 2016-09-20 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8478442B2 (en) 2000-01-24 2013-07-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9144361B2 (en) 2000-04-04 2015-09-29 Irobot Corporation Debris sensor for cleaning apparatus
US8686679B2 (en) 2001-01-24 2014-04-01 Irobot Corporation Robot confinement
US9038233B2 (en) 2001-01-24 2015-05-26 Irobot Corporation Autonomous floor-cleaning robot
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US9582005B2 (en) 2001-01-24 2017-02-28 Irobot Corporation Robot confinement
US8463438B2 (en) 2001-06-12 2013-06-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9104204B2 (en) 2001-06-12 2015-08-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8838274B2 (en) 2001-06-12 2014-09-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9128486B2 (en) 2002-01-24 2015-09-08 Irobot Corporation Navigational control system for a robotic device
US8781626B2 (en) 2002-09-13 2014-07-15 Irobot Corporation Navigational control system for a robotic device
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US8515578B2 (en) 2002-09-13 2013-08-20 Irobot Corporation Navigational control system for a robotic device
US8793020B2 (en) 2002-09-13 2014-07-29 Irobot Corporation Navigational control system for a robotic device
WO2005015256A1 (en) * 2003-08-06 2005-02-17 Siemens Aktiengesellschaft Determination of the position of goods in transit by the combination of local, absolute position measuring and relative position measuring
WO2005033628A2 (en) * 2003-09-23 2005-04-14 Snap-On Technologies, Inc. Invisible target illuminators for 3d camera-based alignment systems
WO2005033628A3 (en) * 2003-09-23 2008-01-17 Eric F Bryan Invisible target illuminators for 3d camera-based alignment systems
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US9215957B2 (en) 2004-01-21 2015-12-22 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
US8598829B2 (en) 2004-01-28 2013-12-03 Irobot Corporation Debris sensor for cleaning apparatus
US8456125B2 (en) 2004-01-28 2013-06-04 Irobot Corporation Debris sensor for cleaning apparatus
US8378613B2 (en) 2004-01-28 2013-02-19 Irobot Corporation Debris sensor for cleaning apparatus
US8780342B2 (en) 2004-03-29 2014-07-15 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US9360300B2 (en) 2004-03-29 2016-06-07 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US9008835B2 (en) 2004-06-24 2015-04-14 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9486924B2 (en) 2004-06-24 2016-11-08 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9223749B2 (en) 2004-07-07 2015-12-29 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8874264B1 (en) 2004-07-07 2014-10-28 Irobot Corporation Celestial navigation system for an autonomous robot
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US9229454B1 (en) 2004-07-07 2016-01-05 Irobot Corporation Autonomous mobile robot system
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US9445702B2 (en) 2005-02-18 2016-09-20 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8774966B2 (en) 2005-02-18 2014-07-08 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8782848B2 (en) 2005-02-18 2014-07-22 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8985127B2 (en) 2005-02-18 2015-03-24 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8855813B2 (en) 2005-02-18 2014-10-07 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8392021B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8670866B2 (en) 2005-02-18 2014-03-11 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8966707B2 (en) 2005-02-18 2015-03-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8584305B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US8978196B2 (en) 2005-12-02 2015-03-17 Irobot Corporation Coverage robot mobility
US8954192B2 (en) 2005-12-02 2015-02-10 Irobot Corporation Navigating autonomous coverage robots
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US9599990B2 (en) 2005-12-02 2017-03-21 Irobot Corporation Robot system
US8761931B2 (en) 2005-12-02 2014-06-24 Irobot Corporation Robot system
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US9144360B2 (en) 2005-12-02 2015-09-29 Irobot Corporation Autonomous coverage robot navigation system
US8606401B2 (en) 2005-12-02 2013-12-10 Irobot Corporation Autonomous coverage robot navigation system
US9392920B2 (en) 2005-12-02 2016-07-19 Irobot Corporation Robot system
US8600553B2 (en) 2005-12-02 2013-12-03 Irobot Corporation Coverage robot mobility
US8661605B2 (en) 2005-12-02 2014-03-04 Irobot Corporation Coverage robot mobility
US9149170B2 (en) 2005-12-02 2015-10-06 Irobot Corporation Navigating autonomous coverage robots
US9492048B2 (en) 2006-05-19 2016-11-15 Irobot Corporation Removing debris from cleaning robots
US10244915B2 (en) 2006-05-19 2019-04-02 Irobot Corporation Coverage robots and associated cleaning bins
US8528157B2 (en) 2006-05-19 2013-09-10 Irobot Corporation Coverage robots and associated cleaning bins
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US8572799B2 (en) 2006-05-19 2013-11-05 Irobot Corporation Removing debris from cleaning robots
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US9317038B2 (en) 2006-05-31 2016-04-19 Irobot Corporation Detecting robot stasis
US9480381B2 (en) 2007-05-09 2016-11-01 Irobot Corporation Compact autonomous coverage robot
US8839477B2 (en) 2007-05-09 2014-09-23 Irobot Corporation Compact autonomous coverage robot
US8726454B2 (en) 2007-05-09 2014-05-20 Irobot Corporation Autonomous coverage robot
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US8438695B2 (en) 2007-05-09 2013-05-14 Irobot Corporation Autonomous coverage robot sensing
US10299652B2 (en) 2007-05-09 2019-05-28 Irobot Corporation Autonomous coverage robot
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush

Also Published As

Publication number Publication date
SE0100954D0 (en) 2001-03-20
SE523318C2 (en) 2004-04-13
SE0100954L (en) 2002-11-08

Similar Documents

Publication Publication Date Title
EP0482604B1 (en) Distance detecting apparatus for a vehicle
JP4391624B2 (en) Object recognition device
JP3756452B2 (en) Infrared image processing apparatus
US9686532B2 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
US4630109A (en) Vehicle tracking system
KR100435650B1 (en) Detection method of road condition in a vehicle equipped with a camera, and method for detecting distance between vehicles in the same vehicle
US7454054B2 (en) Three-dimensional shape input device
US5586063A (en) Optical range and speed detection system
US6828903B2 (en) Method and apparatus for detecting position of object present in a surrounding detection zone of automotive vehicle
JP3868876B2 (en) Obstacle detecting apparatus and method
KR100521119B1 (en) Obstacle detecting apparatus for vehicle
EP1291668B1 (en) Vehicle surroundings display device and image providing system
US20020118874A1 (en) Apparatus and method for taking dimensions of 3D object
EP1057141B1 (en) Road profile detection
EP1005234B1 (en) Three-dimensional scope system for vehicles with a single camera
Lindner et al. Lateral and depth calibration of PMD-distance sensors
JP3995846B2 (en) Object recognition device
KR100257592B1 (en) Lane detection sensor and navigation system employing the same
US5929784A (en) Device for determining distance between vehicles
US5410346A (en) System for monitoring condition outside vehicle using imaged picture by a plurality of television cameras
JP2012253758A (en) Method of calibrating vehicle vision system and vehicle vision system
JP3983573B2 (en) Stereo image characteristics inspection system
US7812969B2 (en) Three-dimensional shape measuring apparatus
EP1508876A2 (en) Image projection method and device
US5910817A (en) Object observing method and device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP