WO1987002484A1 - Cible et systeme de commande pour positionner un vehicule guide automatiquement - Google Patents

Cible et systeme de commande pour positionner un vehicule guide automatiquement Download PDF

Info

Publication number
WO1987002484A1
WO1987002484A1 PCT/US1986/002151 US8602151W WO8702484A1 WO 1987002484 A1 WO1987002484 A1 WO 1987002484A1 US 8602151 W US8602151 W US 8602151W WO 8702484 A1 WO8702484 A1 WO 8702484A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
light source
reflector elements
target member
camera
Prior art date
Application number
PCT/US1986/002151
Other languages
English (en)
Inventor
Harry B. Hammill, Iii
Frank J. Lukowksi, Jr.
Original Assignee
Caterpillar Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US06/789,280 external-priority patent/US4678329A/en
Priority claimed from US06/788,989 external-priority patent/US4684247A/en
Application filed by Caterpillar Inc. filed Critical Caterpillar Inc.
Priority to KR870700519A priority Critical patent/KR880700512A/ko
Publication of WO1987002484A1 publication Critical patent/WO1987002484A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/0755Position control; Position detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37555Camera detects orientation, position workpiece, points of workpiece
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45049Forklift

Definitions

  • This invention relates to method and apparatus for determining the position and orientation of a target with respect to a sensor.
  • Both the target and the sensing unit may be affixed to movable objects, although generally the target will be stationary, and the sensing unit will be attached to a self-propelled vehicle.
  • the vehicle carries a sensing unit, the output of which controls some function of the vehicle, such as its forward motion, steering control or the vertical location of the forks.
  • the target might be the pallet itself or the frame or rack on which the pallet is supported.
  • Some of the prior art devices employ specialized marks whose dimensions are known; others utilize special light projectors and sense the light reflected from the target for positioning a vehicle, or a component of the vehicle, such as the forks of a lift truck.
  • This invention is directed to a target member for use in a positioning system for providing at least three identifiable images positioned with respect to a sensor carried by a self-propelled vehicle. These three images are located so as to provide an unambiguous frame of reference thereby to allow the determination of all six positional and orientational degrees of freedom from a single observation of the target by the sensor.
  • the target includes at least three reflector elements mounted on a support member.
  • the target and the reflector elements provide a thin, essentially flat surface while at the same time providing to the vehicle mounted sensor the appearance of a target having considerable depth.
  • the mirrors are selected so that the images of a light source carried by the vehicle define a plane and a circle.
  • the plane is not normal to the sensor viewing axis, and the circle does not include the sensor.
  • a light source was chosen as the identifying means so that it could be readily detected by commercially available sensing devices, such as a small television camera.
  • the reflector elements have at least two different radii of curvature.
  • two of the reflectors may be convex and have the same radius of curvature, and the third reflector may be concave.
  • the radius of curvature and diameter of each are selected so that the reflection of the light source may be viewed by a sensor when the vehicle is within a predetermined field of vi ⁇ w with respect to the target. It is necessary, of course, to allow the vehicle to approach the target from some distance, and to identify the location of the target from some acceptable viewing angle.
  • the target member may also include retroreflector members which provide a brilliant and rather large reflection whenever the light source is flashed. The position of the reflections from the retroreflector members are used to determine the area on the sensor's image plane where the desired reflections of the light source from the curved reflector elements are to be found.
  • the target member may also include coded means for identifying the specific target, such as a bar code that may be scanned to confirm that the target within the field of view of the sensor is the one to be engaged by the vehicle.
  • coded means for identifying the specific target such as a bar code that may be scanned to confirm that the target within the field of view of the sensor is the one to be engaged by the vehicle.
  • a guidance system may be programmed to manuever the vehicle into proper position with respect to the target.
  • the identifying means is a light source, and specifically a xenon strobe lamp for providing a short duration, high intensity pulse of light directed away from the front of the vehicle along the axis of the camera and back to the sensor via the mirrors.
  • the camera and the light source would be collocated. It is possible to use half silvered mirrors so that the center of the light source falls upon the axis of the lens of the camera.
  • a practical embodiment of the invention places the light source immediately above the camera. It has been found that this slight offsetting does not appreciably affect the accuracy of the measurements taken.
  • an object of this invention to provide a target member for use in a positioning system that employs at least three reflector elements with each of the reflector elements being so configured to form images of an identifying means in a plane that is oriented other than normal to a line from the identifying means to the plane.
  • Fig. 1 is a perspective view showing a rack capable of supporting a plurality of pallets, and an automatically guided vehicle carrying a light source and a camera for sensing the reflections of the light source in a reflector member attached to a selected pallet;
  • Fig. 2 is a perspective view showing the camera and light source located below and to the right of a line passing perpendicular to and through the center of the target or reflector member;
  • Fig. 3A is a view showing the reflections of the light source in the reflector elements.
  • Fig. 3B is a. view showing the reflections from the mirrors on the target as they appear on the image plane of the camera;
  • Fig. 4 is a perspective view showing the camera and light source mounted on the forklift vehicle
  • Fig. 5 illustrates a target or reflector member with attached retroreflector members, spherical reflector elements, and a bar code
  • Fig. 6 is a plan view showing various locations of an automatically guided vehicle, such as a fork lift, with respect to a pallet;
  • Figs. 7A-7D represent the reflections of the light source in the reflector elements at the various locations of the vehicle with respect to the pallet, as shown in Fig. 6;
  • Figs. 8A-8D represents the images of the reflections shown in Figs. 7A-7D as they appear on the image plane of the vehicle carried camera;
  • Figs. 9A-9D represent the electrical signals on the image plane of the sensor at a single location of the vehicle.
  • Fig. 9A shows the signals due to the images of the retroreflectors as a result of the first flash of the light source.
  • Fig. 9B shows the signals resulting from ambient light.
  • Fig. 9C shows the signals when the light source is flashed a second time.
  • Fig. 9D represents the electrical signals that remain after processing;
  • Fig. 10 is a vector diagram illustrating the directional relationship between a first image point P and the final image point P'. Tne illustrated vector is located at the nodal point of the sensor lens;
  • Fig. 11 is a block diagram of the video processing circuit used to identify and locate images of the retroreflectors and other reflector elements; and
  • Figs. 12A-12C are timing diagrams showing the various signals that occur during various times during the operation of this invention.
  • a storage rack 10 is shown supporting an object such as a pallet 15.
  • the pallet 15 is provided with a target member 20.
  • a wooden pallet 15 is shown, and the target member 20 is illustrated as being a separate component attached to the pallet. It should be understood, however, that any type of pallet structure may be used, and the target member 20 may either be a separate unit or it may be formed integral with the pallet itself.
  • a vehicle 30, such as a forklift truck carries the identification means 35 (Figs. 2 and 4), such as a high intensity light source, and an imaging sensing means 40, which is preferably a miniature TV camera, such as a Sony XC-37 CCD camera.
  • the light source and camera are preferably mounted together as a unit 45, with the light source 35 immediately adjacent and above the camera lens (Fig. 4). In the preferred embodiment, the vertical distance separating the light source and camera lens is approximately one inch.
  • the target member 20 is shown in more detail in Fig. 5, and it includes a generally flat support member 50, three reflector elements 52, 53, and 54, and three retroreflector members 62, 63, and 64.
  • a bar code 71 for uniquely identifying the pallet may also be printed on or attached to the support member.
  • the light source and camera unit 45 is preferably aligned with the direction of travel of the vehicle. It is possible, however, rotatably to mount the camera on the vehicle so that it may scan through a large field of view, both horizontally and vertically. If this were done, however, the camera would be associated with a drive unit and position indicating device so that the proper corrections would be considered when calculating the relative location of the target.
  • the preferred embodiment of the invention employs two convex reflector elements 52 and 54, and one concave reflector element 53.
  • Each of the reflector elements is spherical in shape, and all are horizontally arranged on the support member 50.
  • Both reflectors elements 52 and 54 have the same radius of curvature, and the radii of curvatures of all of the elements and their diameters are selected to provide a reasonable field of view A (Fig. 6) such that a reflection of the identifying means 35 will be viewable by the camera as long as the vehicle is within the field of view. In the embodiment described, it is preferred to have a minimum field of view of + 10° from the mirror platen normal. Typical mirrors may be approximately 1.5 inches in diameter and have a radius of curvature of 3 inches or greater.
  • Mirrors 52 and 54 may be type 380 convex mirrors, and mirror 53 may be type 100 concave mirror, both manufactured by ROLYN. It should be emphasized that the identifying means 35, while preferably a brilliant light source, could also be any means that could be detected by sensing means 40. A brilliant xenon flash lamp Las been found effective for this purpose.
  • the light/camera unit 45 is shown positioned below and to the right of the center line 60 passing through the target 20, but within the field of view A. Under these conditions, the reflection of the identifying means or light source 35 appear as images P A , P O , P B in mirrors 52, 53 and 54, respectively, as shown in Fig. 3. Since the mirrors are curved surfaces in the embodiment shown, the images P A and P B will appear toward the lower right portion of mirrors 52 and 54, and the image P O will be toward the upper left in mirror 53.
  • P O ', P B ' of the identifying means would be grouped on the image plane 70 of the camera towards its upper left hand corner, as shown in Fig. 3B. (It will be assumed for the following illustration that the images formed on the image plane are not inverted or reversed.)
  • the absolute location of the reflections, the spacing between the reflections, and the relative position of all the reflections will provide information sufficient to determine from a single observation the location of the vehicle with respect to the pallet and the orientation or rotation of the pallet. As the location of the vehicle changes, the observed position of the identifying means or reflections on the image plane of the camera will also change, as will be explained.
  • the images P A , P O and P B of the identifying means 35 in the reflector elements 52-54 will define a plane 80, and these images will also define a circle 82.
  • the plane 80 is not normal or perpendicular to the center line 60 of the target 20; it is in fact essentially parallel to the upper surrace of the pallet.
  • the circle 82 will not include the lens of the camera 40.
  • FIGs. 9A-9D represent the images appearing on the image plane of the camera 40 during one sequence of operations necessary to gather positional information.
  • the preferred method of this invention provides for flashing the light source 35 and recording the positions of the reflections from the retroreflector members 62, 63, and 64.
  • the circuit for accomplishing this is shown in Fig. 11.
  • These reflections are identified in Fig. 9A as reflections 162, 163, and 164, respectively.
  • These reflections are easily identified because they each occupy a plurality of pixels on the image plane 70 of the camera since they are physically large components of the target 20 and since the retroreflectors return a large percentage of the light emitted by the identifying means 35 back toward the source. For this reason, the effective sensitivity of the camera is reduced at this stage of the operation so that only the reflections of the retroreflectors are likely to be found at the image plane. Also, because of the high intensity of the reflected light, there may be some blooming of the image.
  • the positions of each of the retroreflector images is recorded in memory means.
  • Microprocessor means 310 performs a calculation by reference to the positions of the retroreflector images 162-164, and an area 200 is defined in whicii the reflections from the reflector elements 52-54 are likely to be found.
  • This defined area 200 may be located anywhere on the image plane of the camera and will vary in area in proportion to the separation of the vehicle from the target. In other words, the closer the vehicle is to the target, the more widely separated will be the images, and the defined area will consequently be larger.
  • images are those resulting from ambient light and may include such reflections 165 as overhead lights, specular reflections from metal objects within the field of view, and other light sources.
  • the next step is to reduce the effective sensitivity of the camera and again flash the light source 35.
  • the image plane will contain the image of the retroreflectors 162-164, the ambient reflections 165, and also reflections P A , P O and P B of the light source in each of the reflector elements 52-54. All of the images within the defined area 200 are recorded. Any images from the retroreflectors that bloom into the defined area are removed from memory, and the images recorded in Fig. 9B are effectively subtracted from those in Fig. 9C, and what remains are images P A ', P O ' and P B ', reflections of the light source in the reflector elements 52,54, as shown in Fig. 9D. The center of each of these images P A ', P O ' and P B ' are calculated and the video signals from the camera image plane are evaluated in accordance with the procedure later defined. As shown in Fig. 5, the retroreflector members
  • each retroreflector should be known so that an area in which the reflector elements are positioned can be defined. Also, the position of the retroreflector elements further define a second area 210 in which the image 170 of the bar code 71 may be found, and at some appropriate time during the analysis of the image, the bar code may be read to confirm that the proper target is being approached.
  • the reflector elements 52-54 do not all have to be spherical in shape. All that is .necessary is that the images of the identifying means carried by the vehicle be viewable by the camera. This means that one or more of the reflector elements could be a retroreflector. Using spherical mirrors, however, reduces the cost of the target and also provides relatively smaller images, images whose position can therefore be determined with a high degree of accuracy.
  • the mirrors are evenly spaced and are horizontally aligned, and that the center line of the target, that is, the center mirror, is the desired final position of the light/camera unit 45. It should be recognized, however, that any orientation of the mirrors and any position of the camera unit with respect to the target would be acceptable and would not depart from this invention. All that the control circuit would need is information regarding the final desired position of each of the reflections on the image plane. Because of convention, and for purposes of explanation, it will be assumed that the desired final position will be on the center line with the images equally spaced on the image plane and that the images are in a horizontal line. The area between the retroreflectors is provided with a dark background in order to minimize random noise and false data.
  • Figs. 6, 7A-7D and 8A-8D it is assumed that the light/camera unit 45 is in the same horizontal plane as the target, and the vehicle 30 is positioned to the right of the center line 60 in location 1.
  • the reflections of the light source are shown in Fig. 7A
  • the images of the reflections on the image plane 70 of the camera 40 are shown in Fig. 8A and would be located in the left center of the image plane.
  • the images are close together and unequally spaced.
  • the location of the images on the image plane and their relative spacing are all important to the calculations for determining the vehicle's relative location with respect to the target.
  • both the light source 35 and the camera 40 are connected to electronic control and processing circuits 300.
  • a microprocessor system 310 which provides control signals to control the flow of data to the remainder of the circuit.
  • a timing logic circuit shown generally at 320, responds to instructions from the microprocessor system 310 to control the light source or flash 35, the video information from the camera 40, and the way that information is processed and stored in the remainder of the circuit.
  • 330 provides the means for recording the images on the image plane of the video camera 40.
  • a multiplexer 340 operating under control of the microprocessor 310 and timing logic circuit 320, transfers video information into the video RAM 330 through a serial-to-parallel converter 350, and out of the video RAM 330 through a parallel-to-serial converter 360.
  • a digital-to-analog (D/A) converter 370 responds to a digital signal from the microprocessor to establish a threshold level for the output of the video camera unit, and that threshold level determines what video information passes from the camera 40 through a comparator circuit 380 into a selection circuit 390, which circuit includes a first AND gate 392, a second AND gate 394, and an exclusive OR gate 396.
  • each interval between vertical blanking pulses 302 represents one-half frame.
  • the interval designated 401 includes all of the odd numbered lines of one frame or screen, while the interval 402 represents all of the even numbered lines.
  • the intervals 401 and 402 together comprise one complete frame.
  • a power up signal is provided by the microprocessor system on line 316 to the timing logic 320 when the system is first turned on to synchronize the microprocessor system with the timing logic circuit.
  • the timing logic circuit sends a reset pulse on line 321 to the video camera unit to initialize this device.
  • the camera 40 provides a video output signal to tne comparator circuit 380, and part of this output is a pulse 302 representing the vertical blanking interval. During each vertical blanking interval, the timing logic circuit provides a vertical sync pulse back to the microprocessor on line 324.
  • the microprocessor system 310 controls the sequencing of operation of the entire system.
  • the microprocessor establishes an initial threshold level for the camera by sending a digital value on the microprocessor bus 315 to the D/A converter 370.
  • This threshold level shown in Fig. 12A, as the dashed line 372, limits those signals that may pass through the comparator circuit 380.
  • This initial level is set high enough that all reflections, except from the retroreflectors within the field of view of the camera, will be ignored and will not be passed on to the selection circuit 390.
  • the microprocessor 310 sends a flash enable signal 311 on line 312 to timing logic circuit 320.
  • This signal extends through the vertical blanking interval 302.
  • the microprocessor generates a strobe signal on line 313, and the timing logic circuit 320 in response thereto sends a flash trigger pulse 322 on line 323 at the beginning of interval 401 to the light source or strobe 35.
  • the light source 35 is a high intensity xenon strobe which floods the area in front of the camera with a short duration pulse of high intensity light.
  • the threshold level 372 during intervals 401 and 402 is set high enough that only the video signals exceeding the predetermined threshold value, such as those representing the reflections 162-164 returned by the retroreflectors 62-64, will be allowed to pass through the comparator 380. Although the duration of the flash may be measured in microseconds, the light energy of the reflections therefrom will be retained on the camera image plane for one complete frame.
  • the video camera unit 40 provides an output on line 326 from the internal camera clock, basically a 3.58 MHz series of pulses, to the timing logic circuit 320. These clock pulses are converted by the timing logic circuit into pixel clock pulses on line 327, with each pixel clock pulse representing a single pixel as it appears on the camera image plane. As shown in
  • pixel clock pulses are provided to the serial-to-parallel converter 350 and the parallel-toserial converter 360 for two intervals after a start pulse.
  • the camera a Sony XC-37 CCD Camera, has an image plane providing an array of 384 x 491 pixels. Half of those pixels will be interrogated during the first interval and the other half during the second interval. Thus, each pixel on the camera image plane is separately and uniquely identified and it can be determined whether or not the output from each pixel exceeds the predetermined threshold level.
  • the output of the comparator 380 is applied on line 381 to selection circuit 390, and to both the AND gate 392 and exclusive OR gate 396. During intervals 401 and 402, the exclusive OR enable line 318 is low and therefore the video information from the comparator 380 will be passed directly on line 391 into the serial-to-parallel converter 350.
  • the target 20 typically includes three retroreflectors in the preferred embodiment, it is expected that only three intense returns, or images 162, 163 and 164, will exceed the threshold 372, and therefore these images will be processed through the serial-to-parallel converter 350 and sent on the video RAM data bus 355, through the multiplexer 340 under control of signals provided by timing circuit 320 on bus 328, and into the video RAM 330 via bus 335 where they will be stored or recorded in electronic form.
  • the images of the retroreflectors are so large, it is possible to speed up the process by scaling the pixel clock and limiting the video data stored in the RAM 330 by storing every fourth pixel of every fourth line of each frame and still detect the presence of the retroreflectors.
  • the threshold level is set by the microprocessor to that shown at 373 so that low level images 165 and 166, which exceed the threshold level during intervals 403 and 404, may pass through the comparator 380.
  • the selection circuit 390 remains inactive at this time, so those images will be recorded in the video RAM 330 in accordance with the process described above.
  • Fig. 12C illustrates the next step in the process.
  • the microprocessor 310 increases the threshold level slightly to that shown at 374. This will permit most of the images due to ambient light to pass through the comparator 380, but will eliminate some that might have been marginal, such as image 166. This will also tend to eliminate noise from the camera itself and slight changes in size of the image due to camera movement.
  • the flash is once again enabled and it is triggered a second time at the beginning of interval
  • the selection circuit 390 provides the. means for comparing the images due to ambient light temporarily stored in the recording means or memory 330 and the images resulting from both ambient light and from the light source, and for thereafter recording only those images due to reflections of the light source in the video RAM 330.
  • the location of the retroreflector images is known and was determined during intervals 401 and 402, the location of the reflections of the light source in the reflector elements with respect to each other and with respect to the image plane of the camera can now be accurately determined by analyzing the data stored in RAM 330 and the location of the vehicle in relation to the target calculated.
  • One of the target points, the center one, P o is chosen as a target reference point or origin.
  • the other two points, P A and P B have 3-dimensional vector offsets from P O . These offset vectors are identified as a and b in the target coordinate system.
  • TARGET POINTS AS VIEWED BY SENSOR Consider an axis system fixed with respect to the camera.
  • the specific axis system chosen is the right-handed system shown in Fig. 2, with x representing horizontal, positive to the right; y representing horizontal, positive along forward lens axis; and z representing vertical, positive up.
  • P O will be at some vector location, R.
  • the image points will correspond to three points which, from the viewpoint of the sensor, are at locations R, R + ⁇ , R + ⁇ .
  • the vectors R, ⁇ and ⁇ , and the rotation matrix M, are initially unknown.
  • DIRECTION VECTORS u, v, w.
  • the direction of any single source or target point can be established with a camera system, but not distance.
  • the direction can be defined by a vector from image point to lens center (thin lens) or second nodal point (thick lens). See Fig. 9.
  • the direction vectors corresponding to P o , P A , and P B be called u, v, w, respectively. These vectors could be chosen as unit vectors (generally convenient for analysis) . A more convenient and natural normalization is to scale these vectors so that the lens-axis or y-component equals focal length. The x and a components are then simply the horizontal ( ⁇ ) and vertical (n) components of location in the focal plane. (With due regard for signs and "image reversal.") BASELINE VECTOR EQUATIONS. In terms of known direction vectors u, v, w, the basic vector equations become
  • ⁇ o , ⁇ A , ⁇ B are (unknown) scalars proportional to distance.
  • Equations (1) through (3) are underdetermined. There are 12 unknowns (three components each for R, ⁇ , ⁇ vectors, plus the three scalar ⁇ 's), and nine scalar equations. The "missing" three equations come from scalar invariance relationships.
  • SCALAR INVARIANCE RELATIONSHIPS THREE SCALAR EQUATIONS.
  • ⁇ and ⁇ are unknown as vectors, partial information comes from scalar invariants of (rigid-body) rotation. Specifically, if
  • Equations (1) through (6) collectively form a fully determined baseline equation set. They give 12 nonlinear (scalar) equations in 12 (scalar) unknowns, if evaluation of unknown rotation matrix M is temporarily regarded as a "later step.”
  • the algorithmic steps are based upon mathematical analysis that successively reduces the dimensionality of the problem.
  • the ultimate step becomes that of solving a single nonlinear equation in one unknown, after which the reduction steps are retraced to give explicit values of the other variables.
  • R, ⁇ , and ⁇ are known.
  • D and E coefficients are combinations of (previous) known constants.
  • Equations (12) and (13) are near the end of the reduction process. They could be solved simultaneously, using (for example) a 2-dimensional version of Newton successive approximation. That approach is not preferred, largely because of practical problems in determining two distinct solutions (a requirement previously referred to), and because convergence would probably be slower than for alternatives discussed below.
  • Equations (12) and (13) could also be combined to give a purely 1-dimensional equation in ⁇ --a fourth-order polynomial equation of form
  • Equation (14) The practical disadvantage of the Equation (14) approach is the complexity, lengthy software code and computation time involved in evaluating the P coefficients prior to solving for roots.
  • Equation (13) can be solved for ⁇ as an explicit function of ⁇ :
  • Equation (12) can then be written as if a function only of ⁇ :
  • Equation (16) is a value of ⁇ for which Equation (16) holds.
  • Equation (15) is a value of ⁇ for which Equation (16) holds.
  • Equation (15) is a value of ⁇ for which Equation (16) holds.
  • Equation (16) For the class of problems and target-point geometries that occur in applications of the type described here, there are normally two distinct real roots to Equation (16), and two complex roots. Physical solutions correspond to the two real roots. Of the two physical solutions, one is "usually” identifiable as not valid. For some of the retroreflector target-point geometries (vs. mirror configurations), and in presence of mosaic quantization and/or other sources of errors, resolution of correct versus incorrect solution is not necessarily reliable. This problem is strictly a data error problem, not an algorithmic problem. Empirical studies indicate that this type of problem does not occur for the mirror configuration and realistic distance/angle combinations.
  • Equation (15) A potential problem exists with Equation (15), in that for some value of ⁇ (say, ⁇ o ) a zero denominator occurs. Theoretically (i.e., in absence of roundoff errors), this case must imply that the numerator is also zero and that a definite (finite) limiting value exists.
  • Equation (15) is replaced with
  • Equation (17b) is the limiting form of (17a) as ⁇ approaches ⁇ 1 .
  • This formulation is equivalent to dividing out an ( ⁇ - ⁇ 1 ) factor from a pure polynomial form.
  • the pair of equations (15 and 16) yield double roots for ⁇ and ⁇ . Procedures for establishing both ⁇ and ⁇ pairs and selecting the physically valid values are required.
  • the ⁇ parameter is a required intermediate variable. It is obtained from Equation (9'), rewritten in the explicit form
  • Equation (10') It could also be evaluated from Equation (10'), or (provided ⁇ ⁇ 0 ) from Equation (11'), since ⁇ and ⁇ have, in principle, been evaluated to make these three equations compatible.
  • M is a 3 x 3 matrix, hence with nine elements. These nine elements are not independent, however.
  • the constraint that M be a rotation matrix implies that only three degrees of freedom exist. These three degrees of freedom can be identified with pitch, roll, and yaw angles, but that identification is neither required nor useful in the steps to solve for M.
  • P has vector a as its first column, vector b as second column, etc.
  • the matrix P is nonsingular (provided only that the pallet vectors a and b are not collinear), so an immediate formal solution results:
  • unit ( ⁇ ) unit (a) (29) where unit "( )" means normalized to unit length.
  • Equation (33) is the same as in Equation (33).
  • Equation (28) is the form used for software evaluation of M. It requires a matrix-times-raatrix multiplication, no explicit matrix inversion.
  • the matrix M can be considered to be the product of three canonic matrices associated with pure pitch, roll, and yaw. The convention chosen for the order of multiplication is
  • the yaw matrix Y is first applied to the vector, then the roll matrix R, then the pitch matrix P.
  • the canonic matrices are

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Transportation (AREA)
  • Structural Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Civil Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Geology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Forklifts And Lifting Vehicles (AREA)

Abstract

Un organe de cible (20) monté sur un objet stationnaire, tel qu'une palette (15), comprend au moins trois éléments réflecteurs (52, 53, 54). Des moyens d'identification (35) tels qu'une source de lumière de forte intensité (35) et un capteur d'images (40) sont portés par un autre objet mobile tel qu'un véhicule élévateur (30). Les éléments réflecteurs (52, 53, 54) sont configurés de manière à former des images des moyens d'identification (35), lesquelles images définissent un plan (70) ayant une orientation autre que l'orientation normale par rapport aux moyens d'identification (35) sur ce plan (70), les images définissant également un cercle (82) qui ne contient pas les moyens d'identification (35). L'organe cible (20) peut se présenter sous la forme d'un organe de support plan orienté verticalement (50) sur lequel sont montés une paire de miroirs convexes (52, 54) et un miroir concave (53). Les images des moyens d'identification (35) dans les miroirs (52, 53, 54) sont détectées par un capteur d'images (40), tel qu'une caméra de télévision (40), et les directions de chacune des images à la caméra (40) sont utilisées pour déterminer la totalité des six degrés d'information relatives à la position du capteur (40) par rapport à l'organe cible (20). Ces informations peuvent être utilisées pour guider un véhicule élévateur (30) et le positionner par rapport à une palette (15).
PCT/US1986/002151 1985-10-18 1986-10-10 Cible et systeme de commande pour positionner un vehicule guide automatiquement WO1987002484A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR870700519A KR880700512A (ko) 1985-10-18 1986-10-10 자동유도차량 위치설정용 타게트 및 제어시스템

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US06/789,280 US4678329A (en) 1985-10-18 1985-10-18 Automatically guided vehicle control system
US788,989 1985-10-18
US789,280 1985-10-18
US06/788,989 US4684247A (en) 1985-10-18 1985-10-18 Target member for use in a positioning system

Publications (1)

Publication Number Publication Date
WO1987002484A1 true WO1987002484A1 (fr) 1987-04-23

Family

ID=27120859

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1986/002151 WO1987002484A1 (fr) 1985-10-18 1986-10-10 Cible et systeme de commande pour positionner un vehicule guide automatiquement

Country Status (4)

Country Link
EP (1) EP0243493A1 (fr)
KR (1) KR880700512A (fr)
CA (1) CA1277392C (fr)
WO (1) WO1987002484A1 (fr)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4869635A (en) * 1988-03-31 1989-09-26 Caterpillar Industrial Inc. Apparatus for controllably positioning a lift mast assembly of a work vehicle
EP0363072A1 (fr) * 1988-09-28 1990-04-11 THE GENERAL ELECTRIC COMPANY, p.l.c. Commande de véhicule automatisée
WO1992000913A1 (fr) * 1990-07-11 1992-01-23 Pekka Kolari Procede et dispositif de reglage de la pression hydraulique s'exerçant sur un element de serrage
GB2259823A (en) * 1991-09-17 1993-03-24 Radamec Epo Limited Navigation system
EP0712697A3 (fr) * 1994-11-16 1996-08-14 Consorzio Telerobot Système de commande basé sur la vision pour un chariot élévateur pour le chargement autonome de palettes
US5805286A (en) * 1995-11-07 1998-09-08 Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E.V. Process for determination of the position of a vehicle in a plane of travel
WO1999012083A1 (fr) * 1997-09-01 1999-03-11 Siemens Aktiengesellschaft Procede de positionnement pour la mise a quai d'une unite mobile autonome avec utilisation d'un faisceau de guidage
FR2799002A1 (fr) * 1999-09-28 2001-03-30 Thomson Csf Procede d'imagerie laser active
FR2804101A1 (fr) * 2000-01-21 2001-07-27 Nippon Yusoki Co Ltd Chariot-elevateur a fourche
US7473884B2 (en) 2005-04-21 2009-01-06 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Orientation determination utilizing a cordless device
US7609249B2 (en) 2005-04-21 2009-10-27 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Position determination utilizing a cordless device
BE1018160A3 (nl) * 2008-05-26 2010-06-01 Egemin Nv Automatisch gestuurd voertuig en werkwijze voor het sturen daarbij toegepast.
US7796119B2 (en) 2006-04-03 2010-09-14 Avago Technologies General Ip (Singapore) Pte. Ltd. Position determination with reference
EP2373558A1 (fr) * 2008-12-05 2011-10-12 Datachassi DC AB Procédé et système permettant de fournir de l'assistance au quai
US8477987B2 (en) 2005-06-22 2013-07-02 Sime Oy Method for repositioning a numerically controlled device
US8718372B2 (en) 2011-10-19 2014-05-06 Crown Equipment Corporation Identifying and evaluating possible horizontal and vertical lines intersecting potential pallet features
US20160091899A1 (en) * 2013-05-10 2016-03-31 Dyson Technology Limited Apparatus for guiding an autonomous vehicle towards a docking station
US9378554B2 (en) 2014-10-09 2016-06-28 Caterpillar Inc. Real-time range map generation
US9449397B2 (en) 2014-10-15 2016-09-20 Caterpillar Inc. Real-time visual odometry system for determining motion of a machine with a range detection unit
US9990535B2 (en) 2016-04-27 2018-06-05 Crown Equipment Corporation Pallet detection using units of physical length
US10611615B2 (en) 2016-07-14 2020-04-07 Toyota Material Handling Manufacturing Sweden Ab Floor conveyor
US10633232B2 (en) 2016-07-14 2020-04-28 Toyota Material Handling Manufacturing Sweden Ab Floor conveyor
US10710853B2 (en) 2016-07-14 2020-07-14 Toyota Material Handling Manufacturing Sweden Ab Floor conveyor
WO2021216255A1 (fr) * 2020-04-24 2021-10-28 Autoguide, LLC Robot pour empiler des éléments

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2523801A1 (de) * 1974-05-29 1975-12-11 Villamos Berendezes Es Keszule Steuergeraet fuer die genaue zielansteuerung innerhalb eines fangbereichs bei der zweidimensionalen befoerderung von gegenstaenden in einer bahnebene, z.b. der befoerderung von stapelkraenen in regalanlagen
GB2019809A (en) * 1978-04-28 1979-11-07 Volvo Ab Device for orienting a lifting means for example in relation to a load
EP0035890A1 (fr) * 1980-03-07 1981-09-16 Lear Siegler, Inc. Système et procédé pour commander la position relative de deux objets tels qu'un véhicule de manipulation d'articles et un lieu d'emmagasinage
FR2495797A1 (fr) * 1980-12-09 1982-06-11 Onera (Off Nat Aerospatiale) Systeme de pilotage automatique d'un vehicule terrestre autonome
DE3305277A1 (de) * 1983-02-16 1984-08-16 Ing. Günter Knapp GmbH & Co. KG, Graz Verfahren und vorrichtung zum selbsttaetigen ein- und auslagern von stueckgut

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2523801A1 (de) * 1974-05-29 1975-12-11 Villamos Berendezes Es Keszule Steuergeraet fuer die genaue zielansteuerung innerhalb eines fangbereichs bei der zweidimensionalen befoerderung von gegenstaenden in einer bahnebene, z.b. der befoerderung von stapelkraenen in regalanlagen
GB2019809A (en) * 1978-04-28 1979-11-07 Volvo Ab Device for orienting a lifting means for example in relation to a load
EP0035890A1 (fr) * 1980-03-07 1981-09-16 Lear Siegler, Inc. Système et procédé pour commander la position relative de deux objets tels qu'un véhicule de manipulation d'articles et un lieu d'emmagasinage
FR2495797A1 (fr) * 1980-12-09 1982-06-11 Onera (Off Nat Aerospatiale) Systeme de pilotage automatique d'un vehicule terrestre autonome
DE3305277A1 (de) * 1983-02-16 1984-08-16 Ing. Günter Knapp GmbH & Co. KG, Graz Verfahren und vorrichtung zum selbsttaetigen ein- und auslagern von stueckgut

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4869635A (en) * 1988-03-31 1989-09-26 Caterpillar Industrial Inc. Apparatus for controllably positioning a lift mast assembly of a work vehicle
EP0363072A1 (fr) * 1988-09-28 1990-04-11 THE GENERAL ELECTRIC COMPANY, p.l.c. Commande de véhicule automatisée
WO1992000913A1 (fr) * 1990-07-11 1992-01-23 Pekka Kolari Procede et dispositif de reglage de la pression hydraulique s'exerçant sur un element de serrage
GB2259823A (en) * 1991-09-17 1993-03-24 Radamec Epo Limited Navigation system
EP0712697A3 (fr) * 1994-11-16 1996-08-14 Consorzio Telerobot Système de commande basé sur la vision pour un chariot élévateur pour le chargement autonome de palettes
US5812395A (en) * 1994-11-16 1998-09-22 Masciangelo; Stefano Vision based forklift control system for autonomous pallet loading
US5805286A (en) * 1995-11-07 1998-09-08 Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E.V. Process for determination of the position of a vehicle in a plane of travel
US6278917B1 (en) 1997-09-01 2001-08-21 Siemens Aktiengesellschaft Method for docking an autonomous mobile unit with the use of a light beam
WO1999012083A1 (fr) * 1997-09-01 1999-03-11 Siemens Aktiengesellschaft Procede de positionnement pour la mise a quai d'une unite mobile autonome avec utilisation d'un faisceau de guidage
FR2799002A1 (fr) * 1999-09-28 2001-03-30 Thomson Csf Procede d'imagerie laser active
EP1089090A1 (fr) * 1999-09-28 2001-04-04 Thomson-Csf Procédé d'imagerie laser active
FR2804101A1 (fr) * 2000-01-21 2001-07-27 Nippon Yusoki Co Ltd Chariot-elevateur a fourche
US7473884B2 (en) 2005-04-21 2009-01-06 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Orientation determination utilizing a cordless device
US7609249B2 (en) 2005-04-21 2009-10-27 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Position determination utilizing a cordless device
US7812816B2 (en) 2005-04-21 2010-10-12 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Powerless signal generation for use in conjunction with a powerless position determination device
US7737393B2 (en) 2005-04-21 2010-06-15 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Orientation determination utilizing a cordless device
GB2425352B (en) * 2005-04-21 2010-07-14 Agilent Technologies Inc System and method of determining orientation information
US8384663B2 (en) 2005-04-21 2013-02-26 Avago Technologies General Ip (Singapore) Pte. Ltd. Position determination utilizing a cordless device
US8477987B2 (en) 2005-06-22 2013-07-02 Sime Oy Method for repositioning a numerically controlled device
US7796119B2 (en) 2006-04-03 2010-09-14 Avago Technologies General Ip (Singapore) Pte. Ltd. Position determination with reference
BE1018160A3 (nl) * 2008-05-26 2010-06-01 Egemin Nv Automatisch gestuurd voertuig en werkwijze voor het sturen daarbij toegepast.
EP2373558A1 (fr) * 2008-12-05 2011-10-12 Datachassi DC AB Procédé et système permettant de fournir de l'assistance au quai
EP2373558A4 (fr) * 2008-12-05 2012-06-27 Datachassi Dc Ab Procédé et système permettant de fournir de l'assistance au quai
US9025886B2 (en) 2011-10-19 2015-05-05 Crown Equipment Corporation Identifying and selecting objects that may correspond to pallets in an image scene
US8849007B2 (en) 2011-10-19 2014-09-30 Crown Equipment Corporation Identifying, evaluating and selecting possible pallet board lines in an image scene
US8885948B2 (en) 2011-10-19 2014-11-11 Crown Equipment Corporation Identifying and evaluating potential center stringers of a pallet in an image scene
US8934672B2 (en) 2011-10-19 2015-01-13 Crown Equipment Corporation Evaluating features in an image possibly corresponding to an intersection of a pallet stringer and a pallet board
US8938126B2 (en) 2011-10-19 2015-01-20 Crown Equipment Corporation Selecting objects within a vertical range of one another corresponding to pallets in an image scene
US8977032B2 (en) 2011-10-19 2015-03-10 Crown Equipment Corporation Identifying and evaluating multiple rectangles that may correspond to a pallet in an image scene
US8995743B2 (en) 2011-10-19 2015-03-31 Crown Equipment Corporation Identifying and locating possible lines corresponding to pallet structure in an image
US9025827B2 (en) 2011-10-19 2015-05-05 Crown Equipment Corporation Controlling truck forks based on identifying and tracking multiple objects in an image scene
US8718372B2 (en) 2011-10-19 2014-05-06 Crown Equipment Corporation Identifying and evaluating possible horizontal and vertical lines intersecting potential pallet features
US9082195B2 (en) 2011-10-19 2015-07-14 Crown Equipment Corporation Generating a composite score for a possible pallet in an image scene
US9087384B2 (en) 2011-10-19 2015-07-21 Crown Equipment Corporation Identifying, matching and tracking multiple objects in a sequence of images
US20160091899A1 (en) * 2013-05-10 2016-03-31 Dyson Technology Limited Apparatus for guiding an autonomous vehicle towards a docking station
US10175696B2 (en) * 2013-05-10 2019-01-08 Dyson Technology Limited Apparatus for guiding an autonomous vehicle towards a docking station
US9378554B2 (en) 2014-10-09 2016-06-28 Caterpillar Inc. Real-time range map generation
US9449397B2 (en) 2014-10-15 2016-09-20 Caterpillar Inc. Real-time visual odometry system for determining motion of a machine with a range detection unit
US9990535B2 (en) 2016-04-27 2018-06-05 Crown Equipment Corporation Pallet detection using units of physical length
US10611615B2 (en) 2016-07-14 2020-04-07 Toyota Material Handling Manufacturing Sweden Ab Floor conveyor
US10633232B2 (en) 2016-07-14 2020-04-28 Toyota Material Handling Manufacturing Sweden Ab Floor conveyor
US10710853B2 (en) 2016-07-14 2020-07-14 Toyota Material Handling Manufacturing Sweden Ab Floor conveyor
WO2021216255A1 (fr) * 2020-04-24 2021-10-28 Autoguide, LLC Robot pour empiler des éléments
US11459221B2 (en) 2020-04-24 2022-10-04 Autoguide, LLC Robot for stacking elements

Also Published As

Publication number Publication date
EP0243493A1 (fr) 1987-11-04
KR880700512A (ko) 1988-03-15
CA1277392C (fr) 1990-12-04

Similar Documents

Publication Publication Date Title
US4678329A (en) Automatically guided vehicle control system
US4684247A (en) Target member for use in a positioning system
WO1987002484A1 (fr) Cible et systeme de commande pour positionner un vehicule guide automatiquement
US5513276A (en) Apparatus and method for three-dimensional perspective imaging of objects
US5729475A (en) Optical system for accurate monitoring of the position and orientation of an object
US5303034A (en) Robotics targeting system
US9519810B2 (en) Calibration and self-test in automated data reading systems
EP0798567B1 (fr) Système de mesure pour tester la position d'un véhicule et senseur pour cela
US20180164417A1 (en) Method of error correction for 3d imaging device
EP2877959B1 (fr) Systèmes et procédés de mesure d'objet dans un lecteur de données automatisé
US5838428A (en) System and method for high resolution range imaging with split light source and pattern mask
US7310431B2 (en) Optical methods for remotely measuring objects
EP3537380B1 (fr) Étalonnage de coordonnées entre un système de coordonnées bidimensionnelles et un système de coordonnées tridimensionnelles
CN113034612B (zh) 一种标定装置、方法及深度相机
US20220100979A1 (en) Machine vision system and method with on-axis aimer and distance measurement assembly
US6730926B2 (en) Sensing head and apparatus for determining the position and orientation of a target object
GB2259764A (en) A device for measuring the three dimensional shape of an elongate member
US4849643A (en) Optical probe with overlapping detection fields
Araki et al. High speed rangefinder
US5086411A (en) Optical location systems
WO2021231947A1 (fr) Agencement d'imagerie et procédés et systèmes correspondants pour la génération de carte de profondeur
EP0813690B1 (fr) Systeme de detection de cibles
Puskorius et al. Camera calibration methodology based on a linear perspective transformation error model
US6411918B1 (en) Method and apparatus for inputting three-dimensional data
JP2614446B2 (ja) 距離測定装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP KR

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): DE FR GB IT SE

WWE Wipo information: entry into national phase

Ref document number: 1987900361

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1987900361

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1987900361

Country of ref document: EP