CA2049773A1 - Optical guidance system for an automated guidance vehicle - Google Patents
Optical guidance system for an automated guidance vehicleInfo
- Publication number
- CA2049773A1 CA2049773A1 CA 2049773 CA2049773A CA2049773A1 CA 2049773 A1 CA2049773 A1 CA 2049773A1 CA 2049773 CA2049773 CA 2049773 CA 2049773 A CA2049773 A CA 2049773A CA 2049773 A1 CA2049773 A1 CA 2049773A1
- Authority
- CA
- Canada
- Prior art keywords
- optical
- guideline
- vehicle
- milestones
- guidance system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Landscapes
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Traffic Control Systems (AREA)
Abstract
17 An optical guidance system for an automated guided vehicle comprises an optical guideline network including a main guideline and milestones adapted to be installed above the path of the vehicle, a two dimensions optical sensor mounted on the vehicle for scanning the optical guideline and associated milesontes and producing a video image of the optical guideline and associated milestones, and an image processing system also mounted on the vehicle and responsive to the optical sensor for generating lateral and orientation offset signals from the localization of the optical guideline on the video image.
Description
FI~hD OF T~ I~VBNTION
This invention relates to an optical guidance system for an automated suided vehicle (AGV).
BAC~RO~ND OF T~B I~VE~T~ON
The underground mining industry faces increasingly competitive market. A response in terms of production costs reduction and productivity increase is, therefore, necessary. Automation is perceived as a major step towards higher efficiencies. Recent progress in the field of vehicle guidance will lead to practical applications in the very near futureO An appropriate guidance system in underground mines should allow:
1) Improved work environment and security by having the equipment's operator located in a control xoom.
This invention relates to an optical guidance system for an automated suided vehicle (AGV).
BAC~RO~ND OF T~B I~VE~T~ON
The underground mining industry faces increasingly competitive market. A response in terms of production costs reduction and productivity increase is, therefore, necessary. Automation is perceived as a major step towards higher efficiencies. Recent progress in the field of vehicle guidance will lead to practical applications in the very near futureO An appropriate guidance system in underground mines should allow:
1) Improved work environment and security by having the equipment's operator located in a control xoom.
2) Better management and monitoring of production quantity and quality.
3) Control and supervision of multiple vehicles by one operator.
4) Elimination of equipment abuse with a proper control system.
The most current industrial types o~ AGV~s use embedded . .
.. . .
.-.. . ~, , - ~
2 ;2~773 electrical wires beneath the floor surface as a guide.
This type of guidance system is either inflexible or complex. Inflexible guidance sy~tems make the AGV unfit for general purposes especially for underground applications. Complexity makes them expensive and unreliable. As a result, the use of AGV's has been limited to applications where the environment is controlledO Also, this solution was not practical for underground applications since it would require drift floors to be made of rein~orced concxete.
Lasers have also been used in combination with retro-reflective targets for guiding unmanned vehicles.
However, lasers havP a number of limitations ~as will be seen later) which make them unsuitable for robust applications, such as underground operations.
~A~EMENT OF T~B IN~ENTION
It is the object of the present invention to provide an optical guidance for an automated guided vehicle which overcomes the above difficulties.
The optical guidance system in accordance with the present invention comprises an opticai guideline network including a main guideline and milestones adapted to be installed above the path of the vehicle, a two dimensions optical sensor mounted on the vehicle for scanning the optical guideline and associated milestones and producing a video image of the optical guideline and associated milestones, and an image processing system also mounted on the vehicle and responsiv~ to the optical sensor for , - . : . - :
: , , generating lateral and orientation offset signals from the localization of the optical guideline on tha video image.
The optical sensor is preferably a video camera mounted on the vehicle. The camera is oriPnted towards the guideline and inclined in the direction of movement of the vehicle and produces a video image of the guideline and associated milestone.
The optical guideline i5 preferably made of a retro-reflective ribbon. A light source is located near the objective of the camera and the ribbon reflects the light back to its source. This enhances image contrast since the background (rock, pipesl cables...) absorbs most of the incoming light and reflects the rest in all directions.
Tha milestones are preferably made of at least one retro-reflective ribbon of short length mounted parallel to the guideline in a manner as to form a bar code with the guideline for indicating changing situations on the path like a curvature or a dead-end~
The image processing system comprises a computer, a grey-level image grabber responsive to the computer for transferring the video image into computer memory in the binary form, and a number of computer modules allowing to compute the lateral and orientation offsets of the vehicle with respect to the optical guideline and detect the passage of milestones and identify th~m.
The present invention overcom~s the difficulties of the prior art by providing an unmanned self-propelled vehicle , ~ . , .
,. ' . ' ,:
~ ' , 4 ~ 37~3 having an improved guidancs system using a simple and low cost optical sensor and an optical guideline network fixed on drift ceilings (backs)~ This guidance system makes it easier and less expensive to set up ox change the vehicle's route than with a wire guided machine. It also provides a more robust system by using an industrial two dimensions optical sensor instead of a laser scanner. The most used laser scanner i~ a 1 dimension sensor. It produce~ one line of scan that generates a surface while in motion. The camera is already a 2D sensor. It scan up to 512 lines at once while the laser scan only one line at a time.
The guidance system of the present invention has the capability to evaluate two types of navigation error:
lateral and angular. These errors are used to guide the vehicle along a predetermined guide path. Tests have demonstrated that the angular error is the most important one to be compensated in order to properly control a vehicle at high speeds.
Moreover, if the optical guideline becomes dusty or partially absent, the camera will stili be able to detect it: (because of the 512 lines) and continue to assume guidance of the vehicle. In the case of the laser, when the guide is dusty, the line is lost and the vehicle stops.
The camera i5 preferably inclined from the vertical at an angle in order to detect the guideline way ahead of the vehicle. It allows to anticipate the incoming curvature .t in the guideline and to react more rapidly in corxecting the vehicle trajectory. More important, for articulated vehicles such as LHD's, it allows ko mount the camera behind the bucket, thus offering better protection for the sensor. on the other hand, the laser could also be inclined. However, because of its limited range (<20'~, it would have to be installed at a much closer distance from the extremities of the vehicle. For a large size LHD, the laser ~ould have to be located in the bucket, therefore exposing it to severe shocks during the loading operation.
The camera does not have any mobile part and only the lens is exposed on the vehicle. On the other hand, the laser has a revolving mirror which turns at high speed when the laser is activated. This mobile part can be seen as a source o~ potential breakdown and costly repairs.
i3RIBF D138CRIPT:tO~ OF TIII5 DR~WI~T3;~
The invention will now be disclosed, by way of example, with reference to a preferred embodiment illustrated in the accompanying drawings in which:
Figures la and lb illustrate a top and a side view, respectively, of a Load-Haul-Dump (LHD) vehicle equipped with optical line sensors scanning an optical guideline;
Figure ~ illustrates a typical underground mine environmen~ with the guideline path and milestones locations;
~ igure 3 illustrates a typical guideline and milestone setup;
,~ ., . ,:
~, 6 2~ 3 Figure 4 illustrates a branch point, used to indicate two possible directions;
Figure 5 shows the hardware of the guidance system in accordance with the present invention;
5Figure 6 is a diagram illustrating the software structure of the guidancP and control systems;
Figure 7 shows how one o~ the three scan lines (profile ~lice~ is used by the guidance system to determine the offsets of the vehicle; and 10Figure 8 i5 a diagram illustrating the kinematic of the optical system and vehicle.
D~T~I~BD DB8CRIPTION OF ~HE INV~NTIO~
The invention relates to an optical guidance system for automated vehicles. As disclosed above, it comprises an 15optical line network, an optical line sensor and an image processing module. The invention provides navigation offsets evaluation relative to the guideline and relevant indications along the oncoming path. The guidance system must be mounted on a vehicle equipped with a motion 20controller that activate the actuators of the vehicle in response to the navigation offsets and indications.
Referring to Figures la, lb and 2, the vehicle shown is an articulated Load-Haul-Dump (LHD) vehicle 10, commonly used in underground mining environments. However any type 25o~ vehicle operating in an environment with a ceiling could be equipped with the present optical guidance system. The ceiling is necessary to fix the optical line network. Normally, an operator uses the LHD to take ore ~: ' . '' ~
7 ~ 773 from a draw point 12, such as an open stope, to a dumping point 14 through a drift 16. With the help of the optical guidance system in accordance with the present invention, the transportation and dumping operations are done automatically. The vehicle is guided by an optical line network including a guideline 18 and milestones 20 which are fixed to the ceiling of the drift and are scanned by a two dimensions optical line sensor 22 mounted aboard the vehicle. Also mounted aboard the vehicle is an image processing system to be disclosed later.
The optical guideline 18 is preferably made of a retro-reflective ribbon (such as the 3M Scothlite 3970G). This ribbon reflects about an hundred times more than a white sheet of paper back to the direction of the light source.
It is for this reason that tha light source is installed in parallel to the field of ~iew of the video camera used to detect the guideline, as it will be seen in the description of Figure 5. Using this technique, the ribbon appears much brighter than anything else in the image, allowing the image processing module to easily differentiate the retro-reflective ribbon from other objects on the ceiling. The network also includes milestones made with the same retro-reflective ribbon.
Milestones are used to signal a changing situation on the path like a curvature or a dead-end.
The width of the guideline ribbon must be optimized;
not too large to minimize costs and installation complexity but wide enough to be recognized by the optical , :
8 ~9773 sensor. Typically, for underground tunnels of three meters height, the width of the ribbon is about ~ive centimetres. Milestones are typically made of ribbon stripes two centimetres large by one meter long to allow thP optical system to detect them, even at high speed. The milestones are fixed parallel to the main guideline. The space between each ribbon stripe of the milestones is typically five centimetres. As shown in figure 3, the milestones are binary bar codes. For example, code 0 may be used to signal the end of a path and code 1 for curvatures. The system could use many stripes per milestone, representing different possible meanings. For branch points, leading to draw points or dump points, two guidelines are installed in parallel for typically two meters, separated by 30 cm, before going alone in each branch. This allows the guidance system to have enough time to detect the branch. Figure 4 shows a typical branch point.
The optical line sensor 22 is an industrial rugged CCD
video camera. It scans 512 lines of 512 pixels each, thirty times per second. The camera also has an adjustable ~ocus and iris. The last feature allows to adjust the opening of the camera to get every objects in the image black except the guideline and milestones.
The vehicle is equipped with a camera for each direction, oriented towards the guideline for forward and backward movements. For better performanre, the cameras are aligned under the guideline to get it's image in the ..
~, .
:
:. . :
"' ' ~
centre of the image. To get better control of the vehicle, the cameras must look ~orward to the displacement to anticipate the coming curvature of the path.
Experimentation shows that pointing the cameras to look at the front of the vehicle (rear of the vehicle for backward movement) gives the hest results. To do so, the cameras are inclined as much as 50 degrees from the vertical field of view. However, an angle of more than 60 produces a complex image because the yuideline is not perfectly flat.
Figur~ 5 shows the hardware part of the guidance system which includ~s the video camera 22, a light source 24 in the form of a ring surrounding the camera objective and provided with a plurality of light windows, a power source 26 and a computer 28 containing a frame grabber card. A
block diagram of the image processing system which is part of the computer software is shown in figure 6; it consists of the following modules: image grahber 32, video scan 34, video peak detector 3S herea~ter called "pic" t bar code 38, optical line 40, camera model 42, guidepath 44 and milestone 46. The architecture of the software is object oriented, so higher level modules have more abstract tasks than low level ones. The modules at the right of the drawing are part 4f the guidance system while those at the le~t are part of the control system (required ~o control the vehicle movement). Generally speaking, the image grabber module 32 stores in memory the image sent by the camera. The higher level pic module 36 analyses the intensity of the pixels of at least three lines; at the . ' bottom, middle and top of the i~age, (middle slice shown on figure 7). This module r~turns the width and position of all the groups of pixels on these three lines that have a higher intensity than a critical threshold value. ~fter that, the optical line module 40 correlates the groups of pixels on each line to identify the guideline and milestone stripes. Then, the guidepath module 44 evaluates the lateral and angular offs ts of the vshicle in function of the position o~ the guideline in the image and on the geometry of the installation. The milestone ~odule 46 recognizes the bar code represented by the lines 2f the milestones. The main program identified by guidanc~ block 48 then sends the offsets and the milestone cod~s to a navigator module 50, as well as information concerning any abnormal situation like the lost of the guideline or inconsistencies in the image. The description of each module is given below.
I~AG~ BB~R
The image grabber instructs the frame grabber hardware to freeze the present image and transfers it to system memory in the form of an array of information typically 512 x 512 points with an 8 bit resolution.
VIDEO ~C~N
This module performs an intensity profile of stripes of image information, figure 7 illustrates this concept. The position/ width and the num~r of scan stripes are parameters for this module and are determined by the pic module. The maximum number of stripes possible is equal .: .
. , . . ~
,'''~..; .
to the rPsolution o~ the frame grabber typically 512. At this resolution the stripe will be one pixel widev PIC
Analysis of the intensity profile in done within this module. Figure 7 illustrates the di~ferent parameters involved~ The module compares the intensiky profil2 to a predetermined threshold value. The resulting peaks are then compared to a minimum pic width parameter and are rejected if they do not satisfy this criteria. In certain situatisns a gap such as indicated by a dashed line in Figure 7 can be tolerated i-f this g2p is within 25% to 33%
of the Pic width~
OPTICA~ ~I~E
Determination if a Pic information is an optical line is done by comparing the widest line on the image with a maximum value. If the pic satisfies this criteria then the optical line width is set to that of this line. At least one optical line must ~e detected. Identification of a second or third line is done by testing if the other pics are wider than a large ~raction of the optical line width. If this condition is satisfied and it's distance from the previous Pic is less than a given times the optical line width then this new line is also considered to be an optical line.
B~R COD~
Other Pics less than the large fraction of the optical line width and situated also within a giv~n times the optical line width from the preceding pics are considered to be bar codes.
C~NER~ ~ODE~
The camera model module converts the optical lines obtained previously to the machines referential. The following equations demonstxate this:
( ry 0~5)(~ -1) sin~ ~ E cos~ ~ dx y = -Lx ( x _ 0,5 ) (E~ dy ( ry 0~51( A ~I)csa~Esin~+dz = dcos~ Asln ~cos sin(a-O sin(a-O
(1 (ry ~) wherein:
( x, y, z ) a ~pecific point of the optical guide with rsspect to the vehicle referential.
(xi, yi, O) a point of the optical guide image in screen coordinatesO
( dx, dy, dz~ position of th~ image plane in the vehicl~
rPferential.
angle of elevation of the camera in relation to the ground.
rx, ry resolution of the vision system.
Lx,Ly Dimensions of the CCD sensor.
A Focal distance of the lens.
d vertical distance between the optical guide and the lens G~IDEPAT~
As shown in Figure 8, the vehicle reference point is .:, , -: : :::
:
- : . : !
above the middle o~ the front axle (rear axles for backward movement), the X axis pointing forward and the Y
axis pointing to the left. The path reference point is located on the nearest point of the path from the vehicle reference point and it's X axis is pointing in the direction of the tangent of the path at this reference point which the Y axis is pointing to the left, perpendicular to the tangent. The lateral offset is the distance on the Y axis o~ the path reference point to ~0 reach the vehi~-le reference point; the lateral offset i5 negakive if the vehicle reference point is located at the right of the path, looking in the direction movement. The angular offset is the angle from the X axis of the path reference point to the X axis o the vehicle reference point, going counterclockwise direction~
The drawing at the top of Figure 8 shows the relation between the camera image plane reference point and the vehicle reference point defined in the previous paragraph.
This reference point transformation allows to convert lateral and angular offsets observed in the image plane to the offsets in the vehicle reference point.
The guidepath module receives the points translated into the vehicle referential and calculates lateral and orientation offsets. For calculation of the angular offset, at least 2 valid stripes are required. More can be used to improve accuracy.
TO~E
The milestone module is responsible for analyzing the Bar . ,. " ~
,:
7~3 codes observed by the system. Simple codes can be implemented for example a 0 code and a 1 code. In certain instances if more than one optical guide is~detected thi~
can also serve as a valid point for the milestone module.
N~IGA~OR
The navigator is the main module of the control system which it's not part of the present invention. This module compares the information on milestones receiv~d from the guidance system with a description (named ROUTE) of the path to ~ollow, in order to take the appropriate actions (like changing speed, stopping or brancAing). It also send the lateral and angular offsets to a motion control module 52. Figure 8 shows the software connection between the guidance system and the control system.
~OTIO~ CO~TRO~
The motion control module 52 commands the steering of the vehicle through actuator cQntrols 54 in order to minimize the lateral and angular offsets.
Although the invention has been disclosed by way of example with reference to a preferred embodiment, it is to be understood that it is not limited to such embodiment and that other alternatives are also envisaged within the scope of the following claims.
:
The most current industrial types o~ AGV~s use embedded . .
.. . .
.-.. . ~, , - ~
2 ;2~773 electrical wires beneath the floor surface as a guide.
This type of guidance system is either inflexible or complex. Inflexible guidance sy~tems make the AGV unfit for general purposes especially for underground applications. Complexity makes them expensive and unreliable. As a result, the use of AGV's has been limited to applications where the environment is controlledO Also, this solution was not practical for underground applications since it would require drift floors to be made of rein~orced concxete.
Lasers have also been used in combination with retro-reflective targets for guiding unmanned vehicles.
However, lasers havP a number of limitations ~as will be seen later) which make them unsuitable for robust applications, such as underground operations.
~A~EMENT OF T~B IN~ENTION
It is the object of the present invention to provide an optical guidance for an automated guided vehicle which overcomes the above difficulties.
The optical guidance system in accordance with the present invention comprises an opticai guideline network including a main guideline and milestones adapted to be installed above the path of the vehicle, a two dimensions optical sensor mounted on the vehicle for scanning the optical guideline and associated milestones and producing a video image of the optical guideline and associated milestones, and an image processing system also mounted on the vehicle and responsiv~ to the optical sensor for , - . : . - :
: , , generating lateral and orientation offset signals from the localization of the optical guideline on tha video image.
The optical sensor is preferably a video camera mounted on the vehicle. The camera is oriPnted towards the guideline and inclined in the direction of movement of the vehicle and produces a video image of the guideline and associated milestone.
The optical guideline i5 preferably made of a retro-reflective ribbon. A light source is located near the objective of the camera and the ribbon reflects the light back to its source. This enhances image contrast since the background (rock, pipesl cables...) absorbs most of the incoming light and reflects the rest in all directions.
Tha milestones are preferably made of at least one retro-reflective ribbon of short length mounted parallel to the guideline in a manner as to form a bar code with the guideline for indicating changing situations on the path like a curvature or a dead-end~
The image processing system comprises a computer, a grey-level image grabber responsive to the computer for transferring the video image into computer memory in the binary form, and a number of computer modules allowing to compute the lateral and orientation offsets of the vehicle with respect to the optical guideline and detect the passage of milestones and identify th~m.
The present invention overcom~s the difficulties of the prior art by providing an unmanned self-propelled vehicle , ~ . , .
,. ' . ' ,:
~ ' , 4 ~ 37~3 having an improved guidancs system using a simple and low cost optical sensor and an optical guideline network fixed on drift ceilings (backs)~ This guidance system makes it easier and less expensive to set up ox change the vehicle's route than with a wire guided machine. It also provides a more robust system by using an industrial two dimensions optical sensor instead of a laser scanner. The most used laser scanner i~ a 1 dimension sensor. It produce~ one line of scan that generates a surface while in motion. The camera is already a 2D sensor. It scan up to 512 lines at once while the laser scan only one line at a time.
The guidance system of the present invention has the capability to evaluate two types of navigation error:
lateral and angular. These errors are used to guide the vehicle along a predetermined guide path. Tests have demonstrated that the angular error is the most important one to be compensated in order to properly control a vehicle at high speeds.
Moreover, if the optical guideline becomes dusty or partially absent, the camera will stili be able to detect it: (because of the 512 lines) and continue to assume guidance of the vehicle. In the case of the laser, when the guide is dusty, the line is lost and the vehicle stops.
The camera i5 preferably inclined from the vertical at an angle in order to detect the guideline way ahead of the vehicle. It allows to anticipate the incoming curvature .t in the guideline and to react more rapidly in corxecting the vehicle trajectory. More important, for articulated vehicles such as LHD's, it allows ko mount the camera behind the bucket, thus offering better protection for the sensor. on the other hand, the laser could also be inclined. However, because of its limited range (<20'~, it would have to be installed at a much closer distance from the extremities of the vehicle. For a large size LHD, the laser ~ould have to be located in the bucket, therefore exposing it to severe shocks during the loading operation.
The camera does not have any mobile part and only the lens is exposed on the vehicle. On the other hand, the laser has a revolving mirror which turns at high speed when the laser is activated. This mobile part can be seen as a source o~ potential breakdown and costly repairs.
i3RIBF D138CRIPT:tO~ OF TIII5 DR~WI~T3;~
The invention will now be disclosed, by way of example, with reference to a preferred embodiment illustrated in the accompanying drawings in which:
Figures la and lb illustrate a top and a side view, respectively, of a Load-Haul-Dump (LHD) vehicle equipped with optical line sensors scanning an optical guideline;
Figure ~ illustrates a typical underground mine environmen~ with the guideline path and milestones locations;
~ igure 3 illustrates a typical guideline and milestone setup;
,~ ., . ,:
~, 6 2~ 3 Figure 4 illustrates a branch point, used to indicate two possible directions;
Figure 5 shows the hardware of the guidance system in accordance with the present invention;
5Figure 6 is a diagram illustrating the software structure of the guidancP and control systems;
Figure 7 shows how one o~ the three scan lines (profile ~lice~ is used by the guidance system to determine the offsets of the vehicle; and 10Figure 8 i5 a diagram illustrating the kinematic of the optical system and vehicle.
D~T~I~BD DB8CRIPTION OF ~HE INV~NTIO~
The invention relates to an optical guidance system for automated vehicles. As disclosed above, it comprises an 15optical line network, an optical line sensor and an image processing module. The invention provides navigation offsets evaluation relative to the guideline and relevant indications along the oncoming path. The guidance system must be mounted on a vehicle equipped with a motion 20controller that activate the actuators of the vehicle in response to the navigation offsets and indications.
Referring to Figures la, lb and 2, the vehicle shown is an articulated Load-Haul-Dump (LHD) vehicle 10, commonly used in underground mining environments. However any type 25o~ vehicle operating in an environment with a ceiling could be equipped with the present optical guidance system. The ceiling is necessary to fix the optical line network. Normally, an operator uses the LHD to take ore ~: ' . '' ~
7 ~ 773 from a draw point 12, such as an open stope, to a dumping point 14 through a drift 16. With the help of the optical guidance system in accordance with the present invention, the transportation and dumping operations are done automatically. The vehicle is guided by an optical line network including a guideline 18 and milestones 20 which are fixed to the ceiling of the drift and are scanned by a two dimensions optical line sensor 22 mounted aboard the vehicle. Also mounted aboard the vehicle is an image processing system to be disclosed later.
The optical guideline 18 is preferably made of a retro-reflective ribbon (such as the 3M Scothlite 3970G). This ribbon reflects about an hundred times more than a white sheet of paper back to the direction of the light source.
It is for this reason that tha light source is installed in parallel to the field of ~iew of the video camera used to detect the guideline, as it will be seen in the description of Figure 5. Using this technique, the ribbon appears much brighter than anything else in the image, allowing the image processing module to easily differentiate the retro-reflective ribbon from other objects on the ceiling. The network also includes milestones made with the same retro-reflective ribbon.
Milestones are used to signal a changing situation on the path like a curvature or a dead-end.
The width of the guideline ribbon must be optimized;
not too large to minimize costs and installation complexity but wide enough to be recognized by the optical , :
8 ~9773 sensor. Typically, for underground tunnels of three meters height, the width of the ribbon is about ~ive centimetres. Milestones are typically made of ribbon stripes two centimetres large by one meter long to allow thP optical system to detect them, even at high speed. The milestones are fixed parallel to the main guideline. The space between each ribbon stripe of the milestones is typically five centimetres. As shown in figure 3, the milestones are binary bar codes. For example, code 0 may be used to signal the end of a path and code 1 for curvatures. The system could use many stripes per milestone, representing different possible meanings. For branch points, leading to draw points or dump points, two guidelines are installed in parallel for typically two meters, separated by 30 cm, before going alone in each branch. This allows the guidance system to have enough time to detect the branch. Figure 4 shows a typical branch point.
The optical line sensor 22 is an industrial rugged CCD
video camera. It scans 512 lines of 512 pixels each, thirty times per second. The camera also has an adjustable ~ocus and iris. The last feature allows to adjust the opening of the camera to get every objects in the image black except the guideline and milestones.
The vehicle is equipped with a camera for each direction, oriented towards the guideline for forward and backward movements. For better performanre, the cameras are aligned under the guideline to get it's image in the ..
~, .
:
:. . :
"' ' ~
centre of the image. To get better control of the vehicle, the cameras must look ~orward to the displacement to anticipate the coming curvature of the path.
Experimentation shows that pointing the cameras to look at the front of the vehicle (rear of the vehicle for backward movement) gives the hest results. To do so, the cameras are inclined as much as 50 degrees from the vertical field of view. However, an angle of more than 60 produces a complex image because the yuideline is not perfectly flat.
Figur~ 5 shows the hardware part of the guidance system which includ~s the video camera 22, a light source 24 in the form of a ring surrounding the camera objective and provided with a plurality of light windows, a power source 26 and a computer 28 containing a frame grabber card. A
block diagram of the image processing system which is part of the computer software is shown in figure 6; it consists of the following modules: image grahber 32, video scan 34, video peak detector 3S herea~ter called "pic" t bar code 38, optical line 40, camera model 42, guidepath 44 and milestone 46. The architecture of the software is object oriented, so higher level modules have more abstract tasks than low level ones. The modules at the right of the drawing are part 4f the guidance system while those at the le~t are part of the control system (required ~o control the vehicle movement). Generally speaking, the image grabber module 32 stores in memory the image sent by the camera. The higher level pic module 36 analyses the intensity of the pixels of at least three lines; at the . ' bottom, middle and top of the i~age, (middle slice shown on figure 7). This module r~turns the width and position of all the groups of pixels on these three lines that have a higher intensity than a critical threshold value. ~fter that, the optical line module 40 correlates the groups of pixels on each line to identify the guideline and milestone stripes. Then, the guidepath module 44 evaluates the lateral and angular offs ts of the vshicle in function of the position o~ the guideline in the image and on the geometry of the installation. The milestone ~odule 46 recognizes the bar code represented by the lines 2f the milestones. The main program identified by guidanc~ block 48 then sends the offsets and the milestone cod~s to a navigator module 50, as well as information concerning any abnormal situation like the lost of the guideline or inconsistencies in the image. The description of each module is given below.
I~AG~ BB~R
The image grabber instructs the frame grabber hardware to freeze the present image and transfers it to system memory in the form of an array of information typically 512 x 512 points with an 8 bit resolution.
VIDEO ~C~N
This module performs an intensity profile of stripes of image information, figure 7 illustrates this concept. The position/ width and the num~r of scan stripes are parameters for this module and are determined by the pic module. The maximum number of stripes possible is equal .: .
. , . . ~
,'''~..; .
to the rPsolution o~ the frame grabber typically 512. At this resolution the stripe will be one pixel widev PIC
Analysis of the intensity profile in done within this module. Figure 7 illustrates the di~ferent parameters involved~ The module compares the intensiky profil2 to a predetermined threshold value. The resulting peaks are then compared to a minimum pic width parameter and are rejected if they do not satisfy this criteria. In certain situatisns a gap such as indicated by a dashed line in Figure 7 can be tolerated i-f this g2p is within 25% to 33%
of the Pic width~
OPTICA~ ~I~E
Determination if a Pic information is an optical line is done by comparing the widest line on the image with a maximum value. If the pic satisfies this criteria then the optical line width is set to that of this line. At least one optical line must ~e detected. Identification of a second or third line is done by testing if the other pics are wider than a large ~raction of the optical line width. If this condition is satisfied and it's distance from the previous Pic is less than a given times the optical line width then this new line is also considered to be an optical line.
B~R COD~
Other Pics less than the large fraction of the optical line width and situated also within a giv~n times the optical line width from the preceding pics are considered to be bar codes.
C~NER~ ~ODE~
The camera model module converts the optical lines obtained previously to the machines referential. The following equations demonstxate this:
( ry 0~5)(~ -1) sin~ ~ E cos~ ~ dx y = -Lx ( x _ 0,5 ) (E~ dy ( ry 0~51( A ~I)csa~Esin~+dz = dcos~ Asln ~cos sin(a-O sin(a-O
(1 (ry ~) wherein:
( x, y, z ) a ~pecific point of the optical guide with rsspect to the vehicle referential.
(xi, yi, O) a point of the optical guide image in screen coordinatesO
( dx, dy, dz~ position of th~ image plane in the vehicl~
rPferential.
angle of elevation of the camera in relation to the ground.
rx, ry resolution of the vision system.
Lx,Ly Dimensions of the CCD sensor.
A Focal distance of the lens.
d vertical distance between the optical guide and the lens G~IDEPAT~
As shown in Figure 8, the vehicle reference point is .:, , -: : :::
:
- : . : !
above the middle o~ the front axle (rear axles for backward movement), the X axis pointing forward and the Y
axis pointing to the left. The path reference point is located on the nearest point of the path from the vehicle reference point and it's X axis is pointing in the direction of the tangent of the path at this reference point which the Y axis is pointing to the left, perpendicular to the tangent. The lateral offset is the distance on the Y axis o~ the path reference point to ~0 reach the vehi~-le reference point; the lateral offset i5 negakive if the vehicle reference point is located at the right of the path, looking in the direction movement. The angular offset is the angle from the X axis of the path reference point to the X axis o the vehicle reference point, going counterclockwise direction~
The drawing at the top of Figure 8 shows the relation between the camera image plane reference point and the vehicle reference point defined in the previous paragraph.
This reference point transformation allows to convert lateral and angular offsets observed in the image plane to the offsets in the vehicle reference point.
The guidepath module receives the points translated into the vehicle referential and calculates lateral and orientation offsets. For calculation of the angular offset, at least 2 valid stripes are required. More can be used to improve accuracy.
TO~E
The milestone module is responsible for analyzing the Bar . ,. " ~
,:
7~3 codes observed by the system. Simple codes can be implemented for example a 0 code and a 1 code. In certain instances if more than one optical guide is~detected thi~
can also serve as a valid point for the milestone module.
N~IGA~OR
The navigator is the main module of the control system which it's not part of the present invention. This module compares the information on milestones receiv~d from the guidance system with a description (named ROUTE) of the path to ~ollow, in order to take the appropriate actions (like changing speed, stopping or brancAing). It also send the lateral and angular offsets to a motion control module 52. Figure 8 shows the software connection between the guidance system and the control system.
~OTIO~ CO~TRO~
The motion control module 52 commands the steering of the vehicle through actuator cQntrols 54 in order to minimize the lateral and angular offsets.
Although the invention has been disclosed by way of example with reference to a preferred embodiment, it is to be understood that it is not limited to such embodiment and that other alternatives are also envisaged within the scope of the following claims.
:
Claims (6)
1. An optical guidance system for an automated guided vehicle including:
a) an optical guideline network including a main guideline and milestones adapted to be installed above the path of the vehicle;
b) a two dimensions optical sensor mounted on the vehicle for scanning the optical guideline and associated milestones and producing a video image of the optical guideline and associated milestones; and c) an image processing system also mounted on the vehicle and responsive to the optical sensor for generating lateral and orientation offset signals from the localization of the optical guideline on the video image.
a) an optical guideline network including a main guideline and milestones adapted to be installed above the path of the vehicle;
b) a two dimensions optical sensor mounted on the vehicle for scanning the optical guideline and associated milestones and producing a video image of the optical guideline and associated milestones; and c) an image processing system also mounted on the vehicle and responsive to the optical sensor for generating lateral and orientation offset signals from the localization of the optical guideline on the video image.
2. An optical guidance system as defined in claim 1, wherein the optical guideline is made of a retro-reflective ribbon.
3. An optical guidance system as defined in claim 1, wherein the milestones are made of at least one retro-reflective ribbon of short length mounted parallel to the guideline in a manner as to form a bar code with the guideline for indicating predetermined milestones along the guideline.
4. An optical guidance system as defined in claim 1, wherein the optical sensor comprises a video camera mounted on the vehicle, the camera being oriented towards the guideline and inclined in the direction of movement of the vehicle and producing a video image of the guideline and associated milestones.
5. An optical guidance system as defined in claim 4, further comprising a lighting source located near the objective of the camera for image contrast enhancement.
6. An optical guidance system as defined in claim 1, wherein the image processing system comprises a computer, a grey-level image grabber responsive to the computer for transferring the video image into computer memory in the binary form, and a number of computer modules allowing to compute the lateral and orientation offset of the vehicle with respect to the optical guideline and detect the passage of milestones and identify them.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA 2049773 CA2049773A1 (en) | 1991-08-23 | 1991-08-23 | Optical guidance system for an automated guidance vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA 2049773 CA2049773A1 (en) | 1991-08-23 | 1991-08-23 | Optical guidance system for an automated guidance vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2049773A1 true CA2049773A1 (en) | 1993-02-24 |
Family
ID=4148242
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA 2049773 Abandoned CA2049773A1 (en) | 1991-08-23 | 1991-08-23 | Optical guidance system for an automated guidance vehicle |
Country Status (1)
Country | Link |
---|---|
CA (1) | CA2049773A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5999865A (en) * | 1998-01-29 | 1999-12-07 | Inco Limited | Autonomous vehicle guidance system |
US6163745A (en) * | 1997-04-15 | 2000-12-19 | Ainsworth Inc. | Guidance system for automated vehicles, and guidance strip for use therewith |
US6633800B1 (en) | 2001-01-31 | 2003-10-14 | Ainsworth Inc. | Remote control system |
CN103163883A (en) * | 2011-12-15 | 2013-06-19 | 财团法人工业技术研究院 | Automatic transport vehicle guiding system and automatic transport vehicle guiding method |
RU170172U1 (en) * | 2016-11-03 | 2017-04-18 | ООО "Инновации, Технологии, Экология" | AUTOMATED LOGISTIC ROBOT |
CN108147035A (en) * | 2018-03-05 | 2018-06-12 | 菲尼克斯(南京)智能制造技术工程有限公司 | Alignment system and method guide locating device provided and for AGV conveyer systems |
CN113093719A (en) * | 2019-12-19 | 2021-07-09 | 财团法人工业技术研究院 | Automatic guided vehicle positioning system and operation method thereof |
CN114537555A (en) * | 2022-03-14 | 2022-05-27 | 恒达富士电梯有限公司 | Elevator flexible production line in coordination based on multi-robot and multi-AGV in coordination |
-
1991
- 1991-08-23 CA CA 2049773 patent/CA2049773A1/en not_active Abandoned
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6163745A (en) * | 1997-04-15 | 2000-12-19 | Ainsworth Inc. | Guidance system for automated vehicles, and guidance strip for use therewith |
US5999865A (en) * | 1998-01-29 | 1999-12-07 | Inco Limited | Autonomous vehicle guidance system |
US6633800B1 (en) | 2001-01-31 | 2003-10-14 | Ainsworth Inc. | Remote control system |
CN103163883A (en) * | 2011-12-15 | 2013-06-19 | 财团法人工业技术研究院 | Automatic transport vehicle guiding system and automatic transport vehicle guiding method |
US9207676B2 (en) | 2011-12-15 | 2015-12-08 | Industrial Technology Research Institute | System and method for guiding automated guided vehicle |
CN103163883B (en) * | 2011-12-15 | 2016-06-29 | 财团法人工业技术研究院 | Automatic transport vehicle guiding system and automatic transport vehicle guiding method |
RU170172U1 (en) * | 2016-11-03 | 2017-04-18 | ООО "Инновации, Технологии, Экология" | AUTOMATED LOGISTIC ROBOT |
CN108147035A (en) * | 2018-03-05 | 2018-06-12 | 菲尼克斯(南京)智能制造技术工程有限公司 | Alignment system and method guide locating device provided and for AGV conveyer systems |
CN108147035B (en) * | 2018-03-05 | 2024-03-26 | 菲尼克斯(南京)智能制造技术工程有限公司 | Guiding and positioning device and positioning system and method for AGV conveying system |
CN113093719A (en) * | 2019-12-19 | 2021-07-09 | 财团法人工业技术研究院 | Automatic guided vehicle positioning system and operation method thereof |
CN113093719B (en) * | 2019-12-19 | 2024-06-18 | 财团法人工业技术研究院 | Automatic guided vehicle positioning system and operation method thereof |
CN114537555A (en) * | 2022-03-14 | 2022-05-27 | 恒达富士电梯有限公司 | Elevator flexible production line in coordination based on multi-robot and multi-AGV in coordination |
CN114537555B (en) * | 2022-03-14 | 2023-09-01 | 恒达富士电梯有限公司 | Elevator cooperation flexible production line based on cooperation of multiple robots and multiple AGVs |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108363065B (en) | Object detection system | |
EP0271494B1 (en) | Navigation system | |
US10241206B2 (en) | Sensor system for a vehicle for detecting bridges or tunnel entrances | |
US5999865A (en) | Autonomous vehicle guidance system | |
Tsugawa | Vision-based vehicles in Japan: Machine vision systems and driving control systems | |
EP0567013B1 (en) | An inter-vehicle distance detecting device | |
US5617085A (en) | Method and apparatus for monitoring the surroundings of a vehicle and for detecting failure of the monitoring apparatus | |
US7400746B2 (en) | System and method for detecting obstacle | |
EP0697641A2 (en) | Lane image processing system for vehicle | |
US10580124B2 (en) | Inspection device, control method and control apparatus for the same | |
US20080144926A1 (en) | Obstacle detection system and method therefor | |
CN105393138A (en) | Optoelectronic detection device and method for detecting the environment of a motor vehicle in a scanning manner | |
WO2018194721A1 (en) | Method of providing interference reduction and a dynamic region of interest in a lidar system | |
EP0510363A1 (en) | Distance measuring device | |
CN1292878A (en) | Optical sensor system for detecting position of object | |
EP3674830B1 (en) | Server and method for controlling laser irradiation of movement path of robot, and robot that moves based thereon | |
CA2049773A1 (en) | Optical guidance system for an automated guidance vehicle | |
US11079857B2 (en) | Optical detecting device | |
US20230351776A1 (en) | Method and device for determining the orientation of a surface of an object | |
US5313054A (en) | Method and device for determining the orientation of a solid | |
Takeda et al. | Automated vehicle guidance using spotmark | |
CN220323539U (en) | Road side sensing equipment | |
JP2864742B2 (en) | Coke oven working machine fixed position detection method and apparatus | |
KR102506812B1 (en) | Autonomous vehicle | |
Fleischmann et al. | Segmentation of Very Sparse and Noisy Point Clouds |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
FZDE | Dead |