EP2889713B1 - Object tracking and steer maneuvers for materials handling vehicles - Google Patents
Object tracking and steer maneuvers for materials handling vehicles Download PDFInfo
- Publication number
- EP2889713B1 EP2889713B1 EP15151549.1A EP15151549A EP2889713B1 EP 2889713 B1 EP2889713 B1 EP 2889713B1 EP 15151549 A EP15151549 A EP 15151549A EP 2889713 B1 EP2889713 B1 EP 2889713B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- zone
- steer
- truck
- vehicle
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000005007 materials handling Methods 0.000 title claims description 53
- 238000001514 detection method Methods 0.000 claims description 146
- 230000009471 action Effects 0.000 claims description 47
- 230000033001 locomotion Effects 0.000 claims description 11
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000012937 correction Methods 0.000 description 77
- 230000004044 response Effects 0.000 description 46
- 238000000034 method Methods 0.000 description 22
- 238000005516 engineering process Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 230000002829 reductive effect Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000000977 initiatory effect Effects 0.000 description 4
- 230000000670 limiting effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 230000003466 anti-cipated effect Effects 0.000 description 2
- 230000009118 appropriate response Effects 0.000 description 2
- 238000013475 authorization Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000000153 supplemental effect Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000003028 elevating effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000008092 positive effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D6/00—Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
- B62D6/002—Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits computing target steering angles for front or rear wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/025—Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
- B62D15/0265—Automatic obstacle avoidance by steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D6/00—Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
- B62D6/001—Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits the torque NOT being among the input parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/0755—Position control; Position detectors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/07568—Steering arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/07581—Remote controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/20—Binding and programming of remote control devices
Definitions
- each detection zone may be chosen, for example, based upon factors such as the desired speed of the truck when the truck 10 is traveling in response to a valid, received travel request from the remote control device 70, the required stopping distance, the anticipated load to be transported by the truck 10, whether a certain amount of coast is required for load stability, vehicle reaction time, etc.,. Moreover, factors such as the range of each desired detection zone etc. may be considered to determine the number of obstacle sensors 76 required. In this regard, such information may be static, or dynamic, e.g., based upon operator experience, vehicle load, nature of the load, environmental conditions, etc. It is also contemplated that the controller 103 may generate a warning signal or alarm if an object or a person is detected in a detection zone.
- Values defining the vehicle stop and maximum allowable speed signals may be experimentally determined and stored in a look-up table, computed in real time based upon a predetermined formula, etc.
- the controller 103 determines the weight of a load on the forks 16 and whether an obstacle has been detected in one of the first, second and third detection zones and, using a lookup table it effects a stop command or defines a maximum allowable speed for the truck 10 and generates a corresponding maximum allowable speed signal for the truck 10.
- first and second obstacle sensors 76A and 76B are spaced apart from one another along a longitudinal axis L A of the truck 10, see Fig. 8 .
- the first obstacle sensors 76A are positioned at the front 10A of the truck 10 and are capable of sensing objects located in, for example, the first, second and/or third detection zones. So as to ensure that objects O located in the non-detect zone DZ, which may be inherent in the first obstacle sensors 76A, the second obstacle sensors 76B are positioned on the truck 10 a spaced distance behind the first sensors 76A, i.e., in a direction away from the front 10A of truck 10, as best illustrated in Fig. 8 . In this regard, the second sensors 76B function at least to sense objects in the dead zone DZ in Fig. 7 .
- a left steer compensation maneuver is implemented at 158.
- the left steer compensation maneuver at 158 may comprise, for example, implementing a counter steer to adjust the travel direction of the truck 10 to an appropriate heading.
- the left steer compensation maneuver may comprise steering the truck 10 at a selected or otherwise determined angle for a distance that is a percentage of the previously accumulated travel distance.
- the left steer angle utilized for the left steer compensation maneuver may be fixed or variable, and may be the same as, or different from the steer angle utilized to implement the right steer correction at 156.
- a steer correction routine is implemented at 160 that includes computing a steer angle correction to steer the truck 10 to the left according to a second set of parameters.
- a steer left correction implemented at 160 may include steering the truck 10 to the left at a left steer angle.
- the left steer correction maneuver at 160 may be implemented in a manner analogous to that described above at 156, except that the correction is to the right at 156 and to the left at 160.
- a right steer compensation maneuver is implemented at 162.
- the right steer compensation maneuver at 162 may comprise, for example, implementing a counter steer to adjust the travel direction of the truck 10 to an appropriate heading in a manner analogous to that described at 158, except that the steer compensation maneuver at 158 is to the left and the steer compensation maneuver at 162 is to the right.
- a user and/or service representative may be able to customize the response of the steer angle correction algorithm parameters.
- a service representative may have access to programming tools to load customized variables, e.g., in the controller 103, for implementing steer correction.
- a truck operator may have controls that allow the operator to input customized parameters into the controller, e.g., via potentiometers, encoders, a software user interface, etc.
- the steer correction described more fully herein will steer away from the structure.
- the truck 10 will eventually travel towards the structure until the steer correction again repositions the truck 10.
- the steer compensation e.g., 158 in Fig. 10 , could be made to deliberately overcompensate, thus maintaining the truck 10 adjacent to the structure.
- First and second objects 272, 274 are illustrated in the environment 200 in Figs. 15A-15C . These objects 272, 274 are detected by the obstacle sensor 76 during operation, and the obstacle sensor 76 sends sensor data to the controller 103 about the objects 272, 274. The controller 103 uses the sensor data to assign the objects 272, 274 to buckets 220 defined within the scanned zone 202 based on the sensor data from the obstacle sensor 76. Once the objects 272, 274 exit the scanned zone 202 and enter the history zone 204, the objects 272, 274 are assigned to the buckets 222 in the history zone 204.
- the left and right stop zones 300, 302 are located to the front of and immediately to the sides of the truck 10. If an object is detected in either of the stop zones 300, 302 the controller 103 will initiate a brake operation to cause the truck 10 to stop.
- the truck 10 may be traveling in response to receiving a remote wireless travel request, i.e., from a wireless transmitter, as discussed in detail herein.
- a remote wireless travel request i.e., from a wireless transmitter
- the truck 10 may be coasting to a stop or may be driven manually by a rider or a walker who is walking alongside the truck 10.
- the controller 103 implements a steer maneuver to achieve a straight heading of the truck 10 in the axial direction, i.e., parallel to the central axis C A , so as to maintain the edge portion 276A of the selected object 276 on the right hug line 314A.
Landscapes
- Engineering & Computer Science (AREA)
- Transportation (AREA)
- Structural Engineering (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Civil Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Geology (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Acoustics & Sound (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mathematical Physics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Guiding Agricultural Machines (AREA)
- Automatic Cycles, And Cycles In General (AREA)
Description
- The present invention relates in general to materials handling vehicles, and more particularly, to object tracking and steer correction schemes for materials handling vehicles, such as remotely operated low level order picking trucks.
- Low level order picking trucks are commonly used for picking stock in warehouses and distribution centers. Such order picking trucks typically include load carrying forks and a power unit having a platform upon which an operator may step and ride while controlling the truck. The power unit also has a steerable wheel and corresponding traction and steering control mechanisms, e.g., a movable steering arm that is coupled to the steerable wheel. A control handle attached to the steering arm typically includes the operational controls necessary for driving the truck and operating its load handling features.
- In a typical stock picking operation, an operator fills orders from available stock items that are located in storage areas provided along a plurality of aisles of a warehouse or distribution center. In this regard, the operator drives a low level order picking truck to a first location where item(s) are to be picked. In a pick process, the operator typically steps off the order picking truck, walks over to the appropriate location and retrieves the ordered stock item(s) from their associated storage area(s). The operator then returns to the order picking truck and places the picked stock on a pallet, collection cage or other support structure carried by the truck forks. Upon completing the pick process, the operator advances the order picking truck to the next location where item(s) are to be picked. The above process is repeated until all stock items on the order have been picked.
- It is not uncommon for an operator to repeat the pick process several hundred times per order. Moreover, the operator may be required to pick numerous orders per shift. As such, the operator may be required to spend a considerable amount of time relocating and repositioning the order picking truck, which reduces the time available for the operator to spend picking stock.
-
WO2010/065864 A2 andUS 2010/114405 A1 relate to a supplemental control system for a materials handling vehicle comprising one or more sensors capable of defining multiple contactless detection zones at least towards the front of the forward travel direction of a remotely controlled vehicle. The vehicle responds to the detection of objects within the designated zones based upon predetermined actions, such as to slow down or stop the vehicle, and/or to take other action, such as to perform a steer angle correction. -
WO 2011/002478 A2 relates to a finger-mounted remote control device capable of wirelessly transmitting a travel request signal to a materials handling vehicle. The remote control vehicle may include one or more obstacle sensors and to define at least one detection zone defining an area at least partially in front of the forward travelling vehicle. - In accordance with an aspect of the present invention, a materials handling vehicle having detection zone control is provided, the materials handling vehicle comprising: a power unit for driving the vehicle; at least one contactless obstacle sensor on the vehicle that is operable to scan a scanned zone within an environment proximate to the vehicle, characterized in that the environment comprises the scanned zone and a history zone, the history zone being not actively scanned by the at least one obstacle sensor, but objects that are detected by the at least one obstacle sensor in the scanned zone are capable of being tracked as they pass through the history zone during movement of the vehicle; and a controller configured to receive information obtained from the at least one obstacle sensor and to define at least two zones within the environment based on the received information, the at least two zones comprising at least one stop zone and at least one steer away zone, wherein the controller: performs a stop action to bring the vehicle to a stop if an object is detected in the at least one stop zone; performs a steer maneuver to steer the vehicle away from an object detected in the at least one steer away zone; and tracks objects determined to be in the history zone until such objects are no longer in the environment.
- In accordance with a second aspect of the present invention, a multiple detection zone control system for a materials handling vehicle comprising: at least one contactless obstacle sensor on the vehicle that is operable to scan a scanned zone within an environment proximate to the vehicle, characterized in that the environment comprises the scanned zone and a history zone, the history zone being not actively scanned by the at least one obstacle sensor, but objects that are detected by the at least one obstacle sensor in the scanned zone are capable of being tracked as they pass through the history zone during movement of the vehicle; and a controller configured to receive information obtained from the at least one obstacle sensor and to define at least two zones within the environment based on the received information, the at least two zones comprising at least one stop zone and at least one steer away zone, wherein the controller: performs a stop action to bring the vehicle to a stop if an object is detected in the at least one stop zone; performs a steer maneuver to steer the vehicle away from an object detected in the at least one steer away zone; and tracks objects determined to be in the history zone until such objects are no longer in the environment.
-
-
Fig. 1 is an illustration of a materials handling vehicle capable of remote wireless operation according to various embodiments of the present invention; -
Fig. 2 is a schematic diagram of several components of a materials handling vehicle capable of remote wireless operation according to various embodiments of the present invention; -
Fig. 3 is a schematic diagram illustrating detection zones of a materials handling vehicle according to various embodiments of the present invention; -
Fig. 4 is a schematic diagram illustrating an exemplary approach for detecting an object according to various embodiments of the present invention; -
Fig. 5 is a schematic diagram illustrating a plurality of detection zones of a materials handling vehicle according to further embodiments of the present invention; -
Fig. 6 is an illustration of a materials handling vehicle having spaced-apart obstacle detectors according to various embodiments of the present invention; -
Fig. 7 is an illustration of a materials handling vehicle having obstacle detectors according to further embodiments of the present invention; -
Fig. 8 is an illustration of a materials handling vehicle having obstacle detectors according to still further embodiments of the present invention; -
Fig. 9 is a schematic block diagram of a control system of a materials handling vehicle that is coupled to sensors for detecting objects in the travel path of the vehicle according to various embodiments of the present invention; -
Figs. 10 is a flow chart of a method of implementing steer correction according to various embodiments of the present invention; -
Fig. 11 is a schematic illustration of a materials handling vehicle traveling down a narrow warehouse aisle under remote wireless operation, which is automatically implementing a steer correction maneuver according to various embodiments of the present invention; -
Fig. 12 is a graph illustrating an exemplary speed of a materials handling vehicle implementing a steer correction maneuver under remote wireless operation according to various embodiments of the present invention; -
Fig. 13 is a graph illustrating exemplary steer bumper input data to a controller, which illustrates whether an object is sensed in the left or right steer bumper zones, according to various embodiments of the present invention; -
Fig. 14 is a graph illustrating exemplary steer correction in degrees to illustrate an exemplary and illustrative steer correction maneuver applied to a materials handling vehicle under remote wireless operation according to various embodiments of the present invention; -
Figs. 15A-15C are schematic illustrations of an exemplary environment used in connection with object tracking in a materials handling vehicle traveling under remote wireless operation according to various embodiments of the present invention; -
Figs. 16A-16C are schematic illustrations of exemplary zones used for implementing steer maneuvers in a materials handling vehicle traveling under remote wireless operation according to various embodiments of the present invention; and -
Fig. 17A-17C are schematic illustrations of a materials handling vehicle traveling down a warehouse aisle under remote wireless operation, which is automatically implementing steer maneuvers according to various embodiments of the present invention. - In the following detailed description of the illustrated embodiments, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration, and not by way of limitation, specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of various embodiments of the present invention.
- Referring now to the drawings, and particularly to
Fig. 1 , a materials handling vehicle, which is illustrated as a low levelorder picking truck 10, includes in general aload handling assembly 12 that extends from apower unit 14. Theload handling assembly 12 includes a pair offorks 16, eachfork 16 having a load supportingwheel assembly 18. Theload handling assembly 12 may include other load handling features in addition to, or in lieu of the illustrated arrangement of theforks 16, such as a load backrest, scissors-type elevating forks, outriggers or separate height adjustable forks. Still further, theload handling assembly 12 may include load handling features such as a mast, a load platform, collection cage or other support structure carried by theforks 16 or otherwise provided for handling a load supported and carried by thetruck 10. - The illustrated
power unit 14 comprises a step-through operator's station dividing a first end section of the power unit 14 (opposite the forks 16) from a second end section (proximate the forks 16). The step-through operator's station provides a platform upon which an operator may stand to drive thetruck 10 and/or to provide a position from which the operator may operate the various included features of thetruck 10. -
Presence sensors 58 may be provided to detect the presence of an operator on thetruck 10. For example,presence sensors 58 may be located on, above or under the platform floor, or otherwise provided about the operator's station. In the exemplary truck ofFig. 1 , thepresence sensors 58 are shown in dashed lines indicating that they are positioned under the platform floor. Under this arrangement, thepresence sensors 58 may comprise load sensors, switches, etc. As an alternative, thepresence sensors 58 may be implemented above the platform floor, such as by using ultrasonic, capacitive or other suitable sensing technology. The utilization ofpresence sensors 58 will be described in greater detail herein. - An
antenna 66 extends vertically from thepower unit 14 and is provided for receiving control signals from a corresponding wirelessremote control device 70. Theremote control device 70 may comprise a transmitter that is worn or otherwise maintained by the operator. Theremote control device 70 is manually operable by an operator, e.g., by pressing a button or other control, to cause theremote control device 70 to wirelessly transmit at least a first type signal designating a travel request to thetruck 10. The travel request is a command that requests thecorresponding truck 10 to travel by a predetermined amount, as will be described in greater detail herein. - The
truck 10 also comprises one ormore obstacle sensors 76, which are provided about thetruck 10, e.g., towards the first end section of thepower unit 14 and/or to the sides of thepower unit 14. Theobstacle sensors 76 include at least one contactless obstacle sensor on thetruck 10, and are operable to define at least one detection zone. For example, at least one detection zone may define an area at least partially in front of a forward traveling direction of thetruck 10 when thetruck 10 is traveling in response to a wirelessly received travel request from theremote control device 70, as will also be described in greater detail herein. - The
obstacle sensors 76 may comprise any suitable proximity detection technology, such as an ultrasonic sensors, optical recognition devices, infrared sensors, laser scanner sensors, etc., which are capable of detecting the presence of objects/obstacles or are capable of generating signals that can be analyzed to detect the presence of objects/obstacles within the predefined detection zone(s) of thepower unit 14. - In practice, the
truck 10 may be implemented in other formats, styles and features, such as an end control pallet truck that includes a steering tiller arm that is coupled to a tiller handle for steering the truck. Similarly, although theremote control device 70 is illustrated as a glove-like structure 70, numerous implementations of theremote control device 70 may be implemented, including for example, finger worn, lanyard or sash mounted, etc. Still further, the truck, remote control system and/or components thereof, including theremote control device 70, may comprise any additional and/or alternative features or implementations, examples of which are disclosed inU.S. Provisional Patent Application Serial No. 60/825,688, filed September 14, 2006 U.S. Patent Application Serial No. 11/855,310, filed September 14, 2007 U.S. Patent Application Serial No. 11/855,324, filed September 14, 2007 U.S. Provisional Patent Application Serial No. 61/222,632, filed July 2, 2009 U.S. Patent Application Serial No. 12/631,007, filed December 4, 2009 U.S. Provisional Patent Application Serial No. 61/119,952, filed December 4, 2008 U.S. Patent No. 7,017,689, issued March 28, 2006 , entitled "ELECTRICAL STEERING ASSIST FOR MATERIAL HANDLING VEHICLE". - Referring to
Fig. 2 , a block diagram illustrates a control arrangement for integrating remote control commands with thetruck 10. Theantenna 66 is coupled to areceiver 102 for receiving commands issued by theremote control device 70. Thereceiver 102 passes the received control signals to acontroller 103, which implements the appropriate response to the received commands and may thus also be referred to herein as a master controller. In this regard, thecontroller 103 is implemented in hardware and may also execute software (including firmware, resident software, micro-code, etc.) Furthermore, embodiments of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. For example, thetruck 10 may include memory that stores the computer program product, which, when implemented by a processor of thecontroller 103, implements steer correction as described more fully herein. - Thus, the
controller 103 may define, at least in part, a data processing system suitable for storing and/or executing program code and may include at least one processor coupled directly or indirectly to memory elements, e.g., through a system bus or other suitable connection. The memory elements can include local memory employed during actual execution of the program code, memory that is integrated into a microcontroller or application specific integrated circuit (ASIC), a programmable gate array or other reconfigurable processing device, etc. - The response implemented by the
controller 103 in response to wirelessly received commands, e.g., via thewireless transmitter 70 and correspondingantennae 66 andreceiver 102, may comprise one or more actions, or inaction, depending upon the logic that is being implemented. Positive actions may comprise controlling, adjusting or otherwise affecting one or more components of thetruck 10. Thecontroller 103 may also receive information fromother inputs 104, e.g., from sources such as thepresence sensors 58, theobstacle sensors 76, switches, load sensors, encoders and other devices/features available to thetruck 10 to determine appropriate action in response to the received commands from theremote control device 70. Thesensors controller 103 via theinputs 104 or via a suitable truck network, such as a control area network (CAN)bus 110. - In an exemplary arrangement, the
remote control device 70 is operative to wirelessly transmit a control signal that represents a first type signal such as a travel command to thereceiver 102 on thetruck 10. The travel command is also referred to herein as a "travel signal", "travel request" or "go signal". The travel request is used to initiate a request to thetruck 10 to travel by a predetermined amount, e.g., to cause thetruck 10 to advance or jog in a first direction by a limited travel distance. The first direction may be defined, for example, by movement of thetruck 10 in apower unit 14 first, i.e.,forks 16 to the back, direction. However, other directions of travel may alternatively be defined. Moreover, thetruck 10 may be controlled to travel in a generally straight direction or along a previously determined heading. Correspondingly, the limited travel distance may be specified by an approximate travel distance, travel time or other measure. - Thus, a first type signal received by the
receiver 102 is communicated to thecontroller 103. If thecontroller 103 determines that the travel signal is a valid travel signal and that the current vehicle conditions are appropriate (explained in greater detail below), thecontroller 103 sends a signal to the appropriate control configuration of theparticular truck 10 to advance and then stop thetruck 10. Stopping thetruck 10 may be implemented, for example, by either allowing thetruck 10 to coast to a stop or by initiating a brake operation to cause thetruck 10 to brake to a stop. - As an example, the
controller 103 may be communicably coupled to a traction control system, illustrated as atraction motor controller 106 of thetruck 10. Thetraction motor controller 106 is coupled to atraction motor 107 that drives at least one steeredwheel 108 of thetruck 10. Thecontroller 103 may communicate with thetraction motor controller 106 so as to accelerate, decelerate, adjust and/or otherwise limit the speed of thetruck 10 in response to receiving a travel request from theremote control device 70. Thecontroller 103 may also be communicably coupled to asteer controller 112, which is coupled to asteer motor 114 that steers at least one steeredwheel 108 of thetruck 10. In this regard, thetruck 10 may be controlled by thecontroller 103 to travel an intended path or maintain an intended heading in response to receiving a travel request from theremote control device 70. - As yet another illustrative example, the
controller 103 may be communicably coupled to abrake controller 116 that controlstruck brakes 117 to decelerate, stop or otherwise control the speed of thetruck 10 in response to receiving a travel request from theremote control device 70. Still further, thecontroller 103 may be communicably coupled to other vehicle features, such asmain contactors 118, and/orother outputs 119 associated with thetruck 10, where applicable, to implement desired actions in response to implementing remote travel functionality. - According to various embodiments of the present invention, the
controller 103 may communicate with thereceiver 102 and with thetraction controller 106 to operate thetruck 10 under remote control in response to receiving travel commands from the associatedremote control device 70. Moreover, thecontroller 103 may be configured to perform a first action if thetruck 10 is traveling under remote control in response to a travel request and an obstacle is detected in a first one of previously detection zone(s). Thecontroller 103 may be further configured to perform a second action different from the first action if thetruck 10 is traveling under remote control in response to a travel request and an obstacle is detected in a second one of the detection zones. In this regard, when a travel signal is received by thecontroller 103 from theremote control device 70, any number of factors may be considered by thecontroller 103 to determine whether the received travel signal should be acted upon to initiate and/or sustain movement of thetruck 10. - Correspondingly, if the
truck 10 is moving in response to a command received by remote wireless control, thecontroller 103 may dynamically alter, control, adjust or otherwise affect the remote control operation, e.g., by stopping thetruck 10, changing the steer angle of thetruck 10, or taking other actions. Thus, the particular vehicle features, the state/condition of one or more vehicle features, vehicle environment, etc., may influence the manner in whichcontroller 103 responds to travel requests from theremote control device 70. - The
controller 103 may refuse to acknowledge a received travel request depending upon predetermined condition(s), e.g., that relate to environmental or/ operational factor(s). For example, thecontroller 103 may disregard an otherwise valid travel request based upon information obtained from one or more of thesensors controller 103 may optionally consider factors such as whether an operator is on thetruck 10 when determining whether to respond to a travel command from theremote control device 70. As noted above, thetruck 10 may comprise at least onepresence sensor 58 for detecting whether an operator is positioned on thetruck 10. In this regard, thecontroller 103 may be further configured to respond to a travel request to operate thetruck 10 under remote control when the presence sensor(s) 58 designate that no operator is on thetruck 10. Thus, in this implementation, thetruck 10 cannot be operated in response to wireless commands from the transmitter unless the operator is physically off of thetruck 10. Similarly, if theobject sensors 76 detect that an object, including the operator, is adjacent and/or proximate to thetruck 10, thecontroller 103 may refuse to acknowledge a travel request from thetransmitter 70. Thus, in an exemplary implementation, an operator must be located within a limited range of thetruck 10, e.g., close enough to thetruck 10 to be in wireless communication range (which may be limited to set a maximum distance of the operator from the truck 10). Other arrangements may alternatively be implemented. - Any other number of reasonable conditions, factors, parameters or other considerations may also/alternatively be implemented by the
controller 103 to interpret and take action in response to received signals from the transmitter. Other exemplary factors are set out in greater detail inU.S. Provisional Patent Application Serial No. 60/825,688 , entitled "SYSTEMS AND METHODS OF REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE;"U.S. Patent Application Serial No. 11/855,310 , entitled "SYSTEMS AND METHODS OF REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE;"U.S. Patent Application Serial No. 11/855,324 , entitled "SYSTEMS AND METHODS OF REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE;"U.S. Provisional Patent Application Serial No. 61/222,632 U.S. Patent Application Serial No. 12/631,007 , entitled "MULTIPLE ZONE SENSING FOR MATERIALS HANDLING VEHICLES;" andU.S. Provisional Patent Application Serial No. 61/119,952 - Upon acknowledgement of a travel request, the
controller 103 interacts with thetraction motor controller 106, e.g., directly or indirectly, e.g., via a bus such as theCAN bus 110 if utilized, to advance thetruck 10 by a limited amount. Depending upon the particular implementation, thecontroller 103 may interact with thetraction motor controller 106 and optionally, thesteer controller 112, to advance thetruck 10 by a predetermined distance. Alternatively, thecontroller 103 may interact with thetraction motor controller 106 and optionally, thesteer controller 112, to advance thetruck 10 for a period of time in response to the detection and maintained actuation of a travel control on the remote 70. As yet another illustrative example, thetruck 10 may be configured to jog for as long as a travel control signal is received. Still further, thecontroller 103 may be configured to "time out" and stop the travel of thetruck 10 based upon a predetermined event, such as exceeding a predetermined time period or travel distance regardless of the detection of maintained actuation of a corresponding control on theremote control device 70. - The
remote control device 70 may also be operative to transmit a second type signal, such as a "stop signal", designating that thetruck 10 should brake and/or otherwise come to rest. The second type signal may also be implied, e.g., after implementing a "travel" command, e.g., after thetruck 10 has traveled a predetermined distance, traveled for a predetermined time, etc., under remote control in response to the travel command. If thecontroller 103 determines that a wirelessly received signal is a stop signal, thecontroller 103 sends a signal to thetraction controller 106, thebrake controller 116 and/or other truck component to bring thetruck 10 to a rest. As an alternative to a stop signal, the second type signal may comprise a "coast signal" or a "controlled deceleration signal" designating that thetruck 10 should coast, eventually slowing to rest. - The time that it takes to bring the
truck 10 to a complete rest may vary, depending for example, upon the intended application, the environmental conditions, the capabilities of theparticular truck 10, the load on thetruck 10 and other similar factors. For example, after completing an appropriate jog movement, it may be desirable to allow thetruck 10 to "coast" some distance before coming to rest so that thetruck 10 stops slowly. This may be achieved by utilizing regenerative braking to slow thetruck 10 to a stop. Alternatively, a braking operation may be applied after a predetermined delay time to allow a predetermined range of additional travel to thetruck 10 after the initiation of the stop operation. It may also be desirable to bring thetruck 10 to a relatively quicker stop, e.g., if an object is detected in the travel path of thetruck 10 or if an immediate stop is desired after a successful jog operation. For example, the controller may apply predetermined torque to the braking operation. Under such conditions, thecontroller 103 may instruct thebrake controller 116 to apply thebrakes 117 to stop thetruck 10. - Referring to
Fig. 3 , according to various embodiments of the present invention, one ormore obstacle sensors 76 are configured so as to collectively enable detection of objects/obstacles within multiple "detection zones". In this regard, thecontroller 103 may be configured to alter one or more operational parameters of thetruck 10 in response to detection of an obstacle in one or more of the detection zones as set out in greater detail herein. The control of thetruck 10 utilizing detection zones may be implemented when an operator is riding/driving thetruck 10. One or more detection zones may also be disabled or otherwise ignored by thecontroller 103 when an operator is riding on/driving thetruck 10, e.g., to allow the operator to navigate thetruck 10 in tight spaces. The control of thetruck 10 utilizing detection zones may also be integrated with supplemental remote control as set out and described more fully herein. - Although six
obstacle sensors 76 are shown for purposes of clarity of discussion herein, any number ofobstacle sensors 76 may be utilized. The number ofobstacle sensors 76 will likely vary, depending upon the technology utilized to implement the sensor, the size and/or range of the detection zones, the number of detection zones, and/or other factors. - In the illustrative example, a
first detection zone 78A is located proximate to thepower unit 14 of thetruck 10. Asecond detection zone 78B is defined adjacent to thefirst detection zone 78A and appears to generally circumscribe thefirst detection zone 78A. A third area is also conceptually defined as all area outside the first andsecond detection zones second detection zone 78B is illustrated as substantially circumscribing thefirst detection zone 78A, any other practical arrangement that defines the first andsecond detection zones detection zones detection zones - Still further, the detection zones need not surround the
entire truck 10. Rather, the shape of the detection zones may be dependent upon the particular implementation as set out in greater detail herein. For example, if thedetection zones truck 10 is moving without an operator riding thereon, under remote travel control in a power unit first (forks to the rear) orientation, then thedetection zones truck 10. However, the detection zones can also cover other areas, e.g., adjacent to the sides of thetruck 10. - According to various embodiments of the present invention, the
first detection zone 78A may further designate a "stop zone". Correspondingly, thesecond detection zone 78B may further designate a "first speed zone". Under this arrangement, if an object, e.g., some form of obstacle is detected within thefirst detection zone 78A, and the materials handling vehicle, e.g.,truck 10, is traveling under remote control in response to a travel request, then thecontroller 103 may be configured to implement an action such as a "stop action" to bring thetruck 10 to a stop. In this regard, travel of thetruck 10 may continue once the obstacle is clear, or a second, subsequent travel request from theremote control device 70 may be required to restart travel of thetruck 10 once the obstacle is cleared. - If a travel request is received from the
remote control device 70 while the truck is at rest and an object is detected within thefirst detection zone 78A, then thecontroller 103 may refuse the travel request and keep the truck at rest until the obstacle is cleared out of the stop zone. - If an object/obstacle is detected within the
second detection zone 78B, and thematerials handling truck 10 is traveling under remote control in response to a travel request, then thecontroller 103 may be configured to implement a different action. For example, thecontroller 103 may implement a first speed reduction action to reduce the speed of thetruck 10 to a first predetermined speed, such as where thetruck 10 is traveling at a speed greater than the first predetermined speed. - Thus, assume the
truck 10 is traveling in response to implementing a travel request from the remote control device at a speed V2 as established by a set of operating conditions where theobstacle sensors 76 do not detect an obstacle in any detection zone. If the truck is initially at rest, the truck may be accelerated up to speed V2. The detection of an obstacle within thesecond detection zone 78B (but not thefirst detection zone 78A) may cause thetruck 10, e.g., via thecontroller 103 to alter at least one operational parameter, e.g., to slow down thetruck 10 to a first predetermined speed V1, which is slower than the speed V2. That is, V1 < V2. Once the obstacle is cleared from thesecond detection zone 78B, thetruck 10 may resume its speed V2, or thetruck 10 may maintain its speed V1 until the truck stops and theremote control device 70 initiates another travel request. Still further, if the detected object is subsequently detected within thefirst detection zone 78A, thetruck 10 will be stopped as described more fully herein. - Assume as an illustrative example, that the
truck 10 is configured to travel at a speed of approximately 2.5 miles per hour (mph) (4 Kilometers per hour (Km/h)) for a limited, predetermined amount, if thetruck 10 is traveling without an operator onboard and is under remote wireless control in response to a travel request from a correspondingremote control 70, so long as no object is detected in a defined detection zone. If an obstacle is detected in thesecond detection zone 78B, then thecontroller 103 may adjust the speed of thetruck 10 to a speed of approximately 1.5 mph (2.4 Km/h) or some other speed less than 2.5 miles per hour (mph) (4 Kilometers per hour (Km/h)). If an obstacle is detected in thefirst detection zone 78A, then thecontroller 103 stops thetruck 10. - The above example assumes that the
truck 10 is traveling under remote wireless control in response to a valid signal received from thetransmitter 70. In this regard, theobstacle sensors 76 can be used to adjust the operating conditions of theunoccupied truck 10. However, theobstacle sensors 76 and corresponding controller logic may also be operative when thetruck 10 is being driven by an operator, e.g., riding on the platform or other suitable location of thetruck 10. Thus, according to various embodiments of the present invention, thecontroller 103 may stop thetruck 10 or refuse to allow thetruck 10 to move if an object is detected within thestop zone 78A regardless of whether the truck is being driven by an operator or operating automatically in response to receiving a corresponding wirelessly transmitted travel request. Correspondingly, depending upon the specific implementation, speed control/limiting capability of thecontroller 103, e.g., in response to detecting an object in thesecond detection zone 78B but not thefirst detection zone 78A, may be implemented regardless of whether thetruck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request, or whether an operator is riding on thetruck 10 while driving it. - However, according to various embodiments of the present invention and as noted briefly above, there may be situations where it is desirable to disable one or more of the detection zones when the
truck 10 is being driven by an operator. For example, it may be desirable to override/disable theobstacle sensors 76/controller logic while the operator is driving thetruck 10 regardless of external conditions. As a further example, it may be desirable to override/disable theobstacle sensors 76/controller logic while the operator is driving thetruck 10 to allow the operator to navigate thetruck 10 in tight quarters, e.g., to navigate tight spaces, travel around corners, etc., that might otherwise activate one or more of the detection zones. As such, the activation of the controller logic, e.g., within thecontroller 103 to utilize the detection of objects in the detection zones to help control thetruck 10 while thetruck 10 is occupied by an operator, according to various embodiments of the present invention, may be manually controlled, programmably controlled or otherwise selectively controlled. - Referring to
Fig. 4 , according to further embodiments of the present invention, one or more of theobstacle sensors 76 may be implemented by ultrasonic technology or other suitable contactless technology capable of a distance measurement and/or position determination. Thus, the distance to an object can be measured, and/or a determination may be made so as to ascertain whether the detected object is within adetection zone truck 10. As an example, anobstacle sensor 76 may be implemented by an ultrasonic sensor or transducer that provides a "ping" signal, such as a high frequency signal generated by a piezo element. Theultrasonic sensor 76 then rests and listens for a response. In this regard, time of flight information may be determined and utilized to define each zone. Thus, a controller, e.g., thecontroller 103 or a controller specifically associated with theobstacle sensors 76 may utilize software that looks at time of flight information to determine whether an object is within a detection zone. - According to further embodiments of the present invention,
multiple obstacle sensors 76 can work together to obtain object sensing. For example, a first ultrasonic sensor may send out a ping signal. The first ultrasonic sensor and one or more additional ultrasonic sensors may then listen for a response. In this way, thecontroller 103 may use diversity in identifying the existence of an object within one or more of the detection zones. - With reference to
Fig. 5 , an implementation of multiple speed zone control is illustrated according to yet further embodiments of the present invention. As illustrated, three detection zones are provided. If an object such as an obstacle is detected in thefirst detection zone 78A and thetruck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by thetransmitter 70, then a first action may be performed, e.g., thetruck 10 may be brought to a stop as described more fully herein. If an object such as an obstacle is detected in thesecond detection zone 78B and thetruck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by thetransmitter 70, then a second action may be performed, e.g., the vehicle speed may be limited, reduced, etc. Thus, thesecond detection zone 78B may further designate a first speed zone. For example, the speed of thetruck 10 may be reduced and/or limited to a first relatively slow speed, e.g., approximately 1.5 mph (2.4 Km/h). - If an object such as an obstacle is detected in the
third detection zone 78C and thetruck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by thetransmitter 70, then a third action may be performed, e.g., thetruck 10 may be reduced in speed or otherwise limited to a second speed, e.g., approximately 2.5 mph (4 Km/h). Thus, the third detection zone may further designate a second speed zone. If no obstacles are detected in the first, second andthird detection zones truck 10 may be remotely commanded to travel a limited amount, e.g., at a rate that is greater than the rate of speed when an obstacle is in the third detection zone, e.g., a speed of approximately 4 mph (6.2 Km/h). - As
Fig. 5 further illustrates, the detection zones may be defined by different patterns relative to thetruck 10. Also, inFig. 5 , aseventh obstacle sensor 76 is utilized, however any number of sensors may be provided, depending upon the technology utilized and/or the features to be implemented. By way of illustration and not by way of limitation, theseventh obstacle sensor 76 may be approximately centered, such as on the bumper or other suitable location on thetruck 10. On anexemplary truck 10, thethird zone 78C may extend approximately 6.5 feet (2 meters) forward of thepower unit 14 of thetruck 10. - According to various embodiments of the present invention, any number of detection zones of any shape may be implemented. For example, depending upon desired truck performance, many small zones may be defined at various coordinates relative to the
truck 10. Similarly, a few large detection zones may be defined based upon desired truck performance. As an illustrative example, a table may be set up in the memory of the controller. If travel speed while operating under remote travel control is an operational parameter of interest, then the table may associate travel speed with the detection zones defined by distance, range, position coordinates or some other measure. If thetruck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by thetransmitter 70 and an obstacle sensor detects an object, then the distance to that detected object may be used as a "key" to look up a corresponding travel speed in the table. The travel speed retrieved from the table can be utilized by thecontroller 103 to adjust thetruck 10, e.g., to slow it down, etc. - The areas of each detection zone may be chosen, for example, based upon factors such as the desired speed of the truck when the
truck 10 is traveling in response to a valid, received travel request from theremote control device 70, the required stopping distance, the anticipated load to be transported by thetruck 10, whether a certain amount of coast is required for load stability, vehicle reaction time, etc.,. Moreover, factors such as the range of each desired detection zone etc. may be considered to determine the number ofobstacle sensors 76 required. In this regard, such information may be static, or dynamic, e.g., based upon operator experience, vehicle load, nature of the load, environmental conditions, etc. It is also contemplated that thecontroller 103 may generate a warning signal or alarm if an object or a person is detected in a detection zone. - As an illustrative example, in a configuration with multiple detection zones, e.g., three detection zones, as many as seven or more object detectors, e.g., ultrasonic sensors or laser sensors, may be used to provide a range of coverage desired by a corresponding application. In this regard, the detector(s) may be able to look ahead of the direction of travel of the
truck 10 by a sufficient distance to allow the appropriate response, e.g., to slow down. In this regard, at least one sensor may be capable of looking several meters forward in the direction of travel of thetruck 10. - According to various embodiments of the present invention, the multiple detection speed zones allows a relatively greater maximum forward travel speed while operating in response to wirelessly received travel commands. Such an arrangement may prevent unnecessarily early vehicle stops by providing one or more intermediate zones where the
truck 10 slows down before deciding to come to a complete stop. - According to further embodiments of the present invention, the utilization of multiple detection zones allows a system that rewards the corresponding operator for better alignment of the
truck 10 during pick operations. For example, an operator may position thetruck 10 so as to not be aligned with a warehouse aisle. In this example, as thetruck 10 is jogged forward, thesecond detection zone 78B may initially detect an obstacle such as a pick bin or warehouse rack. In response to detecting the rack, thetruck 10 will slow down. If the rack is sensed in thefirst detection zone 78A, then thetruck 10 will come to rest, even if thetruck 10 has not jogged its entire programmed jog distance. Similar un-necessary slow downs or stops may also occur in congested and/or messy aisles. - According to various embodiments of the present invention, the
truck 10 may shape speed and braking operation parameters based upon the information obtained from theobstacle sensors 76. Moreover, the logic implemented by thetruck 10 in response to the detection zones may be changed or varied depending upon a desired application. As a few illustrative examples, the boundaries of each zone in a multiple zone configuration may be programmably (and/or reprogrammably) entered in the controller, e.g., flash programmed. In view of the defined zones, one or more operational parameters may be associated with each zone. The established operational parameters may define a condition, e.g., maximum allowable travel speed, an action, e.g., brake, coast or otherwise come to a controlled stop, etc. The action may also be an avoidance action. For example, an action may comprise adjusting a steer angle or heading of thetruck 10 as will be described in greater detail herein. - In accordance with a further embodiment of the present invention, one or more obstacle sensors, such as the
obstacle sensors Figs. 6 and8 , may be employed to sense or detect objects within first, second and third detection zones in front of thetruck 10 when thetruck 10 is traveling in response to a travel request wirelessly received from thetransmitter 70. Thecontroller 103 or other sensor processing device may also generate an object-detected signal and optionally, a distance signal in response to sensing/detecting an object in front of thetruck 10. As an illustrative example, afurther input 104 into thecontroller 103 may be a weight signal generated by a load sensor LS, as illustrated inFigs. 7 and8 , which senses the combined weight of theforks 16 and any load on theforks 16. The load sensor LS is shown schematically inFigs. 7 and8 near theforks 16, but may be incorporated into a hydraulic system for effecting lift of theforks 16. By subtracting the weight of the forks 16 (a known constant value) from the combined weight defined by the weight signal, thecontroller 103 determines the weight of the load on the forks. Using sensed load weight and whether an object has been detected in one of the first, second and third detection zones as inputs into a lookup table or appropriate equations, thecontroller 103 generates an appropriate vehicle stop or maximum allowable speed signal. - Values defining the vehicle stop and maximum allowable speed signals may be experimentally determined and stored in a look-up table, computed in real time based upon a predetermined formula, etc. In the illustrated embodiment, the
controller 103 determines the weight of a load on theforks 16 and whether an obstacle has been detected in one of the first, second and third detection zones and, using a lookup table it effects a stop command or defines a maximum allowable speed for thetruck 10 and generates a corresponding maximum allowable speed signal for thetruck 10. - As an example, if no load is on the
forks 16 and no object is being detected by theobstacle sensors controller 103 allows thetruck 10 to be operated at any speed up to and including a maximum speed of 4.5 MPH. If no object is being detected in any one of the first, second and third detection zones, the maximum permitted speed of thetruck 10 may be configured for example, to decrease as the load on thetruck 10 increases. As an illustration, for a load weight of 8000 pounds, the maximum allowable speed of thetruck 10 may be 2.5 MPH. It is noted that, in some locations the maximum allowable speed of thetruck 10, if unoccupied by a rider, may be set at a predetermined upper limit, e.g., 3.5 MPH. Hence, the maximum speed of the vehicle, if unoccupied by a rider, may be set, e.g., by thecontroller 103, at this maximum allowable speed. - For any load weight on the
forks 16, if an object is detected in the first detection zone, thecontroller 103 generates a "stop signal," designating that thetruck 10 come to a substantially immediate stop. For any given load weight, the maximum allowable speed of thetruck 10 is progressively greater the further the object is from thetruck 10. Also for any given load weight, the maximum allowable speed of thetruck 10 is less if an object is detected in the second detection zone as compared to when an object is detected in the third detection zone. The maximum allowable vehicle speeds for the second and third detection zones are defined for each load weight so that the speed of thetruck 10 can be reduced in a controlled manner as thetruck 10 continues to move towards the object so that thetruck 10 can eventually be safely brought to a stop prior to the truck reaching the point where the object is located. These speeds may be determined experimentally, based upon formulas or a combination thereof, and can vary based on vehicle type, size and truck braking capabilities. - As an illustrative example, assume that the load weight on the
forks 16 is 1500 pounds and three detection zones are provided, including a first detection zone nearest the truck, followed by a second detection zone and a third detection zone furthest from the truck. If a sensed object is located at a distance within the third detection zone, then the maximum allowable vehicle speed may be set to a speed such as 3 MPH. Hence, if thetruck 10 is traveling at a speed greater than 3 MPH when the object is detected, thecontroller 103 effects a speed reduction so that the vehicle speed is reduced to 3.0 MPH. - If the load weight on the
truck 10 remains equal to 1500 pounds, and if a sensed object is located at a distance from thetruck 10 within the second detection zone, then the maximum allowable vehicle speed may be, for example, 2 MPH. Hence, if thetruck 10 is traveling at a speed greater than 2 MPH when the object is detected in the second detection zone, thecontroller 103 effects a speed reduction so that the vehicle speed is reduced to 2 MPH. - Keeping with the above example, if the load weight on the
truck 10 equals 1,500 pounds and an object is sensed in the first detection zone, then a stop signal may be generated by thecontroller 103 to effect stopping of thetruck 10. - The obstacle sensors may comprise ultrasonic transducers. Ultrasonic transducers are known to experience a phenomena known as transducer "ring down." Essentially "ring down" is the tendency of a transducer to continue to vibrate and transmit ultrasonic signals after the control signal that is used for initiating a transmitted signal has ceased. This "ring down" signal decreases in magnitude rather rapidly, but during the time that it is decreasing to a level below a threshold detection level, each obstacle sensor may respond by ignoring such "ring down" signals if the signals are above a reference level associated with that listening sensor. As a result, a sensor may mistake an object for a "ring down" signal and thus fail to identify an object in a corresponding detection zone. A common technique to avoid this problem is to blank out all return signals generated by the obstacle sensors for a preselected period of time after initiation of a transmission. The preselected time is determined based on various factors including the type of transducer that is used, but during this preselected time no valid returns can be sensed. If the obstacle sensors are positioned near a front 10A of the
truck 10, seeobstacle sensors 76A inFig. 7 , and if the blanking technique is used, this results in a "dead" or "non-detect" zone DZ existing immediately in front of thetruck 10. Hence, if an object O is very near the front of thetruck 10, e.g., 10 mm or less, and theobstacle sensors 76A are positioned at the front of thetruck 10, seeFig. 7 , then the object O may not be detected. - In the embodiment illustrated in
Figs. 6 and8 , first andsecond obstacle sensors truck 10, seeFig. 8 . Thefirst obstacle sensors 76A are positioned at the front 10A of thetruck 10 and are capable of sensing objects located in, for example, the first, second and/or third detection zones. So as to ensure that objects O located in the non-detect zone DZ, which may be inherent in thefirst obstacle sensors 76A, thesecond obstacle sensors 76B are positioned on the truck 10 a spaced distance behind thefirst sensors 76A, i.e., in a direction away from the front 10A oftruck 10, as best illustrated inFig. 8 . In this regard, thesecond sensors 76B function at least to sense objects in the dead zone DZ inFig. 7 . - When a
truck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by thetransmitter 70, e.g., while no person is riding on thetruck 10 as described more fully herein, it is possible for thetruck 10 to encounter obstacles that do not require thetruck 10 to come to rest. Rather, a steer correction maneuver may be performed such that thetruck 10 can continue to jog forward by the appropriate limited amount without requiring operator intervention. - According to embodiments of the present invention, steer correction allows the
truck 10 to automatically steer away from objects that are sensed to be in the general area of the front of thetruck 10. This steer correction capability allows, for example, thetruck 10, which may be traveling in response to a wirelessly received travel request from thetransmitter 70, to stay generally in the center of an aisle in a warehouse environment as thetruck 10 travels down the aisle. For example, it is possible that thetruck 10 might have some drift in its steer angle because of steer calibration, floor crown, or any number of external factors. However, according to various embodiments of the present invention, atruck 10 traveling in response to receiving a corresponding wirelessly transmitted travel request by thetransmitter 70 may implement steer corrections, e.g., to stay away from or otherwise avoid walls and racks, other trucks, persons, boxes and other obstacles, etc., thus freeing the operator from the need to periodically remount thetruck 10 and steer thetruck 10 manually to the center of the aisle or other desired position and heading. - According to various embodiments of the present invention, the
controller 103 collects data from various sensors, e.g., 76, 76A, 76B that provide a picture of the landscape/environment in front of thetruck 10, as will be discussed more fully herein. Thecontroller 103 then uses data collected from the sensors to determine whether to implement steer correction maneuvers as described more fully herein. In this regard, steer correction may be implemented in addition to, in lieu of and/or in combination with other avoidance techniques described more fully herein. Thus, by way of illustration and not by way of limitation, steer correction may be utilized in combination with multiple speed zones, a stop detection zone, weight dependent speed zones, etc. - As a further example, the object detection components of the
truck 10 may still implement an alarm and/or cause thetruck 10 to stop, reduce or otherwise limit the maximum travel speed of thetruck 10, etc. Still further, thetruck 10 may issue a first alarm if the truck is attempting an automated steer correction maneuver and a second alarm or signal if thetruck 10 is reducing speed and/or stopping in response to an object in a corresponding detection zone if such features are implemented in combination with steer correction. - In this regard, as used herein, the term "steer bumper zone" will be used to distinguish a zone utilized for steer correction from a "detection zone" which is utilized for maximum speed limiting, stopping the
truck 10, etc., as described more fully above. - In an illustrative example, two steer bumper zone inputs are provided to the
controller 103, to distinguish left and right orientations relative to thetruck 10. However, depending upon the sensor technology and the manner in which sensor data is made available, one or more inputs to thecontroller 103 may be required. By way of illustration, and not by way of limitation, thetruck 10 may be equipped with one or more sensing device(s) 76, 76A, 76B that collectively provide a first steer bumper zone and a second steer bumper zone, which are proximate to thetruck 10. For example, the first steer bumper zone may be positioned to the left and generally towards the front of the forward traveling direction of thetruck 10, to the left side of thetruck 10, etc. Similarly, a second steer bumper zone may be positioned to the right and generally towards the forward traveling direction of thetruck 10, to the right side of thetruck 10, etc. In this regard, the first and second steer bumper zones of thetruck 10 may be utilized to implement steer correction, which may include steer angle and steer direction components. In this illustrative configuration, the first and second steer bumper zones may be mutually exclusive, or portions of the first and second steer bumper zone may overlap, thus essentially providing a third steer bumper zone designated by the overlapping coverage of the first and second steer bumper zones. - Moreover, the first and second steer bumper zones may overlap substantially with, partially with or not overlap one or more detection zones utilized for other techniques such as speed control, obstacle triggered braking and stopping of the
truck 10, etc. For example, the range of the steer bumper zones may be similar to or different from the range of one or more detection zones if speed limiting control or other features are also implemented along with steer correction as described in greater detail herein. - Moreover, the sensing inputs provided to the
controller 103 may be derived from a variety of similar type sensors or via a mix of different sensor technologies, e.g., ultrasonic sensors and/or laser scanner sensors. In this regard, various sensors and/or sensor technology types, e.g., laser scanning and ultrasonic may be used in conjunction or cooperation with each other, e.g., to utilize one or more sensor(s) or sensor technologies for one or more zones (detection and/or steer bumper) and to utilize yet another one or more sensor(s) or sensor technologies for one or more different zones (detection and/or bumper). As another example, two or more sensors or sensor technologies can provide redundancy, e.g., as a fail-safe, backup or confirmation set of data. - According to further embodiments of the present invention, the
controller 103 may be configured to process additional data beyond the two steer bumper zone inputs, examples of which may include object detection angle and distance data, etc. Thus, the techniques described herein are not limited to only two steer bumper zones. - Thus, steer correction according to embodiments of the present invention provides an aid to the operator by maintaining the
truck 10 away from walls, racks, other vehicles, or other obstructions as thetruck 10 is operated by the remotewireless control device 70. - According to various embodiments of the present invention, a control system in a
truck 10 provides steer correction control according to various embodiments of the present invention. Referring toFig. 9 , a partial schematic view of the control system is illustrated. In the illustrated system, a first ultrasonic sensor 76' is utilized to generate a first detection zone 78', which is also designated herein as a left detection zone. Correspondingly, a secondultrasonic sensor 76" is utilized to generate asecond detection zone 78", which is also designated herein as a right detection zone. Moreover, although only two ultrasonic detection zones are illustrated, it should be understood that any number of detection zones may be implemented. Still further, as described more fully herein, the implemented detection zones may overlap or define discrete, mutually excusive zones. - The output of each
ultrasonic sensor 76', 76" is coupled to anultrasonic controller 130, which is utilized, where required by the specific ultrasonic technology, to process the output of theultrasonic sensors 76', 76". The output of theultrasonic controller 130 is coupled, for example, as an input to thecontroller 103. Thecontroller 103 may process the outputs of theultrasonic sensor controller 130 to implement speed control, obstacle avoidance or other features, examples of which are set out in greater detail herein. - Also illustrated, a
sensor 76"', which is illustrated as a scanning laser sensor to further illustrate exemplary configurations. In this example, the sensor 76'" is utilized to generate a firststeer bumper zone 132A, also designated as a left steer bumper zone, and a secondsteer bumper zone 132B, also designated as a right steer bumper zone. For example, the scanning laser sensor 76'" may sweep a laser beam in an area in front oftruck 10. In this regard, multiple laser systems may be utilize, or one or more laser beams may be swept, e.g., to raster scan one or more areas forward of thetruck 10. In this regard, the laser sensor may independently define and scan the left and right steer bumper zones, or thecontroller 103 may derive the left and right steer bumper zones based upon the raster scan of the laser(s). Still further, alternate scanning patterns may be utilized, so long as thecontroller 103 can determine whether a detected obstacle is to the left or to the right of thetruck 10. - As a few additional examples, although a laser scanner is illustrated for purposes of discussion herein, other sensing technologies may be utilized, examples of which may include ultrasonic sensors, infrared sensors, etc. For example, ultrasonic sensors located to the sides of the
truck 10 may define the left and rightsteer bumper zones - As illustrated, the output of the laser scanner 76'" provides two
inputs 110 into thecontroller 103. A first signal designates whether an object is detected in the left steer bumper zone. Correspondingly, a second signal designates whether an object is detected in the right steer bumper zone. Depending upon the sensor and sensor processing technologies utilized, the input(s) to thecontroller 103 designating an object in thesteer bumper zones steer bumper zones steer bumper zones steer bumper zones - According to various embodiments of the present invention, a steer correction algorithm is implemented, e.g., by the
controller 103. Referring toFig. 10 , a steer correction algorithm comprises determining whether a steer bumper zone warning is detected at 152. A steer bumper signal warning at 152 may comprise, for example, detecting the presence of an object within the first and/or secondsteer bumper zones truck 10, e.g., whether the detected object is in the first steer bumper zone 132 or the secondsteer bumper zone 132B. For example, with brief reference back toFig. 9 , a laser scanner sensor 76'" may generate two outputs, a first output signal designating whether an object is detected in the first (left)steer bumper zone 132A, and a second signal designating whether an object is detected in the second (right)steer bumper zone 132B. Alternatively, thecontroller 103 may receive raw laser scanner data and process/distinguish the first and secondsteer bumper zones - If a steer bumper zone warning designates that an object is detected in the left
steer bumper zone 132A, then a steer correction routine is implemented at 156 that includes computing a steer angle correction to steer thetruck 10 to the right according to a first set of parameters. By way of illustration and not by way of limitation, a steer right correction implemented at 156 may include steering thetruck 10 to the right at a right direction steer angle. In this regard, the right direction steer angle may be fixed or variable. For example, thecontroller 103 may command thesteer controller 112 to ramp up to some desired steer angle, e.g., 8-10 degrees to the right. By ramping up to a fixed steer angle, sudden changes in the angle of the steer wheel(s) will not occur, resulting in a smoother performance. The algorithm accumulates the distance traveled at the steer correction angle, which may be a function of how long the appropriate steer bumper input is engaged. - According to various embodiments of the present invention, the steered wheel angular change may be controlled to achieve, for example, a substantially fixed truck angle correction as a function of accumulated travel distance. The travel distance accumulated while performing a steer correction maneuver may be determined based upon any number of parameters. For example, the distance traveled during the steer correction may comprise the distance traveled by the
truck 10 until the detected object is no longer within the associated leftbumper detection zone 132A. The accumulated travel distance may also/alternatively comprise, for example, traveling until a time out is encountered, another object is detected in any one of the bumper or detection zones, a predetermined maximum steer angle is exceeded, etc. - Upon exiting a right steer correction at 156, e.g., by maneuvering the
truck 10 so that no object is detected within the left steerbumper detection zone 132A, a left steer compensation maneuver is implemented at 158. The left steer compensation maneuver at 158 may comprise, for example, implementing a counter steer to adjust the travel direction of thetruck 10 to an appropriate heading. For example, the left steer compensation maneuver may comprise steering thetruck 10 at a selected or otherwise determined angle for a distance that is a percentage of the previously accumulated travel distance. The left steer angle utilized for the left steer compensation maneuver may be fixed or variable, and may be the same as, or different from the steer angle utilized to implement the right steer correction at 156. - By way of illustration and not by way of limitation, the distance utilized for the left steer compensation maneuver at 158 may be approximately one quarter to one half of the accumulated travel distance while implementing the right steer correction at 156. Similarly, the left steer angle to implement the left steer compensation maneuver may be approximately one half of the angle utilized to implement the right steer correction at 156. Thus, assume that the right steer angle is 8 degrees and the accumulated steer correction travel distance is 1 meter. In this example, the left steer compensation may be approximately one half of right steer correction, or -4 degrees, and the left steer compensation will occur for a travel distance of approximately ¼ meters to ½ meters.
- The particular distance and/or angle associated with the left steer compensation maneuver at 158 may be selected, for example, so as to dampen the "bounce" of the
truck 10 as thetruck 10 moves along its course to steer correct away from detected obstacles. As an illustration, if thetruck 10 steer corrects at a fixed degree per distance traveled, thecontroller 103 may be able to determine how much the corresponding truck angle has changed, and therefore, adjust the left steer compensation maneuver at 158 to correct back towards the original or other suitable heading. Thus, thetruck 10 will avoid "ping ponging" down an aisle and instead, converge to a substantially straight heading down the center of the aisle without tedious manual repositioning required by the truck operator. Moreover, the left steer compensation maneuver at 158 may vary depending upon the particular parameters utilized to implement the right steer correction at 156. - Correspondingly, if a steer bumper zone warning designates that an object is detected in the right
steer bumper zone 132B, then a steer correction routine is implemented at 160 that includes computing a steer angle correction to steer thetruck 10 to the left according to a second set of parameters. By way of illustration and not by way of limitation, a steer left correction implemented at 160 may include steering thetruck 10 to the left at a left steer angle. In this regard, the left steer correction maneuver at 160 may be implemented in a manner analogous to that described above at 156, except that the correction is to the right at 156 and to the left at 160. - Similarly, upon exiting a left steer correction at 160, e.g., by maneuvering the
truck 10 so that no object is detected within the rightbumper detection zone 132B, a right steer compensation maneuver is implemented at 162. The right steer compensation maneuver at 162 may comprise, for example, implementing a counter steer to adjust the travel direction of thetruck 10 to an appropriate heading in a manner analogous to that described at 158, except that the steer compensation maneuver at 158 is to the left and the steer compensation maneuver at 162 is to the right. - After implementing the steer compensation maneuver at 158 or 162, the truck may return to a substantially straight heading, e.g., 0 degrees at 164 and the process loops back to the beginning to wait for the detection of another object in either of the
steer bumper zones - The algorithm can further be modified to follow various control logic implementations and/or state machines to facilitate various anticipated circumstances. For example, it is possible that a second object will move into either
steer bumper zone truck 10 may iteratively attempt to steer correct around the second object. As another illustrative example, if object(s) are simultaneously detected in both the left and rightsteer bumper zones controller 103 may be programmed to maintain thetruck 10 at its current heading (e.g., zero degree steer angle), until either one or moresteer bumper zones truck 10 to come to a stop. - According to further embodiments of the present invention, a user and/or service representative may be able to customize the response of the steer angle correction algorithm parameters. For example, a service representative may have access to programming tools to load customized variables, e.g., in the
controller 103, for implementing steer correction. As an alternative, a truck operator may have controls that allow the operator to input customized parameters into the controller, e.g., via potentiometers, encoders, a software user interface, etc. - The output of the algorithm illustrated in
Fig. 10 may comprise, for example, an output that defines a steer correction value that may be coupled from thecontroller 103 to an appropriate control mechanism of thetruck 10. For example, the steer correction value may comprise a +/- steer correction value, e.g., corresponding to steer left or steer right, that is coupled to a vehicle control module,steer controller 112, e.g., as illustrated inFig. 2 , or other suitable controller. Still further, additional parameters that may be editable, e.g., to adjust operational feel may comprise the steer correction angle, a steer correction angle ramp rate, a bumper detection zone size/range for each steer bumper zone, truck speed while steer correcting, etc. - Referring to
Fig. 11 , assume in the illustrative example, that thetruck 10 is traveling in response to receiving a remote wireless travel request and that before thetruck 10 can travel a predetermined jog distance, thetruck 10 travels into a position where arack leg 172 and acorresponding pallet 174 are in the path of the leftsteer bumper zone 132A. Keeping with the exemplary algorithm ofFig. 10 , thetruck 10, e.g., via thecontroller 103, may implement an obstacle avoidance maneuver by entering a steer correction algorithm, to steer the truck to the right. For example, thecontroller 103 may compute or otherwise lookup or retrieve a steer correction angle that is communicated to asteer controller 112 to turn the drive wheel(s) of thetruck 10. - The
truck 10 maintains steer correction until an event occurs, such as the disengagement of the object, e.g., when the scanning laser or other implemented sensor technology no longer detects an object in the left steer bumper zone 132. Assume that thetruck 10 accumulated a travel distance of one half of a meter during the steer correction maneuver, which was fixed at 8 degrees. Upon detecting that the left steer bumper zone signal has disengaged, a counter steer compensation is implemented to compensate for the change in heading caused by the steer correction. By way of example the steer compensation may steer thetruck 10 to the left for approximately one quarter meter accumulated travel distance, at 4 degrees. For very narrow aisles, the Left / Right steer bumper zone sensors may provide very frequent inputs /little time between senses compared to relatively wider aisles. - The various steer angle corrections and corresponding counter steer compensations may be determined empirically, or the angles, ramp rates, accumulated distances, etc., may be computed, modeled or otherwise derived.
- In the illustrative arrangement, the system will try to maintain the
truck 10 centered in the aisle as thetruck 10 advances in response to receiving a corresponding wirelessly transmitted travel request by thetransmitter 70. Moreover, bounce, e.g., as measured by the distance from the centerline of a warehouse aisle, is damped. Still further, there may be certain conditions where thetruck 10 may still require some operator intervention in order to maneuver around certain objects in the line of travel. - Referring to
Fig. 12 , a graph illustrates a speed measurement of thetruck 10 during an obstacle avoidance maneuver. The graph inFig. 13 illustrates a steer correction at the predetermined steer angle to illustrate a total correction applied by the algorithm. And a graph inFig. 14 illustrates motion of thetruck 10 as a function of when steer correction is active and when an object is sensed in the left and/or right bumper detection zones. - According to further embodiments of the present invention, the steer correction algorithm may be configured to hug a wall / rack, versus staying away from a wall and/or rack. For example, adding a small drift to the
truck 10 will allow thetruck 10 to maintain a distance with a small amount of control-related ripple on its distance to the fixed wall / rack. - Although the left and right
steer bumper zones truck 10, other arrangements may be alternatively and/or additionally be implemented. For example, the left and right steer bumper zones could alternatively be positioned towards the sides of thetruck 10, e.g., as illustrated by left and right sidesteer bumper zones truck 10 may utilize a first pair of left and right steer bumper zones towards the forward traveling direction of thetruck 10, e.g., left and rightsteer bumper zones steer bumper zones truck 10. In this regard, the particular algorithm utilized to implement steer correction may be the same or different for each pair of steer bumper zones. - As an example, side steer
bumper zones truck 10 generally adjacent to a rack, wall or other heading. In this regard, a multi-zone steer bumper may be used, e.g., to establish a hysteresis, e.g., such that thecontroller 103 maintains a heading by keeping the wall, rack or other structure between a first, outer steer bumper limit and a second, inner steer bumper limit. As yet another illustrative alternative, assume that the truck is to stay just to the right of a rack or other structure, which is to the left of thetruck 10. Thetruck 10 can automatically steer to the left by a small amount so as to steer towards the structure. In this regard, when the leftsteer bumper zone 132C is breached by the structure, the steer correction described more fully herein will steer away from the structure. However, because the steering is configured to steer just slightly to the left, thetruck 10 will eventually travel towards the structure until the steer correction again repositions thetruck 10. As yet another illustrative example, the steer compensation, e.g., 158 inFig. 10 , could be made to deliberately overcompensate, thus maintaining thetruck 10 adjacent to the structure. - As yet another illustrative example, the steer bumper zones may be comprised of multiple steer bumper sub-zones, where each sub-zone may be associated with different parameters for steer correction, e.g., to allow subtle steer correction for objects sensed further away from the
truck 10 than objects sensed more closely to thetruck 10. By way of example, the steer correction may be a lesser amount, e.g., 2 degrees, when an object is detected in the furthest region or sub-zone from the vehicle; an intermediate amount, e.g., 4 degrees, when an object is detected in a middle region; and a greater amount, e.g., 8 degrees, when an object is detected in an inner region of a steer bumper zone. As further alternatives, distance measurement to the detected object may be utilized to dynamically adjust the steer algorithm to make appropriate steer correction maneuvers. - As yet another illustrative example, it may be desirable to apply a first, greater amount of steer correction, e.g., 10 degrees, if certain predefined conditions are met, and to apply a second, lesser amount of steer correction, e.g., 7 degrees, under all other circumstances. For example, assume that an operator is driving the
truck 10 and comes to the end of an aisle or row. The operator then maneuvers thetruck 10 by making a 180 degree turn and enters an adjacent aisle. Perhaps the operator over or under steers upon entering the adjacent aisle, such that the heading of thetruck 10 cannot be straightened down the aisle with the second, lesser amount of steer correction. In this situation, it may be desirable to apply a greater amount of steer correction than is normally used to allow thetruck 10 to achieve a straight heading down the aisle. - The conditions that must occur prior to applying the greater amount of steer correction may vary, but in the above example, may comprise the following: a first condition may be that a preselected driving speed, such as, for example, 3 MPH, must be reached or exceeded. A second condition may be that a minimum steering angle, such as, for example, 45 degrees, must be met or exceeded. A third condition may be that an operator must be present on the
truck 10 during the occurrences of the first and second conditions. In the above example, if each of these three conditions is met, thecontroller 103 performs a single instance of the greater amount of steer correction, e.g., 10 degrees, if an object is detected in one of the steer bumper zones after the occurrence of the three conditions. Subsequent steer corrections applied would be the lesser amount, e.g., 7 degrees, until all three conditions are once again met, in which case another single instance of the greater amount of steer correction will be applied by thecontroller 103. - Referring to
Figs. 15A-15C , a scannedenvironment 200, also referred to as a landscape, is illustrated. Theenvironment 200 may be derived by thecontroller 103 based on sensor data obtained by thecontroller 103 from anobstacle sensor 76, such as a laser scanning device. In this embodiment, asingle obstacle sensor 76 is used to provide the sensor data, althoughadditional sensors 76 could be used as desired. In an exemplary embodiment, theobstacle sensor 76 may be located at a distance off the floor upon which thetruck 10 is travelling, wherein theobstacle sensor 76 scans in a scanning plane that is oriented at an angle from thesensor 76 downward toward the floor. - The
exemplary environment 200 illustrated inFigs. 15A-15C extends in an axial direction, i.e., parallel to a central axis CA of thetruck 10, from afront edge 200A of theenvironment 200 to arear edge 200B of theenvironment 200. Thefront edge 200A is displaced a predefined distance DF from the front of thetruck 10. The distance DF may be any suitable distance and in a preferred embodiment is from about 1 meter to about 5 meters. Therear edge 200B is located at a predetermined location L1 associated with thetruck 10. As a few nonlimiting examples, the location L1 may be defined at a load wheel of thetruck 10, at a rear end of an estimated position of a typical load carried by thetruck 10, or at the tips of theforks 16, as illustrated inFigs. 15A-15C . - The
exemplary environment 200 in the embodiment shown inFigs. 15A-15C extends in a lateral direction, i.e., perpendicular to the central axis CA of thetruck 10, from aleft edge 200C of theenvironment 200 to aright edge 200D of theenvironment 200. Theleft edge 200C is displaced laterally a predefined distance DL to the left of the central axis CA of thetruck 10. Theright edge 200D is displaced laterally a predefined distance DR to the right of the central axis CA of thetruck 10. The distances DL and DR may comprise any suitable distances and in a preferred embodiment are each from about 2 meters to about 5 meters. It is noted that the distances DL and DR could be measured from the sides of thetruck 10 or any other suitable location, rather than from the central axis CA. It is also noted that theedges 200A-200D of theenvironment 200 could comprise an shape and need not define straight edges. For example, theedges 200A-200D could be curved or could comprise uneven or serrated portions. - The
exemplary environment 200 illustrated inFigs. 15A-15C comprises a scannedzone 202 and ahistory zone 204. The scannedzone 202 is actively scanned by theobstacle sensor 76 during operation of thetruck 10. Thehistory zone 204 is not actively scanned by theobstacle sensor 76, but objects that are detected in the scannedzone 202 are capable of being tracked as they pass through thehistory zone 204 during movement of thetruck 10, as will be described herein. Thehistory zone 204 comprises afirst portion 2040A comprising unscanned areas laterally outside of the scannedzone 202 and also comprises asecond portion 2040B comprising an area that is located rearwardly from the scannedzone 202, as shown inFigs. 15A-15C . - The scanned
zone 202 extends from thefront edge 200A of theenvironment 200 to a predetermined axial location L2, which location L2 in the embodiment shown is defined close to the front end of thetruck 10 but could be defined at other areas. The scannedzone 202 extends in the lateral direction between predetermined lateral locations L3 and L4, which locations L3 and L4 are laterally displaced from respective sides of thetruck 10 and are located between the sides of thetruck 10 and the left andright edges environment 200, as shown inFigs. 15A-15C . - The
first portion 2040A of thehistory zone 204 extends laterally outwardly from both sides of the scannedzone 202, i.e., from the respective locations L3 and L4, to the left andright edges environment 200. Thesecond portion 2040B of thehistory zone 204 extends rearwardly from the scannedzone 202, i.e., from the location L2, to therear edge 200B of theenvironment 200. Thesecond portion 2040B of thehistory zone 204 extends laterally between the left andright edges environment 200. - The scanned
zone 202 and thehistory zone 204 each comprise corresponding left andright sections left section 202A of the scannedzone 202 in the embodiment shown comprises fourscan zones scan zones 202A1-4) and theright section 202B of the scannedzone 202 in the embodiment shown comprises four scan zones, 202B1, 202B2, 202B3, 202B4 (collectively referred to hereinafter asscan zones 202B1-4). Theexemplary scan zones 202A1-4 - 202B1-4 illustrated inFigs. 15A-15C are substantially all the same size and are generally rectangular in shape, with the exception of thescan zones truck 10 having angled bottom corner portions. However, it is noted that thescan zones 202A1-4 - 202B1-4 could have any suitable size and shape. Further, while thescan zones truck 10 in the embodiment shown extend slightly rearwardly from the front of thetruck 10, i.e., to the location L2, thescan zones truck 10 could extend to other locations without departing from the scope of the invention. Also, while eachsection zone 202 in the embodiment shown comprises fourscan zones 202A1-4 - 202B1-4, additional or fewer scan zones may be provided in eachsection - The
obstacle sensor 76 scans thescan zones 202A1-4 - 202B1-4 and sends sensor data to thecontroller 103 regarding objects detected in thescan zones 202A1-4 - 202B1-4. Included in the sensor data sent by theobstacle sensor 76 is data for eachscan zone 202A1-4 - 202B1-4 that is representative of whether an object is detected in thecorresponding scan zone 202A1-4 - 202B1-4. Further, if an object is detected in ascan zone 202A1-4 - 202B1-4, the sensor data includes data representative of the distance that the detected object is from a reference coordinate RC associated with the vehicle. The reference coordinate RC may be a predetermined location on thetruck 10, such as a bumper, wheel, fork, theobstacle sensor 76, etc., or the reference coordinate RC may be an axis or plane associated with thetruck 10. In the embodiment shown, the reference coordinate RC is the central axis CA of thetruck 10. - As shown in
Figs. 15A-15C , eachscan zone 202A1-4 - 202B1-4 comprises a plurality ofbuckets 220. Thebuckets 220 are used for tracking objects in a plane generally parallel to the floor and that are detected in thescan zones 202A1-4 - 202B1-4, as will be discussed herein. In a preferred embodiment, eachscan zone 202A1-4 - 202B1-4 comprises between four and eleven buckets 220 (sixbuckets 220 are included in eachscan zone 202A1-4- 202B1-4 in the embodiment shown), although additional orfewer buckets 220 could be included in eachscan zone 202A1-4 - 202B1-4. - The
history zone 204 also comprises a plurality ofbuckets 222. Thebuckets 222 infirst portion 2040A of thehistory zone 204 may be continuations of thebuckets 220 from thescan zones 202A1-4 - 202B1-4. Thebuckets 222 are used for tracking objects that enter thehistory zone 204 from thescan zones 202A1-4 - 202B1-4 as will be discussed herein. - First and
second objects environment 200 inFigs. 15A-15C . Theseobjects obstacle sensor 76 during operation, and theobstacle sensor 76 sends sensor data to thecontroller 103 about theobjects controller 103 uses the sensor data to assign theobjects buckets 220 defined within the scannedzone 202 based on the sensor data from theobstacle sensor 76. Once theobjects zone 202 and enter thehistory zone 204, theobjects buckets 222 in thehistory zone 204. - The
buckets objects environment 200 as thetruck 10 moves. That is, as thetruck 10 moves, thecontroller 103 tracks theobjects obstacle sensor 76 to re-assign theobjects adjacent buckets 220, and/or by using dead reckoning to re-assign theobjects adjacent buckets objects adjacent buckets controller 103 is able to determine an updated axial distance that theobjects truck 10. Thecontroller 103 is also able to determine an updated lateral distance that theobjects truck 10 using subsequent sensor data and/or dead reckoning. In a preferred embodiment, theobjects controller 103 until they are no longer determined to be in theenvironment 200. - It is noted that, if the
obstacle sensor 76 scans in a scanning plane that is oriented at an angle from thesensor 76 downward toward the floor, some objects that are detected in one or more of thescan zones 202A1-4 - 202B1-4 may not be detected in an adjacent scan zone, even though that object is located within the axial dimension of the adjacent scan zone. For example, shorter objects may be detected by theobstacle sensor 76 inscan zone 202A1, but may not be detected by theobstacle sensor 76 upon entering the axial dimensions of theadjacent zone 202A2. While the sensor data provided by theobstacle sensor 76 may not indicate that the object is in thezone 202A2, i.e., since the object is located under the scanning plane of thesensor 76, the object is still tracked in theenvironment 200 via dead reckoning. - Referring to
Figs. 16A-16C ,exemplary action zones 280 defined within theenvironment 200 are illustrated. Theaction zones 280 may be used for implementing various steer maneuvers as will be described herein. Theaction zones 280 in the embodiment shown are divided into left andright action zones left action zone 282 is located on the left of the central axis CA of thetruck 10, and theright action zone 284 is located on the right of the central axis CA of thetruck 10. - The
exemplary action zones 280 illustrated inFigs. 16A-16C comprise left andright stop zones steer zones right steer zones right hug zones - The left and
right stop zones truck 10. If an object is detected in either of thestop zones controller 103 will initiate a brake operation to cause thetruck 10 to stop. - Laterally outwardly from the
stop zones steer zones steer zones rear portions forward portions steer zones steer zones steer zones zone 202, whereas therear portions steer zones steer zones steer zones second portion 2040B of thehistory zone 204. If an object is detected in one of the nosteer zones controller 103 does not permit the vehicle to turn toward the nosteer zone steer zone - Laterally outwardly from the no
steer zones right steer zones right steer zones rear portions forward portions steer zones steer zones steer zones zone 202, whereas therear portions steer zones steer zones steer zones second portion 2040B of thehistory zone 204. If an object is detected in one of therear portions steer zones controller 103 permits the vehicle to turn toward thesteer zone steer zone controller 103 does not permit additional turning of thetruck 10 toward the respective nosteer zone controller 103 may implement another steer maneuver as will be described herein. It is noted that, in the preferred embodiment, thecontroller 103 does not implement a steer maneuver to turn thetruck 10 toward asteer zone forward portion controller 103 could be programmed to implement such a steer maneuver. - Laterally outwardly from the
steer zones right hug zones hug zones controller 103 to steer thetruck 10 relative to selected objects such that the truck can be substantially maintained at a desired distance from the selected object, as will be described herein with reference toFigs. 17A-17C . Laterally inner boundaries of thehug zones right hug lines Figs. 16A-16C and17A-17C . - Select ones of the
action zones 280, or portions thereof, may be used by thecontroller 103 for implementing additional steer maneuvers. For example, the nosteer zones steer zones zones zones steer zones forward portions rear portions steer zones zones truck 10 may turn away from the object, as long as another object is not located in thestop zone steer zone forward portion steer zone truck 10. It is noted that the exemplary steer awayzones other action zones 280 or portions thereof. - The
controller 103 may implement various steer maneuvers upon the happening of certain predefined conditions. A first exemplary event occurs when an object is detected within the scannedzone 202 by theobstacle sensor 76 and is determined to be within the left orright hug line zone 202 and within the left orright hug line controller 103 will attempt to steer thetruck 10 away from the detected object, as long as such a steer maneuver is permitted, i.e., as long as a second object is not detected within thestop zone steer zone forward portion steer zone truck 10. - A second exemplary event occurs when an object is detected or otherwise determined to be located, e.g., via dead reckoning, within a no
steer zone front edge 200A of theenvironment 200 and a predetermined axial location L5 associated with thetruck 10, seeFigs. 16A-16C . The predetermined location L5 associated with thetruck 10 may be defined, for example, at the axial location where theforks 16 extend from thetruck 10. The predetermined axial location L5 may alternatively be defined with respect to a predetermined distance from thefront edge 200A of theenvironment 200. Upon the happening of the event according to this example, thecontroller 103 will attempt to steer away from the detected object, as long as such a steer maneuver is permitted, i.e., as long as a second object is not detected within thestop zone steer zone forward portion steer zone truck 10. - A third exemplary event occurs when a first object is detected by the
obstacle sensor 76 within theleft hug line 312A and a second object is detected by theobstacle sensor 76 within theright hug line 314A. In this case, thecontroller 103 will implement a steer maneuver to maintain thetruck 10 on a straight heading until one of the following occurs: one of the objects moves outside of therespective hug line rear portion steer zone environment 200; or one of the objects enters astop zone controller 103 may implement another steer maneuver or initiate a brake operation depending on the location of the object(s). - A fourth exemplary event occurs when a "hug" maneuver is implemented by the
controller 103. Additional details in connection with the hug maneuver will be described below with reference toFigs. 17A-17C . - Referring to
Figs. 16A-16C in succession, exemplary steer maneuvers implemented by thecontroller 103 during movement of thetruck 10 will be described. Thetruck 10 may be traveling in response to receiving a remote wireless travel request, i.e., from a wireless transmitter, as discussed in detail herein. Alternatively, thetruck 10 may be coasting to a stop or may be driven manually by a rider or a walker who is walking alongside thetruck 10. - In
Fig. 16A , theobstacle sensor 76 detects first andsecond objects zone 202. Theobstacle sensor 76 sends sensor data to thecontroller 103 that includes information about the first andsecond objects scan zones 202A1-A4, 202B1-B4 (seeFigs. 15A-15C ) theobjects objects truck 10 in the embodiment shown. - In
Fig. 16A , the laterally innermost portion of thefirst object 272 is determined to be in the scannedzone 202 and located outside of theleft hug line 312A in theleft hug zone 312, and the laterally innermost portion of thesecond object 274 is determined to be in the scannedzone 202 and located inside of theright hug line 314A in theforward portion 310A of theright steer zone 310. It is noted that, while a portion of thefirst object 272 is located outside of theleft hug zone 312 and a portion of thesecond object 274 is located in theright hug zone 314, thecontroller 103 may be primarily concerned with the portion of any detected object that is closest laterally to thetruck 10. Based on the object location information derived from the sensor data, it is determined that the laterally innermost portion of thesecond object 274 is closer than the laterally innermost portion of thefirst object 272 to the central axis CA of thetruck 10. Based on the locations of the first andsecond objects Fig. 16A , thecontroller 103 will automatically implement a steer maneuver to steer thetruck 10 toward thefirst object 272, so as to steer thetruck 10 away from thesecond object 274. - The
truck 10 is continually steered toward thefirst object 272 and away from thesecond object 274 until one of two conditions occurs. The first condition is that the first object 272 (or another object determined to be in the environment 200) enters a predefined portion of theleft action zone 282. The predefined portion of theleft action zone 282 comprises a portion of theleft action zone 282 wherein further steering of thetruck 10 toward thefirst object 272 is determined to not be permitted. The predefined portion of theleft action zone 282 in the exemplary embodiment shown is either of theforward portion 308A of theleft steer zone 308 or therear portion 304B of the left nosteer zone 304, but could be otherleft action zones 282 or portions thereof The second condition is that the second object 274 (and any other objects determined to be in the right action zone 284) completely exits a predefined portion of theright action zone 284. The predefined portion of theright action zone 284 comprises a portion of theright action zone 284 wherein further steering of thetruck 10 away from thesecond object 274 is determined to not be required. The predefined portion of theright action zone 284 in the embodiment shown is theforward portion 310A of theright steer zone 310 if thesecond object 274 is in the scannedzone 202, i.e., such that thesecond object 274 is completely outside of theright hug line 314A, or therear portion 306B of the right nosteer zone 306 forward of the location L5 if thesecond object 274 is in thesecond portion 2040B of thehistory zone 204, but could be otherright action zones 284 or portions thereof. - In
Fig. 16B , the first condition is illustrated as being met, i.e., thefirst object 272 enters theforward portion 308A of theleft steer zone 308. While the first andsecond objects zone 202 such that they are being actively detected by theobstacle sensor 76, and while the laterally innermost portion of thefirst object 272 is in theforward portion 308A of theleft steer zone 308 and the laterally innermost portion of the second object is in theforward portion 310A of theright steer zone 310, thecontroller 103 will implement a steer maneuver such that thetruck 10 will maintain a straight heading. As noted above, thetruck 10 will maintain a straight heading until one of the following occurs: the laterally innermost portion of one of theobjects hug line objects rear portion steer zone environment 200. - In
Fig. 16C , the laterally innermost portion of thesecond object 274 is illustrated as having moved into therear portion 310B of theright steer zone 310. In this scenario, thesecond object 274 has gone from being scanned by theobstacle sensor 76 in the scannedzone 202 to not being scanned in thesecond portion 2040B of thehistory zone 204, and, thus, being tracked by dead reckoning. Since the laterally innermost portion offirst object 272 is in theforward portion 308A of theleft steer zone 308 and thesecond object 274 is in therear portion 310B of theright steer zone 310, thecontroller 103 automatically implements a steer maneuver to steer thetruck 10 away from thefirst object 272 so as to steer thetruck 10 toward thesecond object 274. Thetruck 10 will continue to steer away from thefirst object 272 and toward thesecond object 274 until one of the following exemplary conditions occurs: the laterally innermost portion of thefirst object 272 enters therear portion 308B of theleft steer zone 308; thefirst object 272 is located completely outside of theleft hug line 312A; or until an object is determined to be in the right nosteer zone 306 or theforward portion 310A of theright steer zone 310. If one of these events occurs, thecontroller 103 may implement a subsequent steer maneuver as described herein. - If at any time during operation the first and/or
second object stop zones controller 103 will initiate a brake operation to cause thetruck 10 to stop, as discussed above. -
Figs. 17A-17C are successive views of atruck 10 performing steer maneuvers according to another embodiment of the invention.Figs. 17A-17C will be discussed in terms of theaction zones 280 discussed above with reference toFigs. 16A-16C . Thetruck 10 may be traveling in response to receiving a remote wireless travel request, i.e., from a wireless transmitter, as discussed in detail herein. Alternatively, thetruck 10 may be coasting to a stop or may be driven manually by a rider or a walker who is walking alongside thetruck 10. - In
Fig. 17A , theobstacle sensor 76 detects a selectedobject 276 in the scannedzone 202. Theobstacle sensor 76 sends sensor data to thecontroller 103 that includes information about the selectedobject 276. The sensor data comprises data that is representative of which of thescan zones 202A1-A4, 202B1-B4 (seeFigs. 15A-15C ) the selectedobject 276 is located in. The sensor data also includes data representative of the lateral distance that the selectedobject 276 is from the reference coordinate RC, i.e., the central axis CA of thetruck 10 in the embodiment shown. The selectedobject 276 may be a rack or a stacked product face having a generally axially extending laterallyinner edge portion 276A, although it is understood that the selectedobject 276 could be other objects. - In the
environment 200 illustrated inFig. 17A , based on the sensor data from theobstacle sensor 76, it is determined that theedge portion 276A of the selectedobject 276 is in theright steer zone 310. Based on the detected location of the selectedobject 276 illustrated inFig. 17A , thecontroller 103 automatically implements a steer maneuver to steer thetruck 10 away from the selectedobject 276 with the intent of steering thetruck 10 such that thetruck 10 is substantially maintained at a desired distance from theedge portion 276A of the selectedobject 276, i.e., such that thetruck 10 "hugs" theedge portion 276A of the selectedobject 276. In one embodiment, the intent of the steer maneuver may be such that the selectedobject 276 is at least partially maintained in theright hug zone 314. Additionally or alternatively, the intent of the steer maneuver may be such that a portion of the selectedobject 276, e.g., theedge portion 276A thereof, is substantially maintained on theright hug line 314A that is associated with theright hug zone 314. - In the exemplary embodiment shown, the intent of the steer maneuver is to continually steer the
truck 10 away from the selectedobject 276 until the selectedobject 276 is at least partially maintained in theright hug zone 314 and until theedge portion 276A of the selectedobject 276 is substantially maintained on theright hug line 314A. - Referring to
Fig. 17B , an exemplary condition is illustrated wherein thetruck 10 "overshot" theright hug line 314A, such that theedge portion 276A of the selectedobject 276 went past theright hug line 314A. In this case, thecontroller 103 automatically implements a steer maneuver to steer thetruck 10 toward the selectedobject 276 until theedge portion 276A of the selectedobject 276 is maintained on theright hug line 314A. It is noted that, since no portion of the selectedobject 276 is located in the right nosteer zone 306 or in theforward portion 310A of theright steer zone 310 inFig. 17B , thetruck 10 is permitted to turn toward the selectedobject 276. - In
Fig. 17C , after the steer maneuver is implemented that steers thetruck 10 toward the selectedobject 276 such that theedge portion 276A of the selectedobject 276 is positioned on theright hug line 314A, thecontroller 103 implements a steer maneuver to achieve a straight heading of thetruck 10 in the axial direction, i.e., parallel to the central axis CA, so as to maintain theedge portion 276A of the selectedobject 276 on theright hug line 314A. Thetruck 10 continues to travel straight until the selectedobject 276 is no longer determined to be in theenvironment 200, or until theedge portion 276A of the selectedobject 276 is no longer determined to be located on theright hug line 314A, at which point thecontroller 103 could implement a steer maneuver such that theright hug line 314A coincides with theedge portion 276A of the selectedobject 276. - According to one embodiment, if multiple objects are located within the
environment 200, the selectedobject 276 may be an object that is determined to be located closest to theleft hug line 312A or theright hug line 314A. Alternatively, the selectedobject 276 may be the first object that is detected in the scannedzone 202 by theobstacle sensor 76, or may be the first object that is determined to be in at least one of thesteer zones steer zones object 276 may be an object that is determined to be the closest object to thetruck 10 within theenvironment 200, as measured in the lateral direction. - Further, the
controller 103 may be programmable to only perform a steer maneuver to "hug" a selected object if the object is detected in a select one of the left andright hug zones truck 10 only hug objects located on the right side of thetruck 10. Under this arrangement, thetruck 10 may travel in a controlled fashion down the right side of an aisle, while another truck travels in the opposite direction on the other side of the aisle. As another example, if an operator will only be picking items located on the right side of an aisle, thetruck 10 may only hug a rack or stacked product face on the right side of thetruck 10, so as to minimize the distance that the operator has to walk from the rack to thetruck 10. - Further still, the hug maneuver described herein may be implemented by the
controller 103 in one embodiment only upon authorization to do so. For example, an operator may depress a button, which button may be located on thetruck 10 or on a remote control device as described herein. Upon receiving authorization to implement a hug maneuver, thecontroller 103 enters into an "acquire hug" mode, wherein thecontroller 103 looks for objects in the scannedzone 202 to hug. Additionally, the operator may designate hug preferences, such as whether to hug an object on the left or right side of thetruck 10, the first object detected in the scannedzone 202, the object that is determined to be located closest to the central axis CA of thetruck 10, etc. Additionally, once an object that is being hugged is no longer located within theenvironment 200, the truck may continue forward on a straight heading until a new object to hug is detected by theobstacle sensor 76. If a new object is detected by theobstacle sensor 76 within theenvironment 200, thecontroller 103 may be programmed to automatically hug the new object, or thecontroller 103 may need to be authorized to do so by the operator. - Moreover, the hug maneuvers used in connection with the
hug zones Figs. 17A-17C may be used in combination with theother action zones 280 described above with reference toFigs. 16A-16C . - Having thus described the invention of the present application in detail and by reference to embodiments thereof, it will be apparent that modifications and variations are possible without departing from the scope of the invention defined in the appended claims.
Claims (15)
- A materials handling vehicle (10) having detection zone control comprising:a power unit (14) for driving the vehicle (10);at least one contactless obstacle sensor (76) on the vehicle (10) that is operable to scan a scanned zone (202) within an environment (200) proximate to the vehicle (10),characterized in that the environment (200) comprises the scanned zone (202) and a history zone (204), the history zone (204) being not actively scanned by the at least one obstacle sensor (76), but objects that are detected by the at least one obstacle sensor (76) in the scanned zone (202) are capable of being tracked as they pass through the history zone (204) during movement of the vehicle (10); anda controller (103) configured to receive information obtained from the at least one obstacle sensor (76) and to define at least two zones within the environment (200) based on the received information, the at least two zones comprising at least one stop zone (300,302) and at least one steer away zone (304, 306), wherein the controller (103):performs a stop action to bring the vehicle (10) to a stop if an object is detected in the at least one stop zone (300, 302);performs a steer maneuver to steer the vehicle (10) away from an object detected in the at least one steer away zone (304, 306); andtracks objects determined to be in the history zone (204) until such objects are no longer in the environment (200).
- A multiple detection zone control system for a materials handling vehicle (10) comprising:at least one contactless obstacle sensor (76) on the vehicle (10) that is operable to scan a scanned zone (202) within an environment (200) proximate to the vehicle (10),characterized in that the environment (200) comprises the scanned zone (202) and a history zone (204), the history zone (204) being not actively scanned by the at least one obstacle sensor (76), but objects that are detected by the at least one obstacle sensor (76) in the scanned zone (202) are capable of being tracked as they pass through the history zone (204) during movement of the vehicle (10); anda controller (103) configured to receive information obtained from the at least one obstacle sensor (76) and to define at least two zones within the environment (200) based on the received information, the at least two zones comprising at least one stop zone (300, 302) and at least one steer away zone (304, 306), wherein the controller (103):performs a stop action to bring the vehicle (10) to a stop if an object is detected in the at least one stop zone (300, 302);performs a steer maneuver to steer the vehicle (10) away from an object detected in the at least one steer away zone (304, 306); andtracks objects determined to be in the history zone (204) until such objects are no longer in the environment (200).
- The materials handling vehicle (10) according to claim 1 or the multiple detection zone control system of claim 2, wherein the at least one stop zone (300, 302) and the at least one steer away zone (304, 306) each define an area at least partially in front of a forward traveling direction of the vehicle (10).
- The materials handling vehicle (10) according to claim 1 or the multiple detection zone control system of claim 2, wherein the at least one stop zone (300, 302) comprises a closest one of the zones to a central axis of the vehicle (10).
- The materials handling vehicle (10) according to claim 4 or the multiple detection zone control system of claim 4, wherein the at least one steer away zone (304, 306) comprises left and right steer away zones (304, 306) located laterally outwardly from the at least one stop zone (300, 302) on opposed sides of the at least one stop zone (300, 302).
- The materials handling vehicle (10) according to claim 5 or the multiple detection zone control system of claim 5, wherein the history zone (204) comprises areas located laterally outwardly from the respective left and right steer away zones (304, 306).
- The materials handling vehicle (10) according to claim 1 or the multiple detection zone control system of claim 2, wherein the at least one contactless obstacle sensor (76) comprises at least one laser sensor.
- The materials handling vehicle (10) according to claim 1 or the multiple detection zone control system of claim 2, wherein objects that are determined to be in the history zone (204) but are not detected by the at least one contactless obstacle sensor (76) are tracked by the controller (103) via dead reckoning until such objects are no longer in the environment (200).
- The materials handling vehicle (10) according to claim 1 or the multiple detection zone control system of claim 2, wherein the scanned zone (202):includes the at least one stop zone (300, 302) and the at least one steer away zone (304, 306); andis actively scanned by the at least one contactless obstacle sensor (76) during operation of the vehicle (10).
- The materials handling vehicle (10) according to claim 9 or the multiple detection zone control system of claim 9, wherein at least a portion of the history zone (204) is located rearwardly from the scanned zone (202) and is not part of the scanned zone (202).
- The materials handling vehicle (10) according to claim 10 or the multiple detection zone control system of claim 10, wherein:the portion of the history zone (204) located rearwardly from the scanned zone (202) comprises a plurality of buckets (220) located adjacent to one another in an axial direction defined by a central axis of the vehicle (10); andobjects being tracked in the portion of the history zone (204) located rearwardly from the scanned zone (202) during movement of the vehicle (10) are re-assigned to adjacent buckets (220) using dead reckoning so as to update the axial distance of the object relative to the vehicle (10).
- The materials handling vehicle according to claim 1 or the multiple detection zone control system of claim 2, wherein the environment (200) is defined by:a front edge (200A) that is displaced a predefined distance from a front of the vehicle (10);a rear edge (200B) that is located rearwardly from the front of the vehicle (10);a left edge (200C) that is displaced a predefined distance from a central axis of the vehicle (10); anda right edge (200D) that is displaced a predefined distance from the central axis of the vehicle (10).
- The materials handling vehicle (10) according to claim 1 or the multiple detection zone control system of claim 2, further comprising a receiver (102) at the vehicle (10) for receiving transmissions from a corresponding remote control device (70), the transmissions comprising at least a first type signal designating a travel request requesting the vehicle (10) to travel.
- The materials handling vehicle (10) according to claim 13 or the multiple detection zone control system of claim 13, wherein the controller (103) is further configured to refuse to implement a remote travel request if an object is detected within the at least one stop zone (300, 302) before the vehicle (10) begins travel.
- The materials handling vehicle (10) according to claim 1 or the multiple detection zone control system of claim 2, wherein the at least one steer away zone (304, 306) comprises at least one steer zone (308, 310) and at least one no steer zone (304, 306), wherein:if an object is detected in the at least one no steer zone (304, 306), the controller (103) does not permit the vehicle (10) to turn toward the object until the object moves out of the corresponding no steer zone (304, 306); andif an object is detected in the at least one steer zone (308, 310), the controller (103) permits the vehicle (10) to turn toward the object.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/033,169 US8731777B2 (en) | 2009-08-18 | 2011-02-23 | Object tracking and steer maneuvers for materials handling vehicles |
EP12706413.7A EP2678748B2 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12706413.7A Division EP2678748B2 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
EP12706413.7A Division-Into EP2678748B2 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2889713A2 EP2889713A2 (en) | 2015-07-01 |
EP2889713A3 EP2889713A3 (en) | 2015-08-05 |
EP2889713B1 true EP2889713B1 (en) | 2016-12-14 |
Family
ID=45771947
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12706413.7A Active EP2678748B2 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
EP14190556.2A Active EP2866113B1 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
EP15151549.1A Active EP2889713B1 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
EP14190558.8A Active EP2866114B1 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
EP15156819.3A Active EP2905668B1 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12706413.7A Active EP2678748B2 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
EP14190556.2A Active EP2866113B1 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14190558.8A Active EP2866114B1 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
EP15156819.3A Active EP2905668B1 (en) | 2011-02-23 | 2012-02-21 | Object tracking and steer maneuvers for materials handling vehicles |
Country Status (11)
Country | Link |
---|---|
US (3) | US8731777B2 (en) |
EP (5) | EP2678748B2 (en) |
KR (4) | KR20200008046A (en) |
CN (2) | CN103926929B (en) |
AU (1) | AU2012220819B2 (en) |
BR (2) | BR112013021044A2 (en) |
CA (3) | CA3004966C (en) |
IN (1) | IN2014CN03425A (en) |
MX (1) | MX339969B (en) |
RU (1) | RU2578831C9 (en) |
WO (1) | WO2012115920A2 (en) |
Families Citing this family (78)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8970363B2 (en) * | 2006-09-14 | 2015-03-03 | Crown Equipment Corporation | Wrist/arm/hand mounted device for remotely controlling a materials handling vehicle |
US9122276B2 (en) | 2006-09-14 | 2015-09-01 | Crown Equipment Corporation | Wearable wireless remote control device for use with a materials handling vehicle |
US10358331B2 (en) | 2010-12-20 | 2019-07-23 | Jlg Industries, Inc. | Work platform with protection against sustained involuntary operation |
US9522817B2 (en) | 2008-12-04 | 2016-12-20 | Crown Equipment Corporation | Sensor configuration for a materials handling vehicle |
US8731777B2 (en) | 2009-08-18 | 2014-05-20 | Crown Equipment Corporation | Object tracking and steer maneuvers for materials handling vehicles |
US8577551B2 (en) * | 2009-08-18 | 2013-11-05 | Crown Equipment Corporation | Steer control maneuvers for materials handling vehicles |
DE102010032909A1 (en) * | 2010-07-30 | 2012-02-02 | Wabco Gmbh | Monitoring system for monitoring the environment, in particular the rear space of motor vehicles |
US10124999B2 (en) | 2010-12-20 | 2018-11-13 | Jlg Industries, Inc. | Opto-electric system of enhanced operator control station protection |
US9146559B2 (en) * | 2011-03-18 | 2015-09-29 | The Raymond Corporation | System and method for gathering video data related to operation of an autonomous industrial vehicle |
EP2715286B1 (en) * | 2011-05-31 | 2020-11-25 | John Bean Technologies Corporation | Deep lane navigation system for automatic guided vehicles |
US9030332B2 (en) * | 2011-06-27 | 2015-05-12 | Motion Metrics International Corp. | Method and apparatus for generating an indication of an object within an operating ambit of heavy loading equipment |
US20140133944A1 (en) * | 2012-11-12 | 2014-05-15 | Lts Scale Company, Llc | Detection System Usable In Forklift Apparatus |
US9415983B2 (en) * | 2012-12-17 | 2016-08-16 | Shamrock Foods Company | Crash prevention system for a storage and retrieval machine |
US9144525B2 (en) * | 2013-03-14 | 2015-09-29 | Max Mobility, Llc. | Motion assistance system for wheelchairs |
FR3012387B1 (en) * | 2013-10-25 | 2015-10-30 | Rio Tinto Alcan Int Ltd | HANDLING VEHICLE AND ELECTROLYSIS FACTORY COMPRISING THIS VEHICLE |
US20150318765A1 (en) * | 2014-04-30 | 2015-11-05 | Rossie Owen Terry | Electrical motors and methods thereof having reduced electromagnetic emissions |
KR102075808B1 (en) * | 2013-12-30 | 2020-03-02 | 주식회사 두산 | Controller and control method of Forklift |
JP6267972B2 (en) * | 2014-01-23 | 2018-01-24 | 日立建機株式会社 | Work machine ambient monitoring device |
US9886036B2 (en) | 2014-02-10 | 2018-02-06 | John Bean Technologies Corporation | Routing of automated guided vehicles |
US9164514B1 (en) * | 2014-04-14 | 2015-10-20 | Southwest Research Institute | Cooperative perimeter patrol system and method |
CN103941736B (en) * | 2014-05-07 | 2017-06-30 | 山东理工大学 | A kind of intelligent carrier and its control method |
US9459349B2 (en) * | 2014-10-27 | 2016-10-04 | Hyster-Yale Group, Inc. | Vehicle and environmental detection system |
CN104483969B (en) * | 2014-12-05 | 2018-04-17 | 嘉兴市日新自动化科技有限公司 | The automatic patrol robot of road |
DE102015101381A1 (en) * | 2015-01-30 | 2016-08-04 | Hubtex Maschinenbau Gmbh & Co. Kg | Steering method, industrial truck and route guidance system |
US9864371B2 (en) | 2015-03-10 | 2018-01-09 | John Bean Technologies Corporation | Automated guided vehicle system |
US9989636B2 (en) * | 2015-03-26 | 2018-06-05 | Deere & Company | Multi-use detection system for work vehicle |
US10202267B2 (en) * | 2015-10-29 | 2019-02-12 | The Raymond Corporation | Systems and methods for sensing a load carried by a material handling vehicle |
EP3199486B1 (en) * | 2016-01-28 | 2018-06-20 | MOBA - Mobile Automation AG | Crane mechanism and work platform with a load measuring device and an integrated inclination sensor |
US20170253237A1 (en) * | 2016-03-02 | 2017-09-07 | Magna Electronics Inc. | Vehicle vision system with automatic parking function |
EP3269678B1 (en) | 2016-07-14 | 2019-03-06 | Toyota Material Handling Manufacturing Sweden AB | Floor conveyor |
EP3269680B1 (en) | 2016-07-14 | 2020-09-30 | Toyota Material Handling Manufacturing Sweden AB | Floor conveyor |
EP3269679B1 (en) | 2016-07-14 | 2019-09-11 | Toyota Material Handling Manufacturing Sweden AB | Floor conveyor |
DE102016115703A1 (en) | 2016-08-24 | 2018-03-01 | Jungheinrich Aktiengesellschaft | Industrial truck and method for controlling an industrial truck |
WO2018039556A1 (en) * | 2016-08-26 | 2018-03-01 | Crown Equipment Corporation | Multi-field scanning tools in materials handling vehicles |
US10589931B2 (en) | 2016-09-30 | 2020-03-17 | Staples, Inc. | Hybrid modular storage fetching system |
CA3038898A1 (en) | 2016-09-30 | 2018-04-05 | Staples, Inc. | Hybrid modular storage fetching system |
US10683171B2 (en) | 2016-09-30 | 2020-06-16 | Staples, Inc. | Hybrid modular storage fetching system |
CN107045677A (en) * | 2016-10-14 | 2017-08-15 | 北京石油化工学院 | A kind of harmful influence warehouse barrier Scan orientation restoring method, apparatus and system |
DE102016123541A1 (en) * | 2016-12-06 | 2018-06-07 | Jungheinrich Aktiengesellschaft | Method for automatic alignment of a truck in a warehouse and system of an industrial truck and a warehouse |
DE102016123542A1 (en) | 2016-12-06 | 2018-06-07 | Jungheinrich Aktiengesellschaft | Method for automatic alignment of a truck in a warehouse and system of an industrial truck and a warehouse |
AU2017279735C1 (en) * | 2016-12-23 | 2024-01-18 | The Raymond Corporation | Systems and methods for determining a rack interface for a material handling vehicle |
CN106671906B (en) * | 2016-12-30 | 2017-11-28 | 罗维迁 | Control method, control device and the AGV fork trucks of AGV fork trucks |
DE102017203514A1 (en) * | 2017-03-03 | 2018-09-20 | Robert Bosch Gmbh | Industrial truck with improved sensor concept and forklift system |
KR102569034B1 (en) | 2017-03-30 | 2023-08-22 | 크라운 이큅먼트 코포레이션 | warehouse mapping tool |
DE102017214185A1 (en) * | 2017-08-15 | 2019-02-21 | Zf Friedrichshafen Ag | Control of a transport vehicle |
WO2019042345A1 (en) * | 2017-08-30 | 2019-03-07 | 苏州宝时得电动工具有限公司 | Self-mobile device and control method for moving path thereof |
EP3466843A1 (en) * | 2017-10-03 | 2019-04-10 | AJ Produkter AB | A method and a device for controlling speed of a moving shuttle |
JP6753385B2 (en) | 2017-10-23 | 2020-09-09 | 株式会社豊田自動織機 | Remote control system for industrial vehicles, remote control device, remote control program for industrial vehicles, and remote control method for industrial vehicles |
CN107745908A (en) * | 2017-11-30 | 2018-03-02 | 无锡凯乐士科技有限公司 | A kind of new logistics shuttle |
DE102017128623A1 (en) | 2017-12-01 | 2019-06-06 | Jungheinrich Aktiengesellschaft | Method for coupling a second remote control unit with a first remote control unit |
CN112135715B (en) * | 2018-05-18 | 2024-03-19 | 科林达斯公司 | Remote communication and control system for robotic intervention procedures |
CA3108380A1 (en) * | 2018-08-01 | 2020-02-06 | Crown Equipment Corporation | Systems and methods for warehouse environment speed zone management |
US11590997B1 (en) | 2018-08-07 | 2023-02-28 | Staples, Inc. | Autonomous shopping cart |
US11084410B1 (en) | 2018-08-07 | 2021-08-10 | Staples, Inc. | Automated guided vehicle for transporting shelving units |
US11630447B1 (en) | 2018-08-10 | 2023-04-18 | Staples, Inc. | Automated guided vehicle for transporting objects |
DE102018121928A1 (en) | 2018-09-07 | 2020-03-12 | Jungheinrich Aktiengesellschaft | Industrial truck with a detection unit |
JP7180219B2 (en) | 2018-09-10 | 2022-11-30 | 株式会社豊田自動織機 | autonomous vehicle |
US20200125092A1 (en) * | 2018-10-17 | 2020-04-23 | Wellen Sham | Self-driving with onboard control system |
CN109361352B (en) * | 2018-11-09 | 2020-08-04 | 苏州瑞得恩光能科技有限公司 | Control method of cleaning system |
US11119487B2 (en) | 2018-12-31 | 2021-09-14 | Staples, Inc. | Automated preparation of deliveries in delivery vehicles using automated guided vehicles |
US11180069B2 (en) | 2018-12-31 | 2021-11-23 | Staples, Inc. | Automated loading of delivery vehicles using automated guided vehicles |
CN113302119A (en) | 2019-01-11 | 2021-08-24 | 杜尔系统股份公司 | Transport device, processing plant, method for transporting and/or processing objects |
EP4257406A3 (en) | 2019-02-01 | 2023-12-20 | Crown Equipment Corporation | On-board charging station for a remote control device |
US11641121B2 (en) | 2019-02-01 | 2023-05-02 | Crown Equipment Corporation | On-board charging station for a remote control device |
US11124401B1 (en) | 2019-03-31 | 2021-09-21 | Staples, Inc. | Automated loading of delivery vehicles |
US11453355B2 (en) | 2019-04-02 | 2022-09-27 | The Raymond Corporation | Systems and methods for a material handling vehicle with a multi-piece bumper assembly |
AU2020202311A1 (en) | 2019-04-02 | 2020-10-22 | The Raymond Corporation | Material handling vehicle having a multi-piece bumper assembly |
WO2021021008A1 (en) * | 2019-08-01 | 2021-02-04 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods for risk management for autonomous devices and related node |
US11353877B2 (en) * | 2019-12-16 | 2022-06-07 | Zoox, Inc. | Blocked region guidance |
US11789155B2 (en) | 2019-12-23 | 2023-10-17 | Zoox, Inc. | Pedestrian object detection training |
US11462041B2 (en) | 2019-12-23 | 2022-10-04 | Zoox, Inc. | Pedestrians with objects |
CN115843346A (en) * | 2019-12-23 | 2023-03-24 | 祖克斯有限公司 | Pedestrian with object |
CN114929538A (en) * | 2020-02-21 | 2022-08-19 | 克朗设备公司 | Modifying vehicle parameters based on vehicle location information |
WO2022026144A1 (en) | 2020-07-31 | 2022-02-03 | Crown Equipment Corporation | On-board charging station for a remote control device |
EP4446840A3 (en) | 2020-08-11 | 2024-10-23 | Crown Equipment Corporation | Remote control device |
CA3189182A1 (en) | 2020-08-13 | 2022-02-17 | Michael J. Corbett | Method and system for testing a remote control device |
MX2023003898A (en) * | 2020-10-05 | 2023-04-24 | Crown Equip Corp | Systems and methods for relative pose sensing and field enforcement of materials handling vehicles using ultra-wideband radio technology. |
CN116746174A (en) | 2021-02-19 | 2023-09-12 | 克朗设备公司 | Power saving for remote control device |
Family Cites Families (166)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1835808A (en) | 1930-02-06 | 1931-12-08 | Baker Raulang Co | Industrial truck |
US2959260A (en) | 1955-10-17 | 1960-11-08 | Thew Shovel Co | Dual control system for cranes and the like |
US3047783A (en) | 1957-08-27 | 1962-07-31 | Philips Corp | Remote control arrangements |
US3016973A (en) | 1958-12-29 | 1962-01-16 | Clark Equipment Co | Lift truck |
GB1002825A (en) | 1962-02-07 | 1965-09-02 | Lansing Bagnall Ltd | Improvements in or relating to industrial trucks |
US3587784A (en) | 1968-09-26 | 1971-06-28 | Hunter Manufacturing Co Inc | Telescopic load booster |
US3968893A (en) | 1972-06-26 | 1976-07-13 | Lapham Engineering Co., Inc. | Material handling vehicle with H-shaped walk area |
US3825130A (en) | 1972-06-26 | 1974-07-23 | S Lapham | Material handling system |
GB1515712A (en) | 1975-08-14 | 1978-06-28 | Total Mech Handling Ltd | Mechanical handling apparatus |
US4074120A (en) | 1976-03-02 | 1978-02-14 | Kenway Incorporated | Automated materials storage system and method |
US4258825A (en) | 1978-05-18 | 1981-03-31 | Collins Pat L | Powered manlift cart |
US4287966A (en) | 1978-11-06 | 1981-09-08 | Missouri Research Laboratories Inc. | Industrial truck |
JPS58217012A (en) | 1982-06-11 | 1983-12-16 | Kubota Ltd | Traveling vehicle with obstacle detecting sensor |
US4476954A (en) | 1982-09-22 | 1984-10-16 | Johnson Engineering Corporation | Remote control for motor vehicle |
GB8313338D0 (en) | 1983-05-14 | 1983-06-22 | Gen Electric Co Plc | Vehicle control |
US4551059A (en) | 1983-11-02 | 1985-11-05 | The United States Of America As Represented By The Secretary Of The Navy | Multi-directional straddle-lift carrier |
US4527651A (en) | 1983-11-17 | 1985-07-09 | Racine Federated Inc. | Remote control handle assembly |
US4665487A (en) | 1984-05-25 | 1987-05-12 | Kabushiki Kaisha Meidensha | Unmanned vehicle control system and method |
EP0243382B1 (en) | 1985-10-15 | 1992-08-26 | Hans-Reinhard Knepper | Process and installation for the automatic control of a utility vehicle |
US4716980A (en) | 1986-02-14 | 1988-01-05 | The Prime Mover Company | Control system for rider vehicles |
US4714140A (en) | 1986-03-17 | 1987-12-22 | Hatton John H | Multi-axis articulated all terrain vehicle |
US4785664A (en) | 1986-04-28 | 1988-11-22 | Kay-Ray, Inc. | Ultrasonic sensor |
US5457629A (en) | 1989-01-31 | 1995-10-10 | Norand Corporation | Vehicle data system with common supply of data and power to vehicle devices |
GB2197799A (en) | 1986-10-08 | 1988-06-02 | Synergistics Research | A remote controller for a toy vehicle |
US4954817A (en) | 1988-05-02 | 1990-09-04 | Levine Neil A | Finger worn graphic interface device |
JPH02152898A (en) | 1988-12-06 | 1990-06-12 | Komatsu Forklift Co Ltd | Stock control method in material handling vehicle |
US4967362A (en) | 1989-01-30 | 1990-10-30 | Eaton Corporation | Automatic steering apparatus for crop vehicle |
US5023790A (en) | 1989-02-17 | 1991-06-11 | Whs Robotics | Automatic guided vehicle system |
JP2636403B2 (en) | 1989-03-08 | 1997-07-30 | 株式会社豊田自動織機製作所 | Operation control device for unmanned vehicles |
JP2728174B2 (en) * | 1989-05-29 | 1998-03-18 | マツダ株式会社 | Forward car recognition device for mobile vehicles |
FR2648842B1 (en) | 1989-06-26 | 1992-05-15 | Screg Routes & Travaux | SECURITY SYSTEM FOR A MACHINE, PARTICULARLY FOR PUBLIC WORKS |
US5107946A (en) | 1989-07-26 | 1992-04-28 | Honda Giken Kogyo Kabushiki Kaisha | Steering control system for moving vehicle |
US5044472A (en) | 1989-12-05 | 1991-09-03 | Crown Equipment Corporation | Dual operator position for material handling vehicle |
US5170351A (en) | 1990-09-18 | 1992-12-08 | Matushita Electric Industrial Co., Ltd. | Automatic guided vehicle and method for controlling travel thereof |
US5307271A (en) | 1990-09-28 | 1994-04-26 | The United States Of America As Represented By The Secretary Of The Navy | Reflexive teleoperated control system for a remotely controlled vehicle |
US5141381A (en) | 1990-11-19 | 1992-08-25 | Daifuku Co., Ltd. | Safety arrangement for automatic warehousing facility |
US5117935A (en) * | 1990-12-21 | 1992-06-02 | Caterpillar Inc. | Load sensing hydrostatic steering system |
WO1992015977A1 (en) | 1991-03-04 | 1992-09-17 | Sydec N.V. | Selectively addressable programmable remote control system |
DE4111736C1 (en) | 1991-04-08 | 1992-08-27 | Mannesmann Ag, 4000 Duesseldorf, De | |
US5245144A (en) | 1991-10-25 | 1993-09-14 | Crown Equipment Corporation | Walk along hand grip switch control for pallet truck |
JP3210121B2 (en) * | 1992-02-10 | 2001-09-17 | 本田技研工業株式会社 | Obstacle avoidance route search method for moving objects |
US5502638A (en) | 1992-02-10 | 1996-03-26 | Honda Giken Kogyo Kabushiki Kaisha | System for obstacle avoidance path planning for multiple-degree-of-freedom mechanism |
RU2032926C1 (en) * | 1992-11-24 | 1995-04-10 | Лев Борисович Гурин | Method of control over equidistant traffic of vehicles and method of determination of angular and linear-side deviation of vehicle from reference path |
US5361861A (en) * | 1993-01-29 | 1994-11-08 | Trw Inc. | Power steering system |
EP0658467B1 (en) | 1993-11-10 | 1997-11-26 | Raymond Corporation | Guidewire controls for a manned, material handling vehicle |
JPH07138000A (en) | 1993-11-15 | 1995-05-30 | Shinko Electric Co Ltd | Manned/unmanned load handling vehicle |
DE4415736C2 (en) | 1994-05-04 | 2002-11-14 | Siemens Ag | Collision avoidance method using a steering angle field for an autonomous mobile unit |
SE514791C2 (en) * | 1994-06-06 | 2001-04-23 | Electrolux Ab | Improved method for locating lighthouses in self-propelled equipment |
US5652486A (en) | 1995-04-14 | 1997-07-29 | S.L.O.W. Corporation | Travel speed limiting system for forklift trucks |
WO1996039679A1 (en) | 1995-06-06 | 1996-12-12 | Mrigank Shekhar | Pointer device |
US5709523A (en) | 1995-06-07 | 1998-01-20 | Ware; Emmet P. | Material handling lift |
JPH0995194A (en) | 1995-09-29 | 1997-04-08 | Aisin Seiki Co Ltd | Detecting device for object in front of vehicle |
JP3239727B2 (en) * | 1995-12-05 | 2001-12-17 | トヨタ自動車株式会社 | Automatic driving control device for vehicles |
DE19613386A1 (en) | 1996-04-03 | 1997-10-09 | Fiat Om Carrelli Elevatori | Industrial truck, which can be operated either manually or automatically |
CA2210037C (en) | 1996-07-30 | 2001-01-23 | The Raymond Corporation | Motion control system for a materials handling vehicle |
US5939986A (en) | 1996-10-18 | 1999-08-17 | The United States Of America As Represented By The United States Department Of Energy | Mobile machine hazardous working zone warning system |
US5816741A (en) | 1997-04-03 | 1998-10-06 | Ingersoll-Rand Company | Remote control for walk-behind compactor |
FR2764091B1 (en) | 1997-05-30 | 1999-09-03 | Peugeot | REMOTE CONTROL AND OPERATING DEVICE FOR AT LEAST ONE URBAN VEHICLE, ESPECIALLY ELECTRIC |
NL1007225C2 (en) * | 1997-10-08 | 1999-04-09 | Maasland Nv | Vehicle combination. |
US6033366A (en) | 1997-10-14 | 2000-03-07 | Data Sciences International, Inc. | Pressure measurement device |
DE19746700C2 (en) | 1997-10-22 | 2000-01-13 | Wacker Werke Kg | Method and safety device for remote control of self-propelled work equipment |
US6173215B1 (en) | 1997-12-19 | 2001-01-09 | Caterpillar Inc. | Method for determining a desired response to detection of an obstacle |
CN1178467C (en) * | 1998-04-16 | 2004-12-01 | 三星电子株式会社 | Method and apparatus for automatically tracing moving object |
US6268803B1 (en) | 1998-08-06 | 2001-07-31 | Altra Technologies Incorporated | System and method of avoiding collisions |
US6030169A (en) | 1998-08-07 | 2000-02-29 | Clark Equipment Company | Remote attachment control device for power machine |
US6276485B1 (en) | 1998-12-30 | 2001-08-21 | Bt Industries Ab | Device at tiller truck |
US7133537B1 (en) * | 1999-05-28 | 2006-11-07 | It Brokerage Services Pty Limited | Method and apparatus for tracking a moving object |
US6226902B1 (en) | 1999-07-16 | 2001-05-08 | Case Corporation | Operator presence system with bypass logic |
GB2352859A (en) * | 1999-07-31 | 2001-02-07 | Ibm | Automatic zone monitoring using two or more cameras |
US6382359B1 (en) | 1999-10-29 | 2002-05-07 | Jungheinrich Aktiengesellschaft | Hand/co-traveller lift truck with a holding bar |
US6548982B1 (en) | 1999-11-19 | 2003-04-15 | Regents Of The University Of Minnesota | Miniature robotic vehicles and methods of controlling same |
US6686951B1 (en) | 2000-02-28 | 2004-02-03 | Case, Llc | Crop row segmentation by K-means clustering for a vision guidance system |
AT412196B (en) | 2000-03-17 | 2004-11-25 | Keba Ag | METHOD FOR ASSIGNING A MOBILE OPERATING AND / OR OBSERVATION DEVICE TO A MACHINE AND OPERATING AND / OR OBSERVATION DEVICE THEREFOR |
DE10015009B4 (en) | 2000-03-20 | 2006-02-23 | Jungheinrich Ag | Industrial truck with a display, control and monitoring system |
US20030205433A1 (en) | 2001-05-03 | 2003-11-06 | Hagman Earl L | Variable straddle transporter lift with programmable height positions |
DE10033857A1 (en) | 2000-07-12 | 2002-01-24 | Dambach Lagersysteme Gmbh | Storage system operating device is remotely controllable by operator via remote control unit; controller positions device coarsely, control unit automatically performs storage/extraction |
US6552661B1 (en) * | 2000-08-25 | 2003-04-22 | Rf Code, Inc. | Zone based radio frequency identification |
JP2002104800A (en) | 2000-09-29 | 2002-04-10 | Komatsu Forklift Co Ltd | Remote control device for battery forklift truck |
US20020163495A1 (en) | 2001-05-02 | 2002-11-07 | Plamen Doynov | Multi-functional ergonomic interface |
US6681638B2 (en) * | 2001-05-04 | 2004-01-27 | Homayoon Kazerooni | Device and method for wireless material handling systems |
US6464025B1 (en) | 2001-05-15 | 2002-10-15 | Crown Equipment Corporation | Coast control for walkie/rider pallet truck |
US6784800B2 (en) | 2001-06-19 | 2004-08-31 | Signal Tech | Industrial vehicle safety system |
JP3905727B2 (en) | 2001-07-13 | 2007-04-18 | 日産自動車株式会社 | Vehicle lane tracking control device |
US6595306B2 (en) | 2001-08-09 | 2003-07-22 | Crown Equipment Corporation | Supplemental walk along control for walkie/rider pallet trucks |
JP3865121B2 (en) | 2001-10-31 | 2007-01-10 | 株式会社小松製作所 | Vehicle obstacle detection device |
JP2003231422A (en) * | 2002-02-08 | 2003-08-19 | Hitachi Ltd | Automatic inter-vehicle distance control device and car |
US6862537B2 (en) * | 2002-03-21 | 2005-03-01 | Ford Global Technologies Llc | Sensor fusion system architecture |
DE10218010A1 (en) * | 2002-04-23 | 2003-11-06 | Bosch Gmbh Robert | Method and device for lateral guidance support in motor vehicles |
US6748292B2 (en) | 2002-07-15 | 2004-06-08 | Distrobot Systems, Inc. | Material handling method using autonomous mobile drive units and movable inventory trays |
DE10232295A1 (en) * | 2002-07-16 | 2004-02-05 | Daimlerchrysler Ag | Method for assisting the driver in driving maneuvers |
US7076366B2 (en) | 2002-09-06 | 2006-07-11 | Steven Simon | Object collision avoidance system for a vehicle |
GB0229700D0 (en) | 2002-12-19 | 2003-01-29 | Koninkl Philips Electronics Nv | Remote control system and authentication method |
FR2849160B1 (en) * | 2002-12-24 | 2005-03-18 | Alm | LIGHTING DEVICE AND USE THEREOF |
GB2398394B (en) | 2003-02-14 | 2006-05-17 | Dyson Ltd | An autonomous machine |
US6801125B1 (en) * | 2003-03-19 | 2004-10-05 | Delphi Technologies, Inc. | Rear steering hitch/docking mode |
EP1462880A3 (en) | 2003-03-24 | 2005-04-06 | Fila Luxembourg S.a.r.l. | Housing for electronic device wearable on user's finger |
US6813557B2 (en) | 2003-03-27 | 2004-11-02 | Deere & Company | Method and system for controlling a vehicle having multiple control modes |
US7016783B2 (en) | 2003-03-28 | 2006-03-21 | Delphi Technologies, Inc. | Collision avoidance with active steering and braking |
US7510038B2 (en) * | 2003-06-11 | 2009-03-31 | Delphi Technologies, Inc. | Steering system with lane keeping integration |
US7042438B2 (en) | 2003-09-06 | 2006-05-09 | Mcrae Michael William | Hand manipulated data apparatus for computers and video games |
JP2005094425A (en) | 2003-09-18 | 2005-04-07 | Fuji Xerox Co Ltd | Remote control device |
US7047132B2 (en) | 2004-01-12 | 2006-05-16 | Steven Jacobs | Mobile vehicle sensor array |
JP4257230B2 (en) | 2004-02-26 | 2009-04-22 | 株式会社東芝 | Mobile robot |
US7228231B2 (en) * | 2004-04-29 | 2007-06-05 | The Boeing Company | Multiple stayout zones for ground-based bright object exclusion |
US8075243B2 (en) | 2004-05-03 | 2011-12-13 | Jervis B. Webb Company | Automatic transport loading system and method |
WO2005108246A2 (en) | 2004-05-03 | 2005-11-17 | Jervis B. Webb Company | Automatic transport loading system and method |
US7017689B2 (en) * | 2004-05-06 | 2006-03-28 | Crown Equipment Corporation | Electrical steering assist for material handling vehicles |
DE102004027250A1 (en) * | 2004-06-03 | 2005-12-29 | Magna Donnelly Gmbh & Co. Kg | Method and device for assisted control of a motor vehicle |
US7188015B2 (en) * | 2004-07-14 | 2007-03-06 | Trimble Navigation Limited | Method and system for controlling a mobile machine |
CN101005981B (en) * | 2004-08-06 | 2010-06-16 | 本田技研工业株式会社 | Control device for vehicle |
US8406845B2 (en) * | 2004-09-01 | 2013-03-26 | University Of Tennessee Research Foundation | Method and apparatus for imaging tracking |
US20060125806A1 (en) | 2004-09-27 | 2006-06-15 | The Regents Of The University Of Minnesota | Human-activated displacement control appliance for use with computerized device/mechanism |
JP4400418B2 (en) * | 2004-10-29 | 2010-01-20 | 日産自動車株式会社 | Inter-vehicle distance control device, inter-vehicle distance control method, driving operation support device, and driving operation support method |
US7164118B2 (en) * | 2004-10-29 | 2007-01-16 | Deere & Company | Method and system for obstacle detection |
EP1813569A1 (en) | 2004-11-19 | 2007-08-01 | Mitsubishi Heavy Industries, Ltd. | Overturning prevention device for forklift truck |
KR20060059006A (en) | 2004-11-26 | 2006-06-01 | 삼성전자주식회사 | Method and apparatus of self-propelled mobile unit with obstacle avoidance during wall-following |
JP2006259877A (en) | 2005-03-15 | 2006-09-28 | Daifuku Co Ltd | Article conveyance equipment |
JP4093261B2 (en) | 2005-03-15 | 2008-06-04 | 松下電工株式会社 | Autonomous mobile device |
US7400415B2 (en) | 2005-03-15 | 2008-07-15 | Mitutoyo Corporation | Operator interface apparatus and method for displacement transducer with selectable detector area |
CA2531305A1 (en) | 2005-04-25 | 2006-10-25 | Lg Electronics Inc. | Self-moving robot capable of correcting movement errors and method for correcting movement errors of the same |
US20060250255A1 (en) | 2005-05-06 | 2006-11-09 | Flanagan Eugene E | Paired child to parent separation distance monitoring and alarm system and method of same |
US7679996B2 (en) | 2005-06-13 | 2010-03-16 | Robert Ray Gross | Methods and device for ultrasonic range sensing |
US7266477B2 (en) | 2005-06-22 | 2007-09-04 | Deere & Company | Method and system for sensor signal fusion |
DE102005045018A1 (en) * | 2005-09-21 | 2007-03-22 | Robert Bosch Gmbh | Device for longitudinal guidance of a motor vehicle |
US7996109B2 (en) | 2005-10-14 | 2011-08-09 | Aethon, Inc. | Robotic ordering and delivery apparatuses, systems and methods |
US7477973B2 (en) | 2005-10-15 | 2009-01-13 | Trimble Navigation Ltd | Vehicle gyro based steering assembly angle and angular rate sensor |
JP4887980B2 (en) * | 2005-11-09 | 2012-02-29 | 日産自動車株式会社 | VEHICLE DRIVE OPERATION ASSISTANCE DEVICE AND VEHICLE WITH VEHICLE DRIVE OPERATION ASSISTANCE DEVICE |
JP2007137126A (en) * | 2005-11-15 | 2007-06-07 | Mazda Motor Corp | Obstacle detecting device for vehicle |
US8050863B2 (en) | 2006-03-16 | 2011-11-01 | Gray & Company, Inc. | Navigation and control system for autonomous vehicles |
KR100776944B1 (en) * | 2006-05-01 | 2007-11-21 | 주식회사 한울로보틱스 | The map building method for mobile robot |
US9645968B2 (en) | 2006-09-14 | 2017-05-09 | Crown Equipment Corporation | Multiple zone sensing for materials handling vehicles |
US8452464B2 (en) | 2009-08-18 | 2013-05-28 | Crown Equipment Corporation | Steer correction for a remotely operated materials handling vehicle |
EP3190572B1 (en) | 2006-09-14 | 2018-12-12 | Crown Equipment Corporation | Method of remotely controlling a materials handling vehicle |
US8072309B2 (en) | 2006-09-14 | 2011-12-06 | Crown Equipment Corporation | Systems and methods of remotely controlling a materials handling vehicle |
US9207673B2 (en) | 2008-12-04 | 2015-12-08 | Crown Equipment Corporation | Finger-mounted apparatus for remotely controlling a materials handling vehicle |
JP4270259B2 (en) * | 2006-10-05 | 2009-05-27 | 日産自動車株式会社 | Obstacle avoidance control device |
US8983765B2 (en) * | 2006-10-11 | 2015-03-17 | GM Global Technology Operations LLC | Method and system for lane centering control |
EP2115692A4 (en) | 2006-12-13 | 2011-11-16 | Crown Equip Corp | Fleet management system |
JP4905156B2 (en) | 2007-01-26 | 2012-03-28 | 株式会社豊田自動織機 | Industrial vehicle travel control device |
KR20080073933A (en) * | 2007-02-07 | 2008-08-12 | 삼성전자주식회사 | Object tracking method and apparatus, and object pose information calculating method and apparatus |
JP2008226140A (en) * | 2007-03-15 | 2008-09-25 | Mazda Motor Corp | Vehicle operation support system |
US8068962B2 (en) * | 2007-04-05 | 2011-11-29 | Power Curbers, Inc. | 3D control system for construction machines |
DE102007027494B4 (en) | 2007-06-14 | 2012-12-06 | Daimler Ag | A method and apparatus for assisting the driver of a vehicle in vehicle guidance |
US8195366B2 (en) | 2007-09-13 | 2012-06-05 | The Raymond Corporation | Control system for a pallet truck |
US8027029B2 (en) * | 2007-11-07 | 2011-09-27 | Magna Electronics Inc. | Object detection and tracking system |
EP2085279B1 (en) * | 2008-01-29 | 2011-05-25 | Ford Global Technologies, LLC | A system for collision course prediction |
JP4775391B2 (en) | 2008-03-18 | 2011-09-21 | 株式会社デンソー | Obstacle detection device |
KR100946723B1 (en) * | 2008-04-12 | 2010-03-12 | 재단법인서울대학교산학협력재단 | Steering Method for vehicle and Apparatus thereof |
US8170787B2 (en) | 2008-04-15 | 2012-05-01 | Caterpillar Inc. | Vehicle collision avoidance system |
JP4538762B2 (en) * | 2008-05-20 | 2010-09-08 | トヨタ自動車株式会社 | Inter-vehicle distance control device |
DE102008027282A1 (en) * | 2008-06-06 | 2009-12-10 | Claas Industrietechnik Gmbh | Agricultural vehicle and operating procedure for it |
US8190364B2 (en) * | 2008-06-30 | 2012-05-29 | Deere & Company | System and method for providing towed implement compensation |
KR100962529B1 (en) * | 2008-07-22 | 2010-06-14 | 한국전자통신연구원 | Method for tracking object |
US8280560B2 (en) * | 2008-07-24 | 2012-10-02 | GM Global Technology Operations LLC | Adaptive vehicle control system with driving style recognition based on headway distance |
US8705792B2 (en) | 2008-08-06 | 2014-04-22 | Toyota Motor Engineering & Manufacturing North America, Inc. | Object tracking using linear features |
US8229618B2 (en) * | 2008-09-11 | 2012-07-24 | Deere & Company | Leader-follower fully autonomous vehicle with operator on side |
US8392065B2 (en) * | 2008-09-11 | 2013-03-05 | Deere & Company | Leader-follower semi-autonomous vehicle with operator on side |
EP2370870B1 (en) * | 2008-12-04 | 2019-01-30 | Crown Equipment Corporation | Multiple zone sensing for materials handling vehicles |
EP2381697B1 (en) | 2008-12-24 | 2014-11-12 | Doosan Infracore Co., Ltd. | Remote control system and method for construction equipment |
US8099214B2 (en) * | 2009-02-09 | 2012-01-17 | GM Global Technology Operations LLC | Path planning for autonomous parking |
US20100209888A1 (en) * | 2009-02-18 | 2010-08-19 | Gm Global Technology Operations, Inc. | Vehicle stability enhancement control adaptation to driving skill based on curve-handling maneuvers |
EP2533119B1 (en) | 2009-07-02 | 2017-02-08 | Crown Equipment Corporation | Apparatus for remotely controlling a materials handling vehicle |
US8120476B2 (en) * | 2009-07-22 | 2012-02-21 | International Truck Intellectual Property Company, Llc | Digital camera rear-view system |
AU2009351340B2 (en) | 2009-08-18 | 2015-06-18 | Crown Equipment Corporation | Steer correction for a remotely operated materials handling vehicle |
US8731777B2 (en) | 2009-08-18 | 2014-05-20 | Crown Equipment Corporation | Object tracking and steer maneuvers for materials handling vehicles |
US11995208B2 (en) | 2018-12-12 | 2024-05-28 | Hewlett-Packard Development Company, L.P. | Updates of machine learning models based on confidential data |
-
2011
- 2011-02-23 US US13/033,169 patent/US8731777B2/en active Active
-
2012
- 2012-02-21 CN CN201410186886.3A patent/CN103926929B/en active Active
- 2012-02-21 RU RU2013138708/11A patent/RU2578831C9/en not_active IP Right Cessation
- 2012-02-21 CA CA3004966A patent/CA3004966C/en active Active
- 2012-02-21 EP EP12706413.7A patent/EP2678748B2/en active Active
- 2012-02-21 CA CA3005016A patent/CA3005016C/en active Active
- 2012-02-21 BR BR112013021044A patent/BR112013021044A2/en not_active Application Discontinuation
- 2012-02-21 MX MX2013009769A patent/MX339969B/en active IP Right Grant
- 2012-02-21 KR KR1020207001300A patent/KR20200008046A/en not_active Application Discontinuation
- 2012-02-21 KR KR1020137024683A patent/KR102038848B1/en active IP Right Grant
- 2012-02-21 IN IN3425CHN2014 patent/IN2014CN03425A/en unknown
- 2012-02-21 EP EP14190556.2A patent/EP2866113B1/en active Active
- 2012-02-21 KR KR1020147012563A patent/KR101940469B1/en active IP Right Grant
- 2012-02-21 AU AU2012220819A patent/AU2012220819B2/en active Active
- 2012-02-21 CN CN201280010186.8A patent/CN103392157B/en active Active
- 2012-02-21 EP EP15151549.1A patent/EP2889713B1/en active Active
- 2012-02-21 CA CA2827735A patent/CA2827735C/en active Active
- 2012-02-21 EP EP14190558.8A patent/EP2866114B1/en active Active
- 2012-02-21 BR BR122014017960A patent/BR122014017960A2/en not_active Application Discontinuation
- 2012-02-21 WO PCT/US2012/025849 patent/WO2012115920A2/en active Application Filing
- 2012-02-21 KR KR1020197025277A patent/KR102144781B1/en active IP Right Grant
- 2012-02-21 EP EP15156819.3A patent/EP2905668B1/en active Active
-
2013
- 2013-06-10 US US13/913,663 patent/US9002581B2/en active Active
-
2014
- 2014-03-12 US US14/205,449 patent/US9493184B2/en active Active
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2889713B1 (en) | Object tracking and steer maneuvers for materials handling vehicles | |
AU2015207833B2 (en) | Steer control maneuvers for materials handling vehicles | |
US8452464B2 (en) | Steer correction for a remotely operated materials handling vehicle | |
EP2685337B1 (en) | Steer correction for a remotely operated materials handling vehicle | |
AU2014268191B2 (en) | Object tracking and steer maneuvers for materials handling vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20150116 |
|
AC | Divisional application: reference to earlier application |
Ref document number: 2678748 Country of ref document: EP Kind code of ref document: P |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: B66F 9/075 20060101ALI20150701BHEP Ipc: G05D 1/02 20060101AFI20150701BHEP Ipc: B62D 6/00 20060101ALI20150701BHEP Ipc: B62D 15/02 20060101ALI20150701BHEP Ipc: B66F 9/06 20060101ALI20150701BHEP |
|
R17P | Request for examination filed (corrected) |
Effective date: 20160205 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G05D 1/02 20060101AFI20160613BHEP Ipc: B66F 9/075 20060101ALI20160613BHEP Ipc: B62D 15/02 20060101ALI20160613BHEP Ipc: B62D 6/00 20060101ALI20160613BHEP Ipc: B66F 9/06 20060101ALI20160613BHEP |
|
INTG | Intention to grant announced |
Effective date: 20160704 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AC | Divisional application: reference to earlier application |
Ref document number: 2678748 Country of ref document: EP Kind code of ref document: P |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 854139 Country of ref document: AT Kind code of ref document: T Effective date: 20170115 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602012026754 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170314 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170315 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 854139 Country of ref document: AT Kind code of ref document: T Effective date: 20161214 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170414 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170314 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170414 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602012026754 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170228 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170228 |
|
26N | No opposition filed |
Effective date: 20170915 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170221 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170221 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170221 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20120221 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20161214 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230529 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602012026754 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G05D0001020000 Ipc: G05D0001430000 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20240219 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240219 Year of fee payment: 13 Ref country code: GB Payment date: 20240219 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IT Payment date: 20240228 Year of fee payment: 13 Ref country code: FR Payment date: 20240221 Year of fee payment: 13 Ref country code: BE Payment date: 20240219 Year of fee payment: 13 |