MX2013009769A - Object tracking and steer maneuvers for materials handling vehicles. - Google Patents

Object tracking and steer maneuvers for materials handling vehicles.

Info

Publication number
MX2013009769A
MX2013009769A MX2013009769A MX2013009769A MX2013009769A MX 2013009769 A MX2013009769 A MX 2013009769A MX 2013009769 A MX2013009769 A MX 2013009769A MX 2013009769 A MX2013009769 A MX 2013009769A MX 2013009769 A MX2013009769 A MX 2013009769A
Authority
MX
Mexico
Prior art keywords
vehicle
truck
zone
area
steering
Prior art date
Application number
MX2013009769A
Other languages
Spanish (es)
Other versions
MX339969B (en
Inventor
William W Mccroskey
Anthony T Castaneda
James F Schloemer
Mark E Schumacher
Vernon W Siefring
Timothy A Wellman
Original Assignee
Crown Equip Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=45771947&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=MX2013009769(A) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Crown Equip Corp filed Critical Crown Equip Corp
Publication of MX2013009769A publication Critical patent/MX2013009769A/en
Publication of MX339969B publication Critical patent/MX339969B/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • B62D6/002Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits computing target steering angles for front or rear wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • B62D15/0265Automatic obstacle avoidance by steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • B62D6/001Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits the torque NOT being among the input parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/0755Position control; Position detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/07568Steering arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/07581Remote controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/20Binding and programming of remote control devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Structural Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Civil Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Mathematical Physics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Guiding Agricultural Machines (AREA)
  • Automatic Cycles, And Cycles In General (AREA)

Abstract

A materials handling vehicle automatically implements steer maneuvers when objects enter one or more zones proximate the vehicle, wherein the zones are monitored by a controller associated with the vehicle. The controller tracks objects in the zones via sensor data obtained from at least one obstacle sensor located on the vehicle and via dead reckoning. The objects are tracked by the controller until they are no longer in an environment proximate the vehicle. Different zones result in different steer maneuvers being implemented by the controller.

Description

MANEUVERS OF TRACKING AND DIRECTION OF OBJECTS FOR VEHICLES OF MATERIAL HANDLING TECHNICAL FIELD The present invention relates in general to materials handling vehicles, and more particularly, to tracking and object directioning schemes for material handling vehicles, such as remotely operated low order picking trucks.
BACKGROUND OF THE INVENTION Low-level order collection trucks are commonly used to collect stock in warehouses and distribution centers. Such order picking trucks typically include forks that carry cargo and a power unit that has a platform on which an operator can pass and operate while controlling the truck. The power unit also has a steerable wheel and corresponding traction and steering control mechanisms, for example, a movable steering arm which is coupled to the steerable wheel. A control handle attached to the steering arm typically includes the operating controls necessary to operate the truck and operate its load handling characteristics.
In a typical stock picking operation, an operator fills orders of the available stock items that are located in storage areas provided at the same time as a plurality of aisles of a warehouse or distribution center. In this regard, the operator drives a low level pickup truck to a first location where the item (s) is selected. In a collection process, the operator typically passes out of the order collection truck, walks over the appropriate location and retrieves the ordered item (s) from its associated storage area (s). The operator then returns to the order collection truck and places the collected stock on a pallet, collection box or other support structure carried by the forks of the truck. After completing the collection process, the operator moves the order collection truck to the next location where the item (s) is collected. The above procedure is repeated until all items in stock in the order have been collected.
It is not very common for an operator to repeat the collection procedure hundreds of times in order. In addition, the operator can be requested to collect numerous orders per shift. As such, the operator can be requested to spend a considerable amount of time collecting and relocating the order collection truck, which reduces the time available for the operator to spend on stockpiling.
BRIEF DESCRIPTION OF THE INVENTION In accordance with various aspects of the present invention, methods and systems are provided for a material handling vehicle to automatically perform a steering correction maneuver. The sensor data is received by a controller in a material handling vehicle of at least one sensor device. Based on the sensor data received, a first object is detected which is located in a first zone defined at least partially on a first side, and a second object is detected which is located in a second zone defined at least partially in a second side of the vehicle, where the second object is close to a central axis of the vehicle that of the first object. An address correction maneuver is automatically made by directing the vehicle towards the first object in order to direct the vehicle beyond the second object to at least one of: the first object enters a predefined portion of the first zone; and the second object leaves a predefined portion of the second zone.
In accordance with other aspects of the present invention, methods and systems are provided for tracking detected objects by at least one sensor device in a material handling vehicle. Sensor data is received by a controller in a material handling vehicle of at least one sensor device. The sensor data includes: representative data of whether an object is detected in a scanned area that is scanned by at least one sensor device, the scanned area is a part of the environment in which the objects are scanned; and data representative of a lateral distance at which any of the objects are of an associated reference coordinate with the vehicle. Each detected object is tracked until the object is no longer located in the environment by: assigning the object to at least one defined cuvette within the scanned area by at least one sensor device; and using at least one of the subsequent sensor data and estimated navigation to reassign the object to adjacent cuvettes and to determine an updated lateral distance where the object is of the reference coordinate as the vehicle moves. The controller automatically implements a direction correction maneuver if a tracked object enters an area beyond the defined direction within the environment.
In accordance with other aspects of the present invention, methods and systems are provided for a material handling vehicle to automatically implement a steering maneuver. The sensor data is received by a controller in a material handling vehicle from at least one sensor device. A selected object is detected in an environment close to the vehicle. A steering maneuver is performed by steering the vehicle so that the vehicle remains substantially at a desired distance from the selected object.
BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is an illustration of a material handling vehicle capable of remote wireless operation in accordance with various aspects of the invention; Figure 2 is a schematic diagram of various components of a material handling vehicle capable of remote wireless operation in accordance with various aspects of the present invention; Figure 3 is a schematic diagram illustrating detection zones of a material handling vehicle according to various aspects of the present invention; Figure 4 is a schematic diagram illustrating an illustrative aspect for detecting an object according to various aspects of the present invention; Figure 5 is a schematic diagram illustrating a plurality of detection zones of a material handling vehicle according to additional aspects of the present invention; Figure 6 is an illustration of a material handling vehicle having separate obstacle detectors according to various aspects of the present invention; Figure 7 is an illustration of a material handling vehicle having obstacle detectors in accordance with additional aspects of the present invention; Figure 8 is an illustration of a material handling vehicle having obstacle detectors in accordance with even additional aspects of the present invention; Figure 9 is a schematic block diagram of a control system of a material handling vehicle that is coupled to the sensors for detecting objects in the travel path of the vehicle according to various aspects of the present invention; Figure 10 is a flow chart of a method for implementing address correction according to various aspects of the present invention; Figure 11 is a schematic illustration of a material handling vehicle traveling through a narrow warehouse aisle under remote wireless operation, which automatically implements a steering correction maneuver in accordance with various aspects of the present invention; Figure 12 is a graph illustrating an illustrative speed of a material handling vehicle implementing a steering correction maneuver under remote wireless operation in accordance with various aspects of the present invention; Figure 13 is a graph illustrating illustrative steering bumper input data for a controller, which illustrates whether an object is detected in the left or right steering bumper zones, in accordance with various aspects of the present invention; Figure 14 is a graph illustrating directional correction in degrees to illustrate an exemplary and illustrative address correction maneuver applied to a material handling vehicle under remote wireless operation in accordance with various aspects of the present invention; Figures 15A-15C are schematic illustrations of an illustrative environment used in connection with object tracking in a material handling vehicle traveling under remote wireless operation in accordance with various aspects of the present invention; Figures 16A-16C are schematic illustrations of the illustrative zones used to implement steering maneuvers in a material handling vehicle traveling under remote wireless operation in accordance with various aspects of the present invention; Y Figures 17A-17C are schematic illustrations of a material handling vehicle traveling under a warehouse aisle under remote wireless operation, which automatically implements steering maneuvers in accordance with various aspects of the present invention.
MODES FOR CARRYING OUT THE INVENTION In the following detailed description of the illustrated embodiments, reference is made to the accompanying drawings that form a part thereof, and in which specific embodiments in which the invention can be practiced are shown by way of illustration, and not by limitation. It will be understood that other embodiments may be used and that changes may be made without departing from the spirit and scope of the various embodiments of the present invention.
Low Level Order Pickup Truck: Referring now to the drawings, and particularly to Figure 1, there is illustrated a material handling vehicle such as a low level order pickup truck 10, which generally includes a load handling assembly 12 extending from a power unit 14. The load handling assembly 12 includes a pair of forks 16, each fork 16 having a load bearing wheel assembly 18. The load handling assembly 12 may also include other load handling characteristics, or instead of the polished arrangement of the forks 16, such as a load backrest, scissor-type lifting forks, rockers or adjustable forks of separate height. Even further, the handling assembly 12 may include load handling features such as a mast, a loading platform, collection basket or other support structure carried by the forks 16 or otherwise provided to handle a load supported and transported by the truck 10.
The illustrated power unit 14 comprises a step operator station that divides a first end section of the power unit 14 (opposite the forks 16) from a second end section (next to the forks 16). The step through operator station provides a platform on which an operator can stand to operate the truck 10 and / or to provide a position from which the operator can operate the various features included of the truck 10.
Presence sensors 58 may be provided to detect the presence of an operator on the truck 10. For example, the presence sensors 58 may be located on, on or under the floor of the platform, or otherwise provided on the operator's station. In the illustrative truck of Figure 1, the presence sensors 58 are shown in dotted lines indicating that they are positioned below the platform floor. Under this arrangement, presence sensors 58 may comprise load sensors, switches, etc. As an alternative, presence sensors 58 can be implemented above the platform floor, such as using ultrasonic, capacitive or other suitable sensor technology. The use of presence sensors 58 will be described in greater detail here.
An antenna 66 extends vertically from the power unit 14 and is provided to receive control signals from a corresponding wireless remote control device 70. The remote control device 70 may comprise a transmitter which is used or otherwise maintained by the operator. The remote control device 70 is manually operated by an operator, for example, by pressing a button or other control, to cause the remote control device 70 to wirelessly transmit at least one first type designating a trip request to the truck 10. The travel request is an order that requests the corresponding truck 10 to travel for a predetermined amount, as will be described in more detail here.
The truck 10 also comprises one or more obstacle sensors 76, which are provided in the truck 10, for example to the first end section of the power unit 14 and / or to the sides of the power unit 14. The obstacle sensors 76 include at least one obstacle sensor without contact in the truck 10, and are operated to define at least one detection zone. For example, at least one detection zone may define an area at least partially opposite a forward travel direction of the truck 10 when the truck 10 is traveling in response to a wirelessly received travel request from the remote control device 70, such as it will also be described in more detail here.
The obstacle sensors 76 may comprise any suitable proximity detection technology, such as ultrasonic sensors, optical recognition devices, infrared sensors, laser scanning sensors, etc. which are able to detect the presence of objects / obstacles or are capable of generating signals that can be analyzed to detect the presence of objects / obstacles within the predefined detection zone (s) of the power unit 14.
In practice, the truck can be implemented in other formats, styles and features, such as an end-control pallet truck that includes a hand-held steering lever arm that is coupled to a hand lever to steer the truck. Similarly, although the remote control device 70 is illustrated as a glove-like structure 70, numerous implementations of the remote control device 70 may be implemented, including for example, broken finger, cord or assembled band, etc. In addition, the truck, remote control system and / or components thereof, which include the remote control device 70, may comprise any of the additional and / or alternative features or implementations, examples of which are described in the Application Provisional Patent Series No. 60 / 825,688, filed September 14, 2006 entitled "SYSTEMS AND METHODS FOR REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE", the US Patent Application Series No. 11 / 855,310, filed on September 14, 2007 entitled "SYSTEMS AND METHODS TO CONTROL REMOTELY A MATERIALS HANDLING VEHICLE ", US Patent Application Serial No. 11 / 855,324, filed on September 14, 2007 entitled" SYSTEMS AND METHODS FOR REMOTELY CONTROLLING A VEHICLE FOR HANDLING MATERIALS "; US Patent Application Series No. 61 / 222,632, filed July 2, 2009, entitled" APPARATUS FOR REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE "; US Patent Application Series No. 12 / 631,007 , filed December 4, 2009, entitled "MULTIPLE ZONE SENSOR FOR MATERIALS MANAGEMENT VEHICLES"; US Patent Application Series No. 61 / 119,952, filed December 4, 2008, entitled "MULTIPLE AREA SENSOR" FOR VEHICLES FOR HANDLING REMOTE CONTROLLED MATERIALS "; and / or US Patent No. 7,017,689, issued March 28, 2006, entitled" ELECTRICAL ADDRESS ASSISTANT FOR MATERIAL HANDLING VEHICLE ", the complete descriptions of which are they incorporate each one here for reference.
Control System for Remote Operation of a Truck Low Level Order Collection: Referring to Figure 2, a block diagram illustrates a control arrangement for integrating remote control commands with the truck 10. The antenna 66 is coupled to a receiver 102 to receive commands issued by the remote control device 70. The receiver 102 passes the received control signals to a controller 103, which implements the appropriate response to the received commands and can thus also be referred to herein as a master controller. In this respect, the controller 103 is implemented in hardware and can also execute software (including firmware, resident software, microcode, etc.). In addition, aspects of the present invention may take the form of a computer program product incorporated into one or more computer-readable medium (s) having a computer-readable program code incorporated therein. For example, the truck 10 may include memory that stores the computer program product, which, when implemented by a processor of the controller 103, implements the address correction as described herein more fully.
In that way, the controlled 103 may define, at least in part, a data processing system suitable for storing and / or executing program code and may include at least one processor directly or indirectly coupled to the memory elements, for example, through a common system driver or other suitable connection. The memory elements may include local memory used during the actual execution of the program code, memory that is integrated within a microcontroller or application-specific integrated circuit (ASIC), a variety of programmable gates or other reconfigurable processing device, etc.
The response implemented by the controller 103 in response to commands received remotely, for example, through the wireless transmitter 70 and the corresponding antenna 66 and the receiver 102, may comprise one or more actions, or inaction, which depends on the logic that is being implemented. Positive actions may comprise controlling, adjusting or otherwise affecting one or more components of the truck 10. The controller 103 may also receive information from other inputs 104, for example, from sources such as presence sensors 58, obstacle sensors 76, switches, load sensors, encoders and other devices / features available from truck 10 to determine the appropriate action in response to commands received from remote control device 70. Sensors 58, 76, etc. they can be coupled to the controller 103 through the inputs 104 or through a suitable truck network, such as a common control area network (CAN) 110 conductor.
In an illustrative arrangement, the remote control device 70 is operative to wirelessly transmit a control signal representing a first type of signal such as a trip command to the receiver 102 in the truck 10. The travel command is also referred to herein as a "travel signal", "travel request" or "go to the signal". The travel request is used to initiate an application to the truck 10 to travel for a predetermined amount, for example, by having the truck 10 advance or jog in a first direction for a limited travel distance. The first direction can be defined, for example, by the movement of the truck 10 in a power unit 14 first, that is, forks 16 to the rear direction. However, other travel directions may alternatively be defined. In addition, truck 10 can be controlled to travel in a generally straight direction or along a predetermined course. Correspondingly, the limited travel distance can be specified by an approximate travel distance, travel time or other measure.
In that way, a first type of signal received by the receiver 102 is communicated to the controller 103. If the controller 103 determines that the travel signal is a valid travel signal and that the current vehicle conditions are adequate (explained in more detail below), the controller 103 sends a signal to the configuration of proper control of the particular truck 10 when advancing and then stopping the truck 10. Stopping the truck 10 can be implemented, for example, either by allowing the truck 10 to go to a stop or by initiating a braking operation to make the truck 10 brake in a stop.
As an example, the controller 103 may be communicatively coupled to a traction control system, illustrated as a traction motor controller 106 of the truck 10. The traction motor controller 106 is coupled to a traction motor 107 which drives the traction motor 107. less a steered wheel 108 of the truck 10. The controller 103 can communicate with the traction motor controller 106 in order to accelerate, decelerate, adjust and / or otherwise limit the speed of the truck 10 in response to receiving a request from travel of the remote control device 70. The controller 103 can also be communicatively coupled to a steering controller 112, which is coupled to a steering motor 114 which drives at least one steered wheel 108 of the truck 10. In this regard, the truck 10 can be controlled by controller 103 to travel a predicted trajectory or maintain a predicted course in response to receiving a trip request from the remote control device 70.
Still as illustrated in another example, the controller 103 can be communicatively coupled to a brake controller 116 that controls the truck's brakes 117 to decelerate, stop or otherwise control the speed of the truck 10 in response to receiving a travel request from the truck. remote control device 70. Still further, controller 103 can be communicatively coupled to other vehicle features, such as main contactors 118, and / or other outputs 119 associated with truck 10, where applicable, to implement desired actions in response to implement remote travel functionality.
In accordance with various aspects of the present invention, the controller 103 can communicate with the receiver 102 and with the traction controller 106 operating the truck 10 under remote control in response to receiving travel commands from the associated remote control device 70. In addition, the controller 103 may be configured to perform a first action if the truck 10 is traveling under remote control in response to a trip request and an obstacle is detected in one of the first pre-detection zone (s). The controller 103 may further be configured to perform a second action different from the first action if the truck 10 is traveling under remote control in response to a trip request and an obstacle is detected in one of the second detection zones. In this regard, when the path signal is received by the controller 103 of the remote control device 70, any number of factors can be considered by the controller 103 to determine whether the received path signal can be operated to initiate and / or maintain the movement of the truck 10.
Correspondingly, if the truck 10 moves in response to a command received by the remote wireless control, the controller 103 can dynamically alter, control, adjust, or otherwise affect the remote control operation, for example, by stopping the truck 10. , by changing the steering angle of truck 10, or by taking other actions. In that way, the particular vehicle characteristics, the condition / condition of one or more of the characteristics of the vehicle, vehicle environment, etc. they can influence the way in which the controller 103 responds to the travel requests of the remote control device 70.
The controller 103 may reject the confirmation of a received trip request depending on the predetermined condition (s), for example, which refers to the environmental factor (s) or / operation. For example, the controller 103 may ignore a valid other travel request based on information obtained from one or more of the sensors 58, 76. As an illustration, in accordance with various aspects of the present invention, the controller 103 may optionally consider factors such as whether an operator is in truck 10 when determining whether to respond to a trajectory command from remote control device 70. As noted above, truck 10 may comprise at least one presence sensor 58 to detect if an operator is placed on the truck 10. In this regard, the controller 103 may further be configured to respond to a path request to operate the truck 10 under the remote control when the presence sensor (s) 58 designates that the operator is not in the truck 10. In this way, in this implementation, the truck 10 can not be operated in response to wireless commands from the transmitter unless the operator is physically outside the truck 10. Similarly, if the object sensors 76 detect that an object, including the operator, is adjacent and / or next to the truck 10, the controller 103 may reject the confirmation of the request transmitter trajectory 70. Thus, in an illustrative implementation, an operator must be located within a limited range of the truck 10, for example, close enough to the truck 10 to be in the wireless communication range (which may be limit yourself to the group of a maximum distance from the truck operator 10). Other provisions may be implemented alternatively.
Any other number of reasonable conditions, factors, parameters, or other considerations may also / alternatively be implemented by the controller 103 to interpret and take action in response to the signals received from the transmitter. Other illustrative factors are set forth in greater detail in the U.S. Provisional Patent Application. Series No. 60 / 825,688 entitled "SYSTEMS AND METHODS OF REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE"; the Provisional Patent Application of E.U.A. Series No. 11 / 855,310 entitled "SYSTEMS AND METHODS OF REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE"; the Provisional Patent Application of E.U.A. Series No. 11 / 855,324, entitled "SYSTEMS AND METHODS OF REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE"; the Provisional Patent Application of E.U.A. Series No. 61 / 222,632, entitled "APPARATUS FOR REMOTELY CONTROLLING A MATERIALS HANDLING VEHICLE"; the Provisional Patent Application of E.U.A. Series No. 12 / 631,007, entitled "MULTIPLE ZONE SENSOR FOR MATERIALS HANDLING VEHICLES"; and the Provisional Patent Application of E.U.A. Series No. 61 / 119,952 entitled "MULTIPLE AREA SENSOR FOR VEHICLES FOR THE MANAGEMENT OF REMOTELY CONTROLLED MATERIALS"; the descriptions of which are incorporated each for reference here.
After recognition of a trajectory request, the controller 130 interacts with the traction motor controller 106, for example, directly or indirectly, for example, through a common conductor such as the common conductor CAN 110 if used, for advancing the truck 10 by a limited amount.
Depending on the particular implementation, the controller 103 may interact with the traction motor controller 106 and optionally, the address controller 112, by advancing the truck 10 by a predetermined distance. Alternatively, the controller 103 may interact with the traction motor controller 106 and optionally, the steering controller 112, by advancing the truck 10 for a period of time in response to the detection and sustained activation of a path control in the remote 70. As in yet another illustrative example, the truck 10 can be configured to move as long as a path control signal is received. Even further, the controller 103 can be configured for a "time out" and stop the truck 10 trajectory based on a predetermined event, such as exceeding a predetermined time period or traveling distances with respect to detecting the sustained activation of a control corresponding to the remote control device 70.
The remote control device 70 may also be operative to transmit a second type signal, such as a "stop signal", designating that the truck 10 must brake and / or otherwise be at rest. The second type signal may also be involved, for example, after implementing a "path" command, for example, after the truck 10 has traveled a predetermined distance, traveled for a predetermined time, etc., under remote control in response to the trajectory command. If the controller 103 determines that a received signal wirelessly it is a stop signal, the controller 103 sends a signal to the traction controller 106, the brake controller 116 and / or another component that brings the truck 10 to rest. As an alternative to a stop signal, the second type signal may comprise a "coast signal" or a "controlled declaration signal" which designates that the truck 10 must coast, finally decelerating to arrive at rest.
The time it takes to bring the truck 10 to full rest can vary, depending for example on the intended application, the environmental conditions, the capabilities of the particular truck 10, the load on the truck 10 and other similar factors. For example, after completing an appropriate pushing movement, it may be desirable to allow the truck 10 to "cost" some distance before reaching the standstill so that the truck 10 stops slowly. This can be achieved by using regenerative braking to reduce the speed of truck 10 to a stop. Alternatively, a braking operation may be applied after a predetermined delay time to allow a predetermined range of path additional to the truck 10 after the initiation of the stopping operation. It may also be desirable to bring the truck 10 to a relatively rapid stop, for example, if an object is detected in the travel path of the truck 10 or if an immediate stop is desired after a successful pushing operation. For example, the controller can apply predetermined torque to the braking operation. Under these conditions, the controller 103 can instruct the brake controller 116 to apply the brakes 117 to stop the truck.
Detection Zones of a Material Handling Vehicle Referring to Figure 3, according to various aspects of the present invention, one or more obstacle sensors 76 are configured to collectively allow the detection of objects / obstacles within multiple "zones". detection". About this, the controller 103 may be configured to alter one or more operation parameters of the truck 10 in response to the detection of an obstacle in one or more of the detection zones as set forth in greater detail herein. The control of the truck 10 using detection zones can be implemented when an operator walks / operates the truck 10. One or more detection zones can also be deactivated or otherwise ignored by the controller 103 when an operator is walking in / controlling the truck 10, for example, to allow the operator to navigate the truck 10 in tight spaces. The control of the truck 10 using the detection zones can also be integrated with complementary remote control as established and described here more fully.
Although six obstacle sensors 76 are shown for purposes of clarity of discussion here, any number of obstacle sensors 76 may be used. The number of obstacle sensors 76 will be similarly variable, depending on the technology used when implementing the sensor, the size and / or range of detection zones, the number of detection zones, and / or other factors.
In the illustrative example, a first detection zone 78A is located close to the power unit 14 of the truck 10. A second detection zone 78B is defined adjacent to the first detection zone 78A and generally appears to circumscribe the first detection zone 78A . A third area is also conceptually defined as all the outer areas of the first and second detection zones 78A, 78B. Although the second detection zone 78B is illustrated as substantially circumscribing the first detection zone 78A, any other practical arrangement defining the first and second detection zones 78A, 78B can be realized. For example, all or certain portions of the detection zones 78A, 78B may be crossed, overlapped or mutually exclusive. In addition, the particular shape of the detection zones 78A, 78B may vary. Furthermore, any number of detection zones can be defined, other examples of which are described in greater detail here.
In addition, the detection zones do not need to surround the complete truck 10. Rather, the shape of the detection zones may depend on the particular implementation as described in more detail here. For example, if the detection zones 78A, 78B are used for speed control while the truck 10 is moving without an operator therein, under remote travel control in a first power unit orientation (hairpins in the rear), then the detection zones 78A, 78B can be oriented at least forward of the path direction of the truck 10. However, the detection zones can also cover other areas, for example, adjacent to the sides of the truck 10. .
In accordance with various aspects of the present invention, the first detection zone 78A may also designate a "stopping zone". Correspondingly, the second detection zone 78B can also designate a "first speed zone". Under this arrangement, if an object, for example, some form of obstacle is detected within the first detection zone 78A, and the material handling vehicle, for example, the truck 10, is traveling under remote control in response to a travel request, then the controller 103 can be configured to implement an action such as a "detention action" by driving the truck 10 to a stop. With respect to this, the trajectory of the truck 10 can continue once the obstacle is clear, or a second, subsequent travel request of the remote control device 70 can be requested to restart the journey of the truck 10 once the obstacle is cleared. .
If a travel request is received from the remote control device 70 while the truck is at rest and an object is detected within the first address area 78A, then the controller 103 can reject the trip request and keep the truck at rest. until the obstacle is clear outside the detention zone.
If an object / obstacle is detected within the second detection zone 78B, and the material handling truck 10 is traveling under remote control in response to a trip request, then the controller 103 can be configured to implement different actions. For example, the controller 103 can implement a first speed reduction action by reducing the speed of the truck 10 to the first predetermined speed, such as when the truck 10 is traveling at a higher speed than the first predetermined speed.
In that way, assume that the truck 10 is traveling in response when implementing a travel request from the remote control device at a speed V2 as set by a group of operating conditions where the obstacle sensors 76 do not detect an obstacle in any detection zone. If the truck is initially at rest, the truck can accelerate to speed V2. The detection of an obstacle within the second detection zone 78B (but not the first detection zone 78A) can cause the truck 10, for example, through the controller 103 to alter at least one operation parameter, for example, for decelerating the truck 10 to a predetermined first speed V1, which is slower than the speed V2. That is, V1 < V2 Once the obstacle is cleared from the second detection zone 78B, the truck 10 can resume its speed V2, or the truck 10 can maintain its speed V1 until the truck stops and the remote control device 70 initiates another request for travel. In addition, if the detected object is subsequently detected within the first detection zone 78A, the truck 10 will stop as described more fully here.
Assume as an illustrative example, the truck 10 is configured to travel at a speed of approximately 4 kilometers per hour (Km / h) for a predetermined, limited amount, if the truck 10 is traveling without an operator on board and is under the remote wireless control in response to a travel request from a corresponding remote control 70, provided that the object is not detected in a defined detection zone. If an obstacle is detected in the second detection zone 78B, then the controller 103 can adjust the speed of the truck 10 at a speed of approximately 2.4 Km / h or some other speed lower than 4 kilometers per hour (Km / h). If an obstacle is detected in the first detection zone 78A, then the controller 103 stops the truck 10.
The above example assumes that the truck 10 is traveling under remote wireless control in response to a valid signal received from the transmitter 70. In this regard, the obstacle sensors 76 can be used to adjust the operating conditions of the unoccupied truck 10. However , the obstacle sensors 76 and the corresponding controller logic can also be operated when the truck 10 is being controlled by an operator, for example, mounted on the platform or other suitable location of the truck 10. In this way, according to several aspects of the present invention, the controller 103 can stop the truck 10 or refuse to allow the truck 10 to move if an object is detected within the stopping zone 78A with respect to whether the truck is being controlled by an operator or automatically operates in response to receiving a corresponding wirelessly transmitted travel request. Correspondingly, depending on the specific implementation, the control / limiting speed capability of the controller 103, for example, in response to detecting an object in the second detection zone 78B but not in the first detection zone 78A, may implemented without considering whether the truck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request, or if an operator is mounted on the truck 10 while controlling it.
However, in accordance with various aspects of the present invention and as briefly noted above, there may be situations where it is desirable to disable one or more of the detection zones when the truck 10 is being controlled by an operator. For example, it may be desirable to disable / disable the obstacle sensors 79 / controller logic while the operator is controlling the truck 10 with respect to external conditions. As an additional example, it can be Desirable to disable / disable the obstacle sensors 76 / controller logic while the operator is controlling the truck 10 to allow the operator to navigate the truck 10 in straight quarters, for example, navigate in · straight spaces, travel around corners, etc. this can otherwise activate one or more of the detection zones. As such, the activation of the controller logic, for example, within the controller 103 to utilize the detection of the objects in the detection zones to help control the truck 10 while the truck 10 is occupied by an operator, in accordance With various aspects of the present invention, they can be manually controlled, programmablely controlled, or otherwise selectively controlled.
Referring to Figure 4, in accordance with further aspects of the present invention, one or more of the obstacle sensors 76 may be implemented by ultrasonic technology or other suitable non-contacting technology capable of distance measurement and / or position determination. In that way, the distance to an object can be measured, and / or a determination can be made to determine whether the detected object is within a detection zone 78A, 78B, for example, by virtue of the distance of the object from the truck 10. As an example, an obstacle sensor 76 may be implemented by an ultrasonic or transducer sensor that provides a "sound" signal, such as a high frequency signal generated by a piezo element. The ultrasonic sensor 76 then sits and listens for an answer. In this respect, the flight information time can be determined and used to define each zone. In that way, a controller, for example the controller 103 or a controller specifically associated with the obstacle sensors 76 may use software that observes the flight information time to determine if an object is within a detection zone.
In accordance with additional aspects of the present invention, multiple obstacle sensors 76 can work together to obtain object sensors. For example, a first ultrasonic sensor can send a sound signal. The first ultrasonic sensor and one or more additional ultrasonic sensors can then hear an answer. In this way, the controller 103 may use diversity in identifying the existence of an object within one or more of the detection zones.
With reference to Figure 5, a multiple speed zone control implementation is illustrated in accordance with even further aspects of the present invention. As illustrated, three detection zones are provided. If an object such as an object is detected in the first detection zone 78A and the truck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by the transmitter 70, then a first action can be performed, for example, the truck 10 may be taken to an arrest as described more fully here. If an object such as an obstacle is detected in the second detection zone 78B and the truck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by the transmitter 70, then a second action can be performed, for example, the speed of the vehicle can be limited, reduced, etc. In that way, the second detection zone 78B can also designate a first speed zone. For example, the speed of the truck 10 can be reduced and / or limited to a relatively low first speed, for example, approximately 2.4 km / h.
If an object such as an obstacle is detected in the third detection zone 78C and the truck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by the transmitter 70, then a third action can be performed, for example, the truck 10 can be reduced in speed or otherwise limited to a second speed, for example, approximately 4 km / h. In that way, the third detection zone can also designate a second speed zone. If there are no obstacles detected in the first, second and third detection zones 78A, 78B, 78C, then the truck 10 can be commanded remotely to travel to a limited amount, for example, at a speed that is greater than the speed of travel. the speed when an obstacle is in the third detection zone, for example, a speed of approximately 6.2 km / h.
As illustrated in Figure 5, the detection zones can be defined by different patterns relative to the truck 10. Also, in Figure 5, a seventh obstacle sensor 76 is used, however, any number of sensors can be provided, depending of the technology used and / or the features implemented. By way of illustration and not limitation, the seventh obstacle sensor 76 may be approximately centered, such as in the bumper or other suitable location in the truck 10. In an illustrative truck 10, the third zone 78C may extend approximately 2 meters toward in front of the power unit 14 of the truck 10.
In accordance with various aspects of the present invention, any number of detection zones of any form can be implemented. For example, depending on the desired performance of the truck, many small areas can be defined at several coordinates relative to the truck 10. Similarly, some of the large detection zones can be defined based on the desired performance of the truck. As an illustrative example, a table can be configured in the memory of the controller. If the speed of the trip while operating the remote trip control is an operating parameter of interest, then the table can associate the travel speed with the detection zones defined by distance, range, position coordinates or some other measurement. If the truck 10 is traveling in response to receiving a corresponding wirelessly transmitted travel request by the transmitter 70 and an obstacle sensor detects an object, then the distance to that detected object can be used as a "key" to observe a corresponding travel speed In the table. The travel speed recovered from the table can be used by the controller 103 when adjusting the truck 10, for example, by lowering the speed, etc.
The areas of each detection zone can be chosen, for example, based on factors such as the desired speed of the truck when the truck 10 is traveling in response to a received, valid travel request from the remote control device 70, the distance of required detention, the anticipated load to be transported by truck 10, if a certain amount of shore is required due to load stability, reaction time of the vehicle, etc. In addition, factors such as the range of each desired detection zone, etc. can be considered to determine the number of obstacle sensors 76 required. With respect to this, such information may be static, or dynamic, for example, based on the operator's experience, vehicle load, nature of the load, environmental conditions, etc. It is also contemplated that the controller 103 may generate a warning or alarm signal if an object or a person is detected in a detection zone.
As an illustrative example, in a configuration with multiple detection zones, for example, three detection zones, up to seven or more objects detectors, for example ultrasonic sensors or laser sensors, can be used to provide a range of coverage desired by an application. correspondent. In this regard, the detector (s) may be able to see ahead of the path direction of the truck 10 by a distance sufficient to allow the appropriate response, for example, to go slower. With respect to this, at least one sensor may be able to see several meters forward in the direction of the trajectory of the truck 10.
In accordance with various aspects of the present invention, the multiple detection speed zones allow a relatively greater maximum forward path speed while operating in response to the wirelessly received path commands. Such an arrangement can unnecessarily prevent early vehicle stopping by providing one or more intermediate zones where the truck 10 slows down before deciding to arrive at a complete stop.
In accordance with further aspects of the present invention, the use of multiple detection zones allows a system that rewards the corresponding operator for better alignment of the truck 10 during the collection operation. For example, an operator may place truck 10 so as not to align with a warehouse aisle. In this example, as the truck 10 is pushed forward, the second detection zone 78B may initially detect an obstacle such as a collection bin or storage bin. In response to the detection of the shelf, truck 10 will go slower. If the shelf is detected in the first detection zone 78A, then the truck will rest, even if the truck 10 has not been moved throughout the programmed movement distance. Similar to low speeds or unnecessary stops can also occur a congested and / or disordered corridors.
According to various aspects of the present invention, the truck 10 can form speed and braking operation parameters based on the information obtained from the obstacle sensors 76. In addition, the logic implemented by the truck 10 in response to the detection zones It can change or vary depending on a desired application. As some illustrative examples, the boundaries of each zone in a multiple zone configuration can be programmable (and / or reprogrammably) entered into the controller, for example, programmed flash. In view of the defined zones, one or more operational parameters may be associated with each zone. The established operating parameters can define a condition, for example, maximum permissible path speed, an action, for example, brake, coast or otherwise reach a controlled stop, etc. The action can also be an annulment action. For example, an action may comprise adjusting a steering angle or heading of the truck 10 as will be described in greater detail herein.
According to a further embodiment of the present invention, one or more obstacle sensors, such as obstacle sensors 76A, 76B are shown in Figures 6 and 8, can be used to perceive or detect objects within the first, second and third detection zones in front of the truck 10 when the truck 10 is traveling in response to a wirelessly received path request from the transmitter 70. The controller 103 or other sensor processing device can also generate a signal detected by object and optionally, a signal distance in response to sensing / detecting an object in front of the truck 10. As an illustrative example, an additional input 104 within the controller 103 may be a charge signal generated by a load sensor LS, as illustrated in Figures 7 and 8, which perceives the combined weight of the forks 16 and any load on the forks 16. The load sensor LS is shown schematically e in Figures 7 and 8 near the forks 16, but can be incorporated into a hydraulic system to effect the lifting of the forks 16. By subtracting the load from the forks 16 (a known constant value) of the combined weight defined by the load signal, the controller 103 determines the weight of the load on the forks. By using the detected load weight and if an object has been detected in one of the first, second, third detection zones as entries within a look-up table or suitable equations, the controller 103 generates an adequate stop of vehicle or signal of maximum permissible speed.
Values defining the vehicle stop and the maximum permissible speed signals can be determined experimentally and stored in a look-up table, calculated in real time based on a predetermined formula, etc. In the illustrated embodiment, the controller 103 determines the weight of a load on the forks 16 and if an obstacle has been detected in one of the first, second and third detection zones and, using a search table, it performs a stop command or it defines a maximum allowable speed for the truck 10 and generates a corresponding maximum permissible speed signal for the truck 10.
As an example, if there is no load on the forks 16 and there is no object detected by the obstacle sensors 76A, 76B in any of the first, second, third detection zones, the controller 103 allows the truck 10 to be operated at any speed and includes a maximum speed of 7.2 km / hour. If the object has not been detected in any of the first, second, third detection zones, the maximum allowed speed of the truck 10 can be configured, for example, to decrease as the load on the truck 10 increases. As an illustration, for a load weight of 3632 kilograms, the maximum allowable speed of truck 10 can be 4.02 km / hour. It should be noted that, in some locations, the maximum permissible speed of the truck 10, if it is not occupied by a pilot, can be set to a predetermined upper limit, for example, 5.63 km / hour. Accordingly, the maximum speed of the vehicle, if it is not occupied by a pilot, can be set, for example, by the controller 103, at this maximum permissible speed.
For any load weight on the forks 16, if an object is detected in the first detection zone, the controller 103 generates a "stop signal", which designates that the truck 10 arrives at a substantially immediate stop. For any given load weight, the maximum permissible speed of the truck 10 is progressively greater than the additional object of the truck 10. Also for any given load weight, the maximum permissible speed of the truck 10 is less if an object is detected in the second detection zone as compared to when an object is detected in the third detection zone. The maximum allowable vehicle speed for the second and third detection zones is defined for each load weight so that the speed of the truck 10 can be reduced in a controlled manner as the truck 10 continues to move forward of the object so that the Truck 10 can eventually safely arrive at a stop before the truck reaches the point where the object is located. These speeds can be determined experimentally, based on formulas or a combination thereof, and can vary based on the type of vehicle, size and braking capabilities of the truck.
As an illustrative example, assume that the load weight on the forks 16 is 681 kilograms and three detection zones are provided, including a first detection zone closer to the truck, followed by a second detection zone and a third zone detection farthest from the truck. If a detected object is located at a distance within the third detection zone, then the maximum permissible vehicle speed can be set at a speed such as 4.60 km / hour. Accordingly, if the truck 10 is traveling at a speed greater than 4.60 km / hour when the object is detected, the controller 103 makes a speed reduction so that the vehicle speed is reduced to 4.60 km / hour.
If the load weight on the truck 10 remains equal to 681 kilograms, and if a detected object is located at a distance from the truck 10 within the second detection zone, then the maximum permissible vehicle speed may be, for example, 3.21 km / hour. Accordingly, if the truck 10 is traveling at a speed greater than 3.21 km / hour when the object is detected in the second detection zone, the controller 103 makes a speed reduction so that the vehicle speed is reduced to 3.21 km / hour.
Keeping the previous example, if the load weight on the truck 10 is equal to 681 kilograms and an object is detected in the first detection zone, then a stop signal can be generated by the controller 103 when the truck 10 is stopped.
The obstacle sensors may comprise ultrasonic transducers. Ultrasonic transducers are known to experience a phenomenon known as transducer "call generation". Essentially "call generation" is the tendency of a transducer to continue vibrating and transmitting ultrasonic signals after the control signal that is used to initiate a transmitted signal has ceased. This "call generation" signal decreases in magnitude rather quickly, but during the time that it is decreasing to a level below a threshold detection level, each obstacle sensor can respond by ignoring such "call generation" signals. if the signals are above a reference level associated with the listening sensor. As a result, a sensor may be wrong in an object for a "call generation" signal and thus fails to identify an object in a corresponding detection zone. A common technique to avoid this problem is to clear all signals back generated by the obstacle sensors during a pre-selected period of time after the initiation of a transmission. The preselected time is determined based on several factors that include the type of transducer that is used, but during this pre-selected time no valid returns are perceived. If the obstacle sensors are located near a front part 10A of the truck 10, see the obstacle sensors 76A in Figure 7, and if the emptying technique is used, this results in a "dead" or "dead" zone. no detection "DZ that immediately exits in the front of the truck 10. Accordingly, if an object O is very close to the front of the truck 10, for example, 10 mm or less, and the obstacle sensors 76A are placed in the front of the truck 10, see Figure 7, then the object O may not be detected.
In the embodiment illustrated in Figures 6 and 8, the first and second obstacle sensors 76A and 76B, respectively, are separated from one another along a longitudinal axis LA of the truck 10, see Figure 8. The obstacle sensors 76A are placed on the front part 10A of the truck 10 and are capable of detecting objects located in, for example, the first, second and / or third detection zones. To ensure that the objects O located in the non-detection zone DZ, which may be inherent in the first obstacle sensors 76A, the second obstacle sensors 76B are located in the truck 10 at a distance apart behind the first sensors 76A , that is, in a direction beyond the front 10A of the truck 10, as best illustrated in Figure 8. With respect to this, the second sensors 76B operate at least to detect objects in the dead zone DZ in the Figure 7 Address Correction When a truck 10 is traveling in response to receiving a corresponding wirelessly transmitted path request by the transmitter 70, for example, as long as there is not a person operating the truck 10 as described in more detail in full, it is possible for the truck 10 to encounter obstacles that do not require the truck 10 to come to rest. Rather, a steering correction maneuver can be performed so that the truck 10 can continue to move forward for the proper limited amount without requiring operator intervention.
In accordance with aspects of the present invention, the Address correction allows the truck 10 to automatically steer past the objects that are detected within the general area of the front of the truck 10. This capability of address correction allows, for example, that the truck 10, which may be traveling in response to a wirelessly received path request from the transmitter 70, generally remain in the center of a corridor in a warehouse environment as the truck 10 travels in the aisle. For example, this is possible since the truck 10 may have some displacement in its steering angle due to steering calibration, floor crown, or any number of external factors. However, according to various aspects of the present invention, a truck 10 traveling in response to receiving a corresponding wirelessly transmitted path request by the transmitter 70 may implement address corrections, for example, by staying away from or otherwise avoid walls and shelves, other trucks, people, boxes and other obstacles, etc., in that way, freeing the operator from the need to periodically track truck 10 and truck 10 manually to the center of the aisle or other desired position and of course.
In accordance with various aspects of the present invention, the controller 103 collects data from the various sensors, for example 76, 76A, and 76B that provide a landscape / ambience image on the front of the truck 10, as will be discussed more fully herein. . The controller 103 then uses the data collected from the sensors to determine whether to implement the address correction maneuvers as described more fully herein. With respect to this, the address correction may be implemented in addition to, instead of and / or in combination with the various cancellation techniques described more fully herein. Thus, by way of illustration and not limitation, the address correction can be used in combination with multiple speed zones, a stop detection zone, weight dependent speed zones, etc.
As a further example, the object detection components of the truck 10 can still implement an alarm and / or cause the truck 10 to stop, reduce or otherwise limit the maximum path speed of the truck 10, etc. Still further, the truck 10 can emit a first alarm if the truck is attempting an automated steering correction maneuver and the second alarm or signal if the truck 10 is slowing down and / or stopping in response to an object in an area of corresponding detection if such characteristics are implemented in combination with the address correction.
With respect to this, as used herein, the term "address bumper zone" will be used to distinguish a zone used for address correction from a "detection zone" that is used to limit the maximum speed, stopping the truck 10, etc., as described more fully herein above.
In the illustrative example, two steering bumper zone inputs are provided to the controller 103, to distinguish left and right orientations relative to the truck 10. However, depending on the technology of the sensor and the manner in which the sensor data is make available, one or more inputs to the controller 103 may be required. By way of illustration, and not limitation, the truck 10 may be equipped with one or more sensor device (s) 76, 76A, 76B which collectively provides a first steering bumper area and a second steering bumper zone, which they are close to the truck 10. For example, the first bumper zone can be placed on the left and generally towards the front of the forward direction of the truck 10, on the left side of the truck 10, etc. Similarly, a second steering bumper area can be placed to the right and generally forward of the direction of travel of the truck 10, to the right side of the truck 10, etc. In this respect, the first and second steering bumper zones of the truck 10 can be used to implement the steering correction, which can include steering angle and steering directed components. In this illustrative configuration, the first and second steering bumper zones may be manually exclusive, or portions of the first and second steering bumper zones may overlap, thereby essentially providing a third steering bumper zone designated by the overlapping coverage of the steering bumper. the first and second steering bumper zones.
In addition, the first and second steering bumper zones can substantially overlap with, partially with or not overlap one or more detection zones used by other techniques such as speed control, truck brake obstacle 10, etc. For example, the range of the steering bumper zones may be similar to or different from the range of one or more of the detection zones if the control limiting speed or other characteristics are also implemented along with the steering correction as described. in more detail here.
In addition, the sensor inputs provided to the controller 103 can be derived from a variety of sensors of similar type or through a mixture of different sensor technologies, for example, ultrasonic sensors and / or laser scanner sensors. In this respect, various sensors and / or types of sensor technology, for example, laser and ultrasonic scanner can be used in conjunction or cooperation with each other, for example, to use one or more sensor (s) or sensor technologies for one or more zones (detection and / or steering bumpers) and to use still others of one or more sensor (s) or sensor technologies for one or more different zones (detection and / or bumper). As another example, two or more sensors or sensor technologies can provide redundancy, for example, as a secure failure, backup or confirmation of the data set.
In accordance with further aspects of the present invention, the controller 103 may be configured to process additional data beyond the two bumper zone entries, examples of this may include the object detection angle and the data distance, etc. . In this way, the techniques described herein are not limited to only two steering bumper zones.
In that way, the address correction according to the aspects of the present invention provides an aid to the operator in keeping the truck 10 off walls, shelves, other vehicles, or other obstructions as the truck 10 is operated by the truck. remote control device 70.
In accordance with various aspects of the present invention, a control system in a truck 10 provides steering correction control in accordance with various aspects of the present invention. Referring to Figure 9, a partial schematic view of the system control is illustrated. In the illustrated system, a first ultrasonic sensor 76 'is used to generate a first detection zone 78', which is also referred to herein as a left detection zone. Correspondingly, a second ultrasonic sensor 76"is used to generate a second detection zone 78", which is also referred to herein as a right detection zone. In addition, although only two ultrasonic detection zones are illustrated, it will be understood that any number of detection zones can be implemented.
In addition, as described more fully herein, the detection zones implemented may overlap or define mutually exclusive zones of discretion.
The output of each ultrasonic sensor 76 ', 76"is coupled to an ultrasonic controller 130, which is used, when required by the specific ultrasonic technology, to process the output of the ultrasonic sensors 76', 76". The output of the ultrasonic controller 130 is coupled, for example, as an output to the controller 103. The controller 103 can process the outputs of the ultrasonic sensor controller 130 to implement speed control, obstacle avoidance or other features, examples of which they are set out in more detail here.
Also illustrated is a sensor 76"', which is shown as a laser scanning sensor to further illustrate illustrative configurations In this example, the sensor 76'" is used to generate a first steering bumper zone 132A, also designated as a left-hand bumper area, and a second steering-bumper area 132B, also designated as a right-hand bumper area. For example, the laser scan sensor 76"'can sweep a laser beam in an area on the front of the truck 10. In this regard, multiple laser systems can be used, or one or more laser beams can be swept, for example, by tracking the scanning of one or more of the areas in front of the truck 10. In this regard, the laser sensor can independently define and scan the areas of left and right steering bumpers, or the controller 103 can derive the left and right steering bumper zones in the scanning lattice of the laser (s). Still further, alternative scanning patterns may be used, provided that the controller 103 can determine if a detected obstacle is to the left or right of the truck 10.
As some of the additional examples, although a laser scanner is illustrated for discussion purposes herein, other detection technolo may be used, examples of which may include ultrasonic sensors, infrared sensors, etc. For example, the ultrasonic sensors located on the sides of the truck 10 can define the left and right steering bumper zones 132A, 132B and other ultrasonic sensors can be used to define the detection zones, for example, to limit the speed, etc.
As illustrated, the output of the laser scanner 76"'provides two inputs 110 within the controller 103. A first signal designates whether an object is detected in the bumper address area Correspondingly, a second signal designates whether an object is detected in the right-hand bumper area Depending on the sensor and the processing sensor technolo used, the input (s) to the controller 103 that designates an object in the steering bumper areas 132A, 132B may be in other formats. an additional illustration, the first and second bumper zones of direction 132A, 132B can be defined by both ultrasonic sensors and a scanning laser. In this example, the scanning laser is used as a redundant check to verify that the ultrasonic sensors properly detect an object either in the left or right bumper steering zones 132A, 132B. As a further example, the ultrasonic sensors can be used to detect an object in the left and right direction bumper zones 132A, 132B and the scanning laser can be used to distinguish or otherwise locate the object to determine if the object was detected in the left-hand bumper area or the right-hand bumper area. Other arrangements and configurations may be implemented alternately.
Algorithm In accordance with various aspects of the present invention, an address correction algorithm is implemented, for example, by controller 103. Referring to Figure 10, an address correction algorithm comprises determining whether a directional bumper zone is detects at 152. A steering bumper warning signal 152 may comprise, for example, detecting the presence of an object within the first and / or second steering bumper zones 132A, 132B. If an address bumper warning zone is received, a determination is made at 154 if the warning of the address bumper zone indicates that an object is detected to the right or to the left of the truck 10, for example, if the object detected is in the first steering bumper zone 132 or the second steering bumper zone 132B. For example, with brief reference back to Figure 9, a laser scan sensor 76"'can generate two outputs, a first output signal designating if an object is detected in the first direction bumper zone (left) 132A , and a second signal designating if an object is detected in the second steering bumper zone (right) 132 B. Alternatively, the controller 103 may receive raw data from the laser scanner and process / distinguish the first and second bumper zones. address 132A, 132B using a predetermined mapping.
If a direction bumper zone warning designates that an object is detected in the left direction bumper area 132A, then a steering correction routine is implemented at 156 which includes calculating a steering angle correction to steer the truck 10. to the right according to a first group of parameters. By way of illustration and not limitation, a right direction correction implemented at 156 may include directing the truck 10 to the right at a right-hand steering angle. With respect to this, the right direction angle can be fixed or variable. For example, the controller 103 may command the address controller 112 to ascend in a ramp fashion to some desired steering angle, for example, 8-10 degrees to the right. When ramping up a fixed steering angle, sudden changes in the steering wheel angle (s) will not occur, resulting in more uniform performance. The algorithm accumulates the distance traveled to the steering correction angle, which can be a function of how much appropriate steering bumper input is engaged.
According to various aspects of the present invention, the angular change of steered wheel can be controlled to achieve, for example, a substantially fixed truck angle correction as a function of the accumulated trajectory distance. The accumulated trajectory distance while performing an address correction maneuver can be determined based on any number of parameters. For example, the distance traveled during the address correction may comprise the distance traveled by the truck 10 until the detected object is not within the left bumper detection zone 132A. The accumulated trajectory distance may also / alternatively comprise, for example, travel until an expiration time is found, another object is detected in either one of the bumper or detection zones, a predetermined maximum steering angle is exceeded, etc. .
After outputting a right direction correction at 156, for example, when maneuvering the truck 10 so that the object is not detected within the direction bumper detection zone 132A, a left direction compensation maneuver is implemented in 158. The left direction compensation maneuver at 158 may comprise, for example, implementing an address counter to adjust the direction of travel of the truck 10 on an appropriate course. For example, the left direction compensation maneuver may comprise directing the truck 10 to a selected angle or otherwise determined by a distance that is a percentage of the previously accumulated path distance. The left steering angle used by the left directional compensation maneuver can be fixed or variable, and it can be the same as, or different from the steering angle used to implement the right direction correction at 156.
By way of illustration and not limitation, the distance used by the address compensation maneuver at 158 may be approximately one quarter to one half of the accumulated trajectory distance while implementing the right direction correction at 156. Similarly, the angle of left direction to implement the left directional compensation maneuver can be about one half of the angle used to implement the right direction correction at 156. In that way, assume that the right direction angle is 8 degrees and the path distance of Accumulated address correction is 1 meter. In this example, the left direction offset can be approximately one-half of the right-direction correction, or -4 degrees, and the left-direction offset will occur for a path distance of approximately ¼ meters to ½ meters.
The particular distance and / or angle associated with the left-hand compensation maneuver at 158 can be selected, for example, to dampen the "bounce" of the truck 10 as the truck 10 moves along its course to correct the direction beyond the obstacles detected. As an illustration, if the truck 10 corrects the direction to a fixed degree per traveled distance, the controller 103 may be able to determine how much the corresponding truck angle has changed, and therefore, adjust the left-hand offset maneuver in 158 to correct towards the original or another suitable course. In that way, truck 10 will avoid "ping ponging" (forward and backward movement) under an aisle and rather, cover a substantially straight course down the center of the aisle without tedious manual reset required by the truck operator. In addition, the left direction compensation maneuver in 158 may vary depending on the particular parameters used to implement the right direction correction in 156.
Correspondingly, if a direction bumper zone warning designates that an object is detected in the right-hand bumper area 132B, then the address correction routine is implemented at 160 which includes calculating a steering angle correction to steer the Truck 10 on the left according to a second group of parameters. A way For illustration and not limitation, a left direction correction implemented at 160 may include steering the truck 10 to the left at a left steering angle. In this regard, the left direction correction maneuver at 160 can be implemented in a manner analogous to that described above at 156, except that the correction is on the right at 156 and on the left at 160.
Similarly, after removing a left direction correction at 160, for example, when maneuvering the truck 10 so that the object is not detected within the bumper detection zone 132B, a right direction compensation maneuver is implemented at 162. The right direction compensation maneuver at 162 may comprise, for example, implementing a counting direction to adjust the direction of travel of the truck 10 to an appropriate course in a manner analogous to that described in 158, except that the compensation maneuver of address at 158 is on the left and the address compensation maneuver at 162 is on the right.
After implementing the address compensation maneuver at 158 or 162, the truck can return to a substantially straight course, for example, 0 degrees at 164 and the procedure returns to the beginning to wait for the detection of another object in any of the zones of steering bumper 132A, 132B.
The algorithm can also be modified to follow several logical control implementations and / or state machines to facilitate several anticipated circumstances. For example, this is possible since a second object will not move within either the steering bumper zone 132A or 132B while it is in the procedure of implementing a steering compensation maneuver. With respect to this, the truck 10 can iteratively attempt to correct the direction around the second object. As another illustrative example, if the object (s) is simultaneously detected in both the left and right bumper steering zones 132A, 132B, the controller 103 can be programmed to keep the truck 10 on its actual course, (e.g. zero degree direction), until either one or more of the steering bumper zones 132A, 132B are cleared or the associated detection zones cause the truck 10 to stop.
In accordance with other additional aspects of the present invention, a representative user and / or service may be able to customize the response of the address angle correction algorithm parameters. For example, a representative service may have access to programming tools to load the custom variables, for example, into the controller 103, to implement the address correction. As an alternative, a truck operator may have controls that allow the operator to enter custom parameters within the controller, for example, through potentiometers, encoders, a software user interface, etc.
The output of the algorithm illustrated in Figure 10 may comprise, for example, an output defining an address correction value that can be coupled from the controller 103 to a suitable control mechanism of the truck 10. For example, the address correction value it may comprise a +/- address correction value, for example, which corresponds to the left direction or the right direction, which is coupled to a vehicle control module, the address controller 112, for example, as illustrated in Figure 2, or another suitable controller, Even further, additional parameters that can be edited, for example, to adjust the operating perception may comprise the steering correction angle, a ramp speed of steering correction angle, a size / range of bumper detection zone for each bumper area of direction, speed of the truck while address is corrected, etc.
Referring to Figure 11, assume in the illustrative example that the truck 10 is traveling in response to receiving a wireless remote path request and that before the truck 10 can travel at a predetermined movement distance, the truck 10 travels towards a position wherein a shelf leg 172 and a corresponding pallet 174 are in the path of the left-direction bumper area 132A. Maintaining with the illustrative algorithm of Figure 10, the truck 10, for example through the controller 103, can implement a maneuver of obstacle avoidance by entering an address correction algorithm, to steer the truck to the right. For example, the controller 103 may calculate or otherwise observe or recover a steering correction angle that is communicated to a steering controller 112 by turning the control wheel (s) of the truck 10.
The truck 10 maintains steering correction until an event occurs, such as the deactivation of the object, for example, when the scanning laser or other implemented sensor technology does not detect an object in the left-hand bumper area 132 anymore. Assume that truck 10 accumulated a half-meter trajectory distance during the steering correction maneuver, which was set at 8 degrees. Upon detecting that the left-direction bumper zone has been deactivated, a counting direction compensation is implemented to compensate for the change in heading caused by the steering correction. By way of example, the steering compensation can direct the truck 10 to the left by approximately a quarter meter of accumulated trajectory distance, at 4 degrees. For several narrow aisles, the Left / Right direction of the sensor bumper area can provide several frequent / short time entries between detections compared to relatively wide aisles.
The various address angle corrections and corresponding counter address offset can be determined empirically, or the angles, ramp speeds, accumulated distances, etc. they can be calculated, molded or otherwise derived.
In the illustrative arrangement, the system will attempt to keep the truck 10 centered in the aisle as the truck 10 advances in response to receiving a corresponding wirelessly transmitted path request by the transmitter 70. In addition, the bounce, for example, as measured by the distance from the center line of a store aisle, it is cushioned. Even further, there may be certain conditions where the truck 10 may still require some intervention from the operator in order to maneuver around certain objects in the path line.
Referring to Figure 12, a graph of a speed measurement of the truck 10 is illustrated during an obstacle avoidance maneuver. The graph in Figure 13 illustrates a direction correction to the predetermined steering angle to illustrate a total correction applied by the algorithm. And a graph in Figure 14 illustrates the movement of the truck 10 as a function of when the address correction is activated and when an object is detected in the detection zones of the left and / or right bumpers.
According to further aspects of the present invention, the address correction algorithm can be configured to embrace a wall / shelf, against staying beyond the wall and / or shelf. For example, add a small offset to truck 10 will allow the truck 10 to maintain a distance with a small amount of wave relative to the control in its distance to the fixed wall / shelf.
Although the left and right steering bumper areas 132A, 132B are illustrated at least partially in the front of the forward path direction of the truck 10, other arrangements may be alternatively and / or additionally implemented. For example, the left and right steering bumper zones could alternatively be placed towards the sides of the truck 10, for example, as illustrated by the left and right side-steering bumper zones 132C, 132D. Also, the truck 10 can use a first pair of bumper zones of left and right direction towards the forward path direction of the truck 10, for example, left and right steering bumper areas 132A, 132B, and a second pair of left and right steering bumper zones 132C, 132D to the sides of the truck 10. In this respect, the particular algorithm used to implement the steering correction may be the same or different for each pair of steering bumper zones.
As an example, the side steering bumper zones 132C, 132D can be used to keep the truck 10 generally adjacent to a shelf, wall or other heading. In this regard, a multi-zone steering bumper can be used, for example, to establish a hysteresis, for example, so that the controller 103 maintains a course by maintaining the wall, shelf or other structure between a first, limited outer steering bumper and a second, inner steering bumper limit. Even as another illustrative alternative, assume that the truck is standing just to the right of a shelf or other structure, which is to the left of the truck 10. The truck 10 can be automatically directed to the left by a small amount in order to go towards the structure. In this respect, when the left-direction bumper zone 132C is divided by the structure, the steering correction described more fully here will lead from the structure. However, because the guide is configured to steer just slightly to the left, the truck 10 will eventually travel to the structure until the steering correction again repositions the truck 10. Even as another illustrative example, the steering compensation, for example, 158 in Figure 10, could be done to deliberately over-compensate, thereby keeping the truck 10 adjacent to the structure.
In yet another illustrative example, the steering bumper zones may be composed of multiple steering bumper sub-zones, wherein each sub-zone may be associated with different parameters for steering correction, for example, by allowing the correction of subtle direction for the objects detected beyond the truck 10 that the objects detected closer to the truck 10. As an example, the Address correction can be a smaller amount, for example, 2 degrees, when an object is detected in the region or additional sub-zone of the vehicle; an intermediate quantity, for example, 4 degrees, when an object is detected in a middle region; and an amount greater than, for example, 8 degrees, when an object is detected in an inner region of a steering bumper area. As additional alternatives, the distance measurement towards the detected object can be used to dynamically adjust the steering algorithm to make the appropriate steering correction maneuvers.
Even as another illustrative example, it may be desirable to apply a first, greater amount of address correction, for example, 10 degrees, if certain predefined conditions are known, and to apply a second, lesser amount than the address correction, for example, 7 degrees, under all other circumstances. For example, assume that an operator is controlling truck 10 and comes to the end of a corridor or row. The operator then maneuvers the truck 10 by making a 180 degree turn and enters an adjacent corridor. Perhaps the operator over directs or sub-directs after entering the adjacent aisle, so that the course of the truck 10 can not straighten down the aisle with the second minor amount of address correction. In this situation, this may be desirable to apply a greater amount of steering correction than is normally used to allow truck 10 to achieve a straight course under the aisle.
The conditions that must occur before applying the greatest amount of address correction may vary, but in the previous example, they may comprise the following: a first condition may be that of a pre-selected address speed, such as, for example, 4.82 km / hour, must be reached or exceeded. A second condition may be that a minimum right angle, such as, for example, 45 degrees, must be satisfied or exceeded. A third condition may be that an operator must report to truck 10 during occurrences of the first and second conditions. In the above example, if each of these three conditions is known, the controller 103 performs a particular instance of the largest amount of address correction, for example, 10 degrees, if an object is detected in one of the bumper zones of the vehicle. address after the occurrence of the three conditions. Subsequent address corrections applied could be the minor amount, for example 7 degrees, until all three conditions are satisfied once again, in which case another particular instance of the greater amount of the address correction will be applied by the controller 103.
Referring to Figures 15A-15C, a scanned environment 200, also referred to as a landscape, is illustrated. The environment 200 can be derived by the controller 103 based on the sensor data obtained by the controller 103 of an obstacle sensor 76, such as a laser scanning device. In this mode, a particular obstacle sensor 76 is used to provide the sensor data, although the additional sensors 76 could be used as desired. In an illustrative embodiment, the obstacle sensor 76 can be located at a distance from the floor on which the truck 10 is traveling, wherein the obstacle sensor 76 scans in a scanning plane that is oriented at an angle from the sensor 76. down the floor.
The illustrative environment 200 illustrated in Figures 15A-15C extends in an axial direction, i.e., parallel to a central axis CA of the truck 10, from a front edge portion 200A of the environment 200 to a trailing edge 200B of the environment 200. The front edge 200A is displaced at a predefined distance DF from the front of the truck 10. The distance DF can be any suitable distance and in a preferred embodiment is from about 1 meter to about 5 meters. The trailing edge 200B is located in a predetermined location Li associated with the truck 10. As some non-limiting examples, the location L, can be defined to a load wheel of the truck 10, at a rear end of the estimated position of a typical load transported by the truck 10, or on the tips of the forks 16, as illustrated in Figures 15A-15C.
The illustrative environment 200 in the embodiment shown in Figures 15A-15C extends in a lateral direction, i.e., perpendicular to the central axis CA of the truck 10, from an edge left 200C of the environment 200 to a right edge 200D of the environment 200. The left edge 200C moves laterally at a predefined distance DL to the left of the central axis CA of the truck 10. The right edge 200D moves laterally at a predefined distance DR a the right of the central axis CA of the truck 10. The distances DL and DR can comprise any of the suitable distances and in a preferred embodiment are each from approximately 2 meters to approximately 5 meters. It should be noted that DL and DR distances could be measured from the sides of truck 10 or any other suitable location, either from the central axis CA. It is also noted that the edges 200A-200D of the environment 200 can comprise a shape and do not need to define straight edges. For example, the edges 200A-200D could be curved or could comprise uneven or jagged positions.
The illustrative environment 200 illustrated in Figures 15A-15C comprises a scanned area 202 and a history area 204. The scanned area 202 is actively scanned by the obstacle sensor 76 during the operation of the truck 10. The history zone 204 is not actively scanned by the obstacle sensor 76, but the objects that are detected in the scanned area 202 are able to be tracked as they pass through the history area 204 during the movement of the truck 10, as will be described here. The history zone 204 comprises a first portion 2040A comprising non-scanned areas laterally outside the scanned area 202 and also comprises a second portion 2040B comprising an area that is located rearwardly of the scanned area 202, as shown in the Figures 15A-15C.
The scanned area 202 extends from the front edge 200A of the environment 200 to a predetermined axial location L2, said location L2 in the mode shown being defined near the front end of the truck 10 but can be defined in other areas. The scanned area 202 extends in the lateral direction between the predetermined lateral locations L3 and L4, the locations L3 and L4 are laterally offset from the respective sides of the truck 10 and are located between the sides of the truck 10 and the left and right edges 200C and 200D of environment 200, as shown in Figures 15A-15C.
The first portion 2040A of the history area 204 extends laterally from both sides of the scanned area 202, ie, from the respective locations L3 and L4, to the left and right edges 200C and 200D of the environment 200. The second portion 2040B of the history area 204 extends backward from the scanned area 202, i.e., from the location L2 to the side edge 200B of the environment 200. The second portion 2040B of the history area 204 extends laterally between the left and right edges 200C and 200D of the environment 200.
The scanned area 202 and the history area 204 each comprise corresponding left and right sections 202A, 202B and 204A, 204B. The left section 202A of the scanned area 202 in the embodiment shown comprises four scanning zones 202A, 202A2, 202A3, 202A4 (collectively referred to later as scanning areas 202Ai.4) and the right section 202B of the scanned area 202 in the scanned area 202A2. The modality shown comprises when scanning zones, 202Bi, 202B2, 202B3, 202B4 (collectively referred to later as scan zones 202BL4). Scanning areas 202A1- - 202B1-4 illustrated in Figures 15A-15C are substantially the same size and are generally rectangular in shape, with the exception of scanning zones 202A4 and 202B4 located closer to truck 10 having corner portions lower angled. However, it is noted that the scanned areas 202Ai-4-202B1-4 could have an adequate size and shape. In addition, although scanning zones 202A4 and 202B4 located closer to truck 10 in the embodiment shown extend slightly outwardly from the front of truck 10, i.e., location L2, scan areas 202A4 and 202B located further. near the truck 10 could be extended to other locations without departing from the spirit and scope of the invention. Also, although each section 202A, 202B of the scanned area 202 in the embodiment shown comprises four scan zones 202Ai.4-202Bi.4, some additional scan zones may be provided in each section 202A, 202B.
The obstacle sensor 76 scans the scan areas 202Ai. - 202Bi.4 and sends the sensor data to the controller 103 with respect to the objects detected in the scan areas 202A1-4 -202Bi.4. Included in the sensor data sent by the obstacle sensor 76 are the data for each scanning zone 202Ai-4-202B1-4 which are representative of whether an object is detected in the corresponding scan area 202A1-4 - 202B1-4 . further, if an object is detected in a scan area 202A1-4-202B -, the data sensor includes data representative of the distance that the detected object is from a reference coordinate Re associated with the vehicle. The reference coordinate Rc may be a predetermined location on the truck 10, such as a bumper, wheel, fork, obstacle sensor 76, etc., or the reference coordinate Rc may be on an axis or plane associated with the truck 10. In the embodiment shown, the reference coordinate Rc is the central axis CA of the truck 10.
As shown in Figures 15A-15C, each scanning zone 202A1-4-202B1-4 comprises a plurality of cuvettes 220. The cuvettes 220 are used to track objects in a plane generally parallel to the floor and which are detected in the areas Scan 202A1- - 202B1-4 as will be discussed here. In a preferred embodiment, each scanning zone 202A1.4-202B1-4 comprises between four and eleven cuvettes 220 (six cuvettes 220 are included in each scanning zone 202A1-4-202Bi.4 in the embodiment shown), although some cuvettes additional 220 could be included in each scanning zone 202A1-4-202Bi.4.
The history zone 204 also comprises a plurality of cuvettes 222. The cuvettes 222 in the first potion 2040A of the history zone 204 can be continuations of the cuvettes 220 of the scanning zones 202Ai.4-202B1-4. The trays 222 are used to track objects entering the history area 204 from the scan areas 202A1-4-202B1-4 as will be discussed here.
The first and second objects 272, 274 are illustrated in the environment 200 in Figures 15A-15C. These objects 272, 274 are detected by the obstacle sensor 76 during operation, and the obstacle sensor 76 sends the sensor data to the controller 103 approximately of the objects 272, 274. The controller 103 uses the sensor data to assign the objects 272, 274 to the trays 220 defined within the scanned area 202 based on the sensor data of the obstacle sensor 76. Once the objects 272, 274 leave the scanned area 202 and enter the history area 204, the objects 272, 274 are assigned to the trays 222 in the history area 204.
The trays 220, 222 are used to track the objects 272, 274 in the environment 200 as the truck 10 moves. That is, as the truck 10 moves, the controller 103 tracks the objects 272, 274 by using subsequent sensor data from the obstacle sensor 76 to reassign the objects 272, 274 to adjacent trays 220, and / or when using navigation It is estimated to reassign objects 272, 274 to adjacent cells 220, 222. By reassigning objects 272, 274 to adjacent cells 220, 222, controller 103 is able to determine an updated axial distance that objects 272, 274 are from the truck 10. The controller 103 is also capable of determining an updated lateral distance that the objects 272, 274 are from the truck 10 using subsequent sensor data and / or estimated navigation. In a preferred embodiment, the objects 272, 274 are tracked by the controller 103 until it is no longer determined that they are in the environment 200.
It is noted that, if the obstacle sensor 76 scans in a scanning plane that is oriented at an angle from the sensor 76 down towards the floor, some objects that are detected in one or more of the scan areas 202AL4 - 202BL4 do not they can be detected in an adjacent scan area, even through the object that is located within the axial dimension of the adjacent scanning zone. For example, shorter objects may be detected by the obstacle sensor 76 in the scan area 202A, but may not be detected by the obstacle sensor 76 after entering the axial dimensions of the adjacent zone 202A2. While the sensor data provided by the obstacle sensor 76 may not indicate that the object is in the area 202A2, ie, since the object is located below the scanning plane of the sensor 76, the object is still tracked in the 200 environment through navigation to esteem.
Referring to FIGS. 16A-16C, exemplary action zones 280 defined within environment 200 are illustrated. Action zones 280 may be used to implement various steering maneuvers as will be described herein. The action zones 280 in the mode shown are divided into left and right action zones 282, 274, where the left action zone 282 is located on the left of the central axis CA of the truck 10, and the right action zone 274 it is located on the right of the central axis CA of truck 10.
Illustrative action zones 280 shown in Figures 16A-16C comprise left and right stopping zones 300, 302, left and right non-steering areas 304, 306, left and right steering zones 308, 310, and left bracket zones. and right 312, 314.
The left and right stopping zones 300, 302 are located on the front of and immediately to the sides of the truck 10. If an object is detected in any of the stopping zones 300, 302, the controller 103 will initiate a braking operation to cause truck 10 to stop.
Laterally out of the stopping zones 300, 302 are the left and right non-steering areas 304, 306. The left and right non-steering areas 304, 306 comprise front and rear portions 304A, 306A and 304B, 306B. The forward positions 304A, 306A of the non-directed areas 304, 306 may comprise scanned portions of the non-directed areas 304, 306, ie, the portions of the non-directed areas 304, 306 corresponding to the scanned area 202, while the rear portions 304B, 306B of the non-targeted areas 304, 306 may comprise non-scanned portions of the non-targeted areas 304, 306, ie, the portions of the non-directed areas 304, 306 corresponding to the second portion 2040B of the zone of history 204. If an object is detected in one of the non-targeted areas 304, 306, the controller 103 does not allow the vehicle to turn toward the non-directed areas 304, 306 where the object was detected until the object moves out of the respective non-directed area 304, 306.
Laterally outward of the non-directed areas 304, 306 are the left and right steering zones of 308, 310. The left and right steering zones 308, 310 comprise front and rear portions 308A, 310A and 308B, 310B. The forward portions 308A, 310A of the address areas 308, 310 may comprise scanned portions of the address areas 308, 310, ie, portions of the address areas 308, 310 corresponding to the scanned area 202, while the portions rear 308B, 310B of the steering zones 308, 310 may comprise non-scanned portions of the steering zones 308, 310, ie, the portions of the steering zones 308, 310 corresponding to the second portion 2040B of the steering zone 308, 310B. history 204. If an object is detected in one of the rear portions 308B, 310B of the address zones 308, 310, the controller 103 allows the vehicle to rotate toward the address area 308, 310 in which the object was detected, is say, until the detected object enters the adjacent non-directed area 304, 306, in which the point of the controller 103 does not allow an additional rotation of the truck 10 towards the respective non-directed area 304, 306, and in which the pu The controller 103 can implement another steering maneuver as will be described here. It is noted that, in the preferred embodiment, the controller 103 does not implement an address maneuver upon returning the truck 10 to a directional area 308, 310 if an object was detected in the forward portion 308A, 310A thereof, although the controller 103 it could be programmed to implement such a management maneuver.
Laterally outward of the steering zones 308, 310 are in the left and right clamp zones 312, 314, the clamp zones 312, 314 are used by the controller 103 to direct the truck 10 relative to the selected objects in a manner that that the truck can be maintained substantially at a desired distance from the selected object, as will be described here with reference to Figures 17A-17C. Laterally internal boundaries of the clamp zones 312, 314 are defined by the clamp lines 312A, 314A, as illustrated in Figures 16A-16C and 17A-17C.
The selection of one of the action zones 280, or portions thereof, can be used by the controller 103 to implement additional steering maneuvers. For example, the non-directed areas 304, 306 and all or portions of the address areas 308, 310 can respectively define the areas beyond the left and right directions 316, 318. For example, the areas beyond address 316, 318 can be defined by the non-directed areas 304, 306 and the forward portions 308A, 310A but not the rear portions 308B, 310B of the address zones 308, 310. If an object is detected in or otherwise determined to be located, for example , through navigation to esteem, in one of the areas beyond address 316, 318, the truck 10 can return beyond the object, provided that another object is not located in the detention area 302, 304, the area is not directed 304, 306, or the forward portion 308A, 310A of the address area 308, 310 on the opposite side of the truck 10. It is noted that the areas beyond address 316, 318 described and illustrated here could be defined by other areas of action 280 or portion ones of it.
The controller 103 may implement several steering maneuvers after certain predefined conditions occur. A first illustrative event occurs when an object within the scanned area 202 is detected by the obstacle sensor 76 and is determined to be within the left or right clamp line 312A, 314A. If an object is detected within the scanned area 202 and within the left or right clamp line 312A, 314A, the controller 103 will attempt to direct the truck 10 past the detected object, as long as a steering maneuver is allowed, it is said, as long as a second object is not detected within the stopping zone 302, 304, the non-directed area 304, 306, or the forward portion 308A, 310A of the address area 308, 310 on the opposite side of the truck 10.
A second illustrative event occurs when an object is detected or otherwise determined to be located, for example, through estimated navigation, within a non-directed area 304, 306 and the object is located between the front edge 200A of the environment 200 and a predetermined axial location L5 associated with the truck 10, see Figures 16A-16C. The predetermined location L5 associated with the truck 10 can be defined, for example, at the axial location where the forks 16 extend from the truck 10. The predetermined axial location L5 can alternatively be defined with respect to a predetermined distance from the front edge 200A of the environment 200. After the occurrence of the event according to this example, the controller 103 will attempt to steer past the detected object, as long as an address maneuver is allowed, that is, as long as a second object is not detected within the stopping zone 302, 304, the non-directed area 304, 304, or the forward portion 308A, 310A of the address area 308, 310 on the opposite side of the truck 10.
A third illustrative event occurs when a first object is detected by the obstacle sensor 76 within the left clamp line 312A and a second object is detected by the obstacle sensor 76 within the right clamp line 314A. In this case, the controller 103 will implement a steering maneuver by keeping the truck 10 on a straight course until one of the following occurs: one of the objects moves out of the clamp line 312A, 314Á; one of the objects enters a rear portion 308B, 310B of an address area 308, 310; one of the objects leaves the environment 200; or one of the objects enters a stopping area 300, 302. After the occurrence of one of these cases, the controller 103 may implement another steering maneuver or initiate a braking operation depending on the location of the object (s) .
A fourth illustrative event occurs when a "clamp" maneuver is implemented by the controller 103. Additional details in connection with the clamp maneuver will be described below with reference to Figures 17A-17C.
Referring to Figures 16A-16C in succession, illustrative steering maneuvers implemented by the controller 103 during truck 10 movement will be described. The truck 10 may be traveling in response to receiving a wirelessly remote path request, i.e. wireless transmitter, as discussed here in detail. Alternatively, the truck 10 can afford a stop or can be manually operated by a pilot or a walker who is walking beside the truck 10.
In Figure 16A, the obstacle sensor 76 detects the first and second objects 272, 274 in the scanned area 202. The obstacle sensor 76 sends sensor data to the controller 103 which includes information about the first and second objects 272, 274 The sensor data comprises data representative of which of the scan areas 202Ai-A2, 202BI-B2 (see Figures 15A-15C) the objects 272, 274 are located. The sensor data also includes data representative of a lateral distance that the objects 272, 274 are from the reference coordinate Rc, that is, the central axis CA of the truck 10 in the mode shown.
In Figure 16A, the most internally lateral portion of the first object 272 is determined to be in the scanned area 202 and located outside the left clamp line 312A in the clamp area 312, and the innermost portion laterally of the second object 274. is determined to be in the scanned area 202 and located within the right clamp line 314A in the forward portion 310A of the right direction zone 310. It is noted that, while a portion of the first object 272 is located outside the region of left clamp 312 and a portion of the second object 274 is located in the right clamp area 314, the controller 103 can be concerned primarily with the portion of any of the detected objects that are closest laterally to the truck 10. Based on the location information of the object of the sensor data, it is determined that the innermost portion laterally of the second object 274 is closer than the innermost portion laterally of the first object 272 to the central axis CA of the truck 10. Based on the locations of the first and second objects 272, 274 in Figure 16A, the controller 103 will automatically implement a steering maneuver to direct the truck 10 toward the first object 272, in order to direct the truck 10 beyond the second object 274.
The truck 10 is continuously directed towards the first object 272 and beyond the second object 274 until one of the two conditions occurs. The first condition is that the first object 272 (or another object determined to be in environment 200) enters a predefined portion of the left action zone 282. The predefined portion of action zone 282 comprises a portion of the zone of action 282. Left action 282 where the truck 10 is also directed to the first object 272 is determined to be not allowed. The predefined portion of the left action zone 282 in the illustrative embodiment shown is any of the front portion 308A of the left direction zone 308 or the rear portion 304B of the left non-directional zone 304, but could be another of the zones of left action 282 or portions thereof. The second condition is that the second object 272 (and any of the other objects determined to be in the right action area 284) completely leaves a predefined portion of the right action area 284. The predefined portion of the right action area 284 comprises a portion of the right action zone 284 wherein the additional direction of the truck 10 away from the second object 274 is determined as not required. The predefined portion of the right action zone 284 in the embodiment shown is the forward portion 310A of the right address area 310 if the second object 274 is in the second scanned area 202, ie, so that the second object 274 is completely out of the right clamp line 314A, or the back portion 306B of the right non-directed area 306 of the location L5 if the second object 274 is in the second portion 2040B of the history zone 204, but it could be another of the right action zones 274 or portions thereof.
In Figure 16B, the first condition as it is known is illustrated, that is, the first object 272 enters the front portion 308A of the left direction zone 308. While the first and second objects 272 and 274 are both in the second zone 202 so that they are actively detected by the obstacle sensor 76, and while the innermost portion laterally of the first object 272 is at the front portion 308A of the left direction zone 308 and the innermost portion laterally of the second object is at the front portion 310A of the right direction zone 310, the controller 103 will implement a steering maneuver so that the truck 10 will maintain a straight course. As noted above, the truck 10 will maintain a straight course until one of the following occurs: the innermost portion laterally of one of the objects 272, 274 moves out of the clamp line 312A, 314A, the innermost portion laterally of one of the objects 272, 274 enters a rear portion 308B, 310B of an address area 308, 310; or one of the objects leaves the environment 200.
In Figure 16C, the most internally lateral portion of the second object 274 is illustrated as having moved within the rear portion 310B of the right direction zone 310. In this scenario, the second object 274 has gone from being scanned by a sensor of obstacle 76 in the scanned area 202 not to be scanned in the second portion 2040B of the history area 204, and, thus, being tracked by estimated navigation. From the innermost portion laterally of the first address 272 is in the front portion 308A of the left direction zone 308 and the second object 274 is in the rear portion 310B of the right direction zone 310, the controller 103 automatically implements a direction to direct the truck 10 past the first object 272 in order to direct the truck 10 to the second object 274. The truck 10 will continue the direction beyond the first object 274 and the second object 274 until one of the following illustrative conditions occur: the innermost portion laterally of the first object 272 enters the rear portion 308B of the left direction zone 308; the first object 272 is located completely outside the clamp line 312A; or until it is determined that an object is in the right non-directed area 306 or the forward portion 310A of the right address area 310. If one of these events occurs, the controller 103 may implement a subsequent address maneuver as described herein. .
If at any time during the operation, the first and / or second object 272, 274 enters one of the stopping areas 300, 302, the controller 103 will initiate a braking operation by causing the truck 10 to stop, as discussed previously.
Figures 17A-17C are successive views of a truck 10 performing steering maneuvers according to another aspect of the invention. Figures 17A-17C will be discussed in terms of action zones 280 discussed above with reference to Figures 16A-16C. The truck 10 may be traveling in response to receiving a remote wireless path request, i.e., from a wireless transmitter, as discussed in detail here. Alternatively, the truck 10 can afford a stop or can be manually driven by a pilot or a walker who is walking beside the truck 10.
In Figure 17A, the obstacle sensor 76 detects a selected object 276 in the scanned area 202. The obstacle sensor 76 sends sensor data to the controller 103 which includes information on the selected object 276. The sensor data comprises data which is representative of which zones are scanned 202Ai-A4, 202Bi-B4 (see Figures 15A-15C) where the selected object 276 is located. The sensor data also includes data representative of the lateral distance that the selected object 276 is from the reference coordinate Re, that is, the central axis CA of the truck 10 in the mode shown. The selected object 276 can be a shelf or a stacked product face having an axially generally laterally extended inner edge portion 276A, although it will be understood that the selected object 276 could be other objects.
In the environment 200 illustrated in Figure 17A, based on the sensor data of the obstacle sensor 76, it is determined that the edge portion 276A of the selected object 276 is in the right address area 310. Based on the detected location of the object selected 276 illustrated in Figure 17A, the controller 103 automatically implements a steering maneuver to direct the truck 10 past the selected object 276 with the intent to steer the truck 10 so that the truck 10 is substantially maintained at a desired distance from the edge portion 276A of the selected object 276, i.e., so that the truck 10"embraces" the edge portion 276A of the selected object 276. In an embodiment, the intent of the steering maneuver may be such that the selected object 276 is at least partially retained in the right clamp area 314. Additionally or alternatively, the intent of the steering maneuver may be such that a portion of the selected object 276 , for example, the edge portion 276A thereof, is substantially maintained in the right clamp line 314A which is associated with the right clamp area 314.
In the illustrative embodiment shown, the attempt of the steering maneuver is to continuously direct the truck 10 past the selected object 276 until the selected object 276 is at least partially maintained in the right clamp area 314 and until the edge portion 276A of the selected object 276 is substantially maintained in the right clamp line 314A.
Referring to Figure 17B, an illustrative condition is illustrated in which the truck 10"excessively oscillates" the right clamp line 314A, so that the edge portion 276A of the selected object 276 goes beyond the right clamp line 314A . In this case, the controller 103 automatically implements a steering maneuver to direct the truck 10 to the selected object 276 until the edge portion 276A of the selected object 276 is held in the right clamp line 314A. It is noted that, since there is not a portion of the selected object 276 that is located in the untargeted area 306 right or in the front portion 310A of the right direction zone 310 in Figure 17B, the truck 10 is allowed to rotate towards the selected object 276.
In Figure 17C, after the steering maneuver is implemented which directs the truck 10 towards the selected object 276 so that the edge portion 276A of the selected object 276 is located on the right clamp line 314A, the controller 103 implements a steering maneuver to achieve a straight course of the truck 10 in the axial direction, ie, parallel to the central axis CA, so that the edge portion 276A of the selected object 276 is maintained in the right clamp line 314A. The truck 10 continues to the straight path until the selected object 276 is not determined to be in the environment 200, or until the edge portion 276A of the selected object 276 is not determined to be located in the right clamp line 314A, said point the controller 103 could implement a steering maneuver so that the right clamp line 314A coincides with the edge portion 276A of the selected object 276.
According to one embodiment, if multiple objects are located within environment 200, the selected object 276 may be an object that is determined to be located closer to the left clamp line 312A or the right clamp line 314A. Alternatively, the selected object 276 may be the first object that is detected in the scanned area 202 by the obstacle sensor 76, or it may be the first object that is determined to be in at least one of the address areas 308, 310 and the non-targeted areas 304, 306. As another example, the selected object 276 may be an object that is determined to be the object closest to the truck 10 within the environment 200, as measured in the lateral direction.
In addition, the controller 103 may be programmable to only perform one steering maneuver to "hug" a selected object if the object is detected in a selection of the left and right clamp zones 312, 314. For example, it may be desired that the truck 10 only embrace the objects located on the right side of the truck 10. Under this arrangement, the truck 10 can travel in a driver mode under the right side of a corridor, while the other truck travels in the opposite direction on the other side of the road. aisle. As another example, if an operator will only collect the items located on the right side of a corridor, the truck 10 can only hug a shelf or product face stacked on the right side of the truck 10, in order to minimize the distance that the Operator has to walk from the shelf to the truck 10.
Even further, the clamp maneuver described herein can be implemented by the controller 103 in a mode only after authorization to do so. For example, an operator can press a button, which can be located on the truck 10 or on a remote control device as described herein. Upon receiving authorization to implement a clamp maneuver, the controller 103 enters into a "clamp acquire" mode, wherein the controller 103 observes the objects in the scanned area 203 to embrace them. Additionally, the operator can designate clamp preferences, so that if anyone who hugs an object on the left or right side of the truck 10, the first object detected in the scanned area 202, the object that is determined to be located closer to the axis CA central truck 10, etc. Additionally, once an object that is being embraced is no longer located within environment 200, the truck can continue forward on a straight course until a new object to be embraced is detected by obstacle sensor 76. If a new object is detected by an obstacle sensor 76 within environment 200, controller 103 can be automatically programmed to embrace the new object, or controller 103 may need to be authorized to do so by the operator.
In addition, the clamp maneuvers used in connection with the clamp zones 312, 314 described herein with reference to Figures 17A-17C can be used in combination with the other action zones 280 described above with reference to Figures 16A-16C.
In this way the invention of the present application has been described in detail and by reference to the modalities thereof, it will be appreciated that modifications and variations are possible without departing from the scope of the invention defined in the appended claims.

Claims (34)

1. - A method for a material handling vehicle to automatically perform a steering correction maneuver comprising: receiving sensor data from at least one sensor device by a controller in a material handling vehicle; detecting, based on the received sensor data, that a first object is located in a first zone defined at least partially on a first side of the vehicle; detecting, based on the received sensor data, that a second object is located in a second zone defined at least partially on a second side of the vehicle, wherein the second object is closer to a central axis of the vehicle than the first object; Y automatically perform a steering correction maneuver to direct the vehicle towards the first object in order to direct the vehicle beyond the second object to at least one of: the first object enters a predefined portion of the first zone; Y the second object leaves a predefined portion of the second zone.
2 - . 2 - The method according to claim 1, wherein the predefined portion of the first zone comprises a portion of the first zone wherein further the direction of the vehicle towards the first object is determined as not allowed.
3 - . 3 - The method according to claim 1, wherein the predefined portion of the second zone comprises a portion of the second zone wherein the additional direction of the vehicle goes beyond the second object is determined as not required.
4. - The method according to claim 1, further comprising straightening a heading of the vehicle after at least one of: the second object that leaves the predefined portion of the second zone; Y the first object that enters the predefined portion of the first object.
5. - The method according to claim 1, further comprising initiating a braking operation if at least one of the first and second object enters a portion of its corresponding zone comprising a stop zone.
6. - The method according to claim 1, wherein the automatic performance of a steering correction maneuver automatically comprises performing the steering correction maneuver while the material handling vehicle is traveling in response to receiving a wirelessly transmitted path request. requested by a corresponding wireless transmitter.
7 -. 7 - The method according to claim 1, wherein receiving the sensor data of the at least one sensor device comprises receiving the sensor data from a laser scanning device.
8. - The method according to claim 1, wherein the first and second zones each comprise: a detention zone, where an object is detected in the detention area, which causes the vehicle to initiate a braking operation; an area not directed laterally out of the stopping zone, where if an object is detected in at least a portion of the non-directed area, the vehicle is not allowed to turn into the non-directed area in which the object was detected; Y an address zone laterally outside the non-directed area, where if an object is detected in at least a portion of the address zone, the vehicle is allowed to turn toward the address area in which the object was detected.
9. - The method according to claim 8, wherein the predefined portion of the first zone comprises the non-directed area of the first zone.
10. - The method according to claim 8, wherein the predefined portion of the second zone comprises the address zone of the second zone.
11. - The method according to claim 8, wherein the first and second zones each further comprises a clamp area laterally outward of the steering zone, wherein the clamp zone is used by the controller to steer the vehicle with relation to the selected objects detected in the corresponding clamp area such that the vehicle remains substantially at a desired distance from the selected object.
12. - A method for tracking objects detected by at least one sensor device in a material handling vehicle, the method comprising: receiving sensor data from the minus one sensor device by a controller in a material handling vehicle, wherein the sensor data comprises: representative data of whether an object is detected in a scanned area that is scanned by at least one sensor device, the scanned area being a part of an environment in which the objects are tracked; Y data representative of a lateral distance where any of the objects are of a reference coordinate associated with the vehicle; where each detected object is dragged until the object is no longer located in the environment by: assigning the object to at least one defined cuvette within the scanned area by at least one sensor device; Y using at least one of the subsequent sensor data and the estimated navigation to reassign the object to adjacent cells and to determine an updated lateral distance where the object is from the reference coordinate as the vehicle moves; Y where the controller automatically implements a direction correction maneuver if a tracked object enters an address beyond the defined area within the environment.
13. - The method according to claim 12, wherein the reference coordinate is a central axis of the vehicle.
14. - The method according to claim 12, wherein the environment is defined by: a front edge that travels a predefined distance from the front of the vehicle; a trailing edge that is located at a predetermined location associated with the vehicle; a left edge that moves at a predefined distance from a central axis of the vehicle; Y a right edge that moves to a predefined distance from the central axis of the vehicle.
15. - The method according to claim 14, wherein the environment comprises: the scanned area, which extends: in an axial direction from the rear of the front edge of the environment to a predetermined axial location, wherein the axial direction is parallel to the central axis of the vehicle; Y in the lateral direction from a left default location to a right default location, the left default location being between the left side of the truck and the left edge of the environment and the right default location being between the right side of the truck and the right edge of the environment; Y a history zone that includes: a first portion extending laterally from the scanned area to the left edge of the environment and extending laterally from the scanned area to the right edge of the environment; Y a second portion extending towards the rear from the scanned area towards the rear edge of the environment.
16. - The method according to claim 15, wherein at least one sensor device scans objects located in the scanning zone and does not scan objects in the history area.
17. - The method according to claim 16, wherein: the objects located in the scan area are scanned using the sensor data from at least one estimated sensor and navigation device; Y Once the objects that have been detected by at least one sensor device leave the scanned area and enter the history area, the objects are tracked using only the estimated navigation.
18. - The method according to claim 17, wherein the scanned area comprises a plurality of scanning zones, each scanning zone comprising a plurality of buckets, the buckets adjacent to each other in the axial direction.
19. - The method according to claim 18, wherein the history zone comprises a plurality of buckets located adjacent to each other in the axial direction so that the objects tracked in the history area during the movement of the vehicle are reassigned to adjacent buckets using estimated navigation to update the axial distance of the object in relation to the vehicle.
20. - The method according to claim 12, wherein the environment defines a plurality of action zones that results in the controller that implements different steering maneuvers for each action zone, so that, depending on the area of action to the When the tracked object enters, the controller automatically implements an address maneuver that corresponds to the area of action entered.
21. - A method for a material handling vehicle to automatically implement a steering maneuver comprising: receiving sensor data from at least one sensor device by a controller in a material handling vehicle; detecting that a second object is in an environment close to the vehicle; Y performing a steering maneuver when steering the vehicle so that the vehicle remains substantially at a desired distance from the selected object.
22. - The method according to claim 21, wherein performing a steering maneuver comprises directing the vehicle so that the selected object is at least partially maintained in a clamp zone defined within the environment.
23. - The method according to claim 22, wherein performing a steering maneuver further comprises steering the vehicle so that at least a portion of the selected object is substantially maintained in a clamp line associated with the clamp zone.
24. - The method according to claim 23, wherein: if a portion more internally laterally of the object selected between the clamp line and the vehicle, the vehicle is automatically directed beyond the selected object until the innermost portion of the object laterally selected is located on the clamp line, at that point the vehicle is automatically directed to a desired heading; Y if the innermost portion of the selected object laterally is located laterally on the other side of the clamp line than the vehicle, the vehicle is automatically directed towards the selected object until the innermost portion of the selected object is located on the line of the vehicle. clamp, at that point the vehicle automatically goes to the desired course.
25. - The method according to claim 24, wherein the desired course is substantially in the axial direction.
26. - The method according to claim 22, wherein the clamp zone extends in an axial direction that is parallel to a central axis of the vehicle and the clamp zone moves laterally from one side of the vehicle.
27. - The method according to claim 21, wherein the environment comprises first and second clamp zones, the first clamp zone is displaced laterally from the left side of the vehicle and the second clamp area is moved from the right side of the vehicle .
28. - The method according to claim 27, wherein the environment further comprises: first and second stopping zones laterally inward from the respective first and second bracket zones, where if an object is detected in a stopping zone, it causes the vehicle to initiate a braking operation; the first and second steering zones laterally outward from the respective stopping zones, where if an object is detected in at least a portion of an unaddressed area it does not allow the vehicle to turn to the non-directed area in which the object It was detected; Y first and second zones laterally between the respective non-directed zones and the respective clamping zones, wherein if an object is detected in at least a portion of a steering zone the vehicle is allowed to turn towards the steering zone in which the object was detected.
29. - The method according to claim 28, wherein the selected object is the first object that is detected in the less one of the management areas and the non-directed areas.
30. - The method according to claim 27, wherein the collector is programmable to only perform a steering maneuver if an object is detected in a selection of the first and second clamp zones.
31. - The method according to claim 21, wherein detecting that a selected object is in an environment close to the vehicle comprises detecting the selected object in a scanned area of the environment, wherein the scanned area is scanned by at least one sensor device .
32. - The method according to claim 21, wherein the object selected is an object that is determined to be the object closest to the vehicle within the environment, as measured in a lateral direction that is perpendicular to a central axis of the vehicle.
33. - The method according to claim 21, wherein the selected object is the first object that is detected in a scanned area defined in the environment, wherein the scanned area is scanned by at least one sensor device.
34. - The method according to claim 21, wherein the scanned object comprises one of a shelf and a stacked product face having an edge portion that extends generally axially so that the vehicle is substantially maintained at a desired distance from the edge portion of the shelf or the face of the stacked product.
MX2013009769A 2011-02-23 2012-02-21 Object tracking and steer maneuvers for materials handling vehicles. MX339969B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/033,169 US8731777B2 (en) 2009-08-18 2011-02-23 Object tracking and steer maneuvers for materials handling vehicles
PCT/US2012/025849 WO2012115920A2 (en) 2011-02-23 2012-02-21 Object tracking and steer maneuvers for materials handling vehicles

Publications (2)

Publication Number Publication Date
MX2013009769A true MX2013009769A (en) 2013-10-01
MX339969B MX339969B (en) 2016-06-20

Family

ID=45771947

Family Applications (1)

Application Number Title Priority Date Filing Date
MX2013009769A MX339969B (en) 2011-02-23 2012-02-21 Object tracking and steer maneuvers for materials handling vehicles.

Country Status (11)

Country Link
US (3) US8731777B2 (en)
EP (5) EP2889713B1 (en)
KR (4) KR102144781B1 (en)
CN (2) CN103392157B (en)
AU (1) AU2012220819B2 (en)
BR (2) BR122014017960A2 (en)
CA (3) CA2827735C (en)
IN (1) IN2014CN03425A (en)
MX (1) MX339969B (en)
RU (1) RU2578831C9 (en)
WO (1) WO2012115920A2 (en)

Families Citing this family (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8970363B2 (en) 2006-09-14 2015-03-03 Crown Equipment Corporation Wrist/arm/hand mounted device for remotely controlling a materials handling vehicle
US9122276B2 (en) 2006-09-14 2015-09-01 Crown Equipment Corporation Wearable wireless remote control device for use with a materials handling vehicle
US10358331B2 (en) * 2010-12-20 2019-07-23 Jlg Industries, Inc. Work platform with protection against sustained involuntary operation
US9522817B2 (en) 2008-12-04 2016-12-20 Crown Equipment Corporation Sensor configuration for a materials handling vehicle
US8577551B2 (en) * 2009-08-18 2013-11-05 Crown Equipment Corporation Steer control maneuvers for materials handling vehicles
US8731777B2 (en) 2009-08-18 2014-05-20 Crown Equipment Corporation Object tracking and steer maneuvers for materials handling vehicles
DE102010032909A1 (en) * 2010-07-30 2012-02-02 Wabco Gmbh Monitoring system for monitoring the environment, in particular the rear space of motor vehicles
US10124999B2 (en) 2010-12-20 2018-11-13 Jlg Industries, Inc. Opto-electric system of enhanced operator control station protection
US9146559B2 (en) * 2011-03-18 2015-09-29 The Raymond Corporation System and method for gathering video data related to operation of an autonomous industrial vehicle
EP2715286B1 (en) * 2011-05-31 2020-11-25 John Bean Technologies Corporation Deep lane navigation system for automatic guided vehicles
US9030332B2 (en) * 2011-06-27 2015-05-12 Motion Metrics International Corp. Method and apparatus for generating an indication of an object within an operating ambit of heavy loading equipment
US20140133944A1 (en) * 2012-11-12 2014-05-15 Lts Scale Company, Llc Detection System Usable In Forklift Apparatus
US9415983B2 (en) * 2012-12-17 2016-08-16 Shamrock Foods Company Crash prevention system for a storage and retrieval machine
US9144525B2 (en) * 2013-03-14 2015-09-29 Max Mobility, Llc. Motion assistance system for wheelchairs
FR3012387B1 (en) * 2013-10-25 2015-10-30 Rio Tinto Alcan Int Ltd HANDLING VEHICLE AND ELECTROLYSIS FACTORY COMPRISING THIS VEHICLE
US20150318765A1 (en) * 2014-04-30 2015-11-05 Rossie Owen Terry Electrical motors and methods thereof having reduced electromagnetic emissions
KR102075808B1 (en) * 2013-12-30 2020-03-02 주식회사 두산 Controller and control method of Forklift
JP6267972B2 (en) * 2014-01-23 2018-01-24 日立建機株式会社 Work machine ambient monitoring device
US9886036B2 (en) 2014-02-10 2018-02-06 John Bean Technologies Corporation Routing of automated guided vehicles
US9164514B1 (en) * 2014-04-14 2015-10-20 Southwest Research Institute Cooperative perimeter patrol system and method
CN103941736B (en) * 2014-05-07 2017-06-30 山东理工大学 A kind of intelligent carrier and its control method
US9459349B2 (en) * 2014-10-27 2016-10-04 Hyster-Yale Group, Inc. Vehicle and environmental detection system
CN104483969B (en) * 2014-12-05 2018-04-17 嘉兴市日新自动化科技有限公司 The automatic patrol robot of road
DE102015101381A1 (en) 2015-01-30 2016-08-04 Hubtex Maschinenbau Gmbh & Co. Kg Steering method, industrial truck and route guidance system
US9864371B2 (en) 2015-03-10 2018-01-09 John Bean Technologies Corporation Automated guided vehicle system
US9989636B2 (en) * 2015-03-26 2018-06-05 Deere & Company Multi-use detection system for work vehicle
US10202267B2 (en) * 2015-10-29 2019-02-12 The Raymond Corporation Systems and methods for sensing a load carried by a material handling vehicle
EP3199486B1 (en) * 2016-01-28 2018-06-20 MOBA - Mobile Automation AG Crane mechanism and work platform with a load measuring device and an integrated inclination sensor
US20170253237A1 (en) * 2016-03-02 2017-09-07 Magna Electronics Inc. Vehicle vision system with automatic parking function
EP3269680B1 (en) 2016-07-14 2020-09-30 Toyota Material Handling Manufacturing Sweden AB Floor conveyor
EP3269678B1 (en) 2016-07-14 2019-03-06 Toyota Material Handling Manufacturing Sweden AB Floor conveyor
EP3269679B1 (en) 2016-07-14 2019-09-11 Toyota Material Handling Manufacturing Sweden AB Floor conveyor
DE102016115703A1 (en) 2016-08-24 2018-03-01 Jungheinrich Aktiengesellschaft Industrial truck and method for controlling an industrial truck
KR102231883B1 (en) * 2016-08-26 2021-03-29 크라운 이큅먼트 코포레이션 Multi-field scanning tools for material handling vehicles
CN109791477A (en) 2016-09-30 2019-05-21 史泰博公司 Hybrid modular storage extraction system
US10683171B2 (en) 2016-09-30 2020-06-16 Staples, Inc. Hybrid modular storage fetching system
US10589931B2 (en) 2016-09-30 2020-03-17 Staples, Inc. Hybrid modular storage fetching system
CN107045677A (en) * 2016-10-14 2017-08-15 北京石油化工学院 A kind of harmful influence warehouse barrier Scan orientation restoring method, apparatus and system
DE102016123542A1 (en) 2016-12-06 2018-06-07 Jungheinrich Aktiengesellschaft Method for automatic alignment of a truck in a warehouse and system of an industrial truck and a warehouse
DE102016123541A1 (en) * 2016-12-06 2018-06-07 Jungheinrich Aktiengesellschaft Method for automatic alignment of a truck in a warehouse and system of an industrial truck and a warehouse
EP3339238B1 (en) * 2016-12-23 2023-09-20 The Raymond Corporation Systems and methods for determining a rack interface for a material handling vehicle
CN106671906B (en) * 2016-12-30 2017-11-28 罗维迁 Control method, control device and the AGV fork trucks of AGV fork trucks
DE102017203514A1 (en) * 2017-03-03 2018-09-20 Robert Bosch Gmbh Industrial truck with improved sensor concept and forklift system
US10429188B2 (en) 2017-03-30 2019-10-01 Crown Equipment Corporation Warehouse mapping tools
DE102017214185A1 (en) * 2017-08-15 2019-02-21 Zf Friedrichshafen Ag Control of a transport vehicle
US20200201347A1 (en) * 2017-08-30 2020-06-25 Positec Power Tools (Suzhou) Co., Ltd Self-moving device and method for controlling movement path of same
EP3466843A1 (en) * 2017-10-03 2019-04-10 AJ Produkter AB A method and a device for controlling speed of a moving shuttle
JP6753385B2 (en) * 2017-10-23 2020-09-09 株式会社豊田自動織機 Remote control system for industrial vehicles, remote control device, remote control program for industrial vehicles, and remote control method for industrial vehicles
CN107745908A (en) * 2017-11-30 2018-03-02 无锡凯乐士科技有限公司 A kind of new logistics shuttle
DE102017128623A1 (en) 2017-12-01 2019-06-06 Jungheinrich Aktiengesellschaft Method for coupling a second remote control unit with a first remote control unit
JP7267306B2 (en) * 2018-05-18 2023-05-01 コリンダス、インコーポレイテッド Remote communication and control system for robotic interventional procedures
MX2021000678A (en) * 2018-08-01 2021-03-25 Crown Equip Corp Systems and methods for warehouse environment speed zone management.
US11084410B1 (en) 2018-08-07 2021-08-10 Staples, Inc. Automated guided vehicle for transporting shelving units
US11590997B1 (en) 2018-08-07 2023-02-28 Staples, Inc. Autonomous shopping cart
US11630447B1 (en) 2018-08-10 2023-04-18 Staples, Inc. Automated guided vehicle for transporting objects
DE102018121928A1 (en) 2018-09-07 2020-03-12 Jungheinrich Aktiengesellschaft Industrial truck with a detection unit
JP7180219B2 (en) * 2018-09-10 2022-11-30 株式会社豊田自動織機 autonomous vehicle
US20200125092A1 (en) * 2018-10-17 2020-04-23 Wellen Sham Self-driving with onboard control system
CN109361352B (en) * 2018-11-09 2020-08-04 苏州瑞得恩光能科技有限公司 Control method of cleaning system
US11119487B2 (en) 2018-12-31 2021-09-14 Staples, Inc. Automated preparation of deliveries in delivery vehicles using automated guided vehicles
US11180069B2 (en) 2018-12-31 2021-11-23 Staples, Inc. Automated loading of delivery vehicles using automated guided vehicles
CN117767471A (en) 2019-02-01 2024-03-26 克朗设备公司 On-board charging station for remote control equipment
US11641121B2 (en) 2019-02-01 2023-05-02 Crown Equipment Corporation On-board charging station for a remote control device
US11124401B1 (en) 2019-03-31 2021-09-21 Staples, Inc. Automated loading of delivery vehicles
EP3964404B1 (en) 2019-04-02 2023-06-07 The Raymond Corporation Material handling vehicle
EP3718831A1 (en) 2019-04-05 2020-10-07 The Raymond Corporation Material handling vehicle having a multi-piece bumper assembly
US20220276650A1 (en) * 2019-08-01 2022-09-01 Telefonaktiebolaget Lm Ericsson (Publ) Methods for risk management for autonomous devices and related node
US11353877B2 (en) * 2019-12-16 2022-06-07 Zoox, Inc. Blocked region guidance
US11462041B2 (en) 2019-12-23 2022-10-04 Zoox, Inc. Pedestrians with objects
US11789155B2 (en) 2019-12-23 2023-10-17 Zoox, Inc. Pedestrian object detection training
CN115843346A (en) * 2019-12-23 2023-03-24 祖克斯有限公司 Pedestrian with object
WO2021167928A1 (en) * 2020-02-21 2021-08-26 Crown Equipment Corporation Modify vehicle parameter based on vehicle position information
CA3187922A1 (en) 2020-07-31 2022-02-03 Trisha M. Luthman On-board charging station for a remote control device
MX2023001754A (en) 2020-08-11 2023-03-07 Crown Equip Corp Remote control device.
WO2022035732A1 (en) 2020-08-13 2022-02-17 Crown Equipment Corporation Method and system for testing a remote control device
US20220107635A1 (en) * 2020-10-05 2022-04-07 Crown Equipment Corporation Systems and methods for relative pose sensing and field enforcement of materials handling vehicles using ultra-wideband radio technology
CN116868595A (en) 2021-02-19 2023-10-10 克朗设备公司 Calculating missed messages expected to be received by a central device from peripheral devices

Family Cites Families (166)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1835808A (en) 1930-02-06 1931-12-08 Baker Raulang Co Industrial truck
US2959260A (en) 1955-10-17 1960-11-08 Thew Shovel Co Dual control system for cranes and the like
US3047783A (en) 1957-08-27 1962-07-31 Philips Corp Remote control arrangements
US3016973A (en) 1958-12-29 1962-01-16 Clark Equipment Co Lift truck
GB1002825A (en) 1962-02-07 1965-09-02 Lansing Bagnall Ltd Improvements in or relating to industrial trucks
US3587784A (en) 1968-09-26 1971-06-28 Hunter Manufacturing Co Inc Telescopic load booster
US3825130A (en) 1972-06-26 1974-07-23 S Lapham Material handling system
US3968893A (en) 1972-06-26 1976-07-13 Lapham Engineering Co., Inc. Material handling vehicle with H-shaped walk area
GB1515712A (en) 1975-08-14 1978-06-28 Total Mech Handling Ltd Mechanical handling apparatus
US4074120A (en) 1976-03-02 1978-02-14 Kenway Incorporated Automated materials storage system and method
US4258825A (en) 1978-05-18 1981-03-31 Collins Pat L Powered manlift cart
US4287966A (en) 1978-11-06 1981-09-08 Missouri Research Laboratories Inc. Industrial truck
JPS58217012A (en) 1982-06-11 1983-12-16 Kubota Ltd Traveling vehicle with obstacle detecting sensor
US4476954A (en) 1982-09-22 1984-10-16 Johnson Engineering Corporation Remote control for motor vehicle
GB8313338D0 (en) 1983-05-14 1983-06-22 Gen Electric Co Plc Vehicle control
US4551059A (en) 1983-11-02 1985-11-05 The United States Of America As Represented By The Secretary Of The Navy Multi-directional straddle-lift carrier
US4527651A (en) 1983-11-17 1985-07-09 Racine Federated Inc. Remote control handle assembly
US4665487A (en) 1984-05-25 1987-05-12 Kabushiki Kaisha Meidensha Unmanned vehicle control system and method
WO1987002483A1 (en) 1985-10-15 1987-04-23 Knepper Hans Reinhard Process and installation for the automatic control of a utility vehicle
US4716980A (en) 1986-02-14 1988-01-05 The Prime Mover Company Control system for rider vehicles
US4714140A (en) 1986-03-17 1987-12-22 Hatton John H Multi-axis articulated all terrain vehicle
US4785664A (en) 1986-04-28 1988-11-22 Kay-Ray, Inc. Ultrasonic sensor
US5457629A (en) 1989-01-31 1995-10-10 Norand Corporation Vehicle data system with common supply of data and power to vehicle devices
GB2197799A (en) 1986-10-08 1988-06-02 Synergistics Research A remote controller for a toy vehicle
US4954817A (en) 1988-05-02 1990-09-04 Levine Neil A Finger worn graphic interface device
JPH02152898A (en) 1988-12-06 1990-06-12 Komatsu Forklift Co Ltd Stock control method in material handling vehicle
US4967362A (en) 1989-01-30 1990-10-30 Eaton Corporation Automatic steering apparatus for crop vehicle
US5023790A (en) 1989-02-17 1991-06-11 Whs Robotics Automatic guided vehicle system
JP2636403B2 (en) 1989-03-08 1997-07-30 株式会社豊田自動織機製作所 Operation control device for unmanned vehicles
JP2728174B2 (en) * 1989-05-29 1998-03-18 マツダ株式会社 Forward car recognition device for mobile vehicles
FR2648842B1 (en) 1989-06-26 1992-05-15 Screg Routes & Travaux SECURITY SYSTEM FOR A MACHINE, PARTICULARLY FOR PUBLIC WORKS
US5107946A (en) 1989-07-26 1992-04-28 Honda Giken Kogyo Kabushiki Kaisha Steering control system for moving vehicle
US5044472A (en) 1989-12-05 1991-09-03 Crown Equipment Corporation Dual operator position for material handling vehicle
US5170351A (en) 1990-09-18 1992-12-08 Matushita Electric Industrial Co., Ltd. Automatic guided vehicle and method for controlling travel thereof
US5307271A (en) 1990-09-28 1994-04-26 The United States Of America As Represented By The Secretary Of The Navy Reflexive teleoperated control system for a remotely controlled vehicle
US5141381A (en) 1990-11-19 1992-08-25 Daifuku Co., Ltd. Safety arrangement for automatic warehousing facility
US5117935A (en) * 1990-12-21 1992-06-02 Caterpillar Inc. Load sensing hydrostatic steering system
WO1992015977A1 (en) 1991-03-04 1992-09-17 Sydec N.V. Selectively addressable programmable remote control system
DE4111736C1 (en) 1991-04-08 1992-08-27 Mannesmann Ag, 4000 Duesseldorf, De
US5245144A (en) 1991-10-25 1993-09-14 Crown Equipment Corporation Walk along hand grip switch control for pallet truck
JP3210121B2 (en) * 1992-02-10 2001-09-17 本田技研工業株式会社 Obstacle avoidance route search method for moving objects
US5502638A (en) 1992-02-10 1996-03-26 Honda Giken Kogyo Kabushiki Kaisha System for obstacle avoidance path planning for multiple-degree-of-freedom mechanism
RU2032926C1 (en) * 1992-11-24 1995-04-10 Лев Борисович Гурин Method of control over equidistant traffic of vehicles and method of determination of angular and linear-side deviation of vehicle from reference path
US5361861A (en) * 1993-01-29 1994-11-08 Trw Inc. Power steering system
EP0658467B1 (en) 1993-11-10 1997-11-26 Raymond Corporation Guidewire controls for a manned, material handling vehicle
JPH07138000A (en) 1993-11-15 1995-05-30 Shinko Electric Co Ltd Manned/unmanned load handling vehicle
DE4415736C2 (en) * 1994-05-04 2002-11-14 Siemens Ag Collision avoidance method using a steering angle field for an autonomous mobile unit
SE514791C2 (en) * 1994-06-06 2001-04-23 Electrolux Ab Improved method for locating lighthouses in self-propelled equipment
US5652486A (en) 1995-04-14 1997-07-29 S.L.O.W. Corporation Travel speed limiting system for forklift trucks
WO1996039679A1 (en) 1995-06-06 1996-12-12 Mrigank Shekhar Pointer device
US5709523A (en) 1995-06-07 1998-01-20 Ware; Emmet P. Material handling lift
JPH0995194A (en) 1995-09-29 1997-04-08 Aisin Seiki Co Ltd Detecting device for object in front of vehicle
JP3239727B2 (en) * 1995-12-05 2001-12-17 トヨタ自動車株式会社 Automatic driving control device for vehicles
DE19613386A1 (en) 1996-04-03 1997-10-09 Fiat Om Carrelli Elevatori Industrial truck, which can be operated either manually or automatically
CA2210037C (en) 1996-07-30 2001-01-23 The Raymond Corporation Motion control system for a materials handling vehicle
US5939986A (en) 1996-10-18 1999-08-17 The United States Of America As Represented By The United States Department Of Energy Mobile machine hazardous working zone warning system
US5816741A (en) 1997-04-03 1998-10-06 Ingersoll-Rand Company Remote control for walk-behind compactor
FR2764091B1 (en) 1997-05-30 1999-09-03 Peugeot REMOTE CONTROL AND OPERATING DEVICE FOR AT LEAST ONE URBAN VEHICLE, ESPECIALLY ELECTRIC
NL1007225C2 (en) * 1997-10-08 1999-04-09 Maasland Nv Vehicle combination.
US6033366A (en) 1997-10-14 2000-03-07 Data Sciences International, Inc. Pressure measurement device
DE19746700C2 (en) 1997-10-22 2000-01-13 Wacker Werke Kg Method and safety device for remote control of self-propelled work equipment
US6173215B1 (en) 1997-12-19 2001-01-09 Caterpillar Inc. Method for determining a desired response to detection of an obstacle
CN1178467C (en) * 1998-04-16 2004-12-01 三星电子株式会社 Method and apparatus for automatically tracing moving object
US6268803B1 (en) 1998-08-06 2001-07-31 Altra Technologies Incorporated System and method of avoiding collisions
US6030169A (en) 1998-08-07 2000-02-29 Clark Equipment Company Remote attachment control device for power machine
US6276485B1 (en) 1998-12-30 2001-08-21 Bt Industries Ab Device at tiller truck
WO2000073996A1 (en) * 1999-05-28 2000-12-07 Glebe Systems Pty Ltd Method and apparatus for tracking a moving object
US6226902B1 (en) 1999-07-16 2001-05-08 Case Corporation Operator presence system with bypass logic
GB2352859A (en) * 1999-07-31 2001-02-07 Ibm Automatic zone monitoring using two or more cameras
US6382359B1 (en) 1999-10-29 2002-05-07 Jungheinrich Aktiengesellschaft Hand/co-traveller lift truck with a holding bar
US6548982B1 (en) 1999-11-19 2003-04-15 Regents Of The University Of Minnesota Miniature robotic vehicles and methods of controlling same
US6686951B1 (en) 2000-02-28 2004-02-03 Case, Llc Crop row segmentation by K-means clustering for a vision guidance system
AT412196B (en) 2000-03-17 2004-11-25 Keba Ag METHOD FOR ASSIGNING A MOBILE OPERATING AND / OR OBSERVATION DEVICE TO A MACHINE AND OPERATING AND / OR OBSERVATION DEVICE THEREFOR
DE10015009B4 (en) 2000-03-20 2006-02-23 Jungheinrich Ag Industrial truck with a display, control and monitoring system
US20030205433A1 (en) 2001-05-03 2003-11-06 Hagman Earl L Variable straddle transporter lift with programmable height positions
DE10033857A1 (en) 2000-07-12 2002-01-24 Dambach Lagersysteme Gmbh Storage system operating device is remotely controllable by operator via remote control unit; controller positions device coarsely, control unit automatically performs storage/extraction
US6552661B1 (en) * 2000-08-25 2003-04-22 Rf Code, Inc. Zone based radio frequency identification
JP2002104800A (en) 2000-09-29 2002-04-10 Komatsu Forklift Co Ltd Remote control device for battery forklift truck
US20020163495A1 (en) 2001-05-02 2002-11-07 Plamen Doynov Multi-functional ergonomic interface
US6681638B2 (en) * 2001-05-04 2004-01-27 Homayoon Kazerooni Device and method for wireless material handling systems
US6464025B1 (en) 2001-05-15 2002-10-15 Crown Equipment Corporation Coast control for walkie/rider pallet truck
US6784800B2 (en) 2001-06-19 2004-08-31 Signal Tech Industrial vehicle safety system
JP3905727B2 (en) 2001-07-13 2007-04-18 日産自動車株式会社 Vehicle lane tracking control device
US6595306B2 (en) * 2001-08-09 2003-07-22 Crown Equipment Corporation Supplemental walk along control for walkie/rider pallet trucks
JP3865121B2 (en) 2001-10-31 2007-01-10 株式会社小松製作所 Vehicle obstacle detection device
JP2003231422A (en) * 2002-02-08 2003-08-19 Hitachi Ltd Automatic inter-vehicle distance control device and car
US6862537B2 (en) * 2002-03-21 2005-03-01 Ford Global Technologies Llc Sensor fusion system architecture
DE10218010A1 (en) * 2002-04-23 2003-11-06 Bosch Gmbh Robert Method and device for lateral guidance support in motor vehicles
US6748292B2 (en) 2002-07-15 2004-06-08 Distrobot Systems, Inc. Material handling method using autonomous mobile drive units and movable inventory trays
DE10232295A1 (en) * 2002-07-16 2004-02-05 Daimlerchrysler Ag Method for assisting the driver in driving maneuvers
US7076366B2 (en) 2002-09-06 2006-07-11 Steven Simon Object collision avoidance system for a vehicle
GB0229700D0 (en) 2002-12-19 2003-01-29 Koninkl Philips Electronics Nv Remote control system and authentication method
FR2849160B1 (en) * 2002-12-24 2005-03-18 Alm LIGHTING DEVICE AND USE THEREOF
GB2398394B (en) 2003-02-14 2006-05-17 Dyson Ltd An autonomous machine
US6801125B1 (en) * 2003-03-19 2004-10-05 Delphi Technologies, Inc. Rear steering hitch/docking mode
EP1462880A3 (en) 2003-03-24 2005-04-06 Fila Luxembourg S.a.r.l. Housing for electronic device wearable on user's finger
US6813557B2 (en) 2003-03-27 2004-11-02 Deere & Company Method and system for controlling a vehicle having multiple control modes
US7016783B2 (en) 2003-03-28 2006-03-21 Delphi Technologies, Inc. Collision avoidance with active steering and braking
US7510038B2 (en) * 2003-06-11 2009-03-31 Delphi Technologies, Inc. Steering system with lane keeping integration
US7042438B2 (en) 2003-09-06 2006-05-09 Mcrae Michael William Hand manipulated data apparatus for computers and video games
JP2005094425A (en) 2003-09-18 2005-04-07 Fuji Xerox Co Ltd Remote control device
US7047132B2 (en) 2004-01-12 2006-05-16 Steven Jacobs Mobile vehicle sensor array
JP4257230B2 (en) 2004-02-26 2009-04-22 株式会社東芝 Mobile robot
US7228231B2 (en) * 2004-04-29 2007-06-05 The Boeing Company Multiple stayout zones for ground-based bright object exclusion
US8075243B2 (en) 2004-05-03 2011-12-13 Jervis B. Webb Company Automatic transport loading system and method
JP2007536177A (en) 2004-05-03 2007-12-13 ジエービス・ビー・ウエブ・インターナショナル・カンパニー System and method for automatically loading a load onto a carrier
US7017689B2 (en) * 2004-05-06 2006-03-28 Crown Equipment Corporation Electrical steering assist for material handling vehicles
DE102004027250A1 (en) * 2004-06-03 2005-12-29 Magna Donnelly Gmbh & Co. Kg Method and device for assisted control of a motor vehicle
US7188015B2 (en) * 2004-07-14 2007-03-06 Trimble Navigation Limited Method and system for controlling a mobile machine
JP4143104B2 (en) * 2004-08-06 2008-09-03 本田技研工業株式会社 Vehicle control device
US8406845B2 (en) * 2004-09-01 2013-03-26 University Of Tennessee Research Foundation Method and apparatus for imaging tracking
US20060125806A1 (en) 2004-09-27 2006-06-15 The Regents Of The University Of Minnesota Human-activated displacement control appliance for use with computerized device/mechanism
JP4400418B2 (en) * 2004-10-29 2010-01-20 日産自動車株式会社 Inter-vehicle distance control device, inter-vehicle distance control method, driving operation support device, and driving operation support method
US7164118B2 (en) * 2004-10-29 2007-01-16 Deere & Company Method and system for obstacle detection
US20100063682A1 (en) 2004-11-19 2010-03-11 Akaki Tomihiro Overturning prevention device for forklift vehicle
KR20060059006A (en) 2004-11-26 2006-06-01 삼성전자주식회사 Method and apparatus of self-propelled mobile unit with obstacle avoidance during wall-following
JP2006259877A (en) 2005-03-15 2006-09-28 Daifuku Co Ltd Article conveyance equipment
JP4093261B2 (en) 2005-03-15 2008-06-04 松下電工株式会社 Autonomous mobile device
US7400415B2 (en) 2005-03-15 2008-07-15 Mitutoyo Corporation Operator interface apparatus and method for displacement transducer with selectable detector area
CA2531305A1 (en) 2005-04-25 2006-10-25 Lg Electronics Inc. Self-moving robot capable of correcting movement errors and method for correcting movement errors of the same
US20060250255A1 (en) 2005-05-06 2006-11-09 Flanagan Eugene E Paired child to parent separation distance monitoring and alarm system and method of same
WO2006138241A2 (en) 2005-06-13 2006-12-28 Maxbotix, Inc. Methods and device for ultrasonic range sensing
US7266477B2 (en) 2005-06-22 2007-09-04 Deere & Company Method and system for sensor signal fusion
DE102005045018A1 (en) * 2005-09-21 2007-03-22 Robert Bosch Gmbh Device for longitudinal guidance of a motor vehicle
CA2625885C (en) 2005-10-14 2016-09-13 Aethon, Inc. Robotic ordering and delivery system software and methods
US7477973B2 (en) 2005-10-15 2009-01-13 Trimble Navigation Ltd Vehicle gyro based steering assembly angle and angular rate sensor
JP4887980B2 (en) * 2005-11-09 2012-02-29 日産自動車株式会社 VEHICLE DRIVE OPERATION ASSISTANCE DEVICE AND VEHICLE WITH VEHICLE DRIVE OPERATION ASSISTANCE DEVICE
JP2007137126A (en) * 2005-11-15 2007-06-07 Mazda Motor Corp Obstacle detecting device for vehicle
US8050863B2 (en) 2006-03-16 2011-11-01 Gray & Company, Inc. Navigation and control system for autonomous vehicles
KR100776944B1 (en) * 2006-05-01 2007-11-21 주식회사 한울로보틱스 The map building method for mobile robot
US9645968B2 (en) 2006-09-14 2017-05-09 Crown Equipment Corporation Multiple zone sensing for materials handling vehicles
US9207673B2 (en) 2008-12-04 2015-12-08 Crown Equipment Corporation Finger-mounted apparatus for remotely controlling a materials handling vehicle
US8072309B2 (en) 2006-09-14 2011-12-06 Crown Equipment Corporation Systems and methods of remotely controlling a materials handling vehicle
CN104991554A (en) 2006-09-14 2015-10-21 克朗设备公司 Systems and methods of remotely controlling a materials handling vehicle
US8452464B2 (en) 2009-08-18 2013-05-28 Crown Equipment Corporation Steer correction for a remotely operated materials handling vehicle
JP4270259B2 (en) * 2006-10-05 2009-05-27 日産自動車株式会社 Obstacle avoidance control device
US8983765B2 (en) * 2006-10-11 2015-03-17 GM Global Technology Operations LLC Method and system for lane centering control
EP2963596A1 (en) 2006-12-13 2016-01-06 Crown Equipment Corporation Fleet management system
JP4905156B2 (en) 2007-01-26 2012-03-28 株式会社豊田自動織機 Industrial vehicle travel control device
KR20080073933A (en) * 2007-02-07 2008-08-12 삼성전자주식회사 Object tracking method and apparatus, and object pose information calculating method and apparatus
JP2008226140A (en) * 2007-03-15 2008-09-25 Mazda Motor Corp Vehicle operation support system
WO2008124657A1 (en) * 2007-04-05 2008-10-16 Power Curbers, Inc. Methods and systems utilizing 3d control to define a path of operation for a construction machine
DE102007027494B4 (en) 2007-06-14 2012-12-06 Daimler Ag A method and apparatus for assisting the driver of a vehicle in vehicle guidance
US8195366B2 (en) 2007-09-13 2012-06-05 The Raymond Corporation Control system for a pallet truck
US8027029B2 (en) * 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system
EP2085279B1 (en) * 2008-01-29 2011-05-25 Ford Global Technologies, LLC A system for collision course prediction
JP4775391B2 (en) 2008-03-18 2011-09-21 株式会社デンソー Obstacle detection device
KR100946723B1 (en) * 2008-04-12 2010-03-12 재단법인서울대학교산학협력재단 Steering Method for vehicle and Apparatus thereof
US8170787B2 (en) 2008-04-15 2012-05-01 Caterpillar Inc. Vehicle collision avoidance system
JP4538762B2 (en) * 2008-05-20 2010-09-08 トヨタ自動車株式会社 Inter-vehicle distance control device
DE102008027282A1 (en) * 2008-06-06 2009-12-10 Claas Industrietechnik Gmbh Agricultural vehicle and operating procedure for it
US8190364B2 (en) * 2008-06-30 2012-05-29 Deere & Company System and method for providing towed implement compensation
KR100962529B1 (en) * 2008-07-22 2010-06-14 한국전자통신연구원 Method for tracking object
US8280560B2 (en) * 2008-07-24 2012-10-02 GM Global Technology Operations LLC Adaptive vehicle control system with driving style recognition based on headway distance
US8705792B2 (en) 2008-08-06 2014-04-22 Toyota Motor Engineering & Manufacturing North America, Inc. Object tracking using linear features
US8392065B2 (en) * 2008-09-11 2013-03-05 Deere & Company Leader-follower semi-autonomous vehicle with operator on side
US8229618B2 (en) * 2008-09-11 2012-07-24 Deere & Company Leader-follower fully autonomous vehicle with operator on side
WO2010065864A2 (en) 2008-12-04 2010-06-10 Crown Equipment Corporation Multiple zone sensing for materials handling vehicles
EP2381697B1 (en) 2008-12-24 2014-11-12 Doosan Infracore Co., Ltd. Remote control system and method for construction equipment
US8099214B2 (en) * 2009-02-09 2012-01-17 GM Global Technology Operations LLC Path planning for autonomous parking
US20100209888A1 (en) * 2009-02-18 2010-08-19 Gm Global Technology Operations, Inc. Vehicle stability enhancement control adaptation to driving skill based on curve-handling maneuvers
CA2765565C (en) 2009-07-02 2017-06-20 Crown Equipment Corporation Apparatus for remotely controlling a materials handling vehicle
US8120476B2 (en) * 2009-07-22 2012-02-21 International Truck Intellectual Property Company, Llc Digital camera rear-view system
US8731777B2 (en) 2009-08-18 2014-05-20 Crown Equipment Corporation Object tracking and steer maneuvers for materials handling vehicles
EP2467761B1 (en) 2009-08-18 2017-02-08 Crown Equipment Corporation Steer correction for a remotely operated materials handling vehicle
US11995208B2 (en) 2018-12-12 2024-05-28 Hewlett-Packard Development Company, L.P. Updates of machine learning models based on confidential data

Also Published As

Publication number Publication date
CA3004966C (en) 2018-09-18
CA3004966A1 (en) 2012-08-30
EP2678748B2 (en) 2024-02-07
IN2014CN03425A (en) 2015-07-03
EP2678748B1 (en) 2015-04-01
CN103392157A (en) 2013-11-13
AU2012220819B2 (en) 2015-06-11
US9493184B2 (en) 2016-11-15
KR102144781B1 (en) 2020-08-14
EP2905668A1 (en) 2015-08-12
WO2012115920A3 (en) 2012-11-15
EP2866114B1 (en) 2016-12-14
BR122014017960A2 (en) 2019-08-13
US20110166721A1 (en) 2011-07-07
CN103926929B (en) 2018-06-26
RU2013138708A (en) 2015-03-27
RU2578831C2 (en) 2016-03-27
US8731777B2 (en) 2014-05-20
CA3005016A1 (en) 2012-08-30
EP2866114A2 (en) 2015-04-29
EP2866113A2 (en) 2015-04-29
CA3005016C (en) 2019-01-08
US20140195121A1 (en) 2014-07-10
RU2578831C9 (en) 2016-08-20
US9002581B2 (en) 2015-04-07
EP2866113B1 (en) 2016-12-14
KR20200008046A (en) 2020-01-22
WO2012115920A2 (en) 2012-08-30
EP2905668B1 (en) 2016-12-14
KR101940469B1 (en) 2019-04-10
MX339969B (en) 2016-06-20
KR20140012993A (en) 2014-02-04
AU2012220819A1 (en) 2013-05-02
CA2827735A1 (en) 2012-08-30
KR20190104238A (en) 2019-09-06
KR102038848B1 (en) 2019-10-31
EP2678748A2 (en) 2014-01-01
EP2866113A3 (en) 2015-08-05
EP2889713B1 (en) 2016-12-14
US20130297151A1 (en) 2013-11-07
EP2889713A3 (en) 2015-08-05
EP2889713A2 (en) 2015-07-01
KR20140072912A (en) 2014-06-13
CN103392157B (en) 2016-08-10
EP2866114A3 (en) 2015-08-05
BR112013021044A2 (en) 2016-10-18
CN103926929A (en) 2014-07-16
CA2827735C (en) 2019-10-22

Similar Documents

Publication Publication Date Title
MX2013009769A (en) Object tracking and steer maneuvers for materials handling vehicles.
AU2015207833B9 (en) Steer control maneuvers for materials handling vehicles
AU2014268191B2 (en) Object tracking and steer maneuvers for materials handling vehicles

Legal Events

Date Code Title Description
FG Grant or registration