WO2018101962A1 - Autonomous storage container - Google Patents
Autonomous storage container Download PDFInfo
- Publication number
- WO2018101962A1 WO2018101962A1 PCT/US2016/064688 US2016064688W WO2018101962A1 WO 2018101962 A1 WO2018101962 A1 WO 2018101962A1 US 2016064688 W US2016064688 W US 2016064688W WO 2018101962 A1 WO2018101962 A1 WO 2018101962A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- controller
- storage container
- detecting
- propulsion source
- Prior art date
Links
- 230000004044 response Effects 0.000 claims abstract description 12
- 238000000034 method Methods 0.000 claims description 34
- 238000000926 separation method Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 230000001815 facial effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000001444 catalytic combustion detection Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000013403 standard screening design Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
Definitions
- Luggage for an extend trip may be very heavy and may include multiple items. People with disabilities may have a particularly difficult time transporting luggage. In a large airport or train station, a person may have a large distance to traverse with luggage.
- FIG. 4B is process flow diagram of a method for operating an autonomous storage container in a public transit vehicle in accordance with an embodiment of the present invention.
- the storage container may further include a battery 110 providing power to the drive unit 104 under the control of the controller 102.
- I/O device(s) 210 include various devices that allow data and/or other information to be input to or retrieved from computing device 200.
- Example I/O device(s) 210 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
- Display device 230 includes any type of device capable of displaying information to one or more users of computing device 200. Examples of display device 230 include a monitor, display terminal, video projection device, and the like.
- Interface(s) 206 include various interfaces that allow computing device 200 to interact with other systems, devices, or computing environments.
- Example interface(s) 206 include any number of different network interfaces 220, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet.
- Other interface(s) include user interface 218 and peripheral device interface 222.
- the interface(s) 206 may also include one or more peripheral interfaces such as interfaces for pointing devices (mice, track pad, etc.), keyboards, and the like.
- Fig. 3 illustrates a scenario in which the storage container 100 may be used.
- a user 300 may traverse a trajectory 302.
- the controller 102 detects proximity of the user 300 according to outputs of the sensing devices 112, 114 and causes the storage container 100 to follow a trajectory 304 following the user 300.
- the controller may select the trajectory 304 according to outputs of the sensing devices 114-120 in order to avoid obstacles 306a-306c.
- the sensor 112 may have a range 308 within which the user 300 may be sensed or the output of the sensors 112 or cameras 114 may be interpreted to determine a distance to the user such that whether the user is within the range 308 may be detected. If the user 300 is found to be outside this range 308, an alert may be transmitted to the user 300.
- the illustrated method 400a may be executed by the controller 102 in order to maintain the storage container 100 to follow within a desired distance from the user 300.
- the method 400a may include evaluating 412 whether a separation between the storage container 100 and the user 300 is outside the range 308. This may include detecting absence of a signal from the user 300, such as an RFID signal, BLUETOOTH signal, or other radio signal from the user 300. Alternatively, this may include detecting a distance to the user 300 and determining that that distance exceeds a maximum distance. Step 412 may include receiving a GPS coordinate from a mobile device carried by the user 300 and comparing it to a GPS coordinate determined using a GPS receiver included in the controller 102 to determine whether a distance between the coordinates exceeds the maximum range.
- a signal from the user 300 such as an RFID signal, BLUETOOTH signal, or other radio signal from the user 300.
- Step 412 may include receiving a GPS coordinate from a mobile device carried by the user 300 and comparing it to a GPS coordinate determined using a GPS receiver included in the controller 102 to determine whether a distance between the coordinates exceeds the maximum range.
- the controller may generate 414 an alert.
- the alert may be an audible or visible alert on an output device mounted to the storage container 100.
- the alert may be a message transmitted to a mobile phone or other device carried by the user 300, which may then output an audible, visual, tactile, or other perceptible output indicating to the user that the storage container 100 is not properly tracking the user 300.
- the method 400a may further include evaluating 416 whether the storage container 416 is not moving in correspondence with control inputs provided by the controller 102 to the drive unit 104, e.g. not accelerating, slowing, or turning in correspondence with instructions to perform such actions input by the controller 102 to the drive unit 104. For example, failure to move in response to control inputs may indicate interference, such as an attempted theft or breakdown of the drive unit 104. If so, the controller 102 may generate 418 an alert, such as in the same manner as for step 414. In some embodiments an alert may continue to be sent with the location of the storage container 100, such as by means of a cellular data network, to enable tracking and recovery of the storage container 100.
- the method 400b may include establishing 420 a wireless data connection to the controller 502 of the vehicle 500 and receiving 422 a target location from the controller 502.
- the target location may be an identifier of a beacon at the target location, a GPS coordinate, or an LPS (local positioning system) coordinate defined by beacons within the vehicle 500.
- the controller 102 then causes the drive unit 104 to urge 426 the storage container along the trajectory.
- the vehicle 500 may advantageously provide a ramp enabling entrance of the storage container 100 and the controller 502 may lower the ramp in response to connecting to or otherwise detecting the storage container 100. Obstacle avoidance, separation, and failure to respond to control inputs may be performed at steps 408-416 in the same manner as for the method 400a. Accordingly, the user 300 may be required to move with the storage container 100 such that the separation does not exceed the perimeter. In other embodiments, steps 412-414 may be omitted such that the storage container 100 may move independently of the trajectory 512 of the user 300 to the storage location 508, which may be in a baggage area separated from the seating area by more than the perimeter distance.
- an application executing on a mobile device of the user 300 may receive control inputs instructing movement of the storage container 100.
- the mobile device may then transmit these control inputs to the controller over a wireless connection, such as BLUETOOTH.
- the controller 102 may then cause the drive unit 104 to move the storage container according to the control inputs.
- Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer- executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer- executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
- the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
- the disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Acoustics & Sound (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Traffic Control Systems (AREA)
Abstract
A storage container, such as luggage, includes a drive unit, controller, and sensors for detecting proximity of a user and for performing obstacle detection. Sensors detect proximity of the user and traverse a trajectory following the user. In response to detecting movement outside of a prescribed range from the user, an alert may be generated. Likewise, if movement of the storage container does not correspond to instructions from a controller, an alert may also be generated. The controller of the storage container may connect to a controller of a vehicle, which may provide a target location to the storage container and may also transmit obstacle locations to the storage container.
Description
Title: AUTONOMOUS STORAGE CONTAINER
BACKGROUND
FIELD OF THE INVENTION
[001] This invention relates to performing obstacle detection, such as for use in autonomous vehicles.
BACKGROUND OF THE INVENTION
[002] Luggage for an extend trip may be very heavy and may include multiple items. People with disabilities may have a particularly difficult time transporting luggage. In a large airport or train station, a person may have a large distance to traverse with luggage.
[003] The apparatus and methods disclosed herein provide an improved approach for implementing luggage.
BRIEF DESCRIPTION OF THE DRAWINGS
[004] In order that the advantages of the invention will be readily understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through use of the accompanying drawings, in which:
[005] Fig. 1 is a schematic block diagram of an autonomous storage container in accordance with an embodiment of the present invention;
[006] Fig. 2 is a schematic block diagram of an example computing device suitable for implementing methods in accordance with embodiments of the invention;
[007] Fig. 3 is a schematic diagram illustrating operation of an autonomous storage container in accordance with an embodiment of the present invention;
[008] Fig. 4A is a process flow diagram of a method for operating an autonomous storage container in accordance with an embodiment of the present invention;
[009] Fig. 4B is process flow diagram of a method for operating an autonomous storage container in a public transit vehicle in accordance with an embodiment of the present invention; and
[0010] Fig. 5 is a schematic diagram illustrating operation of an autonomous storage container in combination with a public transit vehicle in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
[0011] Referring to Fig. 1, an autonomous storage container 100 may include a controller 102 implementing logic as described herein for autonomous movement of the storage container 100. The controller 102 may be operably coupled to a drive unit 104. The drive unit 104 may include a propulsion source such as an electric motor. The drive unit 104 may be coupled to driven wheels 106 and rotate the driven wheels 106 in order to drive the storage container 100 translationally over a surface. The drive unit 104 may also adjust the angle of the driven wheels 106 in order to steer the storage container 100. The storage container may include one or more other pairs of driven or non-driven
wheels 108. In some embodiments, the wheels 108 may be coupled to a steering actuator controlled by the controller 102.
[0012] The storage container may further include a battery 110 providing power to the drive unit 104 under the control of the controller 102.
[0013] The storage container 100 may include sensors for detecting a user for purposes of following the user. Such sensors may include a detector 112 embodied as a RFID (radio frequency identifier) detector, BLUETOOTH transceiver, or other radio signal detector. These detectors 112 may detect a signal transmitted by a device carried by the user. Tracking of a user may also be performed using one or more cameras 114. Outputs of these cameras 114 may be analyzed by the controller 102 to perform facial or other recognition in order to locate and follow the user.
[0014] The storage container 100 may further include other sensors for performing obstacle detection. These sensors may include the cameras 114, a light distancing and ranging (LIDAR) sensor 116, radio distancing and ranging (RADAR) sensors 118, ultrasonic sensors 120, infrared sensors, or the like.
[0015] The sensing devices 112-120 may be disposed around a shell 122 or walls 122 of the storage container 100. Likewise, the wheels 106, 108 may be fastened to or protrude from a lower surface or floor of the storage container 100. In the illustrated embodiment, the drive unit 104, controller 102, and battery 110 are positioned on the bottom of the storage container, however other arrangements are also possible.
[0016] In the illustrated embodiment, the sensing devices 112-120, controller 102, drive unit 104, and battery 110 may be disposed around a storage volume 124 defined by the storage container. Storage volume 124 may be accessible by means of an opening
selectively covered by means of a lid, fabric cover, or other closure mechanism. The storage volume 124 and the means for closing and accessing the storage volume 124 may be according to any means for implementing a storage container known in the art. In particular, the storage container 100 may be embodied as any type of suitcase known in the art, such as for luggage that may be loaded in the cargo hold of an airline. In still other embodiments, the storage container 100 may be replaced with a cart for bearing a separate structure for containing items.
[0017] Fig. 2 is a block diagram illustrating an example computing device 200. Computing device 200 may be used to perform various procedures, such as those discussed herein. The controller 102 may have some or all of the attributes of the computing device 200.
[0018] Computing device 200 includes one or more processor(s) 202, one or more memory device(s) 204, one or more interface(s) 206, one or more mass storage device(s) 208, one or more Input/Output (I/O) device(s) 210, and a display device 230 all of which are coupled to a bus 212. Processor(s) 202 include one or more processors or controllers that execute instructions stored in memory device(s) 204 and/or mass storage device(s) 208. Processor(s) 202 may also include various types of computer-readable media, such as cache memory.
[0019] Memory device(s) 204 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 214) and/or nonvolatile memory (e.g., read-only memory (ROM) 216). Memory device(s) 204 may also include rewritable ROM, such as Flash memory.
[0020] Mass storage device(s) 208 include various computer readable media, such
as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in Fig. 2, a particular mass storage device is a hard disk drive 224. Various drives may also be included in mass storage device(s) 208 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 208 include removable media 226 and/or non-removable media.
[0021] I/O device(s) 210 include various devices that allow data and/or other information to be input to or retrieved from computing device 200. Example I/O device(s) 210 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
[0022] Display device 230 includes any type of device capable of displaying information to one or more users of computing device 200. Examples of display device 230 include a monitor, display terminal, video projection device, and the like.
[0023] Interface(s) 206 include various interfaces that allow computing device 200 to interact with other systems, devices, or computing environments. Example interface(s) 206 include any number of different network interfaces 220, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet. Other interface(s) include user interface 218 and peripheral device interface 222. The interface(s) 206 may also include one or more peripheral interfaces such as interfaces for pointing devices (mice, track pad, etc.), keyboards, and the like.
[0024] Bus 212 allows processor(s) 202, memory device(s) 204, interface(s) 206, mass storage device(s) 208, I/O device(s) 210, and display device 230 to communicate
with one another, as well as other devices or components coupled to bus 212. Bus 212 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
[0025] For purposes of illustration, programs and other executable program components are shown herein as discrete blocks, although it is understood that such programs and components may reside at various times in different storage components of computing device 200, and are executed by processor(s) 202. Alternatively, the systems and procedures described herein can be implemented in hardware, or a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein.
[0026] Fig. 3 illustrates a scenario in which the storage container 100 may be used. A user 300 may traverse a trajectory 302. The controller 102 detects proximity of the user 300 according to outputs of the sensing devices 112, 114 and causes the storage container 100 to follow a trajectory 304 following the user 300. The controller may select the trajectory 304 according to outputs of the sensing devices 114-120 in order to avoid obstacles 306a-306c. The sensor 112 may have a range 308 within which the user 300 may be sensed or the output of the sensors 112 or cameras 114 may be interpreted to determine a distance to the user such that whether the user is within the range 308 may be detected. If the user 300 is found to be outside this range 308, an alert may be transmitted to the user 300.
[0027] Referring to Fig. 4A, the illustrated method 400a may be executed by the controller 102 in order to maintain the storage container 100 to follow within a desired distance from the user 300.
[0028] The method 400a may include tracking 402 the user 300. This may include detecting a location of the user over time using the sensing device 112, cameras 114, or by receiving the location of the user 300, such as from the smart phone or other mobile device carried by the user 300, such as in the form of a GPS (global positioning system) coordinate of the mobile device carried by the user 300. Step 402 may include performing facial, physiological, heat signature, or other individual recognition techniques to identify the user 300 in outputs of the one or more cameras 114.
[0029] The method 400a may further include computing 404 a trajectory to a location within some safe following distance from the tracked location of the user 300. For example, computing a trajectory from the current location of the storage container 100 to the user 300 may include calculating the trajectory using model predictive control (MPC) with convex constraints to avoid obstacles detected according to outputs of one or more of the sensors 118-120. A search algorithm may be used to generate the trajectory as well. The manner in which object detection and avoidance is performed by the controller 102 may be according to any method of autonomous vehicle operation known in the art.
[0030] The method 400a may further include the controller 102 causing the drive unit 104 to at least partially traverse 406 the trajectory. In the event that an object is found 408 to be detected on the trajectory, the trajectory may be adjusted 410 and the
controller 102 may cause the drive unit 104 to at least partially traverse the adjusted trajectory.
[0031] The method 400a may include evaluating 412 whether a separation between the storage container 100 and the user 300 is outside the range 308. This may include detecting absence of a signal from the user 300, such as an RFID signal, BLUETOOTH signal, or other radio signal from the user 300. Alternatively, this may include detecting a distance to the user 300 and determining that that distance exceeds a maximum distance. Step 412 may include receiving a GPS coordinate from a mobile device carried by the user 300 and comparing it to a GPS coordinate determined using a GPS receiver included in the controller 102 to determine whether a distance between the coordinates exceeds the maximum range.
[0032] If the user 300 is found 412 to be outside of the allowable range 308, then the controller may generate 414 an alert. The alert may be an audible or visible alert on an output device mounted to the storage container 100. The alert may be a message transmitted to a mobile phone or other device carried by the user 300, which may then output an audible, visual, tactile, or other perceptible output indicating to the user that the storage container 100 is not properly tracking the user 300.
[0033] The method 400a may further include evaluating 416 whether the storage container 416 is not moving in correspondence with control inputs provided by the controller 102 to the drive unit 104, e.g. not accelerating, slowing, or turning in correspondence with instructions to perform such actions input by the controller 102 to the drive unit 104. For example, failure to move in response to control inputs may indicate interference, such as an attempted theft or breakdown of the drive unit 104. If
so, the controller 102 may generate 418 an alert, such as in the same manner as for step 414. In some embodiments an alert may continue to be sent with the location of the storage container 100, such as by means of a cellular data network, to enable tracking and recovery of the storage container 100.
[0034] The processing of steps 402-418 may continue over time until the user 300 instructs the controller 102 to cease following the user 300.
[0035] Referring to Fig. 4B and 5, the illustrated method 400b may be executed by the controller 102 when boarding a public transit vehicle 500, such as bus, airplane, or other large vehicle. The vehicle 500 may include its own controller 502 as well as one or more obstacles 504a-504b.
[0036] The method 400b may include establishing 420 a wireless data connection to the controller 502 of the vehicle 500 and receiving 422 a target location from the controller 502. The target location may be an identifier of a beacon at the target location, a GPS coordinate, or an LPS (local positioning system) coordinate defined by beacons within the vehicle 500.
[0037] In response to receiving the target location, the controller 102 calculates 424 a trajectory 506 to the target location 508. The controller 502 may transmit sufficient information regarding an entrance 510 and a layout of the interior of the vehicle 500 to enable the controller 102 to determine the trajectory 506. Alternatively, the controller 502 may transmit an initial trajectory to the controller 102 in lieu of step 424.
[0038] The controller 102 then causes the drive unit 104 to urge 426 the storage container along the trajectory. The vehicle 500 may advantageously provide a ramp enabling entrance of the storage container 100 and the controller 502 may lower the ramp
in response to connecting to or otherwise detecting the storage container 100. Obstacle avoidance, separation, and failure to respond to control inputs may be performed at steps 408-416 in the same manner as for the method 400a. Accordingly, the user 300 may be required to move with the storage container 100 such that the separation does not exceed the perimeter. In other embodiments, steps 412-414 may be omitted such that the storage container 100 may move independently of the trajectory 512 of the user 300 to the storage location 508, which may be in a baggage area separated from the seating area by more than the perimeter distance. The baggage area may include storage and retention mechanisms for automatically retaining the container. The vehicle controller 502 may then release the storage container 100 upon arriving at a destination. The user 300 may then transmit an instruction to the storage container 100 to exit the vehicle 500 or to autonomously travel to the user's location.
[0039] In some embodiments, when the storage container 100 is within the vehicle 500, the controller 502 may detect the location of the storage container 100 or receive it from the controller 102. The vehicle 500 may incorporate sensors such as LIDAR, RADAR, ultrasound, or the like. The controller 502 may therefore perform one or both of detection of the user 300 and detection of obstacles and transmit the location of the user and obstacles and the user to the controller 102. In this manner, battery power required to operate the sensing devices 114-20 may be reduced.
[0040] Although the systems and methods disclosed above describe autonomous operation of the storage container 100, other implementations are possible. For example, an application executing on a mobile device of the user 300 may receive control inputs instructing movement of the storage container 100. The mobile device may then transmit
these control inputs to the controller over a wireless connection, such as BLUETOOTH. The controller 102 may then cause the drive unit 104 to move the storage container according to the control inputs.
[0041] In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is understood that other implementations may be utilized and structural changes may be made without departing from the scope of the present disclosure. References in the specification to "one embodiment," "an embodiment," "an example embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[0042] Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer- executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer
system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer- executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
[0043] Computer storage media (devices) includes RAM, ROM, EEPROM, CD- ROM, solid state drives ("SSDs") (e.g., based on RAM), Flash memory, phase-change memory ("PCM"), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
[0044] An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A "network" is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
[0045] Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
[0046] Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, an in-dash vehicle computer, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
[0047] Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
[0048] It should be noted that the sensor embodiments discussed above may comprise computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code. These example devices are provided herein purposes of illustration, and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s).
[0049] At least some embodiments of the disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
[0050] While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various
changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents. The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the disclosure.
[0051] The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative, and not restrictive. The scope of the invention is, therefore, indicated by the appended claims, rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Claims
1. A method comprising:
providing a storage container having a propulsion source, one or more sensors and a controller;
detecting, by the controller, proximity of a user according to one or more outputs of the one or more sensors; and
causing, by the controller, the propulsion source to follow the user.
2. The method of claim 1, further comprising:
detecting, by the controller, presence of an obstructing object in the one or more outputs; and
in response to detecting the presence of the obstructing object, causing, by the controller, the propulsion source to traverse a trajectory around the obstructing object toward the user.
3. The method of claim 1, further comprising:
detecting, by the controller, separation of the storage container from the user by more than a threshold proximity according to the one or more outputs; and
in response to detecting separation of the storage container from the user by more than the threshold proximity, transmitting, by the controller, an alert to the user.
4. The method of claim 3, wherein detecting separation of the storage container from the user by more than the threshold proximity comprises:
detecting, by the controller, failure to detect a signal from a beacon carried by the user.
5. The method of claim 4, wherein detecting separation of the storage container from the user by more than the threshold proximity comprises:
detecting, by the controller, failure to receive a response from a radio frequency identifier (RFID) tag carried by the user.
6. The method of claim 1, further comprising:
detecting, by the controller, failure of the storage container to follow a trajectory instructed by the controller to the propulsion source; and
in response to detecting failure of the storage container to follow the trajectory instructed by the controller to the propulsion source, transmitting an alert to the user.
7. The method of claim 1, wherein the one or more sensors comprise at least one of a light distancing and ranging (LIDAR) sensor and an ultrasonic sensor.
8. The method of claim 1, wherein causing the propulsion source to follow the user comprises:
calculating, by the controller, a trajectory to the user using model predictive control (MPC) with convex constraints to avoid obstacles detected according to the one or more outputs.
9. The method of claim 1, further comprising:
connecting, by the controller, to a public transit vehicle;
receiving, by the controller, a target location from the public transit vehicle; and causing, by the controller, the propulsion source to drive the storage container to the target location.
10. The method of claim 1, wherein the one or more sensors include a camera; and
wherein detecting proximity of the user according to the one or more outputs comprises detecting a face of the user in an output of the camera.
11. An apparatus comprising:
a storage container defining a storage volume;
a propulsion source coupled to the storage container;
one or more sensors mounted to the storage container;
a controller coupled to the propulsion source and the one or more sensors, the controller programmed to:
detect proximity of a user according to one or more outputs of the one or more sensors; and
cause the propulsion source to follow the user.
12. The apparatus of claim 11, wherein the controller is further programmed to:
detect presence of an obstructing object in the one or more outputs; and in response to detecting the presence of the obstructing object, cause the propulsion source to traverse a trajectory around the obstructing object toward the user.
13. The apparatus of claim 11, wherein the controller is further programmed to:
detect separation of the storage container from the user by more than a threshold proximity according to the one or more outputs; and
in response to detecting separation of the storage container from the user by more than the threshold proximity, transmit an alert to the user.
14. The apparatus of claim 13, wherein the controller is further programmed to detect separation of the storage container from the user by more than the threshold proximity by:
detecting failure to detect a signal from a beacon carried by the user.
15. The apparatus of claim 14, wherein the controller is further programmed to detect separation of the storage container from the user by more than the threshold proximity by:
detecting failure to receive a response from a radio frequency identifier (RFID) tag carried by the user.
16. The apparatus of claim 11, wherein the controller is further programmed to:
detect failure of the storage container to follow a trajectory instructed by the controller to the propulsion source; and
in response to detecting failure of the storage container to follow the trajectory instructed by the controller to the propulsion source, transmit an alert to the user.
17. The apparatus of claim 11, wherein the one or more sensors comprise at least one of a light distancing and ranging (LIDAR) sensor and an ultrasonic sensor.
18. The apparatus of claim 11, wherein the controller is further programmed to cause the propulsion source to follow the user by:
calculating a trajectory to the user using model predictive control with convex constraints to avoid obstacles detected according to the one or more outputs.
19. The apparatus of claim 11, wherein the controller is further programmed to:
connect to a public transit vehicle;
receive a target location from the public transit vehicle; and
cause the propulsion source to drive the storage container to the target location.
20. The apparatus of claim 11, wherein the one or more sensors include a camera; and
wherein the controller is further programmed to detect proximity of the user according to the one or more outputs by detecting a face of the user in an output of the camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2016/064688 WO2018101962A1 (en) | 2016-12-02 | 2016-12-02 | Autonomous storage container |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2016/064688 WO2018101962A1 (en) | 2016-12-02 | 2016-12-02 | Autonomous storage container |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018101962A1 true WO2018101962A1 (en) | 2018-06-07 |
Family
ID=62241843
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2016/064688 WO2018101962A1 (en) | 2016-12-02 | 2016-12-02 | Autonomous storage container |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018101962A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020141289A1 (en) | 2019-01-04 | 2020-07-09 | Balyo | Companion robot system comprising an autonomously guided machine |
US20200346352A1 (en) * | 2019-04-30 | 2020-11-05 | Lg Electronics Inc. | Cart robot having auto-follow function |
EP4056454A1 (en) * | 2021-03-09 | 2022-09-14 | Goodrich Aerospace Services Private Limited | Trolley alert system |
US11511785B2 (en) * | 2019-04-30 | 2022-11-29 | Lg Electronics Inc. | Cart robot with automatic following function |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060170565A1 (en) * | 2004-07-30 | 2006-08-03 | Husak David J | Location virtualization in an RFID system |
US7742840B2 (en) * | 2004-12-14 | 2010-06-22 | Honda Motor Co., Ltd. | Autonomous mobile robot |
US20110288684A1 (en) * | 2010-05-20 | 2011-11-24 | Irobot Corporation | Mobile Robot System |
US20120085458A1 (en) * | 2010-10-12 | 2012-04-12 | Craig Edward Wenzel | Intelligent grain bag loader |
US20130035914A1 (en) * | 2010-04-26 | 2013-02-07 | Mitsubishi Electric Corporation | Servo controller |
US20150032252A1 (en) * | 2013-07-25 | 2015-01-29 | IAM Robotics, LLC | System and method for piece-picking or put-away with a mobile manipulation robot |
US9321591B2 (en) * | 2009-04-10 | 2016-04-26 | Symbotic, LLC | Autonomous transports for storage and retrieval systems |
US9400187B2 (en) * | 2014-11-10 | 2016-07-26 | Hyundai Mobis Co., Ltd. | Autonomous driving vehicle, autonomous driving management apparatus, and method of controlling the same |
US20160243970A1 (en) * | 2015-02-25 | 2016-08-25 | Haitham Eletrabi | Dual function robot and storage bin |
US20160260161A1 (en) * | 2015-03-06 | 2016-09-08 | Wal-Mart Stores, Inc. | Shopping facility assistance systems, devices and methods |
US20160290117A1 (en) * | 2013-12-06 | 2016-10-06 | Halliburton Energy Services, Inc. | Controlling a bottom hole assembly in a wellbore |
-
2016
- 2016-12-02 WO PCT/US2016/064688 patent/WO2018101962A1/en active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060170565A1 (en) * | 2004-07-30 | 2006-08-03 | Husak David J | Location virtualization in an RFID system |
US7742840B2 (en) * | 2004-12-14 | 2010-06-22 | Honda Motor Co., Ltd. | Autonomous mobile robot |
US9321591B2 (en) * | 2009-04-10 | 2016-04-26 | Symbotic, LLC | Autonomous transports for storage and retrieval systems |
US20130035914A1 (en) * | 2010-04-26 | 2013-02-07 | Mitsubishi Electric Corporation | Servo controller |
US20110288684A1 (en) * | 2010-05-20 | 2011-11-24 | Irobot Corporation | Mobile Robot System |
US20120085458A1 (en) * | 2010-10-12 | 2012-04-12 | Craig Edward Wenzel | Intelligent grain bag loader |
US20150032252A1 (en) * | 2013-07-25 | 2015-01-29 | IAM Robotics, LLC | System and method for piece-picking or put-away with a mobile manipulation robot |
US20160290117A1 (en) * | 2013-12-06 | 2016-10-06 | Halliburton Energy Services, Inc. | Controlling a bottom hole assembly in a wellbore |
US9400187B2 (en) * | 2014-11-10 | 2016-07-26 | Hyundai Mobis Co., Ltd. | Autonomous driving vehicle, autonomous driving management apparatus, and method of controlling the same |
US20160243970A1 (en) * | 2015-02-25 | 2016-08-25 | Haitham Eletrabi | Dual function robot and storage bin |
US20160260161A1 (en) * | 2015-03-06 | 2016-09-08 | Wal-Mart Stores, Inc. | Shopping facility assistance systems, devices and methods |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020141289A1 (en) | 2019-01-04 | 2020-07-09 | Balyo | Companion robot system comprising an autonomously guided machine |
FR3091609A1 (en) * | 2019-01-04 | 2020-07-10 | Balyo | Robot companion system comprising an autonomous guided machine |
US20200346352A1 (en) * | 2019-04-30 | 2020-11-05 | Lg Electronics Inc. | Cart robot having auto-follow function |
US11511785B2 (en) * | 2019-04-30 | 2022-11-29 | Lg Electronics Inc. | Cart robot with automatic following function |
US11585934B2 (en) * | 2019-04-30 | 2023-02-21 | Lg Electronics Inc. | Cart robot having auto-follow function |
EP4056454A1 (en) * | 2021-03-09 | 2022-09-14 | Goodrich Aerospace Services Private Limited | Trolley alert system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3525992B1 (en) | Mobile robot and robotic system comprising a server and the robot | |
KR102302239B1 (en) | Method of controlling cart-robot in restricted area and cart-robot of implementing thereof | |
Di Paola et al. | An autonomous mobile robotic system for surveillance of indoor environments | |
WO2018101962A1 (en) | Autonomous storage container | |
US20170220040A1 (en) | Smart luggage systems | |
US11269327B2 (en) | Picking up and dropping off passengers at an airport using an autonomous vehicle | |
JP7119871B2 (en) | Lost-and-found delivery support device, lost-and-found delivery support system, lost-and-found delivery support method, and computer program for lost-and-found delivery support | |
US11160340B2 (en) | Autonomous robot system | |
JP2020527266A (en) | Autonomous robot system | |
Fernández et al. | Autonomous navigation and obstacle avoidance of a micro-bus | |
US20190050732A1 (en) | Dynamic responsiveness prediction | |
EP3956191A1 (en) | Methods and systems for emergency handoff of an autonomous vehicle | |
US20190286928A1 (en) | Mobile micro-location | |
CA3170561A1 (en) | Artificial intelligence methods and systems for remote monitoring and control of autonomous vehicles | |
CN111052193B (en) | Anti-theft technique for autonomous vehicles for transporting goods | |
KR20210057886A (en) | Apparatus and method for preventing vehicle collision | |
US20220089187A1 (en) | Multi-layer autonomous vehicle control architecture | |
US20210137438A1 (en) | Control system for mobile robots | |
GB2583604A (en) | Collision prevention for autonomous vehicles | |
JP2019078617A (en) | Mobility device and method of environment sensing in mobility device | |
CN115588098A (en) | Robot sensor data management | |
WO2022221242A1 (en) | Systems and methods for robotic detection of escalators and moving walkways | |
Raj et al. | Ocularone: Exploring drones-based assistive technologies for the visually impaired | |
Binh et al. | Deep Learning-Based Object Tracking and Following for AGV Robot | |
Yang et al. | Autonomous mobile platform for enhanced situational awareness in Mass Casualty Incidents |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16922638 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16922638 Country of ref document: EP Kind code of ref document: A1 |