WO2022091907A1 - 無人配送システム及び無人配送方法 - Google Patents
無人配送システム及び無人配送方法 Download PDFInfo
- Publication number
- WO2022091907A1 WO2022091907A1 PCT/JP2021/038760 JP2021038760W WO2022091907A1 WO 2022091907 A1 WO2022091907 A1 WO 2022091907A1 JP 2021038760 W JP2021038760 W JP 2021038760W WO 2022091907 A1 WO2022091907 A1 WO 2022091907A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- self
- operator
- propelled
- delivery system
- Prior art date
Links
- 238000002716 delivery method Methods 0.000 title claims description 9
- 238000013459 approach Methods 0.000 claims description 8
- 230000032258 transport Effects 0.000 claims description 8
- 238000004891 communication Methods 0.000 description 33
- 238000010586 diagram Methods 0.000 description 19
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 9
- 230000003028 elevating effect Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 238000000034 method Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 239000000446 fuel Substances 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 241001122315 Polites Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 210000000078 claw Anatomy 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/003—Controls for manipulators by means of an audio-responsive input
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/02—Hand grip control means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/06—Control stands, e.g. consoles, switchboards
- B25J13/065—Control stands, e.g. consoles, switchboards comprising joy-sticks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
- B25J13/089—Determining the position of the robot with reference to its environment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/026—Acoustical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1682—Dual arm manipulator; Coordination of several manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/17—Helicopters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/20—Vertical take-off and landing [VTOL] aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/083—Shipping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/00174—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
- G07C9/00896—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys specially adapted for particular uses
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/20—UAVs specially adapted for particular uses or applications for use as communications relays, e.g. high-altitude platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/60—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons
- B64U2101/64—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons for parcel delivery or retrieval
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/60—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons
- B64U2101/67—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons the UAVs comprising tethers for lowering the goods
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
Definitions
- the present invention relates to an unmanned delivery system and an unmanned delivery method.
- a delivery system using a drone is known.
- a vehicle transports a package to the vicinity of a destination, and the package is transported from there to the destination by a drone.
- the package is finally delivered to the destination by an unmanned flying object, so it is difficult to smoothly deliver the package to the recipient as compared with the current delivery system by the vehicle and its driver.
- This disclosure is made to solve the above-mentioned problems, and aims to provide a delivery system and a delivery method capable of smoothly delivering a package to a recipient.
- the unmanned delivery system remotely controls a self-propelled robot, an unmanned aerial vehicle for transporting the baggage to a point in the middle of delivering the baggage, and the self-propelled robot.
- the self-propelled robot is provided with a robot operator for performing the robot operation, and the self-propelled robot delivers the baggage unloaded at the intermediate point to the destination while switching between autonomous operation and remote operation according to the operation of the robot operator. It is equipped with a robot controller configured to control the self-propelled robot so that it can be delivered.
- the unmanned delivery system remotely controls the self-propelled robot, the unmanned aircraft for transporting the luggage and the self-propelled robot to a point in the middle of delivering the luggage, and the self-propelled robot.
- the self-propelled robot is provided with a robot operator for performing the robot operation, and the self-propelled robot delivers the baggage unloaded at the intermediate point to the destination while switching between autonomous operation and remote operation according to the operation of the robot operator. It is equipped with a robot controller configured to control the self-propelled robot so that it can be delivered.
- the unmanned aerial vehicle transports the parcel to a point in the middle of delivering the parcel, and the robot operator remotely controls the self-propelled robot and the self-propelled robot.
- the robot delivers the package unloaded at the intermediate point to the destination while switching between autonomous operation and remote control according to the operation of the robot operator.
- the unmanned aerial vehicle transports the luggage and the self-propelled robot to a point in the middle of delivering the luggage, and the robot operator remotely controls the self-propelled robot.
- the self-propelled robot delivers the baggage unloaded at the intermediate point to the delivery destination while switching between autonomous operation and remote operation according to the operation of the robot operator.
- This disclosure has the effect of being able to provide a delivery system and delivery method that enables smooth delivery of packages to recipients.
- FIG. 1 is a schematic diagram showing an example of a schematic configuration of an unmanned delivery system according to the first embodiment of the present disclosure.
- FIG. 2 is a perspective view showing an example of a detailed configuration of the operation unit of FIG.
- FIG. 3 is a side view showing an example of the configuration of the self-propelled robot of FIG.
- FIG. 4 is a functional block diagram showing an example of the configuration of the control system of the unmanned delivery system of FIG.
- FIG. 5 is a schematic diagram showing an example of delivery data stored in the storage unit of the robot controller.
- FIG. 6 is a flowchart showing an example of the contents of autonomous operation / remote operation switching control.
- FIG. 7 is a flowchart showing an example of the operation of the unmanned delivery system of FIG. FIG.
- FIG. 8A is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8B is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8C is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8D is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8E is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8F is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8G is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG.
- FIG. 8H is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8I is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8J is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8K is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 8L is a schematic diagram showing an example of the operation of the unmanned delivery system of FIG. 1 in order.
- FIG. 9A is a side view showing an example of the configuration of a self-propelled robot used in the unmanned delivery system according to the second embodiment of the present disclosure.
- FIG. 9B is a plan view showing an example of the configuration of a self-propelled robot used in the unmanned delivery system according to the second embodiment of the present disclosure.
- FIG. 1 is a schematic diagram showing an example of a schematic configuration of the unmanned delivery system 100 according to the first embodiment of the present disclosure.
- the unmanned delivery system 100 of the first embodiment includes an unmanned aerial vehicle 1, a self-propelled robot 2, and an operation unit 3.
- the unmanned aerial vehicle will be referred to as a drone.
- the unmanned delivery system 100 transports the package by the drone 1 to a point in the middle of the delivery route from the collection / delivery base 5 to the delivery destination 4, and the self-driving robot 2 autonomously operates the package and follows the operation of the robot operator. It is configured to deliver the package unloaded at the point on the way to the delivery destination 4 while switching between driving and driving.
- the "self-propelled robot” may be simply referred to as a "robot”.
- a point in the middle of the delivery route means a point in the middle of delivering the package.
- the drone 1 may be any as long as it can transport the cargo to be delivered and the self-propelled robot 2.
- An airplane and a helicopter are exemplified as the drone 1.
- Airplanes include VTOL aircraft (Vertical Take-Off and Landing aircraft) as well as those that take off and land by normal gliding.
- the drone 1 is composed of a VTOL machine here.
- the drone 1 has a hangar 16 shown in FIG. 8C formed inside.
- a storage shelf 17 is arranged in the hangar 16 so as to surround the central space.
- the hangar 16 is configured such that the self-propelled robot 2 is stored in the central space and the self-propelled robot 2 can carry out the work of loading and unloading the luggage to and from the loading rack 17.
- the side wall of the rear part of the drone 1 is provided with a carry-in / out door 13 that rotates in the front-rear direction with the lower end as a fulcrum to open and close.
- the inner surface of the carry-in / out door 13 is formed flat, and when the carry-in / out door 13 opens and the tip of the door 13 lands, it becomes a carry-in / carry-out route for luggage G and the like.
- the drone 1 is provided with an elevating device 11.
- the elevating device 11 is composed of a winch here. Hereinafter, it is referred to as a winch 11.
- a drone controller 101 is arranged in the drone 1.
- the drone controller 101 includes a processor Pr3 and a memory Me3.
- FIG. 2 is a perspective view showing an example of a detailed configuration of the operation unit 3 of FIG.
- FIG. 3 is a side view showing an example of the configuration of the self-propelled robot 2 of FIG.
- the operation unit 3 is arranged in the operation room 39.
- the location of the operation unit 3 is not particularly limited.
- the operation unit 3 operates the robot operator 31 that operates the self-propelled robot 2, the drone operator 32 that operates the drone 1, the operator display 33, the operator microphone 34, and the operator speaker 35. It is equipped with a speaker camera 36.
- the robot operator 31 includes a traveling unit operator 31A for operating the traveling unit 21 of the self-propelled robot 2 and an arm operating device 31B for operating the robot arm 22 of the self-propelled robot 2.
- the traveling unit 21 may be a dolly.
- the arm operation device 31B is provided with an operation unit for operating the display robot arm 27 that supports the customer display device 23.
- the robot operating device 31 may be composed of various operating devices. Here, for example, it is composed of a joystick.
- the robot controller 31 is arranged on the desk 37.
- the drone controller 32 is composed of, for example, various control sticks for operating an aircraft.
- the drone controller 32 is composed of a joystick-shaped control stick.
- the drone operator 32 is provided with various operation units for operating the drone 1.
- the drone controller 32 is arranged on the desk 37.
- the operator display 33 is composed of, for example, a liquid crystal display.
- the operator display 33 displays an image including information necessary to be presented to the operator P1.
- Such images include an image captured by the field-of-view camera 26 of the self-propelled robot 2, a field-of-view image captured by the field-of-view camera (not shown) of the drone 1, the position, speed, and the position and speed required to steer the drone 1.
- Information such as the amount of fuel, navigation images, etc. are exemplified.
- the operator display 33 is arranged on the desk 37.
- the operator speaker 35 provides voice information necessary for the operator P1.
- the operator speaker 35 is configured here with headphones, but may be configured in other forms.
- the operator microphone 34 acquires the voice of the operator P1. Although the operator microphone 34 is provided in the headphone 35 here, it may be configured in another form.
- the operator camera 36 captures the operator P1.
- the operator camera 36 is provided here on the operator display 33, but may be provided at another location.
- An operation unit controller 301 is arranged on the desk 37.
- the operation unit controller 301 includes a processor Pr1 and a memory Me1.
- the operator P1 when the drone 1 is flying, the operator P1 operates the drone controller 32 with his right hand to operate the drone 1, and when the self-propelled robot 2 is operating, the operator P1 uses his left and right hands to operate the traveling unit controller 31A, respectively. And the arm controller 31B is operated to operate the self-propelled robot 2.
- the operator P1 is, for example, a courier company.
- the courier company may be, for example, a courier person in charge.
- the operator P1 may not be a delivery person but a dedicated operator.
- the robot 2 which is an example of a self-propelled robot may be a robot capable of autonomously traveling and handling luggage.
- the robot 2 includes a traveling unit 21 capable of autonomous traveling and a robot arm 22 provided on the traveling unit 21.
- the traveling unit 21 may be, for example, a dolly.
- the component that handles the luggage does not necessarily have to be a robot arm.
- the left direction and the right direction in the drawing are the front direction and the rear direction in the traveling direction, respectively.
- FIG. 3 shows the robot 2 in a simplified form.
- the robot arm 22 of the robot 2 is configured in the same manner as the dual-arm robot arm 22 of the robot 2A of the second embodiment, as shown in FIGS. 9A and 9B. That is, the robot arm 22 of the robot 2 is a double-armed vertical articulated robot arm.
- the robot arm 22 of the robot 2A of the second embodiment is a 4-axis vertical articulated robot arm
- the robot arm 22 of the robot 2 of FIG. 3 is a 5-axis vertical articulated robot arm. .. Referring to FIGS.
- the tips of the pair of robot arms 22 are each provided with a grip portion 221 which is a wrist portion having three claws 222, and the pair of robot arms 22 is a pair of these.
- the luggage G is gripped by the grip portion 221.
- the traveling portion of the robot 2 is actually provided with a rectangular parallelepiped vehicle body frame, and the luggage storage portion 212 is provided on the vehicle body frame so as to be movable in the front-rear direction.
- the body frame is covered with an appropriate case, and the front surface of the case is provided with an opening for the luggage storage portion 212 to enter and exit.
- the luggage accommodating portion 212 is formed in a rectangular box shape with an open upper surface, and is located at a retracted position where the front end surface is flush with the case when loading and unloading non-loading, and is designated on the front side when loading and unloading luggage.
- the portion is configured to be located in a forward position protruding forward.
- a pair of front wheels 211 and 211 and a pair of rear wheels 211 and 211 are provided on the bottom of the traveling portion 21.
- one of the pair of front wheels 211, 211 and the pair of rear wheels 211, 211 is the steering wheel, and for example, any one of the pair of front wheels 211, 211 and the pair of rear wheels 211, 211 is the drive wheel.
- a storage battery 28 and a motor are mounted on the traveling unit 21, and the motor drives the drive wheels using the storage battery 28 as a power source.
- the above-mentioned luggage accommodating portion 212 is also slid back and forth by a predetermined drive mechanism.
- a display robot arm 27 is provided behind the robot arm 22 of the traveling unit 21.
- a customer display 23 is attached to the tip of the display robot arm 27.
- a customer microphone 24, a customer speaker 25, and a field-of-view camera 26 are provided at appropriate positions on the customer display 23.
- the display robot arm 27 is composed of, for example, a vertical articulated robot arm and can take any posture, and is a customer display 23, a customer microphone 24, a customer speaker 25, and a visibility camera. It is possible to point 26 in any direction.
- the customer display 23 is composed of, for example, a liquid crystal display. As shown in FIG. 8F, the customer display 23 displays an image containing information that needs to be presented to the recipient P2. As such an image, an image taken by the operator camera 36 and the like are exemplified.
- the customer speaker 25 provides the voice information necessary for the recipient P2.
- the voice information the voice of the operator P1 acquired by the operator microphone 34 is exemplified.
- the operator microphone 34 acquires the voice of the operator P1. Although the operator microphone 34 is provided in the headphone 35 here, it may be configured in another form.
- the operator camera 36 captures the operator P1.
- the operator camera 36 is provided here on the operator display 33, but may be provided at another location.
- the traveling unit 21 is provided with a robot controller 201.
- the robot controller 201 includes a processor Pr2 and a memory Me2.
- the robot 2 configured in this way is controlled by the robot controller 201 to be autonomously operated or remotely controlled, the luggage G is handled by the robot arm 22, and the robot 2 can be moved in a desired direction by the traveling unit 21. can.
- FIG. 4 is a functional block diagram showing an example of the configuration of the control system of the unmanned delivery system 100 of FIG.
- the unmanned delivery system 100 includes an operation unit controller 301, a robot controller 201, and a drone controller 101.
- the operation unit controller 301 includes a robot operation signal generation unit 302, a drone operation signal generation unit 303, a display control unit 304, a microphone IF 305, a headphone IF 306, an operation unit communication unit 307, and a camera control unit 308.
- the operation unit communication unit 307 is composed of a communication device capable of data communication.
- the operation unit controller 301 is a robot operation signal generation unit 302, a drone operation signal generation unit 303, a display control unit 304, a microphone IF305, a headphone IF306, and a camera control unit 308 is an arithmetic unit having a processor Pr1 and a memory Me1. It is composed. These are functional blocks realized by the processor Pr1 executing the control program stored in the memory Me1 in this arithmetic unit.
- this arithmetic unit is composed of, for example, a microcontroller, an MPU, an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), or the like. These may be composed of a single arithmetic unit that performs centralized control, or may be configured by a plurality of arithmetic units that perform distributed control.
- the robot operation signal generation unit 62 generates a robot operation signal in response to the operation of the robot operator 31.
- the drone operation signal generation unit 303 generates a drone operation signal in response to the operation of the drone operation device 32.
- the display control unit 304 causes the operator display 33 to display an image corresponding to the image signal transmitted from the operation unit communication unit 307.
- the microphone IF 305 converts the voice acquired by the operator microphone 34 into an appropriate voice signal.
- the headphone IF 306 causes the operator speaker to emit voice in response to the voice signal transmitted from the operation unit communication unit 307.
- the camera control unit 308 generates an image signal of the image captured by the operator camera 36.
- the operation unit communication unit 307 includes a robot operation signal transmitted from the robot operation signal generation unit 302, a drone operation signal transmitted from the drone operation signal generation unit 303, an audio signal transmitted from the microphone IF 305, and a camera control unit 308.
- the transmitted image signal is converted into a wireless communication signal and transmitted wirelessly.
- the operation unit communication unit 307 receives the wireless communication signal transmitted from the robot communication unit 202, converts it into an image signal or an audio signal, transmits the image signal to the display control unit 304, and transmits the audio signal to the microphone IF305. Send. Further, the operation unit communication unit 307 receives the wireless communication signal transmitted from the drone communication unit 102, converts it into an information signal, and transmits this to the display control unit 304.
- the robot controller 201 includes a robot communication unit 202, a robot control unit 203, and a storage unit 204.
- the robot communication unit 202 is composed of a communication device capable of data communication.
- the robot control unit 203 and the storage unit 204 are composed of an arithmetic unit having a processor Pr2 and a memory Me2.
- the robot control unit 203 and the storage unit 204 are functional blocks realized by the processor Pr2 executing the control program stored in the memory Me2 in this arithmetic unit.
- this arithmetic unit is composed of, for example, a microcontroller, an MPU, an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), or the like. These may be composed of a single arithmetic unit that performs centralized control, or may be configured by a plurality of arithmetic units that perform distributed control.
- the robot communication unit 202 receives the wireless communication signal transmitted from the operation unit communication unit 307, converts it into a robot operation signal, an image signal, or a voice signal, and transmits these signals to the robot control unit 203.
- the robot control unit 203 controls the operation of the robot 2 according to the robot operation signal, displays the image corresponding to the image signal on the customer display 23, and emits the voice corresponding to the voice signal to the customer speaker. Let me.
- the drone controller 101 includes a drone communication unit 102 and a drone control unit 103.
- the drone communication unit 102 is composed of a communication device capable of data communication.
- the drone control unit 103 includes a processor Pr3, a memory, and an arithmetic unit having Me3.
- the drone control unit 103 is a functional block realized by the processor Pr3 executing a control program stored in the memory Me3 in this arithmetic unit.
- this arithmetic unit is composed of, for example, a microcontroller, an MPU, an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), or the like. These may be composed of a single arithmetic unit that performs centralized control, or may be configured by a plurality of arithmetic units that perform distributed control.
- the drone communication unit 102 receives the wireless communication signal transmitted from the operation unit communication unit 65, converts it into a drone operation signal, and transmits this to the drone control unit 103. Further, the drone communication unit 102 converts the information signal transmitted from the drone control unit 103 into a wireless communication signal, and wirelessly transmits the information signal.
- the drone control unit 103 controls the operation of the drone main body 12 and the elevating device 11 of the drone 1 in response to the drone operation signal transmitted from the drone side communication unit 82.
- the drone control unit 03 uses a view image captured by a view camera (not shown) of the drone 1, information on the position, speed, fuel amount, etc. necessary for maneuvering the drone 1, a navigation image, and the like as information signals for the drone. It is transmitted to the communication unit 102.
- the functions of the elements disclosed herein include general-purpose processors configured or programmed to perform the disclosed functions, dedicated processors, integrated circuits, ASICs (Application Specific Integrated Circuits), conventional circuits, and /. Alternatively, it can be performed using a circuit or processing circuit that includes a combination thereof.
- a processor is considered a processing circuit or circuit because it includes transistors and other circuits.
- a "vessel" or “part” is hardware that performs the listed functions or is programmed to perform the listed functions.
- the hardware may be the hardware disclosed herein, or it may be other known hardware that is programmed or configured to perform the listed functions.
- the "vessel" or “part” is a combination of hardware and software, and the software is used to configure the hardware and / or the processor.
- FIG. 5 is a schematic diagram showing an example of delivery data D stored in the storage unit 204 of the robot controller 201.
- the delivery data D includes, for example, the delivery address data D1, the authentication face image data D2, and the map data D3.
- the delivery address data D1 is a list of delivery addresses.
- the face image data D2 for authentication is face image data of the recipient P2 at the delivery destination, is acquired from the delivery requester when accepting the delivery, and is stored in the storage unit 204 of the robot controller 201. This face image data for authentication is stored in association with the delivery address data D1.
- the map data D3 is used for delivery by the robot 2.
- the robot control unit 203 of the robot controller 201 switches and controls the robot 2 between autonomous operation and remote operation.
- Remote operation means operation according to the operation of the robot operator 31, specifically, the robot operation signal.
- FIG. 6 is a flowchart showing an example of the contents of this autonomous operation / remote operation switching control.
- the robot control unit 203 causes the robot 2 to perform autonomous operation, that is, autonomous operation (step S1).
- the robot control unit 203 determines whether or not a remote command has been input (step S2).
- the remote command is included in the robot operation signal.
- step S5 When a remote command is input (YES in step S2), the robot control unit 203 causes the robot 2 to perform remote operation, that is, remote operation (step S5).
- the robot control unit 203 determines whether or not the predetermined condition is satisfied (step S3).
- This predetermined condition is, for example, that the route to the delivery destination 4 of the package is a rough road 6 (see FIG. 8F) as shown in FIG. 8F, or that a person approaches the robot 2.
- the robot control unit 203 causes the robot 2 to perform remote control, that is, remote control (step S5).
- the robot control unit 203 determines whether or not the end command has been input (step S4).
- the end command is included in the robot operation signal.
- step S4 If the end command is not included (NO in step S4), the robot control unit 203 returns this control to step S1.
- the robot control unit 203 ends this control.
- step S5 when remote control, that is, remote control, is performed in step S5, the robot control unit 203 determines whether or not an autonomous command has been input (step S6).
- the autonomous command is included in the robot operation signal.
- step S6 When the autonomous command is included (YES in step S6), the robot control unit 203 returns this control to step S1.
- the robot control unit 203 determines whether or not the authentication command has been input (step S7).
- the authentication command is included in the robot operation signal.
- step S8 the robot control unit 203 performs face recognition (step S8).
- the face recognition is performed by the robot control unit 203 collating the face image data stored in the storage unit 204 with the image of the recipient P2 captured by the visibility camera 26.
- a well-known method can be used for face recognition. Therefore, the description thereof will be omitted.
- the robot control unit 203 returns the robot 2 to the remote operation (step S5).
- the process is appropriately processed by the dialogue between the operator P1 and the recipient P2.
- step S7 when the authentication command is not input (NO in step S7), the robot control unit 203 determines whether or not the end command has been input (step S9).
- step S9 If the end command is not included (NO in step S9), the robot control unit 203 returns this control to step S5.
- the robot control unit 203 ends this control.
- the robot control unit 203 processes the image captured by the visual field camera 26 to determine whether or not a person is present in the image. Since the method of extracting a person in an image by image processing is well known, the description thereof will be omitted here.
- the robot control unit 203 moves the robot 2 in the direction opposite to the image of the person. Whether or not a person's image approaches the field of view camera is determined, for example, by the size of the person's image and the enlargement speed thereof.
- FIG. 7 is a flowchart showing an example of the operation of the unmanned delivery system 100 of FIG. 8A to 8L are schematic views showing an example of the operation of the unmanned delivery system 100 of FIG. 1 in order.
- the drone 1 is operated by the operator P1
- the robot 2 is autonomously or remotely controlled by the robot control unit 203 of the robot controller 201.
- step S11 loading is performed at the collection / delivery base 5 (step S11). There are three modes of this loading.
- the loading / unloading door 13 of the drone 1 is opened by the operator P1, and the luggage G is carried into the drone 1 by the transport vehicle 14 through the loading / unloading door 13.
- the robot 2 gets on the drone 1 through the carry-in / out door 13.
- the luggage G is carried into the drone 1 by the transport vehicle 14 as in the first aspect.
- the robot 2 is mounted on the drone 1 by the winch 11.
- the drone 1 is put into a hovering state, that is, a stopped flight state, and the elevating door 15 is opened.
- the elevating door 15 is opened.
- hooks for hooking the tips of the wires of the winch 11 are provided.
- the robot 2 is autonomously operated and hooks the hook at the tip of the wire by itself on the hanging portion. Further, the robot 2 takes a predetermined storage posture as shown in FIG. 8B.
- sensors are provided in the four hooking portions of the traveling portion 21 of the robot 2, and the robot control unit 203 receives a signal from the sensor that the hook at the tip of the wire is hooked on the hooking portion. Confirm and know. Then, a signal to that effect is transmitted to the operation unit communication unit 307. Then, this information is displayed on the operator display 33.
- the operator P1 winds up the winch 11 and mounts the robot 2 on the drone 1. After that, the elevating door 15 is closed.
- the robot 2 accommodates the luggage G in the accommodating portion 212, and is mounted on the drone 1 by the winch 11 as in the second aspect.
- the robot 2 places the carried-in luggage G on the loading rack 17 in the hangar 16 by remote control.
- the luggage G is stored in the luggage storage unit 212 of the third aspect, the luggage G is taken out from the storage unit 212 and placed on the luggage storage shelf 17.
- the robot 2 charges the storage battery 28 from the drone 1 by autonomous operation, then fixes itself in the hangar 16 by an appropriate means, and takes the above-mentioned predetermined storage posture.
- the luggage G and the robot 2 are then airlifted (step S12).
- the package G is delivered to a plurality of destinations 4.
- unloading is performed at a point on the way to the delivery destination 4 (step S13).
- this unloading is performed by putting the drone 1 in a hovering state and lowering the robot 2 with a winch 11. This descent is performed by the operator P1 while confirming the state of the ground with the field-of-view image captured by the field-of-view camera of the drone 1 displayed on the display device 33 for the operator. This is to ensure safety.
- the altitude of the drone 1 is set to be equal to or higher than a predetermined level.
- the predetermined altitude is set as appropriate, but is set to, for example, 20 m.
- the robot 2 releases the stowed posture by autonomous operation, and then stores the baggage G to be delivered in the baggage storage unit 212 by remote operation.
- the luggage G is transported by the robot 2 to the destination 4 on the ground (step S14).
- the drone 1 waits for the return of the robot 2 in the sky.
- the robot 2 travels on the road in the suburbs while referring to the map data by autonomous driving. Then, when the rough road 6 is encountered on the way, the operation is switched to remote control, and the vehicle travels according to the operation of the operator P1.
- the robot 2 when the robot 2 arrives at the delivery destination 4, the luggage G is delivered (step S15).
- the robot 2 is switched to remote control by the operation of the operator P1, and when the recipient, that is, the customer, P2 appears by pressing the interphone of the delivery destination 4, the robot 2 has a face. Authenticate. Then, when the recipient P2 approaches, the robot 2 automatically stops and does not move unless there is a trigger. From there, the robot 2 automatically switches to remote control and hands the luggage G to the recipient P2. At this time, the robot 2 automatically takes a predetermined baggage delivery posture as shown in FIG. 8H. If the recipient P2 gets too close, the robot 2 automatically moves in the opposite direction to the recipient P2.
- the robot 2 interacts with the recipient P2.
- the robot control unit 203 emits the voice of the operator P1 acquired by the operator microphone 34 to the customer speaker 25, and displays the image of the operator P1 captured by the operator camera 36 on the customer display.
- the voice of the recipient P2 acquired by the customer microphone 24 is emitted to the operator speaker 35, and the image of the recipient P2 captured by the visibility camera 26 is displayed on the customer display 23.
- This dialogue is, for example, as follows.
- the robot 2 returns to the unloading point in the same manner as the outward route (step S16). Then, the robot 2 is mounted on the drone 1 that has been waiting (step S17). The mounting mode of the robot 2 is the same as the second mode of loading in step S11.
- the delivery destination 4 is a room of a high-rise condominium.
- the drone 1 reaches the sky above the high-rise condominium, the drone 2 lowers the robot 2 to the rooftop.
- the first descent mode is the same as when the destination 4 is in the suburbs.
- the second descent mode the drone 1 lands on the rooftop, and the robot 2 descends to the rooftop through the open loading / unloading door 13.
- the luggage G is transported by the robot 2 to the destination 4 in the condominium, that is, is transported by ground (step S14).
- the drone 1 waits for the return of the robot 2 in the sky.
- the robot 2 is remotely controlled.
- the robot 2 descends to the target floor using the elevator of the high-rise condominium.
- the elevator door is opened and closed wirelessly by the robot 2.
- Robot 2 arrives at the rooftop by autonomous driving with appropriate remote control in between. Then, the robot 2 is mounted on the drone 1 that has been waiting (step S17). The mounting mode of the robot 2 is the same as the second mode of loading in step S11.
- the robot 2 is arranged at a point on the way to the above-mentioned destination 4. In this case, the robot 2 may stay in the field or may be collected by the drone 1.
- the baggage G can be smoothly delivered to the recipient P2. Further, with respect to the robot 2, unmanned delivery can be performed more easily by performing relatively easy work by autonomous operation and performing relatively difficult work by remote control.
- the unmanned delivery system of the second embodiment is different from the unmanned delivery system 100 of the first embodiment in that the robot 2A is used instead of the robot 2 of the first embodiment, and the other points are the unmanned delivery system 100 of the first embodiment. It is the same.
- FIG. 9A is a side view showing an example of the configuration of the robot 2A used in the unmanned delivery system according to the second embodiment of the present disclosure.
- FIG. 9B is a plan view showing an example of the configuration of the robot 2A used in the unmanned delivery system according to the second embodiment of the present disclosure.
- the robot 2A includes a traveling unit 21 and a pair of robot arms 22 provided on the traveling unit 21.
- the traveling unit 21 may be a dolly.
- Each of the pair of robot arms 22 is composed of a four-axis vertical articulated robot arm. That is, each robot arm 22 has a first link L1 that is rotatable around a vertical first rotation axis Ax1. This first link L1 is common to both robot arms 22.
- the base end of the second link L2 is rotatably provided around the second rotation axis Ax 2 perpendicular to the first rotation axis Ax 1 .
- the base end portion of the third link L3 is rotatably provided around the third rotation axis Ax 3 perpendicular to the second rotation axis Ax 2 .
- the base end portion of the fourth link L4 is rotatably provided around the fourth rotation axis Ax 4 perpendicular to the third rotation axis Ax 3 .
- a grip portion 221 having a three-claw 222 is provided at the tip of the fourth link L4. The pair of robot arms 22 grip the load G by the pair of gripping portions 221.
- the traveling portion 21 of the robot 2 is formed in a dolly shape, and a luggage accommodating portion 212 is provided at the front end portion.
- the luggage accommodating portion 212 is formed in the shape of a rectangular box having an open upper surface having a bottom wall 212a and a side wall 212b.
- the side wall portion on the rear side of the luggage accommodating portion 212 is notched at an upper portion, and a pair of robot arms 22 can insert the luggage G into the luggage accommodating portion from the notched portion.
- a pair of front wheels 211 and 211 and a pair of rear wheels 211 and 211 are provided on the bottom of the traveling portion 21.
- one of the pair of front wheels 211, 211 and the pair of rear wheels 211, 211 is the steering wheel, and for example, any one of the pair of front wheels 211, 211 and the pair of rear wheels 211, 211 is the drive wheel.
- a storage battery 28 and a motor are mounted on the traveling unit 21, and the motor drives the drive wheels using the storage battery 28 as a power source.
- a pair of out triggers 213 are provided on both sides of the central portion of the traveling portion 21. The out trigger 213 is configured to be accommodated inside the traveling portion 21. When the robot 2A stops and loads / unloads the luggage G, the out trigger 213 projects left and right from the traveling unit 21 and lands on the ground to prevent the traveling unit 21 from moving.
- a display robot arm 27 is provided behind the robot arm 22 of the traveling unit 21. Since the display robot arm 27 is the same as that of the first embodiment, the description thereof will be omitted.
- the same effect as that of the unmanned delivery system 100 of the first embodiment can be obtained.
- the unmanned delivery system of the third embodiment includes a plurality of robots 2. An identification symbol is assigned to each of these plurality of robots 2.
- the robot controller 31 is provided with an operation unit for designating the robot 2 to be operated.
- the robot operation signal generation unit 302 attaches an identification symbol of the robot 2 designated to the robot operation signal according to the operation of the operation unit.
- the robot control unit 203 of each robot 2 controls the robot 2 based on the robot operation signal.
- the operator P1 can operate a plurality of self-propelled robots 2 by one robot operator 31.
- unmanned delivery can be efficiently performed.
- the robot controller 201 basically causes the self-propelled robot 2 to perform the autonomous operation on the route from a point in the middle of delivering the luggage G to the destination 4, and when a predetermined condition is satisfied, the self-propelled robot 2 May be configured to allow the remote operation.
- unmanned delivery can be performed more appropriately.
- the predetermined condition may be that the route to the destination 4 is a bad road 6 or that a person approaches the self-propelled robot 2.
- the robot controller 201 may be configured to cause the self-propelled robot 2 to perform the remote control when the luggage G is delivered to the delivery destination 4.
- the delivery of the baggage G at the delivery destination 4, which requires a polite response, can be appropriately performed at the discretion of the person.
- the robot controller 201 may be configured to move the self-propelled robot 2 in the direction opposite to the person when the person approaches the self-propelled robot 2 when passing the luggage G.
- the distance between the person and the self-propelled robot 2 can be maintained within a safe range.
- the self-propelled robot 2 includes a field camera 26 that captures an image of its surroundings, the robot controller 201 has facial image data for authentication, and the robot controller 201 has the above-mentioned robot controller 201.
- the face recognition of the recipient P2 of the baggage G is performed based on the image captured by the view camera 26 and the face image data for the authentication, and the face recognition is established.
- the baggage G may be delivered.
- the unmanned delivery system 100 includes an operation unit 3, in which the operation unit 3 includes the robot operator 31, an operator camera 36 that captures the operator P1, and an operator that acquires the voice of the operator P1.
- the self-propelled robot 2 includes a microphone 34, an operator display 33, and an operator speaker 35, and the self-propelled robot 2 has a customer microphone 24 for acquiring the voice of the recipient P2, a customer display 23, and a customer. Further including a speaker 25, the robot controller 201 causes the customer speaker 25 to emit the voice of the operator P1 acquired by the operator microphone 34, and the operation captured by the operator camera 36.
- the image of the person P1 is displayed on the customer display 23, and the voice of the recipient P2 acquired by the customer microphone 24 is emitted to the operator speaker 35, and the image is taken by the view camera 26.
- the image of the recipient P2 may be displayed on the customer display 23, whereby the recipient P2 and the operator P1 may be configured to interact with each other.
- smooth delivery can be performed by dialogue between the recipient P2 and the operator P1.
- the robot controller 201 has map data D3, and the robot controller 201 uses the map data D3 to autonomously drive the self-propelled robot 2 from a point in the middle to the destination 4. It may be configured to run by.
- the self-propelled robot 2 can be appropriately driven by autonomous driving.
- the unmanned delivery system 100 includes a plurality of the self-propelled robots 2, and the plurality of self-propelled robots 2 and the robot operating device 31 operate the plurality of self-propelled robots 2 by one robot operating device 31. It may be configured to be possible.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Economics (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Development Economics (AREA)
- Marketing (AREA)
- Quality & Reliability (AREA)
- General Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Resources & Organizations (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- Operations Research (AREA)
- Radar, Positioning & Navigation (AREA)
- Acoustics & Sound (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
図1は、本開示の実施形態1に係る無人配送システム100の概略の構成の一例を示す模式図である。
図1を参照すると、実施形態1の無人配送システム100は、無人航空機1と、自走ロボット2と、操作ユニット3と、を含む。以下、無人航空機をドローンと呼ぶ。
図1を参照すると、ドローン1は、配送する荷物と自走ロボット2を輸送することができるものであればよい。ドローン1として、飛行機、ヘリコプターが例示される。飛行機は、通常の滑走により離着陸するものの他、VTOL機(Vertical Take-Off and Landing aircraft、垂直離着陸機)を含む。ドローン1は、ここでは、VTOL機で構成される。
図2は図1の操作ユニット3の詳細な構成の一例を示す斜視図である。図3は、図1の自走ロボット2の構成の一例を示す側面図である。
図3を参照すると、自走ロボットの一例であるロボット2は、自律走行可能で且つ荷物を扱うことが可能なロボットであればよい。ロボット2は、ここでは、自律走行可能な走行部21と走行部21の上に設けられたロボットアーム22とを備える。走行部21は、例えば、台車であってもよい。なお、荷物を扱う構成要素は、必ずしもロボットアームでなくてよい。図3のロボット2は、図面左方向及び右方向が、それぞれ、走行方向における前方向及び後方向である。
図4は、図1の無人配送システム100の制御系統の構成の一例を示す機能ブロック図である。
又は処理回路を使用して実行できる。プロセッサは、トランジスタやその他の回路を含むため、処理回路又は回路と見なされる。本開示において、「器」又は「部」は、列挙された機能を実行するハードウェアであるか、又は、列挙された機能を実行するようにプログラムされたハードウェアである。ハードウェアは、本明細書に開示されているハードウェアであってもよいし、あるいは、列挙された機能を実行するようにプログラム又は構成されているその他の既知のハードウェアであってもよい。ハードウェアが回路の一種と考えられるプロセッサである場合、「器」又は「部」は、ハードウェアとソフトウェアの組み合わせであり、ソフトウェアは、ハードウェア及び/又はプロセッサの構成に使用される。
図5は、ロボット制御器201の記憶部204に格納された配送用データDの一例を示す模式図である。
ロボット制御器201のロボット制御部203は、ロボット2を、自律運転と遠隔運転との間で切り替えて、制御する。遠隔運転は、ロボット操作器31の操作、具体的にはロボット操作信号、に従った運転を意味する。
次に、人の回避制御について説明する。ロボット制御部203は、視界カメラ26で撮像された画像を画像処理して、当該画像内に人が存在するか否かを判定する。画像処理によって、画像内の人を抽出する方法は周知であるので、ここでは、その説明を省略する。ロボット制御部203は、視界カメラ26で撮像された画像から抽出された人の画像が視界カメラに接近する場合には、ロボット2を当該人の画像と反対方向に移動させる。人の画像が視界カメラに接近するか否かは、例えば、当該人の画像の大きさ及びその拡大速度によって判定される。
次に、以上のように構成された無人配送システム100の動作を、図1乃至図8Lを用いて説明する。無人配送システム100の動作は、無人配送方法を意味する。図7は、図1の無人配送システム100の動作の一例を示すフローチャートである。図8A乃至図8Lは、図1の無人配送システム100の動作の一例を順に示す模式図である。この無人配送システム100の動作では、ドローン1は、操作者P1によって操作され、ロボット2は、ロボット制御器201のロボット制御部203によって、自律運転又は遠隔運転される。
図7を参照すると、届け先4までの途中の地点で荷降ろしが行われる(ステップS13)。図8Eを参照すると、この荷降ろしは、ドローン1をホバリング状態にして、ロボット2をウインチ11で降下させることによって行われる。この降下は、操作者P1が、操作者用表示器33に表示される、ドローン1の視界カメラで撮像された視界画像で地上の様子を確認しながら行う。安全性を確保するためである。また、この場合、ドローン1の高度は所定以上とされる。所定高度は適宜設定されるが、例えば、20mとされる。
この場合、ロボット2は、自律運転によって格納姿勢を解いた後、遠隔運転により、これから配送すべき荷物Gを荷物収容部212に収容する。
図8Iを参照すると、この場合、例えば、届け先4が高層マンションの一室である。ドローン1は、高層マンションの上空に到達すると、ロボット2を屋上に降下させる。この降下の態様は2つある。第1降下態様は、届け先4が郊外部の場合と同じである。第2降下態様では、ドローン1が屋上に着陸し、ロボット2が開放された搬出入扉13から屋上に降りる。
1つの届け先4への配達業務が終了すると、次の届け先4への配達業務が上記と同様に行われ、全ての届け先4への配達業務が終了すると、ドローン1は、集配拠点5に帰還する(ステップS18,19)。
変形例1では、ロボット2が上述の届け先4までの途中の地点に配置されている。この場合、ロボット2は、現地にとどまってもよいし、ドローン1に回収されてもよい。
実施形態2の無人配送システムは、実施形態1のロボット2に代えてロボット2Aが用いられる点で、実施形態1の無人配送システム100と異なり、その他の点は実施形態1の無人配送システム100と同じである。
実施形態3では、実施形態1又は実施形態2において、操作者P1が複数のロボット2を操作することができる。その他の点は、実施形態1又は実施形態2と同様である。
本開示の実施形態によれば、自走ロボット2は、地上走行し、且つ、荷物Gを扱うことができるので、受取人P2に対する荷物Gの受け渡しを円滑に行うことができる。そして、自走ロボット2の制御を自律運転とロボット操作器31の操作に従う遠隔運転との間で切り替えるので、比較的容易な業務を自律運転で行うとともに比較的難しい作業務を遠隔運転で行うことにより、無人配送をより容易に行うことができる。
Claims (12)
- 自走ロボットと、
荷物を届ける途中の地点まで当該荷物を輸送するための無人航空機と、
前記自走ロボットを遠隔操作するためのロボット操作器と、を備え、
前記自走ロボットは、自律運転と前記ロボット操作器の操作に従う遠隔運転との間で切り替えながら、前記途中の地点に降ろされた前記荷物を前記届け先に届けるよう、当該自走ロボットを制御するように構成されたロボット制御器を備える、無人配送システム。 - 自走ロボットと、
荷物を届ける途中の地点まで当該荷物及び前記自走ロボットを輸送するための無人航空機と、
前記自走ロボットを遠隔操作するためのロボット操作器と、を備え、
前記自走ロボットは、自律運転と前記ロボット操作器の操作に従う遠隔運転との間で切り替えながら、前記途中の地点に降ろされた前記荷物を前記届け先に届けるよう、当該自走ロボットを制御するように構成されたロボット制御器を備える、無人配送システム。 - 前記ロボット制御器は、前記途中の地点から前記届け先までの道程において、前記自走ロボットに基本的に前記自律運転をさせ、所定の条件が満たされる場合に、前記自走ロボットに前記遠隔運転をさせるよう構成されている、請求項1又は2に記載の無人配送システム。
- 前記所定の条件が、前記届け先までの道程が悪路であること、又は、人が前記自走ロボットに接近したことである、請求項3に記載の無人配送システム。
- 前記ロボット制御器は、前記届け先において前記荷物を渡す場合、前記自走ロボットに前記遠隔運転をさせるよう構成されている、請求項1乃至3のいずれかに記載の無人配送システム。
- 前記ロボット制御器は、前記荷物を渡す場合において、人が前記自走ロボットに接近すると、前記自走ロボットを前記人と反対方向に移動させるよう構成されている、請求項5に記載の無人配送システム。
- 前記自走ロボットは、自身の周囲を撮像する視界カメラを備えており、
前記ロボット制御器は、認証用の顔画像データを有しており、且つ、
前記ロボット制御器は、前記荷物を渡す場合において、前記視界カメラで撮像された画像と前記認証用の顔画像データとに基づいて、前記荷物の受取人の顔認証を行い、当該顔認証が成立した場合に、前記荷物を渡すよう構成されている、請求項5に記載の無人配送システム。 - 前記無人配送システムは、操作ユニットを備え、
前記操作ユニットは、前記ロボット操作器と、前記操作者を撮像する操作者カメラと、前記操作者の声を取得する操作者マイクと、操作者用表示器と、操作者スピーカと、を備え、
前記自走ロボットは、前記受取人の声を取得する顧客マイクと、顧客用表示器と、顧客スピーカと、をさらに備え、
前記ロボット制御器は、前記操作者マイクで取得された前記操作者の声を前記顧客スピーカに放出させ、前記操作者カメラで撮像された前記操作者の画像を前記顧客用表示器に表示させ、且つ、前記顧客マイクで取得された前記受取人の声を前記操作者スピーカに放出させ、前記視界カメラで撮像された前記受取人の画像を前記顧客用表示器に表示させ、それによって、前記受取人と前記操作者とを対話させるように、構成されている、請求項5乃至7のいずれかに記載の無人配送システム。 - 前記ロボット制御器は、地図データを有しており、
前記ロボット制御器は、前記地図データを用いて、前記途中の地点から前記届け先まで、前記自走ロボットを前記自律運転によって走行させるよう構成されている、請求項3乃至8のいずれかに記載の無人配送システム。 - 前記無人配送システムは、複数の前記自走ロボットを備え、
前記複数の自走ロボット及び前記ロボット操作器は、1つのロボット操作器によって、前記複数の自走ロボットを操作することが可能なように構成されている、請求項1乃至9のいずれかに記載の無人配送システム。 - 無人航空機によって、荷物を届ける途中の地点まで当該荷物を輸送し、
ロボット操作器によって、前記自走ロボットを遠隔操作し、
前記自走ロボットによって、自律運転と前記ロボット操作器の操作に従う遠隔運転との間で切り替えながら、前記途中の地点に降ろされた前記荷物を前記届け先に届ける、無人配送方法。 - 無人航空機によって、荷物を届ける途中の地点まで当該荷物及び自走ロボットを輸送し、
ロボット操作器によって、前記自走ロボットを遠隔操作し、
前記自走ロボットによって、自律運転と前記ロボット操作器の操作に従う遠隔運転との間で切り替えながら、前記途中の地点に降ろされた前記荷物を前記届け先に届ける、無人配送方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21886026.0A EP4238914A4 (en) | 2020-10-30 | 2021-10-20 | UNMANNED DELIVERY SYSTEM AND METHOD |
US18/034,093 US20230405830A1 (en) | 2020-10-30 | 2021-10-20 | Unmanned delivery system and unmanned delivery method |
CN202180073800.4A CN116507569A (zh) | 2020-10-30 | 2021-10-20 | 无人配送系统以及无人配送方法 |
KR1020237014893A KR20230079423A (ko) | 2020-10-30 | 2021-10-20 | 무인 배송 시스템 및 무인 배송 방법 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-183352 | 2020-10-30 | ||
JP2020183352A JP7522007B2 (ja) | 2020-10-30 | 2020-10-30 | 無人配送システム及び無人配送方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022091907A1 true WO2022091907A1 (ja) | 2022-05-05 |
Family
ID=81382385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/038760 WO2022091907A1 (ja) | 2020-10-30 | 2021-10-20 | 無人配送システム及び無人配送方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20230405830A1 (ja) |
EP (1) | EP4238914A4 (ja) |
JP (1) | JP7522007B2 (ja) |
KR (1) | KR20230079423A (ja) |
CN (1) | CN116507569A (ja) |
WO (1) | WO2022091907A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015066632A (ja) * | 2013-09-27 | 2015-04-13 | 本田技研工業株式会社 | ロボット、ロボット制御方法、およびロボット制御プログラム |
US9305280B1 (en) * | 2014-12-22 | 2016-04-05 | Amazon Technologies, Inc. | Airborne fulfillment center utilizing unmanned aerial vehicles for item delivery |
JP2016083711A (ja) * | 2014-10-23 | 2016-05-19 | 公立大学法人首都大学東京 | テレプレゼンスロボット |
JP2019502975A (ja) * | 2015-10-13 | 2019-01-31 | スターシップ テクノロジーズ オサイヒング | 自律的又は半自律的配達の方法及びシステム |
JP2019028838A (ja) * | 2017-08-01 | 2019-02-21 | パナソニックIpマネジメント株式会社 | 配送管理システムおよび配送管理方法 |
US20190236741A1 (en) * | 2016-08-05 | 2019-08-01 | Starship Technologies Oü | System and mobile freight station and method for distribution, delivery, and collection of freight |
JP2020083600A (ja) | 2018-11-29 | 2020-06-04 | トヨタ自動車株式会社 | 配送システム及び処理サーバ |
JP2020128287A (ja) * | 2019-02-08 | 2020-08-27 | トヨタ自動車株式会社 | 車両 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9868526B2 (en) * | 2014-10-15 | 2018-01-16 | W. Morrison Consulting Group, Inc. | Airborne drone delivery network and method of operating same |
CN110945448A (zh) * | 2017-07-28 | 2020-03-31 | 纽诺有限公司 | 自主和半自主载具上的灵活隔间设计 |
-
2020
- 2020-10-30 JP JP2020183352A patent/JP7522007B2/ja active Active
-
2021
- 2021-10-20 CN CN202180073800.4A patent/CN116507569A/zh active Pending
- 2021-10-20 WO PCT/JP2021/038760 patent/WO2022091907A1/ja active Application Filing
- 2021-10-20 KR KR1020237014893A patent/KR20230079423A/ko unknown
- 2021-10-20 EP EP21886026.0A patent/EP4238914A4/en active Pending
- 2021-10-20 US US18/034,093 patent/US20230405830A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015066632A (ja) * | 2013-09-27 | 2015-04-13 | 本田技研工業株式会社 | ロボット、ロボット制御方法、およびロボット制御プログラム |
JP2016083711A (ja) * | 2014-10-23 | 2016-05-19 | 公立大学法人首都大学東京 | テレプレゼンスロボット |
US9305280B1 (en) * | 2014-12-22 | 2016-04-05 | Amazon Technologies, Inc. | Airborne fulfillment center utilizing unmanned aerial vehicles for item delivery |
JP2019502975A (ja) * | 2015-10-13 | 2019-01-31 | スターシップ テクノロジーズ オサイヒング | 自律的又は半自律的配達の方法及びシステム |
US20190236741A1 (en) * | 2016-08-05 | 2019-08-01 | Starship Technologies Oü | System and mobile freight station and method for distribution, delivery, and collection of freight |
JP2019028838A (ja) * | 2017-08-01 | 2019-02-21 | パナソニックIpマネジメント株式会社 | 配送管理システムおよび配送管理方法 |
JP2020083600A (ja) | 2018-11-29 | 2020-06-04 | トヨタ自動車株式会社 | 配送システム及び処理サーバ |
JP2020128287A (ja) * | 2019-02-08 | 2020-08-27 | トヨタ自動車株式会社 | 車両 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4238914A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4238914A4 (en) | 2024-06-12 |
KR20230079423A (ko) | 2023-06-07 |
JP2022073395A (ja) | 2022-05-17 |
JP7522007B2 (ja) | 2024-07-24 |
EP4238914A1 (en) | 2023-09-06 |
CN116507569A (zh) | 2023-07-28 |
US20230405830A1 (en) | 2023-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210132625A1 (en) | Modular delivery vehicle system | |
CN111512253B (zh) | 绳钩的主动位置控制 | |
CN111512327B (zh) | 动态uav运输任务 | |
JP4222510B2 (ja) | 無人飛行体による運搬方法 | |
AU2022202827B2 (en) | Methods and systems for door-enabled loading and release of payloads in an unmanned aerial vehicle (UAV) | |
JP7159822B2 (ja) | 配送システム及び処理サーバ | |
CN111527510A (zh) | 无人机uav到预备地点的预期派遣 | |
CN111491824B (zh) | 航空运输任务期间的递送位置再充电方法和装置 | |
CN116096636A (zh) | 用于无人机的具有充电和装载功能的着陆垫 | |
CN111003183A (zh) | 用于自主物体拾取的地面操作 | |
WO2021070439A1 (ja) | ロボット装置及びその制御方法 | |
WO2022091907A1 (ja) | 無人配送システム及び無人配送方法 | |
US20230069643A1 (en) | Flying body and method for transporting load using same | |
WO2022091910A1 (ja) | 無人配送システム及び無人配送方法 | |
CN118434633A (zh) | 用于将包裹固定到uav的具有附接板的包裹耦合装置和固定包裹以用于递送的方法 | |
JP2022073837A (ja) | 無人配送システム及び無人配送方法 | |
CN118556024A (zh) | 具有用于将包裹固定到uav的带和悬挂器的包裹耦合装置和固定包裹以用于递送的方法 | |
EP4209417A1 (en) | Supply-airlifting system and supply-airlifting method | |
WO2022091882A1 (ja) | 作業システム及び作業方法 | |
JP2022073836A (ja) | 作業システム及び作業方法 | |
CN116490323A (zh) | 作业系统以及作业方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21886026 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18034093 Country of ref document: US Ref document number: 202180073800.4 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 20237014893 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021886026 Country of ref document: EP Effective date: 20230530 |