WO2019130973A1 - 作業現場の管理システム及び作業現場の管理方法 - Google Patents
作業現場の管理システム及び作業現場の管理方法 Download PDFInfo
- Publication number
- WO2019130973A1 WO2019130973A1 PCT/JP2018/044060 JP2018044060W WO2019130973A1 WO 2019130973 A1 WO2019130973 A1 WO 2019130973A1 JP 2018044060 W JP2018044060 W JP 2018044060W WO 2019130973 A1 WO2019130973 A1 WO 2019130973A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unmanned vehicle
- data
- drone
- work site
- unit
- Prior art date
Links
- 238000007726 management method Methods 0.000 title claims description 103
- 230000005856 abnormality Effects 0.000 claims abstract description 94
- 238000003384 imaging method Methods 0.000 claims abstract description 43
- 238000004891 communication Methods 0.000 description 79
- 230000002159 abnormal effect Effects 0.000 description 28
- 238000000034 method Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 5
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical group C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 238000005065 mining Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 230000002547 anomalous effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 229910052500 inorganic mineral Inorganic materials 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000011707 mineral Substances 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000013049 sediment Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/205—Indicating the location of the monitored vehicles as destination, e.g. accidents, stolen, rental
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F02—COMBUSTION ENGINES; HOT-GAS OR COMBUSTION-PRODUCT ENGINE PLANTS
- F02D—CONTROLLING COMBUSTION ENGINES
- F02D29/00—Controlling engines, such controlling being peculiar to the devices driven thereby, the devices being other than parts or accessories essential to engine operation, e.g. controlling of engines by signals external thereto
- F02D29/02—Controlling engines, such controlling being peculiar to the devices driven thereby, the devices being other than parts or accessories essential to engine operation, e.g. controlling of engines by signals external thereto peculiar to engines driving vehicles; peculiar to engines driving variable pitch propellers
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F02—COMBUSTION ENGINES; HOT-GAS OR COMBUSTION-PRODUCT ENGINE PLANTS
- F02N—STARTING OF COMBUSTION ENGINES; STARTING AIDS FOR SUCH ENGINES, NOT OTHERWISE PROVIDED FOR
- F02N11/00—Starting of engines by means of electric motors
- F02N11/08—Circuits or control means specially adapted for starting of engines
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/006—Indicating maintenance
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/008—Registering or indicating the working of vehicles communicating information to a remotely located station
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
- G07C5/085—Registering performance data using electronic data carriers
- G07C5/0866—Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
Definitions
- the present invention relates to a work site management system and a work site management method.
- An unmanned vehicle may be used in a wide area work site such as a mine or a quarry (see Patent Document 1).
- An aspect of the present invention aims to suppress a decrease in productivity at a work site where an unmanned vehicle operates.
- a management system of a work site including an image data acquisition unit for acquiring image data of an unmanned vehicle stopped at a work site due to occurrence of an abnormality, which is imaged by an imaging device mounted on a mobile object.
- FIG. 1 is a view schematically showing an example of a management system of a work site according to the present embodiment.
- FIG. 2 is a sequence diagram showing processing of the management system according to the present embodiment.
- FIG. 3 is a functional block diagram showing a control device according to the present embodiment.
- FIG. 4 is a functional block diagram showing a control device according to the present embodiment.
- FIG. 5 is a functional block diagram showing a management apparatus according to the present embodiment.
- FIG. 6 is a flowchart showing the operation of the unmanned vehicle according to the present embodiment.
- FIG. 7 is a flowchart showing the operation of the management apparatus according to the present embodiment.
- FIG. 8 is a view showing an example of a display device according to the present embodiment.
- FIG. 9 is a view showing an example of a display device according to the present embodiment.
- FIG. 10 is a flowchart showing the operation of the flying object according to the present embodiment.
- FIG. 11 is a block diagram showing an example of a computer system according to the present embodiment.
- FIG. 12 is a functional block diagram showing the unmanned vehicle according to the present embodiment.
- FIG. 1 is a view schematically showing an example of a management system 1 of a work site according to the present embodiment. As shown in FIG. 1, at the work site, the unmanned vehicle 2 and the flying object 3 operate.
- the unmanned vehicle 2 refers to a vehicle that travels unmanned regardless of the driver's driving operation.
- the unmanned vehicle 2 travels based on target travel data to be described later.
- the unmanned vehicle 2 may travel by remote control or may travel autonomously.
- the flying body 3 refers to an unmanned aircraft that flies unmanned.
- the flying object 3 may fly by remote control or may fly autonomously. In the following description, the aircraft 3 will be referred to as drone 3 as appropriate.
- the work site is a mine or quarry.
- the unmanned vehicle 2 is a dump truck that travels a work site to transport cargo.
- the drone 3 can fly at the work site.
- Each of the unmanned vehicle 2 and the drone 3 is a movable body movable at the work site.
- a mine is a place or place of mining minerals.
- Quarry is a place or place of mining rock. Examples of the cargo to be transported to the unmanned vehicle 2 include ore or sediment excavated in a mine or a quarry.
- the management system 1 includes a management device 4, an input device 5, an output device 6, and a communication system 7.
- the management device 4, the input device 5, and the output device 6 are installed, for example, in a control facility 8 at a work site.
- the communication system 7 communicates between the management device 4, the unmanned vehicle 2, and the drone 3.
- the wireless communication device 9 is connected to the management device 4.
- the communication system 7 includes a wireless communication device 9.
- the management device 4, the unmanned vehicle 2 and the drone 3 wirelessly communicate via the communication system 7.
- the unmanned vehicle 2 travels on the work site based on the target travel data from the management device 4.
- the input device 5 is operated by the administrator Wb at the control facility 8.
- the input device 5 generates input data by being operated by the administrator Wb.
- the input data generated by the input device 5 is output to the management device 4.
- the output device 6 is controlled by the management device 4 and outputs prescribed output data.
- the output device 6 is exemplified by at least one of a display device capable of displaying display data, an audio output device capable of outputting an audio, and a printing device capable of outputting a printed matter.
- the output device 6 includes a display device.
- the output device 6 is appropriately referred to as a display device 6.
- the display device 6 includes a flat panel display such as a liquid crystal display (LCD) or an organic electroluminescence display (OELD).
- the administrator Wb can view the display screen of the display device 6.
- the unmanned vehicle 2 can travel at the work site.
- the unmanned vehicle 2 includes a control device 20, a traveling device 21, a vehicle body 22 supported by the traveling device 21, a dump body 23 supported by the vehicle body 22, and a vehicle speed sensor for detecting the traveling speed of the unmanned vehicle 2. 24, a non-contact sensor 25 for detecting an object in a non-contact manner, a position sensor 26 for detecting the position of the unmanned vehicle 2, and a wireless communication device 27.
- the traveling device 21 includes a drive device 21D, a brake device 21B, a steering device 21S, and wheels 21H.
- the unmanned vehicle 2 travels by the rotation of the wheels 21H.
- the wheels 21 H include front wheels and rear wheels. A tire is mounted on the wheel 21H.
- the driving device 21D generates a driving force for accelerating the unmanned vehicle 2.
- the drive device 21D includes at least one of an internal combustion engine such as a diesel engine and a motor.
- the driving force generated by the driving device 21D is transmitted to the wheel 21H (rear wheel).
- the brake device 21B generates a braking force for decelerating or stopping the unmanned vehicle 2.
- the steering device 21S generates a steering force for adjusting the traveling direction of the unmanned vehicle 2.
- the steering force generated by the steering device 21S is transmitted to the wheels 21H (front wheels).
- Control device 20 outputs a driving command to traveling device 21.
- the driving command adjusts an accelerator command to operate the drive device 21D to accelerate the unmanned vehicle 2, a brake command to operate the brake device 21B to decelerate or stop the unmanned vehicle 2, and a traveling direction of the unmanned vehicle 2 And at least one of the steering commands for operating the steering device 21S.
- the driving device 21D generates driving force for accelerating the unmanned vehicle 2 based on the accelerator command output from the control device 20.
- the brake device 21B generates a braking force for decelerating or stopping the unmanned vehicle 2 based on the brake command output from the control device 20.
- the steering device 21S generates a steering force for causing the unmanned vehicle 2 to go straight or turn based on the steering command output from the control device 20.
- the vehicle speed sensor 24 detects the traveling speed of the unmanned vehicle 2.
- the vehicle speed sensor 24 detects, for example, the rotational speed of the wheel 21 H to detect the traveling speed of the unmanned vehicle 2.
- the non-contact sensor 25 detects an object around the unmanned vehicle 2 in a non-contact manner.
- the object includes an obstacle that prevents the unmanned vehicle 2 from traveling.
- Non-contact sensor 25 is provided at the front of vehicle body 22.
- the non-contact sensor 25 may be provided on the side of the vehicle body 22.
- the non-contact sensor 25 includes a laser scanner device.
- the non-contact sensor 25 detects an object in a non-contact manner using laser light which is detection light.
- the non-contact sensor 25 can detect the presence or absence of an object, the relative position to the object, and the relative velocity to the object.
- the non-contact sensor 25 may include a radar device such as a millimeter wave radar device. The radar device can detect an object contactlessly using radio waves.
- the position sensor 26 detects the position of the unmanned vehicle 2.
- the position sensor 26 detects the position of the unmanned vehicle 2 using a Global Navigation Satellite System (GNSS).
- Global navigation satellite systems include the Global Positioning System (GPS).
- GPS Global Positioning System
- the global navigation satellite system detects the absolute position of the unmanned vehicle 2 defined by coordinate data of latitude, longitude, and altitude.
- the global navigation satellite system detects the position of the unmanned vehicle 2 defined in the global coordinate system.
- the global coordinate system is a coordinate system fixed to the earth.
- the position sensor 26 includes a GPS receiver and detects the absolute position (coordinates) of the unmanned vehicle 2.
- the wireless communication device 27 can wirelessly communicate with the management device 4.
- the communication system 7 includes a wireless communication device 27.
- the drone 3 can fly at the work site.
- the drone 3 includes a control device 30, a flight device 31, a main body 32 supported by the flight device 31, a position sensor 33 for detecting the position of the drone 3, an imaging device 34, and a wireless communication device 36.
- the flight device 31 has a propeller 31P and a drive device 31D.
- the driving device 31D generates a driving force for rotating the propeller 31P.
- the drive device 31D includes an electric motor.
- the drone 3 has a power supply for supplying electric power to the motor.
- the power supply includes a rechargeable battery.
- the main body 32 is supported by the flight device 31. The drone 3 flies by the rotation of the propeller 31P.
- the position sensor 33 detects the position of the drone 3.
- the position sensor 33 includes a GPS receiver and detects the absolute position (coordinates) of the drone 3.
- the imaging device 34 acquires image data of a subject.
- the imaging device 34 has an optical system and an image sensor.
- the image sensor includes a CCD (Couple Charged Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- the wireless communication device 36 can wirelessly communicate with the management device 4 and the control device 20 of the unmanned vehicle 2.
- the communication system 7 includes a wireless communication device 36.
- FIG. 2 is a sequence diagram showing an outline of processing of the management system 1 according to the present embodiment.
- the management device 4 generates target travel data indicating a target travel condition of the unmanned vehicle 2.
- the management device 4 transmits the target travel data to the unmanned vehicle 2 via the communication system 7 (step S1).
- the target travel condition of the unmanned vehicle 2 refers to the target condition of the travel state required of the unmanned vehicle 2 by the management system 1.
- the target travel conditions of the unmanned vehicle 2 include the target travel speed, the target acceleration, and the target travel course of the unmanned vehicle 2.
- the target travel conditions are defined, for example, in a global coordinate system.
- the unmanned vehicle 2 receives target travel data.
- the unmanned vehicle 2 travels in accordance with the target travel data.
- the unmanned vehicle 2 When an abnormality occurs in the unmanned vehicle 2 while traveling, the unmanned vehicle 2 stops.
- the unmanned vehicle 2 transmits the abnormal data indicating that the abnormality has occurred and the position data of the unmanned vehicle 2 stopped by the occurrence of the abnormality to the management device 4 via the communication system 7 (step S2).
- the management device 4 receives abnormal data and position data from the unmanned vehicle 2.
- the management device 4 starts processing for guiding the drone 3 to the unmanned vehicle 2 that has stopped due to the occurrence of an abnormality.
- the management device 4 transmits the position data of the unmanned vehicle 2 stopped and the request data for requesting to fly toward the unmanned vehicle 2 to the drone 3 via the communication system 7 (step S3).
- the drone 3 receives position data and request data of the unmanned vehicle 2 which has stopped.
- the control device 30 of the drone 3 accepts to fly toward the unmanned vehicle 2
- acceptance data is generated that consents to fly toward the unmanned vehicle 2.
- the drone 3 transmits the consent data to the management device 4 via the communication system 7 (step S4).
- the management device 4 receives the consent data.
- the management device 4 transmits flight route data indicating the flight route to the unmanned vehicle 2 that has stopped to the drone 3 that has output the consent data via the communication system 7 (step S5).
- the drone 3 receives flight route data.
- the drone 3 having received the flight route data flies toward the unmanned vehicle 2 in which the abnormality has occurred based on the flight route data.
- the drone 3 that has arrived at the unmanned vehicle 2 confirms the situation of the unmanned vehicle 2.
- the situation of the unmanned vehicle 2 includes the situation around the unmanned vehicle 2.
- the drone 3 uses the imaging device 34 to acquire image data of the unmanned vehicle 2 from above.
- the image data of the unmanned vehicle 2 includes image data of at least the front of the unmanned vehicle 2 in the vicinity of the unmanned vehicle 2.
- the drone 3 transmits the image data of the unmanned vehicle 2 to the management device 4 via the communication system 7 (step S6).
- the control device 30 When the drone 3 arrives at the unmanned vehicle 2 stopping due to the occurrence of an abnormality, the control device 30 indicates arrival data indicating that the drone 3 has arrived at the unmanned vehicle 2 based on the detection data of the position sensor 33. May be transmitted to the management device 4 via the communication system 7.
- the management device 4 receives image data of the unmanned vehicle 2.
- the management device 4 determines, based on the image data, whether or not the unmanned vehicle 2 can travel based on the target travel data.
- the manager Wb may determine whether the unmanned vehicle 2 can travel based on the target travel data. If it is determined that the unmanned vehicle 2 can resume traveling, the management device 4 transmits a restart command for causing the unmanned vehicle 2 at rest to travel based on the target traveling data via the communication system 7 (Step S7). Thereby, the unmanned vehicle 2 travels based on the target travel data.
- FIG. 3 is a functional block diagram showing the control device 20 according to the present embodiment.
- Control device 20 includes a computer system.
- the control device 20 wirelessly communicates with the management device 4 via the communication system 7.
- the control device 20 includes a communication unit 201, a target travel data acquisition unit 202, a vehicle speed data acquisition unit 203, an obstacle data acquisition unit 204, a position data acquisition unit 205, a travel control unit 206, and a determination unit 207. , And an abnormal data output unit 208.
- the communication unit 201 receives data or a signal transmitted from the management device 4 via the communication system 7.
- the communication unit 201 also transmits data or a signal to the management device 4 via the communication system 7.
- the target travel data acquisition unit 202 acquires target travel data of the unmanned vehicle 2 from the management device 4.
- the vehicle speed data acquisition unit 203 acquires, from the vehicle speed sensor 24, vehicle speed data indicating the traveling speed of the unmanned vehicle 2.
- the obstacle data acquisition unit 204 acquires obstacle data indicating at least one of the presence / absence of an obstacle around the unmanned vehicle 2, the relative position to the obstacle, and the relative speed to the obstacle from the non-contact sensor 25. Do.
- the position data acquisition unit 205 acquires position data indicating an absolute position of the unmanned vehicle 2 from the position sensor 26.
- the traveling control unit 206 controls the traveling device 21 based on the target traveling data acquired by the target traveling data acquisition unit 202.
- the travel control unit 206 is an operation command including an accelerator command to operate the drive device 21D, a brake command to operate the brake device 21B, and a steering command to operate the steering device 21S such that the unmanned vehicle 2 travels according to the target travel data. Are output to the traveling device 21.
- the determination unit 207 determines whether or not an abnormality has occurred in the unmanned vehicle 2.
- the determination unit 207 is based on at least one of the vehicle speed data acquired by the vehicle speed data acquisition unit 203, the obstacle data acquired by the obstacle data acquisition unit 204, and the position data acquired by the position data acquisition unit 205. It is determined whether or not an abnormality has occurred in the unmanned vehicle 2.
- the abnormality of the unmanned vehicle 2 includes both an abnormality in the traveling state of the unmanned vehicle 2 and a cause of causing an abnormality in the traveling state of the unmanned vehicle 2.
- An abnormality in the traveling state of the unmanned vehicle 2 includes a state in which the unmanned vehicle 2 is traveling under traveling conditions different from the target traveling conditions defined by the target traveling data.
- the abnormality in the traveling state of the unmanned vehicle 2 includes the state in which the unmanned vehicle 2 has stopped.
- the traveling control unit 206 uses the obstacle data acquired by the obstacle data acquiring unit 204 to avoid contact between the unmanned vehicle 2 and the obstacle. And the unmanned vehicle 2 is stopped.
- the traveling control unit 206 stops the unmanned vehicle 2 based on the position data of the unmanned vehicle 2 acquired by the position data acquisition unit 205.
- the unmanned vehicle 2 slips due to the travel path getting wet due to, for example, rain water or water spray.
- the abnormality in the traveling state of the unmanned vehicle 2 includes the state where the unmanned vehicle 2 is traveling at a traveling speed lower than the target traveling speed.
- the traveling control unit 206 may decelerate the unmanned vehicle 2 based on the obstacle data acquired by the obstacle data acquisition unit 204.
- the traveling control unit 206 may decelerate the unmanned vehicle 2 based on the position data of the unmanned vehicle 2 acquired by the position data acquisition unit 205.
- the abnormality in the traveling state of the unmanned vehicle 2 includes the abnormality in the traveling speed of the unmanned vehicle 2.
- the determination unit 207 is unmanned even though the target travel speed is specified based on the target travel data acquired by the target travel data acquisition unit 202 and the vehicle speed data acquired by the vehicle speed data acquisition unit 203. When it is determined that the vehicle 2 is stopped or travels at a traveling speed lower than the target traveling speed, it is determined that an abnormality occurs in the traveling speed.
- the cause of causing an abnormality in the traveling state of the unmanned vehicle 2 includes at least one of the cause of stopping the unmanned vehicle 2 and the cause of traveling the unmanned vehicle 2 at a traveling speed lower than the target traveling speed.
- the cause for causing the abnormality in the traveling state of the unmanned vehicle 2 includes the state where the obstacle is detected by the non-contact sensor 25 .
- the position sensor 26 causes an abnormality in the traveling state of the unmanned vehicle 2 by the position sensor 26 from the target traveling course It includes the state in which it has been detected that it has been disconnected.
- the abnormality of the unmanned vehicle 2 includes the abnormality of the drive system of the unmanned vehicle 2.
- the abnormality in the drive system of the unmanned vehicle 2 refers to an abnormality in a drive system that drives a traveling device, such as an engine, a generator, and an electric motor.
- the abnormal data output unit 208 outputs abnormal data when an abnormality occurs in the unmanned vehicle 2.
- the abnormality data output from the abnormality data output unit 208 includes stop data indicating that the unmanned vehicle 2 has stopped due to the occurrence of an abnormality. Further, the abnormality data output from the abnormality data output unit 208 includes deceleration data indicating that the unmanned vehicle 2 is traveling at a traveling speed lower than the target traveling speed due to the occurrence of the abnormality.
- the abnormal data output from the abnormal data output unit 208 includes cause data indicating a cause of causing an abnormality in the traveling state of the unmanned vehicle 2.
- the abnormal data output unit 208 makes non-contact based on the obstacle data acquired by the obstacle data acquisition unit 204.
- the cause data indicating that the sensor 25 has detected an obstacle is output.
- the abnormal data output unit 208 is based on the position data of the unmanned vehicle 2 acquired by the position data acquiring unit 205 , And outputs cause data indicating that the unmanned vehicle 2 has deviated from the target travel course.
- the abnormal data output from the abnormal data output unit 208 and the position data of the unmanned vehicle 2 in which the abnormality has occurred are transmitted to the management device 4 via the communication system 7.
- FIG. 4 is a functional block diagram showing the control device 30 according to the present embodiment.
- Control device 30 includes a computer system.
- the control device 30 wirelessly communicates with the management device 4 via the communication system 7.
- the control device 30 includes a communication unit 301, a flight route data acquisition unit 302, a position data acquisition unit 303, a flight control unit 304, an image data acquisition unit 305, a request data acquisition unit 306, and a determination unit 307. And an answer output unit 308.
- the communication unit 301 receives data or a signal transmitted from the management device 4 via the communication system 7.
- the communication unit 301 also transmits data or a signal to the management device 4 via the communication system 7.
- the flight route data acquisition unit 302 acquires, from the management device 4, position data of the unmanned vehicle 2 in which an abnormality has occurred. Further, the flight route data acquisition unit 302 acquires flight route data indicating the flight route of the drone 3 from the management device 4.
- the position data acquisition unit 303 acquires position data indicating the absolute position of the drone 3 from the position sensor 33.
- the flight control unit 304 controls the flight device 31 so that the drone 3 flies toward the unmanned vehicle 2 based on the position data of the unmanned vehicle 2 acquired by the flight route data acquisition unit 302.
- the image data acquisition unit 305 acquires image data of the unmanned vehicle 2 captured by the imaging device 34 from the imaging device 34.
- the request data acquisition unit 306 acquires, from the management device 4, request data for requesting to fly toward the unmanned vehicle 2.
- the determination unit 307 determines whether to fly toward the unmanned vehicle 2 or not.
- the reply output unit 308 outputs consent data or rejection data for the request data based on the determination result by the determination unit 307.
- FIG. 5 is a functional block diagram showing the management device 4 according to the present embodiment.
- the management device 4 includes a computer system.
- the management device 4 wirelessly communicates with the control device 20 and the control device 30 via the communication system 7.
- the management device 4 includes a communication unit 40, a target travel data generation unit 41, a position data acquisition unit 42, an abnormal data acquisition unit 43, a guidance unit 44, a selection unit 45, an image data acquisition unit 46, and A start command unit 47, an input data acquisition unit 48, and an output control unit 49 are provided.
- the communication unit 40 receives data or signals transmitted from the control device 20 and the control device 30 via the communication system 7.
- the communication unit 40 also transmits data or signals to the control device 20 and the control device 30 via the communication system 7.
- the target travel data generation unit 41 generates target travel data indicating target travel conditions of the unmanned vehicle 2.
- the target travel data includes the target travel speed and the target travel direction at each of a plurality of spaced points.
- the target acceleration is defined based on the difference between the target traveling speeds of the adjacent points.
- a target travel course is defined by a trajectory connecting a plurality of points.
- the position of the point is defined in the global coordinate system.
- the target travel data generation unit 41 outputs target travel data to the control device 20 of the unmanned vehicle 2 via the communication system 7.
- the position data acquisition unit 42 acquires position data of the unmanned vehicle 2 at the work site.
- the position data acquisition unit 42 acquires position data of the unmanned vehicle 2 that has stopped at the work site due to the occurrence of an abnormality.
- Position data of the unmanned vehicle 2 is detected by a position sensor 26 mounted on the unmanned vehicle 2.
- the position data acquisition unit 42 acquires position data of the unmanned vehicle 2 from the control device 20 via the communication system 7.
- the position data acquisition unit 42 acquires position data of the drone 3 at the work site.
- the position data of the drone 3 is detected by the position sensor 33 mounted on the drone 3.
- the position data acquisition unit 42 acquires position data of the drone 3 from the control device 30 via the communication system 7.
- a plurality of unmanned vehicles 2 operate.
- the position data acquisition unit 42 acquires position data of each of the plurality of unmanned vehicles 2.
- a plurality of drones 3 operate.
- the position data acquisition unit 42 acquires position data of each of the plurality of drones 3.
- the abnormal data acquisition unit 43 acquires the abnormal data output from the abnormal data output unit 208 of the unmanned vehicle 2.
- the abnormal data acquisition unit 43 acquires abnormal data of the unmanned vehicle 2 from the control device 20 via the communication system 7.
- the guidance unit 44 outputs the position data of the unmanned vehicle 2 stopped due to the occurrence of the abnormality to the drone 3. That is, the guidance unit 44 causes the drone 3 to stop at the work site due to the occurrence of an abnormality, and outputs position data of a point at which the unmanned vehicle 2 outputs the abnormality data.
- the position data of the unmanned vehicle 2 is data for guiding the drone 3 to the unmanned vehicle 2 stopped at the work site due to the occurrence of an abnormality.
- the guidance unit 44 outputs, to the control device 30 of the drone 3 via the communication system 7, the position data of the unmanned vehicle 2 that has stopped due to the occurrence of an abnormality.
- the guidance unit 44 is a flight that indicates the flight route from the drone 3 to the unmanned vehicle 2 that has stopped due to the occurrence of an abnormality in the drone 3 based on the position data of the unmanned vehicle 2 that has output abnormal data and the position data of the drone 3 Output root data.
- the guiding unit 44 outputs flight route data to the unmanned vehicle 2 in which an abnormality has occurred to the control device 30 of the drone 3 via the communication system 7.
- the flight route is the shortest route connecting the drone 3 and the unmanned vehicle 2.
- the selection unit 45 selects a specific drone 3 from the plurality of drone 3 based on the position data of the unmanned vehicle 2 and the position data of each of the plurality of drone 3.
- the guiding unit 44 outputs, to the specific drone 3 selected by the selecting unit 45, position data of the unmanned vehicle 2 that has stopped due to the occurrence of an abnormality.
- the specific drone 3 includes the drone 3 having the shortest distance to the unmanned vehicle 2 stopped due to the occurrence of an abnormality among the plurality of drone 3 operating at the work site.
- the drone 3 having the shortest distance to the unmanned vehicle 2 stopped due to the occurrence of the abnormality is guided to the unmanned vehicle 2. Thereby, the flight distance or flight time until the drone 3 arrives at the unmanned vehicle 2 stopped by the occurrence of the abnormality is shortened.
- the image data acquisition unit 46 acquires the image data of the unmanned vehicle 2 output from the image data acquisition unit 305 of the drone 3.
- the image data acquisition unit 46 acquires image data of the unmanned vehicle 2 stopped by the occurrence of an abnormality, which is imaged by the imaging device 34 mounted on the drone 3.
- the image data is data for determining whether the unmanned vehicle 2 can resume traveling based on the target traveling data.
- the drone 3 that has arrived at the unmanned vehicle 2 that has stopped stops acquires image data of the unmanned vehicle 2 using the imaging device 34.
- the image data acquisition unit 46 acquires the image data of the unmanned vehicle 2 captured by the imaging device 34 of the drone 3 via the communication system 7.
- the restart instruction unit 47 outputs a restart instruction to restart the unmanned vehicle 2 based on the image data of the unmanned vehicle 2 acquired by the image data acquisition unit 46.
- the restart of the unmanned vehicle 2 means that the unmanned vehicle 2 stopped by the occurrence of an abnormality restarts traveling based on the target travel data.
- the restart instruction refers to an instruction to cause the unmanned vehicle 2 stopped by the occurrence of an abnormality to restart traveling based on the target traveling data.
- the restart instruction unit 47 outputs a restart instruction to the control device 20 of the unmanned vehicle 2 via the communication system 7. When the restart command is output, the unmanned vehicle 2 that has stopped is restarted to travel based on the target travel data.
- the input data acquisition unit 48 acquires, from the input device 5, input data generated by operating the input device 5.
- the output control unit 49 controls the display device 6.
- the output control unit 49 outputs display data to the display device 6.
- the display device 6 displays the display data output from the output control unit 49.
- the output control unit 49 outputs the image data of the unmanned vehicle 2 stopped by the occurrence of an abnormality, acquired by the image data acquisition unit 46, to the display device 6.
- FIG. 6 is a flowchart showing the operation of the unmanned vehicle 2 according to the present embodiment.
- the target travel data of the unmanned vehicle 2 generated by the target travel data generation unit 41 is transmitted from the management device 4 to the control device 20 via the communication system 7.
- the target travel data acquisition unit 202 receives the target travel data from the management device 4 via the communication system 7 (step S101).
- the traveling control unit 206 outputs a driving command to the traveling device 21 based on the target traveling data acquired by the target traveling data acquisition unit 202 (step S102).
- the unmanned vehicle 2 travels based on the target travel data.
- the determination unit 207 sets the traveling state of the unmanned vehicle 2 based on at least one of the obstacle data acquired by the obstacle data acquisition unit 204 and the position data of the unmanned vehicle 2 acquired by the position data acquisition unit 205. It is determined whether or not an abnormality causing an abnormality has occurred (step S103).
- step S103 when it is determined that no abnormality causing the abnormality in the traveling state of the unmanned vehicle 2 has occurred (step S103: No), the unmanned vehicle 2 continues traveling based on the target traveling data .
- step S103 When it is determined in step S103 that an abnormality causing the abnormality in the traveling state of the unmanned vehicle 2 has occurred (step S103: Yes), the traveling control unit 206 travels the stop command for stopping the unmanned vehicle 2 It outputs to the apparatus 21 (step S104). Note that, when it is determined that an abnormality that causes an abnormality in the traveling state of the unmanned vehicle 2 is generated, the traveling control unit 206 may output a deceleration command for decelerating the unmanned vehicle 2 to the traveling device 21. .
- the anomaly data output unit 208 outputs anomaly data indicating that an anomaly has occurred in the unmanned vehicle 2.
- the abnormal data output unit 208 transmits the abnormal data to the management device 4 via the communication system 7. Further, the abnormal data output unit 208 transmits the position data of the unmanned vehicle 2 stopped by the occurrence of the abnormality to the management device 4 via the communication system 7 (step S105).
- the process of step S105 corresponds to the process of step S2 described with reference to FIG.
- the drone 3 flies to the unmanned vehicle 2 at which the drone 3 stops, and the imaging device 34 of the drone 3 acquires image data of the unmanned vehicle 2.
- the management device 4 controls the restart command via the communication system 7. (See step S7 in FIG. 2).
- a restart instruction is not transmitted from the management device 4 to the control device 20.
- the traveling control unit 206 determines whether a restart command has been acquired from the management device 4 (step S106).
- step S106 When it is determined in step S106 that the restart command has not been acquired (step S106: No), the unmanned vehicle 2 maintains the stopped state.
- step S106 When it is determined in step S106 that the restart command has been acquired (step S106: Yes), the traveling control unit 206 outputs a driving instruction to the traveling device 21 based on the target traveling data.
- the unmanned vehicle 2 resumes traveling based on the target traveling data.
- FIG. 7 is a flowchart showing the operation of the management device 4 according to the present embodiment.
- the target travel data generation unit 41 generates target travel data of the unmanned vehicle 2.
- the target travel data generation unit 41 transmits the target travel data to the control device 20 via the communication system 7 (step S201).
- the process of step S201 corresponds to the process of step S1 described with reference to FIG.
- the position data acquisition unit 42 acquires position data of the unmanned vehicle 2 operating at the work site and position data of the drone 3 via the communication system 7 (step S202). When there are a plurality of unmanned vehicles 2 at the work site, the position data acquisition unit 42 acquires position data of each of the plurality of unmanned vehicles 2. When a plurality of drones 3 exist at the work site, the position data acquisition unit 42 acquires position data of each of the plurality of drones 3.
- the control device 20 transmits the position data and the abnormality data of the unmanned vehicle 2 in which the abnormality has occurred to the management device 4 via the communication system 7 (see step S2 in FIG. 2). .
- no abnormality data is transmitted from the control device 20 to the management device 4.
- the abnormal data acquisition unit 43 determines whether abnormal data has been acquired from the unmanned vehicle 2 (step S203).
- step S203 When it is determined in step S203 that the abnormal data is not acquired (step S203: No), the management device 4 performs the process of step S201.
- the unmanned vehicle 2 maintains traveling based on the target traveling data.
- step S203 When it is determined in step S203 that the abnormal data is acquired (step S203: Yes), the management device 4 starts the process of guiding the drone 3 to the unmanned vehicle 2 that has stopped due to the occurrence of the abnormality.
- the selection unit 45 selects a specific drone 3 from the plurality of drone 3 based on the position data of the unmanned vehicle 2 in which the abnormality has occurred and the position data of each of the plurality of drone 3 existing at the work site (step S204) ).
- the selection unit 45 selects the drone 3 having the shortest distance (linear distance) to the unmanned vehicle 2 stopped due to the occurrence of abnormality among the plurality of drone 3 as the specific drone 3.
- FIG. 8 is a view showing an example of the display device 6 according to the present embodiment.
- the output control unit 49 causes the display device 6 to display map data of a work site, position data of the unmanned vehicle 2, and position data of the drone 3.
- the drone 3 When the drone 3 does not fly, it is installed in the standby facility 10 defined at the work site.
- a charger for charging a rechargeable battery mounted on the drone 3 is provided in the standby facility 10.
- the drone 3 charges the rechargeable battery with the charger in the standby facility 10.
- the output control unit 49 causes the display device 6 to display the icon of the unmanned vehicle 2 as position data of the unmanned vehicle 2. Further, the output control unit 49 causes the display device 6 to display the icon of the drone 3 as position data of the drone 3. For example, when the position of the unmanned vehicle 2 changes due to traveling, the output control unit 49 updates the position of the icon of the unmanned vehicle 2 on the display screen of the display device 6 based on the position data of the unmanned vehicle 2 Or move the icon.
- the output control unit 49 updates the position of the drone 3 icon on the display screen of the display device 6 based on the position data of the drone 3 or the drone 3 icon Move the As a result, the manager Wb can intuitively recognize the position of the unmanned vehicle 2 and the position of the drone 3 at the work site through vision.
- the unmanned vehicle 2A has stopped due to the occurrence of an abnormality in the unmanned vehicle 2A.
- Another unmanned vehicle 2B is traveling based on the target traveling data.
- drone 3A, 3B, 3C presupposes that it exists in the waiting place prescribed at the work site.
- the output control unit 49 may cause the display device 6 to display the display mode of the unmanned vehicle 2A stopped due to the occurrence of an abnormality and the display mode of the other unmanned vehicle 2B differently.
- the output control unit 49 may cause the display 6 to display at least one of the design, the hue, the lightness, and the saturation of the icon of the unmanned vehicle 2A and the icon of the unmanned vehicle 2B differently.
- the output control unit 49 may continuously display one of the icon of the unmanned vehicle 2A and the icon of the unmanned vehicle 2B and blink the other.
- the selection unit 45 can calculate the distance between the unmanned vehicle 2A and the drone 3A based on the position data of the unmanned vehicle 2A and the position data of the drone 3A. Similarly, the selection unit 45 can calculate the distance between the unmanned vehicle 2A and the drone 3B and the distance between the unmanned vehicle 2A and the drone 3C.
- the distance between the unmanned vehicle 2A and the drone 3A is the shortest, then the distance between the unmanned vehicle 2A and the drone 3B is the shortest, and the distance between the unmanned vehicle 2A and the drone 3C is the longest.
- the selection unit 45 selects a drone 3A, which is the shortest in distance from the unmanned vehicle 2A stopped due to the occurrence of an abnormality, as the specific drone 3 among the plurality of drone 3A, 3B, 3C.
- the guiding unit 44 outputs, to the drone 3A selected by the selecting unit 45, request data for requesting the drone 3A which has stopped due to the occurrence of an abnormality to fly toward the unmanned vehicle 2A.
- the guiding unit 44 flies the drone 3A selected by the selecting unit 45 via the communication system 7 toward the unmanned vehicle 2A in which the position data of the unmanned vehicle 2A stopped by the occurrence of the abnormality and the abnormality is generated.
- the requested data to be requested is transmitted (step S205).
- the process of step S205 corresponds to the process of step S3 described with reference to FIG.
- the drone 3A receives position data and request data of the unmanned vehicle 2A.
- consent data is generated that consents to travel toward the unmanned vehicle 2A.
- the control device 30 of the drone 3A refuses to fly toward the unmanned vehicle 2A, it generates refusal data that refuses to fly toward the unmanned vehicle 2A.
- it is difficult or impossible for the drone 3A to fly toward the unmanned vehicle 2A, such as when the charging of the rechargeable battery of the drone 3A is insufficient or when the drone 3A performs another operation There is.
- the control device 30 of the drone 3A When it is difficult or impossible for the drone 3A to fly toward the unmanned vehicle 2A, the control device 30 of the drone 3A generates refusal data that refuses to travel toward the unmanned vehicle 2A.
- the consent data or refusal data generated by the control device 30 is transmitted to the management device 4 via the communication system 7.
- the selection unit 45 acquires consent data or rejection data for the request data from the drone 3A via the communication system 7.
- the selection unit 45 determines whether consent data has been acquired from the drone 3A (step S206).
- step S206 when it is determined that the refusal data is obtained from the drone 3A (step S206: No), the selection unit 45 selects the next specific drone 3 from the plurality of drone 3 (step S207).
- the selection unit 45 selects, as the next specific drone 3, a drone 3B having a distance (linear distance) shorter than that of the drone 3A among the plurality of drone 3A, 3B, 3C with the unmanned vehicle 2A in which an abnormality occurs. Do.
- the guiding unit 44 transmits the position data and request data of the unmanned vehicle 2A to the drone 3B selected by the selecting unit 45 (step S205).
- the selection unit 45 determines whether consent data has been acquired from the drone 3B (step S206).
- the drone 3C whose distance to the unmanned vehicle 2A is short next to the drone 3B is selected as the next specific drone 3 and position data and request data of the unmanned vehicle 2A are transmitted to the drone 3C. Ru. Thereafter, the drone 3 whose distance to the unmanned vehicle 2A is short is sequentially selected until the acceptance data is acquired, and the process of transmitting the position data of the unmanned vehicle 2A and the request data is performed.
- step S206 it is assumed that consent data is output from the drone 3B.
- the guiding unit 44 unmanners the drone 3B based on the position data of the unmanned vehicle 2A and the position data of the drone 3B. Output the flight route to the vehicle 2A.
- the flight route is, for example, the shortest route (straight route) connecting the drone 3B and the unmanned vehicle 2A.
- the guiding unit 44 transmits flight route data indicating the flight route to the unmanned vehicle 2A to the drone 3B via the communication system 7 (step S208).
- the process of step S208 corresponds to the process of step S5 described with reference to FIG.
- the drone 3B flies toward the unmanned vehicle 2A in which the abnormality has occurred based on the flight route data.
- the drone 3B that has arrived above the unmanned vehicle 2A acquires image data of the unmanned vehicle 2A using the imaging device 34.
- the image data of the unmanned vehicle 2A captured by the imaging device 34 of the drone 3B is transmitted to the management device 4 via the communication system 7.
- the image data acquisition unit 46 acquires the image data of the unmanned vehicle 2A captured by the imaging device 34 via the communication system 7 (step S209).
- the output control unit 49 causes the display device 6 to display the image data of the unmanned vehicle 2A acquired by the image data acquisition unit 46 (step S210).
- FIG. 9 is a view showing an example of the display device 6 according to the present embodiment.
- the output control unit 49 causes the display device 6 to display the image data of the unmanned vehicle 2 captured by the imaging device 34.
- the drone 3 ⁇ / b> B acquires image data of the unmanned vehicle 2 from above the unmanned vehicle 2 using the imaging device 34.
- the image data of the unmanned vehicle 2 includes image data of the periphery of the unmanned vehicle 2.
- the restart instruction unit 47 determines whether the unmanned vehicle 2A can be restarted based on the image data of the unmanned vehicle 2A acquired by the image data acquisition unit 46 (step S211).
- the control device 20 of the unmanned vehicle 2A determines that an obstacle is present around the unmanned vehicle 2A based on the detection data of the non-contact sensor 25 and stops the unmanned vehicle 2A, the obstacle actually occurs. It may not exist.
- the non-contact sensor 25 erroneously detects the unevenness of the traveling path as an obstacle, the unmanned vehicle 2A may stop although the unmanned vehicle 2A can actually continue traveling.
- the restart instruction unit 47 performs image processing on the image data of the unmanned vehicle 2A captured by the imaging device 34, and determines whether an obstacle exists around the unmanned vehicle 2A.
- step S211 When it is determined in step S211 that an obstacle exists around the unmanned vehicle 2A and the unmanned vehicle 2A can not be restarted (step S211: No), the output control unit 49 causes the display device 6 to unmanner Display data indicating that the vehicle 2A can not be restarted is displayed. For example, as a result of performing image processing on image data of the unmanned vehicle 2A, when it is determined that an obstacle is present in front of the unmanned vehicle 2A and the unmanned vehicle 2A can not travel, the restart command unit 47 It is determined that 2A can not be restarted. The output control unit 49 causes the display device 6 to display display data indicating that the unmanned vehicle 2A can not be restarted. For example, the administrator Wb can view the display device 6 and instruct the worker to instruct, for example, the removal of an obstacle existing around the unmanned vehicle 2A.
- step S211 when there is no obstacle around the unmanned vehicle 2A and it is determined that the unmanned vehicle 2A can be restarted (step S211: Yes), the restart instruction unit 47 re-selects the unmanned vehicle 2A. Output a restart command to start. For example, as a result of image processing of the image data of the unmanned vehicle 2A, when it is determined that the unmanned vehicle 2A can travel with no obstacle present around the unmanned vehicle 2A, the restart command unit 47 restarts Output a command.
- the restart instruction unit 47 transmits a restart instruction to the unmanned vehicle 2A via the communication system 7 (step S212).
- the process of step S211 corresponds to the process of step S7 described with reference to FIG.
- the unmanned vehicle 2A that has acquired the restart command resumes traveling based on the target traveling data based on the target traveling data.
- the administrator Wb may determine whether the unmanned vehicle 2A can be restarted. Since the image data of the unmanned vehicle 2A is displayed on the display device 6, the administrator Wb can view the display device 6 and confirm the condition of the unmanned vehicle 2A. The administrator Wb operates the input device 5 when the administrator Wb sees the display 6 on which the image data of the unmanned vehicle 2A is displayed, and determines that an obstacle actually exists and the unmanned vehicle 2A can not travel. Thus, the unmanned vehicle 2A generates determination data indicating that it can not travel. The output control unit 49 causes the display device 6 to display display data indicating that the unmanned vehicle 2A can not be restarted.
- the restart instruction unit 47 transmits a restart instruction to the unmanned vehicle 2A via the communication system 7 (step S212).
- FIG. 10 is a flowchart showing the operation of the aircraft 3 according to the present embodiment.
- the management device 4 implements the process of step S205 (step S3) described above. That is, the management device 4 transmits the position data and the request data of the unmanned vehicle 2A stopped by the occurrence of the abnormality to the drone 3B via the communication system 7.
- the flight route data acquisition unit 302 acquires position data of the unmanned vehicle 2A from the management device 4.
- the request data acquisition unit 306 acquires request data from the management device 4 (step S301).
- the determination unit 307 determines whether to fly toward the unmanned vehicle 2A (step S302).
- the determining unit 307 determines to fly toward the unmanned vehicle 2A.
- the determination unit 307 determines that the vehicle does not fly toward the unmanned vehicle 2A.
- step S302 If it is determined in step S302 that the vehicle will fly toward the unmanned vehicle 2A (step S302: Yes), the response output unit 308 generates consent data for consenting to fly toward the unmanned vehicle 2A.
- the reply output unit 308 transmits the consent data to the management device 4 via the communication system 7 (step S303).
- the process of step S303 corresponds to the process of step S4 described with reference to FIG.
- step S302 If it is determined in step S302 that the vehicle does not fly toward the unmanned vehicle 2A (step S302: No), the response output unit 308 generates refusal data that refuses to travel toward the unmanned vehicle 2A.
- the reply output unit 308 transmits the refusal data to the management device 4 via the communication system 7 (step S306).
- the management device 4 generates flight route data, and transmits the flight route data to the control device 30 via the communication system 7.
- the flight route data acquisition unit 302 acquires flight route data (step S304).
- the flight control unit 304 operates the flight device 31 based on the flight route data to fly the drone 3B to the unmanned vehicle 2A.
- the imaging device 34 of the drone 3B that has arrived at the unmanned vehicle 2A captures an image of the unmanned vehicle 2.
- the administrator Wb can remotely operate the drone 3B to obtain the image data of the unmanned vehicle 2A while changing the relative position of the drone 3B with respect to the unmanned vehicle 2A. Further, when the optical system of the imaging device 34 includes a zoom lens, the administrator Wb remotely operates the optical system of the imaging device 34 to magnify or reduce the optical image of the unmanned vehicle 2A, and image data You may get
- the image data acquisition unit 305 acquires the image data of the unmanned vehicle 2A acquired by the imaging device 34.
- the image data acquisition unit 305 transmits the image data of the unmanned vehicle 2 to the management device 4 via the communication system 7 (step S305).
- the process of step S305 corresponds to the process of step S6 described with reference to FIG.
- the position data of the unmanned vehicle 2A that has stopped due to the occurrence of an abnormality is output to the drone 3B.
- the drone 3B can fly toward the unmanned vehicle 2A based on the position data of the unmanned vehicle 2A.
- the drone 3B can quickly acquire image data of the unmanned vehicle 2A. Further, appropriate measures can be taken based on the image data of the unmanned vehicle 2A acquired by the drone 3B. Therefore, the decrease in productivity at the work site is suppressed.
- the restart instruction unit 47 outputs a restart instruction to restart the unmanned vehicle 2A based on the image data of the unmanned vehicle 2A. Thereby, if the unmanned vehicle 2A can travel, the unmanned vehicle 2A can resume traveling based on the target traveling data. Therefore, the decrease in productivity at the work site is suppressed.
- the guidance unit 44 outputs the flight route from the drone 3B to the unmanned vehicle 2A based on the position data of the unmanned vehicle 2A and the position data of the drone 3B. Thereby, the drone 3B can fly quickly to the unmanned vehicle 2A according to the flight route.
- the flight route is the shortest route (straight route) connecting the drone 3B and the unmanned vehicle 2A. Thereby, the flight distance of the drone 3B is shortened, so that the drone 3B can arrive at the unmanned vehicle 2A in a short time.
- the selection unit 45 selects a specific drone 3 from the plurality of drone 3, and the induction unit 44 generates an abnormality in the specific drone 3 selected by the selection unit 45.
- the position data of the unmanned vehicle 2A stopped by is transmitted. Thereby, the optimum drone 3 selected from the plurality of drone 3 is guided to the unmanned vehicle 2A.
- the selection unit 45 selects the drone 3 having the shortest distance to the unmanned vehicle 2A among the plurality of drone 3 as the specific drone 3 flying toward the unmanned vehicle 2A. Thereby, the selected specific drone 3 can arrive at the unmanned vehicle 2A in a short time.
- the selection unit 45 acquires consent data or refusal data for the request data from the drone 3 and determines the drone 3 to fly toward the unmanned vehicle 2A. As a result, it is possible to suppress the task of confirming the condition of the unmanned vehicle 2A from being assigned to the drone 3 which is difficult or impossible to fly toward the unmanned vehicle 2A.
- FIG. 11 is a block diagram showing an example of a computer system 1000 according to the present embodiment.
- Each of the management device 4, the control device 20, and the control device 30 described above includes a computer system 1000.
- a computer system 1000 includes a processor 1001 such as a central processing unit (CPU), and a main memory 1002 including nonvolatile memory such as a read only memory (ROM) and volatile memory such as a random access memory (RAM). It has a storage 1003 and an interface 1004 including an input / output circuit.
- the functions of the management device 4 described above, the functions of the control device 20, and the functions of the control device 30 are stored in the storage 1003 as a program.
- the processor 1001 reads a program from the storage 1003 and develops the program in the main memory 1002, and executes the above-described processing according to the program.
- the program may be distributed to computer system 1000 via a network.
- the control device 20 of the unmanned vehicle 2 may have at least a part of the function of the management device 4, and the control device 30 of the drone 3 is at least a part of the management device 4. It may have a function. That is, the control device 20 may function as the management device 4 or the control device 30 may function as the management device 4.
- the control device 20 and the control device 30 includes a target travel data generation unit 41, a position data acquisition unit 42, an abnormal data acquisition unit 43, a guidance unit 44, a selection unit 45, an image data acquisition unit 46, and restart. It may have the function of the command unit 47.
- the control device 30 may generate a flight route from the drone 3 to the unmanned vehicle 2 based on the position data of the unmanned vehicle 2 and the position data of the drone 3.
- the computer system 1000 including at least one of the management device 4, the control device 20, and the control device 30 acquires position data of the unmanned vehicle 2 which has output abnormal data at the work site, and can fly at the work site Outputting position data of the unmanned vehicle 2 whose abnormality data has been output to 3; Thereby, the fall of productivity can be suppressed in the work field where unmanned vehicles 2 operate.
- the unmanned vehicle 2 in which the abnormality has occurred is to stop.
- the unmanned vehicle 2 in which the abnormality has occurred may be decelerated.
- the abnormality in the traveling state of the unmanned vehicle 2 includes the state where the unmanned vehicle 2 is traveling at a traveling speed lower than the target traveling speed.
- the guiding unit 44 may output, to the drone 3, position data of the unmanned vehicle 2 that has been decelerated.
- the drone 3 may have, in addition to the imaging device 34, for example, a microphone device capable of acquiring audio data around the unmanned vehicle 2A.
- the unmanned vehicle 2A may stop or may decelerate to a traveling speed lower than the target traveling speed without stopping.
- the selection unit 45 selects the drone 3 having the shortest distance to the unmanned vehicle 2A among the plurality of drone 3 as the specific drone 3 flying toward the unmanned vehicle 2A.
- the selection unit 45 may select the drone 3 having a high flight speed as the specific drone 3 or may select the drone 3 having a high charge amount in the charging place as the specific drone 3.
- the flight route of the drone 3 to the unmanned vehicle 2A may not be the shortest route (straight route) connecting the drone 3 and the unmanned vehicle 2A.
- the guiding unit 44 may generate a flight route that bypasses the obstacle as a flight route.
- the image data of the unmanned vehicle 2A captured by the imaging device 34 is transmitted to the management device 4, and the restart instruction unit 47 of the management device 4 outputs the restart instruction of the unmanned vehicle 2A.
- the control device 30 of the drone 3 performs image processing of the image data of the unmanned vehicle 2A, generates a restart instruction based on the image processing result, and generates the restart instruction without the intervention of the management device 4 It may be sent to That is, the restart command generated by the drone 3 may be transmitted from the drone 3 to the unmanned vehicle 2A.
- the position data of the unmanned vehicle 2A that has stopped due to the occurrence of an abnormality may be transmitted from the control device 20 of the unmanned vehicle 2A to the control device 30 of the drone 3 without passing through the management device 4. Also by doing this, the drone 3 can fly toward the unmanned vehicle 2A that has stopped due to the occurrence of an abnormality.
- the drone 3 is installed in the standby facility 10 defined at the work site.
- the drone 3 may be mounted on the unmanned vehicle 2.
- the drone 3 mounted on the unmanned vehicle 2 can be configured to perform communication by wire or wireless.
- the unmanned vehicle 2 stops the drone 3 ascends to a position where an image of at least a region in front of the unmanned vehicle 2 can be captured, and the imaging device 34 can perform imaging.
- the imaging device 34 may include a single camera or a plurality of cameras.
- the management device 4 may not have the position data acquisition unit 42, and the guidance unit 44 may not output the position data of the unmanned vehicle 2A stopped at the drone 3.
- the drone 3 is the mobile unit on which the imaging apparatus is mounted.
- the unmanned vehicle 2 may be a mobile unit on which the imaging device is mounted.
- the image data of the unmanned vehicle 2A stopped by the occurrence of an abnormality may be captured by the imaging device mounted on the unmanned vehicle 2A stopped.
- the moving body on which the imaging device is mounted may be the same as the unmanned vehicle 2A that has stopped.
- FIG. 12 is a functional block diagram showing the unmanned vehicle 2A according to the present embodiment.
- the functional block diagram shown in FIG. 12 differs from the functional block diagram shown in FIG. 3 in that the unmanned vehicle 2A stopped has the imaging device 28 and the control device 20 is stopped by the imaging device 28.
- the image data acquisition unit 209 acquires the image data of 2A.
- the imaging device 28 captures image data of the unmanned vehicle 2A.
- the image data of the stopped unmanned vehicle 2A includes at least the image data of the front of the unmanned vehicle 2A.
- One imaging device 28 may be provided in the unmanned vehicle 2A, or a plurality of imaging devices 28 may be provided.
- the image data may be, for example, an image taken in front of the unmanned vehicle 2A with one camera provided in the unmanned vehicle 2A, or an image obtained by imaging the periphery of the entire periphery of the unmanned vehicle 2A with a plurality of cameras. It may be.
- the image data captured by the imaging device 28 is transmitted to the management device 4 via the communication system 7.
- the output control unit 49 of the management device 4 can cause the display device 6 to display the image data of the unmanned vehicle 2A captured by the imaging device 28.
- the management device 4 may not have the position data acquisition unit 42, and the guidance unit 44 may not output the position data of the unmanned vehicle 2 ⁇ / b> A stopped at the drone 3.
- the unmanned vehicle 2 is a dump truck which is a kind of transport vehicle.
- the unmanned vehicle 2 may be a working machine provided with a working machine such as, for example, a hydraulic shovel or a bulldozer.
- the work machine provided with the work machine may be remotely operated.
- SYMBOLS 1 Management system, 2 ... Unmanned vehicle, 3 ... Drone (aircraft), 4 ... Management device, 5 ... Input device, 6 ... Display device (Output device), 7 ... Communication system, 8 ... Control facility, 9 ...
- Wireless Communication equipment 10: standby equipment, 20: control device, 21: traveling device, 21B: braking device, 21D: driving device, 21H: wheels, 21S: steering device, 22: vehicle body, 23: dump body, 24: vehicle speed Sensor 25 noncontact sensor 26 position sensor 27 wireless communication device 28 imaging device 30 control device 31 flight device 31D drive device 31P propeller 32 main body 33 position Sensor 34: Imaging device 36: Wireless communication device 40: Communication unit 41: Target travel data generation unit 42: Position data acquisition unit 43: Abnormal data acquisition unit 44: Guidance unit 45: Selection unit 46: Image Data acquisition unit 47 restart command unit 48 input data acquisition unit 49 output control unit 201 communication unit 202 target travel data acquisition unit 203 vehicle speed data acquisition unit 204 obstacle data Acquisition unit 205 Position data acquisition unit 206 Travel control unit 207 Determination unit 208 Anomalous data output unit 209 Image data acquisition unit 301 Communication unit 302 Flight route data acquisition unit 303 Position data acquisition unit 304: flight control unit 305: image data acquisition unit 306: request data acquisition unit 307: determination unit 308
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Mechanical Engineering (AREA)
- Combustion & Propulsion (AREA)
- Computer Networks & Wireless Communication (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Selective Calling Equipment (AREA)
- Time Recorders, Dirve Recorders, Access Control (AREA)
- Control Of Vehicle Engines Or Engines For Specific Uses (AREA)
Abstract
Description
図1は、本実施形態に係る作業現場の管理システム1の一例を模式的に示す図である。図1に示すように、作業現場において、無人車両2及び飛行体3が稼働する。
無人車両2は、作業現場において走行可能である。無人車両2は、制御装置20と、走行装置21と、走行装置21に支持される車両本体22と、車両本体22に支持されるダンプボディ23と、無人車両2の走行速度を検出する車速センサ24と、物体を非接触で検出する非接触センサ25と、無人車両2の位置を検出する位置センサ26と、無線通信機27とを備える。
ドローン3は、作業現場において飛行可能である。ドローン3は、制御装置30と、飛行装置31と、飛行装置31に支持される本体32と、ドローン3の位置を検出する位置センサ33と、撮像装置34と、無線通信機36とを備える。
図2は、本実施形態に係る管理システム1の処理の概要を示すシーケンス図である。管理装置4は、無人車両2の目標走行条件を示す目標走行データを生成する。管理装置4は、通信システム7を介して、無人車両2に目標走行データを送信する(ステップS1)。
図3は、本実施形態に係る制御装置20を示す機能ブロック図である。制御装置20は、コンピュータシステムを含む。制御装置20は、通信システム7を介して、管理装置4と無線通信する。
図4は、本実施形態に係る制御装置30を示す機能ブロック図である。制御装置30は、コンピュータシステムを含む。制御装置30は、通信システム7を介して、管理装置4と無線通信する。
図5は、本実施形態に係る管理装置4を示す機能ブロック図である。管理装置4は、コンピュータシステムを含む。管理装置4は、通信システム7を介して、制御装置20及び制御装置30と無線通信する。
図6は、本実施形態に係る無人車両2の動作を示すフローチャートである。目標走行データ生成部41で生成された無人車両2の目標走行データは、通信システム7を介して、管理装置4から制御装置20に送信される。目標走行データ取得部202は、通信システム7を介して、管理装置4から目標走行データを受信する(ステップS101)。
図7は、本実施形態に係る管理装置4の動作を示すフローチャートである。目標走行データ生成部41は、無人車両2の目標走行データを生成する。目標走行データ生成部41は、通信システム7を介して、目標走行データを制御装置20に送信する(ステップS201)。ステップS201の処理は、図2を参照して説明したステップS1の処理に相当する。
図10は、本実施形態に係る飛行体3の動作を示すフローチャートである。無人車両2Aに異常が発生すると、管理装置4は、上述のステップS205(ステップS3)の処理を実施する。すなわち、管理装置4は、通信システム7を介して、異常の発生により停車した無人車両2Aの位置データ及び要求データをドローン3Bに送信する。飛行ルートデータ取得部302は、管理装置4から無人車両2Aの位置データを取得する。要求データ取得部306は、管理装置4から要求データを取得する(ステップS301)。
以上説明したように、本実施形態によれば、異常の発生により停車した無人車両2Aの位置データがドローン3Bに出力される。これにより、ドローン3Bは、無人車両2Aの位置データに基づいて、無人車両2Aに向かって飛行することができる。ドローン3Bは、無人車両2Aの画像データを迅速に取得することができる。また、ドローン3Bにより取得された無人車両2Aの画像データに基づいて適切な処置を講ずることができる。したがって、作業現場の生産性の低下が抑制される。
図11は、本実施形態に係るコンピュータシステム1000の一例を示すブロック図である。上述の管理装置4、制御装置20、及び制御装置30のそれぞれは、コンピュータシステム1000を含む。コンピュータシステム1000は、CPU(Central Processing Unit)のようなプロセッサ1001と、ROM(Read Only Memory)のような不揮発性メモリ及びRAM(Random Access Memory)のような揮発性メモリを含むメインメモリ1002と、ストレージ1003と、入出力回路を含むインターフェース1004とを有する。上述の管理装置4の機能、制御装置20の機能、及び制御装置30の機能は、プログラムとしてストレージ1003に記憶されている。プロセッサ1001は、プログラムをストレージ1003から読み出してメインメモリ1002に展開し、プログラムに従って上述の処理を実行する。なお、プログラムは、ネットワークを介してコンピュータシステム1000に配信されてもよい。
なお、上述の実施形態においては、異常が発生した無人車両2は停車することとした。異常が発生した無人車両2は減速してもよい。上述のように、無人車両2の走行状態の異常は、無人車両2が目標走行速度よりも低い走行速度で走行している状態を含む。誘導部44は、ドローン3に、減速した無人車両2の位置データを出力してもよい。
Claims (9)
- 移動体に搭載された撮像装置で撮像された、異常の発生により作業現場において停車した無人車両の画像データを取得する画像データ取得部を備える作業現場の管理システム。
- 停車した前記無人車両の位置データを取得する位置データ取得部と、
前記移動体に、前記停車した前記無人車両の位置データを出力する誘導部と、
を備える請求項1に記載の作業現場の管理システム。 - 前記移動体は、前記作業現場において飛行可能な飛行体を含み、
前記位置データ取得部は、前記飛行体の位置データを取得し、
前記誘導部は、前記無人車両の位置データと前記飛行体の位置データとに基づいて、前記飛行体に、前記飛行体から前記無人車両までの飛行ルートを出力する、
請求項2に記載の作業現場の管理システム。 - 前記飛行ルートは、前記飛行体と前記無人車両とを結ぶ最短ルートである、
請求項3に記載の作業現場の管理システム。 - 前記位置データ取得部は、複数の前記飛行体のそれぞれの位置データを取得し、
前記無人車両の位置データと複数の前記飛行体のそれぞれの位置データとに基づいて、複数の前記飛行体から特定の飛行体を選択する選択部を備え、
前記誘導部は、前記選択部に選択された特定の前記飛行体に、前記無人車両の位置データを出力する、
請求項3又は請求項4に記載の作業現場の管理システム。 - 特定の前記飛行体は、複数の前記飛行体のうち前記無人車両との距離が最も短い飛行体である、
請求項5に記載の作業現場の管理システム。 - 前記誘導部は、前記飛行体に要求データを出力し、
前記選択部は、前記要求データに対する承諾データ又は拒否データを前記飛行体から取得する、
請求項5又は請求項6に記載の作業現場の管理システム。 - 前記画像データに基づいて、前記無人車両を再起動させる再起動指令を出力する再起動指令部を備える、
請求項1から請求項7のいずれか一項に記載の作業現場の管理システム。 - 移動体に搭載された撮像装置で撮像された、異常の発生により作業現場において停車した無人車両の画像データを取得することを含む作業現場の管理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/611,305 US11535374B2 (en) | 2017-12-27 | 2018-11-29 | Management system of work site and management method of work site |
CA3063022A CA3063022A1 (en) | 2017-12-27 | 2018-11-29 | Management system of work site and management method of work site |
AU2018394476A AU2018394476B2 (en) | 2017-12-27 | 2018-11-29 | Management system of work site and management method of work site |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-252645 | 2017-12-27 | ||
JP2017252645A JP7026505B2 (ja) | 2017-12-27 | 2017-12-27 | 作業現場の管理システム及び作業現場の管理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019130973A1 true WO2019130973A1 (ja) | 2019-07-04 |
Family
ID=67067253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/044060 WO2019130973A1 (ja) | 2017-12-27 | 2018-11-29 | 作業現場の管理システム及び作業現場の管理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11535374B2 (ja) |
JP (1) | JP7026505B2 (ja) |
AU (1) | AU2018394476B2 (ja) |
CA (1) | CA3063022A1 (ja) |
WO (1) | WO2019130973A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021131161A1 (ja) * | 2019-12-25 | 2021-07-01 | コベルコ建機株式会社 | 作業支援サーバ、撮像装置の選択方法 |
JP2022057995A (ja) * | 2020-09-30 | 2022-04-11 | トヨタ自動車株式会社 | 情報処理装置及び方法 |
WO2024024146A1 (ja) * | 2022-07-26 | 2024-02-01 | 三菱重工業株式会社 | 移動制御方法、プログラム及び移動制御システム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7349860B2 (ja) * | 2019-09-20 | 2023-09-25 | 株式会社フジタ | 複数台の車両の管理システム |
CN114572398A (zh) * | 2022-03-24 | 2022-06-03 | 上海顺诠科技有限公司 | 空陆无人机的充电及巡检接替系统及其方法 |
WO2024143041A1 (ja) * | 2022-12-27 | 2024-07-04 | 京セラドキュメントソリューションズ株式会社 | 移動体制御システム |
WO2024189873A1 (ja) * | 2023-03-16 | 2024-09-19 | 日立建機株式会社 | 作業機械 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016171441A (ja) * | 2015-03-12 | 2016-09-23 | セコム株式会社 | 監視システム及び飛行ロボット |
US20160363932A1 (en) * | 2016-08-26 | 2016-12-15 | Caterpillar Paving Products Inc. | Worksite monitoring system |
WO2017122278A1 (ja) * | 2016-01-12 | 2017-07-20 | 楽天株式会社 | 情報提供システム、情報提供方法、及びプログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8918540B2 (en) * | 2005-09-26 | 2014-12-23 | The Boeing Company | Unmanned air vehicle interoperability agent |
JP4728974B2 (ja) | 2007-01-30 | 2011-07-20 | 株式会社小松製作所 | 無人車両管制システムにおけるエンジン遠隔始動装置および方法 |
US9175966B2 (en) * | 2013-10-15 | 2015-11-03 | Ford Global Technologies, Llc | Remote vehicle monitoring |
US10633091B2 (en) | 2015-01-29 | 2020-04-28 | Scope Technologies Holdings Limited | Accident monitoring using remotely operated or autonomous aerial vehicles |
WO2016123424A1 (en) | 2015-01-29 | 2016-08-04 | Scope Technologies Holdings Limited | Remote accident monitoring and vehcile diagnostic distributed database |
US9505494B1 (en) * | 2015-04-30 | 2016-11-29 | Allstate Insurance Company | Enhanced unmanned aerial vehicles for damage inspection |
EP4361038A3 (en) * | 2016-02-26 | 2024-06-12 | SZ DJI Technology Co., Ltd. | Systems and methods for adjusting uav trajectory |
-
2017
- 2017-12-27 JP JP2017252645A patent/JP7026505B2/ja active Active
-
2018
- 2018-11-29 US US16/611,305 patent/US11535374B2/en active Active
- 2018-11-29 CA CA3063022A patent/CA3063022A1/en not_active Abandoned
- 2018-11-29 AU AU2018394476A patent/AU2018394476B2/en active Active
- 2018-11-29 WO PCT/JP2018/044060 patent/WO2019130973A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016171441A (ja) * | 2015-03-12 | 2016-09-23 | セコム株式会社 | 監視システム及び飛行ロボット |
WO2017122278A1 (ja) * | 2016-01-12 | 2017-07-20 | 楽天株式会社 | 情報提供システム、情報提供方法、及びプログラム |
US20160363932A1 (en) * | 2016-08-26 | 2016-12-15 | Caterpillar Paving Products Inc. | Worksite monitoring system |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021131161A1 (ja) * | 2019-12-25 | 2021-07-01 | コベルコ建機株式会社 | 作業支援サーバ、撮像装置の選択方法 |
JP2021103840A (ja) * | 2019-12-25 | 2021-07-15 | コベルコ建機株式会社 | 作業支援サーバ、撮像装置の選択方法 |
JP7409074B2 (ja) | 2019-12-25 | 2024-01-09 | コベルコ建機株式会社 | 作業支援サーバ、撮像装置の選択方法 |
JP2022057995A (ja) * | 2020-09-30 | 2022-04-11 | トヨタ自動車株式会社 | 情報処理装置及び方法 |
JP7351280B2 (ja) | 2020-09-30 | 2023-09-27 | トヨタ自動車株式会社 | 情報処理装置及び方法 |
WO2024024146A1 (ja) * | 2022-07-26 | 2024-02-01 | 三菱重工業株式会社 | 移動制御方法、プログラム及び移動制御システム |
Also Published As
Publication number | Publication date |
---|---|
US20200164980A1 (en) | 2020-05-28 |
JP7026505B2 (ja) | 2022-02-28 |
JP2019118084A (ja) | 2019-07-18 |
AU2018394476B2 (en) | 2021-05-13 |
AU2018394476A2 (en) | 2020-01-30 |
CA3063022A1 (en) | 2019-12-02 |
AU2018394476A1 (en) | 2019-12-05 |
US11535374B2 (en) | 2022-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019130973A1 (ja) | 作業現場の管理システム及び作業現場の管理方法 | |
US20190265710A1 (en) | Vehicle control device, vehicle control system, vehicle control method, and vehicle control program | |
CA3064189C (en) | Work system, work machine, and control method | |
US11077849B2 (en) | Vehicle control system, vehicle control method, and storage medium | |
JP7052692B2 (ja) | 隊列走行システム | |
US11479246B2 (en) | Vehicle control device, vehicle control method, and storage medium | |
US20210086649A1 (en) | Vehicle control system, vehicle control method, and program | |
EP3835178A1 (en) | Automatic parking system | |
JP2022103827A (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2021047794A (ja) | 複数台の車両の管理システム | |
US11307592B2 (en) | Management system of work site and management method of work site | |
JP2018081623A (ja) | 車両制御システム、車両制御方法、および車両制御プログラム | |
JP2023030111A (ja) | 運転支援装置、運転支援方法、およびプログラム | |
US20220203866A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
JP2022129695A (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2022065386A (ja) | 作業車監視システム | |
US20220315050A1 (en) | Vehicle control device, route generation device, vehicle control method, route generation method, and storage medium | |
JP7186210B2 (ja) | 車両制御装置、車両制御方法、およびプログラム | |
CN117241973B (zh) | 车辆控制装置、车辆控制方法 | |
EP4148524A1 (en) | Mining vehicle task control | |
JP2023061093A (ja) | 施工管理システム | |
JP2021026483A (ja) | 複数台の車両の管理装置 | |
JP2022155702A (ja) | 車両制御装置、車両制御方法、およびプログラム | |
JP2022140008A (ja) | 移動制御システム、移動制御方法、制御プログラム及び制御装置 | |
JP2022103474A (ja) | 車両制御装置、車両制御方法、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18896946 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018394476 Country of ref document: AU Date of ref document: 20181129 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18896946 Country of ref document: EP Kind code of ref document: A1 |