WO2022070567A1 - Dispositif d'assistance à la téléopération et système de d'assistance à la téléopération - Google Patents

Dispositif d'assistance à la téléopération et système de d'assistance à la téléopération Download PDF

Info

Publication number
WO2022070567A1
WO2022070567A1 PCT/JP2021/027284 JP2021027284W WO2022070567A1 WO 2022070567 A1 WO2022070567 A1 WO 2022070567A1 JP 2021027284 W JP2021027284 W JP 2021027284W WO 2022070567 A1 WO2022070567 A1 WO 2022070567A1
Authority
WO
WIPO (PCT)
Prior art keywords
aerial vehicle
unmanned aerial
remote control
landing
work machine
Prior art date
Application number
PCT/JP2021/027284
Other languages
English (en)
Japanese (ja)
Inventor
均 佐々木
誠司 佐伯
龍一 廣瀬
洋一郎 山▲崎▼
Original Assignee
コベルコ建機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コベルコ建機株式会社 filed Critical コベルコ建機株式会社
Publication of WO2022070567A1 publication Critical patent/WO2022070567A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/30Supply or distribution of electrical power
    • B64U50/37Charging when not in flight
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U80/00Transport or storage specially adapted for UAVs
    • B64U80/20Transport or storage specially adapted for UAVs with arrangements for servicing the UAV
    • B64U80/25Transport or storage specially adapted for UAVs with arrangements for servicing the UAV for recharging batteries; for refuelling
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/90Launching from or landing on platforms
    • B64U70/99Means for retaining the UAV on the platform, e.g. dogs or magnets

Definitions

  • the present invention relates to a remote control support device and a remote control support system that support remote work performed by an operator operating a work machine by operating the remote control device.
  • Patent Document 1 in order to remotely control a work machine, image data of a work site taken by an image pickup device mounted on an unmanned aerial vehicle and an image pickup device installed in a structure installed at the work site is used. Can be displayed.
  • Patent Document 2 since a camera mounted on an unmanned aerial vehicle is used to manage the progress of the work, the progress of the work can be easily managed.
  • a camera mounted on an unmanned aerial vehicle because it can capture images of the part that the operator wants to see (for example, an excavation target) from various angles.
  • Time is limited. That is, the camera mounted on the unmanned aerial vehicle has a limited time for taking an image.
  • the camera may not be necessary depending on the position where the work is performed, the content of the work, and the image pickup direction of the fixed camera. There is also. That is, the cost is wasted for cameras that are not used by operators, work managers, and the like.
  • an operator who performs work by remotely operating a work machine uses an actual image pickup device (for example, a camera mounted inside the cab of the work machine, a camera fixed at the work site, etc.) as a main camera.
  • an actual image pickup device for example, a camera mounted inside the cab of the work machine, a camera fixed at the work site, etc.
  • a remote control support device for supporting the remote control of a work machine (for example, a hydraulic excavator) using a remote control device by an operator.
  • the remote control support device is The first support processing element and Has a second support processing element,
  • the first support processing element is a predetermined position (for example, a drone, etc.) in the work machine with respect to an unmanned aerial vehicle (for example, a drone) equipped with a camera, in response to a designated operation in the remote input interface provided in the remote control device.
  • a predetermined position on the zenith surface of the cab) and a predetermined position on the work site for example, a predetermined position on the ground, a predetermined position on a pole installed at the work site, a building being demolished, etc.
  • Send a landing command to land on one of them The second support processing element acquires an environmental image showing the surroundings of the work machine taken by the camera mounted on the unmanned aerial vehicle while the unmanned aerial vehicle is landed at the predetermined position.
  • the image output device of the remote control device In order for the image output device of the remote control device to output the environment image, the environment image is transmitted to the remote control device. It is characterized by that.
  • the unmanned aerial vehicle equipped with the camera lands on one of the predetermined position on the work machine and the predetermined position on the work site, so that the operator can work without worrying about the remaining battery level in the unmanned aerial vehicle. You can continue to operate the machine remotely.
  • the unmanned aerial vehicle equipped with the camera lands on one of the predetermined position on the work machine and the predetermined position on the work site, the person who manages the work cost is permanently installed at the work site.
  • the number of fixed cameras installed can be saved. That is, the person who manages the cost of the work can reduce the cost related to the installation of the fixed camera.
  • an unmanned airplane equipped with a camera lands at a predetermined position on the work machine, the operator can use the work machine even if the work machine is used at a work site where it is difficult to install a fixed camera. You can see the environmental image taken by the camera mounted on the work machine.
  • the unmanned aerial vehicle equipped with the camera lands on one of the predetermined position on the work machine and the predetermined position on the work site.
  • the unmanned aerial vehicle can land not only on its own machine (the work machine that the operator targets for remote control) but also at a predetermined position on the work site, the operator can move it into the main camera (for example, inside the cab of the work machine). If you want to see the area where the blind spot of the mounted camera) is likely to occur (for example, the front right with respect to the orientation of the seat in the cab), you can use the camera mounted on the unmanned aerial vehicle as an auxiliary camera to see the blind spot of the main camera. You will be able to see the areas that are likely to become.
  • An explanatory diagram regarding the configuration of a remote control support system as an embodiment of the remote control support device having the configuration according to the present invention Explanatory drawing about the configuration of a remote control device. Explanatory drawing about the structure of the work machine. Explanatory drawing which concerns on the structure as one Embodiment of the positioning mechanism which concerns on this invention. Explanatory drawing which concerns on the structure as one Embodiment of the landing position provided in the work machine which concerns on this invention. A bird's-eye view of a work site as an embodiment of the present invention. Explanatory drawing which concerns on the structure as one Embodiment of the landing port provided in the upper part of the fixed structure which concerns on this invention. An explanatory diagram of a processing flow for remote control as an embodiment of the present invention.
  • the explanatory view of the processing flow including the 1st support process and the 2nd support process as one Embodiment of this invention.
  • An explanatory diagram of a processing flow including an example of a second support processing as another embodiment of the present invention.
  • the remote control support system as an embodiment of the remote control support device 100 having the configuration according to the present invention shown in FIG. 1 is a remote control for assisting a remote control of a work machine by an operator using the remote control device.
  • a remote control support server 10 having a support device 100, a remote control device 20 for remotely controlling a work machine 40 remotely controlled by an operator, and a UAV image pickup device 512 (for example, a UAV image pickup device 512) that captures an environmental image showing a state of a work site. It is composed of an unmanned aerial vehicle 50 having a camera).
  • the remote control support server 10, the remote control device 20, the work machine 40, and the unmanned aerial vehicle 50 are configured to enable network communication with each other.
  • the mutual communication network of the remote control support server 10 and the remote control device 20, the mutual communication network of the remote control support server 10 and the work machine 40, and the mutual communication network of the remote control support server 10 and the unmanned airplane 50 are the same. However, they may be different.
  • the number of the work machines 40 may be one or a plurality.
  • the operator interrupts the remote control of the first work machine 40A, which is mainly the target of remote control, and switches the target of remote control to the second work machine 40B. It is assumed that the first work machine 40A and the second work machine 40B exist at the same work site.
  • the type of the work machine 40 may be the same type of work machine or a different type of work machine.
  • the remote control is a concept that means that the operator operates the work machine 40 from a position away from the work machine 40 without getting on the work machine 40.
  • the operator is a concept that refers to a person who operates the remote control device 20 to operate the work machine 40.
  • the remote control support server 10 includes a remote control support device 100, a database 110, and a server wireless communication device 122.
  • the remote control support device 100 includes a first support processing element 101 and a second support processing element 102.
  • Each support processing element is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the processor), reads necessary data and software from a storage device such as a memory, and applies the data to the software. Therefore, the arithmetic processing described later is executed.
  • the database 110 has a function of storing and holding operation information which is information on the operation status of the work machine 40. For example, when the work machine 40 is running, the database 110 stores and holds information about the running speed. Further, for example, when the upper swivel body 460 (see FIG. 3) of the work machine 40 is swiveling, the database 110 stores and holds information regarding the swivel speed.
  • the database 110 stores information on the size of the work machine 40 (for example, information that the operating hydraulic excavator is a 7t excavator or information that the operating hydraulic excavator is a 13t excavator, etc.). It may have a function of holding.
  • the database 110 has a function of reading operation information, operation information, etc. of the work machine 40 in response to a request from the operator. Further, the database 110 has a function of automatically writing operation information, operation information, and the like by acquiring operation information, operation information, and the like from the work machine 40.
  • the database 110 has a function of storing and holding work position information including information on the position where the work machine 40 is working.
  • the database 110 stores and holds the work position information of the work machine 40 by acquiring information on the coordinates at which the work machine 40 is working with a GNSS receiver (GNSS: Global Navigation Satellite System).
  • GNSS Global Navigation Satellite System
  • the information regarding the coordinates may be information regarding the world coordinates or information regarding the local coordinates at the work site where the work machine 40 is working.
  • the database 110 has a function of storing and holding the landing position information including the information regarding the position where the unmanned aerial vehicle 50 can land at the work site where the work machine 40 is working.
  • the database 110 stores and holds information about the position of the work machine 40 as the landing position information.
  • the database 110 may store and retain information as to whether or not the work machine 40 has a positioning mechanism for landing the unmanned aerial vehicle 50.
  • the database 110 may store and hold information on the position of the work machine 40 where the positioning mechanism is installed.
  • the database 110 may store and hold information on how many positioning mechanisms are installed in the work machine 40.
  • the database 110 stores and holds information regarding the installation position of the landing structure 60 (for example, a pole) installed at the work site as the landing position information. Further, the database 110 may store and retain information regarding the height (for example, altitude, etc.) of the landing structure 60 (for example, a pole) installed at the work site as the landing position information.
  • the database 110 may store and retain information regarding whether or not the landing structure 60 installed at the work site has a battery for charging the unmanned aerial vehicle 50.
  • the database 110 may store and retain information regarding whether or not the landing structure 60 installed at the work site has a magnet for fixing the unmanned aerial vehicle 50.
  • the database 110 stores and retains the attributes of the work machine 40, the operation schedule of the work machine 40, information on the name, position, scale, etc. of the work site, an environmental image showing the state of the work site where the work machine 40 operates, and the like. do.
  • the database 110 may store, as an attribute of the work machine 40, information that the work machine 40 is a hydraulic excavator, information on a manufacturer name, information on a serial number, and the like. Further, when a plurality of work machines 40 exist in the same work site, the database 110 contains identification information for identifying each work machine 40 (for example, each work machine 40 predetermined by the manager of the work site). The control number, etc.) is stored and retained.
  • the server wireless communication device 122 includes an environment including a remote control device 20, a work machine 40, an unmanned aerial vehicle 50, and a work site around the work machine 40 imaged by the operator with the UAV image pickup device 512 of the unmanned aerial vehicle 50. It has a function of transmitting and receiving a command signal used for remotely controlling the work machine 40 while viewing an image.
  • the remote control device 20 includes a remote control device 200, a remote input interface 210, and a remote output interface 220.
  • the remote control device 200 is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the processor core), reads necessary data and software from a storage device such as a memory, and applies the data to the software. Executes the corresponding arithmetic processing.
  • the remote input interface 210 includes a remote control mechanism 211.
  • the remote output interface 220 includes an image output device 221 and a remote wireless communication device 222.
  • the remote input interface 210 includes a landing operation device for landing the unmanned aerial vehicle 50.
  • the landing operation device is a push button, and when the operator presses the push button, the unmanned aerial vehicle 50 starts the landing control.
  • the push button was mentioned as an example of the landing operation device, but the example of the landing operation device is not limited to the push button.
  • Another example of a landing operation device is a dial.
  • the unmanned aerial vehicle 50 may start the landing control by twisting the dial by the operator.
  • the landing operation device there is an operation lever included in the remote control mechanism 211.
  • the unmanned aerial vehicle 50 may start the landing control by the operator operating the operation lever.
  • the landing operation device there is a foot pedal included in the remote control mechanism 211.
  • the unmanned aerial vehicle 50 may start the landing control by the operator stepping on the foot pedal.
  • the unmanned aerial vehicle 50 may start the landing control by tapping the icon displayed on the remote output interface 220 by the operator.
  • the tap operation is mentioned as an example in which the operator operates the icon, but the example in which the operator operates the icon is not limited to the tap operation.
  • Other examples of the operator operating the icon include a double tap operation, a long tap operation, a swipe operation, a pinch operation, a drag operation, and a flick operation.
  • the remote output interface 220 as a landing operation device, there is an example of shaking (shaking) the remote output interface 220.
  • the unmanned aerial vehicle 50 may start the landing control by shaking (shaking) the tablet terminal device.
  • the remote control mechanism 211 includes a traveling operation device, a turning operation device, a boom operation device, an arm operation device, and a bucket operation device.
  • Each operating device has an operating lever that receives a rotation operation.
  • the operation lever (travel lever) of the travel operation device is operated to move the lower traveling body 450.
  • the travel lever may also serve as a travel pedal.
  • a traveling pedal fixed to the base or the lower end of the traveling lever may be provided.
  • the operation lever (swivel lever) of the swivel operation device is operated to move the hydraulic swivel motor constituting the swivel mechanism 430.
  • the operating lever (boom lever) of the boom operating device is operated to move the boom cylinder 442 of the work machine 40.
  • the operation lever (arm lever) of the arm operation device is operated to move the arm cylinder 444 of the work machine 40.
  • the operation lever (bucket lever) of the bucket operation device is operated to move the bucket cylinder 446 of the work machine 40.
  • Each operation lever constituting the remote control mechanism 211 is arranged around the seat St for the operator to sit on, for example, as shown in FIG.
  • the seat St is shaped like a high back chair with armrests.
  • the seat St may be any form of seating that can be seated by the operator, such as a low back chair without a headrest or a chair without a backrest.
  • a pair of left and right traveling levers 2110 corresponding to the left and right crawlers are arranged side by side in front of the seat St.
  • One operation lever may also serve as a plurality of operation levers.
  • the left side operating lever 2111 provided in front of the left side frame of the seat St shown in FIG. 3 functions as an arm lever when operated in the front-rear direction and is operated in the left-right direction. May function as a swivel lever.
  • the right operating lever 2112 provided in front of the right frame of the seat St shown in FIG. 3 functions as a boom lever when operated in the front-rear direction and is operated in the left-right direction. In some cases, it may function as a bucket lever.
  • the lever pattern may be arbitrarily changed by an operation instruction of the operator.
  • the image output device 221 is, for example, as shown in FIG. 3, a central image output device 2210 having a substantially rectangular screen arranged in front of the sheet St, diagonally forward left, and diagonally forward right, respectively, on the left side. It is composed of an image output device 2211 and a right image output device 2212.
  • the shapes and sizes of the screens (image display areas) of the central image output device 2210, the left image output device 2211, and the right image output device 2212 may be the same or different.
  • the left image output device 2211 so that the screen of the central image output device 2210 and the screen of the left image output device 2211 form an inclination angle ⁇ 1 (for example, 120 ° ⁇ ⁇ 1 ⁇ 150 °).
  • the right edge of is adjacent to the left edge of the central image output device 2210.
  • the right image output device 2212 so that the screen of the central image output device 2210 and the screen of the right image output device 2212 form an inclination angle ⁇ 2 (for example, 120 ° ⁇ ⁇ 2 ⁇ 150 °).
  • the left edge of is adjacent to the right edge of the central image output device 2210.
  • the inclination angles ⁇ 1 and ⁇ 2 may be the same or different.
  • the screens of the central image output device 2210, the left image output device 2211, and the right image output device 2212 may be parallel to the vertical direction or tilted with respect to the vertical direction. At least one of the central image output device 2210, the left image output device 2211, and the right image output device 2212 may be configured by the image output device divided into a plurality of parts.
  • the central image output device 2210 may be composed of a pair of vertically adjacent image output devices having a substantially rectangular screen.
  • the image output device 221 center image output device 2210, left image output device 2211, and right image output device 2212
  • the remote wireless communication device 222 included in the remote output interface 220 has a function of transmitting and receiving a command signal input by the operator for remote control of the work machine 40.
  • the remote wireless communication device 222 has a function of transmitting a command signal input by the operator for remote control of the unmanned aerial vehicle 50.
  • the command signal input for remote control of the unmanned aerial vehicle 50 includes a command for the operator to land the unmanned aerial vehicle 50 at a designated predetermined position (for example, the positioning mechanism 480).
  • the remote wireless communication device 222 has a function of receiving an image captured by the actual image pickup device 412 mounted on the work machine 40, which will be described later, via the remote control support server 10.
  • the remote wireless communication device 222 has a function of receiving an image captured by the UAV image pickup device 512 mounted on the unmanned aerial vehicle 50, which will be described later, via the remote control support server 10.
  • the work machine 40 refers to a work vehicle that can be operated at the work site.
  • the work machine 40 is, for example, a hydraulic excavator having a bucket.
  • a hydraulic excavator was mentioned as an example of the work machine 40, but the example of the work machine 40 is not limited to this example.
  • Other examples of the working machine 40 include a hybrid excavator, a crane, a lifting magnet, a cargo handling machine having a grappler, a dismantling machine having a crushing device, a bulldozer having a blade, and the like.
  • the work machine 40 is, for example, a crawler excavator (construction machine), and as shown in FIG. 3, the crawler type lower traveling body 450 and the lower traveling body 450 are rotatably mounted via a turning mechanism 430.
  • the upper swivel body 460 and the above are provided.
  • a cab 470 (driver's cab) is provided on the front left side of the upper swivel body 460.
  • An actuating mechanism 440 is provided at the front center portion of the upper swivel body 460.
  • the work machine 40 is provided with a marker for landing.
  • the landing marker is a sheet-like member whose outer edge is fixed to the upper swivel body 460 and can be horizontally extended.
  • the landing marker is, for example, a sticker that can be freely attached to and removed from the upper swivel body 460.
  • the landing marker has a plurality of AR markers recognizable by the unmanned aerial vehicle 50 on its upper surface.
  • the AR marker is a marker for designating a position for displaying additional information in an image recognition type AR system, and is, for example, a simple and clear black-and-white graphic.
  • a plurality of AR markers are arranged by combining large and small AR markers in order to realize highly accurate implantation.
  • the image recognition device mounted on the unmanned aerial vehicle 50 can accurately land at the center position of the landing marker by recognizing a small AR marker as it approaches the work machine 40 from a distance. can.
  • the actual machine input interface 410 includes an actual machine operation mechanism 411, an actual machine image pickup device 412, and an actual machine positioning device 414.
  • the actual machine operation mechanism 411 includes a plurality of operation levers arranged in the same manner as the remote control mechanism 211 around the seat arranged inside the cab 470.
  • the cab 470 is provided with a drive mechanism or a robot that receives a signal according to the operation mode of the remote control lever and moves the actual machine operation lever based on the received signal.
  • the actual image pickup device 412 is installed inside the cab 470, for example, and is partitioned by a pair of left and right pillars (“L” and “R” are included in the code when distinguishing the left and right) on the front side of the cab 470.
  • the environment including at least a part of the operating mechanism 440 is imaged through the front window and the pair of left and right side windows. Some or all of the front window and the side window may be omitted.
  • the actual positioning device 414 is a device that detects the position of the work machine 40, and is composed of, for example, a GNSS receiver.
  • the actual machine positioning device 414 has a function of acquiring information on the altitude at which the work machine 40 exists.
  • the actual machine positioning device 414 includes a known barometric pressure sensor, and by measuring the barometric pressure related to the working machine 40, information on the altitude at which the working machine 40 exists is acquired.
  • the barometric pressure sensor is mentioned as an example of the actual positioning device 414 acquiring information on the altitude at which the work machine 40 is present, but the example of acquiring the information on the altitude at which the work machine 40 is present is not limited to this example.
  • the actual positioning device 414 acquires information on the altitude at which the work machine 40 exists, there is an example in which information on the altitude at the position where the work machine 40 exists is acquired from the altitude data included in the map data in GNSS. Can be mentioned.
  • the actual device output interface 420 includes an actual device wireless communication device 422. Information about the position of the work machine 40 detected by the actual machine positioning device 414 is transmitted to the remote control support server 10 through the actual machine wireless communication device 422.
  • the work attachment as the actuating mechanism 440 is rotatable to the boom 441 undulatingly mounted on the upper swing body 460, the arm 443 rotatably connected to the tip of the boom 441, and the tip of the arm 443. It is equipped with a bucket 445, which is connected to a bucket 445.
  • the actuating mechanism 440 is equipped with a boom cylinder 442, an arm cylinder 444, and a bucket cylinder 446, which are configured by a telescopic hydraulic cylinder.
  • the boom cylinder 442 is interposed between the boom 441 and the upper swing body 460 so as to expand and contract by receiving the supply of hydraulic oil and rotate the boom 441 in the undulating direction.
  • the arm cylinder 444 expands and contracts by receiving the supply of hydraulic oil, and is interposed between the arm 443 and the boom 441 so as to rotate the arm 443 about a horizontal axis with respect to the boom 441.
  • the bucket cylinder 446 expands and contracts by receiving the supply of hydraulic oil and is interposed between the bucket 445 and the arm 443 so as to rotate the bucket 445 about a horizontal axis with respect to the arm 443.
  • the positioning mechanism 480 (for example, the landing port) has a function of fixing the landed unmanned aerial vehicle 50, which will be described later.
  • the unmanned aerial vehicle 50 does not move from its landing position to the extent that a force of a predetermined value or less is applied to the unmanned aerial vehicle 50 in a predetermined direction.
  • the positioning mechanism 480 has a function as an electromagnet, and the unmanned aerial vehicle 50 is fixed by magnetically adhering to the landing mechanism 580 of the unmanned aerial vehicle 50.
  • the positioning mechanism 480 has shown an example of magnetic attachment as an example of the function of fixing the unmanned aerial vehicle 50, but the example of the function of the positioning mechanism 480 to fix the unmanned aerial vehicle 50 is not limited to magnetic attachment.
  • the positioning mechanism 480 has a concave portion
  • the landing mechanism 580 of the unmanned aerial vehicle 50 has a convex portion.
  • the positioning mechanism 480 fixes the unmanned aerial vehicle 50 by fitting the convex portion into the concave portion when the convex portion is provided.
  • the function of the positioning mechanism 480 to fix the unmanned aerial vehicle 50 an example is shown in which the convex portion of the landing mechanism 580 fits into the concave portion of the positioning mechanism 480, but the positioning mechanism 480 fixes the unmanned aerial vehicle 50.
  • the example of the function to perform is not limited to this example.
  • the positioning mechanism 480 has a recess, and the landing mechanism 580 of the unmanned aerial vehicle 50 is fitted into the recess.
  • the positioning mechanism 480 may have a battery 4810 for charging the unmanned aerial vehicle 50 (see FIG. 4).
  • the battery 4810 can charge the unmanned aerial vehicle 50.
  • the method of charging the unmanned aerial vehicle 50 there is an example of using a known wireless power supply.
  • the positioning mechanism 480 may be installed at one place of the work machine 40, or may be installed at a plurality of places of the work machine 40.
  • the front positioning mechanism 481 is installed in the front part of the work machine 40 (for example, the front part of the zenith surface in the cab 470), and the work machine is provided.
  • the left positioning mechanism 482 is installed on the left side of the 40 (for example, the left side of the zenith surface in the cab 470), and the right positioning mechanism 483 is installed on the right side of the work machine 40 (for example, the right side of the upper swivel body 460).
  • the rear positioning mechanism 484 may be installed behind (for example, behind the upper swivel body 460).
  • the symbol F in FIG. 5 indicates the front side (front side) of the work machine 40.
  • the symbol L indicates the left side of the work machine 40.
  • the symbol R indicates the right side of the work machine 40.
  • the symbol B indicates the rear side (back side) of the work machine 40.
  • the positioning mechanism 480 an example in which the front positioning mechanism 481, the left positioning mechanism 482, the right positioning mechanism 483 and the rear positioning mechanism 484 are used is shown, but the example of installing a plurality of positioning mechanisms 480 is this example. Not limited to examples.
  • an example of installing the positioning mechanism 480 an example of installing the positioning mechanism 480 on the left front of the cab 470, an example of installing the positioning mechanism 480 on the left rear of the upper swivel body 460, and an example of installing the positioning mechanism 480 on the left rear of the upper swivel body 460, and the right front of the upper swivel body 460.
  • An example of installing the positioning mechanism 480 in the above, and an example of installing the positioning mechanism 480 on the right rear side of the upper swing body 460 can be mentioned.
  • the unmanned aerial vehicle 50 is an airplane that cooperates with the work machine 40.
  • the unmanned aerial vehicle 50 is a known drone equipped with a UAV imager 512 (eg, a camera).
  • the unmanned aerial vehicle 50 includes a UAV control device 500, a UAV input interface 510, a UAV image pickup device 512, a UAV output interface 520, a UAV wireless communication device 522, a landing mechanism 580 (for example, a landing leg), and an altitude sensor (not shown). ).
  • the unmanned aerial vehicle 50 may have an angle adjusting mechanism for adjusting the imaging angle of the UAV imaging device 512.
  • the angle adjusting mechanism there is a ball head in which a camera mount attached to the substrate of the unmanned aerial vehicle 50 and a pan head on which the camera is mounted are connected by a ball joint.
  • the ball head can be panned (rotated about 360 degrees), tilted (rotated about 90 degrees), and rolled (rotated about 180 degrees) by electrically operating the ball joint to change the imaging angle of the camera. It has a function to adjust.
  • the example of the angle adjustment mechanism is not limited to the ball head.
  • the example of the angle adjusting mechanism there is an example of a known pan head that adjusts the imaging direction (yaw direction, pitch direction) of the camera.
  • the unmanned aerial vehicle 50 may be remotely controlled by an operator who remotely controls the work machine 40 by the remote control device 20, or may be remotely controlled by another operator (for example, an operator dedicated to the unmanned aerial vehicle 50). ..
  • the unmanned aerial vehicle 50 may fly independently according to a predetermined program.
  • the UAV image pickup device 512 included in the UAV input interface 510 has a function of capturing an environmental image which is an image showing the state of the work site including at least a part of the work machine 40.
  • the UAV wireless communication device 522 included in the UAV output interface 520 has a function of transmitting and receiving information including the position information of the unmanned aerial vehicle 50 to the remote control support server 10.
  • the landing mechanism 580 has a function of landing the unmanned aerial vehicle 50.
  • the landing mechanism 580 may have a magnet.
  • the unmanned aerial vehicle 50 can land on the iron landing surface.
  • the unmanned airplane 50 when the unmanned airplane 50 lands on the work machine 40, the unmanned airplane 50 may have an iron portion of the work machine 40 (eg, an upper turn) even if the work machine 40 does not have a positioning mechanism 480.
  • the implantation mechanism 580 having a magnet can be magnetically attached to the upper surface of the body 460) to implant the implant.
  • the unmanned aerial vehicle 50 landing on the work machine 40 an example in which the landing mechanism 580 is magnetically attached to the upper surface of the upper swivel body 460 is given, but an example in which the unmanned aerial vehicle 50 is landed on the work machine 40 is Not limited to this example.
  • the unmanned airplane 50 is magnetized on the work machine 40 by magnetically adhering the landing mechanism 580 to the side surface of the upper swivel body 460, an example in which the unmanned airplane 50 is magnetized on the iron landing structure 60, and the ground at the work site.
  • magnetizing an iron plate laid on the floor There is an example of magnetizing an iron plate laid on the floor.
  • the altitude sensor has a function to measure the altitude at which the unmanned aerial vehicle 50 flies. Further, the altitude sensor has a function of measuring the altitude of the landing position on the unmanned aerial vehicle 50.
  • the altitude sensor is, for example, a barometric pressure sensor.
  • the UAV control device 500 can acquire information on the altitude at which the unmanned aerial vehicle 50 flies and information on the altitude of the landing position on the unmanned aerial vehicle 50. ..
  • the landing structure 60 is a structure used for the unmanned aerial vehicle 50 to land.
  • An example of the landing structure 60 is a pole installed at a work site.
  • the landing structure 60 As an example of the landing structure 60, a pole installed at a work site was mentioned, but the example of the landing structure 60 is not limited to this example.
  • Other examples of the landing structure 60 include the roof of an office installed at a work site, a gate installed at a work site, a building targeted for demolition work at a work site in a building demolition work, and a work site. Buildings built at a height that gives a bird's-eye view of the building can be mentioned.
  • the landing structure 60 may be installed at one place on the work site or at a plurality of places on the work site (see FIG. 6).
  • the landing structure 60 may be installed at a height at which the UAV image pickup device 512 of the unmanned aerial vehicle 50 can be mounted with a bird's-eye view of the work site. Therefore, the landing structure 60 does not necessarily have to exist inside the work site.
  • the landing structure 60 may have a landing port 680 used for landing the unmanned aerial vehicle 50 on the upper surface of the landing structure 60 (see FIG. 7).
  • the landing structure 60 has a landing port 680
  • an example in which the landing structure 60 installs the landing port 680 on the upper surface of the landing structure 60 has been given. Is not limited to this example when the landing port 680 is provided.
  • the bottom surface of the landing port 680 is attached to the upper part of a pole installed at a work site so that the bottom surface of the landing port 680 faces the outer peripheral surface of the pole.
  • the unmanned aerial vehicle 50 can land in a direction orthogonal to the extending direction of the pole.
  • the unmanned aerial vehicle 50 can land on the landing port 680 installed on the upper part of the pole even if there is no space for installing the landing port 680 on the upper surface of the pole. That is, the UAV image pickup device 512 provided in the unmanned aerial vehicle 50 can take a bird's-eye view of the work site and take an image.
  • the landing port 680 As an example of attaching the landing port 680 to the upper part of the pole installed at the work site, an example of attaching the landing port 680 so that the bottom surface of the landing port 680 faces the outer peripheral surface of the pole was given.
  • the example of attaching the landing port 680 to the upper part is not limited to this example.
  • the landing port 680 For example, as another example of attaching the landing port 680 to the upper part of the pole installed at the work site, there is an example of attaching the landing port 680 so that the outer peripheral surface of the landing port 680 faces the outer peripheral surface of the pole. If the landing port 680 is attached to the pole in this way, the unmanned aerial vehicle 50 will be attached to the landing port 680 installed on the upper part of the pole even if there is no space for installing the landing port 680 on the upper surface of the pole. You can land on the floor.
  • the landing port 680 may have a function of fixing the landed unmanned aerial vehicle 50, which will be described later.
  • the landing port 680 has a function as an electromagnet, and fixes the unmanned aerial vehicle 50 by magnetically adhering to the landing mechanism 580 of the unmanned aerial vehicle 50.
  • the landing port 680 may have a charging device for charging the unmanned aerial vehicle 50 (for example, a power source drawn to a work site (not shown), a portable battery, etc.).
  • a charging device for charging the unmanned aerial vehicle 50 for example, a power source drawn to a work site (not shown), a portable battery, etc.
  • the charging device can charge the unmanned aerial vehicle 50.
  • the method of charging the unmanned aerial vehicle 50 there is an example of using a known wireless power supply.
  • the operation information, the landing position information, the position information, the positioning mechanism information, and the like of the work machine 40 are obtained.
  • the process for storing is explained.
  • the remote control device 200 determines whether or not there is a system start operation (FIG. 8 / STEP201).
  • system start operation is a concept in which the operator pushes the remote input interface 210 in order to specify the work machine 40 intended for remote control.
  • a push operation is given as an example of a system boot operation, but the example of a system boot operation is not limited to this example.
  • Other examples of system startup operations include tap operation, double tap operation, long tap operation, swipe operation, pinch operation, drag operation, flick operation, shake operation, and the like.
  • the tablet terminal device When a tablet terminal device is used for the remote output interface 220, as another example of the system startup operation, the tablet terminal device is pushed or tapped in order to specify the work machine 40 intended for remote control by the operator. , Double tap operation, long tap operation, swipe operation, pinch operation, drag operation, flick operation, shake operation and the like.
  • the remote control device 200 is designated by the operator by the system start operation for the remote control support server 10 through the remote wireless communication device 222.
  • An environment confirmation request which is a request for confirming the environment of the work site around the work machine 40, is transmitted (FIG. 8 / STEP202).
  • the remote control support server 10 When the remote control support server 10 receives the environment confirmation request through the server wireless communication device 122 (FIG. 8 / C10), the operator designates the environment confirmation request through the server wireless communication device 122 in the remote control support server 10. It is transmitted to the work machine 40 (FIG. 8 / STEP101).
  • the actual machine control device 400 executes a process for acquiring the environment image data (FIG. 8 / STEP401). ).
  • Environmental image data is a concept of data including images showing the surroundings of a work machine.
  • the environmental image data may include not only an image showing the surroundings of the work machine but also information such as the time when the environmental image was captured.
  • the environmental image may be captured by the actual image pickup device 412 or may be captured by the UAV image pickup device 512.
  • the actual machine control device 400 When the actual machine control device 400 acquires the environmental image data, the actual machine control device 400 transmits the environmental image data to the remote control support server 10 through the actual machine wireless communication device 422 (FIG. 8 / STEP402).
  • the remote control support server 10 When the remote control support server 10 receives the environmental image data through the server wireless communication device 122 (FIG. 8 / C11), the remote control support server 10 stores and holds the environmental image data in the database 110, and the server wireless The environment image data is transmitted to the remote control device 20 through the communication device 122 (FIG. 8 / STEP 102).
  • the remote control device 20 When the remote control device 20 receives the environmental image data through the remote wireless communication device 222 (FIG. 8 / C21), the remote control device 200 outputs the environmental image data to the image output device 221. Is executed (FIG. 8 / STEP203).
  • the remote control device 200 executes control for dividing and displaying the environmental image included in the environmental image data on the central image output device 2210, the left image output device 2211, and the right image output device 2212. An example is given.
  • the remote control device 200 recognizes the operation mode of the remote control mechanism 211 (FIG. 8 / STEP204), and the remote control device 200 corresponds to the operation mode through the remote wireless communication device 222.
  • a remote control command is transmitted to the remote control support server 10 (FIG. 8 / STEP205).
  • the remote control support server 10 When the remote control support server 10 receives the remote control command through the server wireless communication device 122 (FIG. 8 / C12), the remote control support server 10 transmits the remote control command to the work machine 40 (see FIG. 8 / C12). FIG. 8 / STEP103).
  • the actual machine control device 400 when the actual machine control device 400 receives the remote control command through the actual machine wireless communication device 422 (FIG. 8 / C42), the actual machine control device 400 is for controlling the operation of the operating mechanism 440 and the like.
  • the process is executed (FIG. 8 / STEP403).
  • the actual machine control device 400 scoops the soil in front of the work machine 40 with the bucket 445, swivels the upper swivel body 460, and then executes a process for executing the work of dropping the soil from the bucket 445.
  • the actual machine control device 400 recognizes the operation information which is the information related to the operation of the work machine, and transmits the operation information to the remote control support server 10 through the actual machine wireless communication device 422 (FIG. 8 / STEP404).
  • the actual machine control device 400 also has the landing position information regarding the position where the unmanned aerial vehicle 50 can land on the work machine 40 and the work machine 40 through the actual machine wireless communication device 422.
  • Position information regarding the position and positioning mechanism information which is information regarding whether or not the work machine 40 has the positioning mechanism 480 may be transmitted to the remote control support server 10 through the actual wireless communication device 422.
  • the positioning mechanism information includes not only information on whether or not the working machine 40 has the positioning mechanism 480, but also the positioning mechanism 480 has a battery 4810 used for charging the unmanned aerial vehicle 50. It may contain information about whether or not it is.
  • the remote control support server 10 When the remote control support server 10 receives the operation information or the like through the actual wireless communication device 422 (FIG. 8 / C13), the remote control support server 10 stores the operation information or the like in the database 110 (FIG. 8 / STEP104). ).
  • the remote control device 200 determines whether or not the operator has performed a designated operation through the remote input interface 210 (FIG. 9 / STEP211).
  • the "designated operation” is a concept in which the operator operates the remote input interface 210 in order to land the unmanned aerial vehicle 50. If the determination result is negative (FIG. 9 / STEP211 ... NO), the process ends. On the other hand, when the determination result is affirmative (FIG. 9 / STEP211 ... YES), the remote control device 200 includes a landing command including a command signal for landing the unmanned aerial vehicle 50 through the remote wireless communication device 222. Is transmitted to the remote control support server 10 (FIG. 9 / STEP212).
  • the landing command may include a command signal for recognizing different types of landing markers and landing on the landing marker of the type specified by the operator.
  • the unmanned aerial vehicle 50 since the unmanned aerial vehicle 50 lands on the landing marker of the type specified by the operator, the unmanned aerial vehicle 50 can be arbitrarily specified by the operator even if the position information regarding the landing position cannot be obtained. You can land in a position.
  • the unmanned aerial vehicle 50 detects the landing marker to land. Control may be initiated.
  • the unmanned aerial vehicle 50 when the landing control of the unmanned aerial vehicle 50 is controlled by the self-sustaining control or the self-sustaining control and the semi-self-sustaining control by the operation of the operator, the unmanned aerial vehicle 50 emits light at the landing position (for example, LED).
  • the implantation control for determining the implantation position may be started by recognizing the difference in the emission intensity of the light or the like).
  • the remote control support server 10 when the first support processing element 101 included in the remote control support device 100 receives a landing command through the server wireless communication device 122 (FIG. 9 / C14), the operation information of the work machine 40 Etc. (Fig. 9 / STEP111).
  • the operation information is a concept including information on the operation of the work machine 40.
  • the remote control support server 10 recognizes operation information and the like by referring to information and the like regarding the operation of the work machine 40 stored in the database 110 (FIG. 9 / STEP111).
  • the remote control support device 100 determines whether or not the operating speed of the work machine 40 is equal to or less than a predetermined value based on the acquired operation information (FIG. 9 / STEP112). For example, when the traveling speed of the work machine 40 is 5 km / h or less, the remote control support device 100 may determine that the operating speed of the work machine 40 is a predetermined value or less. If the determination is negative (FIG. 9 / STEP112 ... NO), the remote control support device 100 executes a process of returning to the flow immediately before the determination (FIG. 9 / STEP111). On the other hand, when the determination is affirmative (FIG. 9 / STEP112 ... YES), the remote control support device 100 issues a landing command including a command signal for landing the unmanned aerial vehicle 50 through the server wireless communication device 122. It is transmitted to the unmanned aerial vehicle 50 (Fig. 9 / STEP113).
  • the first support processing element 101 included in the remote control support device 100 performs the calculation process for the remote control support device 100 to transmit the landing command.
  • the first support processing element 101 is located at a predetermined position on the work machine 40 (for example, cab 470, etc.) with respect to the unmanned aerial vehicle 50 in response to a designated operation in the remote input interface 210 provided in the remote control device 20.
  • a landing command for landing at at least one of predetermined positions (for example, the landing structure 60, etc.) at the work site is transmitted.
  • the first support processing element 101 recognizes whether or not the operation of the work machine 40 is stopped, recognizes that there is a designated operation, and recognizes that the operation of the work machine 40 is stopped. In this case, the process of transmitting a landing command for landing at a predetermined position (for example, cab 470 or the like) on the work machine 40 may be executed for the unmanned airplane 50.
  • a landing command for landing at a predetermined position for example, cab 470 or the like
  • the first support processing element 101 is based on communication with the remote control device 20, and the operator can use the remote input interface 210 to perform a plurality of landing positions on the work machine 40 (for example, a front positioning mechanism 481 and a left positioning mechanism).
  • a front positioning mechanism 481 and a left positioning mechanism For example, a front positioning mechanism 481 and a left positioning mechanism.
  • One designated landing position for example, front positioning mechanism 481) from 482, right positioning mechanism 483, rear positioning mechanism 484, etc. is recognized as the designated landing position, and the unmanned airplane 50 is landed at the designated landing position.
  • a process of transmitting a landing command for flooring to the unmanned aircraft 50 may be executed.
  • the first support processing element 101 recognizes the position of each work machine 40 at the work site based on communication with a plurality of work machines 40 existing at the work site, and exists at the work site.
  • a landing command for landing the unmanned airplane 50 is transmitted to the unmanned airplane 50 to the designated work machine 40 as one work machine 40 designated by the operator by the remote input interface 210 among the plurality of work machines 40. The process may be executed.
  • the first support processing element 101 recognizes the position information of the work machine 40 in response to the operation in the remote input interface 210 provided in the remote control device 20, and is based on the communication with the unmanned aerial vehicle 50.
  • the landing position of the unmanned aerial vehicle 50 is recognized and the landing position of the unmanned aerial vehicle 50 is not a predetermined position in the work machine 40 (for example, the landing position of the unmanned aerial vehicle 50 is a landing structure installed at the work site).
  • a landing command including a command to image the direction in which the work machine 40 is located when viewed from a camera mounted on the unmanned aerial vehicle 50 (for example, UAV image pickup device 512) is transmitted to the unmanned aerial vehicle 50.
  • the process may be executed.
  • the direction in which the work machine 40 is located does not mean only the front-back and left-right directions at the work site. It means a three-dimensional direction including the height direction at the work site.
  • the first support processing element 101 is XY of the work machine 40 in the world coordinate system based on the position information of the work machine 40 positioned by the actual machine positioning device 414 (for example, GNSS included in the actual machine positioning device 414).
  • the information about the coordinates (that is, the information about the XY coordinates where the work machine 40 is located at the work site) is acquired.
  • the first support processing element 101 is the height regarding the altitude at the position of the work machine 40 based on the information about the barometric pressure detected by the barometric pressure sensor included in the actual machine positioning device 414 together with the information about the XY coordinates of the work machine 40. Get information.
  • the first support processing element 101 acquires the height information regarding the altitude at the position of the work machine 40
  • an example using the information regarding the pressure detected by the pressure sensor is shown, but the height regarding the altitude at the position of the work machine 40 is shown.
  • the example of acquiring information (that is, information on the Z coordinate in which the work machine 40 is located at the work site) is not limited to this example.
  • As another example of acquiring the height information regarding the altitude at the position of the work machine 40 there is an example of acquiring the height information regarding the altitude at the position of the work machine 40 based on the map information stored in advance in the database 110. Can be mentioned.
  • the first support processing element 101 is based on the position information of the unmanned aerial vehicle 50 positioned by the UAV positioning device (not shown) (for example, GNSS included in the UAV positioning device) mounted on the unmanned aerial vehicle 50. Acquires information about the XY coordinates of the unmanned aerial vehicle 50 in the coordinate system (that is, information about the XY coordinates where the unmanned aerial vehicle 50 is located at the work site). Further, the first support processing element 101 is the height regarding the altitude at the position of the unmanned aerial vehicle 50 based on the information regarding the pressure detected by the pressure sensor included in the UAV positioning device together with the information regarding the XY coordinates of the unmanned aerial vehicle 50. Information (that is, information about the Z coordinate where the unmanned aerial vehicle 50 is located at the work site) is acquired.
  • the first support processing element 101 acquires the height information regarding the altitude at the position of the unmanned aerial vehicle 50
  • an example using the information regarding the atmospheric pressure detected by the barometric pressure sensor is shown, but the height regarding the altitude at the position of the unmanned aerial vehicle 50 is shown.
  • the example of acquiring the information (that is, the information on the Z coordinate where the unmanned aerial vehicle 50 is located at the work site) is not limited to this example.
  • the altitude at the position of the landing structure 60 on which the unmanned airplane 50 has landed is based on the map information stored in advance in the database 110.
  • Height information (eg, information about the height of poles installed at the work site, information about the height of the building where the unmanned airplane 50 has landed, information about the height of other work machines 40 where the unmanned aircraft 50 has landed).
  • An example of acquiring information is given.
  • the first support processing element 101 determines whether or not the landing position of the unmanned aerial vehicle 50 is a predetermined position in the work machine 40 based on the acquired position information including the altitude at which the unmanned aerial vehicle 50 is located. ..
  • the first support processing element 101 is a known triangular function based on the position information of the work machine 40 and the position information of the unmanned aerial vehicle 50.
  • the first support processing element 101 may transmit a landing command including a command signal for zooming in or zooming out the camera mounted on the unmanned aerial vehicle 50 to the unmanned aerial vehicle 50 after landing.
  • the first support processing element 101 may acquire information on the height from the lower traveling body 450 of the work machine 40 to be imaged to the cab 470 by referring to the database 110.
  • the first support processing element 101 takes into account the altitude at which the work machine 40 is located and the height from the lower traveling body 450 to the cab 470 in the work machine 40, so that the focus of the camera mounted on the unmanned airplane 50 is taken into consideration. Can be aligned with the vicinity of the cab 470 of the work machine 40 to be imaged, so that an environmental image showing the state of the work machine 40 and its surroundings can be clearly imaged.
  • the UAV control device 500 executes control for landing the unmanned aerial vehicle 50 (FIG. 9 / STEP511). ).
  • the UAV control device 500 acquires an environmental image showing the surroundings of the work machine 40 by taking an image of the surroundings of the work machine 40 using the UAV image pickup device 512 (Fig.). 9 / STEP512).
  • the UAV control device 500 When the UAV control device 500 acquires the environmental image, the UAV control device 500 transmits the environmental image data including the environmental image to the remote control support server 10 through the UAV wireless communication device 522 (FIG. 9 / STEP 513).
  • the remote control support server 10 when the second support element 102 included in the remote control support device 100 receives a landing command through the server wireless communication device 122 (FIG. 9 / C15), the environment image data is stored in the remote control device. It is transmitted to 20 (FIG. 9 / STEP114).
  • the second support processing element 102 included in the remote control support device 100 performs the calculation process for the remote control support device 100 to transmit the environmental image data.
  • the second support processing element 102 represents a state around the work machine 40 imaged by a camera (for example, UAV image pickup device 512) possessed by the unmanned aerial vehicle 50 landed at a predetermined position (for example, cab 470).
  • a camera for example, UAV image pickup device 512
  • a predetermined position for example, cab 470.
  • An environment image is acquired, and a process of transmitting the environment image to the remote control device 20 is executed in order to have the image output device 221 of the remote control device 20 output the environment image.
  • the second support processing element 102 has a positioning mechanism (for example, a front positioning mechanism 481) for fixing the unmanned aerial vehicle 50 to the landing position (for example, cab 470) in the work machine 40 based on the communication with the work machine 40. Recognize whether or not there is. If the determination is affirmative, the second support processing element 102 has a landing position (eg, a front portion of the ceiling in the cab 470) and a positioning mechanism (eg, a positioning mechanism) having a positioning mechanism (eg, forward positioning mechanism 481). 480) may be executed to transmit the position information regarding the landing position (for example, the boarding door portion of the cab 470) or one of them to the remote control device 20.
  • a positioning mechanism for example, a front positioning mechanism 481 for fixing the unmanned aerial vehicle 50 to the landing position (for example, cab 470) in the work machine 40 based on the communication with the work machine 40. Recognize whether or not there is. If the determination is affirmative, the second support processing element 102 has
  • the second support processing element 102 captures the surroundings of the work machine 40 imaged by a camera (for example, UAV image pickup device 512) of the unmanned aerial vehicle 50 that has landed at a predetermined position (for example, the left positioning mechanism 482). It represents the state of the direction (left direction of the work machine 40) according to a predetermined position (for example, the left positioning mechanism 482) in the work machine 40, and represents a part of the work machine 40 (for example, the left side of the ceiling in the cab 470).
  • An environment image including the edge portion may be acquired, and a process of transmitting the environment image to the remote control device 20 may be executed in order for the image output device 221 of the remote control device 20 to output the environment image.
  • the unmanned aerial vehicle 50 when the landing position is the forward positioning mechanism 481, the unmanned aerial vehicle 50 lands so as to image the front direction of the work machine 40, and when the landing position is the left positioning mechanism 482, the unmanned aerial vehicle 50 works.
  • the landing position is the rear positioning mechanism 484
  • the unmanned aerial vehicle 50 is landed so as to image the rear direction of the work machine 40.
  • the operator can visually recognize the surroundings of the work machine 40 without being obstructed by the cab 470, the operating mechanism 440, the top surface portion of the upper swing body 460, the exhaust device, and the like.
  • the second support processing element 102 recognizes the landing position of the unmanned aerial vehicle 50 and the image captured by the camera (for example, UAV image pickup device 512) of the unmanned aerial vehicle 50, and the unmanned aerial vehicle 50 recognizes the work machine.
  • the vehicle has not landed on the 40 and the captured image shows the surroundings of the work machine 40, and the environment image which is an image including a part or the whole of the work machine 40 cannot be acquired, it is unmanned.
  • a process of transmitting a camera control command which is a signal including a command signal for controlling the operation of the camera (for example, the UAV image pickup device 512) and acquiring an environmental image, may be executed for the airplane 50.
  • the remote control device 20 when the remote control device 200 included in the remote control device 20 receives the environment image data through the remote wireless communication device 222 (FIG. 9 / C22), the environment image data is imaged in the remote output interface 220. Control for output to the output device 221 is executed (FIG. 9 / STEP 213).
  • the second support processing element 102 acquires information on the remaining battery level of the unmanned aerial vehicle 50, and executes a process of transmitting information on the remaining battery level to the remote control device 20 through the server wireless communication device 122. You may.
  • the remote control device 200 determines whether or not a flight operation, which is an operation for flying the unmanned aerial vehicle 50, has been performed (FIG. 10 / STEP221). If the determination is a negative result (FIG. 10 / STEP221 ... NO), the remote control device 200 ends the arithmetic processing. On the other hand, when the determination is a positive result (FIG. 10 / STEP221 ... YES), the remote control device 200 remotely issues a flight command including a command signal for flying the unmanned aerial vehicle 50 through the remote wireless communication device 222. It is transmitted to the operation support server 10 (FIG. 10 / STEP222).
  • the remote control support device 100 when the remote control support device 100 receives a flight command through the server wireless communication device 122 (FIG. 10 / C16), the remote control support device 100 receives a flight command through the server wireless communication device 122. Is transmitted to the unmanned aerial vehicle 50 (FIG. 10 / STEP121).
  • the UAV control device 500 when the UAV control device 500 receives a flight command through the UAV wireless communication device 522 (FIG. 10 / C51), the UAV control device 500 becomes an unmanned flight mode according to the operation mode of the operation device of the unmanned aerial vehicle 50.
  • the flight control for flying the aircraft 50 is executed (FIG. 10 / STEP521).
  • the remote control device 20 may be used, or the operation-dedicated device for the unmanned aerial vehicle 50 (for example, a remote controller) may be used. Further, the flight control of the unmanned aerial vehicle 50 may be a self-sustaining flight by executing control for flying in a command mode based on a command signal programmed in advance, regardless of the operating device.
  • the UAV control device 500 acquires battery remaining amount data including information on the remaining amount of the battery which is the power source of the unmanned aerial vehicle 50 (FIG. 10 / STEP522).
  • the UAV control device 500 acquires the battery remaining amount data
  • the UAV control device 500 transmits the battery remaining amount data to the remote control support server 10 through the UAV wireless communication device 522. (Fig. 10 / STEP 523).
  • the second support element 102 included in the remote control support device 100 receives the battery remaining amount data through the server wireless communication device 122 (FIG. 10 / C17), the received battery remaining amount data. Based on the above, it is determined whether or not the remaining battery level or the flightable time of the unmanned airplane 50 based on the remaining battery level is equal to or less than a predetermined value (FIG. 10 / STEP122). If the determination is a negative result (FIG. 10 / STEP122 ... NO), the second support element 102 ends the remote control support process. On the other hand, when the determination is a positive result (FIG. 10 / STEP122 ...
  • the second support element 102 has a landing position (for example, a battery 4810) having a charging device (for example, a battery 4810) through the server wireless communication device 122.
  • a landing position for example, a battery 4810
  • a charging device for example, a battery 4810
  • Information about the forward positioning mechanism 481 having the battery 4810 and the landing structure 60) having the battery 4810) is transmitted to the remote control device 20 (FIG. 10 / STEP123).
  • the remote control device 200 when the remote control device 200 receives the battery remaining amount data through the remote wireless communication device 222 (FIG. 10 / C22), the remote control device 200 uses the charging device (for example, the battery 4810).
  • a process for controlling an output mode for outputting information regarding the landing position for example, a forward positioning mechanism 481 having a battery 4810, a landing structure 60 having a battery 4810) to the image output device 221 is executed (for example).
  • FIG. 8 / STEP223 A process for controlling an output mode for outputting information regarding the landing position (for example, a forward positioning mechanism 481 having a battery 4810, a landing structure 60 having a battery 4810) to the image output device 221 is executed (for example).
  • the second support element 102 transmits an environmental image to the unmanned aerial vehicle 50 by transmitting a camera control command to the unmanned aerial vehicle 50 even before the unmanned aerial vehicle 50 lands on the work machine. You may execute the control to acquire.
  • the remote control device 200 determines whether or not the operator has performed a designated operation through the remote input interface 210 (FIG. 11 / STEP 231). If the determination result is negative (FIG. 11 / STEP231 ... NO), the process ends. On the other hand, when the determination result is affirmative (FIG. 11 / STEP231 ... YES), the remote control device 200 includes a landing command including a command signal for landing the unmanned aerial vehicle 50 through the remote wireless communication device 222. Is transmitted to the remote control support server 10 (FIG. 11 / STEP232).
  • the remote control support server 10 When the remote control support server 10 receives the landing command through the server wireless communication device 122 (FIG. 10 / C18), the first support processing element 101 is designated by the remote input interface 210 provided in the remote control device 20. In response to the operation, a landing command for landing at a predetermined position (for example, the landing structure 60) at the work site is transmitted to the unmanned aerial vehicle 50 (FIG. 11 / STEP131).
  • the UAV control device 500 when the UAV control device 500 receives the landing command through the UAV wireless communication device 522 (FIG. 11 / C52), the UAV control device 500 places the unmanned aerial vehicle 50 in a predetermined position at the work site (for example,). , A control process for landing on the landing structure 60) is executed (FIG. 11 / STEP531).
  • the UAV control device 500 relates to the position where the unmanned aerial vehicle 50 has landed by the landing position information acquisition means (for example, GNSS or the like) which is also used to acquire information including the position where the unmanned aerial vehicle 50 has landed.
  • the landing position data which is information is acquired (FIG. 11 / STEP532).
  • the UAV control device 500 When the unmanned aerial vehicle 50 lands, the UAV control device 500 operates the UAV image pickup device 512 to take an image of the state near the landing position, and the information including an image showing the state around the landing position. Acquire a certain captured image data (FIG. 11 / STEP 533).
  • the UAV control device 500 When the UAV control device 500 acquires the landing position data and the captured image data, the UAV control device 500 transmits the landing position data and the captured image data to the remote control support server 10 through the UAV wireless communication device 522. (Fig. 11 / STEP 534).
  • the unmanned airplane 50 works.
  • a determination as to whether or not the machine has landed on the machine 40 is executed (FIG. 11 / STEP132).
  • the second support processing element 102 ends the arithmetic processing.
  • the determination result is affirmative (FIG. 11 / STEP132 ... YES)
  • the second support processing element 102 ends the arithmetic processing.
  • the determination result is negative (FIG. 11 / STEP132 ... NO)
  • the captured image represents the state around the work machine 40, and a part or all of the work machine 40 is shown.
  • the determination as to whether or not the environment image which is an image including the above can be acquired is executed (FIG. 11 / STEP133).
  • the second support processing element 102 ends the arithmetic processing.
  • the second support processing element 102 is a camera (for example, UAV image pickup device 512) possessed by the unmanned aerial vehicle 50 through the server wireless communication device 122.
  • a camera control command including a command signal for controlling the operation of the unmanned aerial vehicle 50 is transmitted to the unmanned aerial vehicle 50 (FIG. 11 / STEP134).
  • the camera control command when the image captured by the camera (for example, UAV image pickup device 512) of the unmanned aerial vehicle 50 does not include the work machine 40 to be remotely controlled by the operator (that is, the environment image). ),
  • An example including a command signal for changing the direction of the camera (for example, UAV image pickup device 512) of the unmanned aerial vehicle 50 in the direction in which the work machine 40 exists can be mentioned.
  • examples of the operation of changing the direction of the camera include pan, tilt, and roll.
  • the example of the camera control command an example of changing the direction of the camera (for example, UAV image pickup device 512) of the unmanned aerial vehicle 50 is given, but the example of the camera control command is not limited to this example.
  • the image captured by the camera (for example, UAV image pickup device 512) of the unmanned aerial vehicle 50 includes the work machine 40 to be remotely controlled by the operator, but the operator is out of focus. If the work machine 40 that is remotely operated by itself cannot be recognized (that is, the environment image cannot be acquired), the camera mounted on the unmanned aerial vehicle 50 (for example, UAV image pickup device 512) is zoomed in or out. There is an example including a command signal for focusing and acquiring an environmental image by doing so.
  • the camera mounted on the unmanned aerial vehicle 50 is zoomed out to focus.
  • the work machine 40 to be imaged is a 7t type small excavator and the height of the excavator is relatively low, there is an example in which the camera mounted on the unmanned aerial vehicle 50 is zoomed in and focused.
  • the second support processing element 102 recognizes the landing position of the unmanned aerial vehicle 50 and the image captured by the camera (for example, UAV image pickup device 512) of the unmanned aerial vehicle 50, and the unmanned aerial vehicle 50 works.
  • the captured image shows the surroundings of the work machine 40, and an environmental image which is an image including a part or all of the work machine 40 cannot be acquired.
  • a process of transmitting a camera control command which is a signal including a command signal for controlling the operation of the camera and acquiring the environmental image, is executed for the unmanned aerial vehicle 50.
  • Examples of the second support process executed by the second support process element 102 include STEP 132 to STEP 134.
  • the UAV control device 500 when the UAV control device 500 receives a camera control command through the UAV wireless communication device 522 (FIG. 11 / C53), the UAV control device 500 controls the operation of the camera (FIG. 11 / STEP535).
  • the UAV control device 500 controls the operation of the camera (for example, the UAV image pickup device 512) of the unmanned aerial vehicle 50, the captured image shows the surroundings of the work machine 40, and a part or all of the work machine 40 is displayed.
  • the environment image data which is the information including the environment image which is the included image is acquired (FIG. 11 / STEP536).
  • the UAV control device 500 When the UAV control device 500 acquires the environmental image data, the UAV control device 500 transmits the environmental image data to the remote control support server 10 through the UAV wireless communication device 522 (FIG. 11 / STEP 537).
  • the remote control support device 100 when the remote control support device 100 receives the environment image data through the server wireless communication device 122 (FIG. 11 / C100), the remote control support device 20 transfers the environment image data through the server wireless communication device 122. (Fig. 11 / STEP135).
  • the remote control device 200 when the remote control device 200 receives the environment image data through the remote wireless communication device 222 (FIG. 11 / C23), the remote control device 200 outputs the environment image data to the image output device 221. A process for controlling the output mode for this is executed (FIG. 11 / STEP233).
  • the second support processing element of the present invention is for directing the image pickup direction of the UAV image pickup device 512 mounted on the unmanned aerial vehicle 50 fixed to the positioning mechanism 480 to the direction corresponding to the operation mode of the remote input interface 210 operated by the operator. It is preferable to execute the process of.
  • the second support processing element 102 directs the UAV image pickup device 512 in a direction corresponding to the operation mode of the remote input interface 210 operated by the operator after the unmanned aerial vehicle 50 is fixed to the positioning mechanism 480. Perform the processing for.
  • the unmanned aerial vehicle 50 is fixed to the positioning mechanism 480, even if the imaging direction of the UAV image pickup device 512 is a direction not desired by the operator, the unmanned aerial vehicle 50 is fixed to the positioning mechanism 480 from the time when the unmanned aerial vehicle 50 is fixed to the positioning mechanism 480.
  • the image pickup direction of the UAV image pickup device 512 can be adjusted later. Therefore, after fixing the unmanned aerial vehicle 50 to the positioning mechanism 480, the operator can point the imaging direction of the UAV imaging device 512 in the direction he / she wants to see.
  • the remote control support device 100 acquires an ID (identifier) for identifying each of the unmanned aerial vehicles 50 and each work machine 40. You may.
  • the direction in which the work machine 40 is located as viewed from the camera mounted on the unmanned aerial vehicle 50 is determined.
  • An example of using a known triangular function has been shown as an example of transmitting a landing command including a command to be imaged to the unmanned aerial vehicle 50, but the first support processing element is seen from the camera mounted on the unmanned aerial vehicle 50.
  • the example of transmitting a landing command including a command to image the direction in which the work machine 40 is located to the unmanned aerial vehicle 50 is not limited to this example.
  • the first support processing element is a control for driving an angle adjusting mechanism for adjusting the imaging angle of the UAV imaging device 512 until the working machine 40 to be imaged is found by using a known image recognition function.
  • a landing command including the command may be transmitted to the unmanned aerial vehicle 50.
  • the first support processing element of the present invention recognizes the operating speed of the working machine, and when the operating speed of the working machine is equal to or less than a predetermined value, the unmanned airplane is landed at the predetermined position in the working machine. It is preferable to send a landing command for the purpose.
  • the first support processing element recognizes the operating speed of the working machine, and when the operating speed of the working machine is equal to or less than a predetermined value, the unmanned airplane is placed at the predetermined position in the working machine. Send a landing command to land.
  • the operator can land the unmanned aerial vehicle at a predetermined position of the work machine, so that the operator can surely land the unmanned aerial vehicle on the work machine. be able to.
  • the first support processing element of the present invention recognizes whether or not the operation of the work machine is stopped, recognizes that the designated operation is performed, and that the operation of the work machine is stopped. In this case, it is preferable to transmit a landing command for landing at the predetermined position on the work machine to the unmanned airplane.
  • the first support processing element recognizes whether or not the operation of the work machine is stopped, recognizes that the designated operation is performed, and that the operation of the work machine is stopped. In that case, a landing command for landing at the predetermined position on the work machine is transmitted to the unmanned airplane.
  • the operator can land the unmanned aerial vehicle in a predetermined position of the work machine, so that the operator can reliably and quickly place the unmanned aerial vehicle in the predetermined position. Can be landed.
  • the first support processing element of the present invention is one landing position designated by the operator from a plurality of landing positions in the work machine by the remote input interface based on communication with the remote control device. Is preferably recognized as the designated landing position, and a landing command for landing the unmanned aircraft at the designated landing position is preferably transmitted to the unmanned aircraft.
  • the first support processing element is one landing designated by the operator from a plurality of landing positions on the work machine by the remote input interface based on communication with the remote control device.
  • the position is recognized as the designated landing position, and a landing command for landing the unmanned aircraft at the designated landing position is transmitted to the unmanned aircraft.
  • the operator can land the unmanned aerial vehicle at the landing position designated by himself / herself, so that the operator can visually recognize any direction he / she wants to confirm when remotely controlling the work machine.
  • the first support processing element has a predetermined imaging direction (for example, a sheet inside the cab) determined according to the designated landing position (for example, the left end on the zenith surface of the cab with reference to the orientation of the sheet inside the cab). It is preferable to transmit a landing command including a command to image the left direction of the work machine with respect to the direction of the unmanned aerial vehicle.
  • the unmanned aerial vehicle lands so as to capture a predetermined imaging direction determined according to the designated landing position, so that the operator visually recognizes a predetermined imaging direction determined according to the designated landing position. be able to.
  • the unmanned aerial vehicle lands so as to image the left direction with reference to the orientation of the seat installed inside the cab. Then, most of the captured image is the cab zenith surface, and it is conceivable that the operator can hardly see the environmental image that he / she wants to see.
  • the remote control device when the operator operates the remote control device so that the unmanned aerial vehicle lands on the left end of the zenith surface of the cab, the unmanned aerial vehicle moves to the left of the work machine. Implant to take an image.
  • the operator can visually recognize the zenith surface portion (front portion, right end portion, rear portion, central portion, etc.) of the cab zenith surface excluding the left end portion without obstructing the field of view.
  • the zenith surface of the cab is mentioned as a part that obstructs the operator's field of view, but the example of the part that obstructs the operator's field of view is not limited to this example.
  • Other examples of the portion that obstructs the operator's field of view include an attachment (boom, arm, bucket), a top surface portion of the upper swing body, an exhaust device, and the like.
  • the first support processing element recognizes the position of each of the work machines in the work site based on the communication with the plurality of work machines existing in the work site, and the plurality of said machines existing in the work site.
  • the landing command for landing on one of the predetermined position in the designated work machine as the work machine designated by the operator by the remote input interface and the predetermined position in the work site among the work machines is described above. It is preferable to send to an unmanned aircraft.
  • the unmanned airplane equipped with the camera lands on one of the predetermined position on the designated work machine designated by the operator by the remote input interface and the predetermined position on the work site. Even when the designated work machine is switched from one work machine to another work machine among the plurality of work machines existing at the work site, the landing command can be transmitted to the unmanned airplane.
  • the first support processing element recognizes the position information of the work machine according to the operation in the remote input interface provided in the remote control device, and based on the communication with the unmanned aerial vehicle, the unmanned aerial vehicle When the landing position of the unmanned aerial vehicle is not the predetermined position in the work machine (for example, when the landing position of the unmanned aerial vehicle is a pole installed at the work site). It is preferable to transmit a landing command including a command for imaging the direction in which the work machine is located to the unmanned aerial vehicle when viewed from the camera mounted on the unmanned aerial vehicle.
  • the unmanned aerial vehicle is mounted.
  • the camera captures the direction in which the work machine is present, and the captured image includes the work machine and an environment image around the work machine.
  • the first support processing element recognizes the position information of the work machine in the world coordinate system and the information about the altitude at which the work machine exists as the position information of the work machine. Further, when the landing position of the unmanned aerial vehicle is a pole installed at the work site, the first support processing element recognizes the landing position of the unmanned aerial vehicle.
  • the first support processing element also acquires information on the altitude of the unmanned aerial vehicle as information on the landing position of the unmanned aerial vehicle. Then, when the first support processing element recognizes the landing position of the unmanned aerial vehicle, the first support processing element issues a landing command including a command to image the direction in which the work machine is located when viewed from the camera mounted on the unmanned aerial vehicle. It is transmitted to the unmanned aerial vehicle.
  • the operator can see the state of the work machine remotely controlled by himself and its surroundings. Since the environment image including the image can be visually recognized, the surrounding state of the work machine can be quickly recognized.
  • the first support processing element acquires the landing position information including the information regarding the altitude of the unmanned aerial vehicle, even if the landing position of the unmanned aerial vehicle has a dent or an inclination, these terrains.
  • the operator can quickly recognize the surroundings of the work machine by adding the information about the work machine.
  • the position information of the work machine also includes information on the altitude at which the work machine exists, information on these terrains even when the position where the work machine exists has a dent or an inclination.
  • the operator can quickly recognize the surroundings of the work machine. Therefore, according to the present invention, the operator can visually recognize the work machine remotely controlled by the operator and the environment image including the state around the work machine, so that the operator can quickly recognize the state around the work machine. ..
  • the second support processing element of the present invention recognizes a determination result of whether or not the remaining amount of the battery in the unmanned aerial vehicle is equal to or less than a predetermined value based on the communication with the unmanned aerial vehicle, and the determination result is obtained.
  • the landing having a charging device capable of charging the battery mounted on the unmanned aerial vehicle when the unmanned aerial vehicle is landed from among a plurality of landing positions on the work machine. It is preferable to transmit the information regarding the position to the remote control device.
  • the second support processing element of the present invention recognizes whether or not there is a positioning mechanism for fixing the unmanned aerial vehicle at the landing position in the work machine based on the communication with the work machine, and the work machine. If the determination as to whether or not there is a positioning mechanism for fixing the unmanned aerial vehicle at the landing position in the above is affirmative, the landing position having the positioning mechanism, the landing position not having the positioning mechanism, or one of them is concerned. It is preferable to transmit the position information to the remote control device.
  • 10 ... remote control support server, 20 ... remote control device, 40 ... work machine, 50 ... unmanned aerial vehicle, 100 ... remote control support device, 101 ... first support processing element, 102 ... second support processing element, 210 ... remote input Interface, 211 ... remote control mechanism, 220 ... remote output interface, 221 image output device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Component Parts Of Construction Machinery (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Selective Calling Equipment (AREA)

Abstract

La présente invention comprend un premier élément de traitement d'assistance et un second élément de traitement d'assistance. Le premier élément de traitement d'assistance transmet une instruction d'atterrissage à un engin volant sans pilote embarqué pour faire atterrir un engin volant sans pilote embarqué soit à un emplacement prescrit sur une machine d'opération, soit à un emplacement prescrit sur un site de travail. Le second élément de traitement d'assistance acquiert une image d'environnement prise par une caméra fournie à l'engin volant sans pilote embarqué, et exécute un traitement pour délivrer l'image d'environnement à un dispositif de téléopération.
PCT/JP2021/027284 2020-09-29 2021-07-21 Dispositif d'assistance à la téléopération et système de d'assistance à la téléopération WO2022070567A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-163636 2020-09-29
JP2020163636A JP7521359B2 (ja) 2020-09-29 2020-09-29 遠隔操作支援装置及び遠隔操作支援システム

Publications (1)

Publication Number Publication Date
WO2022070567A1 true WO2022070567A1 (fr) 2022-04-07

Family

ID=80949855

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/027284 WO2022070567A1 (fr) 2020-09-29 2021-07-21 Dispositif d'assistance à la téléopération et système de d'assistance à la téléopération

Country Status (2)

Country Link
JP (1) JP7521359B2 (fr)
WO (1) WO2022070567A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017097640A (ja) * 2015-11-25 2017-06-01 株式会社Ihiエアロスペース 遠隔操縦用画像取得装置と方法および遠隔操縦装置
WO2017131194A1 (fr) * 2016-01-29 2017-08-03 住友建機株式会社 Excavatrice et corps volant autonome pour voler autour d'une excavatrice
WO2017170651A1 (fr) * 2016-03-31 2017-10-05 住友重機械工業株式会社 Système de gestion du travail pour machine de construction, et machine de construction
WO2019026169A1 (fr) * 2017-08-01 2019-02-07 J Think株式会社 Système opérationnel pour machine de travail
JP2019156575A (ja) * 2018-03-13 2019-09-19 株式会社三井E&Sマシナリー クレーン運転補助システムおよびクレーン運転補助方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180068469A (ko) * 2016-12-14 2018-06-22 현대자동차주식회사 무인비행장치 및 이를 포함하는 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017097640A (ja) * 2015-11-25 2017-06-01 株式会社Ihiエアロスペース 遠隔操縦用画像取得装置と方法および遠隔操縦装置
WO2017131194A1 (fr) * 2016-01-29 2017-08-03 住友建機株式会社 Excavatrice et corps volant autonome pour voler autour d'une excavatrice
WO2017170651A1 (fr) * 2016-03-31 2017-10-05 住友重機械工業株式会社 Système de gestion du travail pour machine de construction, et machine de construction
WO2019026169A1 (fr) * 2017-08-01 2019-02-07 J Think株式会社 Système opérationnel pour machine de travail
JP2019156575A (ja) * 2018-03-13 2019-09-19 株式会社三井E&Sマシナリー クレーン運転補助システムおよびクレーン運転補助方法

Also Published As

Publication number Publication date
JP2022055922A (ja) 2022-04-08
JP7521359B2 (ja) 2024-07-24

Similar Documents

Publication Publication Date Title
US11492783B2 (en) Shovel and autonomous aerial vehicle flying around shovel
US20220018096A1 (en) Shovel and construction system
JP2016181119A (ja) 移動機器の周囲状況提示システム
JP2024028464A (ja) 表示制御システムおよび表示制御方法
WO2022168443A1 (fr) Système d'aide au travail et système composite d'assistance au travail
WO2021153187A1 (fr) Système d'assistance au travail et procédé d'assistance au travail
WO2021176883A1 (fr) Serveur et procédé d'assistance au travail
JP7515570B2 (ja) クレーン、クレーン本体及びプログラム
WO2022070567A1 (fr) Dispositif d'assistance à la téléopération et système de d'assistance à la téléopération
JP7537222B2 (ja) 画像提供システム
JP7508815B2 (ja) 作業支援サーバ、作業支援方法
WO2021240957A1 (fr) Dispositif d'assistance au fonctionnement à distance, système d'assistance au fonctionnement à distance et procédé d'assistance au fonctionnement à distance
JP2020165235A (ja) ショベル
JP2023156807A (ja) 作業支援装置およびこれを備えた作業システム
US20220317684A1 (en) Display device and route display program
WO2021106280A1 (fr) Serveur d'aide au travail, procédé d'aide au travail, et système d'aide au travail
US20240012417A1 (en) Crane inspection system, inspection system, non-transitory computer readable medium storing route setting program, and information terminal
JP2021021253A (ja) 作業機械および作業機械支援サーバ
JP2024018270A (ja) 作業支援システム、作業機械、投影装置、制御装置
JP2021017695A (ja) 作業機械および作業機械支援サーバ
JP2023177374A (ja) 架空線地図作成装置及び運転支援システム
JP2021017694A (ja) 作業機械および作業機械支援サーバ
JP2021017696A (ja) 作業機械および作業機械支援サーバ
KR20240127962A (ko) 정보처리시스템, 프로그램, 및 정보처리방법
JP2022057248A (ja) 作業支援装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21874871

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21874871

Country of ref document: EP

Kind code of ref document: A1