WO2018124098A1 - Terminal d'exploitation à distance et système d'exploitation à distance - Google Patents

Terminal d'exploitation à distance et système d'exploitation à distance Download PDF

Info

Publication number
WO2018124098A1
WO2018124098A1 PCT/JP2017/046711 JP2017046711W WO2018124098A1 WO 2018124098 A1 WO2018124098 A1 WO 2018124098A1 JP 2017046711 W JP2017046711 W JP 2017046711W WO 2018124098 A1 WO2018124098 A1 WO 2018124098A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
work machine
crane
remote
display unit
Prior art date
Application number
PCT/JP2017/046711
Other languages
English (en)
Japanese (ja)
Inventor
洋幸 林
Original Assignee
株式会社タダノ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社タダノ filed Critical 株式会社タダノ
Priority to JP2018547491A priority Critical patent/JP6460294B2/ja
Publication of WO2018124098A1 publication Critical patent/WO2018124098A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/40Applications of devices for transmitting control pulses; Applications of remote control devices

Definitions

  • the present invention relates to a remote operation terminal and a remote operation system for a work machine such as a mobile crane or an aerial work vehicle.
  • Patent Document 1 movement between a stock position and a member installation position is performed by automatic operation, and operations that require fine operations such as slinging, loading, and installation are performed by radio control by an operator. Techniques to do are disclosed.
  • An object of the present invention is to provide a remote operation terminal and a remote operation system that can be operated without being aware of delay.
  • a remote operation terminal is a remote operation terminal for remotely operating a work machine, and shows a part of the work machine or a moving object on a display unit and a surrounding image related to a situation around the work machine.
  • a display processing unit that synthesizes a virtual object and displays it on the display unit, an operation unit for simulating the virtual object in the display unit, and a control unit that sends operation information input from the operation unit to the work machine in the simulated operation And comprising.
  • a remote operation system includes the above-described remote operation terminal and a work machine that operates based on operation information received from the remote operation terminal.
  • FIG. 1 is an example of a side view of a crane according to the first embodiment.
  • FIG. 2 is an example of a perspective view of the remote control terminal according to the first embodiment.
  • FIG. 3 is an example of a hardware configuration diagram of the remote control terminal according to the first embodiment.
  • FIG. 4 is an example of a functional block diagram of a crane control system according to the first embodiment.
  • FIG. 5 is an example of a functional block diagram of a control system of the remote control terminal according to the first embodiment.
  • FIG. 6 is an example of a timing chart during work.
  • FIG. 7 is an example of a perspective view of a work site.
  • FIG. 8 is an example of an image displayed on the display unit of the remote operation terminal.
  • FIG. 9 is an example of a control flowchart in the remote operation terminal according to the first embodiment.
  • FIG. 10 is another example of a perspective view of the work site.
  • FIG. 11 is another example of an image displayed on the display unit of the remote operation terminal.
  • FIG. 12 is a timing chart during work according to the second embodiment.
  • FIG. 13 is a timing chart during work according to the second embodiment.
  • FIG. 14 is an example of a control flowchart in the remote operation terminal according to the second embodiment.
  • FIG. 15 is another example of a control flowchart in the remote operation terminal according to the second embodiment.
  • a first embodiment according to the present invention will be described with reference to FIGS.
  • a mobile crane, an aerial work platform, etc. are mentioned.
  • the mobile crane include a rough terrain crane, an all terrain crane, a truck crane, and a loading truck crane.
  • a rough terrain crane will be described as an example of a work machine according to the present embodiment, the remote control terminal according to the present invention can be applied to other work machines.
  • a radio-controlled transmitter type will be described as an example, but a mobile terminal such as a smartphone, a wireless operation terminal such as a tablet terminal, a wired operation terminal, a computer such as a personal computer, etc. It can also be applied to.
  • the remote operation terminal according to the present embodiment is not limited to a portable type, and may be an installation type.
  • it may be a remote operation terminal installed in a room different from the room where the work machine is located, for example, a control room (operation room).
  • a rough terrain crane 1 (hereinafter referred to as a crane 1) according to the present embodiment includes a vehicle body 10, a plurality of outriggers 11, a swivel base 12, and a boom 14, as shown in FIG.
  • the vehicle body 10 becomes a main body portion of a vehicle having a traveling function.
  • the outriggers 11 are respectively provided at the four corners of the vehicle body 10.
  • the turntable 12 is attached to the vehicle body 10 so as to be able to turn horizontally.
  • the boom 14 is attached to a bracket 13 erected on the swivel base 12.
  • the outrigger 11 slides in the width direction of the vehicle body 10 as the slide cylinder expands and contracts, and changes between a retracted state and an extended state. Further, the outrigger 11 extends in the vertical direction of the vehicle body 10 as the jack cylinder expands and contracts, and changes between the jack retracted state and the jack extended state.
  • the terms “width direction”, “front / rear direction”, and “vertical direction” refer to each direction in the vehicle body 10 without particular notice.
  • the turntable 12 has a pinion gear to which the power of the turning motor is transmitted.
  • the swivel base 12 rotates about the swivel axis when the pinion gear and the circular gear provided on the vehicle body 10 mesh with each other.
  • the swivel base 12 has a cockpit 18 disposed on the right front side, a bracket 13 disposed in the rear center, and a counterweight 19 disposed in the lower rear portion.
  • the boom 14 has a proximal boom 141, an intermediate boom 142, and a distal boom 143 combined in a nested manner. Such a boom 14 is expanded and contracted by an expansion cylinder arranged inside.
  • a sheave (only the exterior portion is shown in FIG. 1) is rotatably provided on the boom head 144 provided at the forefront of the tip boom 143.
  • a wire rope 16 (hereinafter referred to as a wire 16) is hung around such a sheave.
  • a hook block 17 (hereinafter referred to as a hook 17) is fixed to the tip of the wire 16.
  • the proximal end of the wire 16 is fixed to a winch (not shown).
  • the wire 16 and the hook 17 are wound up or down by the rotation of the winch.
  • the outermost base end boom 141 is rotatably attached to a support shaft whose base end portion is installed horizontally on the bracket 13. Such a base end boom 141 undulates up and down around the support shaft as a rotation center. Further, a hoisting cylinder 15 is bridged between the bracket 13 and the lower surface of the base end boom 141. The hoisting cylinder 15 raises and lowers the entire boom 14 by its own expansion and contraction.
  • the jib 30 and the tension rod 20 are stored in a sideways posture on the side surface of the base end boom 141.
  • the jib 30 and the tension rod 20 are mounted / stored using a plurality of pins (not shown) and a side-up cylinder 31.
  • the boom head 144 of the crane 1 is provided with one or more detection devices 145 such as a camera and a radar.
  • the detection device 145 acquires information related to the situation around the crane 1 that is a work machine.
  • the detection device 145 generates a surrounding image (for example, a captured image by a camera or a radar image based on detection by a radar) based on the detected information.
  • the surrounding image generated by the detection device 145 is sent to the remote operation terminal 40 via the communication unit 75 by the control device 70 (see FIG. 4) of the crane 1.
  • the remote operation terminal 40 displays the received surrounding image on the display unit 44 (see FIG. 2).
  • the surrounding image may be sent from the crane 1 to the remote operation terminal 40 at predetermined time intervals. In this case, the surrounding image displayed on the display unit 44 is updated at a predetermined time interval.
  • the detection device 145 is preferably composed of a plurality of (for example, three) cameras. It is preferable that the three cameras photograph the periphery of the crane 1 from three directions, ie, a vertical direction above, a first horizontal direction perpendicular to the vertical direction, and a second horizontal direction perpendicular to the first horizontal direction.
  • the display unit 44 may individually display images taken from three directions. Alternatively, the display unit 44 may display a three-dimensional image generated based on images taken from three directions by two or more cameras. Moreover, when using a radar as the detection apparatus 145, the detection apparatus 145 produces
  • FIG. 1 shows a detection device 145 constituted by a camera.
  • the detection device 145 is attached to a position (for example, a sheave position) above the hook 17 in the vertical direction.
  • the configuration of the detection device is not limited to the configuration shown in FIG.
  • the remote operation terminal 40 is a remote operation terminal for remotely operating the crane 1 which is an example of a work machine, and the display unit 44 and a surrounding image related to the situation around the crane 1
  • a display processing unit specifically, a virtual object 80 indicating a part (specifically, hook 17) or a moving object (specifically, package P) and displaying it on the display unit 44.
  • Control unit 46 remote operation of crane 1, operation unit 42 for simulating virtual object 80 on display unit 44, and control for sending operation information input from operation unit 42 in simulation operation to crane 1 Part 46.
  • the remote operation terminal 40 has an operation surface 41. On the operation surface 41, an operation unit 42, a display unit 44, and a stop operation unit 45 are provided. Further, the remote operation terminal 40 has a communication unit 43 for communication connection with the crane 1.
  • the operation unit 42 is an interface for remotely controlling a specific operation of the crane 1.
  • the operation unit 42 is also an interface for moving a virtual object 80 (see FIG. 8) described later displayed on the display unit 44 on the display unit 44 (in other words, performing a simulated operation).
  • Control information including operation information input from the operation unit 42 is sent from the remote operation terminal 40 to the crane 1.
  • the crane 1 operates based on the control information received from the remote operation terminal 40.
  • the remote operation terminal 40 can actually (directly) operate the crane 1 using the operation unit 42 or another interface as well as a simulated operation described later.
  • the remote operation terminal 40 may be configured such that only the remote operation of the crane 1 based on a simulation operation described later can be performed.
  • the operation unit 42 includes an operation stick 42a and an operation button 42b.
  • the operation unit 42 is not limited to the illustrated case, and may have other interfaces.
  • the number of operation units 42 provided in one remote operation terminal 40 may be one (for example, only the operation stick 42a), or may be two or more.
  • the operation stick 42a is arranged, for example, on the left side of the operation surface 41 in the example shown in FIG.
  • the user tilts the operation stick 42a in a predetermined direction, for example, a member that is moved by remote operation such as the hook 17 (hereinafter referred to as “movement target”) in a direction corresponding to the predetermined direction according to the operation amount.
  • movement target a member that is moved by remote operation
  • the user can move the virtual object 80 displayed on the display unit 44 in a direction corresponding to the predetermined direction at a speed corresponding to the operation amount by tilting the operation stick 42a as described above in a predetermined direction.
  • the moving speed of the virtual object 80 increases as the operating stick 42a is largely tilted, and the moving speed of the virtual object 80 decreases as the operating stick 42a is raised.
  • the operation button 42b is arranged, for example, on the right side of the operation surface 41 in the example shown in FIG.
  • Examples of the operation button 42b include an up button 42bup for raising the hook 17 and the virtual object 80, and a down button 42bdown for lowering the hook 17 and the virtual object 80.
  • a display unit 44 is disposed at the center of the operation surface 41, for example.
  • the display unit 44 displays a surrounding image generated based on the detection information of the detection device 145.
  • the number of display units 44 provided in the remote operation terminal 40 may be one or plural.
  • the display unit 44 may have a configuration in which three display units are arranged side by side.
  • images captured from different directions may be displayed on each of the three display units.
  • images taken from different directions may be displayed on the three display unit elements.
  • the images of each camera may be individually displayed on the display unit 44.
  • the image displayed on the display unit 44 may be one new three-dimensional image generated from a plurality (for example, two or more) of camera videos.
  • a configuration in which a virtual three-dimensional image is displayed on the display unit 44 may be used.
  • an image obtained by converting a virtual three-dimensional space into a two-dimensional image viewed from a predetermined viewpoint position may be displayed on the display unit 44.
  • the display unit 44 includes, for example, at least one display of a liquid crystal display, an organic EL (Electro Luminescence) display, and an inorganic EL display.
  • the display unit 44 may include a touch sensor (not shown).
  • the display unit 44 does not need to be integrated with another interface such as the operation unit 42.
  • the display unit 44 may be, for example, a head-mounted head mounted display (Head-Mounted Display, HMD).
  • HMD head-mounted head mounted display
  • the HMD has an optical unit corresponding to the left and / or right eyes of the wearer, and can control at least vision (which may have a configuration that also controls hearing, for example).
  • the wearer can visually recognize the operation image (video) presented through the HMD. Therefore, the wearer can operate intuitively while viewing the operation image by the HMD without confirming the input operation surface.
  • the operation image presented to the wearer may be a virtual reality image (video) generated based on the detection information of the detection device 145.
  • the HMD can project different images on the left and right eyes of the wearer.
  • the HMD can present a 3D image to the wearer by displaying an image with parallax in the left and right eyes. As described above, even when the display unit is the HMD, the surrounding image and the virtual object received from the crane 1 are displayed on the HMD, as in the case of the display unit 44 described above.
  • the remote operation terminal 40 preferably includes a stop operation unit 45 for stopping the operation of the actuator 50 of the crane 1 when a virtual object described later collides with an obstacle or the like.
  • FIG. 2 shows an example in which a button type interface is arranged above the operation stick 42a as the operation interface of the stop operation unit 45.
  • the operation interface of the stop operation unit 45 may be another interface such as a switch type. Further, the operation interface of the operation stop unit 45 may be provided at a position different from that in FIG.
  • the control unit 46 of the remote operation terminal 40 includes, for example, a CPU (Central Processing Unit) 81, a ROM (Read Only Memory) 82, and a RAM (Random Access Memory) 83. Each component is connected to each other via a bus 84.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the CPU 81 comprehensively controls the remote operation terminal 40 while appropriately accessing the RAM 83 or the like as necessary to perform various arithmetic processes.
  • the ROM 82 is a non-volatile memory that stores an OS (Operating System) to be executed by the CPU 81, a program, and firmware such as various parameters.
  • the RAM 83 is used as a work area for the CPU 81 and temporarily holds the OS, the application being executed, the data being processed, and the like.
  • the remote operation terminal 40 includes a communication unit 43, a display unit 44, an operation unit 42, and a stop operation unit 45 (see FIG. 2) as input / output interfaces.
  • the communication unit 43 is a module for communication connection with the crane 1.
  • the display unit 44 is a display device. Since the display unit 44 has already been described, detailed description thereof is omitted here.
  • the operation unit 42 is an input device such as an operation stick or an operation button. The user remotely operates a specific operation of the crane 1 using the operation unit 42.
  • the operation unit 42 includes an operation interface 85, an operation sensor 86, and an operation input processing unit 87.
  • the operation interface 85 is for inputting a movement operation of the hook 17.
  • the operation sensor 86 detects operation information (for example, information about the operation amount of the operation stick 42a, information about the operation direction, etc.) input from the operation interface 85.
  • the operation sensor 86 sends the detected operation information to the operation input processing unit 87.
  • the operation input processing unit 87 calculates an operation command signal based on the operation information received from the operation sensor 86. Then, the operation input processing unit 87 sends the calculated operation command signal to the control unit 46.
  • the control unit 46 generates control information including the received operation command signal, and sends the control information to the crane 1.
  • the control device 70 of the crane 1 drives the actuator 50 based on the received control signal to move the movement target (for example, the hook 17).
  • the stop operation unit 45 is an input device such as an operation button, and is an interface that stops the operation of the crane 1.
  • the crane 1 includes at least an actuator 50, a control device 70, a communication unit 75, and a detection device 145.
  • the actuator 50 corresponds to a work device, and has one or more cylinders and motors.
  • the actuator 50 is configured to receive control information from the control device 70.
  • the control device 70 has at least a bus, an arithmetic device, a storage device, and the like.
  • the control device 70 is configured to be able to transmit / receive control information to / from the communication unit 75 and configured to be able to receive detection information from the detection device 145.
  • control device 70 is configured to be able to transmit control information to the actuator 50.
  • FIG. 4 shows a configuration in which the control valve 60 can transmit control information to the actuator 50. In this case, it can be said that the control valve 60 and the control device 70 correspond to the control device.
  • the communication unit 75 is configured to be able to transmit and receive control information and / or images to and from the remote operation terminal 40.
  • the communication unit 75 is configured to be able to transmit and receive control information and / or images to and from the control device 70. That is, the remote control terminal 40 and the crane 1 (specifically, the control device 70) can transmit and receive control information and / or images.
  • the detection device 145 is configured to be able to transmit detection information to the control device 70.
  • the detection device 145 may include an image generation unit (not shown) that generates an image based on detection information. In this case, the detection device 145 transmits the image (for example, surrounding image) generated by the image generation unit to the control device 70.
  • the crane 1 usually has an operation unit 71.
  • the crane 1 operates based on operation information of the operation unit 71.
  • the operation unit 71 transmits operation information based on the operation to the control device 70.
  • Examples of the operation unit 71 include a turning lever 71a, a hoisting lever 71b, an extendable lever 71c, and a main winch lever 71d.
  • the remote operation terminal 40 includes a control unit 46, a communication unit 43, a display unit 44, an operation unit 42, and a stop operation unit 45.
  • the control unit 46 appropriately transmits / receives information to / from each unit configuring the remote operation terminal 40.
  • the control unit 46 is configured to be able to receive operation information from the operation unit 42 and the stop operation unit 45.
  • the control unit 46 is configured to be able to transmit and receive control information and / or images to and from the communication unit 43. Further, the control unit 46 is configured to be able to transmit and receive images to and from the display unit 44.
  • the control unit 46 causes the display unit 44 to display the surrounding image received from the crane 1. Further, the control unit 46 superimposes (combines) and displays a predetermined virtual object image (hereinafter simply referred to as “virtual object”) on the surrounding image displayed on the display unit 44.
  • the virtual object is an image showing a part of the crane 1 or a movement target.
  • Such a control unit 46 is also a display processing unit.
  • the control unit 46 displays surrounding images at different viewpoints (for example, the above-described captured images from the three directions) for each display unit element. To do. In this case, the control unit 46 displays a virtual object for each display unit element.
  • the control unit 46 determines the position of the movement target when the crane 1 is operated based on the operation information input from the operation unit 42 (hereinafter, “the estimated position of the movement target”). "). At this time, the movement target is not actually moved.
  • the control unit 46 displays a virtual object at a position corresponding to the estimated position of the movement target (hereinafter referred to as “position after movement”) in the surrounding image displayed on the display unit 44.
  • position after movement the control unit 46 displays a virtual object at a position after movement in each display unit element.
  • the communication unit 43 is configured to be able to transmit and receive control information and / or images to and from the communication unit 75 of the crane 1.
  • the communication unit 43 is configured to be able to transmit and receive control information and / or images to and from the control unit 46.
  • the display unit 44 is configured to be able to transmit and receive images to and from the control unit 46. Since the display unit 44 has already been described, detailed description thereof is omitted here.
  • the operation unit 42 and the stop operation unit 45 are each configured to be able to transmit operation information to the control unit 46. Since the operation unit 42 and the stop operation unit 45 have already been described, detailed description thereof is omitted here.
  • the vertical axis indicates whether or not there has been an operation input from the operation unit 42 (ON: operation input ant; OFF: operation input no).
  • the horizontal axis indicates time (t).
  • the vertical axis indicates whether or not the hook 17 is actually moving based on the operation input (ON: moving: OFF: not moving).
  • the horizontal axis indicates time (t).
  • the vertical axis indicates whether or not the hook 17 is moving in the image displayed on the display unit 44 (ON: moving; corresponding to the actual movement of the hook 17; OFF: not moving.
  • the horizontal axis indicates time (t).
  • this delay time is assumed to be 5 seconds (downlink delay: 5 seconds). Similarly, it takes time (that is, delays) until the surrounding image of the crane 1 based on the detection information of the detection device 145 is transmitted to the remote operation terminal 40 and displayed on the display unit 44. In the example shown in FIG. 6, this delay time is assumed to be 6 seconds (uplink delay: 6 seconds).
  • the user who operates the remote operation terminal 40 in a remote place cannot confirm the actual movement of the hook 17 on the display unit 44 for 0 to 11 seconds, the user moves the hook while predicting the movement of the hook 17. It is necessary to let In fact, even when the operation is finished in 30 seconds, the hook 17 on the display unit 44 is in the middle of movement (the hook 17 is in the position moved for 19 seconds). It is necessary to finish the operation while anticipating the movement.
  • the remote operation terminal 40 superimposes and displays a virtual object corresponding to the movement target on the surrounding image displayed on the display unit 44.
  • the virtual object is superimposed on the position of the image to be moved in the surrounding image displayed on the display unit 44 (hereinafter referred to as “position before movement”). Is displayed.
  • the virtual object moves with respect to the surrounding image displayed on the display unit 44 in accordance with the operation information input from the operation unit 42.
  • the virtual object moves in the display unit 44 under the same conditions as the movement of the movement target when it is assumed that the crane 1 is operated based on the operation information. That is, the user can perform a simulated operation (simulation) of the actual movement of the movement target by moving the virtual object with respect to the surrounding image.
  • the movement target is the hook 17 of the crane 1
  • the virtual object is displayed superimposed on the position of the hook 17 in the surrounding image displayed on the display unit 44.
  • the control unit 46 calculates the estimated position of the movement target based on the operation information.
  • the control unit 46 displays the virtual object at a position corresponding to the estimated position of the movement target (that is, the position after the movement described above) in the surrounding image displayed on the display unit 44. That is, the virtual object moves from the position before the movement to the position after the movement in the display unit 44.
  • the operation input processing unit 87 of the operation unit 42 sends operation information of the operation unit 42 (for example, information on the operation direction of the operation stick 42a and information on the operation amount) to the control unit 46.
  • the control unit 46 calculates the position of the hook 17 (that is, the estimated position of the movement target described above) when there is no delay of the two types based on the operation information.
  • the control unit 46 superimposes a predetermined virtual object determined in advance on a position corresponding to the calculated estimated position of the movement target in the surrounding image. Then, the control unit 46 performs control to display the superimposed image on the display unit 44 after superimposition.
  • FIG. 7 is a perspective view of an actual work site.
  • FIG. 8 shows an image displayed on the display unit 44 of the remote operation terminal 40.
  • FIG. 8 shows the display unit 44 divided into a plurality of display unit elements.
  • Each display unit element displays a peripheral image of a different viewpoint (for example, a photographed image from the three directions described above) and a virtual object 80.
  • FIG. 8 shows an image around the crane 1 shown in FIG.
  • the surrounding image includes an actual image of the hook 17 and the luggage P shown in FIG. 7 (hereinafter referred to as “movement target image”).
  • the virtual object 80 corresponding to the hook 17 and the luggage P shown in FIG. 7 is displayed at a position shifted from the positions of the hook 17 and the luggage P in the surrounding image.
  • FIG. 8 shows a state in which the user operates the operation unit 42 to move the virtual object 80 on the display unit 44 from the position before the movement to the position after the movement shown in FIG.
  • the position before the movement is the position of the hook 17 and the luggage P shown in FIG.
  • the virtual object 80 is located at a position before the movement. The user can further move the virtual object 80 with respect to the surrounding image shown in FIG. 8 by operating the operation unit 42.
  • the user works while looking at the virtual object 80 instead of looking at the hook 17 or the luggage P on the screen of the display unit 44 shown in FIG. Then, the hook 17 and the luggage P in the image displayed on the display unit 44 move along the movement locus of the virtual object 80 after 11 seconds in the above example.
  • the virtual object 80 is data obtained by projecting the virtual object 80 arranged in the virtual three-dimensional space into the two-dimensional space.
  • the virtual object 80 is an image obtained by converting (viewpoint conversion) the virtual object 80 arranged in the virtual three-dimensional space into a two-dimensional image viewed from a predetermined viewpoint position as shown in FIG. .
  • the virtual object 80 is a virtual object that is not included in the real space.
  • the display mode of the display unit 44 is not limited to the case of FIG. 8, and a configuration in which a virtual three-dimensional image in which the virtual object 80 is arranged is displayed on the display unit 44 may be used.
  • the virtual object 80 is image data that can be handled by the display unit 44, for example.
  • the image data of the virtual object 80 is preferably image data of a captured image of the hook 17 and the luggage P captured at a timing different from that of the real space image, but the present invention is not limited in this respect.
  • the virtual object 80 may be, for example, an image created separately by an external device or image data of a printed matter.
  • the virtual object 80 may be an image that schematically represents a movement target (for example, the hook 17 and the luggage P).
  • the mobile crane which moves the load P using the hook 17 was demonstrated as a working machine, this invention is not limited in this point, For example, it has a holding device and the load P is attached. It may be a mobile crane that is gripped and moved. Further, the present invention can be applied even when the working machine has a work floor such as a bucket, a basket, or a platform like an aerial work vehicle. In these cases, it is preferable that the virtual object 80 is an image relating to a gripping device or a work floor.
  • step S1 when the remote operation terminal 40 is turned on, the control unit 46 acquires a surrounding image of the crane 1.
  • the control unit 46 acquires a surrounding image as follows.
  • the surrounding image is generated based on the detection information of the detection device 145.
  • the surrounding image generated by the detection device 145 is transmitted from the crane 1 to the remote operation terminal 40 via the communication unit 75.
  • the remote operation terminal 40 receives the surrounding image via the communication unit 43.
  • the communication unit 43 sends the received surrounding image to the control unit 46.
  • the control unit 46 acquires a surrounding image of the crane 1.
  • the control unit 46 may acquire the surrounding image at a predetermined time interval.
  • step S ⁇ b> 2 the control unit 46 displays the acquired surrounding image on the display unit 44.
  • the control unit 46 may update the surrounding images on the display unit 44 at predetermined time intervals.
  • step S3 the control unit 46 determines whether or not an input operation is detected by the operation sensor 86 of the operation unit 42. At this time, for example, the control unit 46 performs detection determination of the input operation based on the operation information (input signal) transmitted from the operation sensor 86.
  • step S3: No when determining that the input operation is not detected (step S3: No), the control unit 46 repeats the detection routine until the input operation is detected. On the other hand, when it is determined that an input operation has been detected (step S3: Yes), the control unit 46 calculates the estimated position of the movement target described above based on the operation information input from the operation unit 42 in step S4. .
  • step S5 the control unit 46 superimposes (synthesizes) the virtual object 80 on the surrounding image displayed on the display unit 44. That is, the control unit 46 generates an image in which the virtual object 80 is superimposed on the surrounding image (hereinafter referred to as “superimposed image”).
  • the position where the virtual object 80 is superimposed is a position corresponding to the estimated position of the movement target calculated in step S4 in the surrounding image (that is, the position after movement described above).
  • step S6 the control unit 46 displays the superimposed image after the superimposition process on the display unit 44. Thereafter, the control unit 46 returns the control process to step S3.
  • the user can operate the remote operation terminal 40 while viewing the virtual object 80 at the position calculated based on the input operation from the operation unit 42 on the screen of the display unit 44. Therefore, the user can perform an operation without being aware of the delay due to communication.
  • the control unit 46 When the operation unit 42 is operated by the user, the control unit 46 generates control information including the operation information input from the operation unit 42 and sends it to the crane 1.
  • the crane 1 operates based on the received control information.
  • the crane 1 operates so that the movement target (for example, the hook 17 or the luggage P) reproduces the movement of the virtual object 80 on the display unit 44.
  • the movement target actually moves in the same manner as the movement of the virtual object 80 on the display unit 44.
  • FIG. 10 is a perspective view of an actual work site.
  • FIG. 11 shows an image displayed on the display unit 44 of the remote operation terminal 40.
  • the vertical axis indicates whether or not there is an operation input from the operation unit 42 (ON: operation input ant; OFF: operation input no).
  • the horizontal axis represents time (t).
  • the vertical axis indicates whether or not the hook 17 is actually moving based on the operation input (ON: moving: OFF: not moving).
  • the horizontal axis represents time (t).
  • the vertical axis indicates whether or not the hook 17 is moving in the image displayed on the display unit 44 (ON: moving) corresponding to the actual movement of the hook 17. ; OFF: not moving)
  • the horizontal axis represents time (t).
  • the hook 17 is moved from 0 to 30 seconds as shown in FIG. 12, but at the time of 15 seconds, the virtual object 80 becomes an obstacle B as shown in FIG. A case where the operation is stopped at the time of 15 seconds because of the collision will be described.
  • the stop of the operation mentioned here may be any case of stopping the input of the operation unit 42 of the remote operation terminal 40 or performing the stop input by the stop operation unit 45.
  • control information (in other words, the operation command) sent from the remote operation terminal 40 to the crane 1 is delayed for 5 seconds due to the downward delay. Move from 5 to 20 seconds.
  • the crane 1 receives the control information including the stop command at the time of 20 seconds and stops the movement of the hook 17.
  • the user actually confirms on the screen of the display unit 44 that the hook 17 and the luggage P collide with the obstacle B at the time of 26 seconds due to the 6-second uplink delay.
  • a configuration is provided in which a further delay other than the above-described two types of delays is provided with respect to the time from when the operation input is performed at the remote operation terminal 40 until the crane 1 actually operates the operation.
  • the further delay is a delay that is intentionally set, not a delay caused by communication. Further delay may be set by the user. Alternatively, the further delay may be stored in advance in a storage device such as the ROM 82 of the remote operation terminal 40.
  • the vertical axis indicates whether or not there has been an operation input from the operation unit 42 (ON: operation input ant; OFF: operation input no).
  • the horizontal axis represents time (t).
  • the vertical axis indicates whether or not the hook 17 is actually moving based on the operation input (ON: moving: OFF: not moving).
  • the horizontal axis represents time (t).
  • the vertical axis indicates whether or not the hook 17 is moving in the image displayed on the display unit 44 (ON: moving) corresponding to the actual movement of the hook 17. ; OFF: not moving)
  • the horizontal axis represents time (t).
  • the stop of the operation means that the stop operation unit 45 performs a stop input.
  • the stop command based on the stop input is sent from the remote operation terminal 40 to the crane 1 without any further delay.
  • the operation command is delayed for 5 seconds due to the delay of the downlink, but in this embodiment, a delay of 20 seconds is provided as a further delay.
  • the hook 17 and the luggage P are scheduled to move for 25 to 40 seconds.
  • stop input is performed by the stop operation unit 45 at the time of 15 seconds.
  • the stop command based on the stop input is sent from the remote control terminal 40 to the crane 1 without any further delay. For this reason, the crane 1 receives the stop command at the time of 20 seconds, which is delayed by 5 seconds from the 15 seconds when the remote operation terminal 40 has sent the stop command.
  • the crane 1 stops the movement schedule of the hook 17 and the load P at 20 seconds when the stop command is received. As a result, since the hook 17 and the luggage P do not move, a collision with the obstacle B is also avoided. According to the present embodiment, it is easy to avoid a collision between the hook 17 and the obstacle B, and to redo an operation due to a user's operation error.
  • the time for further delay is not limited as long as it is longer than 0 seconds. Considering the time delay for the stop operation unit 45 to input the stop after the user confirms the collision between the virtual object 80 and the obstacle B, the further delay time is preferably 5 seconds or more.
  • the further delay time is preferably a time longer than the assumed series of work time.
  • the further delay is preferably 30 seconds or more.
  • the further delay is preferably 300 seconds or more.
  • the user can start the actual movement of the hook 17 and the package P after confirming on the screen of the display unit 44 that the virtual object 80 has achieved all the operations safely.
  • control unit 46 of the remote operation terminal 40 When managing the further delay with the remote operation terminal 40, there are the following two methods for the control unit 46 of the remote operation terminal 40 to send control information to the crane 1.
  • the control information includes operation information input from the operation unit 42 of the remote operation terminal 40.
  • the first method is a method in which the control unit 46 transmits control information related to the operation of the crane 1 based on the operation of the operation unit 42 to the crane 1 after a predetermined time (a further delay time) has elapsed. .
  • the crane 1 that has received the control information immediately starts an operation based on the operation information included in the control information.
  • the second method is a method in which the control unit 46 transmits to the crane 1 control information including an instruction to start an operation after a predetermined time has elapsed.
  • the crane 1 that has received the control information starts an operation based on the operation information included in the control information after a predetermined time has elapsed since the control information was received.
  • step S ⁇ b> 11 of FIG. 14 the control unit 46 determines whether or not there is an input operation in the operation unit 42. Specifically, the control unit 46 determines whether or not there is an input operation in the operation unit 42 based on the detection result of the operation information in the operation unit 42 (specifically, the operation sensor 86).
  • step S11: No when it is determined that there is no input operation in the operation unit 42 (step S11: No), the control unit 46 repeats the determination routine until it is determined that there is an input operation in the operation unit 42.
  • step S11: Yes the control unit 46 acquires an operation command signal from the operation unit 42 in step S12 of FIG.
  • the operation command signal is generated by the operation input processing unit 87 of the operation unit 42 as described above. Then, the control unit 46 generates control information including the received operation command signal.
  • step S13 of FIG. 14 the control unit 46 determines whether or not a predetermined time (a further delay time) has elapsed.
  • a predetermined time a further delay time
  • step S13 when it is determined that the predetermined time has elapsed (step S13: Yes), the control unit 46 transmits control information (operation instruction) to the crane 1 in step S14 of FIG. And the control part 46 returns control processing to step S11.
  • step S21 of FIG. 15 the control unit 46 determines whether or not there is an input operation in the operation unit 42. Specifically, the control unit 46 determines whether or not there is an input operation in the operation unit 42 based on the detection result of the operation information in the operation unit 42 (specifically, the operation sensor 86).
  • step S21: No when it is determined that there is no input operation in the operation unit 42 (step S21: No), the control unit 46 repeats the determination routine until it is determined that there is an input operation in the operation unit 42.
  • step S21: Yes when it is determined that there is an input operation in the operation unit 42 (step S21: Yes), the control unit 46 acquires an operation command signal from the operation unit 42 in step S22 of FIG. Then, the control unit 46 generates control information including the received operation command signal.
  • control unit 46 transmits control information (operation instruction) to the crane 1.
  • the control information in this case includes operation information related to the operation such as the moving direction and moving speed of the hook 17 and delay control information for controlling the movement of the hook 17 to start at a predetermined time.
  • the delay control information is information regarding the further delay described above.
  • the control unit 46 returns the control process to step S21.
  • the present invention is not limited in this respect, and is a configuration in which further delay is managed on the crane 1 side. May be.
  • the control device 70 of the crane 1 that has received the control information from the remote operation terminal 40 transmits the control information to the actuator 50 after a predetermined time has elapsed, or performs the work after the predetermined time.
  • a further delay can be managed on the crane 1 side by a method such as transmitting control information to the actuator 50.
  • the crane 1 after the input operation of the operation unit 42, the crane 1 performs an actual operation after a predetermined time (that is, a time corresponding to the sum of the downward delay time and the further delay time) has elapsed. Start operation. Therefore, even when the crane 1 is expected to collide with the obstacle B, the collision between the crane 1 and the obstacle B can be avoided by stopping the crane 1 during the predetermined time. .
  • a predetermined time that is, a time corresponding to the sum of the downward delay time and the further delay time
  • the crane 1 can be started if there is no problem after confirming the movement locus of the virtual object 80 by a series of operations.
  • Embodiment 3 according to the present invention will be described below.
  • the user moves the operation stick 42 a in a predetermined direction, so that the hook 17 moves in a direction corresponding to the direction at a speed corresponding to the operation amount.
  • control unit 46 transmits control information to the crane 1
  • the control information is set so that the acceleration of the hook 17 calculated based on the operation of the operation unit 42 is not more than a predetermined acceleration. It is preferable to transmit.
  • the control unit 46 transmits the control information to the crane 1 so that the acceleration of the hook 17 calculated based on the operation of the operation unit 42 is equal to or less than a predetermined acceleration. For this reason, even when a load shake occurs in the operation of the virtual object 80, the work can be performed without causing the load shake during the reproduction by the crane 1.
  • Embodiment 4 according to the present invention will be described below. Even when the user operating the remote operation terminal 40 determines that there is no problem with the movement of the virtual object 80 based on the operation of the operation unit 42, the work is suddenly stopped due to the occurrence of a new obstacle or the like. It may be necessary. The occurrence of a new obstacle mentioned here may occur due to a situation such as a truck arriving at the work site or another worker entering the work site.
  • the detection device 145 detects an object that can collide with the movement target on the movement locus of the movement target calculated based on the control information received from the remote operation terminal 40, ( The control device 70 of the crane 1 (not the control unit 46 of the remote operation terminal 40) transmits control information to the actuator 50 so as to stop the operation of the crane 1 based on the operation information of the operation unit 42.
  • the crane 1 can safely perform work based on at least one of the received control information (for example, the movement trajectory of the hook 17) and the surrounding situation of the crane 1 at that time, for example. Determine whether. When it is determined that the work cannot be performed safely, the crane 1 stops the operation based on the control information.
  • the received control information for example, the movement trajectory of the hook 17
  • a danger is detected on the actual machine (ie, crane 1) side, and the operation of the crane 1 is stopped based on this detection, thereby avoiding an unexpected danger. can do.
  • remote control terminal 40 (specifically control part 46) sends out control information to crane 1 by two kinds of methods explained below.
  • the first method can be adopted in the configuration of the first embodiment described above.
  • the remote operation terminal 40 (specifically, the control unit 46) sequentially outputs the control information based on the operation information. Generated and sent to the crane 1.
  • the first method may be employed in a configuration in which further delay management is performed on the crane 1 side in the above-described second embodiment.
  • the second method can be employed in the configuration of the above-described embodiment 2 in which further delay management is performed on the remote operation terminal 40 side.
  • the remote operation terminal 40 when the operation information is input from the operation unit 42, the remote operation terminal 40 (specifically, the control unit 46) generates control information based on the operation information. .
  • the remote operation terminal 40 sequentially sends control information to the crane 1 after a predetermined time (a time corresponding to a further delay) has elapsed since the operation information was input in the operation unit 42.
  • the method by which the remote operation terminal 40 sends control information to the crane 1 is not limited to the above two methods.
  • the remote operation terminal 40 may send control information to the crane 1 by the following third method.
  • the remote operation terminal 40 sends control information to the crane 1 for each work.
  • One operation means, for example, an operation of moving the load P from the first state where the hook 17 and the load P are shown in FIG. 8 to the second state placed on the upper surface of the obstacle A.
  • One operation may include operations of a plurality of cranes 1.
  • One work is determined by the user indicating the start time (for example, the start point of the hook 17 or the load P) and the end time (for example, the end point of the hook 17 or the load P). Also good.
  • the user operates the operation unit 42 of the remote operation terminal 40 to move the virtual object 80 displayed on the display unit 44 from the first state to the second state. In this state, one work is completed. At this point, the moving object is not actually moving.
  • the operation information input from the operation unit 42 is stored in a storage device (also referred to as a storage unit) such as a RAM of the remote operation terminal 40.
  • the information stored in the storage device may be control information generated by the control unit 46 based on operation information input from the operation unit 42 during the work.
  • control information corresponding to the work is sent from the remote operation terminal 40 to the crane 1.
  • the user may instruct the timing to send control information from the remote operation terminal 40 to the crane 1.
  • the remote operation terminal 40 has an input unit (such as a switch) for the user to specify the timing for sending out the control information.
  • the user can operate the crane 1 after confirming the completion of one work on the display unit 44 of the remote operation terminal 40.
  • the remote operation terminal 40 when the virtual object 80 collides with the obstacle B on the display unit 44 (see FIG. 11), the user can redo the operation. Since the control information is not sent to the crane 1 when the virtual object 80 collides with the obstacle B on the display unit 44, the user does not need to perform a stop operation or the like.
  • the user may perform a plurality of simulation operations for one work at the remote operation terminal 40 and select one simulation operation from the plurality of simulation operations. At this time, the user may select, for example, a simulated operation with the optimal time condition and route condition from a plurality of simulated operations. Such a configuration is effective in improving work efficiency.
  • the crane 1 may start operation after receiving all the control information corresponding to one work.
  • the crane 1 may sequentially operate based on the received information when a part of the control information corresponding to one work is received.
  • the crane 1 determines whether the work can be performed safely based on the received control information (for example, the movement locus of the hook 17) and the surrounding situation of the crane 1 at that time. Also good. When it is determined that the work cannot be performed safely, the crane 1 does not execute the operation based on the control information, and sends information related to the determination result to the remote operation terminal 40. On the other hand, when it is determined that the work can be performed safely, the crane 1 operates based on the control information. In addition, when the user determines that the crane 1 cannot perform the work safely, the user may stop the operation of the crane 1 by a stop unit such as a stop switch. In this case, it is preferable that the user is in a place where the state of the crane 1 can be confirmed without delay. Moreover, it is preferable that stop means, such as a stop switch, can stop operation
  • a stop unit such as a stop switch
  • the movement target is not limited to a hook.
  • the moving object may be the entire crane 1 as well as the components of the crane 1 such as a hook or a boom.
  • the movement target is a boom
  • a virtual object corresponding to the boom 14 is superimposed on the surrounding image of the crane 1 displayed on the display unit 44 and displayed. Then, the user operates the operation unit 42 of the remote operation terminal 40 to expand, contract, raise, or turn the virtual object on the display unit 44.
  • a remote operation terminal that remotely operates the operation of the work machine including the work device, the control device, and the detection device.
  • a remote operation terminal includes a control unit configured to be communicable with a control device, a display unit for displaying an image generated based on detection information of the detection device, and an operation for remotely operating the work device.
  • a control part superimposes a predetermined virtual object on the position calculated based on the operation information of an operation part with respect to the said image, and displays it on a display part.
  • the present invention is not limited to movable work machines such as mobile cranes and aerial work platforms, but can also be applied to fixed cranes. Therefore, the industrial applicability is great.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Control And Safety Of Cranes (AREA)

Abstract

L'invention concerne un terminal d'exploitation à distance permettant d'exploiter à distance un engin de mise en œuvre, configuré de façon à comprendre : une unité d'affichage ; une unité de traitement d'affichage permettant de composer, sur une image périphérique concernant l'état de la périphérie de l'engin de mise en œuvre, un objet virtuel indiquant une partie de l'engin de mise en œuvre ou un objet à déplacer, et d'afficher l'image composée sur l'unité d'affichage ; une unité d'exploitation permettant de simuler l'exploitation de l'objet virtuel sur l'unité d'affichage ; et une unité de commande permettant d'envoyer, à l'engin de mise en œuvre, des informations d'exploitation entrées à partir de l'unité d'exploitation pendant une exploitation simulée.
PCT/JP2017/046711 2016-12-27 2017-12-26 Terminal d'exploitation à distance et système d'exploitation à distance WO2018124098A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018547491A JP6460294B2 (ja) 2016-12-27 2017-12-26 遠隔操作端末及び遠隔操作システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016252978 2016-12-27
JP2016-252978 2016-12-27

Publications (1)

Publication Number Publication Date
WO2018124098A1 true WO2018124098A1 (fr) 2018-07-05

Family

ID=62708245

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046711 WO2018124098A1 (fr) 2016-12-27 2017-12-26 Terminal d'exploitation à distance et système d'exploitation à distance

Country Status (2)

Country Link
JP (1) JP6460294B2 (fr)
WO (1) WO2018124098A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020007130A (ja) * 2018-07-11 2020-01-16 株式会社タダノ クレーン
JP2020066391A (ja) * 2018-10-26 2020-04-30 三菱重工交通機器エンジニアリング株式会社 ボーディングブリッジ及びボーディングブリッジ制御装置
WO2020235681A1 (fr) * 2019-05-22 2020-11-26 株式会社タダノ Terminal de commande à distance, et grue mobile équipée de celui-ci
CN112479037A (zh) * 2020-12-01 2021-03-12 徐州重型机械有限公司 一种穿戴式监控系统、起重机及工程机械
CN112884710A (zh) * 2021-01-19 2021-06-01 上海三一重机股份有限公司 作业机械的辅助影像生成方法、远程操控方法及其装置
WO2021230092A1 (fr) * 2020-05-13 2021-11-18 コベルコ建機株式会社 Dispositif d'aide à la manœuvre à distance et procédé d'aide à la manœuvre à distance
EP4036046A4 (fr) * 2019-09-27 2023-11-01 Tadano Ltd. Système d'affichage d'informations de grue
JP7382684B1 (ja) 2023-08-04 2023-11-17 サン・シールド株式会社 玉掛け作業シミュレーションシステム、及び、玉掛け作業シミュレーション方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11149315A (ja) * 1997-11-19 1999-06-02 Mitsubishi Heavy Ind Ltd ロボット制御システム
JP2006000977A (ja) * 2004-06-17 2006-01-05 National Univ Corp Shizuoka Univ ロボット環境間力作用状態呈示装置
JP2012010468A (ja) * 2010-06-23 2012-01-12 Toshiba Corp 電子機器及び電源制御装置
JP2014053795A (ja) * 2012-09-07 2014-03-20 NEUSOFT Japan株式会社 情報処理装置及び情報処理システム
EP2922287A1 (fr) * 2014-03-19 2015-09-23 Commissariat à l'Énergie Atomique et aux Énergies Alternatives Dispositif de caméra portative à fixer a une pince de télémanipulateur
WO2016163559A1 (fr) * 2015-04-09 2016-10-13 ヤマハ発動機株式会社 Petit navire et système de remorque de petit navire

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11149315A (ja) * 1997-11-19 1999-06-02 Mitsubishi Heavy Ind Ltd ロボット制御システム
JP2006000977A (ja) * 2004-06-17 2006-01-05 National Univ Corp Shizuoka Univ ロボット環境間力作用状態呈示装置
JP2012010468A (ja) * 2010-06-23 2012-01-12 Toshiba Corp 電子機器及び電源制御装置
JP2014053795A (ja) * 2012-09-07 2014-03-20 NEUSOFT Japan株式会社 情報処理装置及び情報処理システム
EP2922287A1 (fr) * 2014-03-19 2015-09-23 Commissariat à l'Énergie Atomique et aux Énergies Alternatives Dispositif de caméra portative à fixer a une pince de télémanipulateur
WO2016163559A1 (fr) * 2015-04-09 2016-10-13 ヤマハ発動機株式会社 Petit navire et système de remorque de petit navire

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7119674B2 (ja) 2018-07-11 2022-08-17 株式会社タダノ クレーン
JP2020007130A (ja) * 2018-07-11 2020-01-16 株式会社タダノ クレーン
JP2020066391A (ja) * 2018-10-26 2020-04-30 三菱重工交通機器エンジニアリング株式会社 ボーディングブリッジ及びボーディングブリッジ制御装置
WO2020235681A1 (fr) * 2019-05-22 2020-11-26 株式会社タダノ Terminal de commande à distance, et grue mobile équipée de celui-ci
JPWO2020235681A1 (fr) * 2019-05-22 2020-11-26
JP7416063B2 (ja) 2019-05-22 2024-01-17 株式会社タダノ 遠隔操作端末および遠隔操作端末を備える移動式クレーン
EP4036046A4 (fr) * 2019-09-27 2023-11-01 Tadano Ltd. Système d'affichage d'informations de grue
WO2021230092A1 (fr) * 2020-05-13 2021-11-18 コベルコ建機株式会社 Dispositif d'aide à la manœuvre à distance et procédé d'aide à la manœuvre à distance
EP4116041A4 (fr) * 2020-05-13 2023-08-23 Kobelco Construction Machinery Co., Ltd. Dispositif d'aide à la manoeuvre à distance et procédé d'aide à la manoeuvre à distance
JP7484401B2 (ja) 2020-05-13 2024-05-16 コベルコ建機株式会社 作業機械の遠隔操作支援システム
CN112479037A (zh) * 2020-12-01 2021-03-12 徐州重型机械有限公司 一种穿戴式监控系统、起重机及工程机械
CN112884710A (zh) * 2021-01-19 2021-06-01 上海三一重机股份有限公司 作业机械的辅助影像生成方法、远程操控方法及其装置
JP7382684B1 (ja) 2023-08-04 2023-11-17 サン・シールド株式会社 玉掛け作業シミュレーションシステム、及び、玉掛け作業シミュレーション方法

Also Published As

Publication number Publication date
JP6460294B2 (ja) 2019-01-30
JPWO2018124098A1 (ja) 2019-01-10

Similar Documents

Publication Publication Date Title
JP6460294B2 (ja) 遠隔操作端末及び遠隔操作システム
JP6772803B2 (ja) クレーン
JP6743676B2 (ja) 遠隔操作端末
ES2808029T3 (es) Dispositivo de control remoto para grúa, máquina de construcción y/o camión industrial, y sistema que comprende este dispositivo de control remoto y una grúa, una máquina de construcción y/o un camión industrial
US10921771B2 (en) Method and device for planning and/or controlling and/or simulating the operation of a construction machine
JP7070047B2 (ja) 旋回式作業機械の旋回制御装置
WO2018110707A1 (fr) Terminal d'actionnement à distance et engin de chantier équipé d'un terminal d'actionnement à distance
WO2017065093A1 (fr) Dispositif d'actionnement à distance et système d'instruction
JP5091447B2 (ja) 作業機搭載車両の周辺監視支援装置
CN109405804B (zh) 作业辅助方法及系统
US20210270013A1 (en) Shovel, controller for shovel, and method of managing worksite
WO2019172417A1 (fr) Terminal de commande à distance et véhicule de travail
KR101258851B1 (ko) 크레인 제어 장치 및 방법
WO2022176783A1 (fr) Pelle et dispositif de traitement d'informations
JP7172199B2 (ja) 遠隔操作端末及び作業車両
JP2019218198A (ja) 操作支援システム
JP2015154239A (ja) 建設機械用俯瞰画像表示装置
JP2022179081A (ja) 遠隔操作支援システム、遠隔操作支援装置
KR101369214B1 (ko) 건설 현장 안전 시스템
JP2021124987A (ja) 動作支援サーバ、作業機、及び性能情報の提供方法
CN114867921A (zh) 工程机械的远程操纵系统
JP2020164281A (ja) 作業機械用吊りシステム
US20220317684A1 (en) Display device and route display program
JP2019167220A (ja) 作業車用の周囲監視システム、作業車、表示方法、及びプログラム
WO2020218438A1 (fr) Grue, corps de grue et unité mobile

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018547491

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17887367

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17887367

Country of ref document: EP

Kind code of ref document: A1