CN115244254A - Work support server and work support method - Google Patents

Work support server and work support method Download PDF

Info

Publication number
CN115244254A
CN115244254A CN202180020031.1A CN202180020031A CN115244254A CN 115244254 A CN115244254 A CN 115244254A CN 202180020031 A CN202180020031 A CN 202180020031A CN 115244254 A CN115244254 A CN 115244254A
Authority
CN
China
Prior art keywords
work
work machine
image
machine
captured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180020031.1A
Other languages
Chinese (zh)
Inventor
佐佐木均
佐伯诚司
山崎洋一郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kobelco Construction Machinery Co Ltd
Original Assignee
Kobelco Construction Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kobelco Construction Machinery Co Ltd filed Critical Kobelco Construction Machinery Co Ltd
Publication of CN115244254A publication Critical patent/CN115244254A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/20Drives; Control devices
    • E02F9/2025Particular purposes of control systems not otherwise provided for
    • E02F9/205Remotely operated machines, e.g. unmanned vehicles
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/261Surveying the work-site to be treated
    • E02F9/262Surveying the work-site to be treated with follow-up actions to control the work tool, e.g. controller
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F3/00Dredgers; Soil-shifting machines
    • E02F3/04Dredgers; Soil-shifting machines mechanically-driven
    • E02F3/28Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
    • E02F3/36Component parts
    • E02F3/42Drives for dippers, buckets, dipper-arms or bucket-arms
    • E02F3/43Control of dipper or bucket position; Control of sequence of drive operations
    • E02F3/435Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like

Landscapes

  • Engineering & Computer Science (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Component Parts Of Construction Machinery (AREA)
  • Operation Control Of Excavators (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The invention provides a job assisting server and a job assisting method. A work support server (10) is provided with a first support processing element (121) and a second support processing element (122). When the start of work of the work machine (40) is requested, a first auxiliary processing element (121) creates a peripheral image in which a peripheral area of the work machine (40) is captured; the second auxiliary processing element (122) permits the operation of the work machine (40) on the condition that the surrounding image is displayed on a display device (221) of the remote operation device (20).

Description

Work support server and work support method
Technical Field
The present invention relates to a work support server for supporting remote operation of a work machine and a work support method for supporting work of a work machine.
Background
Conventionally, a plurality of fixed point cameras are installed in a work site where a work machine such as a remote excavator performs work, and the entire work site can be viewed in plan or the operation or work of a specific excavator can be monitored.
In addition, a method of mounting a plurality of cameras on a work machine to capture images in order to check the situation around the work machine is also known.
For example, in an excavator disclosed in patent document 1, a left camera is attached to a bracket on the left front side of a cab, and a rear camera is attached to an upper portion of a counterweight. Further, a right camera is attached to the armrest on the right front side of the upper revolving structure.
The left and right cameras are attached so as to include the side surface of the upper revolving structure in the imaging ranges of the left and right cameras. The rear camera is attached so as to include the upper rear end of the counterweight in the shooting range. The operator of the excavator can easily recognize the positional relationship with the objects existing in the periphery by these cameras (patent document 1/paragraphs 0015 to 0019, fig. 2).
Documents of the prior art
Patent literature
Patent document 1: international publication No. 2017/115808
Disclosure of Invention
Problems to be solved by the invention
However, when the remote working machine is remotely operated, an operator (worker) is located at a position remote from the remote working machine. Therefore, there are the following requirements: in other words, during the work, the operator is required to efficiently check the image of the camera that is captured around the work machine that is the remote operation target.
The present invention has been made in view of the above problems, and an object of the present invention is to provide a work support server capable of efficiently reporting the peripheral situation of a work machine to an operator.
Means for solving the problems
A first aspect of the present invention is directed to a work support server for supporting a work of a work machine to be operated so that the work machine to be operated can be remotely operated in accordance with an operation performed on a remote operation device having a display device, the work support server including: a first auxiliary processing element that, when requested to start work on the work machine, creates a surrounding image in which a surrounding area of the work machine is captured; and a second auxiliary processing element that permits the work machine to perform work on condition that the surrounding image is displayed on the display device.
In the work support server according to the present invention, the captured image (still image or moving image) of the area around the work machine is transmitted to the display device, and the operator can confirm the captured image. Thus, the work support server can support the operator in operating the work machine.
When the start of the work machine is requested, the first auxiliary processing element creates a surrounding image in which a surrounding area of the work machine is captured, and displays the surrounding image on the display device of the remote operation device. The second auxiliary processing element permits the work machine to perform work on the condition that the peripheral image is displayed on the display device. This allows the operator to observe the area around the work machine before starting the work of the work machine. The work support server allows the remote operation device to operate the work machine after the peripheral image to be confirmed by the operator is reliably displayed on the display device of the remote operation device.
A second aspect of the present invention relates to a work assisting method for assisting a work of a work machine to be operated so that the work machine to be operated can be remotely operated in accordance with an operation performed on a remote operation device having a display device, the method including: a first step of creating a surrounding image in which a surrounding area of the work machine is captured when a request to start work on the work machine is made; and a second step of permitting the work machine to perform work on the condition that the surrounding image is displayed on the display device.
In the work assisting method of the present invention, in a first step, when a request to start a work of a work machine is made, a surrounding image in which a surrounding area of the work machine is captured is created, and the surrounding image is displayed on a display device of a remote operation device. In addition, in the second step, the work machine is permitted to perform the work on the condition that the surrounding image is displayed on the display device. This allows the operator to observe the peripheral region of the work machine before starting the work of the work machine.
According to the work support method, the operation of the work machine by the remote operation device can be permitted after the peripheral image to be confirmed by the operator is reliably displayed on the display device of the remote operation device.
Brief description of the drawings
Fig. 1 is a diagram illustrating an outline of a work support system according to an embodiment of the present invention.
Fig. 2 is a diagram illustrating a configuration of an operation mechanism of the working machine.
Fig. 3A is a diagram (1) illustrating a detailed configuration of the working machine.
Fig. 3B is a diagram (2) illustrating a detailed configuration of the working machine.
Fig. 4A is a flowchart (1) of each process performed by the work assistance server, the remote operation device, and the work machine.
Fig. 4B is a flowchart (2) of each process performed by the work support server, the remote operation device, and the work machine.
Fig. 5 is an example of an image displayed on the image output apparatus (surrounding image 1).
Fig. 6 is a diagram (1) illustrating a unit for acquiring a surrounding image of the work machine.
Fig. 7 is an example of an image displayed on the image output apparatus (surrounding image 2).
Fig. 8 is a diagram (2) illustrating a unit for acquiring a peripheral image of the work machine.
Fig. 9 is an example of an image displayed on the image output apparatus (surrounding image 3).
Fig. 10 is an explanatory diagram of a work environment image displayed on the image output apparatus.
Detailed Description
Hereinafter, the details of the work support server according to the present invention will be described with reference to the drawings.
First, a work support server 10 and a work support system 1 including the work support server 10 according to an embodiment of the present invention will be described with reference to fig. 1.
The work assistance system 1 of the construction machine according to the present embodiment is a system configured by: the operator OP can selectively remotely operate each of the plurality of work machines 40 assigned to the operator OP or the remote operation device 20 as an operation target by operating the remote operation device 20. The work site where the plurality of work machines 40 to be operated by the remote operation device 20 are arranged may be one work site or one of a plurality of work sites.
The work assistance system 1 includes a work machine 40, and is configured by at least a work assistance server 10 and a remote operation device 20 for remotely operating the work machine 40. The work support server 10, the remote operation device 20, and the work machine 40 are configured to be able to communicate with each other via a network NW including a wireless communication network. The work support server 10 can automatically select an appropriate image capturing device and capture an image according to switching or movement of the work machine 40, the content of a work, or the like. The work assisting server 10 outputs the captured image to an image output device 221 (a "display device" of the present invention) described later in order to assist the operation of the operator OP of the remote operation device 20.
The work assistance server 10 includes a database 102, a first assistance processing element 121, a second assistance processing element 122, and a server wireless communication device 125. The database 102 stores and holds captured images captured by an actual imaging device 412, a surrounding imaging device 413, and the like, which will be described later. The database 102 may be configured by a database server independent of the work support server 10.
The auxiliary processing elements 121 and 122 are configured by an arithmetic processing device (a single-core processor, a multi-core processor, or a processor core constituting the processor). The auxiliary processing elements 121 and 122 read necessary data and software from a storage device such as a memory, and execute arithmetic processing in accordance with the software with the data as an object. In addition, the server wireless communication device 125 instructs the image output apparatus 221 to display or receive a captured image from the capturing apparatus via the network NW.
Next, the remote operation device 20 includes a remote control device 200, a remote input interface 210, and a remote output interface 220. The remote control device 200 is configured by an arithmetic processing device (a single-core processor, a multi-core processor, or a processor core constituting the processor). The remote control apparatus 200 reads necessary data and software from a storage device such as a memory, and executes arithmetic processing in accordance with the software with the data as an object.
The remote input interface 210 is provided with a remote operation mechanism 211. The remote output interface 220 includes an image output device 221, an operator imaging device 222, and a remote wireless communication device 223.
The operator imaging device 222 is a camera mounted on the remote operation device 20, and images at least the operator seat (seat St).
The remote wireless communication device 223 transmits an operation signal to the work machine 40 via the network NW or receives a captured image from the real-machine imaging device 412, the surrounding imaging device 413, or the like.
The remote operation mechanism 211 includes a travel operation device, a swing operation device, a boom operation device, an arm operation device, and a bucket operation device. Each operating device has an operating lever that receives a rotational operation. An operation lever (travel lever) of the travel operation device is operated to operate lower traveling structure 427 of work machine 40 (see fig. 3A and 3B). The travel bar may also double as a travel pedal. For example, a travel pedal fixed to the base or lower end of the travel lever may be provided.
The operating lever (turning lever) of the turning operating device is used to operate a hydraulic turning motor constituting the turning mechanism 430 of the working machine 40. Further, an operation lever (boom lever) of the boom operation device is used to operate a boom cylinder 442 of the work machine 40 (see fig. 3A and 3B).
The operation lever (arm lever) of the arm operation device is used to operate the arm cylinder 444 of the work machine 40. The control lever (bucket lever) of the bucket control device is used to operate the bucket cylinder 446 of the work machine 40 (see fig. 3A and 3B).
Here, fig. 2 shows an example of each operation lever constituting the remote operation mechanism 211 and a seat St on which the operator OP sits. The seat St is a high back seat with armrests. The seat St may be any type that allows an operator to sit thereon, such as a low back seat without a headrest or a seat without a backrest.
A pair of left and right travel levers 2110 corresponding to the left and right crawler belts are arranged in parallel in the left and right in front of the seat St. One operation lever may double as a plurality of operation levers. For example, the right operation lever 2111 provided in front of the right frame of the seat St may function as a boom lever when operated in the front-rear direction, and may function as a bucket lever when operated in the left-right direction.
Similarly, the left operation lever 2112 provided in front of the left side frame of the seat St may function as an arm when operated in the front-rear direction, and may function as a swing lever when operated in the left-right direction. The lever mode can be arbitrarily changed in accordance with an operation instruction of the operator OP.
As shown in fig. 2, the image output device 221 is composed of a center image output device 2210, a left side image output device 2211, and a right side image output device 2212, which are arranged in front of the seat St, diagonally left front, and diagonally right front, respectively, and have a substantially rectangular screen. The shape and size of the screen (image display area) of each of the center image output device 2210, the left image output device 2211, and the right image output device 2212 may be the same or different from each other.
The right edge of the left image output device 2211 is adjacent to the left edge of the center image output device 2210 so that the screen of the center image output device 2210 is inclined at an angle θ 1 (e.g., 120 ° ≦ θ 1 ≦ 150 °) to the screen of the left image output device 2211. Similarly, the left edge of the right image output device 2212 is adjacent to the right edge of the center image output device 2210 so that the screen of the center image output device 2210 is at an inclination angle θ 2 (e.g., 120 ° ≦ θ 2 ≦ 150 °) to the screen of the right image output device 2212. The inclination angle θ 1 and the inclination angle θ 2 may be the same or different.
The screens of the center image output device 2210, the left image output device 2211, and the right image output device 2212 may be parallel to the vertical direction or inclined to the vertical direction. Further, at least one of the center image output device 2210, the left image output device 2211, and the right image output device 2212 may be configured by an image output device divided into a plurality of image output devices. For example, the center image output device 2210 may be configured by a pair of vertically adjacent image output devices having a substantially rectangular screen. The image output devices 2210 to 2212 may further include speakers (audio output devices).
The working machine 40 includes an actual machine control device 400, an actual machine input interface 410, an actual machine output interface 420, and a working mechanism (working attachment) 440. The real machine control device 400 is configured by an arithmetic processing device (a single-core processor, a multi-core processor, or a processor core constituting the processor). The real machine control device 400 reads necessary data and software from a storage device such as a memory, and executes arithmetic processing in accordance with the software with the data as an object.
Here, fig. 3A and 3B show an example of the work machine 40 according to the present embodiment. The work machine 40 is, for example, a crawler excavator (construction machine), and includes a machine body 450. The machine body 450 is composed of a crawler-type lower traveling body 427, an upper revolving body 435 mounted on the lower traveling body 427 so as to be able to revolve via a revolving mechanism 430, and a working mechanism 440. A cab (cab) 425 is provided in a front left portion of the upper revolving structure 435, and an operating mechanism 440 is provided in a front center portion of the upper revolving structure 435. A machine housing part 436 and a counterweight 437 are disposed behind a cab (cab) 425, the machine housing part 436 houses a machine such as an engine, and the counterweight 437 is disposed behind the machine housing part 436.
Returning to fig. 1, the real machine input interface 410 includes a real machine operation mechanism 411, a real machine imaging device 412, and a surrounding imaging device 413. The real machine operation mechanism 411 includes a plurality of operation levers that are arranged around the seats arranged inside the cab 425, similarly to the remote operation mechanism 211. The cab 425 is provided with a drive mechanism or a robot, receives a signal corresponding to the operation mode of the remote operation lever, and operates the real operation lever based on the received signal.
As shown in fig. 3A and 3B, the real camera imaging device 412 is installed inside a cab 425, for example. The real-machine imaging device 412 images the front direction of the work machine 40 through the front window of the cab 425.
As shown in fig. 3A and 3B, the periphery imaging device 413 is provided, for example, in a lower front portion of the cab 425. The periphery imaging device 413 includes a front camera 413A, a right camera 413B, a left camera 413D, and a rear camera 413C, and the front camera 413A images the front of the work machine 40; right camera 413B is provided on body 450 and photographs the right of work machine 40; the left camera 413D is provided on the machine body 450 and photographs the left side of the work machine 40; the rear camera 413C is provided on the machine body 450 and photographs the rear of the work machine 40. The imaging ranges of the cameras 413A to 413D are set to overlap each other, and can image an entire range (360 °) of the peripheral region of the work machine 40.
The real machine output interface 420 is provided with a real machine wireless communication device 422. The real wireless communication device 422 receives a signal corresponding to the operation mode of the remote operation lever from the remote wireless communication device 223 of the remote operation apparatus 20 via the network NW. The signal is further transmitted to the real machine control device 400, and the working machine 40 operates in accordance with the signal. In the cab 425, a driving mechanism or a robot is provided, and the driving mechanism or the robot operates the real machine operation lever based on the signal.
The operating mechanism 440 includes: a boom 441 attached to the upper revolving body 435; an arm 443 rotatably coupled to the distal end portion of the boom 441; bucket 445 is rotatably coupled to the tip end of arm 443. A boom cylinder 442, an arm cylinder 444, and a bucket cylinder 446, which are telescopic hydraulic cylinders, are attached to the operating mechanism 440.
Further, the positioning device 460 is a device that detects the position of the work machine 40. The positioning device 460 is constituted by a GNSS receiver (GNNS: global Navigation Satellite System), for example. The position of the work machine 40 detected by the positioning device 460 is transmitted to the work support server 10 via the real machine wireless communication device 422, and the position is stored in the database 102.
Boom cylinder 442 is interposed between boom 441 and upper revolving unit 435, and receives a supply of hydraulic oil to expand and contract, thereby rotating boom 441 in the raising and lowering direction. Arm cylinder 444 is interposed between arm 443 and boom 441 so as to receive the supply of hydraulic oil and expand and contract, thereby rotating arm 443 about a horizontal axis with respect to boom 441.
A bucket cylinder 446 is interposed between the bucket 445 and the arm 443 so as to be extended and contracted by receiving a supply of working oil, thereby rotating the bucket 445 about a horizontal axis with respect to the arm 443.
The work assistance system 1 may further include a device camera 70 installed at the work site where each work machine 40 is disposed, and an unmanned aerial vehicle 80 flying above the work site where each work machine 40 is disposed.
The work support server 10 and the remote operation device 20 can appropriately acquire the image captured by the imaging device 713 of the equipment camera 70 and the image captured by the imaging device 813 of the unmanned aerial vehicle 80 via the network NW and output the acquired images to the image output device 221.
Next, a flowchart relating to the function of the work support system 1 according to the present embodiment will be described with reference to fig. 4A and 4B. Further, selection of an imaging device, display of a captured image, and the like will be described with reference to fig. 5 to 10 as appropriate.
For the sake of simplicity, control blocks (blocks) "C10" to "C44" are used in the flowchart, and the control blocks refer to transmission and/or reception of data, and branch direction processing is executed on the condition of the transmission and/or reception of the data.
(first function)
First, the remote operation device 20 determines whether or not there is a first designation operation by the operator OP through the remote input interface 210 (STEP 210 in fig. 4A). Here, the "first designated operation" refers to an operation of selecting the work machine 40 that cooperates with the remote operation device 20.
The presence or absence of the first designation operation is determined, for example, in the following manner. First, a map (map), a list, and the like indicating the positions of the respective work machines 40 that can cooperate with the remote operation device 20 are output to the remote output interface 220. Next, it is determined whether or not an operation such as a tap for specifying one work machine 40 that cooperates with the remote operation device 20 is performed by the operator OP.
If the determination result is negative (fig. 4A/step 210: NO (NO)), the process of the first function is ended. On the other hand, if the determination result is affirmative (YES in fig. 4A/STEP 210), the remote operation device 20 transmits a connection request signal to the work assistance server 10 via the remote wireless communication apparatus 223 (STEP 211 in fig. 4A). The "connection request signal" includes a work machine identifier for identifying the work machine 40 which has established communication with the remote operation device 20 or the work machine 40 which is designated through the remote input interface 210.
When receiving the job start signal (fig. 4A/C10), the job assistance server 10 instructs the first assistance processing element 121 to start transmission of the captured image to the work machine 40 identified by the work machine identifier (STEP 111 in fig. 4A). Here, at least the image pickup device of the real machine image pickup device 412 of the work machine 40 is selected.
When receiving the transmission start instruction (fig. 4A/C40), the work machine 40 transmits the captured image to the work assistance server 10 (STEP 411/fig. 4A). When the job assistance server 10 receives the captured image via the server wireless communication device 125 (fig. 4A/C11), it creates a surrounding image (STEP 112 in fig. 4A), and transmits the surrounding image to the remote operation device 20 via the server wireless communication device 125 (STEP 113 in fig. 4A). Examples of the surrounding image include the following surrounding image 1 to surrounding image 3.
(surrounding image 1)
For example, when the work machine 40 includes the periphery imaging device 413 shown in fig. 3A and 3B, images captured by the cameras 413A to 413D constituting the periphery imaging device 413 are transmitted to the work assistance server 10 via the real wireless communication device 422 (fig. 4A/step 411).
When the job assistance server 10 receives the captured images of the cameras 413A to 413D via the server wireless communication device 125 (fig. 4A/C11), the captured images of the cameras 413A to 413D are combined by the first assistance processing element 121. Then, the work support server 10 creates a surrounding image viewed from above the work machine 40 as shown in fig. 5 (STEP 112 in fig. 4A), and transmits the surrounding image to the remote operation device 20 via the server wireless communication device 125 (STEP 113 in fig. 4A).
When the remote operation device 20 receives the surrounding image via the remote wireless communication apparatus 223 (fig. 4A/C20), the surrounding image is output to the image output device 221 (fig. 4A/STEP (STEP) 212).
Among the peripheral images shown in fig. 5, the image of work machine 40 shown in the central area may be a graphic stored in database 102 as graphic information in advance. In the surrounding image, dotted lines indicate seams of the captured images of the cameras 413A to 413D. The dotted line may not be output to the image output device 221.
The periphery imaging device 413 of the work machine 40 may synthesize the images captured by the cameras 413A to 413D to generate a periphery image. In this case, the work machine 40 transmits the surrounding image created by the surrounding image capturing device 413 to the work support server 10 via the real machine wireless communication device 422 (fig. 4A/STEP (STEP) 411).
Then, when the job assisting server 10 receives the surrounding image via the server wireless communication device 125 (fig. 4A/C11), the surrounding image can be transmitted to the remote operation apparatus 20 via the server wireless communication device 125 (fig. 4A/STEP (STEP) 113).
The remote output interface 220 of the remote operation device 20 may synthesize the images captured by the cameras 413A to 413D to generate a surrounding image. When the job assisting server 10 receives the captured images of the cameras 413A to 413D via the server wireless communication device 125 (fig. 4A/C11), the captured images are transmitted to the remote operation apparatus 20 via the server wireless communication device 125 (fig. 4A/step 113).
Then, when the remote operation device 20 receives the captured images via the remote wireless communication apparatus 223, the captured images of the cameras 413A to 413D are combined. Then, the remote operation device 20 outputs the created surrounding image to the image output device 221 (fig. 4A/step 212).
(surrounding image 2)
The work machine 40 transmits the captured image captured by the real machine imaging device 412 to the work assisting server 10 via the real machine wireless communication apparatus 422 (fig. 4A/step 411). Further, the position of the work machine 40 detected by the positioning device 460 is transmitted to the work support server 10 via the real machine wireless communication device 422, and is stored in the database 102.
When the job assisting server 10 receives the captured image captured by the real camera 412 via the server wireless communication device 125 (fig. 4A/C11), the first assisting processing element 121 creates a surrounding image as shown in fig. 7, which is an image obtained by superimposing the captured images captured by the device cameras 70a and 70b on the captured image captured by the real camera 412 (fig. 4A/step 112).
Specifically, the job assisting server 10 can receive the captured image captured by the image capturing device 713 transmitted via the communication device 723 of the device camera 70 in addition to the captured image captured by the real-time image capturing device 412 via the server wireless communication device 125.
The first auxiliary processing element 121 grasps the position of the device camera 70 at the work site and the photographable range. Then, the work assist server 10 determines the device camera 70 whose position of the work machine 40 is to be included in the photographable range, based on the position of the work machine 40 saved in the database 102. Further, the work assisting server 10 superimposes the captured image captured by the device camera 70 capturing the work machine 40 out of the determined device cameras 70 on the captured image captured by the real machine imaging device 412, and outputs the superimposed image to the image output device 221 (fig. 4A/step 212).
As shown in fig. 6, in the case where there are a plurality of device cameras 70 whose positions of the work machine 40 are included in the photographable range, the photographed images of all the device cameras 70 may be displayed, or the photographed images of only a part of the device cameras 70 may be displayed.
In fig. 7, a captured image of the device camera 70B of the work machine 40 captured from the left side is output to the left side of the image output apparatus 221. In addition, a captured image of the device camera 70A of the work machine 40 captured from the right side is output to the right side of the image output device 221. By selecting the captured image obtained by capturing the work machine 40 from different directions in this manner, the situation around the work machine 40 can be reflected in the surrounding image.
(surrounding image 3)
Further, with reference to fig. 8, a case where an unmanned aerial vehicle 80 capable of flying over a work site in which the work machine 40 is disposed is present at the work site will be described. The work assistance server 10 can receive a captured image captured by the camera 813 mounted on the unmanned aerial vehicle 80 transmitted via the communication device 823.
The unmanned aerial vehicle 80 may be a rechargeable aircraft that flies in a wireless manner, or may be a device that supplies power by wire. Unmanned aerial vehicle 80 is on standby at a landing site located on the work site or on work machine 40. The photographing device 813 is capable of photographing at least the lower side of the unmanned aerial vehicle 80.
Then, when the work machine 40 receives the transmission start instruction (fig. 4A/C40), the unmanned aerial vehicle 80 moves to the upper space of the work machine 40, and starts shooting with the shooting device 813.
The work machine 40 transmits the captured image captured by the real machine imaging device 412 to the work assistance server 10 via the real machine wireless communication apparatus 422 (fig. 4A/step 411). Meanwhile, the captured image captured by the capturing device 813 is transmitted to the job assistance server 10 via the communication apparatus 823. Here, the captured image captured by the imaging device 813 may be transmitted to the work assistance server 10 via the real machine wireless communication device 422 of the work machine 40.
In the case where the job assistance server 10 receives the captured image captured by the capturing device 813 via the server wireless communication apparatus 125 (fig. 4A/C11), the first assistance processing element 121 takes the captured image as a surrounding image and transmits it to the remote operation device 20 via the server wireless communication apparatus 125 (fig. 4A/step 113).
When receiving the surrounding image via the remote wireless communication device 223 (fig. 4A/C20), the remote operation apparatus 20 outputs the surrounding image shown in fig. 9 to the image output apparatus 221 (fig. 4A/step 212). Of the peripheral images, the image of the work machine 40 shown in the central area and the image of the work site around the work machine are captured images captured by the imaging device 813.
The height and orientation of the unmanned aerial vehicle 80 when the work machine 40 is photographed from above may be set. For example, the height and orientation of the unmanned aerial vehicle 80 may be controlled so that the image of the work machine 40 in the captured image is captured at an appropriate height and orientation, and stored as a predetermined image.
The peripheral images 1 to 3 can be selected as appropriate. For example, when the work machine 40 is equipped with the periphery imaging device 413, the periphery image 1 may be used.
In addition, when the work assisting server 10 and the remote operation device 20 can acquire a captured image of the device camera 70 via the network NW, the surrounding image 2 may be used. The device camera 70 is installed in a work site where the work machine 40 is installed.
If unmanned aerial vehicle 80 is present in the air above work machine 40 and the captured image captured by imaging device 813 can be acquired via network NW, surrounding image 3 may be used.
As described above, when the image output device 221 outputs the surrounding image (STEP 212 in fig. 4A), the operator OP can check whether or not the safety of the vehicle, the site operator, or the like is present.
Next, the operator information is transmitted to the work assistance server 10 via the remote wireless communication device 223 (fig. 4A/STEP (STEP) 213). The "operator information" is information that can determine whether the operator oop confirms the surrounding image output to the image output device 221.
For example, the operator information is the following information: the surrounding image continuously displays such information for a predetermined time (for example, 10 seconds) or longer on the image output device 221. When the job assistance server 10 receives the information via the server wireless communication device 125 after the information is transmitted (fig. 4A/C12), it is determined by the second assistance processing element 122 whether the job permission condition is satisfied (fig. 4A/STEP 114).
When the display time of the surrounding image is shorter than the predetermined time, the operator is highly likely not to confirm the surrounding image. Therefore, the second subsidiary processing element 122 determines that the job permission condition is not satisfied (NO in FIG. 4A/STEP (STEP) 114), creates the surrounding image again (FIG. 4A/STEP 112), and updates the surrounding image.
On the other hand, when the display time of the surrounding image is equal to or longer than the predetermined time, the operator OP is highly likely to confirm the surrounding image. Therefore, the second auxiliary processing element 122 determines that the work permission condition is satisfied (YES in STEP (STEP) 114 of fig. 4A), and transmits the operation reception permission signal to the work machine 40 via the server wireless communication device 125 (STEP) 115 of fig. 4A).
Further, the operator information may be the following images: an image from which the operator OP can capture the movement when confirming the surrounding image output on the image output device 221. In this case, the remote operation device 20 captures an image of the operator OP seated on the seat St by the operator capturing device 222, and transmits the captured image as an operator image to the work assistance server 10 via the remote wireless communication apparatus 223 (fig. 4A/STEP (STEP) 213).
The job assistance server 10, upon receiving the operator image via the server wireless communication device 125 (fig. 4A/C12), determines whether or not the job permission condition is satisfied by the second assistance processing element 122 (fig. 4A/STEP 114).
The second auxiliary processing element 122 performs image analysis on the operator image to determine the operator's actions. For example, in the case where it is detected that the line of sight of the operator OP, the direction of the face toward the image output apparatus 221, and the operator OP pointing to the image output apparatus 221 are confirmed, the second auxiliary processing component 122 determines that the job permission condition is satisfied (fig. 4A/step114: YES). Then, the second auxiliary processing element 122 transmits an operation acceptance permission signal to the work machine 40 via the server wireless communication device 125 (fig. 4A/step 115).
On the other hand, when the second subsidiary processing element 122 determines that the job permission condition is not satisfied (no in fig. 4A/STEP (STEP) 114), the surrounding image is created again (fig. 4A/STEP (STEP) 112). Thereby, the operator OP is allowed to confirm the surrounding image output to the image output device 221.
In addition, the second auxiliary processing element 122 may be controlled to output an alarm sound or the like from the speaker of the remote operation device 20 when the state in which the job permission condition is not satisfied (no in fig. 4A/STEP 114) continues for a predetermined time or longer in the analysis result of the operator image.
When receiving the drive permission signal via the real-machine wireless communication device 422 (fig. 4A/C41), the work machine 40 starts receiving the remote operation of the remote operation device 20 (fig. 4A/step 412).
(second function)
Next, with reference to the flowchart of fig. 4B, further functions of the work support system 1 will be described.
The work assistance server 10 transmits a captured image switching instruction to the work machine 40 via the server wireless communication device 125 (fig. 4B/STEP (STEP) 120). The "shot image switching instruction" is an instruction for displaying an image, which is used when the work machine 40 is remotely operated by the remote operation device 20, on the image output device 221.
When the work machine 40 receives the captured image switching instruction via the real machine wireless communication device 422 (fig. 4B/C42), the real machine control device 400 acquires the real machine captured image from the real machine imaging device 412 (STEP 421 in fig. 4B). That is, the real-machine control device 400 transmits real-machine captured image data representing the real-machine captured image to the work support server 10 (STEP 422 in fig. 4B).
In the case where the real machine captured image data is received by the first auxiliary processing element 121 via the server wireless communication device 125 (fig. 4B/C14), the job assisting server 10 transmits the real machine captured image data to the remote operation device 20 through the second auxiliary processing element 122 (fig. 4B/STEP (STEP) 121).
When the remote operation device 20 receives the real machine captured image data via the remote wireless communication apparatus 223 (fig. 4B/C21), the remote control device 200 outputs a work environment image corresponding to the real machine captured image data to the image output device 221 (STEP 220/fig. 4B).
As a result, for example, as shown in fig. 10, a work environment image in which a boom 441, an arm 443, and a bucket 445, which are part of the working mechanism 440, are reflected is output to the image output device 221.
The remote operation device 20 recognizes the operation mode of the remote operation mechanism 211 by the remote control device 200 (fig. 4B/STEP (STEP) 221). Further, the remote operation device 20 transmits a remote operation command corresponding to the operation mode to the work assisting server 10 via the remote wireless communication apparatus 223 (STEP 222/fig. 4B).
In the case where the work assistance server 10 receives the remote operation instruction through the second assistance processing element 122 (fig. 4B/C15), the first assistance processing element 121 transmits the remote operation instruction to the work machine 40 (fig. 4B/STEP 122).
When the real machine control device 400 receives the remote operation command via the real machine wireless communication device 422 (fig. 4B/C43), the working machine 40 controls the operation of the operating mechanism 440 and the like (fig. 4B/STEP (STEP) 423). Thereby, the remote operation of the work machine 40 is started. For example, the work of scooping up the earth in front of the work machine 40 with the bucket 445 and then dumping the earth down from the bucket 445 after the upper revolving body 435 is revolved is performed.
Then, in the remote operation device 20, the operator OP ends the work performed by the work machine 40. At this time, the remote operation device 20 transmits a job end signal to the job assistance server 10 (STEP) 223) in fig. 4B.
When receiving the work completion signal via the real-machine wireless communication device 422 (fig. 4B/C16), the work assistance server 10 transmits a work completion signal to the work machine 40 (STEP 123/fig. 4B). In this case, the job assisting server 10 may instruct all the image capturing apparatuses to end the job, not only to instruct the image capturing apparatuses that are currently transmitting the captured images to end the job.
When the work machine 40 receives the work end signal (fig. 4B/C44), the imaging by the imaging device and the driving of the work machine 40 are stopped (fig. 4B/STEP (STEP) 424).
Finally, the remote operation device 20 stops the output of the captured image (fig. 4B/STEP (STEP) 224). Thereby, the display of the captured image in the image output device 221 is ended.
The operator OP may operate the plurality of work machines 40 from the remote operation device 20 and may switch the work machines 40. When switching to a new work machine 40, the work assistance system 1 may execute the processing of fig. 4A and 4B described above. In this way, the series of processes (the first function and the second function) of the job assistance server 10 is ended.
In this way, the work assistance server 10 captures an image using the real machine imaging device 412, the periphery imaging device 413, the device camera 70, and the like of the work machine 40, and displays the captured image of the periphery of the work machine 40 on the image output device 221. The operator OP can grasp not only the state around the work machine 40 but also the relative positional relationship with the obstacle, and can therefore perform work reliably. In addition, the operator OP does not need to perform switching of the imaging devices by himself or herself.
In the work assistance server according to the first aspect of the present invention, it is preferable that the second assistance processing element permits the work of the work machine based on information related to confirmation of the peripheral image displayed on the display device by an operator who operates the remote operation device.
The second auxiliary processing element permits the operation of the work machine by the remote operation device based on information related to confirmation of the peripheral image displayed on the display device by the operator. In this way, the work assistance server can permit the operation of the work machine in a situation where the operator has a high possibility of confirming the surrounding image.
In the work support server according to the first aspect of the present invention, it is preferable that the information is information on a time period during which the peripheral image is displayed on the display device, and the second support processing element permits the work machine to perform the work based on a case in which the peripheral image is displayed on the display device for a predetermined time period.
The second auxiliary processing element estimates that the operator has a high possibility of confirming the surrounding image based on the fact that the surrounding image is displayed on the display device for a predetermined time, and allows the work machine to perform work.
In the work support server according to the first aspect of the present invention, it is preferable that the information is information relating to a result of detection of a confirmation operation by the operator, and the second support processing element permits the work machine to perform the work based on detection of the confirmation operation by the operator with respect to the surrounding image displayed on the display device.
In this case, the second auxiliary processing element can permit the work machine to perform the work when a situation in which the operator is likely to have confirmed the surrounding image, such as the confirmation operation of the surrounding image displayed on the display device, is detected.
In the work assistance server according to the first aspect of the present invention, it is preferable that the first assistance processing element creates the surrounding image based on a captured image captured by a surrounding imaging device provided in the work machine and capturing an image of the surrounding of the work machine.
The first auxiliary processing element can create a surrounding image based on a captured image captured by a surrounding imaging device provided in the work machine. Then, since the surrounding image is displayed on the display device, the work support server can use the surrounding image for the work permission of the operator.
In the work assistance server according to the first aspect of the present invention, it is preferable that the first assistance processing element grasps positions and imaging directions of a plurality of device cameras disposed at a work site where the work machine is located, selects the device camera that captures the work machine based on the positions and imaging directions of the device cameras and the position of the work machine, and creates the surrounding image based on a captured image captured by the selected device camera.
In this case, the first auxiliary processing element can create a surrounding image based on a captured image captured by an equipment camera disposed at the work site. Thus, the work support server can create a surrounding image even when the work machine does not include a surrounding imaging device.
In the work assistance server according to the first aspect of the present invention, it is preferable that the first assistance processing element creates the surrounding image based on a captured image captured by an imaging device provided in an aircraft capable of flying over the work machine.
In this case, the first auxiliary processing element can create the surrounding image based on the captured image captured by the imaging device provided in the aircraft flying over the working machine. Thus, the work support server can create a surrounding image even when the work machine does not include a surrounding imaging device or an appropriate device camera.
Description of the symbols
The unmanned aerial vehicle comprises a work auxiliary system 10 method 8230, a work auxiliary server 20 method 8230, a remote operation device 40 method 8230, a work machine 70, 70A,70B method 8230, an equipment camera 80 method 8230, an unmanned aerial vehicle 121 method 8230, a first auxiliary processing element 122 method 8230, a second auxiliary processing element 200 method 8230, a remote control device 221 method 8230, an image output device 222 method 8230, an operator shooting device 412 method 8230, a real machine shooting device 413 8230, a peripheral shooting device, a bucket 82308230425, a driving room 427 8230, a lower portion driving body 430 method 30, a rotation mechanism 435 8230, an upper portion rotation body 8230, a work mechanism (work accessory 445), a rotation mechanism 82828230823, an 828282828230, a rotation device 828230823, a rotation body 8230823 method 8230, a left side communication device 828230823 method, a communication device 82828230823 method, a work device, a work rod 822118230823 method and a communication device 82813 and a left side communication device 8230823 method.

Claims (8)

1. A work support server that supports a work of a work machine to be operated so that the work machine to be operated can be remotely operated in accordance with an operation performed on a remote operation device having a display device,
the work support server is characterized by comprising:
a first auxiliary processing element that, when requested to start work on the work machine, creates a surrounding image in which a surrounding area of the work machine is captured; and (c) a second step of,
and a second auxiliary processing element which permits the working machine to perform work on the condition that the surrounding image is displayed on the display device.
2. The work assistance server according to claim 1,
the second auxiliary processing element permits the work machine to perform work based on information related to confirmation of the surrounding image displayed on the display device by an operator operating the remote operation device.
3. The work assistance server according to claim 2,
the information is information related to a time at which the surrounding image is displayed on the display device,
the second auxiliary processing element permits the work machine to perform work based on a case where the surrounding image is displayed on the display device for a predetermined time.
4. The work assistance server according to claim 2 or 3,
the information is information related to a detection result of a confirmation action of the operator,
the second auxiliary processing element permits the work machine to perform work based on detection of the confirmation operation of the operator on the surrounding image displayed on the display device.
5. The work assistance server according to any one of claims 1 to 4,
the first auxiliary processing element creates the surrounding image based on a captured image captured by a surrounding imaging device provided in the work machine and capturing a surrounding of the work machine.
6. The work assistance server according to any one of claims 1 to 4,
the first auxiliary processing element grasps positions and imaging directions of a plurality of device cameras disposed in a work site where the work machine is located, selects the device camera that captures the work machine based on the positions and imaging directions of the device cameras and the position of the work machine, and creates the surrounding image based on a captured image captured by the selected device camera.
7. The work assistance server according to any one of claims 1 to 4,
the first auxiliary processing element creates the surrounding image based on a captured image captured by a capturing device provided on an aircraft capable of flying over the work machine.
8. A work assisting method for assisting a work of a work machine to be operated so that the work machine to be operated can be remotely operated in accordance with an operation performed on a remote operation device having a display device,
the work support method is characterized by comprising:
a first step of creating a peripheral image in which a peripheral area of the work machine is captured when a request to start work on the work machine is made; and the number of the first and second groups,
and a second step of permitting the work machine to perform work on the condition that the surrounding image is displayed on the display device.
CN202180020031.1A 2020-03-13 2021-01-25 Work support server and work support method Pending CN115244254A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020043744A JP2021143541A (en) 2020-03-13 2020-03-13 Operation support server, operation support method
JP2020-043744 2020-03-13
PCT/JP2021/002418 WO2021181916A1 (en) 2020-03-13 2021-01-25 Work assistance server and work assistance method

Publications (1)

Publication Number Publication Date
CN115244254A true CN115244254A (en) 2022-10-25

Family

ID=77671352

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180020031.1A Pending CN115244254A (en) 2020-03-13 2021-01-25 Work support server and work support method

Country Status (5)

Country Link
US (1) US20230092296A1 (en)
EP (1) EP4079978A4 (en)
JP (1) JP2021143541A (en)
CN (1) CN115244254A (en)
WO (1) WO2021181916A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014181508A (en) * 2013-03-19 2014-09-29 Sumitomo Heavy Ind Ltd Periphery monitoring apparatus for working machine
CN205894129U (en) * 2016-07-25 2017-01-18 山推工程机械股份有限公司 Wireless video monitor system of remote control bull -dozer
JP2017102604A (en) * 2015-11-30 2017-06-08 住友重機械工業株式会社 Periphery monitoring system for work machine
CN107075840A (en) * 2015-04-28 2017-08-18 株式会社小松制作所 The periphery monitoring apparatus of Work machine and the environment monitoring method of Work machine
CN107431783A (en) * 2015-07-31 2017-12-01 株式会社小松制作所 The display system of Work machine and the display methods of Work machine
JP2018105064A (en) * 2016-12-28 2018-07-05 株式会社小松製作所 Work vehicle, and control system for work vehicle

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1028265A (en) * 1996-07-11 1998-01-27 Hitachi Constr Mach Co Ltd Working-site monitor device for remote control working machine
JP7023714B2 (en) 2015-12-28 2022-02-22 住友建機株式会社 Excavator
CN114640827A (en) * 2016-01-29 2022-06-17 住友建机株式会社 Shovel and autonomous flying body flying around shovel
JP7252137B2 (en) * 2017-12-04 2023-04-04 住友重機械工業株式会社 Perimeter monitoring device
JP7087545B2 (en) * 2018-03-28 2022-06-21 コベルコ建機株式会社 Construction machinery
WO2020032267A1 (en) * 2018-08-10 2020-02-13 住友建機株式会社 Shovel
JP2019060228A (en) * 2018-11-01 2019-04-18 住友重機械工業株式会社 Periphery monitoring device for work machine
JP7338514B2 (en) * 2020-03-04 2023-09-05 コベルコ建機株式会社 Work support server, work support method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014181508A (en) * 2013-03-19 2014-09-29 Sumitomo Heavy Ind Ltd Periphery monitoring apparatus for working machine
CN107075840A (en) * 2015-04-28 2017-08-18 株式会社小松制作所 The periphery monitoring apparatus of Work machine and the environment monitoring method of Work machine
CN107431783A (en) * 2015-07-31 2017-12-01 株式会社小松制作所 The display system of Work machine and the display methods of Work machine
JP2017102604A (en) * 2015-11-30 2017-06-08 住友重機械工業株式会社 Periphery monitoring system for work machine
CN205894129U (en) * 2016-07-25 2017-01-18 山推工程机械股份有限公司 Wireless video monitor system of remote control bull -dozer
JP2018105064A (en) * 2016-12-28 2018-07-05 株式会社小松製作所 Work vehicle, and control system for work vehicle

Also Published As

Publication number Publication date
US20230092296A1 (en) 2023-03-23
WO2021181916A1 (en) 2021-09-16
EP4079978A1 (en) 2022-10-26
JP2021143541A (en) 2021-09-24
EP4079978A4 (en) 2023-06-14

Similar Documents

Publication Publication Date Title
EP2717570A1 (en) Device for monitoring area around working machine
JP7151392B2 (en) Remote control device for construction machinery
CN108432234B (en) Terminal device, control device, data integration device, work vehicle, imaging system, and imaging method
WO2021176883A1 (en) Work support server and work support method
JP2021097367A (en) Remote operation system and remote operation server
CN116848307A (en) Work support system and work support composite system
CN115244254A (en) Work support server and work support method
WO2021131161A1 (en) Work assisting server and method for selecting imaging device
WO2021106278A1 (en) Work assistance server, work assistance method, and work assistance system
WO2021153187A1 (en) Work assisting server and work assisting system
JP7283283B2 (en) working machine
WO2022070567A1 (en) Remote operation assistance device and remote operation assistance system
WO2023136070A1 (en) Remote operation support system and remote operation support method
WO2021106280A1 (en) Work assist server, work assist method, and work assist system
EP4365376A1 (en) Remote operation system and remote operation composite system
WO2021166475A1 (en) Remote operation device, remote operation assistance server, remote operation assistance system, and remote operation assistance method
JP2022152338A (en) Work support system and work support complex system
CN115118873A (en) Shooting function control system and shooting function control method
JP2023032997A (en) remote control system
CN117043732A (en) Remote operation support server and remote operation support system
CN114902311A (en) Work support server, work support method, and work support system
CN115516175A (en) Remote operation support device and remote operation support method
JP2021021253A (en) Work machine and work machine assistance server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination