US20230092296A1 - Work assistance server and work assistance method - Google Patents

Work assistance server and work assistance method Download PDF

Info

Publication number
US20230092296A1
US20230092296A1 US17/794,655 US202117794655A US2023092296A1 US 20230092296 A1 US20230092296 A1 US 20230092296A1 US 202117794655 A US202117794655 A US 202117794655A US 2023092296 A1 US2023092296 A1 US 2023092296A1
Authority
US
United States
Prior art keywords
work
image
work machine
picked
assistance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/794,655
Other languages
English (en)
Inventor
Hitoshi Sasaki
Seiji Saiki
Yoichiro Yamazaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kobelco Construction Machinery Co Ltd
Original Assignee
Kobelco Construction Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kobelco Construction Machinery Co Ltd filed Critical Kobelco Construction Machinery Co Ltd
Assigned to KOBELCO CONSTRUCTION MACHINERY CO., LTD. reassignment KOBELCO CONSTRUCTION MACHINERY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SASAKI, HITOSHI, SAIKI, SEIJI, YAMAZAKI, YOICHIRO
Publication of US20230092296A1 publication Critical patent/US20230092296A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/261Surveying the work-site to be treated
    • E02F9/262Surveying the work-site to be treated with follow-up actions to control the work tool, e.g. controller
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/20Drives; Control devices
    • E02F9/2025Particular purposes of control systems not otherwise provided for
    • E02F9/205Remotely operated machines, e.g. unmanned vehicles
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F3/00Dredgers; Soil-shifting machines
    • E02F3/04Dredgers; Soil-shifting machines mechanically-driven
    • E02F3/28Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
    • E02F3/36Component parts
    • E02F3/42Drives for dippers, buckets, dipper-arms or bucket-arms
    • E02F3/43Control of dipper or bucket position; Control of sequence of drive operations
    • E02F3/435Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like

Definitions

  • the present invention relates to a work assistance server that assists remote operation of a work machine and a work assistance method for assisting the work of the work machine.
  • a plurality of fixed point cameras are installed at a work site where a work machine such as a remote shovel works, which enables overlooking the entire site and monitoring motion or work of a specific shovel.
  • a left camera is attached to a stay on the left front side of a cabin, and a back camera is attached to an upper portion of a counter weight.
  • a right camera is attached to a handrail on the right front side of an upper turning body.
  • the left camera and the right camera are attached such that a side surface of the upper turning body is included in their respective image pickup ranges.
  • the back camera is attached such that a rear end of an upper surface of the counter weight is included in its image pickup range.
  • a remote work machine is remotely operated, an operator (worker) is at a position spaced apart from the remote work machine. Accordingly, the worker desires to be able to efficiently confirm a video image of a camera that picks up an image of the surroundings of a work machine to be remotely operated when performing the work.
  • the present invention has been made in view of the above-described points, and is directed to providing a work assistance server capable of a worker efficiently perceiving a situation around a work machine.
  • a work assistance server that assists in work by a work machine to be operated such that the work machine to be operated can be remotely operated in response to an operation performed on a remote operation apparatus including a display device comprises a first assistance processing element that generates a surroundings image including a region around the work machine when a request to start the work by the work machine is made, and a second assistance processing element that permits the work by the work machine on condition that the surroundings image is displayed on the display device.
  • the work assistance server according to the present invention, a picked-up image (a still image or a moving image) obtained by picking up an image of the region around the work machine is sent to the display device, and a worker can confirm the situation. As a result, the work assistance server assists the worker in the work by the work machine.
  • the first assistance processing element generates the surroundings image including the region around the work machine when the request to start the work by the work machine is made, and displays the surroundings image on the display device of the remote operation apparatus.
  • the second assistance processing element permits the work by the work machine on condition that the surroundings image is displayed on the display device. As a result, an operator overlooks the region around the work machine before starting the work by the work machine.
  • the work assistance server can permit the remote operation apparatus to operate the work machine after the surroundings image to be confirmed by the operator is reliably displayed on the display device of the remote operation apparatus.
  • a work assistance method for assisting in work by a work machine to be operated such that the work machine to be operated can be remotely operated in response to an operation performed on a remote operation apparatus including a display device comprises a first step of generating a surroundings image including a region around the work machine when a request to start the work by the work machine is made, and a second step of permitting the work by the work machine on condition that the surroundings image is displayed on the display device.
  • the surroundings image including the region around the work machine is generated when the request to start the work by the work machine is made, and the surroundings image is displayed on the display device of the remote operation apparatus.
  • the work by the work machine is permitted on condition that the surroundings image is displayed on the display device.
  • the remote operation apparatus can be permitted to operate the work machine after the surroundings image to be confirmed by the operator is reliably displayed on the display device of the remote operation apparatus.
  • FIG. 1 is a diagram illustrating an outline of a work assistance system according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a configuration of an operation mechanism in a work machine.
  • FIG. 3 A is a diagram (1) for describing details of the work machine.
  • FIG. 3 B is a diagram (2) for describing details of the work machine.
  • FIG. 4 A is a flowchart (1) of processing to be performed by each of a work assistance server, a remote operation apparatus, and the work machine.
  • FIG. 4 B is a flowchart (2) of processing to be performed by each of the work assistance server, the remote operation apparatus, and the work machine.
  • FIG. 5 illustrates an example of an image (a surroundings image 1) to be displayed on an image output device.
  • FIG. 6 is a diagram (1) for describing means for acquiring a surroundings image of the work machine.
  • FIG. 7 illustrates an example of an image (a surroundings image 2) to be displayed on the image output device.
  • FIG. 8 is a diagram (2) for describing means for acquiring a surroundings image of the work machine.
  • FIG. 9 illustrates an example of an image (a surroundings image 3) to be displayed on the image output device.
  • FIG. 10 is a diagram illustrating a work environment image to be displayed on the image output device.
  • a work assistance server 10 according to an embodiment of the present invention and a work assistance system 1 including the work assistance server 10 will be described with reference to FIG. 1 .
  • the work assistance system 1 for a construction machine is a system configured to be able to selectively remotely operate a plurality of work machines 40 assigned as operation targets to an operator OP or a remote operation apparatus 20 by the operator OP operating the remote operation apparatus 20 .
  • a work site where the plurality of work machines 40 to be operated by the remote operation apparatus 20 are arranged may be one work site or any one of a plurality of work sites.
  • the work assistance system 1 comprises at least the work assistance server 10 and the remote operation apparatus 20 configured to remotely operate the work machine 40 in addition to the work machine 40 .
  • the work assistance server 10 , the remote operation apparatus 20 , and the work machine 40 are configured to be communicable with one another by a network NW including a wireless communication network.
  • the work assistance server 10 automatically selects an appropriate image pickup device or picked-up image in response to switching, movement, or a work content, for example, of the work machine 40 .
  • the work assistance server 10 outputs the picked-up image to an image output device 221 (a “display device” in the present invention), described below, in order to assist an operator OP of the remote operation apparatus 20 in performing an operation.
  • an image output device 221 a “display device” in the present invention
  • the work assistance server 10 comprises a database 102 , a first assistance processing element 121 , a second assistance processing element 122 , and server wireless communication equipment 125 .
  • the database 102 stores and holds a picked-up image picked up by an actual machine image pickup device 412 , a surroundings image pickup device 413 , or the like, described below.
  • the database 102 may be constituted by a database server separate from the work assistance server 10 .
  • Each of the assistance processing elements 121 and 122 is constituted by an arithmetic processing unit (a single core processor or a multi-core processor or a processor core constituting them). Each of the assistance processing elements 121 and 122 reads required data and software from a storage device such as a memory and performs arithmetic processing conforming to the software with the data used as a target.
  • the server wireless communication equipment 125 issues an instruction to perform display on the image output device 221 via the network NW, and receives the picked up image from the image pickup device.
  • the remote operation apparatus 20 comprises a remote control device 200 , a remote input interface 210 , and a remote output interface 220 .
  • the remote control device 200 is constituted by an arithmetic processing unit (a single core processor or a multi-core processor or a processor core constituting them).
  • the remote control device 200 reads required data and software from a storage device such as a memory and performs arithmetic processing conforming to the software with the data used as a target.
  • the remote input interface 210 comprises a remote operation mechanism 211 .
  • the remote output interface 220 comprises the image output device 221 , a worker image pickup device 222 , and remote wireless communication equipment 223 .
  • the worker image pickup device 222 is a camera attached to the remote operation apparatus 20 , and picks up an image of at least an operation seat (a seat St).
  • the remote wireless communication equipment 223 transmits an operation signal to the work machine 40 via the network NW, and receives a picked-up image from the actual machine image pickup device 412 , the surroundings image pickup device 413 , or the like.
  • the remote operation mechanism 211 comprises a traveling operation device, a turning operation device, a boom operation device, an arm operation device, and a bucket operation device.
  • Each of the operation devices has an operation lever that receives a rotation operation.
  • the operation lever (traveling lever) of the traveling operation device is operated to operate a lower traveling body 427 in the work machine 40 (see FIGS. 3 A and 3 B ).
  • the traveling lever may also serve as a traveling pedal.
  • a traveling pedal fixed to a base portion or a lower end portion of the traveling lever may be provided.
  • the operation lever (turning lever) of the turning operation device is used to operate a hydraulic turning motor constituting a turning mechanism 430 in the work machine 40 .
  • the operation lever (boom lever) of the boom operation device is used to operate a boom cylinder 442 in the work machine 40 (see FIGS. 3 A and 3 B ).
  • the operation lever (arm lever) of the arm operation device is used to operate an arm cylinder 444 in the work machine 40 .
  • the operation lever (bucket lever) of the bucket operation device is used to operate a bucket cylinder 446 in the work machine 40 (see FIGS. 3 A and 3 B ).
  • the seat St has a form such as a high back chair with an armrest.
  • the seat St may have any form in which the operator can sit, for example, a form like a low back chair with no headrest or a form like a chair with no backrest.
  • a pair of left and right traveling levers 2110 respectively corresponding to left and right crawlers are laterally arranged side by side in front of the seat St.
  • the one operation lever may also serve as a plurality of operation levers.
  • a right-side operation lever 2111 provided in front of a right-side frame of the seat St may function as a boom lever when operated in a front-rear direction and function as a bucket lever when operated in a left-right direction.
  • a left-side operation lever 2112 provided in front of a left-side frame of the seat St may function as an arm lever when operated in the front-rear direction and function as a turning lever when operated in the left-right direction.
  • a lever pattern may be arbitrarily changed in response to an operation instruction from the operator OP.
  • the image output device 221 comprises a central image output device 2210 , a left-side image output device 2211 , and a right-side image output device 2212 respectively having substantially rectangular screens arranged in front of, diagonally leftward in front of, and diagonally rightward in front of the seat St, as illustrated in FIG. 2 .
  • Respective shapes and sizes of the screens (image display regions) of the central image output device 2210 , the left-side image output device 2211 , and the right-side image output device 2212 may be the same as or different from one another.
  • a right edge of the left-side image output device 2211 is adjacent to a left edge of the central image output device 2210 such that the screen of the central image output device 2210 and the screen of the left-side image output device 2211 form an inclined angle ⁇ 1 (e.g., 120° ⁇ 1 ⁇ 150°).
  • a left edge of the right-side image output device 2212 is adjacent to a right edge of the central image output device 2210 such that the screen of the central image output device 2210 and the screen of the right-side image output device 2212 form an inclined angle ⁇ 2 (e.g., 120° ⁇ 2 ⁇ 150°).
  • the inclined angles ⁇ 1 and ⁇ 2 may be the same as or different from each other.
  • the respective screens of the central image output device 2210 , the left-side image output device 2211 , and the right-side image output device 2212 may be parallel to one another in a vertical direction, or may be inclined in the vertical direction. At least one of the central image output device 2210 , the left-side image output device 2211 , and the right-side image output device 2212 may be constituted by a plurality of separated image output devices.
  • the central image output device 2210 may be constituted by a pair of image output devices, which are vertically adjacent to each other, each having a substantially rectangular screen.
  • Each of the image output devices 2210 to 2212 may further comprise a speaker (voice output device).
  • the work machine 40 comprises an actual machine control device 400 , an actual machine input interface 410 , an actual machine output interface 420 , and an actuation mechanism (work attachment) 440 .
  • the actual machine control device 400 is constituted by an arithmetic processing unit (a single core processor or a multi-core processor or a processor core constituting them).
  • the actual machine control device 400 reads required data and software from a storage device such as a memory and performs arithmetic processing conforming to the software with the data used as a target.
  • FIGS. 3 A and 3 B illustrate an example of the work machine 40 according to the present embodiment.
  • the work machine 40 is a crawler shovel (construction machine), for example, and comprises a machine body 450 constituted by the crawler type lower traveling body 427 , an upper turning body 435 turnably loaded into the lower traveling body 427 via the turning mechanism 430 , and an actuation mechanism 440 .
  • a front left side portion of the upper turning body 435 is provided with a cab (operation room) 425
  • a front central portion of the upper turning body 435 is provided with the actuation mechanism 440 .
  • a machine housing portion 436 that houses a machine such as an engine and a counter weight 437 arranged behind the machine housing portion 436 are arranged behind the cab (operation room) 425 .
  • the actual machine input interface 410 comprises an actual machine operation mechanism 411 , the actual machine image pickup device 412 , and the surroundings image pickup device 413 .
  • the actual machine operation mechanism 411 comprises a plurality of operation levers arranged similarly to the remote operation mechanism 211 around a seat arranged in the cab 425 .
  • the cab 425 is provided with a driving mechanism or a robot that receives a signal corresponding to an operation mode of the remote operation lever and moves the actual machine operation levers based on the received signal.
  • the actual machine image pickup device 412 is installed in the cab 425 , for example.
  • the actual machine image pickup device 412 picks up an image of a forward orientation of the work machine 40 through a front window of the cab 425 .
  • the surroundings image pickup device 413 is installed in a front lower portion of the cab 425 , for example.
  • the surroundings image pickup device 413 comprises a front camera 413 A that picks up an image of the front of the work machine 40 , a right camera 413 B that is installed on the machine body 450 and picks up an image of the right of the work machine 40 , a left camera 413 D that is installed on the machine body 450 and picks up an image of the left of the work machine 40 , and a rear camera 413 C that is installed on the machine body 450 and picks up an image of the rear of the work machine 40 .
  • Respective image pickup ranges of the cameras 413 A to 413 D are set to overlap one another, and can pick up an omnidirectional image (360°) of a region around the work machine 40 .
  • the actual machine output interface 420 comprises actual machine wireless communication equipment 422 .
  • the actual machine wireless communication equipment 422 receives a signal corresponding to the operation mode of the remote operation lever from the remote wireless communication equipment 223 in the remote operation apparatus 20 via the network NW.
  • the signal is further transmitted to the actual machine control device 400 , and the work machine 40 operates in response to the signal.
  • the cab 425 is provided with a driving mechanism or a robot that actuates the actual machine operation levers based on the signal.
  • the actuation mechanism 440 comprises a boom 441 mounted on the upper turning body 435 , an arm 443 rotatably connected to a distal end portion of the boom 441 , and a bucket 445 rotatably connected to a distal end portion of the arm 443 .
  • the boom cylinder 442 , the arm cylinder 444 , and the bucket cylinder 446 each constituted by a stretchable hydraulic cylinder are mounted on the actuation mechanism 440 .
  • a positioning device 460 is a device that detects a position of the work machine 40 .
  • the positioning device 460 is constituted by a GNSS receiver (GNSS: Global Navigation Satellite System), for example.
  • GNSS Global Navigation Satellite System
  • the position of the work machine 40 detected by the positioning device 460 is transmitted to the work assistance server 10 via the actual machine wireless communication equipment 422 , and is stored in the database 102 .
  • the boom cylinder 442 is interposed between the boom 441 and the upper turning body 435 by expanding and contracting upon being supplied with hydraulic oil to rotate the boom 441 in a rise and fall direction.
  • the arm cylinder 444 is interposed between the arm 443 and the boom 441 by expanding and contracting upon being supplied with hydraulic oil to rotate the arm 443 around a horizontal axis relative to the boom 441 .
  • the bucket cylinder 446 is interposed between the bucket 445 and the arm 443 by expanding and contracting upon being supplied with hydraulic oil to rotate the bucket 445 around a horizontal axis relative to the arm 443 .
  • the work assistance system 1 can further comprise an installation camera 70 installed at a work site where each of the work machines 40 is arranged and an unmanned aircraft 80 that flies above the work site where the work machine 40 is arranged.
  • the work assistance server 10 and the remote operation apparatus 20 can appropriately acquire a picked-up video image picked up by an image pickup device 713 in the installation camera 70 and a picked-up video image picked up by an image pickup device 813 in the unmanned aircraft 80 via the network NW and output the picked-up video images to the image output device 221 .
  • FIGS. 4 A and 4 B flowcharts relating to a function of the work assistance system 1 according to the present embodiment will be described with reference to FIGS. 4 A and 4 B . Selection of the image pickup device, display of the picked-up image, and the like will be appropriately described with reference to FIGS. 5 to 10 .
  • Blocks “C 10 ” to “C 44 ” in the flowcharts mean transmission and/or receiving of data and mean that processing in a branch direction is performed on condition that the data is transmitted and/or received, although used to simplify description.
  • the presence or absence of a first designation operation through the remote input interface 210 by the operator OP is determined ( FIG. 4 A /STEP 210 ).
  • the “first designation operation” is an operation for selecting the work machine 40 that cooperates with the remote operation apparatus 20 .
  • the presence or absence of the first designation operation is determined in the following manner, for example.
  • a map, a list, or the like representing respective existence positions of work machines 40 that can cooperate with the remote operation apparatus 20 is outputted to the remote output interface 220 .
  • connection request signal includes a work machine identifier for identifying the work machine 40 that has established communication with the remote operation apparatus 20 or the work machine 40 designated through the remote input interface 210 .
  • the first assistance processing element 121 issues an instruction to start transmission of a picked-up image to the work machine 40 to be identified by the work machine identifier ( FIG. 4 A /STEP 111 ). At that time, at least an image pickup device of the actual machine image pickup device 412 of the work machine 40 is selected.
  • the work machine 40 transmits, if it receives the transmission start instruction ( FIG. 4 A /C 40 ), a picked-up image to the work assistance server 10 ( FIG. 4 A /STEP 411 ).
  • the work assistance server 10 generates, if it receives the picked-up image via the server wireless communication equipment 125 ( FIG. 4 A /C 11 ), a surroundings image ( FIG. 4 A /STEP 112 ), and transmits the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 ( FIG. 4 A /STEP 113 ).
  • Examples of the surroundings image include surroundings images 1 to 3 listed below.
  • the work machine 40 transmits, if it comprises the surroundings image pickup device 413 illustrated in FIGS. 3 A and 3 B , images respectively picked up by the cameras 413 A to 413 D constituting the surroundings image pickup device 413 to the work assistance server 10 via the actual machine wireless communication equipment 422 ( FIG. 4 A /STEP 411 ).
  • the work assistance server 10 receives the picked-up images respectively picked up by the cameras 413 A to 413 D via the server wireless communication equipment 125 ( FIG. 4 A /C 11 ), the first assistance processing element 121 synthesizes the picked-up images respectively picked up by the cameras 413 A to 413 D.
  • the work assistance server 10 generates a surroundings image that has been viewed from above the work machine 40 ( FIG. 4 A /STEP 112 ), as illustrated in FIG. 5 , and transmits the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 ( FIG. 4 A /STEP 113 ).
  • the remote operation apparatus 20 outputs, if it receives the surroundings image via the remote wireless communication equipment 223 ( FIG. 4 A /C 20 ), the surroundings image to the image output device 221 ( FIG. 4 A /STEP 212 ).
  • an image of the work machine 40 illustrated at its center may be previously stored as graphical information in the database 102 .
  • a broken line indicates each of joints among the picked-up images respectively picked up by the cameras 413 A to 413 D. The broken lines need not be outputted to the image output device 221 .
  • the surroundings image pickup device 413 in the work machine 40 may generate the surroundings image by synthesizing the picked-up images respectively picked up by the cameras 413 A to 413 D.
  • the work machine 40 transmits the surroundings image generated by the surroundings image pickup device 413 to the work assistance server 10 via the actual machine wireless communication equipment 422 ( FIG. 4 A /STEP 411 ).
  • the work assistance server 10 can be configured to transmit, if it receives the surroundings image via the server wireless communication equipment 125 ( FIG. 4 A /C 11 ), the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 ( FIG. 4 A /STEP 113 ).
  • the remote output interface 220 in the remote operation apparatus 20 may generate the surroundings image by synthesizing the picked-up images respectively picked up by the cameras 413 A to 413 D.
  • the work assistance server 10 transmits, if it receives the picked-up images respectively picked up by the cameras 413 A to 413 D via the server wireless communication equipment 125 ( FIG. 4 A /C 11 ), the picked-up images to the remote operation apparatus 20 via the server wireless communication equipment 125 ( FIG. 4 A /STEP 113 ).
  • the remote operation apparatus 20 synthesizes, if it receives the picked-up images respectively picked up by the cameras 413 A to 413 D via the remote wireless communication equipment 223 , the picked-up images. Then, the remote operation apparatus 20 outputs the generated surroundings image to the image output device 221 ( FIG. 4 A /STEP 212 ).
  • the work machine 40 transmits a picked-up image picked up by the actual machine image pickup device 412 to the work assistance server 10 via the actual machine wireless communication equipment 422 ( FIG. 4 A /STEP 411 ). Further, a position of the work machine 40 detected by the positioning device 460 is transmitted to the work assistance server 10 via the actual machine wireless communication equipment 422 , and is stored in the database 102 .
  • the work assistance server 10 receives the picked-up image picked up by the actual machine image pickup device 412 via the server wireless communication equipment 125 (FIG. 4 A/C 11 ), the first assistance processing element 121 generates a surroundings image obtained by superimposing picked-up images respectively picked up by the installation cameras 70 A and 70 B on the picked-up image picked up by the actual machine image pickup device 412 ( FIG. 4 A /STEP 112 ), as illustrated in FIG. 7 .
  • the work assistance server 10 can receive a picked-up image picked up by the image pickup device 713 , which is to be sent via a communication equipment 723 in the installation camera 70 , in addition to the picked-up image picked up by the actual machine image pickup device 412 via the server wireless communication equipment 125 .
  • the first assistance processing element 121 grasps a position and a shootable range at a work site of the installation camera 70 .
  • the work assistance server 10 specifies the installation camera 70 the shootable range of which includes a position of the work machine 40 stored in the database 102 based on the position of the work machine 40 . Further, the work assistance server 10 superimposes the picked-up image picked up by the specified installation camera 70 , which includes the work machine 40 captured by the specified installation camera 70 , on the picked-up image picked up by the actual machine image pickup device 412 , and outputs the obtained image to the image output device 221 ( FIG. 4 A /STEP 212 ).
  • all picked-up images respectively picked up by the installation cameras 70 may be displayed, or some of the picked-up images may be displayed.
  • a picked-up image picked up by an installation camera 70 B that shoots the work machine 40 from the left side is outputted to the left side of the image output device 221 .
  • a picked-up image picked up by an installation camera 70 A that shoots the work machine 40 from the right side is outputted to the right side of the image output device 221 .
  • the work assistance server 10 can receive a picked-up image picked up by the image pickup device 813 loaded into the unmanned aircraft 80 , which is to be transmitted via a communication equipment 823 .
  • the unmanned aircraft 80 may be one that can fly wirelessly in a rechargeable manner or may be supplied with power by wire.
  • the unmanned aircraft 80 waits at a work site or a station provided on the work machine 40 .
  • the image pickup device 813 can pick up an image of at least a portion below the unmanned aircraft 80 .
  • the unmanned aircraft 80 moves to the area above the work machine 40 , and the image pickup device 813 starts image pickup.
  • the work machine 40 transmits the picked-up image picked up by the actual machine image pickup device 412 to the work assistance server 10 via the actual machine wireless communication equipment 422 ( FIG. 4 A /STEP 411 ).
  • the picked-up image picked up by the image pickup device 813 is transmitted to the work assistance server 10 via the communication equipment 823 .
  • the picked-up image picked up by the image pickup device 813 may be transmitted to the work assistance server 10 via the actual machine wireless communication equipment 422 in the work machine 40 .
  • the work assistance server 10 receives the picked-up image picked up by the image pickup device 813 via the server wireless communication equipment 125 ( FIG. 4 A /C 11 ), the first assistance processing element 121 considers the picked-up image as a surroundings image, and transmits the surroundings image to the remote operation apparatus 20 via the server wireless communication equipment 125 ( FIG. 4 A /STEP 113 ).
  • the remote operation apparatus 20 outputs, if it receives the surroundings image via the remote wireless communication equipment 223 ( FIG. 4 A /C 20 ), the surroundings image as illustrated in FIG. 9 to the image output device 221 ( FIG. 4 A /STEP 212 ).
  • the surroundings image an image of the work machine 40 illustrated at its center and an image of a work site around the work machine 40 are each a picked-up image picked up by the image pickup device 813 .
  • a height and a direction of the unmanned aircraft 80 when the work machine 40 is shot from thereabove may be set.
  • an image, the direction and the size of which are appropriate, of the work machine 40 in a picked-up image is stored as a defined image, and the height and the direction of the unmanned aircraft 80 can be controlled to be respectively a height and a direction in which the defined image can be picked up.
  • the above-described surroundings images 1 to 3 can be appropriately selected. If the work machine 40 is loaded with the surroundings image pickup device 413 , for example, the surroundings image 1 can be adopted.
  • the work assistance server 10 and the remote operation apparatus 20 can acquire the picked-up image picked up by the installation camera 70 installed at the work site where the work machine 40 is arranged via the network NW, the surroundings image 2 can be adopted.
  • the surroundings image 3 can be adopted.
  • the operator OP can confirm safety, for example, whether or not a vehicle or a field worker does not exist.
  • operator information is transmitted to the work assistance server 10 via the remote wireless communication equipment 223 ( FIG. 4 A /STEP 213 ).
  • the “operator information” is information capable of determining whether or not the operator OP has confined the surroundings image outputted to the image output device 221 .
  • the operator information is information indicating that the surroundings image has been displayed on the image output device 221 over a predetermined time period (e.g., 10 seconds). If the information is transmitted, and the work assistance server 10 receives the information via the server wireless communication equipment 125 ( FIG. 4 A /C 12 ), the second assistance processing element 122 determines whether or not a work permission condition is satisfied ( FIG. 4 A /STEP 114 ).
  • the second assistance processing element 122 determines that the work permission condition is not satisfied ( FIG. 4 A /NO in STEP 114 ), and generates the surroundings image again ( FIG. 4 A /STEP 112 ), to update the surroundings image.
  • the second assistance processing element 122 determines that the work permission condition is satisfied ( FIG. 4 A /YES in STEP 114 ), and transmits an operation receiving permission signal to the work machine 40 via the server wireless communication equipment 125 ( FIG. 4 A /STEP 115 ).
  • the operator information may be an image capturing an operation of the operator OP for confirming the surroundings image outputted to the image output device 221 .
  • the remote operation apparatus 20 picks up an image of the operator OP who sits on the seat St using the worker image pickup device 222 , and transmits the image as an operator image to the work assistance server 10 via the remote wireless communication equipment 223 ( FIG. 4 A /STEP 213 ).
  • the second assistance processing element 122 determines whether or not a work permission condition is satisfied ( FIG. 4 A /STEP 114 ).
  • the second assistance processing element 122 analyzes the operator image, and specifies an operation of the operator. For example, if it is detected that an eye line and a face of the operator OP are oriented toward the image output device 221 and the operator OP has pointed and confirmed the image output device 221 , the second assistance processing element 122 determines that the work permission condition is satisfied ( FIG. 4 A /YES in STEP 114 ). The second assistance processing element 122 transmits the operation receiving permission signal to the work machine 40 via the server wireless communication equipment 125 ( FIG. 4 A /STEP 115 ).
  • the second assistance processing element 122 may perform control to output a warning sound or the like from a speaker of the remote operation apparatus 20 if a state where the work permission condition is not satisfied ( FIG. 4 A /NO in STEP 114 ) is continued for a predetermined time period or more in a result of the analysis of the operator image.
  • the work machine 40 starts, if it receives the operation receiving permission signal through the actual machine wireless communication equipment 422 ( FIG. 4 A /C 41 ), to receive a remote operation by the remote operation apparatus 20 ( FIG. 4 A /STEP 412 ).
  • the work assistance server 10 transmits a picked-up image switching instruction to the work machine 40 via the server wireless communication equipment 125 ( FIG. 4 B /STEP 120 ).
  • the “picked-up image switching instruction” is an instruction to display on the image output device 221 an image to be used when the remote operation apparatus 20 remotely operates the work machine 40 .
  • the actual machine control device 400 acquires an actual machine picked-up image from the actual machine image pickup device 412 ( FIG. 4 B /STEP 421 ). That is, the actual machine control device 400 transmits actual machine picked-up image data representing the actual machine picked-up image to the work assistance server 10 ( FIG. 4 B /STEP 422 ).
  • the second assistance processing element 122 transmits the actual machine picked-up image data to the remote operation apparatus 20 ( FIG. 4 B /STEP 121 ).
  • the remote control device 200 If the remote operation apparatus 20 receives the actual machine picked-up image data via the remote wireless communication equipment 223 ( FIG. 4 B /C 21 ), the remote control device 200 outputs a work environment image corresponding to the actual machine picked-up image data to the image output device 221 ( FIG. 4 B /STEP 220 ).
  • the work environment image on which the boom 441 , the arm 443 , and the bucket 445 , respectively, as parts of the actuation mechanism 440 are reflected is outputted to the image output device 221 , as illustrated in FIG. 10 , for example.
  • the remote control device 200 recognizes an operation mode of the remote operation mechanism 211 ( FIG. 4 B /STEP 221 ). Further, the remote operation apparatus 20 transmits a remote operation command corresponding to the operation mode to the work assistance server 10 via the remote wireless communication equipment 223 ( FIG. 4 B /STEP 222 ).
  • the first assistance processing element 121 transmits the remote operation command to the work machine 40 ( FIG. 4 B /STEP 122 ).
  • the work machine 40 controls, if the actual machine control device 400 receives the remote operation command via the actual machine wireless communication equipment 422 ( FIG. 4 B /C 43 ), an operation of the actuation mechanism 440 or the like ( FIG. 4 B /STEP 423 ). As a result, the work machine 40 starts to be remotely operated. For example, work for scooping soil in front of the work machine 40 using the bucket 445 and dropping the soil from the bucket 445 after turning the upper turning body 435 is performed.
  • the remote operation apparatus 20 transmits a work end signal toward the work assistance server 10 ( FIG. 4 B /STEP 223 ).
  • the work assistance server 10 transmits, if it receives the work end signal via the actual machine wireless communication equipment 422 ( FIG. 4 B /C 16 ), the work end signal toward the work machine 40 ( FIG. 4 B /STEP 123 ). At this time, the work assistance server 10 may currently issue a work end instruction to not only an image pickup device that transmits a picked-up image but also all image pickup devices.
  • the work machine 40 stops, if it receives the work end signal ( FIG. 4 B /C 44 ), image pickup by the image pickup device and driving of the work machine 40 ( FIG. 4 B /STEP 424 ).
  • the remote operation apparatus 20 stops outputting the picked-up image ( FIG. 4 B /STEP 224 ). As a result, display of the picked-up image on the image output device 221 ends.
  • the operator OP can operate the plurality of work machines 40 from the remote operation apparatus 20 , and may switch the work machine 40 .
  • the work assistance system 1 may perform processing illustrated in FIGS. 4 A and 4 B , described above. From the foregoing, a series of processes of the work assistance server 10 (the first function and the second function) ends.
  • the work assistance server 10 uses the picked-up images respectively picked up by the actual machine image pickup device 412 and the surroundings image pickup device 413 in the work machine 40 , the installation camera 70 , and the like to display a picked-up image around the work machine 40 on the image output device 221 .
  • the operator OP can grasp not only a situation around the work machine 40 but also a relative positional relationship with an obstacle, for example, and reliably perform work. Time and effort required for the operator OP himself/herself to switch the image pickup device are also saved.
  • the second assistance processing element preferably permits the work by the work machine based on information about confirming the surroundings image displayed on the display device by an operator who operates the remote operation apparatus.
  • the second assistance processing element permits the remote operation apparatus to operate the work machine based on the information about confirming the surroundings image displayed on the display device by the operator.
  • the work assistance server can permit the operation of the work machine in a situation where the possibility that the operator has confirmed the surroundings image is high.
  • the information is information about a time period during which the surroundings image is displayed on the display device
  • the second assistance processing element preferably permits the work by the work machine based on the surroundings image being displayed on the display device for a predetermined time period.
  • the second assistance processing element can estimate that the possibility that the operator has confirmed the surroundings image is high based on the surroundings image being displayed on the display device for the predetermined time period and permit the work by the work machine.
  • the information is information about a detection result of a confirmation operation of the operator
  • the second assistance processing element preferably permits the work by the work machine based on the confirmation operation having been detected for the surroundings image displayed on the display device.
  • the second assistance processing element can permit the work by the work machine in a situation where the confirmation operation has been detected for the surroundings image displayed on the display device, which means that the possibility that the operator has reliably confirmed the surroundings image is high.
  • the first assistance processing element preferably generates the surroundings image based on a picked-up image picked up by the surroundings image pickup device that is provided in the work machine and picks up an image of the surroundings of the work machine.
  • the first assistance processing element can generate the surroundings image based on the picked-up image picked up by the surroundings image pickup device provided in the work machine. Then, the surroundings image is displayed on the display device.
  • the work assistance server can use the surroundings image to permit the operator to perform the work.
  • the first assistance processing element preferably grasps respective positions and image pickup directions of a plurality of installation cameras arranged at a work site where the work machine is positioned, selects the installation camera that captures the work machine by the respective positions and image pickup directions of the installation cameras and the position of the work machine, and generates the surroundings image based on a picked-up image picked up by the selected installation camera.
  • the first assistance processing element can generate the surroundings image based on the picked-up image picked up by the installation camera arranged at the work site.
  • the work assistance server can generate the surroundings image even when the work machine comprises no surroundings image pickup device.
  • the first assistance processing element preferably generates the surroundings image based on a picked-up image picked up by an image pickup device provided in an aircraft that can fly above the work machine.
  • the first assistance processing element can generate the surroundings image based on the picked-up image picked up by the image pickup device provided in the aircraft that flies above the work machine.
  • the work assistance server can generate the surroundings image even when the work machine comprises no surroundings image pickup device and when there is no appropriate installation camera.
  • turning mechanism 435 . . . upper turning body, 440 . . . actuation mechanism (work attachment), 445 . . . bucket, 460 . . . positioning device, 713 . . . image pickup device, 723 . . . communication equipment, 813 . . . image pickup device, 823 . . . communication equipment, 2110 . . . traveling lever, 2111 . . . right-side operation lever, 2112 . . . left-side operation lever.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mining & Mineral Resources (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Component Parts Of Construction Machinery (AREA)
  • Operation Control Of Excavators (AREA)
  • Selective Calling Equipment (AREA)
US17/794,655 2020-03-13 2021-01-25 Work assistance server and work assistance method Pending US20230092296A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020-043744 2020-03-13
JP2020043744A JP2021143541A (ja) 2020-03-13 2020-03-13 作業支援サーバ、作業支援方法
PCT/JP2021/002418 WO2021181916A1 (ja) 2020-03-13 2021-01-25 作業支援サーバ、作業支援方法

Publications (1)

Publication Number Publication Date
US20230092296A1 true US20230092296A1 (en) 2023-03-23

Family

ID=77671352

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/794,655 Pending US20230092296A1 (en) 2020-03-13 2021-01-25 Work assistance server and work assistance method

Country Status (5)

Country Link
US (1) US20230092296A1 (ja)
EP (1) EP4079978A4 (ja)
JP (1) JP2021143541A (ja)
CN (1) CN115244254A (ja)
WO (1) WO2021181916A1 (ja)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1028265A (ja) * 1996-07-11 1998-01-27 Hitachi Constr Mach Co Ltd 遠隔操縦作業機械の作業現場モニタ装置
JP6456584B2 (ja) * 2013-03-19 2019-01-23 住友重機械工業株式会社 作業機械用周辺監視装置
JPWO2015125979A1 (ja) * 2015-04-28 2018-02-15 株式会社小松製作所 作業機械の周辺監視装置及び作業機械の周辺監視方法
DE112015006345T5 (de) * 2015-07-31 2017-12-07 Komatsu Ltd. Anzeigesystem einer arbeitsmaschine und anzeigeverfahren einer arbeitsmaschine
JP2017102604A (ja) * 2015-11-30 2017-06-08 住友重機械工業株式会社 作業機械用周辺監視システム
JP7023714B2 (ja) 2015-12-28 2022-02-22 住友建機株式会社 ショベル
CN108699814B (zh) * 2016-01-29 2022-04-12 住友建机株式会社 挖土机以及在挖土机的周围飞行的自主式飞行体
CN205894129U (zh) * 2016-07-25 2017-01-18 山推工程机械股份有限公司 一种遥控推土机无线视频监控系统
JP6866155B2 (ja) * 2016-12-28 2021-04-28 株式会社小松製作所 作業車両および作業車両の制御システム
CN111344460B (zh) * 2017-12-04 2022-04-26 住友重机械工业株式会社 周边监视装置
JP7087545B2 (ja) * 2018-03-28 2022-06-21 コベルコ建機株式会社 建設機械
CN112567102B (zh) * 2018-08-10 2023-04-25 住友建机株式会社 挖土机
JP2019060228A (ja) * 2018-11-01 2019-04-18 住友重機械工業株式会社 作業機械用周辺監視装置
JP7338514B2 (ja) * 2020-03-04 2023-09-05 コベルコ建機株式会社 作業支援サーバ、作業支援方法

Also Published As

Publication number Publication date
WO2021181916A1 (ja) 2021-09-16
EP4079978A4 (en) 2023-06-14
CN115244254A (zh) 2022-10-25
JP2021143541A (ja) 2021-09-24
EP4079978A1 (en) 2022-10-26

Similar Documents

Publication Publication Date Title
EP2717570A1 (en) Device for monitoring area around working machine
CN108432234B (zh) 终端装置、控制装置、数据统合装置、作业车辆、拍摄系统以及拍摄方法
US20230033706A1 (en) Work support server and work support method
JP2019004399A (ja) 作業機械の周囲監視システム
JP2021097367A (ja) 遠隔操作システムおよび遠隔操作サーバ
US20230073325A1 (en) Remote operation assistance server, remote operation assistance system, and remote operation assistance method
JP6893746B2 (ja) 重機の遠隔操作装置
US20230092296A1 (en) Work assistance server and work assistance method
US20210395981A1 (en) Surrounding monitoring apparatus for work machine
WO2021153187A1 (ja) 作業支援サーバおよび作業支援システム
WO2021131161A1 (ja) 作業支援サーバ、撮像装置の選択方法
WO2021106278A1 (ja) 作業支援サーバ、作業支援方法および作業支援システム
CN114867921B (zh) 工程机械的远程操纵系统
CN114930809A (zh) 工程机械的检测系统
JP2022065386A (ja) 作業車監視システム
WO2022070567A1 (ja) 遠隔操作支援装置及び遠隔操作支援システム
US11939744B2 (en) Display system, remote operation system, and display method
WO2021106280A1 (ja) 作業支援サーバ、作業支援方法および作業支援システム
JP7283283B2 (ja) 作業機械
WO2023100533A1 (ja) 画像表示システム、遠隔操作支援システムおよび画像表示方法
CN116745489A (zh) 挖掘机、信息处理装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: KOBELCO CONSTRUCTION MACHINERY CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SASAKI, HITOSHI;SAIKI, SEIJI;YAMAZAKI, YOICHIRO;SIGNING DATES FROM 20220526 TO 20220530;REEL/FRAME:060588/0408

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED