WO2020194882A1 - Remote operation system and remote operation server - Google Patents
Remote operation system and remote operation server Download PDFInfo
- Publication number
- WO2020194882A1 WO2020194882A1 PCT/JP2019/047061 JP2019047061W WO2020194882A1 WO 2020194882 A1 WO2020194882 A1 WO 2020194882A1 JP 2019047061 W JP2019047061 W JP 2019047061W WO 2020194882 A1 WO2020194882 A1 WO 2020194882A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- remote control
- image
- control device
- line
- sight
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 19
- 238000003384 imaging method Methods 0.000 claims abstract description 9
- 238000004891 communication Methods 0.000 claims description 24
- 230000006870 function Effects 0.000 description 9
- 230000004044 response Effects 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 4
- 239000010720 hydraulic oil Substances 0.000 description 3
- 238000000034 method Methods 0.000 description 2
- 239000002689 soil Substances 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/205—Remotely operated machines, e.g. unmanned vehicles
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C23/00—Non-electrical signal transmission systems, e.g. optical systems
- G08C23/04—Non-electrical signal transmission systems, e.g. optical systems using light waves, e.g. infrared
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
Definitions
- the present invention relates to a system for remotely controlling a work machine or the like.
- Patent Document 1 A technique for remotely controlling a work machine has been proposed (see, for example, Patent Document 1).
- one operator who is inferior in the remote control skill of the work machine grasps the remote control mode of the work machine by another operator who has excellent skill from the viewpoint of improving the skill.
- an object of the present invention is to provide a system capable of providing appropriate information to the one operator from the viewpoint of grasping the remote control mode of the work machine by the other operator.
- the present invention relates to a working machine having an operating mechanism and an imaging device including at least a part of the operating mechanism, a wireless communication device, and captured image data acquired by the imaging device of the working machine.
- the present invention relates to a remote control system including an image output device for displaying an corresponding environmental image, a first remote control device and a second remote control device having an operation mechanism for remotely controlling the work machine.
- the first remote control device uses a line-of-sight detector for detecting the line of sight of an operator and line-of-sight detection data according to the line of sight of the operator detected by the line-of-sight detector.
- the present invention comprises a working machine having an operating mechanism and an imaging device that images an environment including at least a part of the operating mechanism, and an image captured by the imaging device of the working machine using a wireless communication function.
- a remote control device having an intercommunication function with each of a first remote control device and a second remote control device having an image output device for displaying an environment image according to data and an operation mechanism for remotely controlling the work machine.
- the operation server Regarding the operation server.
- the remote control server of the present invention includes a first server arithmetic processing element that receives line-of-sight detection data according to the line of sight of the operator detected by the first remote control device from the first remote control device. By transmitting the line-of-sight detection data to the second remote control device, an image spread to the image output device of the second remote control device with reference to the line of sight of the operator according to the line-of-sight detection data in the environment image. It is characterized by including a second server arithmetic processing element that displays a designated image area as an area in a form different from that of the surrounding image area.
- Explanatory drawing which concerns on the structure of the remote control system as one Embodiment of this invention.
- Explanatory drawing about the structure of the work machine Explanatory drawing about the structure of the 1st remote control device.
- the explanatory view about the function of the remote control system as one Embodiment of this invention.
- the explanatory view about the image output mode in the 1st image output apparatus The explanatory view regarding the 1st aspect of the image output in the 2nd image output apparatus.
- the explanatory view regarding the 3rd mode of the image output in the 2nd image output apparatus The explanatory view regarding the 4th aspect of the image output in the 2nd image output device.
- Explanatory drawing about a remote control system as another embodiment of this invention.
- the remote control system as an embodiment of the present invention shown in FIG. 1 includes a first remote control device 10, a second remote control device 20, and a work machine 40.
- the remote control subject of the common work machine 40 can be switched between the first remote control device 10 and the second remote control device 20.
- the work machine 40 includes a work machine control device 400, an image pickup device 401, a wireless communication device 402, and an operation mechanism 440.
- the work machine control device 400 is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the processor core), reads necessary data and software from a storage device such as a memory, and targets the data. Executes the arithmetic processing according to.
- the work machine 40 is, for example, a crawler excavator (construction machine), and as shown in FIG. 2, the work machine 40 is rotatably mounted on a crawler type lower traveling body 410 and a lower traveling body 410 via a turning mechanism 430. It is equipped with an upper swivel body 420 and. A cab (driver's cab) 422 is provided on the front left side of the upper swing body 420. A work attachment 440 is provided at the front center portion of the upper swing body 220.
- a crawler excavator construction machine
- the work attachment 440 as an operating mechanism is rotatable to the boom 441 undulatingly attached to the upper swing body 420, the arm 443 rotatably connected to the tip of the boom 441, and the tip of the arm 443. It is provided with a bucket 445 connected to the.
- the work attachment 440 is equipped with a boom cylinder 442, an arm cylinder 444, and a bucket cylinder 446, which are configured by a telescopic hydraulic cylinder.
- the boom cylinder 442 is interposed between the boom 441 and the upper swing body 420 so as to expand and contract by receiving the supply of hydraulic oil and rotate the boom 441 in the undulating direction.
- the arm cylinder 444 expands and contracts by receiving the supply of hydraulic oil, and is interposed between the arm 443 and the boom 441 so as to rotate the arm 443 about a horizontal axis with respect to the boom 441.
- the bucket cylinder 446 expands and contracts by receiving the supply of hydraulic oil, and is interposed between the bucket 445 and the arm 443 so as to rotate the bucket 445 about the horizontal axis with respect to the arm 443.
- the image pickup apparatus 401 is installed inside, for example, the cab 422, and images an environment including at least a part of the operating mechanism 440 through the front window of the cab 422.
- An actual machine side operation lever corresponding to an operation lever (described later) constituting the first remote control device 10 and a signal corresponding to the operation mode of each operation lever are received from the remote control room, and the actual machine operation lever is received based on the received signal.
- a drive mechanism or a robot for moving the cab 422 is provided.
- the first remote device 10 includes a first client control device 100, a first input interface 110, and a first output interface 120.
- the first client control device 100 is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the same), reads necessary data and software from a storage device such as a memory, and targets the data. Performs arithmetic processing according to the software.
- the first input interface 110 includes a first operation mechanism 111 and a line-of-sight detector 112.
- the first output interface 120 includes a first image output device 121 and a first wireless communication device 122.
- the first operation mechanism 111 includes a traveling operation device, a turning operation device, a boom operation device, an arm operation device, and a bucket operation device.
- Each operating device has an operating lever that receives a rotation operation.
- the operation lever (travel lever) of the travel operation device is operated to move the lower traveling body 410.
- the travel lever may also serve as a travel pedal.
- a traveling pedal fixed to the base or the lower end of the traveling lever may be provided.
- the operation lever (swivel lever) of the swivel operation device is operated to move the hydraulic swivel motor constituting the swivel mechanism 430.
- the operating lever (boom lever) of the boom operating device is operated to move the boom cylinder 442.
- the operating lever (arm lever) of the arm operating device is operated to move the arm cylinder 444.
- the operating lever (bucket lever) of the bucket operating device is operated to move the bucket cylinder 446.
- Each operating lever constituting the first operating mechanism 111 is arranged around the seat 1100 for the operator to sit on, for example, as shown in FIG.
- the seat 1100 is in the form of a high back chair with armrests, but may be in any form in which the operator can sit, such as a low back chair without a headrest or a chair without a backrest.
- a pair of left and right traveling levers 1110 corresponding to the left and right crawlers are arranged side by side in front of the seat 1100.
- One operating lever may also serve as a plurality of operating levers.
- the right operating lever 1111 provided in front of the right frame of the seat 1100 shown in FIG. 3 functions as a boom lever when operated in the front-rear direction and is operated in the left-right direction. May function as a bucket lever.
- the left side operating lever 1112 provided in front of the left side frame of the seat 1100 shown in FIG. 3 functions as an arm lever when operated in the front-rear direction and is operated in the left-right direction. In some cases, it may function as a swivel lever.
- the lever pattern may be arbitrarily changed according to the operation instruction of the operator.
- the first image output device 121 includes a right diagonal front image output device 1211 and a front image output device 1212 arranged diagonally forward to the right, forward, and diagonally forward to the left of the sheet 1100. It is composed of a left oblique forward image output device 1213.
- the image output devices 1211 to 1213 may further include a speaker (audio output device).
- the line-of-sight detector 112 detects the line-of-sight of the operator based on the position of the moving point (moving part) with respect to the reference point (non-moving part) of the operator's eyes seated on the seat 1100.
- the line-of-sight detector 112 is composed of one or a plurality of visible light cameras.
- the line-of-sight detector 112 is composed of one or more sets of an infrared LED and an infrared camera.
- the position and posture of the line-of-sight detector 112 may be changed to the optimum position and posture for capturing the operator's line of sight.
- one of the plurality of line-of-sight detectors 112, which is most suitable for capturing the line of sight of the operator, may be switched.
- pixel regions corresponding to the reference points and moving points of the eyes of the first operator are defined.
- the vector (start point position and end point position) representing the operator's line of sight in the real space is defined.
- An additional ranging sensor may be used to determine the real space position of the eye.
- the intersection with the vector is defined as the center of the region of interest in the display image to which the operator's line of sight is directed.
- the second remote control device 20 includes a second client control device 200, a second input interface 210, and a second output interface 220.
- the second client control device 200 is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the same), reads necessary data and software from a storage device such as a memory, and targets the data. Performs arithmetic processing according to the software.
- the second input interface 210 includes a second operation mechanism 211.
- the second output interface 220 includes a second image output device 221 and a second wireless communication device 222.
- the detailed configuration of the second remote control device 20 is almost the same as the configuration of the first remote control device 10 except that the line-of-sight detector 112 is omitted and the functions of the second client control device 200 described later. Therefore, the description thereof will be omitted (see FIG. 3).
- a predetermined operation is performed by the first operator (FIG. 4 / STEP102).
- the predetermined operation is, for example, the operation of the button or the operation lever constituting the first input interface 110 or the first operation mechanism 111.
- the first client control device 100 transmits an operation start command from the first remote control device 10 to the work machine 40 through the first wireless communication device 122 (FIG. 4 / STEP104).
- a predetermined operation is performed by the second operator (FIG. 4 / STEP202).
- the predetermined operation is, for example, the operation of the button or the operation lever constituting the second input interface 210 or the second operation mechanism 211.
- the second client control device 200 transmits an operation start command from the second remote control device 20 to the work machine 40 through the second wireless communication device 222 (FIG. 4 / STEP204).
- the work machine control device 400 receives the operation start command through the wireless communication device 402 (FIG. 4 / STEP402). In response to this, the work machine control device 400 outputs a command to the image pickup device 401, and the image pickup device 401 acquires the captured image in response to the command (FIG. 4 / STEP404). The work machine control device 400 transmits the captured image data representing the captured image to the first remote control device 10 and the second remote control device 20 through the wireless communication device 402 (FIG. 4 / STEP406).
- the captured image data is received by the first client control device 100 through the first wireless communication device 122 (FIG. 4 / STEP106).
- the first client control device 100 displays an environment image (all or part of the captured image itself or a simulated environment image generated based on the captured image) on the first image output device 121 according to the captured image data. (Fig. 4 / STEP108).
- the second client control device 200 receives the captured image data through the second wireless communication device 122 (FIG. 4 / STEP206).
- the second client control device 200 displays an environment image corresponding to the captured image data on the second image output device 211 (FIG. 4 / STEP208).
- the first image is an environmental image including the boom 441, the arm 443, the bucket 445, and the arm cylinder 444, which are a part of the work attachment 440 as the operating mechanism. It is displayed on each of the output device 121 and the second image output device 221.
- the first operation mechanism 111 is operated by the first operator (FIG. 4 / STEP110), and accordingly, the operation is performed by the first client control device 100 through the first wireless communication device 122.
- An operation command according to the mode is transmitted to the work machine 40 (FIG. 4 / STEP112).
- the work machine control device 400 receives an operation command through the wireless communication device 402 (FIG. 4 / STEP408).
- the operation of the work attachment 440 and the like is controlled by the work machine control device 400 (FIG. 4 / STEP410).
- the bucket 445 scoops the soil in front of the work machine 40, the upper swivel body 410 is swiveled, and then the soil is dropped from the bucket 445.
- the line of sight of the first operator is detected by the line of sight detector 112 (FIG. 4 / STEP114).
- the area of interest of the first operator in the environment image displayed on the first image output device 111 is specified.
- the first client control device 100 transmits the line-of-sight detection data according to the line-of-sight detection result to the second remote control device 20 through the first wireless communication device 122 (FIG. 4 / STEP116).
- the second client control device 200 receives the line-of-sight detection data through the second wireless communication device 222 (FIG. 4 / STEP210).
- the second client control device 200 emphasizes the image area specified by the line-of-sight detection data more than the other image areas and displays it on the second image output device 211 (FIG. 4 / STEP212). For example, when the line of sight of the first operator, and thus the region of interest of the environmental image, is an image region corresponding to the bucket 445, the image region is highlighted as a "designated image region" so as to stand out from the surrounding image region.
- a figure such as a pointer may be superimposed and displayed on the designated image area S including the image area corresponding to the bucket 445.
- a figure such as a circle figure surrounding the designated image area S corresponding to the bucket 445 may be displayed.
- the image quality of the rectangular designated image area S including the image area corresponding to the bucket 445 may be higher than the image quality of the other image areas.
- the designated image area S may be displayed as a color image while the other image area may be displayed as a grayscale image.
- the designated image area S may be displayed as an image having a higher resolution than other image areas.
- the brightness of the rectangular designated image area S including the image area corresponding to the bucket 445 may be higher than the brightness of the other image areas.
- the second operator who operates the work machine 40 through the second remote control device 20 remotely controls the work machine 40 by the first operator who operates the work machine 40 through the first remote control device 10.
- the designated image area displayed more emphasized than the surrounding image area among the environmental images displayed on the second image output device 221. Can be provided to the second operator as appropriate information (see FIG. 4 / STEP212 and FIGS. 6A-6D).
- the second operator can recognize how to move or operate the operation lever by the first operator through visual recognition of the designated image area highlighted in the environment image displayed on the second image output device 221.
- the first remote control device 10, the second remote control device 20, and the work machine 40 directly communicate with each other according to the wireless communication method, but as another embodiment, the remote control shown in FIG. 7 is shown.
- the first remote control device 10, the second remote control device 20, and the work machine 40 may indirectly communicate with each other via the operation server 30.
- the remote control server 30 includes a first server arithmetic processing element 31 and a second server arithmetic processing element 32.
- the first server arithmetic processing element 31 receives the line-of-sight detection data according to the line-of-sight of the operator detected by the first remote control device 10 from the first remote control device 10.
- the second server arithmetic processing element 32 displays the designated image area on the second image output device 221 in a form different from the surrounding image area by transmitting the line-of-sight detection data to the second remote control device 20. (See FIGS. 6A to 6D).
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Mining & Mineral Resources (AREA)
- Civil Engineering (AREA)
- Structural Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Aviation & Aerospace Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Networks & Wireless Communication (AREA)
- Selective Calling Equipment (AREA)
- Operation Control Of Excavators (AREA)
- Component Parts Of Construction Machinery (AREA)
- Closed-Circuit Television Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The present invention provides a system with which it is possible to provide one operator with appropriate information from the standpoint of the one operator in ascertaining the state of remote operation of a work machine by another operator. In a first remote operation device 10, the gaze of a first operator is detected by a gaze detector 112, and gaze detection data corresponding to the gaze of the first operator is transmitted. In a second remote operation device 20, a designated image area of an environment image that corresponds to the captured image data acquired by an imaging device 401 of a work machine 40, the designated image area spreading in relation to the gaze of the first operator that corresponds to the gaze detection data, is displayed on a second image output device 221 in a manner different from that of image areas around the designated image area.
Description
本発明は、作業機械等を遠隔操作するためのシステムに関する。
The present invention relates to a system for remotely controlling a work machine or the like.
作業機械を遠隔操作する技術が提案されている(例えば、特許文献1参照)。
A technique for remotely controlling a work machine has been proposed (see, for example, Patent Document 1).
しかし、作業機械の遠隔操作のスキルに劣る一のオペレータが、そのスキルを向上させる観点から、スキルに優れた他のオペレータによる作業機械の遠隔操作態様を把握することが好ましい。
However, it is preferable that one operator who is inferior in the remote control skill of the work machine grasps the remote control mode of the work machine by another operator who has excellent skill from the viewpoint of improving the skill.
そこで、本発明は、一のオペレータが他のオペレータによる作業機械の遠隔操作態様を把握する観点から、適当な情報を当該一のオペレータに対して提供しうるシステムを提供することを目的とする。
Therefore, an object of the present invention is to provide a system capable of providing appropriate information to the one operator from the viewpoint of grasping the remote control mode of the work machine by the other operator.
本発明は、作動機構および当該作動機構の少なくとも一部を含む環境を撮像する撮像装置を有している作業機械と、無線通信機、前記作業機械の前記撮像装置により取得された撮像画像データに応じた環境画像を表示する画像出力装置および前記作業機械を遠隔操作するための操作機構を有している第1遠隔操作装置および第2遠隔操作装置と、を備えている遠隔操作システムに関する。
The present invention relates to a working machine having an operating mechanism and an imaging device including at least a part of the operating mechanism, a wireless communication device, and captured image data acquired by the imaging device of the working machine. The present invention relates to a remote control system including an image output device for displaying an corresponding environmental image, a first remote control device and a second remote control device having an operation mechanism for remotely controlling the work machine.
本発明の遠隔操作システムは、前記第1遠隔操作装置が、オペレータの視線を検知する視線検知器と、前記視線検知器により検知された当該オペレータの視線に応じた視線検知データを前記無線通信機に送信させる第1クライアント制御装置と、を備え、前記第2遠隔操作装置が、前記環境画像において、前記無線通信機により受信された前記視線検知データに応じた前記オペレータの視線を基準として広がる画像領域としての指定画像領域を、その周囲の画像領域とは異なる形態で前記画像出力装置に表示させる第2クライアント制御装置と、を備えていることを特徴とする。
In the remote control system of the present invention, the first remote control device uses a line-of-sight detector for detecting the line of sight of an operator and line-of-sight detection data according to the line of sight of the operator detected by the line-of-sight detector. An image in which the second remote control device expands with reference to the line of sight of the operator according to the line of sight detection data received by the wireless communication device in the environment image. It is characterized by including a second client control device for displaying a designated image area as an area on the image output device in a form different from that of the surrounding image area.
本発明は、作動機構および当該作動機構の少なくとも一部を含む環境を撮像する撮像装置を有している作業機械と、無線通信機能を用いて前記作業機械の前記撮像装置により取得された撮像画像データに応じた環境画像を表示する画像出力装置および前記作業機械を遠隔操作するための操作機構を有している第1遠隔操作装置および第2遠隔操作装置のそれぞれとの相互通信機能を有する遠隔操作サーバに関する。
The present invention comprises a working machine having an operating mechanism and an imaging device that images an environment including at least a part of the operating mechanism, and an image captured by the imaging device of the working machine using a wireless communication function. A remote control device having an intercommunication function with each of a first remote control device and a second remote control device having an image output device for displaying an environment image according to data and an operation mechanism for remotely controlling the work machine. Regarding the operation server.
本発明の遠隔操作サーバは、前記第1遠隔操作装置において検知されたオペレータの視線に応じた視線検知データを、前記第1遠隔操作装置から受信する第1サーバ演算処理要素と、を備え、前記視線検知データを前記第2遠隔操作装置に対して送信することにより、前記第2遠隔装置の前記画像出力装置に、前記環境画像において前記視線検知データに応じた前記オペレータの視線を基準として広がる画像領域としての指定画像領域を、その周囲の画像領域とは異なる形態で表示させる第2サーバ演算処理要素と、を備えていることを特徴とする。
The remote control server of the present invention includes a first server arithmetic processing element that receives line-of-sight detection data according to the line of sight of the operator detected by the first remote control device from the first remote control device. By transmitting the line-of-sight detection data to the second remote control device, an image spread to the image output device of the second remote control device with reference to the line of sight of the operator according to the line-of-sight detection data in the environment image. It is characterized by including a second server arithmetic processing element that displays a designated image area as an area in a form different from that of the surrounding image area.
(構成)
図1に示されている本発明の一実施形態としての遠隔操作システムは、第1遠隔操作装置10と、第2遠隔操作装置20と、作業機械40と、を備えている。共通の作業機械40の遠隔操作主体は、第1遠隔操作装置10および第2遠隔操作装置20の間で切り替え可能である。 (Constitution)
The remote control system as an embodiment of the present invention shown in FIG. 1 includes a firstremote control device 10, a second remote control device 20, and a work machine 40. The remote control subject of the common work machine 40 can be switched between the first remote control device 10 and the second remote control device 20.
図1に示されている本発明の一実施形態としての遠隔操作システムは、第1遠隔操作装置10と、第2遠隔操作装置20と、作業機械40と、を備えている。共通の作業機械40の遠隔操作主体は、第1遠隔操作装置10および第2遠隔操作装置20の間で切り替え可能である。 (Constitution)
The remote control system as an embodiment of the present invention shown in FIG. 1 includes a first
(作業機械の構成)
作業機械40は、作業機械制御装置400と、撮像装置401と、無線通信機器402と、作動機構440と、を備えている。作業機械制御装置400は、演算処理装置(シングルコアプロセッサまたはマルチコアプロセッサもしくはこれを構成するプロセッサコア)により構成され、メモリなどの記憶装置から必要なデータおよびソフトウェアを読み取り、当該データを対象として当該ソフトウェアにしたがった演算処理を実行する。
(Structure of work machine)
Thework machine 40 includes a work machine control device 400, an image pickup device 401, a wireless communication device 402, and an operation mechanism 440. The work machine control device 400 is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the processor core), reads necessary data and software from a storage device such as a memory, and targets the data. Executes the arithmetic processing according to.
作業機械40は、作業機械制御装置400と、撮像装置401と、無線通信機器402と、作動機構440と、を備えている。作業機械制御装置400は、演算処理装置(シングルコアプロセッサまたはマルチコアプロセッサもしくはこれを構成するプロセッサコア)により構成され、メモリなどの記憶装置から必要なデータおよびソフトウェアを読み取り、当該データを対象として当該ソフトウェアにしたがった演算処理を実行する。
(Structure of work machine)
The
作業機械40は、例えばクローラショベル(建設機械)であり、図2に示されているように、クローラ式の下部走行体410と、下部走行体410に旋回機構430を介して旋回可能に搭載されている上部旋回体420と、を備えている。上部旋回体420の前方左側部にはキャブ(運転室)422が設けられている。上部旋回体220の前方中央部には作業アタッチメント440が設けられている。
The work machine 40 is, for example, a crawler excavator (construction machine), and as shown in FIG. 2, the work machine 40 is rotatably mounted on a crawler type lower traveling body 410 and a lower traveling body 410 via a turning mechanism 430. It is equipped with an upper swivel body 420 and. A cab (driver's cab) 422 is provided on the front left side of the upper swing body 420. A work attachment 440 is provided at the front center portion of the upper swing body 220.
作動機構としての作業アタッチメント440は、上部旋回体420に起伏可能に装着されているブーム441と、ブーム441の先端に回動可能に連結されているアーム443と、アーム443の先端に回動可能に連結されているバケット445と、を備えている。作業アタッチメント440には、伸縮可能な油圧シリンダにより構成されているブームシリンダ442、アームシリンダ444およびバケットシリンダ446が装着されている。
The work attachment 440 as an operating mechanism is rotatable to the boom 441 undulatingly attached to the upper swing body 420, the arm 443 rotatably connected to the tip of the boom 441, and the tip of the arm 443. It is provided with a bucket 445 connected to the. The work attachment 440 is equipped with a boom cylinder 442, an arm cylinder 444, and a bucket cylinder 446, which are configured by a telescopic hydraulic cylinder.
ブームシリンダ442は、作動油の供給を受けることにより伸縮してブーム441を起伏方向に回動させるように当該ブーム441と上部旋回体420との間に介在する。アームシリンダ444は、作動油の供給を受けることにより伸縮してアーム443をブーム441に対して水平軸回りに回動させるように当該アーム443と当該ブーム441との間に介在する。バケットシリンダ446は、作動油の供給を受けることにより伸縮してバケット445をアーム443に対して水平軸回りに回動させるように当該バケット445と当該アーム443との間に介在する。
The boom cylinder 442 is interposed between the boom 441 and the upper swing body 420 so as to expand and contract by receiving the supply of hydraulic oil and rotate the boom 441 in the undulating direction. The arm cylinder 444 expands and contracts by receiving the supply of hydraulic oil, and is interposed between the arm 443 and the boom 441 so as to rotate the arm 443 about a horizontal axis with respect to the boom 441. The bucket cylinder 446 expands and contracts by receiving the supply of hydraulic oil, and is interposed between the bucket 445 and the arm 443 so as to rotate the bucket 445 about the horizontal axis with respect to the arm 443.
撮像装置401は、例えばキャブ422の内部に設置され、キャブ422のフロントウィンドウ越しに作動機構440の少なくとも一部を含む環境を撮像する。
Theimage pickup apparatus 401 is installed inside, for example, the cab 422, and images an environment including at least a part of the operating mechanism 440 through the front window of the cab 422.
The
第1遠隔操作装置10を構成する操作レバー(後述)に対応する実機側操作レバーと、遠隔操作室から各操作レバーの操作態様に応じた信号を受信し、当該受信信号に基づいて実機操作レバーを動かす駆動機構またはロボットと、がキャブ422に設けられている。
An actual machine side operation lever corresponding to an operation lever (described later) constituting the first remote control device 10 and a signal corresponding to the operation mode of each operation lever are received from the remote control room, and the actual machine operation lever is received based on the received signal. A drive mechanism or a robot for moving the cab 422 is provided.
(第1遠隔操作装置の構成)
第1遠隔装置10は、第1クライアント制御装置100と、第1入力インターフェース110と、第1出力インターフェース120と、を備えている。第1クライアント制御装置100は、演算処理装置(シングルコアプロセッサまたはマルチコアプロセッサもしくはこれを構成するプロセッサコア)により構成され、メモリなどの記憶装置から必要なデータおよびソフトウェアを読み取り、当該データを対象として当該ソフトウェアにしたがった演算処理を実行する。第1入力インターフェース110は、第1操作機構111と、視線検知器112と、を備えている。第1出力インターフェース120は、第1画像出力装置121と、第1無線通信機器122と、を備えている。
(Configuration of the first remote control device)
The firstremote device 10 includes a first client control device 100, a first input interface 110, and a first output interface 120. The first client control device 100 is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the same), reads necessary data and software from a storage device such as a memory, and targets the data. Performs arithmetic processing according to the software. The first input interface 110 includes a first operation mechanism 111 and a line-of-sight detector 112. The first output interface 120 includes a first image output device 121 and a first wireless communication device 122.
第1遠隔装置10は、第1クライアント制御装置100と、第1入力インターフェース110と、第1出力インターフェース120と、を備えている。第1クライアント制御装置100は、演算処理装置(シングルコアプロセッサまたはマルチコアプロセッサもしくはこれを構成するプロセッサコア)により構成され、メモリなどの記憶装置から必要なデータおよびソフトウェアを読み取り、当該データを対象として当該ソフトウェアにしたがった演算処理を実行する。第1入力インターフェース110は、第1操作機構111と、視線検知器112と、を備えている。第1出力インターフェース120は、第1画像出力装置121と、第1無線通信機器122と、を備えている。
(Configuration of the first remote control device)
The first
第1操作機構111には、走行用操作装置と、旋回用操作装置と、ブーム用操作装置と、アーム用操作装置と、バケット用操作装置と、が含まれている。各操作装置は、回動操作を受ける操作レバーを有している。走行用操作装置の操作レバー(走行レバー)は、下部走行体410を動かすために操作される。走行レバーは、走行ペダルを兼ねていてもよい。例えば、走行レバーの基部または下端部に固定されている走行ペダルが設けられていてもよい。旋回用操作装置の操作レバー(旋回レバー)は、旋回機構430を構成する油圧式の旋回モータを動かすために操作される。ブーム用操作装置の操作レバー(ブームレバー)は、ブームシリンダ442を動かすために操作される。アーム用操作装置の操作レバー(アームレバー)はアームシリンダ444を動かすために操作される。バケット用操作装置の操作レバー(バケットレバー)はバケットシリンダ446を動かすために操作される。
The first operation mechanism 111 includes a traveling operation device, a turning operation device, a boom operation device, an arm operation device, and a bucket operation device. Each operating device has an operating lever that receives a rotation operation. The operation lever (travel lever) of the travel operation device is operated to move the lower traveling body 410. The travel lever may also serve as a travel pedal. For example, a traveling pedal fixed to the base or the lower end of the traveling lever may be provided. The operation lever (swivel lever) of the swivel operation device is operated to move the hydraulic swivel motor constituting the swivel mechanism 430. The operating lever (boom lever) of the boom operating device is operated to move the boom cylinder 442. The operating lever (arm lever) of the arm operating device is operated to move the arm cylinder 444. The operating lever (bucket lever) of the bucket operating device is operated to move the bucket cylinder 446.
第1操作機構111を構成する各操作レバーは、例えば、図3に示されているように、オペレータが着座するためのシート1100の周囲に配置されている。シート1100は、アームレスト付きのハイバックチェアのような形態であるが、ヘッドレストがないローバックチェアのような形態、または、背もたれがないチェアのような形態など、オペレータが着座できる任意の形態でもよい。
Each operating lever constituting the first operating mechanism 111 is arranged around the seat 1100 for the operator to sit on, for example, as shown in FIG. The seat 1100 is in the form of a high back chair with armrests, but may be in any form in which the operator can sit, such as a low back chair without a headrest or a chair without a backrest.
シート1100の前方に左右のクローラに応じた左右一対の走行レバー1110が左右横並びに配置されている。一の操作レバーが複数の操作レバーを兼ねていてもよい。例えば、図3に示されているシート1100の右側フレームの前方に設けられている右側操作レバー1111が、前後方向に操作された場合にブームレバーとして機能し、かつ、左右方向に操作された場合にバケットレバーとして機能してもよい。同様に、図3に示されているシート1100の左側フレームの前方に設けられている左側操作レバー1112が、前後方向に操作された場合にアームレバーとして機能し、かつ、左右方向に操作された場合に旋回レバーとして機能してもよい。レバーパターンは、オペレータの操作指示によって任意に変更されてもよい。
A pair of left and right traveling levers 1110 corresponding to the left and right crawlers are arranged side by side in front of the seat 1100. One operating lever may also serve as a plurality of operating levers. For example, when the right operating lever 1111 provided in front of the right frame of the seat 1100 shown in FIG. 3 functions as a boom lever when operated in the front-rear direction and is operated in the left-right direction. May function as a bucket lever. Similarly, the left side operating lever 1112 provided in front of the left side frame of the seat 1100 shown in FIG. 3 functions as an arm lever when operated in the front-rear direction and is operated in the left-right direction. In some cases, it may function as a swivel lever. The lever pattern may be arbitrarily changed according to the operation instruction of the operator.
第1画像出力装置121は、例えば図3に示されているように、シート1100の右斜め前方、前方および左斜め前方のそれぞれに配置された右斜め前方画像出力装置1211、前方画像出力装置1212および左斜め前方画像出力装置1213により構成されている。当該画像出力装置1211~1213は、スピーカ(音声出力装置)をさらに備えていてもよい。
As shown in FIG. 3, for example, the first image output device 121 includes a right diagonal front image output device 1211 and a front image output device 1212 arranged diagonally forward to the right, forward, and diagonally forward to the left of the sheet 1100. It is composed of a left oblique forward image output device 1213. The image output devices 1211 to 1213 may further include a speaker (audio output device).
視線検知器112は、シート1100に着座したオペレータの目の基準点(動かない部分)に対する動点(動く部分)の位置に基づいて、当該オペレータの視線を検知する。基準点として「目頭」の位置を定め、動点として「虹彩」の位置を定める場合、視線検知器112は、一または複数の可視光カメラにより構成される。基準点として「角膜反射」の位置を定め、動点として「瞳孔」の位置を定める場合、視線検知器112は、赤外線LEDおよび赤外線カメラの一または複数の組により構成される。オペレータの目の動きに加えてオペレータの頭部の動きが検知され、視線検知器112の位置および姿勢がオペレータの視線をとらえるのに最適な位置および姿勢に変更されてもよい。この場合、複数の視線検知器112のうちオペレータの視線をとらえるのに最適な一の視線検知器112が切り替えられてもよい。
The line-of-sight detector 112 detects the line-of-sight of the operator based on the position of the moving point (moving part) with respect to the reference point (non-moving part) of the operator's eyes seated on the seat 1100. When the position of the "inner corner" is determined as the reference point and the position of the "iris" is determined as the moving point, the line-of-sight detector 112 is composed of one or a plurality of visible light cameras. When the position of the "corneal reflex" is determined as the reference point and the position of the "pupil" is determined as the moving point, the line-of-sight detector 112 is composed of one or more sets of an infrared LED and an infrared camera. In addition to the movement of the operator's eyes, the movement of the operator's head may be detected, and the position and posture of the line-of-sight detector 112 may be changed to the optimum position and posture for capturing the operator's line of sight. In this case, one of the plurality of line-of-sight detectors 112, which is most suitable for capturing the line of sight of the operator, may be switched.
視線検知器112により取得された撮像画像において、第1オペレータの目の基準点および動点のそれぞれに対応する画素領域が定められる。当該画素領域に対応する実空間位置が定められることで、実空間におけるオペレータの視線を表わすベクトル(始点位置および終点位置)が定められる。目の実空間位置を定めるために付加的に測距センサが用いられてもよい。実空間における画像出力装置1211~1213のそれぞれの表示画像座標系を表わす平面において、当該ベクトルとの交点が、オペレータの視線が向けられている表示画像における注目領域の中心として定められる。
In the captured image acquired by the line-of-sight detector 112, pixel regions corresponding to the reference points and moving points of the eyes of the first operator are defined. By defining the real space position corresponding to the pixel region, the vector (start point position and end point position) representing the operator's line of sight in the real space is defined. An additional ranging sensor may be used to determine the real space position of the eye. In a plane representing each display image coordinate system of the image output devices 1211 to 1213 in the real space, the intersection with the vector is defined as the center of the region of interest in the display image to which the operator's line of sight is directed.
In the captured image acquired by the line-of-
(第2遠隔操作装置の構成) 第2遠隔操作装置20は、第2クライアント制御装置200と、第2入力インターフェース210と、第2出力インターフェース220と、を備えている。第2クライアント制御装置200は、演算処理装置(シングルコアプロセッサまたはマルチコアプロセッサもしくはこれを構成するプロセッサコア)により構成され、メモリなどの記憶装置から必要なデータおよびソフトウェアを読み取り、当該データを対象として当該ソフトウェアにしたがった演算処理を実行する。第2入力インターフェース210は、第2操作機構211を備えている。第2出力インターフェース220は、第2画像出力装置221と、第2無線通信機器222と、を備えている。
(Structure of Second Remote Control Device) The secondremote control device 20 includes a second client control device 200, a second input interface 210, and a second output interface 220. The second client control device 200 is composed of an arithmetic processing unit (single-core processor or multi-core processor or a processor core constituting the same), reads necessary data and software from a storage device such as a memory, and targets the data. Performs arithmetic processing according to the software. The second input interface 210 includes a second operation mechanism 211. The second output interface 220 includes a second image output device 221 and a second wireless communication device 222.
(Structure of Second Remote Control Device) The second
第2遠隔操作装置20の詳細な構成は、視線検知器112が省略されていること、および、後述する第2クライアント制御装置200の機能のほかは第1遠隔操作装置10の構成とほぼ同様であるため、説明を省略する(図3参照)。
The detailed configuration of the second remote control device 20 is almost the same as the configuration of the first remote control device 10 except that the line-of-sight detector 112 is omitted and the functions of the second client control device 200 described later. Therefore, the description thereof will be omitted (see FIG. 3).
(機能)
第1遠隔操作装置10において、第1オペレータにより所定の操作が行われる(図4/STEP102)。所定の操作は、例えば、第1入力インターフェース110または第1操作機構111を構成するボタンまたは操作レバーの操作である。これに応じて、第1クライアント制御装置100により、第1無線通信機器122を通じて、第1遠隔操作装置10から作業機械40に対して操作開始指令が送信される(図4/STEP104)。 (function)
In the firstremote control device 10, a predetermined operation is performed by the first operator (FIG. 4 / STEP102). The predetermined operation is, for example, the operation of the button or the operation lever constituting the first input interface 110 or the first operation mechanism 111. In response to this, the first client control device 100 transmits an operation start command from the first remote control device 10 to the work machine 40 through the first wireless communication device 122 (FIG. 4 / STEP104).
第1遠隔操作装置10において、第1オペレータにより所定の操作が行われる(図4/STEP102)。所定の操作は、例えば、第1入力インターフェース110または第1操作機構111を構成するボタンまたは操作レバーの操作である。これに応じて、第1クライアント制御装置100により、第1無線通信機器122を通じて、第1遠隔操作装置10から作業機械40に対して操作開始指令が送信される(図4/STEP104)。 (function)
In the first
同様に、第2遠隔操作装置20において、第2オペレータにより所定の操作が行われる(図4/STEP202)。所定の操作は、例えば、第2入力インターフェース210または第2操作機構211を構成するボタンまたは操作レバーの操作である。これに応じて、第2クライアント制御装置200により、第2無線通信機器222を通じて、第2遠隔操作装置20から作業機械40に対して操作開始指令が送信される(図4/STEP204)。
Similarly, in the second remote control device 20, a predetermined operation is performed by the second operator (FIG. 4 / STEP202). The predetermined operation is, for example, the operation of the button or the operation lever constituting the second input interface 210 or the second operation mechanism 211. In response to this, the second client control device 200 transmits an operation start command from the second remote control device 20 to the work machine 40 through the second wireless communication device 222 (FIG. 4 / STEP204).
作業機械40において、作業機械制御装置400により無線通信機器402を通じて操作開始指令が受信される(図4/STEP402)。これに応じて、作業機械制御装置400が撮像装置401に指令を出力し、当該指令に応じて撮像装置401により撮像画像が取得される(図4/STEP404)。作業機械制御装置400により、無線通信機器402を通じて、当該撮像画像を表わす撮像画像データが第1遠隔操作装置10および第2遠隔操作装置20に対して送信される(図4/STEP406)。
In the work machine 40, the work machine control device 400 receives the operation start command through the wireless communication device 402 (FIG. 4 / STEP402). In response to this, the work machine control device 400 outputs a command to the image pickup device 401, and the image pickup device 401 acquires the captured image in response to the command (FIG. 4 / STEP404). The work machine control device 400 transmits the captured image data representing the captured image to the first remote control device 10 and the second remote control device 20 through the wireless communication device 402 (FIG. 4 / STEP406).
第1遠隔操作装置10において、第1クライアント制御装置100により、第1無線通信機器122を通じて撮像画像データが受信される(図4/STEP106)。第1クライアント制御装置100により、撮像画像データに応じた環境画像(撮像画像そのものの全部または一部またはこれに基づいて生成された模擬的な環境画像)が第1画像出力装置121に表示される(図4/STEP108)。同様に、第2遠隔操作装置20において、第2クライアント制御装置200により、第2無線通信機器122を通じて撮像画像データが受信される(図4/STEP206)。第2クライアント制御装置200により、撮像画像データに応じた環境画像が第2画像出力装置211に表示される(図4/STEP208)。これにより、例えば、図5に示されているように、作動機構としての作業アタッチメント440の一部であるブーム441、アーム443、バケット445およびアームシリンダ444が含まれている環境画像が第1画像出力装置121および第2画像出力装置221のそれぞれに表示される。
In the first remote control device 10, the captured image data is received by the first client control device 100 through the first wireless communication device 122 (FIG. 4 / STEP106). The first client control device 100 displays an environment image (all or part of the captured image itself or a simulated environment image generated based on the captured image) on the first image output device 121 according to the captured image data. (Fig. 4 / STEP108). Similarly, in the second remote control device 20, the second client control device 200 receives the captured image data through the second wireless communication device 122 (FIG. 4 / STEP206). The second client control device 200 displays an environment image corresponding to the captured image data on the second image output device 211 (FIG. 4 / STEP208). Thereby, for example, as shown in FIG. 5, the first image is an environmental image including the boom 441, the arm 443, the bucket 445, and the arm cylinder 444, which are a part of the work attachment 440 as the operating mechanism. It is displayed on each of the output device 121 and the second image output device 221.
第1遠隔操作装置10において、第1オペレータにより第1操作機構111が操作され(図4/STEP110)、これに応じて、第1クライアント制御装置100により、第1無線通信機器122を通じて、当該操作態様に応じた操作指令が作業機械40に対して送信される(図4/STEP112)。
In the first remote control device 10, the first operation mechanism 111 is operated by the first operator (FIG. 4 / STEP110), and accordingly, the operation is performed by the first client control device 100 through the first wireless communication device 122. An operation command according to the mode is transmitted to the work machine 40 (FIG. 4 / STEP112).
作業機械40において、作業機械制御装置400により無線通信機器402を通じて操作指令が受信される(図4/STEP408)。これに応じて、作業機械制御装置400により、作業アタッチメント440等の動作が制御される(図4/STEP410)。例えば、バケット445により作業機械40の前方の土をすくい、上部旋回体410を旋回させたうえでバケット445から土を落とす作業が実行される。
In the work machine 40, the work machine control device 400 receives an operation command through the wireless communication device 402 (FIG. 4 / STEP408). In response to this, the operation of the work attachment 440 and the like is controlled by the work machine control device 400 (FIG. 4 / STEP410). For example, the bucket 445 scoops the soil in front of the work machine 40, the upper swivel body 410 is swiveled, and then the soil is dropped from the bucket 445.
第1遠隔操作装置10において、視線検知器112により第1オペレータの視線が検知される(図4/STEP114)。これにより、第1画像出力装置111に表示されている環境画像における当該第1オペレータの注目領域が特定される。第1クライアント制御装置100により、第1無線通信機器122を通じて、当該視線検知結果に応じた視線検知データが第2遠隔操作装置20に対して送信される(図4/STEP116)。
In the first remote control device 10, the line of sight of the first operator is detected by the line of sight detector 112 (FIG. 4 / STEP114). As a result, the area of interest of the first operator in the environment image displayed on the first image output device 111 is specified. The first client control device 100 transmits the line-of-sight detection data according to the line-of-sight detection result to the second remote control device 20 through the first wireless communication device 122 (FIG. 4 / STEP116).
第2遠隔操作装置20において、第2クライアント制御装置200により、第2無線通信機器222を通じて視線検知データが受信される(図4/STEP210)。第2クライアント制御装置200により、当該視線検知データにより特定される画像領域が他の画像領域よりも強調されて第2画像出力装置211に表示される(図4/STEP212)。例えば、第1オペレータの視線、ひいては環境画像の注目領域がバケット445に相当する画像領域である場合、当該画像領域が「指定画像領域」として周囲の画像領域よりも目立つように強調表示される。
In the second remote control device 20, the second client control device 200 receives the line-of-sight detection data through the second wireless communication device 222 (FIG. 4 / STEP210). The second client control device 200 emphasizes the image area specified by the line-of-sight detection data more than the other image areas and displays it on the second image output device 211 (FIG. 4 / STEP212). For example, when the line of sight of the first operator, and thus the region of interest of the environmental image, is an image region corresponding to the bucket 445, the image region is highlighted as a "designated image region" so as to stand out from the surrounding image region.
図6Aに示されているように、ポインタなどの図形がバケット445に相当する画像領域を含む指定画像領域Sに重畳表示されてもよい。図6Bに示されているように、バケット445に相当する指定画像領域Sを囲む円図形などの図形が表示されてもよい。図6Cに示されているように、バケット445に相当する画像領域を含む矩形状の指定画像領域Sの画質が他の画像領域の画質よりも高くされてもよい。これにより、指定画像領域Sがカラー画像で表示される一方で他の画像領域がグレースケール画像で表示されてもよい。また、指定画像領域Sが他の画像領域よりも高解像度の画像で表示されてもよい。図6Dに示されているように、バケット445に相当する画像領域を含む矩形状の指定画像領域Sの明度が他の画像領域の明度よりも高くされてもよい。
As shown in FIG. 6A, a figure such as a pointer may be superimposed and displayed on the designated image area S including the image area corresponding to the bucket 445. As shown in FIG. 6B, a figure such as a circle figure surrounding the designated image area S corresponding to the bucket 445 may be displayed. As shown in FIG. 6C, the image quality of the rectangular designated image area S including the image area corresponding to the bucket 445 may be higher than the image quality of the other image areas. As a result, the designated image area S may be displayed as a color image while the other image area may be displayed as a grayscale image. Further, the designated image area S may be displayed as an image having a higher resolution than other image areas. As shown in FIG. 6D, the brightness of the rectangular designated image area S including the image area corresponding to the bucket 445 may be higher than the brightness of the other image areas.
(効果)
当該構成の遠隔操作システムによれば、第2遠隔操作装置20を通じて作業機械40を操作する第2オペレータが、第1遠隔操作装置10を通じて作業機械40を操作する第1オペレータによる作業機械40の遠隔操作態様として、当該第1オペレータの注目している対象を把握する観点から、第2画像出力装置221に表示される環境画像のうち周囲の画像領域よりも強調的に表示されている指定画像領域を適当な情報として当該第2オペレータに対して提供しうる(図4/STEP212および図6A~図6D参照)。第2オペレータは、第2画像出力装置221に表示された環境画像において強調表示された指定画像領域の視認を通じて、第1オペレータによる操作レバーの動かし方または操作態様を認識することができる。 (effect)
According to the remote control system having the configuration, the second operator who operates thework machine 40 through the second remote control device 20 remotely controls the work machine 40 by the first operator who operates the work machine 40 through the first remote control device 10. As an operation mode, from the viewpoint of grasping the object of interest of the first operator, the designated image area displayed more emphasized than the surrounding image area among the environmental images displayed on the second image output device 221. Can be provided to the second operator as appropriate information (see FIG. 4 / STEP212 and FIGS. 6A-6D). The second operator can recognize how to move or operate the operation lever by the first operator through visual recognition of the designated image area highlighted in the environment image displayed on the second image output device 221.
当該構成の遠隔操作システムによれば、第2遠隔操作装置20を通じて作業機械40を操作する第2オペレータが、第1遠隔操作装置10を通じて作業機械40を操作する第1オペレータによる作業機械40の遠隔操作態様として、当該第1オペレータの注目している対象を把握する観点から、第2画像出力装置221に表示される環境画像のうち周囲の画像領域よりも強調的に表示されている指定画像領域を適当な情報として当該第2オペレータに対して提供しうる(図4/STEP212および図6A~図6D参照)。第2オペレータは、第2画像出力装置221に表示された環境画像において強調表示された指定画像領域の視認を通じて、第1オペレータによる操作レバーの動かし方または操作態様を認識することができる。 (effect)
According to the remote control system having the configuration, the second operator who operates the
(本発明の他の実施形態)
前記実施形態では、第1遠隔操作装置10、第2遠隔操作装置20および作業機械40が無線通信方式にしたがって直接的に相互通信したが、他の実施形態として、図7に示されている遠隔操作サーバ30を介して第1遠隔操作装置10、第2遠隔操作装置20および作業機械40が間接的に相互通信してもよい。 (Other Embodiments of the present invention)
In the above embodiment, the firstremote control device 10, the second remote control device 20, and the work machine 40 directly communicate with each other according to the wireless communication method, but as another embodiment, the remote control shown in FIG. 7 is shown. The first remote control device 10, the second remote control device 20, and the work machine 40 may indirectly communicate with each other via the operation server 30.
前記実施形態では、第1遠隔操作装置10、第2遠隔操作装置20および作業機械40が無線通信方式にしたがって直接的に相互通信したが、他の実施形態として、図7に示されている遠隔操作サーバ30を介して第1遠隔操作装置10、第2遠隔操作装置20および作業機械40が間接的に相互通信してもよい。 (Other Embodiments of the present invention)
In the above embodiment, the first
遠隔操作サーバ30は、第1サーバ演算処理要素31と、第2サーバ演算処理要素32と、を備えている。第1サーバ演算処理要素31は、第1遠隔操作装置10において検知されたオペレータの視線に応じた視線検知データを、第1遠隔操作装置10から受信する。第2サーバ演算処理要素32は、視線検知データを第2遠隔操作装置20に対して送信することにより、第2画像出力装置221に指定画像領域を、その周囲の画像領域とは異なる形態で表示させる(図6A~図6D参照)。
The remote control server 30 includes a first server arithmetic processing element 31 and a second server arithmetic processing element 32. The first server arithmetic processing element 31 receives the line-of-sight detection data according to the line-of-sight of the operator detected by the first remote control device 10 from the first remote control device 10. The second server arithmetic processing element 32 displays the designated image area on the second image output device 221 in a form different from the surrounding image area by transmitting the line-of-sight detection data to the second remote control device 20. (See FIGS. 6A to 6D).
10‥第1遠隔操作装置、20‥第2遠隔操作装置、30‥遠隔操作サーバ、31‥第1サーバ演算処理要素、32‥第2サーバ演算処理要素、40‥作業機械、100‥第1クライアント制御装置、110‥第1入力インターフェース、111‥第1操作機構、112‥視線検知器、120‥第1出力インターフェース、121‥第1画像出力装置、122‥第1無線通信機器、200‥第2クライアント制御装置、210‥第2入力インターフェース、211‥第2操作機構、220‥第2出力インターフェース、221‥第2画像出力装置、222‥第2無線通信機器、401‥撮像装置、402‥無線通信機器、440‥作業アタッチメント(作動機構)。
10 ... 1st remote control device, 20 ... 2nd remote control device, 30 ... remote control server, 31 ... 1st server arithmetic processing element, 32 ... 2nd server arithmetic processing element, 40 ... work machine, 100 ... 1st client Control device, 110: 1st input interface, 111: 1st operation mechanism, 112: line-of-sight detector, 120: 1st output interface, 121: 1st image output device, 122: 1st wireless communication device, 200: 2nd Client control device, 210 ... 2nd input interface, 211 ... 2nd operation mechanism, 220 ... 2nd output interface, 221 ... 2nd image output device, 222 ... 2nd wireless communication device, 401 ... Imaging device, 402 ... Wireless communication Equipment, 440 ... Work attachment (actuating mechanism).
Claims (2)
- 作動機構および当該作動機構の少なくとも一部を含む環境を撮像する撮像装置を有している作業機械と、無線通信機、前記作業機械の前記撮像装置により取得された撮像画像データに応じた環境画像を表示する画像出力装置および前記作業機械を遠隔操作するための操作機構を有している第1遠隔操作装置および第2遠隔操作装置と、を備えている遠隔操作システムであって、
前記第1遠隔操作装置が、オペレータの視線を検知する視線検知器と、前記視線検知器により検知された当該オペレータの視線に応じた視線検知データを前記無線通信機に送信させる第1クライアント制御装置と、を備え、
前記第2遠隔操作装置が、前記環境画像において、前記無線通信機により受信された前記視線検知データに応じた前記オペレータの視線を基準として広がる画像領域としての指定画像領域を、その周囲の画像領域とは異なる形態で前記画像出力装置に表示させる第2クライアント制御装置と、を備えていることを特徴とする遠隔操作システム。 An environment image corresponding to an image pickup device that has an image pickup device that captures an operating mechanism and an environment including at least a part of the actuating mechanism, a wireless communication device, and the image pickup device of the work machine. A remote control system including a first remote control device and a second remote control device having an image output device for displaying the above and an operation mechanism for remotely controlling the work machine.
The first remote control device causes the wireless communication device to transmit a line-of-sight detector that detects the line of sight of the operator and line-of-sight detection data according to the line of sight of the operator detected by the line-of-sight detector. And with
In the environment image, the second remote control device sets a designated image area as an image area that expands with reference to the line of sight of the operator according to the line of sight detection data received by the wireless communication device, and an image area around the designated image area. A remote control system including a second client control device for displaying on the image output device in a form different from that of the above. - 作動機構および当該作動機構の少なくとも一部を含む環境を撮像する撮像装置を有している作業機械と、無線通信機能を用いて前記作業機械の前記撮像装置により取得された撮像画像データに応じた環境画像を表示する画像出力装置および前記作業機械を遠隔操作するための操作機構を有している第1遠隔操作装置および第2遠隔操作装置のそれぞれとの相互通信機能を有する遠隔操作サーバであって、
前記第1遠隔操作装置において検知されたオペレータの視線に応じた視線検知データを、前記第1遠隔操作装置から受信する第1サーバ演算処理要素と、を備え、
前記視線検知データを前記第2遠隔操作装置に対して送信することにより、前記第2遠隔装置の前記画像出力装置に、前記環境画像において前記視線検知データに応じた前記オペレータの視線を基準として広がる画像領域としての指定画像領域を、その周囲の画像領域とは異なる形態で表示させる第2サーバ演算処理要素と、を備えていることを特徴とする遠隔操作サーバ。 Corresponding to a working machine having an operating mechanism and an imaging device for imaging an environment including at least a part of the operating mechanism, and captured image data acquired by the imaging device of the working machine using a wireless communication function. A remote control server having an intercommunication function with each of a first remote control device and a second remote control device having an image output device for displaying an environment image and an operation mechanism for remotely controlling the work machine. hand,
The first server arithmetic processing element that receives the line-of-sight detection data according to the line-of-sight of the operator detected by the first remote control device from the first remote control device is provided.
By transmitting the line-of-sight detection data to the second remote control device, the line-of-sight detection data is spread to the image output device of the second remote control device with reference to the line-of-sight of the operator corresponding to the line-of-sight detection data in the environment image. A remote control server including a second server arithmetic processing element that displays a designated image area as an image area in a form different from that of the surrounding image area.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19921340.6A EP3923569B1 (en) | 2019-03-26 | 2019-12-02 | Remote operation system and remote operation server |
CN201980094507.9A CN113615164B (en) | 2019-03-26 | 2019-12-02 | Remote operation system and remote operation server |
US17/438,145 US11732440B2 (en) | 2019-03-26 | 2019-12-02 | Remote operation system and remote operation server |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-058104 | 2019-03-26 | ||
JP2019058104A JP7318258B2 (en) | 2019-03-26 | 2019-03-26 | Remote control system and remote control server |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020194882A1 true WO2020194882A1 (en) | 2020-10-01 |
Family
ID=72608726
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/047061 WO2020194882A1 (en) | 2019-03-26 | 2019-12-02 | Remote operation system and remote operation server |
Country Status (5)
Country | Link |
---|---|
US (1) | US11732440B2 (en) |
EP (1) | EP3923569B1 (en) |
JP (1) | JP7318258B2 (en) |
CN (1) | CN113615164B (en) |
WO (1) | WO2020194882A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7310338B2 (en) * | 2019-06-10 | 2023-07-19 | コベルコ建機株式会社 | Remote control system and remote control server |
KR20220102765A (en) * | 2021-01-14 | 2022-07-21 | 현대두산인프라코어(주) | System and method of controlling construction machinery |
JPWO2023112217A1 (en) * | 2021-12-15 | 2023-06-22 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008144378A (en) * | 2006-12-06 | 2008-06-26 | Shin Caterpillar Mitsubishi Ltd | Controller for remote controlled working machine |
WO2010137165A1 (en) * | 2009-05-29 | 2010-12-02 | 新日本製鐵株式会社 | Technique managing device, and technique managing method |
JP2016076801A (en) | 2014-10-03 | 2016-05-12 | ヤンマー株式会社 | Remote monitoring system |
WO2017042873A1 (en) * | 2015-09-08 | 2017-03-16 | 株式会社日立製作所 | Remote operation system and operation assistance system |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4884939A (en) * | 1987-12-28 | 1989-12-05 | Laser Alignment, Inc. | Self-contained laser-activated depth sensor for excavator |
JP3217723B2 (en) | 1997-03-13 | 2001-10-15 | ▲すすむ▼ 舘 | Telecommunications system and telecommunications method |
CL2012000933A1 (en) * | 2011-04-14 | 2014-07-25 | Harnischfeger Tech Inc | A method and a cable shovel for the generation of an ideal path, comprises: an oscillation engine, a hoisting engine, a feed motor, a bucket for digging and emptying materials and, positioning the shovel by means of the operation of the lifting motor, feed motor and oscillation engine and; a controller that includes an ideal path generator module. |
US20140170617A1 (en) * | 2012-12-19 | 2014-06-19 | Caterpillar Inc. | Monitoring System for a Machine |
JP6326869B2 (en) * | 2014-03-05 | 2018-05-23 | 株式会社デンソー | Vehicle periphery image display device and vehicle periphery image display method |
US10503249B2 (en) * | 2014-07-03 | 2019-12-10 | Topcon Positioning Systems, Inc. | Method and apparatus for construction machine visualization |
EP3222042B1 (en) * | 2014-11-17 | 2022-07-27 | Yanmar Power Technology Co., Ltd. | Display system for remote control of working machine |
WO2016158000A1 (en) * | 2015-03-30 | 2016-10-06 | ソニー株式会社 | Information processing device, information processing method, and information processing system |
JP6754364B2 (en) * | 2015-08-25 | 2020-09-09 | 川崎重工業株式会社 | Robot system |
WO2017043108A1 (en) * | 2015-09-10 | 2017-03-16 | 富士フイルム株式会社 | Projection-type display device and projection control method |
AU2016402225B2 (en) * | 2016-04-04 | 2022-02-10 | Topcon Positioning Systems, Inc. | Method and apparatus for augmented reality display on vehicle windscreen |
JP2018004950A (en) * | 2016-07-01 | 2018-01-11 | フォーブ インコーポレーテッド | Video display system, video display method, and video display program |
DE102016011354A1 (en) * | 2016-09-20 | 2018-03-22 | Liebherr-Werk Biberach Gmbh | Control station for a crane, excavator and the like |
CN106531073B (en) * | 2017-01-03 | 2018-11-20 | 京东方科技集团股份有限公司 | Processing circuit, display methods and the display device of display screen |
EP3363684B1 (en) * | 2017-02-21 | 2021-12-08 | Deere & Company | Adaptive light system of an off-road vehicle |
DE102017205467A1 (en) * | 2017-02-21 | 2018-08-23 | Deere & Company | Adaptive light system of an off-road vehicle |
JP6581139B2 (en) * | 2017-03-31 | 2019-09-25 | 日立建機株式会社 | Work machine ambient monitoring device |
US11874659B2 (en) * | 2017-06-09 | 2024-01-16 | Volvo Construction Equipment Ab | Information system for a working machine |
WO2018228669A1 (en) * | 2017-06-13 | 2018-12-20 | Volvo Construction Equipment Ab | A working machine provided with an image projection arrangement |
DE102017116822A1 (en) * | 2017-07-25 | 2019-01-31 | Liebherr-Hydraulikbagger Gmbh | Work machine with display device |
WO2019176036A1 (en) * | 2018-03-14 | 2019-09-19 | 日立建機株式会社 | Work machine |
US10829911B2 (en) * | 2018-09-05 | 2020-11-10 | Deere & Company | Visual assistance and control system for a work machine |
-
2019
- 2019-03-26 JP JP2019058104A patent/JP7318258B2/en active Active
- 2019-12-02 US US17/438,145 patent/US11732440B2/en active Active
- 2019-12-02 CN CN201980094507.9A patent/CN113615164B/en active Active
- 2019-12-02 EP EP19921340.6A patent/EP3923569B1/en active Active
- 2019-12-02 WO PCT/JP2019/047061 patent/WO2020194882A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008144378A (en) * | 2006-12-06 | 2008-06-26 | Shin Caterpillar Mitsubishi Ltd | Controller for remote controlled working machine |
WO2010137165A1 (en) * | 2009-05-29 | 2010-12-02 | 新日本製鐵株式会社 | Technique managing device, and technique managing method |
JP2016076801A (en) | 2014-10-03 | 2016-05-12 | ヤンマー株式会社 | Remote monitoring system |
WO2017042873A1 (en) * | 2015-09-08 | 2017-03-16 | 株式会社日立製作所 | Remote operation system and operation assistance system |
Also Published As
Publication number | Publication date |
---|---|
EP3923569A4 (en) | 2022-04-06 |
EP3923569A1 (en) | 2021-12-15 |
JP7318258B2 (en) | 2023-08-01 |
US11732440B2 (en) | 2023-08-22 |
US20220186465A1 (en) | 2022-06-16 |
CN113615164A (en) | 2021-11-05 |
EP3923569B1 (en) | 2023-02-08 |
CN113615164B (en) | 2024-06-07 |
JP2020161933A (en) | 2020-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020194882A1 (en) | Remote operation system and remote operation server | |
WO2019189430A1 (en) | Construction machine | |
JP6947101B2 (en) | Remote control system and main control device | |
JP5546427B2 (en) | Work machine ambient monitoring device | |
US20140111648A1 (en) | Device For Monitoring Area Around Working Machine | |
CN106664393A (en) | Information processing device, information processing method, and image display system | |
WO2019187660A1 (en) | Remote operation system for working machine | |
JP2018152738A (en) | Display system, display method, and remote control system | |
JP2015226094A (en) | Remote control system for work machine | |
US11993922B2 (en) | Remote operation system | |
JP7287262B2 (en) | Remote control system and remote control server | |
JP7099358B2 (en) | Display system for work machines | |
JP2021099017A (en) | Remote operation device and remote operation system | |
JP7021502B2 (en) | Visual expansion system for construction machinery | |
JP2020032320A (en) | Dismantling system | |
WO2021020292A1 (en) | Display system, remote operation system, and display method | |
WO2021166475A1 (en) | Remote operation device, remote operation assistance server, remote operation assistance system, and remote operation assistance method | |
WO2023026568A1 (en) | Remote operation system and remote operation composite system | |
WO2023136070A1 (en) | Remote operation support system and remote operation support method | |
WO2021124858A1 (en) | Remote control device and remote control system | |
JP2023032997A (en) | remote control system | |
WO2022195988A1 (en) | Remote operation assistance server and remote operation assistance system | |
KR20200104275A (en) | Appararus for remote-controlling speed sprayer using virtual reality | |
JP2023032998A (en) | Work support system and work support composite system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19921340 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019921340 Country of ref document: EP Effective date: 20210906 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |