CN107111928B - Display system for remote control working machine - Google Patents

Display system for remote control working machine Download PDF

Info

Publication number
CN107111928B
CN107111928B CN201480082777.5A CN201480082777A CN107111928B CN 107111928 B CN107111928 B CN 107111928B CN 201480082777 A CN201480082777 A CN 201480082777A CN 107111928 B CN107111928 B CN 107111928B
Authority
CN
China
Prior art keywords
eye
remote control
panoramic image
image
partial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201480082777.5A
Other languages
Chinese (zh)
Other versions
CN107111928A (en
Inventor
马尔塔·尼科里尼
安东尼奥·阿尔巴
江口慎吾
楠野顺也
保罗·特里皮基奥
埃马努埃莱·鲁法尔迪
卡洛·阿尔贝托·阿维扎诺
保罗·西蒙内·加斯帕雷洛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yanmar Power Technology Co Ltd
Original Assignee
Yanmar Power Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yanmar Power Technology Co Ltd filed Critical Yanmar Power Technology Co Ltd
Priority to CN202010638589.3A priority Critical patent/CN111754750B/en
Publication of CN107111928A publication Critical patent/CN107111928A/en
Application granted granted Critical
Publication of CN107111928B publication Critical patent/CN107111928B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F5/00Dredgers or soil-shifting machines for special purposes
    • E02F5/02Dredgers or soil-shifting machines for special purposes for digging trenches or ditches
    • E02F5/14Component parts for trench excavators, e.g. indicating devices travelling gear chassis, supports, skids
    • E02F5/145Component parts for trench excavators, e.g. indicating devices travelling gear chassis, supports, skids control and indicating devices
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/08Superstructures; Supports for superstructures
    • E02F9/0858Arrangement of component parts installed on superstructures not otherwise provided for, e.g. electric components, fenders, air-conditioning units
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/20Drives; Control devices
    • E02F9/2025Particular purposes of control systems not otherwise provided for
    • E02F9/205Remotely operated machines, e.g. unmanned vehicles
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/261Surveying the work-site to be treated
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/26Indicating devices
    • E02F9/261Surveying the work-site to be treated
    • E02F9/262Surveying the work-site to be treated with follow-up actions to control the work tool, e.g. controller
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0038Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface

Abstract

The display system for the remote control working machine includes a plurality of camera units and image processing devices all provided on or in the working machine, and a display device and a remote control device all provided at a remote control end. The display device detects movement of the head of the operator and transmits the detected movement of the head of the operator to the remote control device. The remote control device transmits the movement of the head of the operator transmitted from the display device to the image processing device by wireless communication. The image processing apparatus adjusts partial left-eye and right-eye panoramic images for the left-eye and right-eye panoramic images transmitted to the display apparatus according to the movement of the head of the operator transmitted from the remote control apparatus.

Description

Display system for remote control working machine
Technical Field
The present invention relates to a display system that displays an image transmitted from a work machine for remotely controlling the work machine.
Background
There have been display systems that display images transmitted from a working machine for remotely controlling the working machine. Conventional display systems include a camera or camera assembly attached to the work machine that is remotely controlled from a location that is located remote from the work site on which the work machine operates. The system transmits images (images) shot by the camera to a remote control end of the display system through wireless communication, so that the display equipment in the remote control end can display the shot images in real time.
For example, JP2008111269A discloses a display system that displays a panoramic image of a work machine environment on a display device in real time.
As another example, JP2002345058A discloses a display system that displays a job site image on a display device in real time based on data transmitted from a camera attached to a construction machine. Due to radio legislation constraints, data transfer is performed on 2.4GHz low power radio.
Disclosure of Invention
Technical problem to be solved by the invention
It is desirable for a display system for a remote control working machine to be able to track the movement of the operator's sight line and change the image being displayed on the display device accordingly, thereby improving the remote control operability.
However, JP2008111269A discloses only a display system that changes a panoramic image in response to a pedal operation. JP2008111269A cannot always provide a good operational feeling because the display system does not track the movement of the operator's sight line to change the image being displayed on the display device accordingly. JP2002345058A also does not disclose a display system that tracks the movement of the operator's gaze to change the image being displayed on the display device accordingly.
The patent documents listed below represent exemplary techniques related to the present invention. However, none foresees the present invention.
US6795109B2
US8274550B2
US6195204B1
US2006/0103723A1
US2007/0182812A1
US8355041B2
Accordingly, an object of the present invention is to provide a display system for a remote control working machine, which is capable of changing an image being displayed on a display device according to the movement of the operator's sight line.
Solution to the technical problem
In order to achieve the above object, the present invention provides a display system for a remote control working machine, the display system having the first to fourth aspects.
(1) First aspect of a display system
A first aspect of the display system is a display system for a remote control work machine, including: a plurality of camera units attached to a working machine having a working member capable of horizontal rotation and vertical swing; an image processing device provided in the work machine, generating a panoramic image based on a plurality of images captured by the plurality of camera units, and transmitting a partial panoramic image of the generated panoramic image by wireless communication; the display equipment is positioned at the remote control end of the display system; and a remote control device that receives the partial panoramic image from the image processing device and transmits the received partial panoramic image to a display device, wherein each of the camera units includes a pair of cameras of a left-eye camera and a right-eye camera, the image processing device generates a left-eye panoramic image and a right-eye panoramic image based on a left-eye image and a right-eye image photographed by the pair of cameras, respectively, and transmits the generated partial left-eye panoramic image and right-eye panoramic image of the left-eye panoramic image and right-eye panoramic image to the remote control device by wireless communication, the remote control device receives the partial left-eye panoramic image and right-eye panoramic image from the image processing device and transmits the received partial left-eye panoramic image and right-eye panoramic image to the display device, the display device being designed to be mounted on an operator's head, receiving the partial left-eye panoramic image and the right-eye panoramic image from the remote control apparatus, and displaying the received partial left-eye panoramic image and right-eye panoramic image, and detecting movement of the operator's eyes, and transmitting the detected movement of the operator's eye to the remote control apparatus, the remote control apparatus transmitting the movement of the operator's eye transmitted from the display apparatus to the image processing apparatus by wireless communication, and the image processing apparatus adjusts the partial left-eye panoramic image and the partial right-eye panoramic image to be transmitted to the display apparatus according to the movement of the operator's eye transmitted from the remote control apparatus, the image processing apparatus performs an adjustment process or a trimming process by delaying the response speed of the movement of the operator's eyeball if the response speed of the movement of the operator's eyeball is faster than a predetermined response speed.
(2) Second aspect of the display system
A second aspect of the display system is a display system for a remote-controlled working machine, comprising: a plurality of camera units attached to a working machine having a working member capable of horizontal rotation and vertical swing; an image processing device provided in the work machine, generating a panoramic image based on a plurality of images captured by the plurality of camera units, and transmitting a partial panoramic image of the generated panoramic image by wireless communication; the display equipment is positioned at the remote control end of the display system; and a remote control device that receives the partial panoramic image from the image processing device and transmits the received partial panoramic image to the display device, wherein each camera unit includes a pair of cameras of a left-eye camera and a right-eye camera, the image processing device generates a left-eye panoramic image and a right-eye panoramic image based on a left-eye image and a right-eye image photographed by the pair of cameras, respectively, and transmits the partial left-eye panoramic image and the right-eye panoramic image of the generated left-eye panoramic image and right-eye panoramic image to the remote control device by wireless communication, the remote control device receives the partial left-eye panoramic image and right-eye panoramic image from the image processing device and transmits the received partial left-eye panoramic image and right-eye panoramic image to the display device, the display device being designed to be mounted on an operator's head, receiving the partial left-eye panoramic image and the right-eye panoramic image from the remote control apparatus, and displaying the received partial left-eye panoramic image and right-eye panoramic image, and detecting movement of the head of the operator, and transmitting the detected movement of the head of the operator to the remote control apparatus, the remote control apparatus transmitting the movement of the head of the operator transmitted from the display apparatus to the image processing apparatus by wireless communication, the image processing apparatus adjusting the partial left-eye panoramic image and the right-eye panoramic image to be transmitted to the display apparatus according to the movement of the head of the operator transmitted from the remote control apparatus, the display apparatus being based on a database prepared in advance in the display system or a database located at a position different from the display system, and displaying the partial work assistance information together with the partial left-eye panoramic image and the partial right-eye panoramic image so that the partial work assistance information coincides with the partial left-eye panoramic image and the partial right-eye panoramic image, the partial work assistance information being assistance information corresponding to the partial left-eye panoramic image and the partial right-eye panoramic image in the work assistance information.
(3) Third aspect of display system
A third aspect of the display system is a display system for a remote control work machine, including: a plurality of camera units attached to a working machine having a working member capable of horizontal rotation and vertical swing; an image processing device provided in the work machine, generating a panoramic image based on a plurality of images captured by the plurality of camera units, and transmitting the generated panoramic image by wireless communication; the display equipment is positioned at a remote control end of the display system; and a remote control device that receives a panoramic image from the image processing device and transmits a partial panoramic image of the received panoramic image to the display device, wherein each of the camera units includes a pair of left-eye and right-eye cameras, the image processing device generates a left-eye panoramic image and a right-eye panoramic image based on a left-eye image and a right-eye image photographed by the pair of cameras, respectively, and transmits the generated left-eye and right-eye panoramic images to the remote control device by wireless communication, the remote control device receives the left-eye and right-eye panoramic images from the image processing device and transmits the partial left-eye and right-eye panoramic images of the received left-eye and right-eye panoramic images to the display device, the display device being designed to be mounted on an operator's head, receiving the partial left-eye panoramic image and the right-eye panoramic image from the remote control apparatus, and displaying the received partial left-eye panoramic image and right-eye panoramic image, and detecting movement of the operator's eye, and transmitting the detected movement of the operator's eye to the remote control apparatus, and the remote control apparatus adjusting the partial left-eye panoramic image and right-eye panoramic image to be transmitted to the display apparatus according to the movement of the operator's eye transmitted from the display apparatus, if a response speed of the movement of the operator's eye is faster than a predetermined response speed, the image processing apparatus performs an adjustment process or a clipping process by delaying the response speed of the movement of the operator's eye.
(4) Fourth aspect of the display system
A fourth aspect of the display system is a display system for a remote-controlled working machine, comprising: a plurality of camera units attached to a working machine having a working member capable of horizontally rotating and vertically swinging; an image processing device provided in the work machine, generating a panoramic image based on a plurality of images captured by the plurality of camera units, and transmitting the generated panoramic image by wireless communication; the display equipment is positioned at the remote control end of the display system; and a remote control device that receives the panoramic image from the image processing device and transmits a partial panoramic image of the received panoramic image to the display device, wherein each of the camera units includes a pair of cameras of a left-eye camera and a right-eye camera, the image processing device generates a left-eye panoramic image and a right-eye panoramic image based on a left-eye image and a right-eye image photographed by the pair of cameras, respectively, and transmits the generated left-eye panoramic image and right-eye panoramic image to the remote control device by wireless communication, the remote control device receives the left-eye panoramic image and right-eye panoramic image from the image processing device and transmits partial left-eye panoramic image and right-eye panoramic image of the received left-eye panoramic image and right-eye panoramic image to the display device, the display device being designed to be mounted on a head of an operator, receiving the partial left-eye panoramic image and the right-eye panoramic image from the remote control apparatus, and displaying the received partial left-eye panoramic image and right-eye panoramic image, and detecting movement of the head of the operator, and transmitting the detected movement of the head of the operator to the remote control apparatus, the remote control apparatus adjusting the partial left-eye panoramic image and right-eye panoramic image to be transmitted to the display apparatus according to the movement of the head of the operator transmitted from the display apparatus, the display apparatus displaying the partial work assistance information together with the partial left-eye panoramic image and right-eye panoramic image so that the partial work assistance information coincides with the partial left-eye panoramic image and right-eye panoramic image based on a database prepared in advance in the display system or a database located at a position different from the display system, the partial job assistance information is assistance information corresponding to the partial left-eye panoramic image and the partial right-eye panoramic image in the job assistance information.
In an exemplary aspect of the present invention, the plurality of camera units are located at a placement position corresponding to a position level at which a seat can be mounted on the work machine, such that the plurality of camera units have lens centers located at a sitting height corresponding to an eye height of a worker having an average physique for a target market of the work machine who is supposed to sit on the seat.
In another exemplary aspect of the present invention, the above-described plurality of camera units are provided on a support member capable of upward/downward movement and vertical rotation.
In another exemplary aspect of the invention, the display system further includes an illumination device located at a periphery of the plurality of camera units.
The invention has the advantages of
The present invention can provide a display system for a remote control working machine, which can track the movement of the operator's sight line to change the image being displayed on the display device accordingly.
Drawings
Fig. 1 is a schematic diagram of a display system for a remote control working machine according to a first embodiment.
Fig. 2 is a schematic perspective view of the structure of a support member provided in a support member on the work machine shown in fig. 1 and a plurality of camera units supported by the support member.
Fig. 3 is a block diagram showing a system configuration of an image processing apparatus provided at the job end of the display system of the first embodiment.
Fig. 4 is a diagram illustrating a clipping process in which the image processing apparatus clips a partial panoramic image.
Fig. 5 is a block diagram showing a system configuration of a remote control device provided at a remote control side of the display system of the first embodiment.
Fig. 6 is a diagram showing an adjustment process in which the image processing apparatus adjusts a partial panoramic image in accordance with the movement of the head of the operator.
Fig. 7 is a flowchart depicting an exemplary control operation implemented by the display system of the first embodiment.
Fig. 8 is a schematic diagram of a display system for a remote control working machine according to a second embodiment.
Fig. 9 is a block diagram showing a system configuration of an image processing apparatus provided at the job end of the display system for a remote control job machine of the third embodiment.
Fig. 10 is a block diagram showing a system configuration of a remote control apparatus provided at a remote control side of a display system of the third embodiment.
Fig. 11 is a flowchart showing an exemplary control operation carried out by the display system of the third embodiment.
Fig. 12 is a schematic diagram of a display system for a remote control working machine according to a fourth embodiment.
Fig. 13 is a schematic diagram of a display system for a remote control working machine according to a fifth embodiment.
Fig. 14 is a schematic diagram of a pair of cameras constituting a camera unit and an exemplary illumination device mounted on the pair of cameras when viewed from the front with respect to the display system for a remote control working machine of the sixth embodiment.
Fig. 15 is a display of an exemplary display screen on a display device of a display system for a remote control working machine of the seventh embodiment, the display screen showing the partial work assistance information together with the partial panoramic image in such a manner that a portion of the work assistance information associated with the partial panoramic image coincides with the partial panoramic image.
Detailed Description
Embodiments according to the present invention will be described below with reference to the accompanying drawings.
First embodiment
Fig. 1 is a schematic diagram of a display system 100 for remotely controlling a working machine 200 according to a first embodiment.
As shown in fig. 1, display system 100 for remotely controlling work machine 200 includes a camera unit 310 attached to work machine 200 and remotely controlled from a location remote from a work site on which work machine 200 operates 310. The images (videos) captured by the camera unit 310 and the camera unit 310 are transmitted to the remote control terminal 120 of the display system through wireless communication, so that the captured images can be displayed on the display device 410 of the remote control terminal 120 in real time.
In the display system 100, the work machine 200 is an unmanned work machine remotely controlled from a location remote from the work site by an operator 500 of the work machine 200. The operator 500 who views the display screen of the display device 410 transmits an operation instruction from the remote control terminal 120 to control the unmanned working machine. Work machine 200 is therefore able to operate in hazardous work sites that human beings cannot reach (e.g., due to contamination of the site).
The working machine 200 includes a working member 220 that can rotate horizontally and can swing vertically. In the present example, the work machine 200 is a construction machine (specifically, an excavator). The working machine 200 further includes a traveling body 210 constituting a crawler. The working member 220 can horizontally rotate (around an axis extending in the vertical direction) with respect to the traveling body 210. The working unit 220 includes: an upper body 221 horizontally rotatable with respect to the traveling body 210; a boom 222 vertically swingable (about an axis extending in a horizontal direction) with respect to the upper body 221; a robot arm 223 vertically swingable with respect to the boom 222; and a bucket 224 capable of swinging vertically with respect to the arm 223. The "horizontal" and "vertical" directions herein are defined with respect to work machine 200 placed on a horizontal plane.
In this example, the attachment to the end of the robotic arm 223 is a bucket 224. However, the present example is by no means limiting and any suitable attachment may be fitted to the end of the robotic arm 223 depending on the nature of the job.
In this example, work implement 200 is an excavator. However, the present embodiment is by no means limiting, and work machine 200 may be any other construction machine or any agricultural machine.
Work machine 200 also includes remote operation device 230. The remote operation device 230 remotely controls the main body of the work machine 200, and is capable of wireless communication with the main body of the work machine 200. The remote operation device 230 accepts operations from the operator 500, and transmits operation instructions issued based on the operations to the main body of the work machine 200 by wireless communication. The remote operation device 230 includes an operation portion 231 such as a lever for operating the working member 220 (specifically, the upper body 221, the boom 222, the robot arm 223, the bucket 224, or any other member).
The display system 100 is made up of components 300 located in the operator end 110 of the display system and components 400 located in the remote control end 120 of the display system.
The component 300 in the operator end 110 includes a plurality of camera units 310 and an image processing device 320, both of which are provided on the operator 200. The components 400 in the remote control 120 include a display device 410 and a remote control device 420.
The component 300 in the work head end 110 also includes a support member 330. The support member 330 has a lower end supported by the upper body 221 provided on the working part 220 of the working machine 200 (in this example, fixed to the upper body 221) and an upper end supporting (in this example, holding) the camera unit 310 in a horizontal posture. The "horizontal posture" herein is defined as the posture of the work machine 200 when the work machine 200 is placed on a horizontal plane.
The support member 330 includes a support 331 and a support base (in this example, a post 332). The support 331 supports the camera unit 310 and 310. The support base supports the support 331 on an upper end thereof, and a lower end thereof is supported by the upper body 221.
Fig. 2 is a schematic perspective view illustrating the structure of a support 331 provided in a support member 330 on the working machine 200 shown in fig. 1 and a camera unit 310 supported by the support 331. Fig. 2 shows (in this example five) camera units 310 and 310 arranged on a support 331, some of which are removed from the support 331, and practically all (in this example 5) camera units 310 and 310 are arranged on the support 331.
As shown in fig. 2, each of the camera units 310-310 is composed of a pair of left-eye camera 311 and right-eye camera 312. More specifically, the pair of cameras 311 and 312 is constituted by a left-eye camera 311 for the left eye of the operator 500 and a right-eye camera 312 for the right eye of the operator 500. The lens centers of the pair of cameras 311 and 312 are spaced apart by a distance "lens, which is equal to the distance between the centers of normal human pupils. In the present example, all the cameras 311 and 312 constituting the camera unit 310 and 310 are small digital video cameras of the same type.
The support 331 is configured to support the camera unit 310 and 310 omnidirectionally or substantially omnidirectionally so that images photographed by the camera unit 310 and 310 can be overlapped at a predetermined photographing area ratio (captured area ratio). Specifically, the support 331 is configured to support the left- eye camera 311 and 311 in an omni-directional or substantially omni-directional manner, so that the left-eye images captured by the left- eye camera 311 and 311 can be overlapped at a predetermined capture area ratio, and to support the right- eye camera 312 and 312 in an omni-directional or substantially omni-directional manner, so that the right-eye cameras captured by the right- eye camera 312 and 312 can be overlapped at a predetermined capture area ratio.
More specifically, when two camera units 310 and 310 are provided, the support 331 may arrange the camera units 310 and 310 at opposite positions spaced apart by a predetermined distance; when three or more camera units 310 and 310 are provided, the support 331 may radially arrange the camera units 310 and 310 at equidistantly spaced or substantially equidistantly spaced positions along an imaginary circle having a predetermined length radius.
Specifically, the support 331 is plate-shaped, and may be shaped to have a number of sides matching the number of the camera units 310. For example, when two camera units 310 and 310 are provided, the support 331 may have an increased length in a direction in which the camera units 310 and 310 are arranged back-to-back in a plan view; when three or more camera units 310-. The support 331 may be circular in plan view.
In this example, five camera units 310 and 310 (in other words, a total of ten cameras 311 and 312 constituting the camera units 310 and 310) are provided. These camera units 310 are radially arranged at equally spaced or substantially equally spaced positions (the centers of each side) along an imaginary circle having a predetermined length radius measured from the center of the support 331 having a pentagon shape in a plan view. The camera unit 310 and 310 are fixed using the fixing member 331 a.
As explained in detail above, the support 331 supports the camera units 310 arranged at opposite positions or radially arranged at equidistantly spaced or substantially equidistantly spaced positions 310. This arrangement enables the camera unit 310 and 310 as a whole to be panned over a 360 deg. horizontal view. The viewing angles of each pair of cameras 311 and 312 constituting the camera unit 310 and 310 may be determined in an appropriate manner according to the arrangement positions of the camera unit 310 and 310. In this example, each pair of cameras 311 and 312 has a vertical viewing angle of 140 ° (specifically, 70 ° above an imaginary horizontal reference plane and another 70 ° below that plane).
The camera unit 310 and 310 (see fig. 1) are located at a set position horizontally corresponding to a position where a seat (in this example, a seat on which a worker sits during operation if the work machine 200 is manually operated) can be mounted on the work machine 200 (specifically, horizontally corresponding in both the front/rear direction and the left/right direction). In addition, camera unit 310 and 310 are arranged to have a lens center located at a sitting height corresponding to an eye height of a worker (e.g., a worker from europe, the united states, or asia) having an average physique (e.g., height and sitting height) for a target market (e.g., specifications for european, north american, or asian markets) of work machine 200, who is supposed to sit on a seat that can be mounted on work machine 200.
In the present example, in front of the upper body 221, the boom 222 is provided at the center of the upper body 221 in terms of the left/right direction of the person facing the traveling direction. Therefore, the support member 330 stands on the right or left side of the center (left side in the example of fig. 1) near the front of the upper body 221 in terms of the left/right direction.
The support member 330 is also configured to support the camera unit 310 and 310 such that the camera unit 310 and 310 has a lens center located at a height corresponding to the eye height of an operator who is supposed to be seated on a seat that can be mounted on the working machine 200.
In the present example, the image processing apparatus 320 is provided at the work part 220 of the work machine 200. The present example is by no means limiting, and the image processing apparatus 320 may be provided at the support member 330, the traveling body 210, or any other suitable member.
The image processing apparatus 320 generates a panoramic image (panoramic picture data) based on a plurality of images (picture data) photographed by the camera unit 310 and the camera unit 310, cuts out a part of the generated panoramic image, and transmits the cut out part of the panoramic image through wireless communication. In the present example, the image processing apparatus 320 is built around a computer (specifically, a personal computer).
Fig. 3 is a block diagram showing a system configuration of the image processing apparatus 320 in the job end 110 of the display system 100 of the first embodiment.
As shown in fig. 3, the image processing apparatus 320 includes a control section 321 that executes various programs, calculations, and other processes, and a storage section 322 that stores various data.
The image processing device 320 has a predetermined communication interface for communicating with the camera unit 310 and 310, and is electrically connected to the camera unit 310 and 310 through the communication interface of the connection cable CB 1. This configuration enables the image processing apparatus 320 to receive a plurality of images (picture data) IM-IM and IM-IM captured by the camera unit 310 and 310.
In the present example, the communication interface between the camera unit 310 and the image processing apparatus 320 is a USB (universal serial bus) interface (specifically, a USB 3.0-compliant interface). The communication interface between the camera unit 310 and the image processing device 320 is not limited to a USB interface in any way, and may be any other communication interface. In addition, the communication between the camera unit 310 and the image processing device 320 is by no means limited to wired communication, and may also be short-range wireless communication.
The storage unit 322 includes a volatile memory such as a RAM (random access memory), a hard disk device, and a nonvolatile memory such as a flash memory. The volatile memory is used as a job memory necessary in executing various calculations and other processes, for example, by the control section 321. The non-volatile memory is preloaded (preinstalled) with software, including computer programs, from an external source, such as a storage medium or an internet server.
The image processing apparatus 320 further includes a communication section 323, and the communication section 323 is used to perform wireless communication (in this example, short-range wireless communication) with a communication section 423 in a remote control apparatus 420 located in the remote control terminal 120.
Examples of the short-range wireless communication include a technology capable of short-range communication (approximately from several meters to 100 meters), that is, wireless L AN (local area network) communication capable of short-range communication in a range of approximately from several tens of meters to 100 meters and wireless PAN (personal area network) communication capable of short-range communication in a range of approximately from several meters to several tens of meters.
AN example of wireless L AN communication is IEEE 802.11 compatible wireless L AN communication AN example of wireless PAN communication is IEEE802.15 compatible wireless PAN communication.
The communication section 323 transmits/receives data to/from the communication section 423 of the remote control apparatus 420 by short-range wireless communication in this example, the communication section 323 performs IEEE 802.11-compliant wireless L AN communication.
The communication between the image processing device 320 and the remote control device 420 is by no means limited to short-range wireless communication, and may also be long-range (more than about 100m) wireless communication.
The image processing apparatus 320 generates left-eye and right-eye panoramic images (panoramas) PI based on the left-eye and right-eye images (panoramas) IM-IM and IM-IM taken by the paired cameras 311 and 312 constituting the camera unit 310 and 310, respectively.
The panoramic image is generated by, for example, the following technique. For the left eye, a left-eye panoramic image (panorama picture) is generated by combining the left-eye images IM-IM taken by the left- eye cameras 311 and 311 arranged omnidirectionally or substantially omnidirectionally to take the superimposed images IM-IM, frame by frame, so as to synthesize a normal image in which the left-eye images IM-IM are superimposed. Also, for the right eye, a right-eye panoramic image (panorama picture) is generated by frame-by-frame combining the right-eye images IM-IM taken by the right- eye camera 312 and 312 arranged omnidirectionally or substantially omnidirectionally to take the superimposed images IM-IM, so as to synthesize a normal image in which the right-eye images IM-IM are superimposed. The panoramic images PI and PI may be generated using any well-known conventional technique; accordingly, no further details are given here.
The image processing device 320 temporarily stores the generated left-eye and right-eye panoramic images PI and PI in the storage section 322.
The image processing apparatus 320 performs a cropping process to crop out the partial panoramic image PIP of the left-eye panoramic image PI stored in the storage section 322 and crop out the partial panoramic image PIP of the right-eye panoramic image PI stored in the storage section 322.
Fig. 4 is a diagram illustrating a cropping process in which the image processing apparatus 320 crops out partial panoramic images PIP and PIP from the panoramic images PI and PI. The left-eye and right-eye panoramic images PI and PI have substantially the same configuration. Fig. 4 does not distinguish between the left-eye and right-eye panoramic images PI and PI, but shows a clipping process commonly used for both the left-eye and right-eye images.
As shown in fig. 4, if the head 510 of the operator 500 is located at the center of a hollow sphere in which the panoramic images PI and PI are omnidirectionally or substantially omnidirectionally (in this example, substantially omnidirectionally) arranged inside, it is easier to understand the cropping process in which the image processing apparatus 320 crops out the partial panoramic images PIP and PIP from the panoramic images PI and PI.
The image processing apparatus 320 crops partial panoramic images PIP and PIP from the panoramic images PI and PI at predetermined cropping angles (β and γ) corresponding to the angle of view of the operator 500 by using a predetermined reference point α (e.g., a forward yaw reference angle Φ y, a roll reference angle Φ r, or a pitch reference angle Φ p of the work machine 200) as a reference point.
For example, when one of the camera units 310 and 310 is located near the front of the work machine 200, the predetermined reference point α may be a center pixel of the entire pixel area of the image transmitted from that camera unit 310 located near the front of the work machine 200. additionally, the predetermined cut angles (β and γ) may match an angle less than or equal to the viewing angle of a normal person, for example, the horizontal cut angle β may be approximately less than or equal to 200 °, and the vertical cut angle γ may be approximately less than or equal to 125 ° (specifically, imaginary horizontal reference plane 75 ° below and 50 ° above).
The image processing device 320 (see fig. 3) transmits the cropped partial panorama image PIP of the left-eye panorama image (panorama picture data) PI and the cropped partial panorama image PIP of the right-eye panorama image (panorama picture data) PI to the remote control device 420 in the remote control terminal 120 through wireless communication. More specifically, the image processing device 320 compresses the cropped partial panorama images PIP and PIP of the left and right eye panorama images PI and transmits them to the remote control device 420 in real time through wireless communication. In this example, data transmitted by wireless communication is encrypted.
The remote control device 420 in the remote control terminal 120 receives the partial panorama images PIP and PIP of the left and right eye panorama images PI and PI from the image processing device 320 and transmits the received partial panorama images PIP and PIP of the left and right eye panorama images PI and PI to the display device 410. In this example, the remote control device 420 is a computer (in this example, a tablet computer).
Fig. 5 is a block diagram showing a system configuration of the remote control device 420 in the remote control terminal 120 of the display system 100 of the first embodiment.
As shown in fig. 5, the remote control device 420 includes a control section 421 that executes various programs, calculations, and other processes, and a storage section 422 that stores various data.
The remote controlling device 420 has a predetermined communication interface for communicating with the display device 410, and is electrically connected to the display device 410 through the communication interface of the connection cable CB 2. This configuration enables the remote controlling device 420 to transmit the partial panorama images PIP and PIP of the left and right eye panorama images PI and PI to the display device 410.
In the present example, the communication interface between the remote control device 420 and the display device 410 is HDMI (high definition multimedia interface). The communication interface between the remote control device 420 and the display device 410 is by no means limited to HDMI, but may be any other communication interface. In addition, the communication between the remote control device 420 and the display device 410 is by no means limited to wired communication, and may also be short-range wireless communication.
The storage unit 422 includes a volatile memory such as a RAM and a nonvolatile memory such as a flash memory. The volatile memory is used as a job memory necessary in executing various calculations and other processes, for example, by the control section 421. The non-volatile memory is preloaded (pre-installed) with software, including a computer program from an external source (e.g., a storage medium or an internet server).
The remote control apparatus 420 further includes a communication section 423, and the communication section 423 is used to perform wireless communication (in this example, short-range wireless communication) with the communication section 323 of the image processing apparatus 320 in the job end 110.
The wireless communication by the communication section 423 of the remote control apparatus 420 is the same as the wireless communication by the communication section 323 of the image processing apparatus 320; accordingly, no further details are given here.
In this example, the remote control device 420 is a tablet computer, but may also be any other mobile terminal device, such as a notebook computer, a multifunctional mobile handset ("smartphone"), a watch-style terminal device, or a wearable device (a terminal device that can be attached to clothing). The remote control device 420 may be provided in the display device 410. The remote control device 420 is by no means limited to a mobile terminal device, but may also be a desktop computer. Operator 500 may stand or sit during remote control of work machine 200.
More specifically, the remote controlling apparatus 420 temporarily stores the received partial panorama images PIP and PIP of the left and right eye panorama images PI and PI in the storage 422 and simultaneously transmits them to the display apparatus 410.
The display device 410 in the remote control terminal 120 receives the partial panorama images PIP and PIP of the left and right eye panorama images PI and PI from the remote control device 420 and displays them at the same time. More specifically, the display device 410 performs streaming playback in which the display device 410 simultaneously receives and reproduces (displays) partial panoramic images PIP and PIP of left-and right-eye panoramic images (panoramas) PI and PI transmitted in real time from the remote control device 420.
This configuration enables the display device 410 to implement a real-time (live) three-dimensional display ("3D image" or "stereoscopic image"). Any well-known conventional technique may be used to implement a three-dimensional display on display device 410; accordingly, no further details are given here.
In the present example, the display device 410 (see fig. 1) is mounted on the head 510 of the operator 500 (specifically, in front of the eyes). More specifically, the display device 410 is an HMD (head mounted display). The display device 410 is by no means limited to a Head Mounted Display (HMD), and may also be a desktop display device (monitor) or a video projector.
The component 400 in the remote control 120 further includes a line-of-sight detection section 411 (see fig. 1 and 5), and the line-of-sight detection section 411 is used to detect movement of the line of sight of the operator 500. In the present example, the line-of-sight detecting section 411 is a posture detecting section for detecting the posture of the head 510 of the operator 500 (see fig. 1).
The sight line detection section 411 is mounted on the head 510 of the operator 500 as shown in fig. 1, and is capable of detecting a yaw angle θ y about the y-axis extending in the vertical direction of the head 510 of the operator 500, a roll angle θ r about the r-axis extending in the front/rear direction of the head 510 of the operator 500, and a pitch angle θ p about the p-axis extending in the left/right direction of the head 510 of the operator 500, using a reference posture of the head 510 (for example, a posture of the head 510 when the operator 500 is viewing in the horizontal direction) as a reference. Prior to detecting the pose of the head 510 of the operator 500, a reference value of a reference pose may be specified.
The line-of-sight detecting section 411, when it is an attitude detecting section in the present example, may be, for example, one of various sensors including a three-dimensional gyro sensor, a three-dimensional gravitational acceleration sensor, a rotary encoder, and a potentiometer. The line-of-sight detecting section 411 may be of a well-known conventional type; accordingly, no further details are given here.
The gaze detecting section 411 (see fig. 5) transmits data on the detected movement of the head 510 of the operator 500 (specifically, the yaw angle θ y, the roll angle θ r, and the pitch angle θ p with respect to the reference attitude of the head 510) to the remote control device 420.
In the present embodiment, the line of sight detecting section 411 is provided in the display device 410. In other words, the display device 410 includes the line of sight detecting section 411, detects the movement of the head 510 of the operator 500 (specifically, the yaw angle θ y, the roll angle θ r, and the pitch angle θ p with respect to the reference attitude of the head 510) using the line of sight detecting section 411, and transmits data regarding the movement of the head 510 of the operator 500 (specifically, the yaw angle θ y, the roll angle θ r, and the pitch angle θ p with respect to the reference attitude of the head 510) detected by the line of sight detecting section 411 to the remote control device 420.
The transmission of the movement of the head 510 of the operator 500 from the line-of-sight detecting section 411 (in this example, the display device 410) to the remote control device 420 may be performed by communication similar to that between the remote control device 420 and the display device 410. The transmission of the movement of the head 510 of the operator 500 from the line-of-sight detecting section 411 (in this example, the display device 410) to the remote control device 420 is by no means necessarily performed by wired communication, and may also be performed by short-range wireless communication.
The remote control device 420 transmits data on the movement of the head 510 of the operator 500 (specifically, the yaw angle θ y, the roll angle θ r, and the pitch angle θ p with respect to the reference attitude of the head 510) transmitted from the sight line detection section 411 (in this example, the display device 410) to the image processing device 320 in the work machine end 110 by wireless communication.
The image processing device 320 in the working machine end 110 receives data on the movement of the head 510 of the operator 500 (specifically, the yaw angle θ y, the roll angle θ r, and the pitch angle θ p with respect to the reference attitude of the head 510) from the remote control device 420, and adjusts the partial panoramic images PIP and PIP of the left-and right-eye panoramic images PI and PI to be transmitted to the display device 410 according to the received movement of the head 510 of the operator 500 (specifically, the yaw angle θ y, the roll angle θ r, and the pitch angle θ p with respect to the reference attitude of the head 510).
Fig. 6 is a diagram illustrating an adjustment process in which the image processing apparatus 320 adjusts the partial panoramic images PIP and PIP of the panoramic images PI and PI according to the movement of the head 510 of the operator 500. The left-eye and right-eye panoramic images PI and PI have substantially the same configuration. Fig. 6 does not distinguish between the left-eye and right-eye panoramic images PI and PI, but shows an adjustment process commonly used for both the left-eye and right-eye images.
As shown in fig. 6, if the head 510 of the operator 500 is located at the center of a hollow sphere in which the panoramic images PI and PI are omnidirectionally or substantially omnidirectionally (in this example, substantially omnidirectionally) arranged on the inside, the adjustment processing of the partial panoramic images PIP and PIP of the panoramic images PI and PI by the image processing apparatus 320 according to the movement of the head 510 of the operator 500 will be more easily understood.
For each panoramic image PI and PI, the image processing apparatus 320 calculates a relative reference point Δ α (e.g., a relative yaw reference angle Φ y + θ y, a relative roll reference angle Φ r + θ r, and a relative pitch reference angle Φ p + θ p for a reference point at the front of the work implement 200) that deviates from a predetermined reference point α (e.g., a forward yaw reference angle Φ y, a roll reference angle Φ r, or a pitch reference angle Φ p of the work implement 200) by the received movement of the head 510 of the operator 500 (specifically, the yaw angle θ y, the roll angle θ r, and the pitch angle θ p with respect to the reference attitude of the head 510).
More specifically, the image processing apparatus 320 clips the partial panoramic images PIP and PIP from the panoramic images PI and PI at clipping angles (β and γ) corresponding to the viewing angle of the operator 500 by using the calculated relative reference point Δ α as a reference.
In the present example, the line-of-sight detecting unit 411 is a posture detecting unit that detects the posture of the head 510 of the operator 500, and may be a line-of-sight detecting unit that detects the line of sight (eye movement) of the operator 500. In the latter case, if the line-of-sight detecting section responds too quickly to a change in the line of sight of the operator 500, the image processing apparatus 320 may implement the adjustment processing or the trimming processing by delaying the response of the line-of-sight detecting section.
Fig. 7 is a flowchart depicting an exemplary control operation implemented by the display system 100 of the first embodiment.
In the control operation carried out by the display system 100 of the first embodiment, as shown in fig. 7, the image processing apparatus 320 in the job end 110 first generates panoramic images PI and PI from the plurality of images IM-IM and IM-IM captured by the camera unit 310 and 310 (step S101).
Next, the image processing apparatus 320 performs a cropping process to crop out the partial panoramic images PIP and PIP of the generated panoramic images PI and PI (step S102).
Next, the image processing apparatus 320 transmits the partial panoramic images PIP and PIP of the cropped panoramic images PI and PI to the remote control apparatus 420 in the remote control terminal 120 through wireless communication (step S103).
Next, the remote control device 420 in the remote control terminal 120 receives the partial panoramic images PIP and PIP of the panoramic images PI and PI from the image processing device 320 in the job machine terminal 110, and simultaneously transfers the above partial panoramic images to the display device 410 (step S104).
Next, the display device 410 simultaneously receives and displays the partial panoramic images PIP and PIP of the panoramic images PI and PI transmitted from the remote control device 420 (step S105).
Next, the display device 410 detects the movement of the head 510 of the operator 500, and simultaneously transmits the movement to the remote controlling device 420 (step S106).
Next, the remote controlling apparatus 420 receives the movement of the head 510 of the operator 500 from the display apparatus 410, and at the same time, transmits the movement to the image processing apparatus 320 in the work machine end 110 by wireless communication (step S107).
Next, the image processing device 320 in the job end 110 receives the movement of the head 510 of the operator 500 from the remote control device 420 in the remote control end 120, and simultaneously adjusts the partial panoramic images PIP and PIP of the panoramic images PI and PI according to the movement of the head 510 (step S108).
Then, the display system 100 of the first embodiment repeats the above-described steps S101 to S108 until the process is completed (step S109).
In the display system 100 of the first embodiment described above, the image processing device 320 adjusts the partial panoramic images PI and PIP of the left and right eye panoramic images PI and PI to be transmitted to the display device 410 according to the movement of the head 510 of the operator 500 transmitted from the remote control device 420. The display system 100 is thus able to adjust the three-dimensional view being displayed on the display device 410 according to the movement of the head 510 of the operator 500, and is thus able to change the image being displayed on the display device 410 by tracking the movement of the line of sight of the operator 500. This configuration can improve remote control operability. Further, the image processing device 320 transmits the partial panorama images PIP and PIP of the panorama images PI and PI to the remote control device 420 through wireless communication. Therefore, the display system 100 can reduce the amount of data of wireless communication compared to the case where the complete panoramic images PI and PI are transmitted to the remote control device 420 through wireless communication.
When an image being displayed on a display device is to be changed, a display system in which a single camera itself moves to display an image on a display device requires time to move the camera.
In contrast to this, the display system 100 of the first embodiment can eliminate a time delay that inevitably occurs during moving the camera because the image processing device 320 stores the panoramic images PI and PI (omnidirectional images or substantially omnidirectional images) as compared with the case where the display system is configured as a moving camera to display images on the display device.
In addition, in the first embodiment, the camera unit 310 and 310 are located at the placement positions horizontally corresponding to the positions where the seat can be mounted on the work machine 200 (specifically, corresponding in the front/rear direction and the left/right direction), and the camera unit 310 and 310 are made to have the lens centers located at the sitting heights corresponding to the eye heights of the working personnel having the average physique (e.g., height and sitting height) for the target market for the work machine 200 assumed to be sitting on the seat. Accordingly, the display device 410 can display partial panoramic images PIP and PIP similar to a real view that a worker sitting on the seat will see in the panoramic images PI and PI.
Second embodiment
Fig. 8 is a schematic diagram of a display system 100 for remotely controlling a working machine 200 of the second embodiment.
As shown in fig. 8, the display system 100 according to the second embodiment is different from the display system 100 of the first embodiment in that the remote control device 420 in the remote control terminal 120 is replaced by a plurality of remote control devices 420 and 410, and the display devices 410 are replaced by a plurality of display devices 410 and 410 respectively corresponding to the remote control devices 420 and 420.
Those components of the display system 100 of the second embodiment that are substantially the same as those of the display system 100 of the first embodiment are denoted by the same reference numerals. The differences from the display system 100 of the first embodiment will be described below with emphasis.
The image processing device 320 in the operator terminal 110 crops out the partial panorama images PIP-PIP and PIP-PIP of the panorama images PI and PI, and transmits the partial panorama images PIP-PIP and PIP-PIP of the panorama images PI and PI to the remote control device 420 in the remote control terminal 120 through wireless communication. The communication between the image processing device 320 in the operator side 110 and the remote control device 420 in the remote control side 120 enables a multiplex communication in which a plurality of signals are simultaneously transmitted through a single communication channel. The multiplex communication means is not limited at all, and examples include FDM (frequency division multiplexing) communication, TDM (time division multiplexing) communication, and other multiplex communication.
The remote control device 420 in the remote control terminal 120 receives the partial panorama images PIP-PIP and PIP-PIP of the panorama images PI and PI from the image processing device 320 and transmits the received partial panorama images PIP-PIP and PIP-PIP of the panorama images PI and PI to the related display device 410.
The display device 410 and 410 in the remote control terminal 120 receives the partial panorama images PIP-PIP and PIP-PIP of the panorama images PI and PI from the associated remote control device 420 and displays the partial panorama images PIP-PIP and PIP-PIP of the received panorama images PI and PI. The plurality of sight line detection portions 411 and 411 (in this example, the plurality of display devices 410 and 410) detect the movement of the head portion 510 and 510 of the relevant operator 500 and 500, and transmit the detected movement of the head portion 510 and 510 of the operator 500 and 500 to the relevant remote control device 420 and 420. The remote control device 420 transmits the movement of the head 510 of the operator 500 and 510 transmitted from the associated line of sight detection device 411 and 411 (in this example, the display device 410 and 410) to the image processing device 320 through wireless communication.
The image processing device 320 in the operator terminal 110 adjusts the partial panoramic images PIP-PIP and PIP-PIP of the panoramic images PI and PI according to the movement of the head 510 of the operator 500 and 510 transmitted from the remote control device 420 and 420.
The image processing apparatus 320 crops out partial panorama images PIP-PIP and PIP-PIP of the adjusted panorama images PI and PI from the panorama images PI and PI.
The process flow carried out by the display system 100 of the second embodiment is substantially the same as the process flow carried out by the display system 100 of the first embodiment shown in fig. 7, except that the remote control device 420 is replaced by the remote control device 420 and the display device 410 is replaced by the display device 410 and 410. Accordingly, no further details are given here.
In the display system 100 of the second embodiment, the operators 500 and 500 are able to simultaneously view the images being displayed on their respective display devices 410 and 410 in their desired fields of view. This configuration can improve job safety as compared with the case where a single operator 500 views an image being displayed on his/her display device 410 in the first embodiment.
Third embodiment
The display system 100 of the third embodiment is different from the display system 100 of the first embodiment in that the remote control device 420 in the remote control terminal 120, instead of the image processing device 320 in the job end 110, performs a cropping process on the panoramic images PI and an adjustment process on the partial panoramic images PIP and PIP of the panoramic images PI and PI.
Those components of the display system 100 of the third embodiment that are substantially the same as those of the display system 100 of the first embodiment are denoted by the same reference numerals. The differences from the display system 100 of the first embodiment will be described below with emphasis.
Fig. 9 is a block diagram showing a system configuration of an image processing apparatus 320 provided in the worker end 110 of the display system 100 for remotely controlling the worker 200 of the third embodiment. Fig. 10 is a block diagram showing a system configuration of a remote control device 420 provided in the remote control terminal 120 of the display system 100 of the third embodiment.
As shown in fig. 9, the image processing device 320 in the job end 110 generates a left-eye panoramic image PI (data on a complete left-eye panoramic image PI) and a right-eye panoramic image PI (data on a complete right-eye panoramic image PI) and transmits them to the remote control device 420 in the remote control end 120 by wireless communication.
As shown in fig. 10, the remote control device 420 in the remote control terminal 120 receives a left-eye panorama image PI (data on a full left-eye panorama image PI) and a right-eye panorama image PI (data on a full right-eye panorama image PI) from the image processing device 320, and crops a partial panorama image PIP of the received left-eye panorama image PI and a partial panorama image PIP of the received right-eye panorama image PI, and transmits the cropped partial panorama images PIP and PIP of the left-and right-eye panorama images PI and PI to the display device 410.
More specifically, the remote controlling apparatus 420 temporarily stores the received left-eye and right-eye panoramic images PI and PI in the storage section 422.
The remote controlling apparatus 420 performs a cropping process to crop out the partial panoramic image PIP of the left-eye panoramic image PI stored in the storage part 422 and the partial panoramic image PIP of the right-eye panoramic image PI stored in the storage part 422.
The clipping process of the PIP and the PIP in which the remote controlling apparatus 420 clips out the panoramic images PI and PI is similar to the clipping process of the partial panoramic images PIP and PIP in which the image processing apparatus 320 illustrated in fig. 4 clips out the panoramic images PI and PI; accordingly, no further details are given here.
The remote controlling apparatus 420 adjusts the partial panoramic images PIP and PIP to be transmitted to the left and right eye panoramic images PI and PI of the display apparatus 410 according to the movement of the head 510 of the operator 500 transmitted from the display apparatus 410.
The adjustment process of the remote controlling apparatus 420 adjusting the partial panorama images PIP and PIP of the panorama images PI and PI according to the movement of the head 510 of the operator 500 is similar to the adjustment process of the image processing apparatus 320 illustrated in fig. 6 adjusting the PIP and PIP of the panorama images PI and PI according to the movement of the head 510 of the operator 500; accordingly, no further details are given here.
The remote controlling apparatus 420 cuts out the partial panorama images PIP and PIP of the adjusted panorama images PI and PI from the panorama images PI and PI.
Fig. 11 is a flowchart depicting an exemplary control operation carried out by the display system 100 of the third embodiment.
In the control operation carried out by the display system 100 of the third embodiment, as shown in fig. 11, the image processing apparatus 320 in the job end 110 first generates panoramic images PI and PI from the plurality of images IM-IM and IM-IM captured by the camera unit 310 and 310 (step S201).
Next, the image processing apparatus 320 transmits the generated panoramic images PI and PI to the remote control apparatus 420 in the remote control 120 through wireless communication (step S202).
Next, the remote control device 420 in the remote control terminal 120 receives the panoramic images PI and PI from the image processing device 320 in the job machine terminal 110, and simultaneously performs a cropping process to crop out partial panoramic images PIP and PIP of the panoramic images PI and PI (step S203).
Next, the remote controlling apparatus 420 transmits the partial panoramic images PIP and PIP of the cropped panoramic images PI and PI to the display apparatus 410 (step S204).
Next, the display device 410 simultaneously receives and displays the partial panoramic images PIP and PIP of the panoramic images PI and PI transmitted from the remote control device 420 (step S205).
Next, the display device 410 detects the movement of the head 510 of the operator 500, and simultaneously transmits the movement to the remote controlling device 420 (step S206).
Next, the remote controlling apparatus 420 receives the movement of the head 510 of the operator 500 from the display apparatus 410, and simultaneously adjusts the partial panoramic images PIP and PIP of the panoramic images PI and PI according to the movement of the head 510 (step S207).
Then, the display system 100 of the third embodiment repeats the above-described steps S201 to S207 until the processing is completed (step S208).
In the display system 100 of the third embodiment, the remote controlling apparatus 420 adjusts the partial panoramic images PIP and PIP to be transmitted to the left and right eye panoramic images PI and PI of the display apparatus 410 according to the movement of the head 510 of the operator 500 transmitted from the display apparatus 410. The display system 100 is thus able to adjust the three-dimensional view being displayed on the display device 410 according to the movement of the head 510 of the operator 500, and is thus able to change the image being displayed on the display device 410 by tracking the movement of the line of sight of the operator 500. This configuration can improve remote control operability. Further, the remote control device 420 in the remote control terminal 120 performs an adjustment process to adjust the partial panoramic images PIP and PIP of the panoramic images PI and PI according to the movement of the head 510 of the operator 500. Therefore, the display system 100 can eliminate the time taken to transfer the movement of the head 510 from the remote control apparatus 420 in the remote control terminal 120 to the image processing apparatus 320 in the job end 110, as compared with the case where the image processing apparatus 320 in the job end 110 performs the adjustment processing.
When an image being displayed on a display device is to be changed, a display system in which a single camera itself moves to display an image on a display device requires time to move the camera.
In contrast to this, the display system 100 of the third embodiment can eliminate a time delay that inevitably occurs during moving the camera because the remote control device 420 stores panoramic images PI and PI (omnidirectional images or substantially omnidirectional images) as compared to a case where the display system is configured as a mobile camera to display images on the display device.
Fourth embodiment
Fig. 12 is a schematic diagram of a display system 100 for remotely controlling a working machine 200 of the fourth embodiment.
As shown in fig. 12, the display system 100 of the fourth embodiment is different from the display system 100 of the third embodiment in that the remote control device 420 in the remote control terminal 120 is replaced by a plurality of remote control devices 420 and 420, and the display device 410 is replaced by a plurality of display devices 410 and 410 respectively corresponding to the remote control devices 420 and 420.
Those components of the display system 100 of the fourth embodiment that are substantially the same as those of the display system 100 of the third embodiment are denoted by the same reference numerals. The differences from the display system 100 of the third embodiment will be focused on.
The image processing device 320 in the operator side 110 transmits the panoramic images PI and PI (data on the complete panoramic images PI and PI) to the remote control device 420 in the remote control side 120 through wireless communication. The communication between the image processing device 320 in the operator end 110 and the remote control device 420 in the remote control end 120 and 420 is similar to the second embodiment; accordingly, no further details are given here.
The remote control device 420 in the remote control terminal 120 receives the panorama images PI and PI from the image processing device 320, clips partial panorama images PIP-PIP and PIP-PIP of the received panorama images PI and PI, and transmits the partial panorama images PIP-PIP and PIP-PIP of the clipped panorama images PI and PI to the related display device 410.
The display device 410 and 410 in the remote control terminal 120 receives the partial panorama images PIP-PIP and PIP-PIP of the panorama images PI and PI from the associated remote control device 420 and displays the partial panorama images PIP-PIP and PIP-PIP of the received panorama images PI and PI. The plurality of sight line detection portions 411 and 411 (in this example, the plurality of display devices 410 and 410) detect the movement of the head portion 510 and 510 of the relevant operator 500 and 500, and transmit the detected movement of the head portion 510 and 510 of the operator 500 and 500 to the relevant remote control device 420 and 420.
The remote control device 420 and 420 in the remote control terminal 120 adjusts the partial panoramic images PIP-PIP and PIP-PIP of the panoramic images PI and PI according to the movement of the head 510 and 510 of the relevant operator 500 and 500 transmitted from the relevant sight line detection part 411 and 411 (in this example, the display device 410 and 410).
Then, the remote controlling device 420 clips out the adjusted partial panorama images PIP-PIP and PIP-PIP of the panorama images PI and PI from the panorama images PI and PI.
The process flow carried out by the display system 100 of the fourth embodiment is substantially the same as the process flow carried out by the display system 100 of the third embodiment shown in fig. 11, except that the remote control device 420 is replaced by the remote control device 420 and the display device 410 is replaced by the display device 410 and 410, and therefore, no further details are given here.
In the display system 100 of the fourth embodiment, the operators 500 and 500 are able to simultaneously view the images being displayed on their respective display devices 410 and 410 in their desired fields of view. This configuration can improve job safety when compared with the case where a single operator 500 views an image being displayed on his/her display device 410 in the first embodiment.
Fifth embodiment
Fig. 13 is a schematic side view of the display system 100 for remotely controlling the working machine 200 of the fifth embodiment.
As shown in fig. 13, the display system 100 of the fifth embodiment is different from the display systems 100 of the first to fourth embodiments 100 in that the support member 330 provided on the working machine 200 is replaced with a support member 340 capable of moving up/down and rotating vertically.
Those components of the display system 100 of the fifth embodiment that are substantially the same as those of the display system 100 of the first to fourth embodiments are denoted by the same reference numerals, and further details will not be given here to those components.
The support member 340 includes a support 331 and a support base (in this example, a support arm 342). The support 331 supports the camera unit 310 and 310. The support base supports the support 331 on the upper end thereof, and the lower end thereof is supported by the upper body 221, in turn, to maintain the camera unit 310 in a horizontal posture.
In this example, the support arm 342 may support the support 331 on the upper end such that the support 331 is freely rotatable about an axis (first fulcrum 342a) extending in the left/right direction, and the lower end is supported by the upper main body 221 such that the support arm 342 is freely rotatable about an axis (second fulcrum 342b) extending in the left/right direction. More specifically, the support arm 342 may be configured to: second fulcrum 342b is always located rearward of first fulcrum 342a regardless of the rotation of support arm 342 about the axis (second fulcrum 342b) extending in the left/right direction. A plurality of support arms 342 (in this example, two support arms 342 and 342) are provided. The support arms 342 and 342 are constituted as parallel links arranged in parallel with each other in the front/rear direction to support the support 331 in such a manner that the support 331 can move up/down. The first fulcrum 342a-342a and the second fulcrum 342b-342b of the support arms 342 and 342 are separated by a distance that maintains the camera unit 310-310 in a horizontal posture. In this example, first fulcrum 342a-342a and second fulcrum 342b-342b of support arm 342-342 are positioned on the same horizontal line. First fulcrum 342a-342a and second fulcrum 342b-342b of support arms 342-342 are separated by an equal distance. This configuration enables the support member 340 to maintain the camera unit 310-310 in the horizontal posture regardless of the rotation of the support arm 342-342 about the axis (the second fulcrum 342b-342b) extending in the left/right direction. As previously mentioned, "horizontal attitude" in this context is defined as the attitude of work machine 200 when work machine 200 is placed on a horizontal surface.
The support member 340 includes an actuator portion 343 that actuates the support arms 342 and 342. The driver section 343 is actuated in response to an operation instruction transmitted from the remote operation device 230 in the remote control terminal 120 to rotate the support arms 342 and move the camera unit 310 up/down 310. This configuration enables the operator 500 to move the camera unit 310 up/down 310 by remotely controlling the rotation of the support arms 342-342.
In the present example, the support member 340 is constituted so that the vertical movement range R of the lens center of the camera unit 310 and 310 can cover the sitting height H corresponding to the eye height of the worker having an average physique (e.g., height and sitting height) for the target market of the working machine 200 who is supposed to sit on the seat. The image processing device 320 may transmit the current (vertical) height of the lens center of the camera unit 310-. This configuration enables the operator 500 to identify the current height of the lens center of the camera unit 310 and 310.
The support member 340 is constituted by: when the lens centers of the camera unit 310 and 310 are located at the sitting height H, the camera unit 310 and 310 are located at the arrangement positions horizontally corresponding to the positions where the seats can be mounted on the working machine 200 (specifically, in the front/rear direction).
The support arm 342 may be configured to: the support 331 is fixedly supported at the upper end of the support arm 342 in such a manner that the support 331 can move up/down.
In the display system 100 of the fifth embodiment, the camera unit 310 and 310 are disposed on the support member 340 that can move up/down and can rotate in the vertical direction. This configuration enables the operator 500 to adjust the height of the lens center of the camera unit 310 and 310, and thus the height of the field of view of the operator 500, as necessary. As a result, for example, the operator 500 can grasp a wider field of view of the surroundings of the work machine 200 due to a field of view similar to a bird's eye view. In addition, if the support member 340 is actuated to lower the height of the camera unit 310 and 310, the working machine 200 can easily enter a working site located in a pit or having a low ceiling. Further, when the operator 500 is controlling the work machine 200 by directly viewing the surrounding environment (by directly checking the surrounding environment), the camera unit 310 can be retracted into the work machine 200 by actuating the support member 340 in such a manner that the camera unit 310 can be retracted into the work machine 200. In addition, support arms 342 and 342 can be configured such that second fulcrum 342b-342b is always located rearward of first fulcrum 342a-342 a. This arrangement enables the camera unit 310 and 310 to reliably capture images of portions of the working member 220 located in front of the first supporting shafts 342a-342 a.
Sixth embodiment
The display system 100 of the sixth embodiment differs from the display systems 100 of the first to fifth embodiments in that: the illumination devices 350 and 350 are additionally disposed at the periphery (adjacent or close to) the camera unit 310 and 310.
Fig. 14 is a schematic diagram when a pair of cameras 311 and 312 constituting a camera unit 310 and exemplary illumination devices 350 and 350 disposed on the pair of cameras 311 and 312 are viewed from the front with respect to the display system 100 for remotely controlling the working machine 200 of the sixth embodiment. All the lighting devices 350 and 350 disposed on the camera unit 310 and 310 have the same structure; fig. 14 shows the lighting devices 350 and 350 as an example arranged on one of the camera units 310 and 310.
As shown in fig. 14, the illumination devices 350 and 350 are disposed on at least a part of the periphery (in this example, on the entire periphery) of the lenses 311a and 312a of the pair of cameras 311 and 312 in such a manner that the light-emitting side faces the object side. In the present example, the lighting devices 350 and 350 include a plurality of (eight in the present example) light emitting elements 351 and 351 (specifically, light emitting diodes) 351 provided along the entire periphery of the lenses 311a and 312a of the pair of cameras 311 and 312. The light emitting elements 351 and 351 are disposed around the lenses 311a and 312a at equal distances or substantially at equal distances from each other.
In the display system 100 according to the sixth embodiment, the illumination devices 350 and 350 are disposed at the periphery of the camera unit 310 and 310. Therefore, the light emitted by the lighting devices 350 and 350 can effectively prevent the shadow in the captured image captured by the camera unit 310 and 310.
Seventh embodiment
In order to improve problems related to work, such as work efficiency and safety, in a work site, it is extremely important that the operator 500 knows a work plan (e.g., about the approximate length, width and depth of excavation), a rotation range of the working unit 220 (specifically, a rotation range of the upper body 221 and a swing range of the boom 222, the robot arm 223 and the bucket 224), or a position of a natural gas pipeline, a water pipeline, and other underground pipelines before starting a project.
From this point of view, the display system 100 of the seventh embodiment is different from the display systems 100 of the first to sixth embodiments in that: the former provides AR (augmented reality) by displaying the partial job assistance information Q1 and Q1 together with the partial panoramic images PIP and PIP of the panoramic images PI and PI on the display device 410 in such a manner that the partial job assistance information Q1 and Q1 is associated with the partial panoramic images PIP and PIP of the panoramic images PI and PI, wherein the partial job assistance information Q1 and Q1 is assistance information corresponding to the partial panoramic images PIP and PIP of the panoramic images PI and PI among the job assistance information Q and Q (various supplementary information including a job plan, a rotation range of the job part 220, and positions of a natural gas pipeline, a water pipeline, and other underground pipelines). Augmented reality is a technique for enhancing information related to a real environment using visual information related to the real environment, or more specifically, a technique for: which combines the video of the real environment captured by the camera unit 310 and the video associated with the real environment generated by the computer in real time to generate a composite video in such a manner that the video generated by the computer changes as the video captured by the camera unit 310 and 310 changes.
Fig. 15 is a schematic view of exemplary display screens D and D on display device 410 of display system 100 for remotely controlling work machine 200 of the seventh embodiment, which display screens D and D display partial work assistance information Q1 and Q1 of work assistance information Q and Q together with partial panoramic images PIP and PIP of panoramic images PI and PI in such a manner that partial work assistance information Q1 and Q1 of work assistance information Q and Q is in agreement with partial panoramic images PIP and PIP of panoramic images PI and PI, wherein partial work assistance information Q1 and Q1 of work assistance information Q and Q is information related to partial panoramic images PIP and PIP of panoramic images PI and PI. Fig. 15 illustrates exemplary underground pipes such as natural gas pipes and water pipes (part of the work assistance information Q1 and Q1) as the work assistance information Q and Q based on the pipe network diagram below the ground G and G. The partial panorama image PIP of the left-eye panorama image PI and the partial panorama image PIP of the right-eye panorama image PI have substantially the same structure. Fig. 15 shows processing for augmented reality common to the left-eye and right-eye panoramic images PI and PI without distinction between the partial panoramic images PIP and PIP.
Based on a database prepared in advance in the display system 100 (for example, a database stored in advance in the storage section 422 of the remote control apparatus 420 or in the storage section 322 of the image processing apparatus 320) or a database located at a different location from the display system 100 (for example, a database downloaded from an internet server), the remote control apparatuses 420 and 420 of the first and third embodiments and the image processing apparatus 320 of the first embodiment add the partial job assistance information Q1 and Q1 of the job assistance information Q and Q to the partial panorama images PIP and PIP of the cropped panorama images PI and PI in such a manner that the partial job assistance information Q1 and Q1 of the job assistance information Q and Q coincides with the partial panorama images PIP and PIP of the panorama images PI and PI, among them, the partial job assistance information Q1 and Q1 of the job assistance information Q and Q is information corresponding to the partial panorama images PIP and PIP of the cropped panorama images PI and PI among the job assistance information Q and Q.
It is also possible that the image processing apparatuses 320 and 320 of the first and third embodiments and the remote control apparatus 420 of the third embodiment add the job auxiliary information Q and Q corresponding to the uncut panoramic images PI and PI in such a manner that the job auxiliary information Q and Q coincide with the panoramic images PI and PI, based on a database prepared in advance in the display system 100 (for example, a database stored in advance in the storage section 422 of the remote control apparatus 420 or in the storage section 322 of the image processing apparatus 320) or a database located at a different location from the display system 100 (for example, a database downloaded from an internet server). In the cropping process of the partial panoramic images PIP and PIP from which the panoramic images PI and PI are cropped, the corresponding partial job assistance information Q1 and Q1 of the job assistance information Q and Q is also cropped.
The database may be, for example, a database built according to a CAD (computer aided design) model. The panoramic images PI and PI contain, for example, position information (latitude and longitude) of the work machine 200 that can be obtained by using GNNS (global navigation satellite system). The job assistance information Q and Q is, for example, assistance information for configuring a panoramic image including position information (latitude and longitude) corresponding to the position information (latitude and longitude) of the panoramic images PI and PI.
The display apparatus 410 displays the partial job information Q1 and Q1 of the job assistance information Q and Q together with the partial panoramic images PIP and PIP of the panoramic images PI and PI in such a manner that the partial job information Q1 and Q1 of the job assistance information Q and Q is in agreement with the partial panoramic images PIP and PIP of the panoramic images PI and PI, wherein the partial job information Q1 and Q1 of the job assistance information Q and Q is the assistance information corresponding to the partial panoramic images PIP and PIP of the panoramic images PI and PI among the job assistance information Q and Q.
This configuration enables the operator 500 to know the work assistance information Q and Q in advance. For example, the operator 500 can know whether the turning position of the working unit 220 (specifically, the turning position of the upper body 221 and the swing position of the boom 222, the robot arm 223, and the bucket 224) has reached a predetermined reference level specified in the work plan (for example, a reference work depth or a predetermined reference work range has been exceeded), has reached a work range in the work site (for example, will come into contact with the ceiling of the work site), or has reached an underground object in the work site (for example, will come into contact with a natural gas pipeline, a water pipeline, and other underground pipelines Q1).
Such augmented reality may be achieved using well-known conventional techniques; accordingly, no further details are given here.
In the display system 100 of the seventh embodiment, the display device 410 displays the partial job assistance information Q1 and Q1 of the job assistance information Q and Q together with the partial panoramic images PIP and PIP of the panoramic images PI and PI in such a manner that the partial job assistance information Q1 and Q1 of the job assistance information is in agreement with the partial panoramic images PIP and PIP of the panoramic images PI and PI based on a database prepared in advance in the display system 100 or a database located at a different location from the display system 100, wherein the partial job assistance information Q1 and Q1 of the job assistance information is assistance information corresponding to the partial panoramic images PIP and PIP of the panoramic images PI and PI among the job assistance information Q and Q. This configuration enables the operator 500 to know the work assistance information Q and Q (a variety of supplemental information including the work plan, the range of rotation of the work components, or the location of the natural gas, water, and other underground pipes). This in turn can improve problems associated with work, such as work efficiency and safety.
The present invention is by no means limited to the above-described embodiments and examples, and can be implemented in other various ways. The present embodiments are, therefore, to be considered in all respects only as illustrative and not restrictive, and any restrictive interpretation should be taken as being limitative. The scope of the invention is defined only by the claims and not by the description. Those variations and modifications which may bring about an equivalent to the claimed elements are included within the scope of the invention.
Industrial applicability
The present invention relates to a display system for a remote-controlled working machine, and is particularly suitable for tracking the movement of the operator's sight line to change the image being displayed on the display device accordingly.
List of reference numerals
100 display system
110 machine end
120 remote control terminal
200 working machine
210 traveling body
220 working part
221 upper body
222 boom
223 mechanical arm
224 excavator bucket
230 remote operation device
231 operating part
300 parts in the work machine end
310 camera unit
311 left eye Camera
311a lens
312 right eye camera
312a lens
320 image processing apparatus
321 control part
322 storage section
323 communication unit
330 supporting member
331 support
331a fastening member
332 support post
340 support member
342 support arm
342a first supporting shaft
342b second fulcrum
343 driver section
350 illumination device
351 light emitting element
400 parts in remote control terminal
410 display device
411 Sight line detection part
420 remote control device
421 control part
422 storage part
423 communication unit
500 operator
510 head
CB1 connecting cable
CB2 connecting cable
D display screen
G ground
H sit high
IM image
MAX maximum height
MIN minimum height
PI panoramic image
PIP partial panoramic image
Q job assistance information
Q1 partial job assistance information
Range of motion of R
d distance
Δ α relative to a reference point
α reference point
β horizontal cutting angle
Gamma vertical cutting angle
Theta p pitch angle
Theta r roll angle
Theta y yaw angle
Phi p reference angle of pitch
Phi r roll reference angle
Phi y yaw reference angle

Claims (5)

1. A display system for a remote control working machine, comprising:
a plurality of camera units attached to a working machine having a working member capable of horizontal rotation and vertical swing;
an image processing device that is provided to the work machine, generates a panoramic image based on a plurality of images captured by the plurality of camera units, and transmits a partial panoramic image of the generated panoramic image by wireless communication;
the display equipment is positioned at a remote control end of the display system; and
a remote control device that receives the partial panoramic image from the image processing device and transmits the received partial panoramic image to the display device,
wherein the content of the first and second substances,
each camera unit includes a pair of left and right eye cameras,
the image processing device generates a left-eye panoramic image and a right-eye panoramic image based on a left-eye image and a right-eye image photographed by the pair of cameras, respectively, and transmits partial left-eye panoramic images and partial right-eye panoramic images of the generated left-eye panoramic images and right-eye panoramic images to the remote control device through wireless communication,
the remote control apparatus receiving the partial left-eye panoramic image and the right-eye panoramic image from the image processing apparatus and transmitting the received partial left-eye panoramic image and right-eye panoramic image to the display apparatus,
the display device is designed to be mounted on the head of an operator, receive the partial left-eye panoramic image and the right-eye panoramic image from the remote control device, and display the received partial left-eye panoramic image and right-eye panoramic image, and detect movement of the eyes of the operator, and transmit the detected movement of the eyes of the operator to the remote control device,
the remote control apparatus transmits the movement of the operator's eye transmitted from the display apparatus to the image processing apparatus by wireless communication,
the image processing device adjusts the partial left-eye panoramic image and right-eye panoramic image to be transmitted to the display device according to the movement of the operator's eyeball transmitted from the remote control device,
the image processing apparatus performs an adjustment process or a trimming process by delaying a response of the line-of-sight detecting section if the line-of-sight detecting section responds too quickly to a change in the operator's line of sight.
2. The display system according to claim 1, wherein the plurality of camera units are located at a placement position corresponding to a position level where a seat can be mounted on the working machine, so that the plurality of camera units have lens centers located at a sitting height corresponding to an eye height of a working person having an average physique for a target market of the working machine, which is assumed to be sitting on the seat.
3. The display system according to claim 1 or 2, wherein the plurality of camera units are provided on a support member that is movable upward/downward and vertically rotatable.
4. The display system according to claim 1 or 2, wherein the display system further comprises an illumination device located at the periphery of the plurality of camera units.
5. A display system for a remote control working machine, comprising:
a plurality of camera units attached to a working machine having a working member capable of horizontal rotation and vertical swing;
an image processing device that is provided to the work machine, generates a panoramic image based on a plurality of images captured by the plurality of camera units, and transmits the generated panoramic image by wireless communication;
the display equipment is positioned at a remote control end of the display system; and
a remote control device that receives the panoramic image from the image processing device and transmits a partial panoramic image of the received panoramic image to the display device,
wherein the content of the first and second substances,
each camera unit includes a pair of left and right eye cameras,
the image processing apparatus generates a left-eye panoramic image and a right-eye panoramic image based on a left-eye image and a right-eye image photographed by the pair of cameras, respectively, and transmits the generated left-eye panoramic image and right-eye panoramic image to the remote control apparatus by wireless communication,
the remote control apparatus receiving the left-eye panoramic image and the right-eye panoramic image from the image processing apparatus and transmitting portions of the left-eye panoramic image and the right-eye panoramic image received to the display apparatus,
the display device is designed to be mounted on the head of an operator, receive the partial left-eye panoramic image and the right-eye panoramic image from the remote control device, and display the received partial left-eye panoramic image and right-eye panoramic image, and detect movement of the eyes of the operator, and transmit the detected movement of the eyes of the operator to the remote control device,
the remote control apparatus adjusts the partial left-eye panoramic image and right-eye panoramic image to be transmitted to the display apparatus according to the movement of the operator's eye transmitted from the display apparatus,
the image processing apparatus performs an adjustment process or a trimming process by delaying a response of the line-of-sight detecting section if the line-of-sight detecting section responds too quickly to a change in the operator's line of sight.
CN201480082777.5A 2014-11-17 2014-11-17 Display system for remote control working machine Expired - Fee Related CN107111928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010638589.3A CN111754750B (en) 2014-11-17 2014-11-17 Display device for remote control working machine, display system, and working machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2014/002551 WO2016079557A1 (en) 2014-11-17 2014-11-17 Display system for remote control of working machine

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202010638589.3A Division CN111754750B (en) 2014-11-17 2014-11-17 Display device for remote control working machine, display system, and working machine

Publications (2)

Publication Number Publication Date
CN107111928A CN107111928A (en) 2017-08-29
CN107111928B true CN107111928B (en) 2020-07-31

Family

ID=52774284

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201480082777.5A Expired - Fee Related CN107111928B (en) 2014-11-17 2014-11-17 Display system for remote control working machine
CN202010638589.3A Active CN111754750B (en) 2014-11-17 2014-11-17 Display device for remote control working machine, display system, and working machine

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010638589.3A Active CN111754750B (en) 2014-11-17 2014-11-17 Display device for remote control working machine, display system, and working machine

Country Status (5)

Country Link
US (1) US10474228B2 (en)
EP (1) EP3222042B1 (en)
JP (1) JP6612865B2 (en)
CN (2) CN107111928B (en)
WO (1) WO2016079557A1 (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2017159063A1 (en) * 2016-03-14 2019-01-17 ソニー株式会社 Display device and information processing terminal device
US10539797B2 (en) * 2016-05-06 2020-01-21 Colopl, Inc. Method of providing virtual space, program therefor, and recording medium
WO2018164152A1 (en) * 2017-03-07 2018-09-13 住友建機株式会社 Shovel
EP3594415B1 (en) 2017-03-07 2021-12-15 Sumitomo Heavy Industries, Ltd. Shovel and construction machinery work assist system
JP6929674B2 (en) * 2017-03-22 2021-09-01 株式会社東京エネシス Environmental image display system and environmental image display method
CN108933920B (en) * 2017-05-25 2023-02-17 中兴通讯股份有限公司 Video picture output and viewing method and device
TWI697692B (en) 2017-08-01 2020-07-01 緯創資通股份有限公司 Near eye display system and operation method thereof
US11010975B1 (en) 2018-03-06 2021-05-18 Velan Studios, Inc. Remote camera augmented reality system
JP7060418B2 (en) * 2018-03-15 2022-04-26 株式会社デンソーテン Vehicle remote control device and vehicle remote control method
JP7000957B2 (en) * 2018-03-29 2022-01-19 コベルコ建機株式会社 Work machine control device
JP7014007B2 (en) * 2018-03-29 2022-02-01 コベルコ建機株式会社 Remote control system for work machines
CN111226009B (en) * 2018-09-25 2022-03-04 日立建机株式会社 External shape measuring system for working machine, external shape display system for working machine, control system for working machine, and working machine
WO2020090985A1 (en) 2018-10-31 2020-05-07 株式会社小松製作所 Display control system and display control method
JP7151392B2 (en) * 2018-11-07 2022-10-12 コベルコ建機株式会社 Remote control device for construction machinery
SE1851590A1 (en) * 2018-12-14 2020-06-15 Brokk Ab Remote control demolition robot with improved area of use and a method for producing such a demolition robot.
JP7160702B2 (en) * 2019-01-23 2022-10-25 株式会社小松製作所 Work machine system and method
JP7318258B2 (en) * 2019-03-26 2023-08-01 コベルコ建機株式会社 Remote control system and remote control server
JP7287047B2 (en) * 2019-03-27 2023-06-06 コベルコ建機株式会社 Remote control system and remote control server
JP7372061B2 (en) * 2019-07-01 2023-10-31 株式会社日立製作所 Remote work support system
US11200654B2 (en) 2019-08-14 2021-12-14 Cnh Industrial America Llc System and method for determining field characteristics based on a displayed light pattern
JP7244197B2 (en) * 2019-12-04 2023-03-22 株式会社三井E&Sマシナリー Camera view movement mechanism
JP2021179105A (en) * 2020-05-13 2021-11-18 コベルコ建機株式会社 Remote control assist system for working machine
US11505919B2 (en) * 2020-07-27 2022-11-22 Caterpillar Inc. Method for remote operation of machines using a mobile device
EP4033035A1 (en) * 2021-01-20 2022-07-27 Volvo Construction Equipment AB A system and method therein for remote operation of a working machine comprising a tool

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005114235A (en) * 2003-10-07 2005-04-28 Hitachi Constr Mach Co Ltd Device for supporting treatment of buried matter for mobile working machine
JP2005308282A (en) * 2004-04-20 2005-11-04 Komatsu Ltd Firearm device
CN1912682A (en) * 2005-08-10 2007-02-14 精工爱普生株式会社 Display device, method of controlling the same, and game machine
JP2008111269A (en) * 2006-10-30 2008-05-15 Komatsu Ltd Image display system
JP2010256534A (en) * 2009-04-23 2010-11-11 Fujifilm Corp Head-mounted display for omnidirectional image display
CN202334770U (en) * 2011-12-04 2012-07-11 长安大学 Full-view monitoring system for engineering machine
CN103141090A (en) * 2010-09-29 2013-06-05 日立建机株式会社 Device for surveying surround of working machine
WO2014077046A1 (en) * 2012-11-13 2014-05-22 ソニー株式会社 Image display device and image display method, mobile body device, image display system, and computer program

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0556925A (en) * 1991-09-02 1993-03-09 A T R Tsushin Syst Kenkyusho:Kk Sight line information controller
JP3146080B2 (en) * 1993-01-08 2001-03-12 株式会社間組 Excavator driving device
JPH08292394A (en) * 1995-01-24 1996-11-05 Matsushita Electric Ind Co Ltd Head-mounted image display device
JP3604443B2 (en) * 1995-02-28 2004-12-22 株式会社島津製作所 Television system
JP2973867B2 (en) * 1995-05-26 1999-11-08 日本電気株式会社 View point tracking type stereoscopic display apparatus and viewpoint tracking method
US6195204B1 (en) 1998-08-28 2001-02-27 Lucent Technologies Inc. Compact high resolution panoramic viewing system
ATE420528T1 (en) 1998-09-17 2009-01-15 Yissum Res Dev Co SYSTEM AND METHOD FOR GENERATING AND DISPLAYING PANORAMIC IMAGES AND FILMS
US7477284B2 (en) 1999-09-16 2009-01-13 Yissum Research Development Company Of The Hebrew University Of Jerusalem System and method for capturing and viewing stereoscopic panoramic images
US6831677B2 (en) 2000-02-24 2004-12-14 Yissum Research Development Company Of The Hebrew University Of Jerusalem System and method for facilitating the adjustment of disparity in a stereoscopic panoramic image pair
US6795109B2 (en) 1999-09-16 2004-09-21 Yissum Research Development Company Of The Hebrew University Of Jerusalem Stereo panoramic camera arrangements for recording panoramic images useful in a stereo panoramic image pair
IL149724A0 (en) 1999-12-31 2002-11-10 Yissum Res Dev Co Streo panoramic camera arrangements for recording panoramic images useful in a stereo panoramic image pair
WO2001080545A2 (en) 2000-04-19 2001-10-25 Yissum Research Development Company Of The Hebrew University Of Jerusalem System and method for capturing and viewing stereoscopic panoramic images
JP2002345058A (en) 2001-05-14 2002-11-29 Komatsu Ltd Remote control device for working machine
US7463280B2 (en) 2003-06-03 2008-12-09 Steuart Iii Leonard P Digital 3D/360 degree camera system
JP4517336B2 (en) * 2004-02-24 2010-08-04 マツダ株式会社 Simulation apparatus and method
US20070182812A1 (en) 2004-05-19 2007-08-09 Ritchey Kurtis J Panoramic image-based virtual reality/telepresence audio-visual system and method
US20060103723A1 (en) 2004-11-18 2006-05-18 Advanced Fuel Research, Inc. Panoramic stereoscopic video system
US20070002131A1 (en) 2005-02-15 2007-01-04 Ritchey Kurtis J Dynamic interactive region-of-interest panoramic/three-dimensional immersive communication system and method
US7982777B2 (en) 2005-04-07 2011-07-19 Axis Engineering Technologies, Inc. Stereoscopic wide field of view imaging system
US9101279B2 (en) 2006-02-15 2015-08-11 Virtual Video Reality By Ritchey, Llc Mobile user borne brain activity data and surrounding environment data correlation system
US9344612B2 (en) 2006-02-15 2016-05-17 Kenneth Ira Ritchey Non-interference field-of-view support apparatus for a panoramic facial sensor
US20100045773A1 (en) 2007-11-06 2010-02-25 Ritchey Kurtis J Panoramic adapter system and method with spherical field-of-view coverage
JP2008144379A (en) * 2006-12-06 2008-06-26 Shin Caterpillar Mitsubishi Ltd Image processing system of remote controlled working machine
JP2009117990A (en) * 2007-11-02 2009-05-28 Sony Corp Information presentation device and information presentation method
US8615383B2 (en) * 2008-01-18 2013-12-24 Lockheed Martin Corporation Immersive collaborative environment using motion capture, head mounted display, and cave
US8355041B2 (en) 2008-02-14 2013-01-15 Cisco Technology, Inc. Telepresence system for 360 degree video conferencing
JP5393301B2 (en) * 2009-07-06 2014-01-22 住友建機株式会社 Construction machine monitoring equipment
CN101943982B (en) * 2009-07-10 2012-12-12 北京大学 Method for manipulating image based on tracked eye movements
KR101891786B1 (en) * 2011-11-29 2018-08-27 삼성전자주식회사 Operation Method For User Function based on a Eye-Tracking and Portable Device supporting the same
CN102880292A (en) * 2012-09-11 2013-01-16 上海摩软通讯技术有限公司 Mobile terminal and control method thereof
US9715008B1 (en) * 2013-03-20 2017-07-25 Bentley Systems, Incorporated Visualization of 3-D GPR data in augmented reality
US9908048B2 (en) * 2013-06-08 2018-03-06 Sony Interactive Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display
CN103345305B (en) * 2013-07-22 2016-08-31 百度在线网络技术(北京)有限公司 Candidate word control method, device and mobile terminal for mobile terminal input method
CN103472915B (en) * 2013-08-30 2017-09-05 深圳Tcl新技术有限公司 reading control method based on pupil tracking, reading control device and display device
CN103869977B (en) * 2014-02-19 2016-06-08 小米科技有限责任公司 Method for displaying image, device and electronics
KR102250821B1 (en) * 2014-08-20 2021-05-11 삼성전자주식회사 Display apparatus and operating method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005114235A (en) * 2003-10-07 2005-04-28 Hitachi Constr Mach Co Ltd Device for supporting treatment of buried matter for mobile working machine
JP2005308282A (en) * 2004-04-20 2005-11-04 Komatsu Ltd Firearm device
CN1912682A (en) * 2005-08-10 2007-02-14 精工爱普生株式会社 Display device, method of controlling the same, and game machine
JP2008111269A (en) * 2006-10-30 2008-05-15 Komatsu Ltd Image display system
JP2010256534A (en) * 2009-04-23 2010-11-11 Fujifilm Corp Head-mounted display for omnidirectional image display
CN103141090A (en) * 2010-09-29 2013-06-05 日立建机株式会社 Device for surveying surround of working machine
CN202334770U (en) * 2011-12-04 2012-07-11 长安大学 Full-view monitoring system for engineering machine
WO2014077046A1 (en) * 2012-11-13 2014-05-22 ソニー株式会社 Image display device and image display method, mobile body device, image display system, and computer program

Also Published As

Publication number Publication date
CN111754750B (en) 2022-03-01
US10474228B2 (en) 2019-11-12
JP2018501684A (en) 2018-01-18
CN107111928A (en) 2017-08-29
US20170322624A1 (en) 2017-11-09
EP3222042A1 (en) 2017-09-27
JP6612865B2 (en) 2019-11-27
CN111754750A (en) 2020-10-09
EP3222042B1 (en) 2022-07-27
WO2016079557A1 (en) 2016-05-26

Similar Documents

Publication Publication Date Title
CN107111928B (en) Display system for remote control working machine
EP3379525B1 (en) Image processing device and image generation method
US9479732B1 (en) Immersive video teleconferencing robot
KR101899531B1 (en) Omni-stereo capture for mobile devices
US11184597B2 (en) Information processing device, image generation method, and head-mounted display
AU2007252840B2 (en) Methods and system for communication and displaying points-of-interest
US10917560B2 (en) Control apparatus, movable apparatus, and remote-control system
KR20170044451A (en) System and Method for Controlling Remote Camera using Head mount display
ES2824049T3 (en) System and method for remote monitoring of at least one observation area
JP6482855B2 (en) Monitoring system
KR101297294B1 (en) Map gui system for camera control
CN113467731B (en) Display system, information processing apparatus, and display control method of display system
JP6482856B2 (en) Monitoring system
EP3430591A1 (en) System for georeferenced, geo-oriented real time video streams
JP2022105568A (en) System and method for displaying 3d tour comparison
JP6857701B2 (en) Work equipment, display devices and display systems for remote control of work equipment
WO2016208539A1 (en) Method for providing binocular stereoscopic image, observation device, and camera unit
JP2018056845A (en) Work support apparatus, system, method and program
JP6929674B2 (en) Environmental image display system and environmental image display method
JP2012244311A (en) Camera remote controller and camera remote control method
WO2019038885A1 (en) Information processing device and image output method
JP2022053421A (en) Display system, display method, and program for remote control
CN115437390A (en) Control method and control system of unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Osaka Prefecture, Japan

Applicant after: Yangma Power Technology Co.,Ltd.

Address before: Osaka Prefecture, Japan

Applicant before: YANMAR Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200731

CF01 Termination of patent right due to non-payment of annual fee