WO2022070707A1 - 表示制御装置及び表示方法 - Google Patents
表示制御装置及び表示方法 Download PDFInfo
- Publication number
- WO2022070707A1 WO2022070707A1 PCT/JP2021/031517 JP2021031517W WO2022070707A1 WO 2022070707 A1 WO2022070707 A1 WO 2022070707A1 JP 2021031517 W JP2021031517 W JP 2021031517W WO 2022070707 A1 WO2022070707 A1 WO 2022070707A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cutting edge
- image
- work machine
- control device
- display control
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/22—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
- G09G5/30—Control of display attribute
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
- E02F3/435—Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/205—Remotely operated machines, e.g. unmanned vehicles
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0613—The adjustment depending on the type of the information to be displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
Definitions
- the present disclosure relates to a display control device and a display method.
- the present application claims priority with respect to Japanese Patent Application No. 2020-163449 filed in Japan on September 29, 2020, the contents of which are incorporated herein by reference.
- Patent Document 1 discloses a technique of displaying a mesh-like line image on the surface of a work object reflected in a captured image in order to give an operator a sense of perspective.
- the work machine equipped with the work machine is driven by a hydraulic cylinder.
- the stroke end is the end of the movable range of the rod. That is, the stroke end means the position of the rod in the state where the hydraulic cylinder is most contracted, or the position of the rod in the state where the hydraulic cylinder is in the most extended state.
- the operator controls the work machine so that the piston does not hit the stroke end while recognizing the posture of the work machine.
- An object of the present disclosure is to provide a display control device and a display method capable of presenting information to an operator for reducing the possibility that the piston of a hydraulic cylinder hits the stroke end.
- the display control device is a display control device for displaying an image used for operating a work machine including a work machine, and the work is performed from a camera provided in the work machine.
- An image capture image acquisition unit that acquires an image captured by a machine
- a blade edge shadow generation unit that generates a blade edge shadow projected by projecting the blade edge of the working machine in a vertical direction on a projection surface, the captured image, and the blade edge shadow.
- a display image generation unit that generates a display image in which a reference range graphic obtained by projecting a reachable range of the cutting edge onto the projection surface in the vertical direction is superimposed, and a display signal for displaying the display image is output. It is provided with a display control unit.
- FIG. 1 is a schematic view showing the configuration of the work system 1 according to the first embodiment.
- the work system 1 includes a work machine 100 and a remote cab 500.
- the work machine 100 operates at the work site. Examples of work sites include mines and quarries.
- the remote control room 500 is provided in a remote place away from the work site. Examples of remote areas include urban areas and work sites. That is, the operator remotely controls the work machine 100 from a distance at which the work machine 100 cannot be visually recognized.
- the work machine 100 is remotely controlled based on an operation signal transmitted from the remote cab 500.
- the remote control room 500 is connected to the work machine 100 via an access point 300 provided at the work site.
- the operation signal indicating the operation of the operator received in the remote driver's cab 500 is transmitted to the work machine 100 via the access point 300.
- the work machine 100 operates based on the operation signal received from the remote driver's cab 500. That is, the work system 1 includes a remote control system including a work machine 100 and a remote control room 500. Further, the work machine 100 captures an image of the work target, and the image is displayed in the remote cab 500. That is, the work system 1 is an example of a display control system.
- FIG. 2 is an external view of the work machine 100 according to the first embodiment.
- the work machine 100 according to the first embodiment is a loading shovel (face shovel).
- the work machine 100 according to another embodiment may be another work machine such as a backhoe, a wheel loader, or a bulldozer.
- the work machine 100 includes a traveling body 110, a swivel body 120 supported by the traveling body 110, and a working machine 130 operated by hydraulic pressure and supported by the swivel body 120.
- the swivel body 120 is freely swiveled around the swivel center axis O.
- the working machine 130 is provided at the front portion of the swivel body 120.
- the working machine 130 includes a boom 130A, an arm 130B, and a bucket 130C.
- the base end portion of the boom 130A is attached to the swivel body 120 via a pin.
- the arm 130B connects the boom 130A and the bucket 130C.
- the base end portion of the arm 130B is attached to the tip end portion of the boom 130A via a pin.
- the bucket 130C includes a cutting edge 130D for excavating earth and sand and a container for accommodating the excavated earth and sand.
- the base end portion of the bucket 130C is attached to the tip end portion of the arm 130B via a pin.
- the working machine 130 is driven by the motion of the boom cylinder 131A, the arm cylinder 131B, and the bucket cylinder 131C.
- the boom cylinder 131A, the arm cylinder 131B, and the bucket cylinder 131C are collectively referred to as a hydraulic cylinder 131.
- the boom cylinder 131A is a hydraulic cylinder for operating the boom 130A.
- the base end portion of the boom cylinder 131A is attached to the swivel body 120.
- the tip of the boom cylinder 131A is attached to the boom 130A.
- the arm cylinder 131B is a hydraulic cylinder for driving the arm 130B.
- the base end portion of the arm cylinder 131B is attached to the boom 130A.
- the tip of the arm cylinder 131B is attached to the arm 130B.
- the bucket cylinder 131C is a hydraulic cylinder for driving the bucket 130C.
- the base end of the bucket cylinder 131C is attached to the boom 130A.
- the tip of the bucket cylinder 131C is attached to the bucket 130C.
- a boom posture sensor 132A, an arm posture sensor 132B, and a bucket posture sensor 132C that detect the postures of the boom 130A, the arm 130B, and the bucket 130C are attached to the work machine 130.
- the boom attitude sensor 132A, the arm attitude sensor 132B, and the bucket attitude sensor 132C are collectively referred to as an attitude sensor 132.
- the posture sensor 132 according to the first embodiment is a stroke sensor attached to the hydraulic cylinder 131. That is, the posture sensor 132 detects the stroke length of the hydraulic cylinder 131.
- the stroke length is the moving distance of the rod from the stroke end of the hydraulic cylinder 131.
- the stroke end is the end of the movable range of the rod. That is, the stroke end means the position of the rod in the state where the hydraulic cylinder 131 is most contracted, or the position of the rod in the state where the hydraulic cylinder 131 is in the most extended state.
- the boom posture sensor 132A is provided on the boom cylinder 131A and detects the stroke length of the boom cylinder 131A.
- the arm posture sensor 132B is provided on the arm cylinder 131B and detects the stroke length of the arm cylinder 131B.
- the bucket posture sensor 132C is provided in the bucket cylinder 131C and detects the stroke length of the bucket cylinder 131C.
- the posture sensor 132 may detect the relative angle of rotation by a potentiometer provided at the proximal end of the boom 130A, arm 130B and bucket 130C, or may detect the relative angle of rotation by the IMU in the vertical direction. May be detected, or the angle of rotation in the vertical direction may be detected by an inclinometer.
- the swivel body 120 is provided with a driver's cab 121.
- a camera 122 is provided in the driver's cab 121.
- the camera 122 is installed in the front and upper part of the driver's cab 121.
- the camera 122 takes an image of the front of the driver's cab 121 through the windshield in front of the driver's cab 121.
- the "forward” means the direction in which the working machine 130 is mounted on the swivel body 120
- the "rear” means the opposite direction of the "front”.
- “Side” means a direction (horizontal direction) that intersects the front-back direction.
- the camera 122 examples include an image pickup device using a CCD (Charge Coupled Device) sensor and a CMOS (Complementary Metal Oxide Semiconductor) sensor.
- the camera 122 does not necessarily have to be provided in the driver's cab 121, and may be provided at least at a position where the construction target and the working machine 130 can be imaged. That is, the imaging range of the camera 122 includes at least a part of the working machine 130.
- the work machine 100 includes a camera 122, a position / orientation calculator 123, an inclination measuring instrument 124, a hydraulic device 125, and a vehicle control device 126.
- the position / orientation calculator 123 calculates the position of the swivel body 120 and the direction in which the swivel body 120 faces.
- the position / orientation calculator 123 includes two receivers that receive positioning signals from artificial satellites constituting the GNSS. The two receivers are installed at different positions on the swivel body 120, respectively.
- the position / orientation calculator 123 detects the position of the representative point (origin of the vehicle body coordinate system) of the swivel body 120 in the field coordinate system based on the positioning signal received by the receiver.
- the position / orientation calculator 123 calculates the orientation of the swivel body 120 as the relationship between the installation position of one receiver and the installation position of the other receiver using each positioning signal received by the two receivers.
- the position / orientation calculator 123 may detect the orientation of the swivel body 120 based on the measured values of the rotary encoder and the IMU.
- the inclination measuring instrument 124 measures the acceleration and the angular velocity of the swivel body 120, and detects the posture (for example, roll angle, pitch angle) of the swivel body 120 based on the measurement result.
- the inclination measuring instrument 124 is installed, for example, on the lower surface of the swivel body 120.
- an inertial measurement unit IMU: Inertial Measurement Unit
- IMU Inertial Measurement Unit
- the hydraulic device 125 supplies hydraulic oil to the hydraulic cylinder 131.
- the flow rate of the hydraulic oil supplied to the hydraulic cylinder 131 is controlled based on the control command received from the vehicle control device 126.
- the vehicle control device 126 transmits the image captured by the camera 122, the turning speed, position, direction and inclination angle of the turning body 120, the posture of the working machine 130, and the running speed of the traveling body 110 to the remote driver's cab 500. Further, the vehicle control device 126 receives an operation signal from the remote driver's cab 500, and drives the working machine 130, the turning body 120, and the traveling body 110 based on the received operation signal.
- the remote driver's cab 500 includes a driver's seat 510, a display device 520, an operation device 530, and a remote control device 540.
- the display device 520 is arranged in front of the driver's seat 510.
- the display device 520 is located in front of the operator when the operator sits in the driver's seat 510.
- the display device 520 may be composed of a plurality of arranged displays, or may be composed of one large display as shown in FIG. Further, the display device 520 may project an image on a curved surface or a spherical surface by a projector or the like.
- the operation device 530 is an operation device for a remote control system.
- the operation device 530 responds to the operation of the operator, the operation signal of the boom cylinder 131A, the operation signal of the arm cylinder 131B, the operation signal of the bucket cylinder 131C, the left / right turning operation signal of the turning body 120, and the forward / backward movement of the traveling body 110. Is generated and output to the remote control device 540.
- the operating device 530 is composed of, for example, a lever, a knob switch, and a pedal (not shown).
- the operating device 530 is arranged in the vicinity of the driver's seat 510.
- the operating device 530 is located within the operator's operable range when the operator sits in the driver's seat 510.
- the remote control device 540 generates a display image based on the data received from the work machine 100, and displays the display image on the display device 520. Further, the remote control device 540 transmits an operation signal indicating the operation of the operation device 530 to the work machine 100.
- the remote control device 540 is an example of a display control device.
- FIG. 3 is a schematic block diagram showing the configuration of the remote control device 540 according to the first embodiment.
- the remote control device 540 is a computer including a processor 610, a main memory 630, a storage 650, and an interface 670.
- the storage 650 stores the program.
- the processor 610 reads the program from the storage 650, expands it in the main memory 630, and executes the process according to the program.
- the remote control device 540 is connected to the network via the interface 670.
- Examples of the storage 650 include magnetic disks, optical disks, magneto-optical disks, semiconductor memories, and the like.
- the storage 650 may be an internal medium directly connected to the common communication line of the remote control device 540, or an external medium connected to the remote control device 540 via the interface 670.
- Storage 650 is a non-temporary tangible storage medium.
- the processor 610 includes a data acquisition unit 611, an attitude specifying unit 612, a cutting edge shadow generation unit 613, a display image generation unit 614, a display control unit 615, an operation signal input unit 616, and an operation signal output unit 617 by executing a program.
- the remote control device 540 may include a custom LSI (Large Scale Integrated Circuit) such as a PLD (Programmable Logic Device) in addition to or in place of the above configuration.
- PLDs include PAL (Programmable Array Logic), GAL (Generic Array Logic), CPLD (Complex Programmable Logic Device), and FPGA (Field Programmable Gate Array).
- PLDs Programmable Logic Device
- PAL Programmable Array Logic
- GAL Generic Array Logic
- CPLD Complex Programmable Logic Device
- FPGA Field Programmable Gate Array
- the data acquisition unit 611 acquires data indicating the image captured by the camera 122, the turning speed, position, direction and inclination angle of the turning body 120, the posture of the working machine 130, and the running speed of the running body 110 from the working machine 100.
- the posture specifying unit 612 specifies the posture in the vehicle body coordinate system and the posture in the site coordinate system of the work machine 100 based on the data acquired by the data acquisition unit 611.
- the vehicle body coordinate system is a local coordinate system defined by three axes of the front-rear axis, the left-right axis, and the up-down axis of the turning body 120, with the intersection of the turning center axis O of the turning body 120 and the bottom surface of the traveling body 110 as the origin.
- the site coordinate system is a global coordinate system defined by three axes, a latitude axis, a longitude axis, and a vertical axis, with a predetermined point (reference station, etc.) at the work site as the origin.
- the posture specifying unit 612 specifies the position in the vehicle body coordinate system and the position in the site coordinate system of the tip of the boom 130A, the tip of the arm 130B, and the left and right ends of the cutting edge 130D. A specific method for specifying the position of each part by the data acquisition unit 611 will be described later.
- the cutting edge shadow generation unit 613 generates a cutting edge shadow image showing the cutting edge shadow projected on the projection surface in the vertical direction based on the positions of both ends of the cutting edge 130D in the field coordinate system specified by the posture specifying unit 612. do.
- the projection plane according to the first embodiment is a plane passing through the bottom surface of the traveling body 110.
- the cutting edge shadow generation unit 613 generates a cutting edge shadow image by the following procedure.
- the cutting edge shadow generation unit 613 specifies the position of the projected cutting edge shadow projected on the projection surface in the field coordinate system by rewriting the value of the vertical axis component among the positions at both ends of the cutting edge 130D to zero.
- the cutting edge shadow generation unit 613 determines the position of the cutting edge shadow in the field coordinate system based on known camera parameters indicating the relationship between the image coordinate system, which is a two-dimensional Cartesian coordinate system related to the captured image of the camera 122, and the field coordinate system. Convert to a position in the image coordinate system.
- the cutting edge shadow generation unit 613 generates a cutting edge shadow image by drawing a line segment representing the cutting edge 130D at the converted position.
- the display image generation unit 614 generates a display image by superimposing the blade edge shadow image G1 and the blade edge arrival gauge image G2 on the captured image acquired by the data acquisition unit 611.
- FIG. 4 is a diagram showing an example of a display image according to the first embodiment.
- the cutting edge arrival gauge image G2 includes a left line G21, a right line G22, a maximum arrival line G23, a scale line G24, a scale value G25, and a reference range graphic G26.
- the left line G21 is a line indicating the reachable range of the left end of the cutting edge 130D. As shown in FIG. 4, the left line G21 passes through the left end of the cutting edge shadow image G1.
- the right line G22 is a line indicating the reachable range of the right end of the cutting edge 130D. As shown in FIG. 4, the right line G22 passes through the right end of the cutting edge shadow image G1.
- the maximum reach line G23 is a line indicating the leading edge of the reachable range of the cutting edge 130D.
- the maximum reach line G23 connects the front end of the left line G21 and the front end of the right line G22.
- the scale line G24 is a line representing the distance of the swivel body 120 from the swivel center axis O.
- the scale lines G24 are provided at equal intervals. In the example shown in FIG. 4, the scale lines G24 are provided at intervals of 2 meters. Each scale line G24 is provided so as to connect the left line G21 and the right line G22.
- the maximum reach line G23 and the scale line G24 are lines parallel to the cutting edge shadow image G1.
- the scale value G25 is provided corresponding to the scale line G24, and the distance indicated by the scale line G24 is represented by a numerical value. In the example shown in FIG. 4, the scale value G25 is provided in the vicinity of the right end of the scale line G24.
- the reference range figure G26 is a figure showing the reachable range of the cutting edge 130D on the projection surface.
- the reference range figure G26 according to the first embodiment is a quadrangle surrounded by a left line G21 and a right line G22, a leading edge of a reachable range on the projection surface, and a trailing edge of the reachable range on the projection surface. be.
- the reachable range of the cutting edge 130D on the projection surface is the reachable range of the cutting edge 130D under the condition that the projection surface and the cutting edge 130D are in contact with each other.
- the reference range figure G26 is highlighted by hatching, coloring, or the like.
- the maximum reachable line G23 and the front ends of the left line G21 and the right line G22 represent the leading edge of the reachable range of the cutting edge 130D when the condition of contact between the projection surface and the cutting edge 130D is not imposed.
- the maximum reachable line G23, the left line G21, and the right line G22 are examples of reachable range figures that project the reachable range of the cutting edge when no condition is imposed.
- FIG. 5 is a side view showing the relationship between the cutting edge shadow image G1 and the cutting edge arrival gauge image G2 according to the first embodiment.
- the cutting edge shadow image G1 and the cutting edge arrival gauge image G2 according to the first embodiment are drawn on a projection surface F1 which is a plane passing through the bottom surface of the traveling body 110. Therefore, when the cutting edge shadow image G1 and the cutting edge reaching gauge image G2 are superimposed on the captured image, the cutting edge shadow image G1 and the cutting edge reaching gauge image G2 sink with respect to the ground surface F2 in the portion of the ground surface F2 higher than the projection surface F1. It looks like it's crowded. For the portion of the ground surface F2 lower than the projection surface F1, the cutting edge shadow image G1 and the cutting edge arrival gauge image G2 appear to float with respect to the ground surface F2.
- the leading edge of the cutting edge arrival gauge image G2 that is, the maximum reaching line G23, is located at the position farthest from the turning center axis O in the reachable range R of the cutting edge 130D on the projection surface F1. It is reflected. Therefore, regardless of the posture of the cutting edge 130D, the cutting edge shadow image G1 is always located in front of the maximum arrival line G23.
- the reference range figure G26 shows a range in which the reachable range of the cutting edge 130D and the projection surface overlap.
- the display image generation unit 614 generates a display image by superimposing the cutting edge arrival gauge image G2 prepared in advance on the captured image.
- the display control unit 615 outputs a display signal for displaying the display image generated by the display image generation unit 614 to the display device 520.
- the operation signal input unit 616 receives the operation signal from the operation device 530.
- the operation signal output unit 617 transmits the operation signal received by the operation signal input unit 616 to the work machine 100.
- the posture specifying unit 612 determines the positions of the tip of the boom 130A (pin at the tip), the tip of the arm 130B (pin at the tip), and the positions of both ends of the cutting edge 130D in the vehicle body coordinate system and the position in the field coordinate system by the following procedure. Identify.
- the posture specifying unit 612 specifies the angle of the boom 130A with respect to the swivel body 120, that is, the angle of the vehicle body coordinate system with respect to the front-rear axis, based on the stroke length of the boom cylinder 131A.
- the posture specifying portion 612 moves from the base end (base end pin) of the boom 130A to the tip end (tip pin) of the boom 130A in the vehicle body coordinate system based on the angle of the boom 130A and the known length of the boom 130A. Identify the growing boom vector.
- the posture specifying unit 612 adds the known position vector and boom vector of the base end (pin of the base end) of the boom 130A in the vehicle body coordinate system to the tip (pin of the tip) of the boom 130A in the vehicle body coordinate system. Identify the position vector of.
- the posture specifying unit 612 specifies the angle of the arm 130B with respect to the boom 130A based on the stroke length of the arm cylinder 131B.
- the posture specifying unit 612 specifies the angle of the arm 130B with respect to the front-rear axis by adding the angle of the specified arm 130B and the angle of the boom 130A with respect to the front-rear axis of the vehicle body coordinate system.
- the posture specifying portion 612 moves from the base end (base end pin) of the arm 130B to the tip end (tip pin) of the arm 130B in the vehicle body coordinate system based on the angle of the arm 130B and the known length of the arm 130B. Identify the extending arm vector.
- the posture specifying unit 612 obtains the position vector of the tip (tip pin) of the arm 130B in the vehicle body coordinate system by adding the position vector of the tip (tip pin) of the boom 130A in the vehicle body coordinate system and the arm vector. Identify.
- the posture specifying unit 612 specifies the angle of the bucket 130C with respect to the arm 130B based on the stroke length of the bucket cylinder 131C.
- the posture specifying unit 612 specifies the angle of the bucket 130C with respect to the front-rear axis by adding the angle of the specified bucket 130C and the angle of the arm 130B with respect to the front-rear axis of the vehicle body coordinate system.
- the posture specifying portion 612 is a right bucket vector and a left based on the angle of the bucket 130C, the known length from the base end (base end pin) of the bucket 130C to the cutting edge 130D, and the known width of the cutting edge 130D. Identify the bucket vector.
- the right bucket vector is a vector extending from the base end (base end pin) of the bucket 130C to the right end of the cutting edge 130D in the vehicle body coordinate system.
- the left bucket vector is a vector extending from the base end of the bucket 130C to the left end of the cutting edge 130D.
- the posture specifying unit 612 specifies the position vector of the left end of the cutting edge 130D in the vehicle body coordinate system by adding the position vector of the tip (pin of the tip) of the arm 130B in the vehicle body coordinate system and the left bucket vector. Further, the posture specifying unit 612 specifies the position vector of the right end of the cutting edge 130D in the vehicle body coordinate system by adding the position vector of the tip (pin of the tip) of the arm 130B in the vehicle body coordinate system and the right bucket vector.
- the posture specifying unit 612 translates the position of each part in the vehicle body coordinate system based on the position in the site coordinate system of the work machine 100, and also causes the orientation (yaw angle) of the swivel body 120, the roll angle of the work machine 130, and the position. By rotating based on the pitch angle, the position of each part in the field coordinate system can be specified.
- FIG. 6 is a flowchart showing a display control process by the remote control device 540 according to the first embodiment.
- the remote control device 540 executes the display control process shown in FIG. 6 at regular intervals.
- the data acquisition unit 611 obtains the image captured by the camera 122 from the vehicle control device 126 of the work machine 100, the turning speed, position, orientation and tilt angle of the turning body 120, the posture of the working machine 130, and the running speed of the traveling body 110.
- the indicated data is acquired (step S1).
- the posture specifying unit 612 specifies the positions of both ends of the cutting edge 130D in the vehicle body coordinate system based on the data acquired in step S1 (step S2).
- the cutting edge shadow generation unit 613 rewrites the value of the vertical axis component among the positions at both ends of the cutting edge 130D in the vehicle body coordinate system specified in step S2 to zero, so that the vehicle body coordinate system of the projected cutting edge shadow projected on the projection surface. (Step S3).
- the cutting edge shadow generation unit 613 converts the position of the cutting edge shadow in the vehicle body coordinate system to the position in the image coordinate system based on the camera parameters (step S4).
- the cutting edge shadow generation unit 613 generates a cutting edge shadow image G1 by drawing a line segment at the converted position (step S5).
- the display image generation unit 614 generates a display image by superimposing the cutting edge shadow image G1 generated in step S5 and the cutting edge arrival gauge image G2 prepared in advance on the captured image acquired in step S1 (step). S6).
- the display control unit 615 outputs a display signal for displaying the display image generated in step S6 to the display device 520 (step S7). As a result, the display image as shown in FIG. 4 is displayed on the display device 520.
- the remote control device 540 includes an captured image in which the working machine 130 is captured, a cutting edge shadow image G1 in which the cutting edge 130D is projected onto the projection surface in the vertical direction, and a cutting edge shadow image.
- the display device 520 displays a display image in which the left line G21 and the right line G22 extending in the front-rear direction along the projection plane passing through both ends of the G1 are superimposed.
- the remote control device 540 can suppress a decrease in work efficiency when working with the work machine 100.
- the display image according to the first embodiment includes a reference range graphic G26 showing a reachable range under the condition that the cutting edge 130D is in contact with the projection surface F1.
- a reference range graphic G26 showing a reachable range under the condition that the cutting edge 130D is in contact with the projection surface F1.
- the maximum reach line G23 is displayed at the position farthest from the turning center axis O of the work machine 100 in the reachable range of the cutting edge 130D in the display image according to the first embodiment.
- the operator can determine whether or not the excavation target ahead of the current position can be excavated by visually recognizing the displayed image.
- the same effect can be obtained even if the maximum reach line G23 is not displayed and the left line G21 and the right line G22 extend only to the leading edge of the reachable range.
- the maximum reach line G23 if the maximum reach line G23 is displayed, the same effect can be obtained even if the left line G21 and the right line G22 extend to infinity.
- the left line G21 and the right line G22 included in the display image according to the first embodiment extend to the position farthest from the turning center axis O of the work machine 100 in the reachable range of the cutting edge 130D.
- the maximum reach line G23 is displayed at the position farthest from the turning center axis O of the work machine 100 in the reachable range of the cutting edge 130D. Thereby, the operator can determine whether or not the excavation target in front of the current position can be excavated by visually recognizing the displayed image.
- the same effect can be obtained even if the maximum reachable line G23 is not displayed and the left line G21 and the right line G22 extend only to the leading edge of the reachable range.
- the maximum reach line G23 if the maximum reach line G23 is displayed, the same effect can be obtained even if the left line G21 and the right line G22 extend to infinity.
- the display image according to the first embodiment includes a scale line G24 and a scale value G25 indicating the distances from the rotation center axis O for each of the plurality of positions away from the rotation center axis O.
- the operator can recognize the position of the cutting edge 130D in the depth direction by visually recognizing the displayed image.
- the same effect can be obtained even if either the scale line G24 or the scale value G25 is not displayed.
- the cutting edge shadow image G1 and the cutting edge arrival gauge image G2 according to the first embodiment are images projected on a projection surface F1 which is a plane passing through the bottom surface of the traveling body 110.
- the cutting edge shadow image G1 and the cutting edge reaching gauge image G2 according to the second embodiment are projected on the ground surface F2. That is, the projection plane according to the second embodiment is the ground surface F2.
- FIG. 7 is an external view of the work machine 100 according to the second embodiment.
- the work machine 100 according to the second embodiment further includes a depth detection device 127 in addition to the configuration of the first embodiment.
- the depth detection device 127 is provided in the vicinity of the camera 122 and detects the depth in the same direction as the image pickup direction of the camera 122.
- the depth is the distance from the depth detection device 127 to the object.
- Examples of the depth detection device 127 include, for example, a LiDAR device, a radar device, a stereo camera, and the like.
- the detection range of the depth detection device 127 is substantially the same as the image pickup range of the camera 122.
- FIG. 8 is a schematic block diagram showing the configuration of the remote control device 540 according to the second embodiment.
- the remote control device 540 according to the second embodiment further includes a terrain update unit 618 and a gauge generation unit 619 in addition to the configuration according to the first embodiment. Further, the remote control device 540 according to the second embodiment is different from the first embodiment in the processing of the cutting edge shadow generation unit 613.
- the terrain update unit 618 updates the terrain data indicating the three-dimensional shape of the work target in the site coordinate system based on the depth data acquired by the data acquisition unit 611 from the depth detection device 127. Specifically, the terrain update unit 618 updates the terrain data according to the following procedure.
- the terrain update unit 618 converts the depth data into three-dimensional data related to the vehicle body coordinate system. Since the depth detection device 127 is fixed to the swivel body 120, the conversion function between the depth data and the vehicle body coordinate system can be obtained in advance.
- the terrain updating unit 618 removes a portion of the working machine 130 from the generated three-dimensional data based on the posture of the working machine 130 in the vehicle body coordinate system specified by the posture specifying unit 612.
- the terrain updating unit 618 converts the three-dimensional data in the vehicle body coordinate system into the three-dimensional data in the field coordinate system based on the position and posture of the vehicle body acquired by the data acquisition unit 611.
- the terrain update unit 618 updates the terrain data already recorded in the main memory 630 using the newly generated three-dimensional data. That is, the portion of the already recorded topographical data that overlaps with the new 3D data is replaced with the value of the new 3D data. As a result, the terrain update unit 618 can always record the latest terrain data in the main memory 630.
- the gauge generation unit 619 generates the cutting edge arrival gauge image G2 projected on the ground surface F2 based on the topographical data. For example, the gauge generation unit 619 generates the cutting edge arrival gauge image G2 by the following procedure.
- the gauge generation unit 619 converts the portion of the terrain data included in the imaging range into the vehicle body coordinate system based on the position and posture of the vehicle body acquired by the data acquisition unit 611.
- the gauge generation unit 619 uses the terrain data in the vehicle body coordinate system to project a known reach of the cutting edge 130D and a plurality of lines that divide the reach at equal intervals onto the ground surface F2. As a result, the gauge generation unit 619 specifies the positions of the left line G21, the right line G22, the maximum reach line G23, and the scale line G24 in the vehicle body coordinate system.
- the gauge generation unit 619 represents a reference range representing the reachable range under the condition that the surface where the reachable range R of the known cutting edge 130D and the topographical data in the vehicle body coordinate system overlap is brought into contact with the ground surface F2. It is specified as the figure G26.
- the gauge generation unit 619 converts the left line G21, the right line G22, the maximum reach line G23, the scale line G24, and the reference range graphic G26 into an image based on the camera parameters of the camera 122.
- the gauge generation unit 619 attaches a scale value G25 in the vicinity of each scale line G24 of the converted image.
- the gauge generation unit 619 can generate the cutting edge arrival gauge image G2 projected on the ground surface F2.
- the cutting edge shadow generation unit 613 generates a cutting edge shadow image G1 in which the cutting edge 130D is projected onto the ground surface F2 based on the topographical data.
- the display image generation unit 614 generates a display image by superimposing the blade edge shadow image G1 and the blade edge arrival gauge image G2 on the captured image acquired by the data acquisition unit 611.
- FIG. 9 is a diagram showing an example of a display image according to the second embodiment.
- the cutting edge arrival gauge image G2 includes a left line G21, a right line G22, a maximum arrival line G23, a scale line G24, a scale value G25, and a reference range graphic G26.
- FIG. 10 is a side view showing the relationship between the cutting edge shadow image G1 and the cutting edge arrival gauge image G2 according to the second embodiment.
- the cutting edge shadow image G1 and the cutting edge arrival gauge image G2 according to the second embodiment are drawn along the ground surface F2 detected by the depth detection device 127. Therefore, when the cutting edge shadow image G1 and the cutting edge reaching gauge image G2 are superimposed on the captured image, the cutting edge shadow image G1 and the cutting edge reaching gauge image G2 appear to be stuck on the ground surface F2.
- the reference range figure G26 according to the second embodiment represents, but is not limited to, the reachable range under the condition that the cutting edge 130D is in contact with the ground surface F2.
- the reference range figure G26 according to another embodiment may represent a reachable range under the condition that the cutting edge 130D is brought into contact with a plane passing through the bottom surface of the traveling body 110 as in the first embodiment. ..
- the gauge generation unit 619 generates the reference range figure G26 by projecting the reachable range under the condition that the cutting edge 130D is in contact with the plane passing through the bottom surface of the traveling body 110 on the ground surface F2.
- the reference range figure G26 generated by the remote control device 540 according to the first and second embodiments is an reachable range under the condition that the cutting edge 130D is in contact with the projection surface (a plane passing through the bottom surface of the traveling body 110 or the ground surface).
- the remote control device 540 according to the third embodiment represents the reachable range of the cutting edge 130D under the condition that only the arm 130B is driven. This is because, as a usage mode of the loading excavator, the excavation operation of the work target is often performed by pushing the arm 130B, and there is a possibility that the piston of the arm cylinder 131B hits the stroke end as compared with the boom cylinder 131A and the bucket cylinder 131C. Is high.
- the configuration of the work system 1 according to the third embodiment is basically the same as that of the first embodiment.
- FIG. 11 is a schematic block diagram showing the configuration of the remote control device 540 according to the third embodiment.
- the remote control device 540 according to the third embodiment further includes a reference range specifying unit 620 in addition to the configuration according to the first embodiment.
- the reference range specifying unit 620 calculates the reachable range of the cutting edge 130D when the boom 130A and the bucket 130C are fixed and only the arm 130B is driven, based on the postures of the boom 130A and the bucket 130C specified by the posture specifying unit 612. do.
- FIG. 12 is a side view showing the relationship between the cutting edge shadow image G1 and the cutting edge arrival gauge image G2 according to the third embodiment.
- the reference range specifying portion 620 specifies the rotation center P (pin center) of the arm 130B based on the posture of the boom 130A, and the length L from the rotation center to the cutting edge 130D based on the posture of the bucket 130C. To identify. Then, the reference range specifying unit 620 calculates the reachable range R1 of the cutting edge 130D when driving only the arm 130B based on the known rotation range of the arm 130B. The reference range specifying unit 620 generates the reference range figure G26 by projecting the calculated reachable range R1 onto the projection surface F1 from the vertical direction. The reference range figure G26 generated by the reference range specifying unit 620 changes each time the posture of at least one of the boom 130A and the bucket 130C changes.
- the operator remotely controls the work machine 100 so that the piston of the arm cylinder 131B does not hit the stroke end by controlling the work machine 130 so that the cutting edge shadow image G1 does not hit the end of the reference range figure G26. can do.
- the cutting edge arrival gauge image G2 according to the third embodiment has a shape projected on the projection surface F1, but is not limited to this.
- the cutting edge arrival gauge image G2 according to another embodiment may have a shape projected on the ground surface F2 as in the second embodiment.
- the remote control device 540 may be configured by a single computer, or the configuration of the remote control device 540 is divided into a plurality of computers, and the plurality of computers cooperate with each other. By doing so, it may function as a remote control device 540. At this time, some computers constituting the remote control device 540 may be provided inside the remote control room 500, and other computers may be provided outside the remote control room 500. For example, some computers constituting the remote control device 540 may be provided in the work machine 100.
- FIG. 13 is a diagram showing an example of a display image according to another embodiment.
- the cutting edge arrival gauge image G2 according to the above-described embodiment includes the left line G21 and the right line G22, so that the operator can recognize the range excavated by the working machine 130.
- the cutting edge arrival gauge image G2 according to another embodiment may include a center line G27 instead of the left line G21 and the right line G22 in the display image.
- the center line G27 passes through the center point of the cutting edge 130D and extends in the front-rear direction along the projection plane.
- the operator can recognize the position of the cutting edge 130D in the depth direction by at least one of the end point of the center line G27, the maximum arrival line G23, the scale line G24, the scale value G25, and the reference range figure G26. ..
- the reference range figure G26 shows the leading edge and the trailing edge of the reachable range of the cutting edge 130D under predetermined conditions, but is not limited to this in other embodiments.
- the work machine 100 is a loading shovel
- the cutting edge 130D of the bucket 130C faces forward, so that the excavation work is usually performed by pushing the arm 130B. Therefore, the leading edge is more likely to hit the stroke end than the trailing edge within reach. Therefore, the reference range figure G26 according to another embodiment may represent only the leading edge of the reachable range of the cutting edge 130D under predetermined conditions.
- the reference range figure G26 may represent only the trailing edge of the reachable range of the cutting edge 130D under predetermined conditions.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Civil Engineering (AREA)
- Mining & Mineral Resources (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Component Parts Of Construction Machinery (AREA)
- Operation Control Of Excavators (AREA)
- Selective Calling Equipment (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Train Traffic Observation, Control, And Security (AREA)
Abstract
Description
本願は、2020年9月29日に日本に出願された特願2020-163449号について優先権を主張し、その内容をここに援用する。
特許文献1には、オペレータに遠近感を与えるため、撮像画像に写る作業対象の表面にメッシュ状の線画像を表示する技術が開示されている。
本開示の目的は、油圧シリンダのピストンがストロークエンドに当たる可能性を低減するための情報をオペレータに提示することができる表示制御装置及び表示方法を提供することにある。
《作業システム1》
図1は、第1の実施形態に係る作業システム1の構成を示す概略図である。
作業システム1は、作業機械100と、遠隔運転室500とを備える。作業機械100は、作業現場で稼働する。作業現場の例としては、鉱山、採石場などが挙げられる。遠隔運転室500は、作業現場から離れた遠隔地に設けられる。遠隔地の例としては、市街、作業現場内などが挙げられる。つまり、オペレータは、作業機械100を視認できない距離から、当該作業機械100を遠隔操作する。
図2は、第1の実施形態に係る作業機械100の外観図である。
第1の実施形態に係る作業機械100は、ローディングショベル(フェイスショベル)である。なお、他の実施形態に係る作業機械100は、バックホウ、ホイールローダやブルドーザなどの他の作業機械であってもよい。
作業機械100は、走行体110と、走行体110に支持される旋回体120と、油圧により作動し旋回体120に支持される作業機130とを備える。旋回体120は旋回中心軸Oを中心として旋回自在に支持される。作業機130は、旋回体120の前部に設けられる。
ブーム130Aの基端部は、旋回体120にピンを介して取り付けられる。
アーム130Bは、ブーム130Aとバケット130Cとを連結する。アーム130Bの基端部は、ブーム130Aの先端部にピンを介して取り付けられる。
バケット130Cは、土砂などを掘削するための刃先130Dと掘削した土砂を収容するための容器とを備える。バケット130Cの基端部は、アーム130Bの先端部にピンを介して取り付けられる。
ブームシリンダ131Aは、ブーム130Aを作動させるための油圧シリンダである。ブームシリンダ131Aの基端部は、旋回体120に取り付けられる。ブームシリンダ131Aの先端部は、ブーム130Aに取り付けられる。
アームシリンダ131Bは、アーム130Bを駆動するための油圧シリンダである。アームシリンダ131Bの基端部は、ブーム130Aに取り付けられる。アームシリンダ131Bの先端部は、アーム130Bに取り付けられる。
バケットシリンダ131Cは、バケット130Cを駆動するための油圧シリンダである。バケットシリンダ131Cの基端部は、ブーム130Aに取り付けられる。バケットシリンダ131Cの先端部は、バケット130Cに取り付けられる。
アーム姿勢センサ132Bは、アームシリンダ131Bに設けられ、アームシリンダ131Bのストローク長を検出する。
バケット姿勢センサ132Cは、バケットシリンダ131Cに設けられ、バケットシリンダ131Cのストローク長を検出する。
遠隔運転室500は、運転席510、表示装置520、操作装置530、遠隔制御装置540を備える。
表示装置520は、運転席510の前方に配置される。表示装置520は、オペレータが運転席510に座ったときにオペレータの眼前に位置する。表示装置520は、並べられた複数のディスプレイによって構成されてもよいし、図1に示すように、1つの大きなディスプレイによって構成されてもよい。また、表示装置520は、プロジェクタ等によって曲面や球面に画像を投影するものであってもよい。
操作装置530は、運転席510の近傍に配置される。操作装置530は、オペレータが運転席510に座ったときにオペレータの操作可能な範囲内に位置する。
遠隔制御装置540は、プロセッサ610、メインメモリ630、ストレージ650、インタフェース670を備えるコンピュータである。ストレージ650は、プログラムを記憶する。プロセッサ610は、プログラムをストレージ650から読み出してメインメモリ630に展開し、プログラムに従った処理を実行する。遠隔制御装置540は、インタフェース670を介してネットワークに接続される。
なお、他の実施形態においては、遠隔制御装置540は、上記構成に加えて、又は上記構成に代えてPLD(Programmable Logic Device)などのカスタムLSI(Large Scale Integrated Circuit)を備えてもよい。PLDの例としては、PAL(Programmable Array Logic)、GAL(Generic Array Logic)、CPLD(Complex Programmable Logic Device)、FPGA(Field Programmable Gate Array)が挙げられる。この場合、プロセッサ610によって実現される機能の一部又は全部が当該集積回路によって実現されてよい。このような集積回路も、プロセッサの一例に含まれる。
右ラインG22は、刃先130Dの右端の到達可能範囲を示す線である。図4に示すように、右ラインG22は刃先影画像G1の右端を通る。
目盛りラインG24は、等間隔に設けられる。図4に示す例では、目盛りラインG24は2メートル間隔で設けられる。各目盛りラインG24は、左ラインG21と右ラインG22とを接続するように設けられる。
最大到達ラインG23及び目盛りラインG24は、刃先影画像G1と平行な線である。
目盛り値G25は、目盛りラインG24に対応して設けられ、当該目盛りラインG24が示す距離を数値で表す。図4に示す例では、目盛り値G25は、目盛りラインG24の右端の近傍に設けられる。
図5に示すように、基準範囲図形G26は、刃先130Dの到達可能範囲と投影面とが重なる範囲を示す。
操作信号入力部616は、操作装置530による操作信号を受け付ける。
操作信号出力部617は、操作信号入力部616が受け付けた操作信号を、作業機械100に送信する。
ここで、姿勢特定部612による姿勢の特定方法について説明する。姿勢特定部612は、以下の手順で、ブーム130Aの先端(先端部のピン)、アーム130Bの先端(先端部のピン)及び刃先130Dの両端の車体座標系における位置及び現場座標系における位置を特定する。
図6は、第1の実施形態に係る遠隔制御装置540による表示制御処理を示すフローチャートである。オペレータが遠隔運転室500による作業機械100の遠隔運転を開始すると、遠隔制御装置540は、一定時間ごとに、図6に示す表示制御処理を実行する。
これにより、表示装置520には、図4に示すような表示画像が表示される。
このように、第1の実施形態によれば、遠隔制御装置540は、作業機130が写る撮像画像と、刃先130Dを投影面に鉛直方向に向けて投影した刃先影画像G1と、刃先影画像G1の両端を通り投影面に沿って前後方向に伸びる左ラインG21及び右ラインG22とを重畳した表示画像を、表示装置520に表示させる。これにより、オペレータは、作業対象のうち作業機130によって掘削される範囲を容易に認識することができる。すなわち、オペレータは、撮像画像に写る作業対象のうち、左ラインG21と右ラインG22によって挟まれた部分が掘削されることを認識し、掘削される土量を推定することができる。したがって、遠隔制御装置540は、作業機械100を用いて作業する際の作業効率の低下を抑制することができる。
第1の実施形態に係る刃先影画像G1及び刃先到達ゲージ画像G2は、走行体110の底面を通る平面である投影面F1に投影された画像である。これに対し、第2の実施形態に係る刃先影画像G1及び刃先到達ゲージ画像G2は、地表面F2に投影される。すなわち、第2の実施形態に係る投影面は、地表面F2である。
図7は、第2の実施形態に係る作業機械100の外観図である。第2の実施形態に係る作業機械100は、第1の実施形態の構成に加え、さらに深度検出装置127を備える。深度検出装置127は、カメラ122の近傍に設けられ、カメラ122の撮像方向と同じ方向の深度を検出する。深度とは、深度検出装置127から対象物までの距離である。深度検出装置127の例としては、例えば、LiDAR装置、レーダ装置、ステレオカメラなどが挙げられる。深度検出装置127の検出範囲は、カメラ122の撮像範囲と略同一である。
図8は、第2の実施形態に係る遠隔制御装置540の構成を示す概略ブロック図である。第2の実施形態に係る遠隔制御装置540は、第1の実施形態に係る構成に加え、さらに地形更新部618及びゲージ生成部619を備える。また、第2の実施形態に係る遠隔制御装置540は、第1の実施形態と刃先影生成部613の処理が異なる。
刃先影生成部613は、ゲージ生成部619と同様に、地形データに基づいて刃先130Dを地表面F2に投影した刃先影画像G1を生成する。
第1、第2の実施形態に係る遠隔制御装置540が生成する基準範囲図形G26は、投影面(走行体110の底面を通る平面又は地表面)に刃先130Dを接触させる条件下における到達可能範囲を表す。これに対し、第3の実施形態に係る遠隔制御装置540は、アーム130Bのみを駆動させる条件下における刃先130Dの到達可能範囲を表す。これは、ローディングショベルの使用態様として、作業対象の掘削操作をアーム130Bの押し操作によって行うことが多く、ブームシリンダ131A及びバケットシリンダ131Cと比較して、アームシリンダ131Bのピストンがストロークエンドに当たる可能性が高いためである。第3の実施形態に係る作業システム1の構成は、基本的に第1の実施形態と同様である。
図11は、第3の実施形態に係る遠隔制御装置540の構成を示す概略ブロック図である。第3の実施形態に係る遠隔制御装置540は、第1の実施形態に係る構成に加え、さらに基準範囲特定部620を備える。基準範囲特定部620は、姿勢特定部612が特定したブーム130A及びバケット130Cの姿勢に基づいて、ブーム130A及びバケット130Cを固定し、アーム130Bのみを駆動させる場合の刃先130Dの到達可能範囲を計算する。
第3の実施形態に係る刃先到達ゲージ画像G2は、投影面F1に投影された形状であるが、これに限られない。例えば、他の実施形態に係る刃先到達ゲージ画像G2は、第2の実施形態のように地表面F2に投影された形状であってもよい。
以上、図面を参照して一実施形態について詳しく説明してきたが、具体的な構成は上述のものに限られることはなく、様々な設計変更等をすることが可能である。すなわち、他の実施形態においては、上述の処理の順序が適宜変更されてもよい。また、一部の処理が並列に実行されてもよい。
Claims (8)
- 作業機を備える作業機械の操作のために用いられる画像を表示させる表示制御装置であって、
前記作業機械に設けられたカメラから、前記作業機が写る撮像画像を取得する撮像画像取得部と、
前記作業機の刃先を投影面に鉛直方向に向けて投影した刃先影を生成する刃先影生成部と、
前記撮像画像と、前記刃先影と、前記刃先の到達可能な範囲を前記投影面に鉛直方向に向けて投影した基準範囲図形とを重畳した表示画像を生成する表示画像生成部と、
前記表示画像を表示するための表示信号を出力する表示制御部と
を備える表示制御装置。 - 前記基準範囲図形は、前記刃先の到達可能な範囲の前縁及び後縁の少なくとも一方を投影した図形である
請求項1に記載の表示制御装置。 - 前記基準範囲図形は、所定条件下における前記刃先の到達可能な範囲を投影した図形である
請求項1又は請求項2に記載の表示制御装置。 - 前記基準範囲図形は、前記刃先を前記投影面に接触させる条件下における前記刃先の到達可能な範囲を投影した図形である
請求項3に記載の表示制御装置。 - 前記作業機はブーム、アーム及びバケットを備え、
前記基準範囲図形は、前記ブーム及び前記バケットを動かさず、前記アームを動かす条件下における前記刃先の到達可能な範囲を投影した図形である
請求項3に記載の表示制御装置。 - 前記表示画像には、前記条件を課さないときの前記刃先の到達可能な範囲を投影した到達可能範囲図形を含む
請求項3から請求項5の何れか1項に記載の表示制御装置。 - 前記投影面は、前記作業機械の接地面を通る平面である
請求項1から請求項6の何れか1項に記載の表示制御装置。 - 作業機を備える作業機械の操作のために用いられる画像を表示させる表示制御方法であって、
前記作業機械に設けられたカメラから、前記作業機が写る撮像画像を取得するステップと、
前記作業機の刃先を投影面に鉛直方向に向けて投影した刃先影を生成するステップと、
前記撮像画像と、前記刃先影と、前記刃先の到達可能な範囲を前記投影面に鉛直方向に向けて投影した基準範囲図形とを重畳した表示画像を生成するステップと、
前記表示画像を表示するステップと
を備える表示制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA3187228A CA3187228A1 (en) | 2020-09-29 | 2021-08-27 | Display control device and display control method |
US18/019,913 US20230267895A1 (en) | 2020-09-29 | 2021-08-27 | Display control device and display control method |
AU2021352215A AU2021352215A1 (en) | 2020-09-29 | 2021-08-27 | Display control device and display control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-163449 | 2020-09-29 | ||
JP2020163449A JP2022055808A (ja) | 2020-09-29 | 2020-09-29 | 表示制御装置及び表示方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022070707A1 true WO2022070707A1 (ja) | 2022-04-07 |
Family
ID=80951341
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/031517 WO2022070707A1 (ja) | 2020-09-29 | 2021-08-27 | 表示制御装置及び表示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230267895A1 (ja) |
JP (1) | JP2022055808A (ja) |
AU (1) | AU2021352215A1 (ja) |
CA (1) | CA3187228A1 (ja) |
WO (1) | WO2022070707A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018146782A1 (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 作業車両及び表示装置 |
WO2019189430A1 (ja) * | 2018-03-28 | 2019-10-03 | コベルコ建機株式会社 | 建設機械 |
WO2019244574A1 (ja) * | 2018-06-19 | 2019-12-26 | 住友建機株式会社 | 掘削機、情報処理装置 |
JP2020002718A (ja) * | 2018-06-29 | 2020-01-09 | 株式会社小松製作所 | 表示制御装置、および表示制御方法 |
-
2020
- 2020-09-29 JP JP2020163449A patent/JP2022055808A/ja active Pending
-
2021
- 2021-08-27 CA CA3187228A patent/CA3187228A1/en active Pending
- 2021-08-27 WO PCT/JP2021/031517 patent/WO2022070707A1/ja active Application Filing
- 2021-08-27 US US18/019,913 patent/US20230267895A1/en active Pending
- 2021-08-27 AU AU2021352215A patent/AU2021352215A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018146782A1 (ja) * | 2017-02-09 | 2018-08-16 | 株式会社小松製作所 | 作業車両及び表示装置 |
WO2019189430A1 (ja) * | 2018-03-28 | 2019-10-03 | コベルコ建機株式会社 | 建設機械 |
WO2019244574A1 (ja) * | 2018-06-19 | 2019-12-26 | 住友建機株式会社 | 掘削機、情報処理装置 |
JP2020002718A (ja) * | 2018-06-29 | 2020-01-09 | 株式会社小松製作所 | 表示制御装置、および表示制御方法 |
Also Published As
Publication number | Publication date |
---|---|
US20230267895A1 (en) | 2023-08-24 |
CA3187228A1 (en) | 2022-04-07 |
JP2022055808A (ja) | 2022-04-08 |
AU2021352215A1 (en) | 2023-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7285051B2 (ja) | 表示制御装置、および表示制御方法 | |
JP7372029B2 (ja) | 表示制御装置、表示制御システムおよび表示制御方法 | |
JP6832548B2 (ja) | 作業機械の画像表示システム、作業機械の遠隔操作システム、作業機械及び作業機械の画像表示方法 | |
JP2018035645A (ja) | 作業機械の画像表示システム | |
JP7080750B2 (ja) | 表示制御システム、遠隔操作システム、表示制御装置、および表示制御方法 | |
JP2016160741A (ja) | 作業機械の画像表示システム、作業機械の遠隔操作システム及び作業機械 | |
JP7420733B2 (ja) | 表示制御システムおよび表示制御方法 | |
JP2024052764A (ja) | 表示制御装置及び表示方法 | |
JP2020033704A (ja) | 作業機械 | |
JP2022164713A (ja) | 作業機械の画像表示システム及び作業機械の画像表示方法 | |
WO2022070707A1 (ja) | 表示制御装置及び表示方法 | |
JP7128497B2 (ja) | 作業機械の画像表示システム | |
JP7333551B2 (ja) | 作業機械の画像表示システム | |
JP7390991B2 (ja) | 作業機械および施工支援システム | |
WO2021256528A1 (ja) | 校正装置および校正方法 | |
JP2020197045A (ja) | 表示システムおよび表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21875010 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3187228 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 2021352215 Country of ref document: AU Date of ref document: 20210827 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21875010 Country of ref document: EP Kind code of ref document: A1 |