WO2019159687A1 - 作業車両の表示システム及び生成方法 - Google Patents
作業車両の表示システム及び生成方法 Download PDFInfo
- Publication number
- WO2019159687A1 WO2019159687A1 PCT/JP2019/003013 JP2019003013W WO2019159687A1 WO 2019159687 A1 WO2019159687 A1 WO 2019159687A1 JP 2019003013 W JP2019003013 W JP 2019003013W WO 2019159687 A1 WO2019159687 A1 WO 2019159687A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- work vehicle
- image
- shape
- controller
- surrounding environment
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 17
- 238000011156 evaluation Methods 0.000 claims description 6
- 238000005259 measurement Methods 0.000 claims description 6
- 230000000737 periodic effect Effects 0.000 claims 1
- 238000012876 topography Methods 0.000 description 5
- 239000002131 composite material Substances 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 238000009412 basement excavation Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000010720 hydraulic oil Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/211—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays producing three-dimensional [3D] effects, e.g. stereoscopic images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/76—Graders, bulldozers, or the like with scraper plates or ploughshare-like elements; Levelling scarifying devices
- E02F3/7609—Scraper blade mounted forwardly of the tractor on a pair of pivoting arms which are linked to the sides of the tractor, e.g. bulldozers
- E02F3/7618—Scraper blade mounted forwardly of the tractor on a pair of pivoting arms which are linked to the sides of the tractor, e.g. bulldozers with the scraper blade adjustable relative to the pivoting arms about a horizontal axis
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/176—Camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/334—Projection means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/60—Structural details of dashboards or instruments
- B60K2360/61—Specially adapted for utility vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2200/00—Type of vehicle
- B60Y2200/40—Special vehicles
- B60Y2200/41—Construction vehicles, e.g. graders, excavators
- B60Y2200/411—Bulldozers, Graders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the present invention relates to a work vehicle display system and generation method.
- the shape of the projection model is always a hemispherical shape and constant. Therefore, it is difficult to grasp the actual shape of the surrounding environment of the work vehicle from the overhead image.
- the bottom surface of the projection model is always a flat plane. Therefore, even when the ground surface around the work vehicle has an inclination or unevenness, an image obtained by photographing them is projected on a flat projection surface. For this reason, it is not easy to grasp that the topographic image is inclined or uneven terrain.
- An object of the present invention is to generate a display image that can easily grasp the shape of the surrounding environment of the work vehicle.
- the work vehicle display system includes a camera, a shape sensor, and a controller.
- the camera captures an image of the surrounding environment of the work vehicle and outputs image data indicating the image.
- the shape sensor measures the three-dimensional shape of the surrounding environment and outputs 3D shape data indicating the three-dimensional shape.
- the controller acquires image data and 3D shape data.
- the controller generates a three-dimensional projection model based on the 3D shape data.
- the three-dimensional projection model represents the three-dimensional shape of the surrounding environment.
- the three-dimensional shape of the environment surrounding the work vehicle is measured by the shape sensor, and a three-dimensional projection model is generated based on the measured three-dimensional shape. Therefore, the three-dimensional projection model has the same or approximate shape as the actual shape of the environment surrounding the work vehicle. Therefore, by projecting the image captured by the camera onto the three-dimensional projection model, it is possible to generate a display image that can easily grasp the shape of the surrounding environment of the work vehicle.
- a generation method is a generation method executed by a controller to generate display image data indicating a display image of the surrounding environment of the work vehicle, and includes the following processing.
- the first process is to acquire image data indicating an image of the surrounding environment of the work vehicle.
- the second process is to acquire 3D shape data indicating the three-dimensional shape of the surrounding environment.
- the third process is to generate a three-dimensional projection model that represents the three-dimensional shape of the surrounding environment based on the 3D shape data.
- the fourth process is to generate display image data by projecting an image onto a three-dimensional projection model based on the image data.
- the three-dimensional shape of the environment surrounding the work vehicle is measured by the shape sensor, and a three-dimensional projection model is generated based on the measured three-dimensional shape. Therefore, the three-dimensional projection model has the same or approximate shape as the actual shape of the environment surrounding the work vehicle. Therefore, by projecting the image captured by the camera onto the three-dimensional projection model, it is possible to generate a display image that can easily grasp the shape of the surrounding environment of the work vehicle.
- FIG. 1 is a side view showing a work vehicle 1 according to the embodiment.
- the work vehicle 1 is a bulldozer.
- the work vehicle 1 includes a vehicle main body 3, a work implement 4, and a traveling device 5.
- the vehicle body 3 has an engine room 6. Drive units such as an engine 7 and a hydraulic pump 8 are arranged in the engine chamber 6. A ripper device 9 is attached to the rear portion of the vehicle body 3.
- the traveling device 5 is a device for traveling the work vehicle 1.
- the traveling device 5 has a pair of crawler belts 11 disposed on one side and the other side in the short direction of the work vehicle 1.
- the crawler belt 11 is a loop-like chain extending in the longitudinal direction of the work vehicle 1. When the crawler belt 11 is driven, the work vehicle 1 travels.
- the work machine 4 is disposed in front of the vehicle body 3.
- the work machine 4 is used for work such as excavation, earthing, or leveling.
- the work machine 4 includes a blade 12, a tilt cylinder 13, a lift cylinder 14, and an arm 15.
- the blade 12 is supported by the vehicle body 3 via the arm 15.
- the blade 12 is provided so as to be swingable in the vertical direction.
- the tilt cylinder 13 and the lift cylinder 14 are driven by the hydraulic oil discharged from the hydraulic pump 8 to change the posture of the blade 12.
- FIG. 2 is a block diagram showing the configuration of the display system 2 according to the first embodiment and the flow of processing by the display system 2.
- the display system 2 includes a plurality of cameras C1-C4.
- the plurality of cameras C1-C4 are attached to the vehicle main body 3.
- the plurality of cameras C1-C4 are fisheye cameras.
- the angle of view of each of the plurality of cameras C1-C4 is 180 degrees. However, the angle of view of each of the plurality of cameras C1-C4 may be smaller than 180 degrees. Alternatively, the angle of view of each of the plurality of cameras C1-C4 may be greater than 180 degrees.
- the plurality of cameras C1-C4 includes a front camera C1, a first side camera C2, a rear camera C3, and a second side camera C4.
- the front camera C ⁇ b> 1 is attached to the front portion of the vehicle body 3.
- the vehicle main body 3 has a front camera support portion 16.
- the front camera support portion 16 extends upward and forward from the front portion of the vehicle body 3.
- the front camera C1 is attached to the front camera support 16.
- the rear camera C3 is attached to the rear part of the vehicle main body 3.
- the first side camera C2 is attached to one side of the vehicle body 3.
- the second side camera C4 is attached to the other side portion of the vehicle body 3.
- the first side camera C ⁇ b> 2 is attached to the left side portion of the vehicle main body 3, and the second side camera C ⁇ b> 4 is attached to the right side portion of the vehicle main body 3.
- the first side camera C ⁇ b> 2 may be attached to the right side portion of the vehicle body 3, and the second side camera C ⁇ b> 4 may be attached to the left side portion of the vehicle body 3.
- the front camera C1 images the front of the vehicle body 3 and acquires an image including the surrounding environment of the work vehicle 1.
- the rear camera C ⁇ b> 3 captures the rear of the vehicle body 3 and acquires an image including the surrounding environment of the work vehicle 1.
- the first side camera C ⁇ b> 2 images the left side of the vehicle body 3 and acquires an image including the surrounding environment of the work vehicle 1.
- the second side camera C ⁇ b> 4 captures the right side of the vehicle body 3 and acquires an image including the surrounding environment of the work vehicle 1.
- Cameras C1-C4 output image data indicating the acquired image.
- the display system 2 includes a controller 20, a shape sensor 21, a posture sensor 22, a position sensor 23, and a display 24.
- the shape sensor 21 measures the three-dimensional shape of the surrounding environment of the work vehicle 1 and outputs 3D shape data D1 indicating the three-dimensional shape.
- the shape sensor 21 measures the positions of a plurality of points on the surrounding environment of the work vehicle 1.
- the 3D shape data D1 indicates the positions of a plurality of points on the surrounding environment of the work vehicle 1.
- the surrounding environment of the work vehicle 1 includes, for example, the ground surface around the work vehicle 1. That is, the 3D shape data D1 includes the positions of a plurality of points on the ground surface around the work vehicle 1. In particular, the 3D shape data D1 includes the positions of a plurality of points on the ground surface in front of the work vehicle 1.
- the shape sensor 21 measures the distance from the work vehicle 1 at a plurality of positions on the surrounding environment. The positions of the plurality of points are obtained from the distances from the plurality of work vehicles 1.
- the shape sensor 21 is a lidar (LIDAR: Laser Imaging Detection and Ranging). The shape sensor 21 measures the distance to a measurement point by irradiating a laser and measuring the reflected light.
- the shape sensor 21 includes, for example, a plurality of laser distance measuring elements arranged in the vertical direction.
- the shape sensor 21 measures the positions of a plurality of points at a predetermined cycle while rotating a plurality of laser distance measuring elements laterally around an axis extending in the vertical direction. Therefore, the shape sensor 21 measures the distance to a point on the surrounding environment at every fixed rotation angle, and acquires the position of the three-dimensional point group.
- the shape data includes information on which element is measured for each point, information on which rotation angle is measured, and information on the positional relationship of each element. Further, the controller 20 has information indicating the positional relationship between each element and the work vehicle 1. Therefore, the controller 20 can acquire the positional relationship between each point on the surrounding environment and the work vehicle from the shape data.
- the attitude sensor 22 detects the attitude of the work vehicle 1 and outputs attitude data D2 indicating the attitude.
- the attitude sensor 22 is, for example, an IMU (Inertial Measurement Unit: Inertial Measurement Unit).
- the attitude data D2 includes an angle (pitch angle) with respect to the horizontal in the vehicle longitudinal direction and an angle (roll angle) with respect to the horizontal in the vehicle lateral direction.
- the IMU outputs attitude data D2.
- the position sensor 23 is, for example, a GNSS (Global Navigation Satellite System) receiver.
- the GNSS receiver is a receiver for GPS (Global Positioning System), for example.
- the GNSS receiver receives a positioning signal from the satellite, and acquires position data D3 indicating the position coordinates of the work vehicle 1 from the positioning signal.
- the GNSS receiver outputs position data D3.
- the shape sensor 21 is attached to the front camera support portion 16, for example. Alternatively, the shape sensor 21 may be attached to another part of the vehicle body 3.
- the attitude sensor 22 and the position sensor 23 are attached to the vehicle body 3. Alternatively, the position sensor 23 may be attached to the work machine 4.
- the controller 20 is connected to the cameras C1-C4 so that they can communicate with each other by wire or wirelessly.
- the controller 20 receives image data from the cameras C1-C4.
- the controller 20 is connected to the shape sensor 21, the attitude sensor 22, and the position sensor 23 so that they can communicate with each other by wire or wirelessly.
- the controller 20 receives 3D shape data D1 from the shape sensor 21.
- the controller 20 receives posture data D ⁇ b> 2 from the posture sensor 22.
- the controller 20 receives the position data D3 from the position sensor 23.
- the controller 20 is programmed to generate a display image Is for displaying the surrounding environment of the work vehicle 1 based on the above-described image data, 3D shape data D1, attitude data D2, and position data D3.
- the controller 20 may be disposed outside the work vehicle 1. Alternatively, the controller 20 may be disposed inside the work vehicle 1.
- the controller 20 includes an arithmetic device 25 and a storage device 26.
- the arithmetic unit 25 is constituted by a processor such as a CPU.
- the arithmetic device 25 performs a process for generating the display image Is.
- the storage device 26 includes a memory such as a RAM or a ROM, or an auxiliary storage device 26 such as a hard disk.
- the storage device 26 stores data and programs used for generating the display image Is.
- the display 24 is, for example, a CRT, LCD, or OELD. However, the display 24 is not limited to these displays, and may be other types of displays.
- the display 24 displays the display image Is based on the output signal from the controller 20.
- the controller 20 acquires a front image Im1, a left image Im2, a rear image Im3, and a right image Im4 from the cameras C1-C4.
- the front image Im1 is an image in front of the vehicle body 3.
- the left image Im ⁇ b> 2 is a left image of the vehicle body 3.
- the rear image Im3 is a rear image of the vehicle body 3.
- the right image Im4 is a right image of the vehicle body 3.
- the controller 20 generates a three-dimensional projection model M1 based on the 3D shape data D1 acquired from the shape sensor 21. As shown in FIG. 3, the controller 20 generates a polygon mesh that represents the shape of the surrounding environment based on the positions of a plurality of points on the surrounding environment of the work vehicle 1.
- the three-dimensional projection model M1 includes a polygon connecting adjacent points among a plurality of points.
- the controller 20 has a plurality of points P (1,1), P (2,1),..., P (i) measured by one scan by the shape sensor 21. , j) A mesh is generated by connecting adjacent points of,.
- P (i, j) represents a point measured by the i-th laser distance measuring element in the vertical direction and obtained at the j-th rotation angle in the lateral direction.
- the controller 20 uses the triangle (P (i, j)) for the points P (i, j), P (i + 1, j), P (i, j + 1), and P (i + 1, j + 1).
- the controller 20 generates a three-dimensional projection model M1 represented by a triangular polygon.
- the shape sensor 21 periodically measures the three-dimensional shape of the surrounding environment.
- the controller 20 updates the 3D shape data D1, and generates a three-dimensional projection model M1 based on the updated 3D shape data D1.
- the controller 20 generates a surrounding composite image Is1 from the images Im1-Im4 acquired by the cameras C1-C4.
- the surrounding composite image Is1 is an image showing the surroundings of the work vehicle 1 in an overhead view.
- the controller 20 synthesizes the vehicle image Is2 showing the work vehicle 1 with the display image.
- the vehicle image Is2 is an image that three-dimensionally shows the work vehicle 1 itself.
- the controller 20 determines the attitude of the vehicle image Is2 on the display image Is from the attitude data D2.
- the controller 20 determines the direction of the vehicle image Is2 on the display image Is from the position data D3.
- the controller 20 synthesizes the vehicle image Is2 with the display image Is such that the posture and orientation of the vehicle image Is2 on the display image Is match the actual posture and orientation of the work vehicle 1.
- the controller 20 may generate the vehicle image Is2 from the images Im1-Im4 acquired by the cameras C1-C4. For example, each part of the work vehicle 1 is included in the image acquired by the cameras C1-C4, and the controller 20 generates a vehicle image Is2 by projecting each part in the image onto the vehicle model M2. Also good.
- the vehicle model M ⁇ b> 2 is a projection model having the shape of the work vehicle 1 and may be stored in the storage device 26.
- the vehicle image Is2 may be a predetermined image taken in advance or three-dimensional computer graphics created in advance.
- FIG. 4 is a diagram illustrating an example of the display image Is.
- the display image Is is an image that three-dimensionally represents the work vehicle 1 and its surroundings.
- the display image Is is displayed using a three-dimensional projection model M ⁇ b> 1 having an inclined shape in accordance with the inclined actual terrain around the work vehicle 1.
- the vehicle image Is2 is displayed on the display image Is in a tilted state in accordance with the actual tilted posture of the work vehicle 1.
- the display image Is is updated in real time and displayed as a moving image. Therefore, when the work vehicle 1 is traveling, the posture, direction, and direction of the surrounding composite image IS1 and the vehicle image Is2 in the display image Is according to the actual environment, the posture, orientation, and position of the work vehicle. The position is also changed and displayed in real time.
- the three-dimensional projection model M1 and the vehicle model M2 are changed from the posture, orientation, and position when the work vehicle 1 starts traveling. Rotate according to the represented rotation matrix and translate according to the translation vector. The rotation vector and the translation vector are acquired from the attitude data D2 and the position data D3 described above.
- the display image Is is an image of the work vehicle 1 and its surroundings viewed from the left side.
- the controller 20 can switch the display image Is to an image in which the work vehicle 1 and its surroundings are viewed from the front, rear, right side, upper side, or obliquely in each direction.
- the three-dimensional shape of the surrounding environment of the work vehicle 1 is measured by the shape sensor 21, and the three-dimensional projection model M1 is generated based on the measured three-dimensional shape. Therefore, the three-dimensional projection model M1 has a shape that is the same as or approximate to the actual topography around the work vehicle 1. Therefore, an image of the surrounding environment can be presented in the display image Is in a shape reflecting the actual topography around the work vehicle 1. Therefore, in the display system 2 according to the present embodiment, the display image Is that can easily grasp the shape of the surrounding environment of the work vehicle 1 can be generated.
- the actual posture of the work vehicle 1 is measured by the posture sensor 22, and the vehicle image Is2 is displayed on the display image Is in accordance with the measured posture. Therefore, the vehicle image Is2 can be presented in the display image Is in a posture that reflects the actual posture of the work vehicle 1. Thereby, the posture change of the work vehicle 1 can be clearly presented to the operator, such as the work vehicle 1 entering the inclined surface and the turning operation.
- the controller 20 evaluates a plurality of regions included in the surrounding environment based on the 3D shape data D1.
- the controller 20 defines the triangular polygon of the three-dimensional projection model M1 described above as one area. Note that the configuration of the display system 2 and the method of generating the display image Is are the same as those in the first embodiment, and thus description thereof is omitted.
- the controller 20 classifies each area into a plurality of levels for evaluation.
- the controller 20 classifies each region into a first level and a second level.
- the first level indicates that the work vehicle 1 is allowed to enter.
- the second level indicates an area where entry of the work vehicle 1 is prohibited.
- FIG. 5 is a flowchart showing a process performed by the controller 20 in order to evaluate a region.
- the controller 20 determines whether the point group density warning condition is satisfied for each region.
- the warning condition of the point group density is expressed by the following formula (1).
- L1 (i), L2 (i), and L3 (i) are the lengths of line segments connecting points that define each region.
- the controller 20 sets the lengths L1 (i), L2 (i), and L3 (i) of each side of a triangle (Pi, Pi + 1, Pi + 2) representing each region to each region. Calculated as the length of the line segment.
- the controller 20 compares the length of the line segment of each region (Pi, Pi + 1, Pi + 2) with a predetermined threshold k ⁇ Lc, and each region (Pi, Pi + 1, Pi + 2) It is determined whether a line segment larger than the threshold value k ⁇ Lc is included.
- a certain region (Pi, Pi + 1, Pi + 2) satisfies the point cloud density warning condition, that is, a certain region (Pi, Pi + 1, Pi + 2) includes a line segment larger than the threshold k ⁇ Lc.
- the controller 20 determines that the region (Pi, Pi + 1, Pi + 2) is the second level.
- “Lc” is the length of the crawler belt 11.
- the length of the crawler belt 11 is a length where the crawler belt placed on the flat surface is in contact with the flat surface, and is called a contact length.
- “K” is a predetermined coefficient larger than 0 and smaller than 1. Therefore, the threshold value k ⁇ Lc is defined based on the length of the crawler belt 11, for example, the coefficient “k” is 1 ⁇ 2. However, the coefficient “k” may be a value other than 1 ⁇ 2.
- the coefficient “k” may be a fixed value or may be arbitrarily set by an operator.
- the length Lc of the crawler belt 11 may be a length related to the contact length. For example, it may be the entire length of the crawler belt 11 in the front-rear direction. In that case, the value of the coefficient k is changed as appropriate.
- the warning condition for the point group density may further include a condition represented by the following expression (2).
- Lc ′ is the distance between the centers of the left and right crawler belts 11 and is called a crawler track gauge width.
- the coefficient k ′ is approximately 1.
- the controller 20 may determine that the warning condition is satisfied when both the expressions (1) and (2) are satisfied.
- step S ⁇ b> 103 the controller 20 determines whether or not the tilt warning condition is satisfied for an area that does not satisfy the point cloud density warning condition.
- the tilt warning condition is expressed by the following equation (3).
- the controller 20 calculates the normal vector Ni of the target area (Pi, Pi + 1, Pi + 2) and the area included in the surrounding predetermined range A1 (i).
- the average Nav of the normals of those areas is calculated.
- the angle formed by the normal Nav and the gravity direction is determined as the inclination angle of the target region (Pi, Pi + 1, Pi + 2).
- the above-described tilt warning condition means that the tilt angle of the target region (Pi, Pi + 1, Pi + 2) exceeds the threshold ⁇ max.
- ez is a unit vector in the direction of gravity.
- the threshold value ⁇ max is, for example, an upper limit inclination angle at which the work vehicle 1 is allowed to enter. However, the threshold value ⁇ max may be another value.
- the threshold value ⁇ max may be a fixed value or may be arbitrarily set by an operator.
- the predetermined range A1 (i) is represented by, for example, a circle with a radius R centered on the centroid of the target region (Pi, Pi + 1, Pi + 2).
- the radius R may be a fixed value. Alternatively, the radius R may be arbitrarily set by the operator.
- step S102 The controller 20 determines that the region (Pi, Pi + 1, Pi + 2) is the second level.
- a certain region (Pi, Pi + 1, Pi + 2) does not satisfy the inclination warning condition, that is, when the inclination angle of a certain region (Pi, Pi + 1, Pi + 2) is less than or equal to the threshold ⁇ max, the processing Advances to step S104.
- step S104 the controller 20 determines whether or not the undulation warning condition is satisfied for an area that does not satisfy the point cloud density warning condition.
- the undulation warning condition is expressed by the following equation (4).
- n is the number of points included in the target determination range A2 (i).
- the determination range A2 (i) here may be the same as or different from the predetermined range A1 (i) in step S103.
- Zi is the height of the point Pi in the direction of gravity.
- Zav is an average of the heights of the points included in the determination range A2 (i).
- ⁇ 2 z indicates the variance of the height of the points in the determination range A2 (i).
- the above undulation warning condition means that the variance ⁇ 2 z of the heights of the points included in the target determination range A2 (i) exceeds the threshold ⁇ 2 max. That is, the undulation warning condition means that the undulation change in the determination range A2 (i) is large.
- the threshold value ⁇ 2 max is, for example, the upper limit of the undulation change that allows the work vehicle 1 to enter. However, the threshold value ⁇ 2 max may be another value. The threshold ⁇ 2 max may be a fixed value or may be arbitrarily set by an operator.
- step S102 the controller 20 determines that the region included in the determination range A2 (i) is the second level.
- step S105 the process proceeds to step S105.
- step S105 the controller 20 determines the first level for an area that does not satisfy all of the point cloud density warning condition, the inclination warning condition, and the undulation warning condition.
- the controller 20 displays the display image Is on the display 24.
- the controller 20 displays each of the plurality of regions in the display image Is in a manner corresponding to the evaluation. Specifically, the controller 20 displays the second level area in a different color from the first level area.
- FIG. 9 is a diagram illustrating an example of the display image Is according to the second embodiment.
- a steep down slope Sp2 exists on the right side of the work vehicle 1.
- Controller 20 determines area Sp1 in front of work vehicle 1 as the first level. Further, the controller 20 determines that the right downward slope Sp2 and the left upward slope Sp3 are the second level. The controller 20 represents the right downward slope Sp2 and the left upward slope Sp3 in the display image Is in a color different from the front area Sp1.
- the controller 20 evaluates a plurality of areas included in the surrounding environment based on the 3D shape data D1, and sets the second level area as the first level area. Display on the display image Is in a different manner. Therefore, the operator can easily notice the presence of the second level area from the display image Is. Further, the display image Is is projected onto the three-dimensional projection model M1 reflecting the actual topography around the work vehicle 1. Therefore, the area determined to be the second level can be expressed in the display image Is in a shape close to the actual terrain.
- the controller 20 determines that the area satisfying the point cloud density warning condition is the second level, and displays the area on the display image Is in a manner different from the area of the first level.
- the range between the points is a portion where measurement by the shape sensor 21 is not performed, the lengths of the line segments L1 (i), L2 (i), and L3 (i) of each region are longer. This means that the range in which measurement by the shape sensor 21 is not performed is large. Therefore, as shown in FIG. 10A, even if there is a steep slope between the point Pi and the point Pi + 1, the shape sensor 21 may not be able to measure.
- the region is The second level is determined. Therefore, it is possible to determine a region where a sufficient point cloud density is not obtained as the second level. Therefore, a region where a sufficient density of point clouds is not obtained because it is far away from the shape sensor 21 can be determined as the second region. Alternatively, since the laser is blocked by the terrain, an area where the accurate terrain cannot be measured can be determined as the second level.
- the threshold value k ⁇ Lc is defined from the length of the crawler belt 11. If the area that has not been measured is longer than the threshold value k ⁇ Lc defined by the length of the crawler belt 11, the inclination of the work vehicle 1 exceeds the upper limit inclination angle ⁇ max if there is a concave terrain in the area. there is a possibility.
- a region can be determined as the second level and displayed on the display image Is in a manner different from the region of the first level.
- the controller 20 determines that the region satisfying the tilt warning condition is the second level, and displays the region on the display image Is in a manner different from the region of the first level. Therefore, for example, as shown in FIG. 10B, an area having a steep inclination exceeding the upper limit inclination angle ⁇ max allowed for the work vehicle 1 is determined as the second level, and is different from the first level area. It can be displayed on the display image Is.
- the controller 20 evaluates the target area not only by the inclination angle of the area to be determined but also by the average of the inclination angles of the other areas included in the surrounding predetermined range A1 (i). Thereby, the influence that a point cloud density changes with distance from the shape sensor 21 or topography can be eased, and it can evaluate accurately.
- the controller 20 determines the determination area A2 (i) satisfying the undulation warning condition as the second level, and displays it on the display image Is in a manner different from the determination area of the first level.
- the controller 20 evaluates the intensity of undulation in the determination range A2 (i) based on the variance of the height of each point in a certain determination range A2 (i).
- a region with a large undulation can be determined as the second level and displayed on the display image Is in a manner different from the region of the first level.
- the display image Is shown in FIG. 9 is an image generated from a viewpoint viewed from the right front of the work vehicle 1, but the controller 20 may arbitrarily change the viewpoint to generate the display image Is. .
- the controller 20 may switch the viewpoint according to, for example, an operator's operation. As a result, the display image Is can be generated so that a portion that the operator particularly wants to visually recognize can be seen in the surrounding environment of the work vehicle 1.
- Work vehicle 1 is not limited to a bulldozer, but may be another type of vehicle such as a wheel loader, a hydraulic excavator, or a dump truck.
- the work vehicle 1 may be a vehicle that is remotely operated from a controller 20 disposed outside the work vehicle 1. In that case, the cab may be omitted from the vehicle body 3 as in the work vehicle 100 shown in FIG. In FIG. 11, the same reference numerals are given to the portions corresponding to the work vehicle 1 shown in FIG.
- the work vehicle 1 may be a vehicle that is directly operated by an operator in a cab mounted on the work vehicle 1.
- the number of cameras is not limited to four, but may be three or less, or five or more.
- the camera is not limited to a fisheye camera, and may be another type of camera.
- the arrangement of the camera is not limited to the arrangement of the above embodiment, and may be a different arrangement.
- the attitude sensor 22 is not limited to the IMU, and may be another sensor.
- the position sensor 23 is not limited to a GNSS receiver, and may be another sensor.
- the shape sensor 21 is not limited to a lidar, and may be another measuring device such as a radar.
- a part of the warning condition may be omitted or changed.
- other warning conditions may be added.
- the content of the warning condition may be changed.
- the evaluation of the area is not limited to the two stages of the first level and the second level, and may be an evaluation at a higher level.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- General Engineering & Computer Science (AREA)
- Civil Engineering (AREA)
- Structural Engineering (AREA)
- Mining & Mineral Resources (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Graphics (AREA)
- Closed-Circuit Television Systems (AREA)
- Component Parts Of Construction Machinery (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Abstract
Description
L1(i), L2(i), L3(i)は、各領域を規定する点を結んだ線分の長さである。図6に示すように、コントローラ20は、各領域を表す三角形(Pi, Pi+1, Pi+2)の各辺の長さL1(i), L2(i), L3(i)を各領域の線分の長さとして算出する。
ここで、Lc’は、左右の履帯11の中心間距離であり、履帯ゲージ幅と呼ばれる。係数k’は、略1である。コントローラ20は、式(1)と式(2)の両者が満足したときに、警告条件が満たされたと判定してもよい。
ここでは、図7に示すように、コントローラ20は、対象とする領域(Pi, Pi+1, Pi+2)と、その周囲の所定範囲A1(i)に含まれる領域の法線ベクトルNiを求め、それらの領域の法線の平均Navを算出する。法線の平均Navと重力方向とのなす角を、当該対象とする領域(Pi, Pi+1, Pi+2)の傾斜角として決定する。上記の傾斜の警告条件は、対象とする領域(Pi, Pi+1, Pi+2)の傾斜角が閾値θmaxを超えていることを意味する。式(3)において、ezは、重力方向の単位ベクトルである。
nは、図8に示すように、対象とする判定範囲A2(i)内に含まれる点の数である。ここでの判定範囲A2(i)は、ステップS103での所定範囲A1(i)と同じであってもよく、或いは異なってもよい。Ziは、点Piの重力方向における高さである。Zavは、判定範囲A2(i)に含まれる点の高さの平均である。σ2zは、判定範囲A2(i)内の点の高さの分散を示している。上記の起伏の警告条件は、対象とする判定範囲A2(i)に含まれる点の高さの分散σ2zが、閾値σ2maxを超えていることを意味する。すなわち、起伏の警告条件は、判定範囲A2(i)の起伏の変化が大きいことを意味している。
11 履帯
21 形状センサ
20 コントローラ
22 姿勢センサ
C1-C4 カメラ
Is 表示画像
M1 3次元投影モデル
Claims (15)
- 作業車両の周囲環境の画像を撮影し、前記画像を示す画像データを出力するカメラと、
前記周囲環境の3次元形状を測定し、前記3次元形状を示す3D形状データを出力する形状センサと、
前記画像データと前記3D形状データとを取得するコントローラと、
を備え、
前記コントローラは、
前記3D形状データに基づいて前記周囲環境の前記3次元形状を表現する3次元投影モデルを生成し、
前記画像データに基づいて前記3次元投影モデルに前記画像を投影することで、前記作業車両の周囲環境の表示画像を示す表示画像データを生成する、
作業車両の表示システム。 - 前記形状センサは、前記周囲環境上の複数点の位置を測定し、
前記3D形状データは、前記複数点の位置を示す、
請求項1に記載の作業車両の表示システム。 - 前記3次元投影モデルは、前記複数点のうち隣接する点を結んだポリゴンを含む、
請求項2に記載の作業車両の表示システム。 - 前記形状センサは、前記周囲環境の3次元形状について周期的測定を行い、
前記コントローラは、
前記周期的測定に伴い前記3D形状データを更新し、
更新された前記3D形状データに基づいて、前記3次元投影モデルを生成する、
請求項1に記載の作業車両の表示システム。 - 前記作業車両の姿勢を検出し、前記姿勢を示す姿勢データを出力する姿勢センサをさらに備え、
前記コントローラは、
前記姿勢データを取得し、
前記作業車両を示す車両画像を前記表示画像に合成し、
前記姿勢データに応じて、前記表示画像上の前記車両画像の姿勢を変化させる、
請求項1に記載の作業車両の表示システム。 - 前記コントローラは、
前記3D形状データに基づいて前記周囲環境に含まれる複数の領域の評価を行い、
前記表示画像において、前記評価に応じた態様で、前記複数の領域のそれぞれを表示する、
請求項1に記載の作業車両の表示システム。 - 前記コントローラは、
前記複数の領域のそれぞれの傾斜角を取得し、
前記傾斜角に基づいて前記複数の領域のそれぞれを評価する、
請求項6に記載の作業車両の表示システム。 - 前記コントローラは、
前記複数の領域のそれぞれの傾斜角と所定の閾値とを比較し、
前記閾値以下の傾斜角を有する領域と、前記閾値よりも大きな傾斜角を有する領域とを異なる態様で前記表示画像に表示する、
請求項7に記載の作業車両の表示システム。 - 前記コントローラは、
前記複数の領域のうち対象とする領域の傾斜角と、前記対象とする領域の周囲の所定範囲に含まれる他の領域の傾斜角との平均を算出し、
前記傾斜角の平均に基づいて前記対象とする領域を評価する、
請求項7に記載の作業車両の表示システム。 - 前記コントローラは、
前記複数の領域のぞれぞれに含まれる複数点の高さを取得し、
前記複数点の高さの分散を算出し、
前記分散に基づいて前記複数の領域のそれぞれを評価する、
請求項6に記載の作業車両の表示システム。 - 前記コントローラは、
前記分散と所定の閾値とを比較し、
前記閾値以下の分散を有する領域と、前記閾値よりも大きな分散を有する領域とを異なる態様で前記表示画像に表示する、
請求項10に記載の作業車両の表示システム。 - 前記形状センサは、前記周囲環境上の複数点の位置を測定し、
前記3D形状データは、前記複数点の位置を示し、
前記コントローラは、
前記3D形状データに基づいて、前記複数点を結んだ線分で囲まれた複数の領域を定義し、
前記複数の領域のそれぞれについて、前記線分の長さを算出し、
前記線分の長さに基づいて、前記複数の領域をそれぞれ評価し、
前記表示画像において、前記評価の結果に応じた態様で、前記複数の領域をそれぞれ表示する、
請求項1に記載の作業車両の表示システム。 - 前記コントローラは、
前記線分の長さを所定の閾値と比較し、
前記閾値より大きい線分を含む領域と、前記閾値より大きい線分を含まない領域とを異なる態様で前記表示画像に表示する、
請求項1に記載の作業車両の表示システム。 - 前記作業車両は履帯を含み、
前記閾値は、前記履帯の長さに基づいて規定される、
請求項13に記載の作業車両の表示システム。 - 作業車両の周囲環境の表示画像を示す表示画像データを生成するためにコントローラによって実行される生成方法であって、
前記作業車両の周囲環境の画像を示す画像データを取得することと、
前記周囲環境の3次元形状を示す3D形状データを取得することと、
前記3D形状データに基づいて前記周囲環境の前記3次元形状を表現する3次元投影モデルを生成することと、
前記画像データに基づいて前記3次元投影モデルに前記画像を投影することで、前記表示画像データを生成すること、
を備える生成方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA3075426A CA3075426C (en) | 2018-02-19 | 2019-01-29 | Display system for work vehicle and generation method |
AU2019222009A AU2019222009B2 (en) | 2018-02-19 | 2019-01-29 | Display system for work vehicle and generation method |
CN201980004602.5A CN111149355B (zh) | 2018-02-19 | 2019-01-29 | 工作车辆的显示系统以及生成方法 |
US16/640,154 US11377824B2 (en) | 2018-02-19 | 2019-01-29 | Display system for work vehicle and generation method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-027202 | 2018-02-19 | ||
JP2018027202A JP7232437B2 (ja) | 2018-02-19 | 2018-02-19 | 作業車両の表示システム及び生成方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019159687A1 true WO2019159687A1 (ja) | 2019-08-22 |
Family
ID=67619363
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/003013 WO2019159687A1 (ja) | 2018-02-19 | 2019-01-29 | 作業車両の表示システム及び生成方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11377824B2 (ja) |
JP (1) | JP7232437B2 (ja) |
CN (1) | CN111149355B (ja) |
AU (1) | AU2019222009B2 (ja) |
CA (1) | CA3075426C (ja) |
WO (1) | WO2019159687A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4060286A3 (en) * | 2021-03-19 | 2022-12-07 | Topcon Corporation | Surveying system, surveying method, and surveying program |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7160004B2 (ja) * | 2019-08-30 | 2022-10-25 | トヨタ自動車株式会社 | 表示システム、車両、および、二次電池の状態表示方法 |
US11595618B2 (en) * | 2020-04-07 | 2023-02-28 | Caterpillar Inc. | Enhanced visibility system for work machines |
JP2022042420A (ja) * | 2020-09-02 | 2022-03-14 | 株式会社Subaru | 車両制御装置 |
JP7244036B2 (ja) * | 2021-05-14 | 2023-03-22 | 学校法人 芝浦工業大学 | 作業支援システムおよび、作業支援方法 |
JP7076114B1 (ja) | 2021-05-14 | 2022-05-27 | 学校法人 芝浦工業大学 | 作業支援システムおよび、作業支援方法 |
JP2023050332A (ja) * | 2021-09-30 | 2023-04-11 | 株式会社トプコン | 測量システム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012202063A (ja) * | 2011-03-24 | 2012-10-22 | Komatsu Ltd | 油圧ショベルの較正装置及び油圧ショベルの較正方法 |
JP2012233353A (ja) * | 2011-05-02 | 2012-11-29 | Komatsu Ltd | 油圧ショベルの較正システム及び油圧ショベルの較正方法 |
JP2012255286A (ja) * | 2011-06-08 | 2012-12-27 | Topcon Corp | 建設機械制御システム |
JP2013036243A (ja) * | 2011-08-09 | 2013-02-21 | Topcon Corp | 建設機械制御システム |
WO2016031009A1 (ja) * | 2014-08-28 | 2016-03-03 | 国立大学法人東京大学 | 作業車両の表示システム、表示制御装置、作業車両、及び表示制御方法 |
JP2017215240A (ja) * | 2016-06-01 | 2017-12-07 | 株式会社トプコン | 測定装置及び測量システム |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4727068B2 (ja) * | 2001-05-29 | 2011-07-20 | 株式会社トプコン | 施工監視システム、施工管理方法 |
US6782644B2 (en) * | 2001-06-20 | 2004-08-31 | Hitachi Construction Machinery Co., Ltd. | Remote control system and remote setting system for construction machinery |
KR100916638B1 (ko) * | 2007-08-02 | 2009-09-08 | 인하대학교 산학협력단 | 구조광을 이용한 토공량 산출 장치 및 방법 |
CN102448681B (zh) * | 2009-12-28 | 2014-09-10 | 松下电器产业株式会社 | 动作空间提示装置、动作空间提示方法以及程序 |
JP5550970B2 (ja) * | 2010-04-12 | 2014-07-16 | 住友重機械工業株式会社 | 画像生成装置及び操作支援システム |
KR101751405B1 (ko) * | 2010-10-22 | 2017-06-27 | 히다치 겡키 가부시키 가이샤 | 작업 기계의 주변 감시 장치 |
US9300954B2 (en) * | 2012-09-21 | 2016-03-29 | Tadano Ltd. | Surrounding information-obtaining device for working vehicle |
WO2014110502A1 (en) * | 2013-01-11 | 2014-07-17 | The Regents Of The University Of Michigan | Monitoring proximity of objects at construction jobsites via three-dimensional virtuality in real-time |
US9013286B2 (en) * | 2013-09-23 | 2015-04-21 | Volkswagen Ag | Driver assistance system for displaying surroundings of a vehicle |
EP3140613B1 (en) * | 2014-05-05 | 2024-04-03 | Hexagon Technology Center GmbH | Surveying system |
US9639958B2 (en) * | 2015-03-19 | 2017-05-02 | Caterpillar Inc. | Synthetic colorization of real-time immersive environments |
JP6496182B2 (ja) * | 2015-04-28 | 2019-04-03 | 株式会社小松製作所 | 施工計画システム |
US9824490B1 (en) * | 2015-06-08 | 2017-11-21 | Bentley Systems, Incorporated | Augmentation of a dynamic terrain surface |
AU2017302833B2 (en) * | 2016-07-29 | 2020-11-12 | Hitachi, Ltd. | Database construction system for machine-learning |
JP6581139B2 (ja) * | 2017-03-31 | 2019-09-25 | 日立建機株式会社 | 作業機械の周囲監視装置 |
CN110573680A (zh) * | 2017-04-26 | 2019-12-13 | 住友建机株式会社 | 挖土机、挖土机管理装置及挖土机管理辅助装置 |
JP7252137B2 (ja) * | 2017-12-04 | 2023-04-04 | 住友重機械工業株式会社 | 周辺監視装置 |
EP3733977B1 (en) * | 2017-12-27 | 2023-11-22 | Sumitomo (S.H.I.) Construction Machinery Co., Ltd. | Shovel |
-
2018
- 2018-02-19 JP JP2018027202A patent/JP7232437B2/ja active Active
-
2019
- 2019-01-29 AU AU2019222009A patent/AU2019222009B2/en active Active
- 2019-01-29 CN CN201980004602.5A patent/CN111149355B/zh active Active
- 2019-01-29 WO PCT/JP2019/003013 patent/WO2019159687A1/ja active Application Filing
- 2019-01-29 CA CA3075426A patent/CA3075426C/en active Active
- 2019-01-29 US US16/640,154 patent/US11377824B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012202063A (ja) * | 2011-03-24 | 2012-10-22 | Komatsu Ltd | 油圧ショベルの較正装置及び油圧ショベルの較正方法 |
JP2012233353A (ja) * | 2011-05-02 | 2012-11-29 | Komatsu Ltd | 油圧ショベルの較正システム及び油圧ショベルの較正方法 |
JP2012255286A (ja) * | 2011-06-08 | 2012-12-27 | Topcon Corp | 建設機械制御システム |
JP2013036243A (ja) * | 2011-08-09 | 2013-02-21 | Topcon Corp | 建設機械制御システム |
WO2016031009A1 (ja) * | 2014-08-28 | 2016-03-03 | 国立大学法人東京大学 | 作業車両の表示システム、表示制御装置、作業車両、及び表示制御方法 |
JP2017215240A (ja) * | 2016-06-01 | 2017-12-07 | 株式会社トプコン | 測定装置及び測量システム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4060286A3 (en) * | 2021-03-19 | 2022-12-07 | Topcon Corporation | Surveying system, surveying method, and surveying program |
Also Published As
Publication number | Publication date |
---|---|
CN111149355B (zh) | 2022-04-22 |
CA3075426C (en) | 2021-07-20 |
AU2019222009B2 (en) | 2020-12-24 |
US20210214922A1 (en) | 2021-07-15 |
JP7232437B2 (ja) | 2023-03-03 |
JP2019145953A (ja) | 2019-08-29 |
CA3075426A1 (en) | 2019-08-22 |
AU2019222009A1 (en) | 2020-02-27 |
US11377824B2 (en) | 2022-07-05 |
CN111149355A (zh) | 2020-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019159687A1 (ja) | 作業車両の表示システム及び生成方法 | |
US9378554B2 (en) | Real-time range map generation | |
JP4671317B2 (ja) | 地形形状計測装置およびガイダンス装置 | |
US9449397B2 (en) | Real-time visual odometry system for determining motion of a machine with a range detection unit | |
US9678210B2 (en) | Error estimation in real-time visual odometry system | |
JP7365122B2 (ja) | 画像処理システムおよび画像処理方法 | |
US20170050566A1 (en) | Display system for work vehicle, display control device, work vehicle, and display control method | |
US20210209799A1 (en) | Position measurement system, work machine, and position measurement method | |
JP6867132B2 (ja) | 作業機械の検出処理装置及び作業機械の検出処理方法 | |
JP7203616B2 (ja) | 作業機械 | |
AU2019213435B2 (en) | Display system for work vehicle | |
US20240028042A1 (en) | Visual overlays for providing perception of depth | |
US11549238B2 (en) | System and method for work machine | |
US20210388580A1 (en) | System and method for work machine | |
Borthwick | Mining haul truck pose estimation and load profiling using stereo vision | |
JP7160701B2 (ja) | 作業機械のシステム及び方法 | |
AU2020212836B2 (en) | System and method for work machine | |
US11908076B2 (en) | Display system and display method | |
KR20190060127A (ko) | 굴삭기 작업반경 표시 방법 | |
US20230340755A1 (en) | Continuous calibration of grade control system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19753779 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019222009 Country of ref document: AU Date of ref document: 20190129 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3075426 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19753779 Country of ref document: EP Kind code of ref document: A1 |