EP3666977A1 - Road machine - Google Patents

Road machine Download PDF

Info

Publication number
EP3666977A1
EP3666977A1 EP18843681.0A EP18843681A EP3666977A1 EP 3666977 A1 EP3666977 A1 EP 3666977A1 EP 18843681 A EP18843681 A EP 18843681A EP 3666977 A1 EP3666977 A1 EP 3666977A1
Authority
EP
European Patent Office
Prior art keywords
image
camera
screed
tractor
road machine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP18843681.0A
Other languages
German (de)
French (fr)
Other versions
EP3666977A4 (en
EP3666977B1 (en
Inventor
Nobuyuki Baba
Kazuaki Hagiwara
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sumitomo SHI Construction Machinery Co Ltd
Original Assignee
Sumitomo SHI Construction Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sumitomo SHI Construction Machinery Co Ltd filed Critical Sumitomo SHI Construction Machinery Co Ltd
Publication of EP3666977A1 publication Critical patent/EP3666977A1/en
Publication of EP3666977A4 publication Critical patent/EP3666977A4/en
Application granted granted Critical
Publication of EP3666977B1 publication Critical patent/EP3666977B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01CCONSTRUCTION OF, OR SURFACES FOR, ROADS, SPORTS GROUNDS, OR THE LIKE; MACHINES OR AUXILIARY TOOLS FOR CONSTRUCTION OR REPAIR
    • E01C19/00Machines, tools or auxiliary devices for preparing or distributing paving materials, for working the placed materials, or for forming, consolidating, or finishing the paving
    • E01C19/48Machines, tools or auxiliary devices for preparing or distributing paving materials, for working the placed materials, or for forming, consolidating, or finishing the paving for laying-down the materials and consolidating them, or finishing the surface, e.g. slip forms therefor, forming kerbs or gutters in a continuous operation in situ
    • EFIXED CONSTRUCTIONS
    • E01CONSTRUCTION OF ROADS, RAILWAYS, OR BRIDGES
    • E01CCONSTRUCTION OF, OR SURFACES FOR, ROADS, SPORTS GROUNDS, OR THE LIKE; MACHINES OR AUXILIARY TOOLS FOR CONSTRUCTION OR REPAIR
    • E01C2301/00Machine characteristics, parts or accessories not otherwise provided for
    • E01C2301/14Extendable screeds
    • E01C2301/16Laterally slidable screeds

Definitions

  • the invention relates to road machines with a screw that feeds a paving material in an axial direction.
  • Patent Document 1 An image generator that generates an image showing an aerial view of an asphalt finisher and its surrounding area from above and presents the image to an operator of the asphalt finisher has been known (see Patent Document 1).
  • This image generator makes it possible for the operator to intuitively understand the positional relationship between the asphalt finisher and an object in its surrounding area by generating and displaying an image showing the entirety of an area around the asphalt finisher.
  • Patent Document 1 Japanese Patent No. 6029941
  • a space around the front end of hopper wings is the blind area of the hopper wings when viewed from a front camera, a left camera, and a right camera.
  • the above-described configuration is not suitable for causing the operator to check the state of a predetermined local area such as an area in front of a screed where a paving material is accumulated during construction. Therefore, the operator has to check the presence or absence of an object in a space that the operator desires to see, the amount of a paving material in a predetermined local area, etc., directly with her/his eyes.
  • a road machine which is a road machine including a tractor and a screed placed behind the tractor, includes a work apparatus configured to feed a paving material in front of the screed and a display device configured to display a peripheral image that is an image showing the surrounding area of the road machine, and to display a local image in a highlighted manner, the local image being an image showing an area fed with the paving material by the work apparatus.
  • FIGS. 1A through 1C illustrates an example configuration of an asphalt finisher 100 serving as a road machine according to an embodiment of the invention.
  • FIG. 1A shows a side view
  • FIG. 1B shows a plan view
  • FIG. 1C shows a rear view.
  • the asphalt finisher 100 mainly includes a tractor 1, a hopper 2, and a screed 3.
  • the tractor 1 is an apparatus for causing the asphalt finisher 100 to travel and tows the screed 3. According to this embodiment, the tractor 1 rotates two or four wheels using traveling hydraulic motors to move the asphalt finisher 100.
  • the traveling hydraulic motors are supplied with hydraulic oil from a hydraulic pump driven by a prime mover such as a diesel engine to rotate.
  • An operator seat 1S and an operation panel 65 are placed on top of the tractor 1.
  • Image capturing devices 51 (a right camera 51R, a left camera 51L, a front camera 51F, a right auxiliary camera 51V, and a left auxiliary camera 51U) are attached to the left side, the right side, and the front of the tractor 1.
  • a display device 52 is installed at such a position as to make it easy for an operator seated in the operator seat 1S to look at the display device 52.
  • the direction of the hopper 2 as viewed from the tractor 1 is defined as a forward direction (the +X direction), and the direction of the screed 3 as viewed from the tractor 1 is defined as a rearward direction (the -X direction).
  • the +Y direction corresponds to a leftward direction
  • the -Y direction corresponds to a rightward direction.
  • the hopper 2 which is an example of a work apparatus, is a mechanism for receiving a paving material (for example, an asphalt mixture).
  • the work apparatus is an apparatus to feed a paving material in front of the screed 3.
  • the hopper 2 can be opened and closed widthwise of the vehicle by hydraulic cylinders.
  • the asphalt finisher 100 fully opens the hopper 2 to receive a paving material from the bed of a dump truck.
  • the asphalt finisher 100 closes the hopper 2 to gather the paving material near the inner wall of the hopper 2 to the center of the hopper 2, thereby making it possible for a conveyor CV, which is an example of a work apparatus, to feed the paving material to the screed 3.
  • the screed 3 is a mechanism for spreading and smoothing a paving material.
  • the screed 3 can be vertically elevated and lowered and be extended and retracted widthwise of the vehicle by hydraulic cylinders. When extended widthwise of the vehicle, the screed 3 is wider than the tractor 1.
  • the screed 3 includes a front screed 30, a left rear screed 31L, and a right rear screed 31R.
  • the left rear screed 31L and the right rear screed 31R can extend and retract widthwise of the vehicle (Y-axis directions).
  • the left rear screed 31L and the right rear screed 31R which can extend and retract widthwise of the vehicle, are arranged with an offset from each other in a travel direction (X-axis direction). Therefore, compared with the case where there is no offset, the left rear screed 31L and the right rear screed 31R can have a larger width (length in the vehicle width direction) and extend more in the vehicle width direction, thus making it possible to construct a wider new pavement.
  • FIG. 2 schematically illustrates an example configuration of an image generating system SYS installed in the asphalt finisher 100.
  • the image generating system SYS generates an output image based on input images captured by the image capturing devices 51 installed in the asphalt finisher 100.
  • the image generating system SYS mainly includes a controller 50, the image capturing devices 51, the display device 52, a storage device 54, and an input device 55.
  • the controller 50 is, for example, a computer including a CPU, a volatile memory, and a nonvolatile memory.
  • the controller 50 causes the CPU to execute programs corresponding to an output image generating part 50A and a highlighting part 50B to implement functions corresponding to the output image generating part 50A and the highlighting part 50B.
  • the image capturing devices 51 are devices to obtain input images for generating an output image.
  • the image capturing devices 51 are cameras including an imaging device such as a CCD or a CMOS.
  • the image capturing devices 51 are so attached to the tractor 1 as to be able to capture an image of the blind area of the operator seated in the operator seat 1S.
  • the blind area includes, for example, an internal space (especially near the tractor 1) of the hopper 2, a space outside the front end of the hopper 2, and a space near a road surface near the side of the asphalt finisher 100.
  • the image capturing devices 51 may be attached to positions other than the right side, the left side, and the front (for example, the rear) of the tractor 1.
  • a wide-angle lens, a fish-eye lens, or the like may be attached to the image capturing devices 51.
  • the image capturing devices 51 may be either attached to the hopper 2 or attached to the screed 3.
  • the image capturing devices 51 include the front camera 51F, the left camera 51L, the right camera 51R, the left auxiliary camera 51U, and the right auxiliary camera 51V.
  • the front camera 51F is attached to the upper end of the front of the tractor 1 such that an optical axis 51FX thereof extends forward in the travel direction and forms an angle ⁇ with the road surface as viewed from the side.
  • the left camera 51L is attached to the upper end of the left side of the tractor 1 such that an optical axis 51LX thereof forms an angle ⁇ with the left side surface of the tractor 1 as viewed from above and forms an angle ⁇ with the road surface as viewed from the rear.
  • the right camera 51R is attached in the same manner as the left camera 51L with right and left reversed.
  • the left auxiliary camera 51U is attached to the upper end of the left side of the tractor 1 such that an optical axis 51UX thereof forms an angle ⁇ with the left side surface of the tractor 1 as viewed from above and forms an angle ⁇ with the road surface as viewed from the rear.
  • the right auxiliary camera 51V is attached in the same manner as the left auxiliary camera 51U with right and left reversed.
  • FIG. 1B an area 51FA surrounded by a dashed line in FIG.
  • an area 51LA surrounded by a one-dot chain line indicates the imaging range of the left camera 51L
  • an area 51RA surrounded by a one-dot chain line indicates the imaging range of the right camera 51R.
  • An area 51UA surrounded by a two-dot chain line indicates the imaging range of the left auxiliary camera 51U
  • an area 51VA surrounded by a two-dot chain line indicates the imaging range of the right auxiliary camera 51V.
  • the left camera 51L and the left auxiliary camera 51U are attached to the tractor 1 such that the area 51UA indicating the imaging range of the left auxiliary camera 51U is completely included in the area 51LA indicating the imaging range of the left camera 51L.
  • the left camera 51L and the left auxiliary camera 51U may be attached to the tractor 1 such that the area 51LA and the area 51UA overlap each other, namely, the area 51UA protrudes from the area 51LA.
  • the right camera 51R and the right auxiliary camera 51V are attached to the tractor 1 such that the area 51VA indicating the imaging range of the right auxiliary camera 51V is completely included in the area 51RA indicating the imaging range of the right camera 51R.
  • the right camera 51R and the right auxiliary camera 51V may be attached to the tractor 1 such that the area 51RA and the area 51VA overlap each other, namely, the area 51VA protrudes from the area 51RA.
  • the left auxiliary camera 51U and the right auxiliary camera 51V may be omitted.
  • the image capturing devices 51 are attached to the asphalt finisher 100 via, for example, a bracket, a stay, a bar, or the like. According to this embodiment, the image capturing devices 51 are attached to the tractor 1 via an attachment stay. Alternatively, the image capturing devices 51 may be directly attached to the tractor 1 without using an attachment stay or be embedded in the tractor 1.
  • the image capturing devices 51 output captured input images to the controller 50.
  • the image capturing devices 51 may output corrected input images in which an apparent distortion or tilt/shift caused by using such a lens is corrected to the controller 50.
  • the image capturing devices 51 may output input images in which the apparent distortion or tilt/shift is not corrected as they are to the controller 50. In this case, the apparent distortion or tilt/shift is corrected by the controller 50.
  • the image capturing devices 51 are arranged such that multiple blind areas to the left and to the right of the asphalt finisher 100 and inside and outside the hopper 2 are included in their imaging ranges.
  • the input device 55 is a device for enabling the operator to input various kinds of information to the image generating system SYS, and is, for example, a touchscreen, buttons, switches, or the like. According to this embodiment, the input device 55 includes a display change switch and a screw dial.
  • the display change switch is a switch for changing configurations of an output image displayed on the display device 52.
  • the screw dial is a dial for controlling the rotational speed of a screw SC that is an example of a work apparatus.
  • the storage device 54 is a device for storing various kinds of information. According to this embodiment, the storage device 54 is a nonvolatile storage device and is integrated into the controller 50. Alternatively, the storage device 54 may be placed outside the controller 50 as a structure different from the controller 50.
  • the display device 52 is a device for displaying various kinds of information. According to this embodiment, the display device 52 is a liquid crystal display installed in the operation panel 65, and displays various images output by the controller 50.
  • the output image generating part 50A is a functional element for generating an output image, and is composed of, for example, software, hardware, or their combination.
  • the output image generating part 50A refers to an input image-output image correspondence map 54a stored in the storage device 54 to correlate coordinates in input image planes in which input images captured by the image capturing devices 51 are positioned with coordinates in an output image plane in which an output image is positioned. Then, the output image generating part 50A generates the output image by associating the values (for example, luminance value, hue value, chromatic value, etc.) of pixels in the output image and the values of pixels in the input images.
  • the values for example, luminance value, hue value, chromatic value, etc.
  • the input image-output image correspondence map 54a stores the correspondence between the coordinates in the input image planes and the coordinates in the output image plane in such a manner as to allow reference to the correspondence.
  • the correspondence is preset based on various parameters of the image capturing devices 51, such as an optical center, a focal length, a CCD size, an optical axis direction vector, a camera horizontal direction vector, and a projection method.
  • the correspondence may be set in such a manner as to prevent an apparent distortion or tilt/shift from appearing in the output image if an input image includes the apparent distortion or tilt/shift.
  • a coordinate group forming a non-rectangular area in the input image plane is correlated with a coordinate group forming a rectangular area in the output image plane.
  • the correspondence may be set such that a coordinate group forming a rectangular area in the input image plane corresponds directly to a coordinate group forming a rectangular area in the output image if the apparent distortion or tilt/shift in the input image is already corrected when the input image is obtained.
  • the highlighting part 50B is a functional element for changing the contents of an output image displayed on the display device 52, and is composed of, for example, software, hardware, or their combination. According to this embodiment, the highlighting part 50B switches an output image displayed on the display device 52 between a first output image and a second output image when the display change switch serving as a highlighting switch is depressed. The highlighting part 50B may also switch the first output image to the second output image when the screw dial is operated and thereafter switch the second output image to the first output image when the length of time without screw dial operation (non-operating time) reaches a predetermined period.
  • the highlighting part 50B may also switch the first output image to the second output image when the highlighting switch is operated and thereafter switch the second output image to the first output image when the length of time without highlighting switch operation (non-operating time) reaches a predetermined period.
  • the non-operating time is counted using the timer function of the controller 50, for example.
  • the first output image includes a peripheral image and does not include a local image.
  • the second output image includes a peripheral image and a local image.
  • the peripheral image is an image showing an area surrounding the asphalt finisher 100.
  • the local image is an image showing a predetermined local area associated with the asphalt finisher 100, for example, an image showing an area where a paving material is fed (scattered out) by the screw SC.
  • the area where a paving material is fed by the screw SC is, for example, an area in front of the screed 3 and surrounded by retaining plates 70 (see FIG. 1B ), side plates 71 (see FIG. 1B ), and mold boards 72 (see FIG. 1B ).
  • the operator can check the amount of a paving material enclosed in the local area while remaining seated in the operator seat 1S without moving around on the tractor 1 or twisting her/his body in the operator seat 1S to look into the local area. Furthermore, the operator can check the surroundings and the amount of a paving material enclosed in the local area substantially simultaneously without greatly moving the line of sight.
  • the asphalt finisher 100 in which the image generating system SYS is installed can reduce the operator's fatigue due to the work of checking the enclosed amount. As a result, it is possible to improve safety with respect to the asphalt finisher 100.
  • the local image may be an image showing an area in the hopper 2.
  • the highlighting part 50B displays the local image in a highlighted manner on the display device 52 so that the operator can distinguish between the peripheral image and the local image.
  • the highlighting part 50B displays the local image in a display frame different from a display frame surrounding the peripheral image.
  • the local image may be displayed over the peripheral image in such a manner as to overlap the peripheral image or be displayed at a different position than the peripheral image so as not to overlap the peripheral image.
  • an image part corresponding to the local area in the peripheral image may be enlarged and displayed.
  • at least part of the other image part of the peripheral image may be reduced in size and displayed, or the display of the other image part may be omitted.
  • the display of the local image with another display frame may be omitted.
  • FIG. 3 is an example of the display of the first output image displayed on the display device 52.
  • the first output image mainly includes a hopper image HG, a left peripheral image LG, a right peripheral image RG, and an illustration image 1CG.
  • the hopper image HG, the left peripheral image LG, and the right peripheral image RG compose the peripheral image.
  • the image generating system SYS displays the hopper image HG, the left peripheral image LG, the right peripheral image RG, and the illustration image 1CG at predetermined positions in predetermined size in the first output image so that the operator can understand that the front of the asphalt finisher 100 coincides with the upper side of the screen of the display device 52. This is for causing the operator to intuitively understand the positional relationship between the asphalt finisher 100 and an object in its surrounding area by showing the operator the first output image serving as an overhead view image showing an aerial view of the asphalt finisher 100 and its surrounding area from above.
  • the hopper image HG is generated based on an input image of the front camera 51F.
  • the hopper image HG is an image showing the state of the inside of the hopper 2 as seen when looking down at the hopper 2 from the tractor 1, and is generated by clipping part of the input image of the front camera 51F to be placed in the top center of the first output image.
  • the left peripheral image LG is generated based on an input image of the left camera 51L.
  • the left peripheral image LG is an image showing the state of a left peripheral area to the left of the asphalt finisher 100 in the travel direction as seen when looking down at the left peripheral area from the tractor 1.
  • the left peripheral image LG is generated by clipping part of the input image of the left camera 51L, performing distortion correction thereon, and further performing image rotation thereon, and is placed at the left end of the first output image.
  • the left peripheral image LG includes an image of the left end portion of the screed 3 and an image of the left end portion of the hopper 2.
  • the right peripheral image RG is generated based on an input image of the right camera 51R.
  • the right peripheral image RG is an image showing the state of a right peripheral area to the right of the asphalt finisher 100 in the travel direction as seen when looking down at the right peripheral area from the tractor 1.
  • the right peripheral image RG is generated by clipping part of the input image of the right camera 51R, performing distortion correction thereon, and further performing image rotation thereon, and is placed at the right end of the first output image.
  • the right peripheral image RG includes an image of the right end portion of the screed 3 and an image of the right end portion of the hopper 2.
  • the distortion correction is image processing for correcting an apparent distortion or tilt/shift caused by using a wide-angle lens or the like.
  • the image rotation is image processing for matching the front side of the asphalt finisher 100 in the travel direction (the upper side of the screen of the display device 52) and the respective orientations of the left peripheral image LG and the right peripheral image RG.
  • the correspondence between the coordinates in the input image planes associated with the respective input images of the left camera 51L and the right camera 51R and the coordinates in the output image plane is stored in the input image-output image correspondence map 54a with effects according to the distortion correction and the image rotation being incorporated in advance.
  • the distortion correction and the image rotation may be performed on the hopper image HG.
  • the illustration image 1CG is a computer-generated graphic of the tractor 1, and is so displayed as to enable the operator to understand the position of the tractor 1. According to this embodiment, the illustration image 1CG is placed in the bottom center of the first output image.
  • the display device 52 can display the first output image showing an aerial view of the asphalt finisher 100 and its surrounding area from above.
  • the hopper image HG, the left peripheral image LG, and the right peripheral image RG are adjacently placed as separate independent images.
  • the three images may be synthesized into a single continuous image.
  • image processing may be performed to prevent disappearance of an image of an object in the overlapping area of the imaging range of the front camera 51F and the imaging ranges of the left camera 51L and the right camera 51R.
  • each of the hopper image HG, the left peripheral image LG, and the right peripheral image RG is generated based on an input image captured by a corresponding single camera.
  • Each of the hopper image HG, the left peripheral image LG, and the right peripheral image RG may be generated based on input images captured by two or more cameras.
  • the left peripheral image LG may be generated based on input images captured by the left camera 51L and the left auxiliary camera 51U.
  • the right peripheral image RG may be generated based on input images captured by the right camera 51R and the right auxiliary camera 51V.
  • FIGS. 4A through 4C are examples of the display of the second output image displayed on the display device 52.
  • the second output image mainly includes the hopper image HG, the left peripheral image LG, the right peripheral image RG, the illustration image 1CG, and a local image SG.
  • the local image SG is displayed over the illustration image 1CG.
  • FIG. 4A illustrates a second output image including a right local image SGR showing a right local area that is an area surrounded by the retaining plate 70 (see FIG. 1B ), the side plate 71 (see FIG. 1B ), and the mold board 72 ( FIG. 1B ) on the right side of the tractor 1.
  • FIG. 4B illustrates a second output image including a left local image SGL showing a left local area that is an area surrounded by the retaining plate 70 (see FIG. 1B ), the side plate 71 (see FIG. 1B ), and the mold board 72 ( FIG. 1B ) on the left side of the tractor 1.
  • FIG. 4C illustrates a second output image including the right local image SGR and the left local image SGL.
  • the right local image SGR is generated based on an input image of the right auxiliary camera 51V.
  • the right local image SGR is an image showing the right local area as seen when looking down at the right local area from the tractor 1.
  • the right local image SGR is generated by clipping part of the input image of the right auxiliary camera 51V, performing distortion correction thereon, and further performing image rotation thereon, and is placed along the right end of the illustration image 1CG over the illustration image 1CG.
  • the left local image SGL is generated based on an input image of the left auxiliary camera 51U.
  • the left local image SGL is an image showing the left local area as seen when looking down at the left local area from the tractor 1.
  • the left local image SGL is generated by clipping part of the input image of the left auxiliary camera 51U, performing distortion correction thereon, and further performing image rotation thereon, and is placed along the left end of the illustration image 1CG over the illustration image 1CG.
  • the display change switch may be configured to include a first switch for displaying the second output image including the right local image SGR (see FIG. 4A ) and a second switch for displaying the second output image including the left local image SGL (see FIG. 4B ).
  • the display change switch may be composed only of a switch for displaying the second output image including the right local image SGR and the left local image SGL (see FIG. 4C ).
  • the display change switch may be configured to include the three switches.
  • the screw dial may be configured to include a right dial for controlling the rotational speed of a right screw and a left dial for controlling the rotational speed of a left screw, or may be composed only of a common dial for controlling the rotational speeds of the right and left screws simultaneously, or may be configured to include the three dials.
  • the highlighting part 50B may display the second output image including the right local image SGR (see FIG. 4A ) when the right dial is operated, and display the second output image including the left local image SGR (see FIG. 4B ) when the left dial is operated.
  • the highlighting part 50B may display the second output image including the right local image SGR and the left local image SGL (see FIG. 4C ) when the common dial is operated.
  • a display frame surrounding the local image SG may be displayed differently from display frames surrounding the hopper image HG, the left peripheral image LG, the right peripheral image RG, and the illustration image 1CG.
  • the display frame surrounding the local image SG may be so displayed as to be different in color, line type, thickness, etc., or may be caused to blink.
  • FIG. 5 is a flowchart of the output image generating process.
  • the output image includes the first output image and the second output image.
  • the image generating system SYS repeatedly executes this output image generating process at predetermined control intervals to selectively generate one of the first output image and the second output image.
  • the image generating system SYS may generate both the first output image and the second output image.
  • the output image generating part 50A of the controller 50 correlates the values of coordinates in the output image plane and the values of coordinates in the input image planes (step S1).
  • the output image generating part 50A refers to the input image-output image correspondence map 54a to obtain the values (for example, luminance value, hue value, chromatic value, etc.) of coordinates in the input image planes corresponding to the coordinates in the output image plane, and set the obtained values as the values of the corresponding coordinates in the output image plane.
  • the controller 50 determines whether the values of all the coordinates in the output image plane are correlated with the values of coordinates in the input image planes (step S2).
  • the output image generating part 50A repeats the process of step S1 and step S2.
  • the output image generating part 50A ends the output image generating process of this time.
  • FIG. 6 is a flowchart of the output image switching process.
  • the image generating system SYS repeatedly executes this output image switching process at predetermined control intervals.
  • the highlighting part 50B of the controller 50 determines whether highlighting is turned on (step S11). For example, the highlighting part 50B determines that highlighting is turned on when the display change switch serving as a highlighting switch is depressed while the first output image is displayed on the display device 52. The highlighting part 50B may determine that highlighting is turned on when the screw dial is operated.
  • the highlighting part 50B displays a local image (step S12) in a highlighted manner. For example, the highlighting part 50B switches the first output image (see FIG. 3 ) displayed on the display device 52 to the second output image (see FIGS. 4A through 4C ) to display at least one of the left local image (SGL) and the right local image (SGR) along with a peripheral image.
  • the highlighting part 50B continues to display the first output image on the display device 52 without switching the displayed first output image to the second output image.
  • the highlighting part 50B determines whether highlighting is turned off (step S13). For example, the highlighting part 50B determines that highlighting is turned off when the display change switch serving as a highlighting switch is depressed while the second output image is displayed on the display device 52. The highlighting part 50B may determine that highlighting is turned off when a predetermined time has passed since the completion of the operation of the screw dial. Alternatively, the highlighting part 50B may determine that highlighting is turned off when a predetermined time has passed since the depression of the display change switch.
  • the highlighting part 50B stops displaying the local image in a highlighted manner (step S14). For example, the highlighting part 50B stops displaying the local image in a highlighted manner by switching the second output image displayed on the display device 52 to the first output image.
  • the highlighting part 50B continues to display the second output image on the display device 52 without switching the displayed second output image to the first output image.
  • the image generating system SYS can display a local image on the display device 52 in response to the operator's request.
  • the operator can check the amount of a paving material enclosed in a local area while remaining seated in the operator seat 1S without moving around on the tractor 1 or twisting her/his body in the operator seat 1S to look into the local area.
  • the operator can check the surroundings and the amount of a paving material enclosed in the local area substantially simultaneously without greatly moving the line of sight.
  • the asphalt finisher 100 in which the image generating system SYS is installed can reduce the operator's fatigue due to the work of checking the enclosed amount. As a result, it is possible to improve safety with respect to the asphalt finisher 100.
  • FIG. 7 is another example of the display of the second output image displayed on the display device 52.
  • the second output image of FIG. 7 is different in that the right local image SGR is displayed over not the illustration image 1CG but the right peripheral image RG from, but otherwise equal to, the second output image of FIG. 4A . Therefore, a description of a common portion is omitted, and differences are described in detail.
  • the following description, which is of the right local image SGR, is also applied to the left local image SGL.
  • the right local image SGR is generated based on an input image of the right camera 51R, the same as the right peripheral image RG. Therefore, the right auxiliary camera 51V may be omitted.
  • the right local image SGR may be an image generated based on an input image of the right auxiliary camera 51V.
  • the right local image SGR corresponds to part of the right peripheral image RG.
  • the right peripheral image RG is composed of a first image part RG1 through an eleventh image part RG11
  • the right local image SGR of FIG. 7 corresponds to the ninth image part RG9.
  • the image generating system SYS displays the right local image SGR, which is an image into which the ninth image part RG9 is vertically enlarged, where the seventh image part RG7 through the eleventh image part RG11 have been displayed. That is, the first image part RG1 through the sixth image part RG6 continue to be displayed, while the seventh image part RG7 through the eleventh image part RG11 are concealed by the right local image SGR to be invisible.
  • the image generating system SYS displays the right local image SGR over image parts showing the right local area in the right peripheral image RG. Therefore, the operator can intuitively understand that the right local area is shown in the right local image SGR. Furthermore, by displaying the right local image SGR, the right local area can be enlarged and displayed compared with the case where the right peripheral image RG is displayed. Therefore, it is possible to more clearly show the operator the state of the right local area fed with a paving material by the right screw.
  • FIG. 8 is yet another example of the display of the second output image displayed on the display device 52.
  • the second output image of FIG. 8 is different in that the right local image SGR is displayed to cover not part but the entirety of the right peripheral image RG from, but otherwise equal to, the second output image of FIG. 7 . Therefore, a description of a common portion is omitted, and differences are described in detail.
  • the following description, which is of the right local image SGR, is also applied to the left local image SGL.
  • the right local image SGR corresponds to part of the right peripheral image RG.
  • the right peripheral image RG is composed of the first image part RG1 through the eleventh image part RG11
  • the right local image SGR of FIG. 8 corresponds to the ninth image part RG9.
  • the image generating system SYS displays the right local image SGR, which is an image into which the ninth image part RG9 is vertically enlarged, where the first image part RG1 through the eleventh image part RG11 have been displayed. That is, the first image part RG1 through the eleventh image part RG11 are concealed by the right local image SGR to be invisible.
  • the image generating system SYS displays the right local image SGR over the entirety of the right peripheral image RG including an image part showing the right local area. Therefore, the operator can intuitively understand that the right local area is shown in the right local image SGR. Furthermore, by displaying the right local image SGR sized to the overall vertical length of the display device 52, the right local area can be further enlarged and displayed compared with the case of the second output image of FIG. 7 . Therefore, it is possible to even more clearly show the operator the state of the right local area fed with a paving material by the right screw.
  • FIG. 9 is still another example of the display of the second output image displayed on the display device 52.
  • the second output image of FIG. 9 is different in that the right local image SGR is generated using not part but the entirety of the right peripheral image RG and that an indicator BG is displayed from, but otherwise equal to, the second output image of FIG. 8 . Therefore, a description of a common portion is omitted, and differences are described in detail.
  • the following description, which is of the right local image SGR, is also applied to the left local image SGL.
  • the right local image SGR corresponds to the entirety of the right peripheral image RG.
  • the right local image SGR of FIG. 9 is generated by vertically enlarging or reducing the size of each of the first image part RG1 through the eleventh image part RG11.
  • the right local image SGR of FIG. 9 is composed of images into which the first image part RG1 through the sixth image part RG6 and the eleventh image part RG11 are vertically reduced in size and images into which the seventh image part RG7 through the tenth image part RG10 are vertically enlarged. That is, unlike in the case of FIGS. 7 and 8 , a view shown in the right peripheral image RG continues to be visible even when the right local image SGR is displayed.
  • the indicator BG is a graphic image representing the state of enlargement or reduction of the image parts of the local image SG relative to the corresponding image parts of the peripheral image.
  • a right indicator BGR represents the state of enlargement or reduction of the image parts of the right local image SGR relative to the corresponding image parts of the right peripheral image RG.
  • the right indicator BGR is a vertically elongated bar-shaped indicator composed of eleven rectangular segments corresponding to the first image part RG1 through the eleventh image part RG11, and is displayed at the right end of the screen. When a left indicator is displayed, the left indicator may be displayed at the left end of the screen.
  • the left indicator represents the state of enlargement or reduction of the image parts of the left local image SGL relative to the corresponding image parts of the left peripheral image LG.
  • the bar-shaped indicator indicates that a vertically longer rectangular segment represents a higher enlargement rate and that a vertically shorter rectangular element represents a higher reduction rate.
  • the display of the indicator BG may be omitted.
  • the image generating system SYS displays the right local image SGR over the entirety of the right peripheral image RG including an image part showing the right local area. Therefore, the operator can intuitively understand that the right local area is shown in the right local image SGR. Furthermore, the same as in the case of FIG. 8 , by displaying the right local image SGR over the overall vertical length of the display device 52, the right local area can be enlarged and displayed compared with the case of the second output image of FIG. 7 . Therefore, it is possible to more clearly show the operator the state of the right local area fed with a paving material by the right screw. Furthermore, unlike in the case of FIG.
  • the image generating system SYS can generate an output image that makes it possible for the operator to intuitively understand the positional relationship between the asphalt finisher 100 and a worker working in its surrounding area, etc., based on input images captured by multiple cameras.
  • the image generating system SYS displays the hopper image HG, the left peripheral image LG, the right peripheral image RG, and the illustration image 1CG so that the operator can be presented with an image showing an aerial view of the asphalt finisher 100 and its surrounding area from above. This makes it possible for the operator to visually check a blind area around the asphalt finisher 100 without leaving the operator seat 1S.
  • the image generating system SYS can improve the safety and operability of the asphalt finisher 100.
  • the image generating system SYS can show the operator the amount of a paving material remaining in the hopper 2, the position of a feature (for example, a maintenance hole) in a road surface to be paved, etc.
  • the image generating system SYS can show the operator the position of a worker working around the asphalt finisher 100, etc. Therefore, the operator can look at the display device 52 and check the position of a worker, etc., and thereafter perform various operations such as opening or closing the hopper, extending or retracting the screed, and raising or lowering the screed 3. Furthermore, the operator can suspend various operations or stop the asphalt finisher when finding that the positional relationship between a worker and the hopper, the screed, or a dump truck is dangerous.
  • the image generating system SYS displays a peripheral image that is an image showing the surrounding area of the asphalt finisher 100 and displays a local image that is an image showing an area fed with a paving material by the screw SC in a highlighted manner.
  • Displaying the local image in a highlighted manner includes displaying the local image in a different display frame, enlarging and displaying the local image, and changing the mode of display of the display frame of the local image. According to this process, the image generating system SYS can clearly show the operator the state of a predetermined local area as well as the state of the surrounding area of the asphalt finisher 100.
  • the image generating system SYS may include the right camera 51R serving as a first camera to capture an image of an area to the right of the asphalt finisher 100 and the right auxiliary camera 51V serving as a second camera to capture an image of an area fed with a paving material by the right screw.
  • the right peripheral image RG that is a component of the peripheral image may be generated based on an image captured by the right camera 51R and the right local image SGR composing the local image may be generated based on an image captured by the right auxiliary camera 51V.
  • the image generating system SYS may include the left camera 51L serving as a first camera to capture an image of an area to the left of the asphalt finisher 100 and the left auxiliary camera 51U serving as a second camera to capture an image of an area fed with a paving material by the left screw.
  • the left peripheral image LG that is a component of the peripheral image may be generated based on an image captured by the left camera 51L and the left local image SGL composing the local image may be generated based on an image captured by the left auxiliary camera 51U.
  • the image generating system SYS may include the right camera 51R that captures an image of an area to the right of the asphalt finisher 100 and the right local area fed with a paving material by the right screw.
  • the right peripheral image RG and the right local image SGR may be generated based on an image captured by the right camera 51R.
  • the right auxiliary camera 51V may be omitted.
  • the image generating system SYS may include the left camera 51L that captures an image of an area to the left of the asphalt finisher 100 and the left local area fed with a paving material by the left screw.
  • the left peripheral image LG and the left local image SGL may be generated based on an image captured by the left camera 51L.
  • the left auxiliary camera 51U may be omitted.
  • the local image may be displayed without overlapping the peripheral image as illustrated in FIGS. 4A through 4C , for example, and may be displayed over the peripheral image as illustrated in FIGS. 6 through 8 , for example.
  • the display device 52 may display the indicator BG that represents the state of enlargement or reduction of the local image SG. By looking at the indicator BG, the operator can instantaneously understand the state of enlargement or reduction of each image part of the local image SG.
  • An asphalt finisher that displays images captured by a front camera, a left camera, and a right camera side by side around a computer-generated graphic of a tractor has been known (see Patent Document 1).
  • the front camera is attached to the upper end of the front of the tractor to capture an image of the inside of a hopper in front of the tractor.
  • the left camera is attached to the upper end of the left side of the tractor to capture an image of a space to the left of the asphalt finisher.
  • the right camera is attached to the upper end of the right side of the tractor to capture an image of a space to the right of the asphalt finisher.
  • a space around the front end of hopper wings is the blind area of the hopper wings when viewed from the front camera, the left camera, and the right camera. Therefore, an operator of the asphalt finisher cannot understand the state of the space around the front end of the hopper wings by looking at the displayed image. The operator has to check the safety of a space that is invisible in an image directly with her/his eyes.
  • FIG. 10 is a side view of the asphalt finisher 100 that is an example of a road machine according to an embodiment of the invention.
  • FIG. 11 is a plan view of the asphalt finisher 100.
  • the asphalt finisher 100 mainly includes the tractor 1, the hopper 2, and the screed 3.
  • the direction of the hopper 2 as viewed from the tractor 1 is defined as a forward direction (the +X direction)
  • the direction of the screed 3 as viewed from the tractor 1 is defined as a rearward direction (the -X direction).
  • the tractor 1 is a vehicle for causing the asphalt finisher 100 to travel. According to this embodiment, the tractor 1 rotates rear wheels 5 using rear wheel traveling hydraulic motors and rotates front wheels 6 using front wheel traveling hydraulic motors to move the asphalt finisher 100.
  • the rear wheel traveling hydraulic motors and the front wheel traveling hydraulic motors are supplied with hydraulic oil from a hydraulic pump to rotate.
  • the rear wheels 5 and the front wheels 6 may be replaced with crawlers.
  • the tractor 1 includes a canopy 1C.
  • the canopy 1C is attached to the top of the tractor 1.
  • the controller 50 is a control device to control the asphalt finisher 100.
  • the controller 50 is composed of a microcomputer including a CPU, a volatile memory, and a nonvolatile memory, and is installed in the tractor 1.
  • the CPU executes programs stored in the nonvolatile memory to implement various functions of the controller 50.
  • the hopper 2 is a mechanism for receiving a paving material, and mainly includes hopper wings 20 and hopper cylinders 24. According to this embodiment, the hopper 2 is installed in front of the tractor 1 and receives a paving material in the hopper wings 20.
  • the hopper wings 20 include a left hopper wing 20L that can be opened and closed in the Y-axis directions (widthwise of the vehicle) by a left hopper cylinder 24L and a right hopper wing 20R that can be opened and closed in the Y-axis directions (widthwise of the vehicle) by a right hopper cylinder 24R.
  • the asphalt finisher 100 fully opens the hopper wings 20 to receive a paving material (for example, an asphalt mixture) from the bed of a dump truck.
  • FIG. 11 illustrates that the hopper wings 20 are fully open.
  • the hopper wings 20 are closed to gather the paving material near the inner wall of the hopper 2 to the center of the hopper 2 so that the conveyor CV in the center of the hopper 2 can continuously feed the paving material to the back of the tractor 1, that is, the paving material can be kept piled on the conveyor CV.
  • the screw SC has extension screws laterally coupled.
  • the screed 3 is a mechanism for spreading and smoothing a paving material.
  • the screed 3 includes a front screed 30 and rear screeds 31.
  • the screed 3 is a free floating screed towed by the tractor 1, and is coupled to the tractor 1 via leveling arms 3A.
  • the rear screeds 31 include the left rear screed 31L and the right rear screed 31R.
  • the left rear screed 31L extends and retracts widthwise of the vehicle using a left screed extendable and retractable cylinder 26L
  • the right rear screed 31R extends and retracts widthwise of the vehicle using a right screed extendable and retractable cylinder 26R.
  • the image capturing devices 51 are devices to capture images. According to this embodiment, the image capturing devices 51 are monocular cameras, and are connected to the controller 50 wirelessly or by wire. The controller 50 can generate an overhead view image by performing a viewpoint changing process on images captured by the image capturing devices 51.
  • the overhead view image is, for example, an image of a space around the asphalt finisher 100 as virtually viewed from substantially directly above.
  • the image capturing devices 51 may be stereo cameras.
  • the image capturing devices 51 include the front camera 51F, the left camera 51L, the right camera 51R, and a back camera 51B.
  • the back camera 51B may be omitted.
  • the front camera 51F captures an image of a space in front of the asphalt finisher 100.
  • the front camera 51F is attached to a hood forming the front of the tractor 1 so as to be able to capture an image of the inside of the hopper 2 that is a blind area from the viewpoint of the operator seated in the operator seat 1S (hereinafter, "operator seat viewpoint").
  • the front camera 51F may be attached to the front edge of the top plate of the canopy 1C.
  • a gray area Z1 indicates the imaging range of the front camera 51F.
  • the left camera 51L captures an image of a space to the left of the asphalt finisher 100.
  • the left camera 51L is attached to the end of a bar member BL extending in the +Y direction (leftward) from the left edge of the top plate of the canopy 1C so as to be able to capture an image of a space outside the left hopper wing 20L in the vehicle width direction that is a blind area from the operator seat viewpoint.
  • the left camera 51L may be attached to the end of the bar member BL extending in the +Y direction (leftward) from the right side of the tractor 1.
  • the left camera 51L is so attached as to protrude outward (leftward) in the vehicle width direction relative to the left end of the fully opened left hopper wing 20L.
  • a gray area Z2 indicates the imaging range of the left camera 51L.
  • the right camera 51R captures an image of a space to the right of the asphalt finisher 100.
  • the right camera 51R is attached to the end of a bar member BR extending in the -Y direction (rightward) from the right edge of the top plate of the canopy 1C so as to be able to capture an image of a space outside the right hopper wing 20R in the vehicle width direction that is a blind area from the operator seat viewpoint.
  • the right camera 51R may be attached to the end of the bar member BR extending in the -Y direction (rightward) from the right side of the tractor 1.
  • the right camera 51R is so attached as to protrude outward (rightward) in the vehicle width direction relative to the right end of the fully opened right hopper wing 20R.
  • a gray area Z3 indicates the imaging range of the right camera 51R.
  • the bar members BL and BR are desirably removable.
  • the bar members BL and BR may be extendable and retractable. This is for handling the case of transporting the asphalt finisher 100 in a trailer or the like.
  • the back camera 51B captures an image of a space behind the asphalt finisher 100.
  • the back camera 51B is attached to the rear edge of the top plate of the canopy 1C so as to be able to capture an image of a space behind the screed 3 that is a blind area from the operator seat viewpoint.
  • a gray area Z4 indicates the imaging range of the back camera 51B.
  • the imaging range of the front camera 1F and the imaging range of the left camera 1L may overlap.
  • the imaging range of the back camera 1B and the imaging range of the left camera 1L do not have to overlap. The same applies to the imaging range of the right camera 1R.
  • the controller 50 generates an overhead view image by changing the viewpoints of and synthesizing the respective captured images of the front camera 51F, the left camera 51L, the right camera 51R, and the back camera 51B.
  • the overhead view image includes an image of the internal space of the hopper 2, a space to the left of the left hopper wing 20L, a space to the right of the right hopper wing 20R, and a space behind the screed 3 as virtually viewed from substantially directly above and a computer graphics image (hereinafter, "model image") of the asphalt finisher 100.
  • the controller 50 may generate an overhead view image by changing the viewpoints of and synthesizing the respective captured images of the three cameras of the front camera 51F, the left camera 51L, and the right camera 51R. That is, the overhead view image may be generated without using the back camera 51B.
  • the display device 52 is a device to display various images.
  • the display device 52 is a liquid crystal display, and is connected to the controller 50 wirelessly or by wire.
  • the display device 52 can display an image captured by each of the image capturing devices 51, and is placed at such a position as to make it easy for an operator seated in the operator seat 1S to look at the display device 52.
  • the display device 52 may be placed where a rear controller is.
  • the controller 50 displays an image generated by performing a viewpoint changing process on images captured by the image capturing devices 51 on the display device 52.
  • FIG. 12 is a block diagram illustrating an example configuration of the display system GS.
  • the display system GS mainly includes the controller 50, the image capturing devices 51, the display device 52, an information obtaining device 53, and the storage device 54.
  • the display system GS generates an image for display (hereinafter, "output image") based on images captured by the image capturing devices 51 (hereinafter, "input images”) and displays the output image on the display device 52.
  • the information obtaining device 53 obtains information and outputs the obtained information to the controller 50.
  • the information obtaining device 53 includes at least one of, for example, a hopper cylinder stroke sensor, screed extendable and retractable cylinder stroke sensors, a steering angle sensor, a travel speed sensor, and a positioning sensor.
  • the hopper cylinder stroke sensor detects the stroke amount of the hopper cylinders 24.
  • the screed extendable and retractable cylinder stroke sensors detect the stroke amounts of screed extendable and retractable cylinders 26.
  • the steering angle sensor detects the steering angle of the front wheels 6.
  • the travel speed sensor detects the travel speed of the asphalt finisher 100.
  • the positioning sensor is, for example, a GNSS compass, and detects the position (latitude, longitude, and altitude) and the orientation of the asphalt finisher 100.
  • the storage device 54 is a device for storing various kinds of information. According to this example, the storage device 54 is a nonvolatile storage device that stores the input image-output image correspondence map 54a in such a manner as to allow reference to the input image-output image correspondence map 54a.
  • the input image-output image correspondence map 54a stores the correspondence between the coordinates in the input image planes and the coordinates in the output image plane.
  • the correspondence is preset based on various parameters of the image capturing devices 51, such as an optical center, a focal length, a CCD size, an optical axis direction vector, a camera horizontal direction vector, and a projection method, so that a viewpoint can be changed as desired.
  • the correspondence is so set as to prevent an apparent distortion or tilt/shift from appearing in the output image.
  • the controller 50 includes a viewpoint changing part 50a and an auxiliary line creating part 50b.
  • the viewpoint changing part 50a and the auxiliary line creating part 50b are composed of software, hardware, or firmware.
  • the viewpoint changing part 50a is a functional element to generate the output image.
  • the viewpoint changing part 50a refers to the input image-output image correspondence map 54a stored in the storage device 54 to correlate the coordinates in the input image planes in which input images captured by the image capturing devices 51 are positioned with the coordinates in the output image plane in which the overhead view image as the output image is positioned.
  • the viewpoint changing part 50a generates the output image by correlating the values (for example, luminance value, hue value, chromatic value, etc.) of pixels in the input images with the values of pixels in the output image.
  • the auxiliary line creating part 50b is a functional element to create auxiliary lines to be displayed over the output image. According to this embodiment, the auxiliary line creating part 50b creates auxiliary lines such that the auxiliary lines match the overhead view image generated by the viewpoint changing part 50a.
  • auxiliary lines include an auxiliary line indicating an expected pavement trajectory that is the expected trajectory of an end of the screed 3 and an auxiliary line indicating an expected travel trajectory that is the expected trajectory of a wheel.
  • the controller 50 refers to the input image-output image correspondence map 54a through the viewpoint changing part 50a. Then, the controller 50 obtains the values (for example, luminance value, hue value, chromatic value, etc.) of coordinates in the input image planes corresponding to the coordinates in the output image plane, and adopts the obtained values as the values of the corresponding coordinates in the output image plane.
  • the values for example, luminance value, hue value, chromatic value, etc.
  • the controller 50 determines whether the values of all the coordinates in the output image plane are correlated with the values of coordinates in the input image planes. In response to determining that the values of all the coordinates are not correlated, the controller 50 repeats the above-described process.
  • the controller 50 In response to determining that the values of all the coordinates are correlated, the controller 50 displays auxiliary lines indicating expected pavement trajectories, auxiliary lines indicating expected travel trajectories, etc., over the output image. Positions in the output image where auxiliary lines are to be superposed, which are preset according to this embodiment, may also be dynamically derived. Furthermore, the controller 50 may correlate the coordinates in the input image planes and the coordinates in the output image plane after displaying auxiliary lines.
  • FIGS. 13A and 13B are diagrams illustrating an example of the overhead view image.
  • FIG. 13A illustrates an example of the overhead view image displayed on the display device 52.
  • FIG. 13B illustrates the segments of input images used for generating the overhead view image of FIG. 13A .
  • the overhead view image of FIG. 13A is an image of a space around the asphalt finisher 100 as virtually viewed from substantially directly above.
  • the overhead view image of FIG. 13A mainly includes an image G1 (see the hatched area) generated by the viewpoint changing part 50a and a model image CG1.
  • the model image CG1 is an image representing the asphalt finisher 100, and includes a model image CGa of the hopper 2, a model image CGb of the tractor 1, and a model image CGc of the screed 3.
  • the model image CGa of the hopper 2 includes a model image WL of the left hopper wing 20L and a model image WR of the right hopper wing 20R.
  • the model images WL and WR change in shape according to the output of the hopper cylinder stroke sensor.
  • FIG. 13A illustrates the model image CGa when each of the left hopper wing 20L and the right hopper wing 20R is fully open.
  • Part of the overhead view image (an image of the inside of the hopper 2 as virtually viewed from substantially directly above) is placed in the hatched area between the model image WL and the model image WR.
  • the model image CGa of the hopper 2 may be omitted. In this case, part of the overhead view image based on an image captured by the front camera 1F is placed.
  • the model image CGc of the screed 3 includes a model image SL of the left rear screed 31L and a model image SR of the right rear screed 31R.
  • the model images SL and SR change in shape according to the outputs of the screed extendable and retractable cylinder stroke sensors.
  • FIG. 13A illustrates the model image CGc when each of the left rear screed 31L and the right rear screed 31R extends most.
  • the model image CGc of the screed 3 may be omitted. In this case, part of the overhead view image based on an image captured by the back camera 1B is placed.
  • the image G1 is an image generated using the respective captured input images of the four image capturing devices 51.
  • the image G1 includes an image Ga of a worker in front and to the left of the asphalt finisher 100 and an image Gb of a maintenance hole cover in front and to the right of the asphalt finisher 100.
  • the controller 50 generates the image G1 by synthesizing a front image R1, a left image R2, a right image R3, and a rear image R4.
  • the worker image Ga is included in the left image R2, and the maintenance hole cover image Gb is included in the right image R3.
  • the front image R1 is an image generated based on an input image captured by the front camera 51F.
  • the front image R1 includes an image showing the state of the inside of the hopper 2 as viewed down from the tractor 1 side.
  • the controller 50 generates the front image R1 by clipping and performing a viewpoint changing process on part of the input image captured by the front camera 51F.
  • the front image R1 is placed between and on the upper side of the model image WL and the model image WR.
  • the front image R1 may change in shape according as the model images WL and WR change in shape.
  • the left image R2 is an image generated based on an input image captured by the left camera 51L.
  • the left image R2 includes an image of a space on the outer side (to the left) of the left hopper wing 20L in the vehicle width direction.
  • the controller 50 generates the left image R2 by clipping and performing a viewpoint changing process by the viewpoint changing part 50a on part of the input image captured by the left camera 51L.
  • the left image R2 is placed to the left of the model image CG1.
  • the left image R2 may change in shape according as the model images WL and SL change in shape.
  • the right image R3 is an image generated based on an input image captured by the right camera 51R.
  • the right image R3 includes an image of a space on the outer side (to the right) of the right hopper wing 20R in the vehicle width direction.
  • the controller 50 generates the right image R3 by clipping and performing a viewpoint changing process by the viewpoint changing part 50a on part of the input image captured by the right camera 51R.
  • the right image R3 is placed to the right of the model image CG1.
  • the right image R3 may change in shape according as the model images WR and SR change in shape.
  • the rear image R4 is an image generated based on an input image captured by the back camera 51B.
  • the rear image R4 includes an image showing the state of the screed 3 as viewed down from the tractor 1 side.
  • the controller 50 generates the rear image R4 by clipping and performing a viewpoint changing process on part of the input image captured by the back camera 51B.
  • the rear image R4 is placed on the lower side of the model image CGc of the screed 3.
  • the rear image R4 may change in shape according as the model images SL and SR change in shape.
  • the correspondence between the coordinates in the respective input image planes of the four cameras and the coordinates in the output image plane is stored in the input image-output image correspondence map 54a with effects according to the viewpoint changing process being incorporated in advance. Therefore, the controller 50 can correlate coordinates in the input image planes with coordinates in the output image plane by only referring to the input image-output image correspondence map 54a. As a result, the controller 50 can generate and display an output image at a relatively low computational load.
  • each of the front image R1, the left image R2, the right image R3, and the rear image R4 is generated based on an input image captured by a corresponding single camera.
  • the invention is not limited to this configuration.
  • each of the front image R1, the left image R2, the right image R3, and the rear image R4 may be generated based on input images captured by two or more cameras.
  • the asphalt finisher 100 includes the multiple image capturing devices 51 attached to the tractor 1, the controller 50 that generates an overhead view image by changing the viewpoints of and synthesizing the respective captured images of the image capturing devices 51, and the display device 52 that displays the overhead view image on a screen.
  • the overhead view image is configured to include an image of a space around the asphalt finisher 100 as virtually viewed from substantially directly above. Therefore, it is possible to further reduce a blind area that is an invisible area in the output image.
  • the asphalt finisher 100 can improve visibility, safety, operability and work efficiency. Specifically, the asphalt finisher 100 can cause the operator to intuitively understand the amount of a paving material remaining in the hopper 2, the position of a feature (for example, a maintenance hole) in a road surface to be paved, etc. Therefore, the operator can look at the overhead view image and check the position of a feature or a worker, and thereafter perform various operations such as opening or closing the hopper wings 20.
  • a feature for example, a maintenance hole
  • FIGS. 14A and 14B illustrate examples where auxiliary lines are displayed over the image G1 generated by the viewpoint changing part 50a.
  • FIG. 14A illustrates an overhead view image of the asphalt finisher 100 that is traveling straight.
  • FIG. 14B illustrates an overhead view image of the asphalt finisher 100 that is turning right.
  • the auxiliary line creating part 50b creates auxiliary lines L1 and L2 based on the respective outputs of the steering angle sensor and the travel speed sensor.
  • the auxiliary line L1 is the expected travel trajectory of a left rear wheel 5L
  • the auxiliary line L2 is the expected travel trajectory of a right rear wheel 5R.
  • the auxiliary lines L1 and L2 are derived based on a current steering angle and travel speed, but may be derived based solely on the steering angle.
  • the auxiliary lines L1 and L2 represent, for example, travel trajectories during a period from a current point of time until a predetermined time (for example, several tens of seconds) passes.
  • the auxiliary line creating part 50b creates auxiliary lines L3 and L4, additionally referring to the outputs of the screed extendable and retractable cylinder stroke sensors.
  • the auxiliary line L3 is the expected trajectory of the left end of the left rear screed 31L
  • the auxiliary line L4 is the expected trajectory of the right end of the right rear screed 31R.
  • the auxiliary lines L3 and L4 are derived based on a current steering angle, travel speed, and stroke amounts of the screed extendable and retractable cylinders 26.
  • the auxiliary lines L3 and L4 represent trajectories during a period from a current point of time until a predetermined time (for example, several tens of seconds) passes.
  • the auxiliary line creating part 50b creates auxiliary lines L5 and L6 based on road design data and the output of the positioning sensor.
  • the road design data are data related to a road that is a construction target, and is prestored in a nonvolatile storage device, for example.
  • the road design data include, for example, data on the position of a feature in a road surface to be paved.
  • the auxiliary line L5 represents the left edge of the road that is a construction target, and the auxiliary line L6 represents the right edge of the road that is a construction target.
  • the controller 50 displays an image of a feature such as a maintenance hole over the overhead view image based on the road design data and the output of the positioning sensor. According to the illustration of FIGS. 14A and 14B , the controller 50 displays a model image CG2 of a maintenance hole cover over the overhead view image.
  • the operator of the asphalt finisher 100 can determine whether the current steering angle, travel speed, amounts of extension or retraction of the rear screeds 31, etc., of the asphalt finisher 100 are appropriate. For example, by looking at the overview head image of FIG. 14A , the operator can be aware that if the asphalt finisher 100 is kept traveling straight, the right rear wheel 5R will run over the maintenance hole cover and the road will not be paved as designed. In contrast, by looking at the overhead view image of FIG. 14B , the operator can be aware that if a current steering condition (condition in which a steering wheel is turned to the right) and travel speed are maintained, the right rear wheel 5R can avoid running over the maintenance hole cover and the road will be paved as designed.
  • a current steering condition condition in which a steering wheel is turned to the right
  • the controller 50 which displays the expected travel trajectories of the rear wheels 5 according to the illustration of FIGS. 14A and 14B , may display the expected travel trajectories of the front wheels 6 or display the respective expected travel trajectories of the rear wheels 5 and the front wheels 6.
  • the controller 50 can achieve the additional effect of being able to present the movement of the asphalt finisher 100 during a period from a current point of time until a predetermined time passes to the operator in advance.
  • the image part may be enlarged and reduced in size only laterally or may be enlarged and reduced in size vertically and laterally.
  • the indicator BG may be a laterally elongated bar-shaped indicator.
  • the indicator BG may be a vertically and laterally elongated matrix-shaped indicator.
  • the asphalt finisher 100 may be a guss asphalt finisher using a guss asphalt mixture.
  • the image generating system SYS may be installed in a guss asphalt finisher using a guss asphalt mixture.
  • auxiliary line creating part 51 ... image capturing device 51B ... back camera 51F ... front camera 51L ... left camera 51R ... right camera 51U ... left auxiliary camera 51V ... right auxiliary camera 52 ... display device 53 ... information obtaining device 54 ... storage device 54a ... input image-output image correspondence map 55 ... input device 65 ... operation panel 70 ... retaining plate 71 ... side plate 72 ... mold board 100 ... asphalt finisher BL, BR ... bar member CV ... conveyor SC ... screw SYS ... image generating system

Abstract

As asphalt finisher (100) serving as a road machine according to an embodiment of the invention includes a tractor (1) and a screed (3) placed behind the tractor (1). The asphalt finisher (100) includes a work apparatus configured to feed a paving material in front of the screed (3) and a display device (52) configured to display a peripheral image that is an image showing the surrounding area of the asphalt finisher (100), and to display a local image (SG) in a highlighted manner, the local image (SG) being an image showing an area fed with the paving material by the work apparatus.

Description

    TECHNICAL FIELD
  • The invention relates to road machines with a screw that feeds a paving material in an axial direction.
  • BACKGROUND ART
  • An image generator that generates an image showing an aerial view of an asphalt finisher and its surrounding area from above and presents the image to an operator of the asphalt finisher has been known (see Patent Document 1). This image generator makes it possible for the operator to intuitively understand the positional relationship between the asphalt finisher and an object in its surrounding area by generating and displaying an image showing the entirety of an area around the asphalt finisher.
  • PRIOR ART DOCUMENT PATENT DOCUMENT
  • Patent Document 1: Japanese Patent No. 6029941
  • SUMMARY OF THE INVENTION PROBLEMS TO BE SOLVED BY THE INVENTION
  • According to the above-described configuration, however, a space around the front end of hopper wings is the blind area of the hopper wings when viewed from a front camera, a left camera, and a right camera. Furthermore, normally, the above-described configuration is not suitable for causing the operator to check the state of a predetermined local area such as an area in front of a screed where a paving material is accumulated during construction. Therefore, the operator has to check the presence or absence of an object in a space that the operator desires to see, the amount of a paving material in a predetermined local area, etc., directly with her/his eyes.
  • In view of the above, it is desired to provide a road machine that further reduces an area that is difficult to see in an image.
  • MEANS FOR SOLVING THE PROBLEMS
  • A road machine according to an embodiment of the invention, which is a road machine including a tractor and a screed placed behind the tractor, includes a work apparatus configured to feed a paving material in front of the screed and a display device configured to display a peripheral image that is an image showing the surrounding area of the road machine, and to display a local image in a highlighted manner, the local image being an image showing an area fed with the paving material by the work apparatus.
  • EFFECTS OF THE INVENTION
  • By the above-described means, a road machine that further reduces an area that is difficult to see in an image is provided.
  • BRIEF DESCRIPTION OF THE DRAWINGS
    • FIG. 1A is a side view of an asphalt finisher according to an embodiment of the invention.
    • FIG. 1B is a plan view of the asphalt finisher according to the embodiment of the invention.
    • FIG. 1C is a rear view of the asphalt finisher according to the embodiment of the invention.
    • FIG. 2 is a diagram illustrating an example configuration of an image generating system installed in the asphalt finisher of FIG. 1A.
    • FIG. 3 is an example of the display of a first output image.
    • FIG. 4A is an example of the display of a second output image.
    • FIG. 4B is an example of the display of the second output image.
    • FIG. 4C is an example of the display of the second output image.
    • FIG. 5 is a flowchart of an output image generating process.
    • FIG. 6 is a flowchart of an output image switching process.
    • FIG. 7 is another example of the display of the second output image.
    • FIG. 8 is yet another example of the display of the second output image.
    • FIG. 9 is still another example of the display of the second output image.
    • FIG. 10 is a side view of the asphalt finisher according to an embodiment of the invention.
    • FIG. 11 is a plan view of the asphalt finisher of FIG. 10.
    • FIG. 12 is a block diagram illustrating an example configuration of a display system installed in the asphalt finisher of FIG. 10.
    • FIG. 13A is a diagram illustrating an example of an image generated by the display system of FIG. 12.
    • FIG. 13B is a diagram illustrating the segments of input images used for generating the image of FIG. 13A.
    • FIG. 14A is a diagram illustrating another example of the image generated by the display system of FIG. 12.
    • FIG. 14B is a diagram illustrating yet another example of the image generated by the display system of FIG. 12.
    EMBODIMENTS OF THE INVENTION
  • A best mode for carrying out the invention is described below with reference to the drawings.
  • FIGS. 1A through 1C illustrates an example configuration of an asphalt finisher 100 serving as a road machine according to an embodiment of the invention. FIG. 1A shows a side view, FIG. 1B shows a plan view, and FIG. 1C shows a rear view.
  • The asphalt finisher 100 mainly includes a tractor 1, a hopper 2, and a screed 3.
  • The tractor 1 is an apparatus for causing the asphalt finisher 100 to travel and tows the screed 3. According to this embodiment, the tractor 1 rotates two or four wheels using traveling hydraulic motors to move the asphalt finisher 100. The traveling hydraulic motors are supplied with hydraulic oil from a hydraulic pump driven by a prime mover such as a diesel engine to rotate. An operator seat 1S and an operation panel 65 are placed on top of the tractor 1.
  • Image capturing devices 51 (a right camera 51R, a left camera 51L, a front camera 51F, a right auxiliary camera 51V, and a left auxiliary camera 51U) are attached to the left side, the right side, and the front of the tractor 1. A display device 52 is installed at such a position as to make it easy for an operator seated in the operator seat 1S to look at the display device 52. According to this embodiment, the direction of the hopper 2 as viewed from the tractor 1 is defined as a forward direction (the +X direction), and the direction of the screed 3 as viewed from the tractor 1 is defined as a rearward direction (the -X direction). The +Y direction corresponds to a leftward direction, and the -Y direction corresponds to a rightward direction.
  • The hopper 2, which is an example of a work apparatus, is a mechanism for receiving a paving material (for example, an asphalt mixture). The work apparatus is an apparatus to feed a paving material in front of the screed 3. According to this embodiment, the hopper 2 can be opened and closed widthwise of the vehicle by hydraulic cylinders. Normally, the asphalt finisher 100 fully opens the hopper 2 to receive a paving material from the bed of a dump truck. When the paving material in the hopper 2 decreases, the asphalt finisher 100 closes the hopper 2 to gather the paving material near the inner wall of the hopper 2 to the center of the hopper 2, thereby making it possible for a conveyor CV, which is an example of a work apparatus, to feed the paving material to the screed 3.
  • The screed 3 is a mechanism for spreading and smoothing a paving material. According to this embodiment, the screed 3 can be vertically elevated and lowered and be extended and retracted widthwise of the vehicle by hydraulic cylinders. When extended widthwise of the vehicle, the screed 3 is wider than the tractor 1. According to this embodiment, the screed 3 includes a front screed 30, a left rear screed 31L, and a right rear screed 31R. The left rear screed 31L and the right rear screed 31R can extend and retract widthwise of the vehicle (Y-axis directions). The left rear screed 31L and the right rear screed 31R, which can extend and retract widthwise of the vehicle, are arranged with an offset from each other in a travel direction (X-axis direction). Therefore, compared with the case where there is no offset, the left rear screed 31L and the right rear screed 31R can have a larger width (length in the vehicle width direction) and extend more in the vehicle width direction, thus making it possible to construct a wider new pavement.
  • FIG. 2 schematically illustrates an example configuration of an image generating system SYS installed in the asphalt finisher 100. For example, the image generating system SYS generates an output image based on input images captured by the image capturing devices 51 installed in the asphalt finisher 100. According to this embodiment, the image generating system SYS mainly includes a controller 50, the image capturing devices 51, the display device 52, a storage device 54, and an input device 55.
  • The controller 50 is, for example, a computer including a CPU, a volatile memory, and a nonvolatile memory. For example, the controller 50 causes the CPU to execute programs corresponding to an output image generating part 50A and a highlighting part 50B to implement functions corresponding to the output image generating part 50A and the highlighting part 50B.
  • The image capturing devices 51 are devices to obtain input images for generating an output image. According to this embodiment, the image capturing devices 51 are cameras including an imaging device such as a CCD or a CMOS. For example, the image capturing devices 51 are so attached to the tractor 1 as to be able to capture an image of the blind area of the operator seated in the operator seat 1S. The blind area includes, for example, an internal space (especially near the tractor 1) of the hopper 2, a space outside the front end of the hopper 2, and a space near a road surface near the side of the asphalt finisher 100.
  • The image capturing devices 51 may be attached to positions other than the right side, the left side, and the front (for example, the rear) of the tractor 1. A wide-angle lens, a fish-eye lens, or the like may be attached to the image capturing devices 51. The image capturing devices 51 may be either attached to the hopper 2 or attached to the screed 3.
  • According to this embodiment, the image capturing devices 51 include the front camera 51F, the left camera 51L, the right camera 51R, the left auxiliary camera 51U, and the right auxiliary camera 51V. As illustrated in FIGS. 1A and 1B, the front camera 51F is attached to the upper end of the front of the tractor 1 such that an optical axis 51FX thereof extends forward in the travel direction and forms an angle α with the road surface as viewed from the side. As illustrated in FIGS. 1A through 1C, the left camera 51L is attached to the upper end of the left side of the tractor 1 such that an optical axis 51LX thereof forms an angle β with the left side surface of the tractor 1 as viewed from above and forms an angle γ with the road surface as viewed from the rear. The right camera 51R is attached in the same manner as the left camera 51L with right and left reversed. As illustrated in FIGS. 1A through 1C, the left auxiliary camera 51U is attached to the upper end of the left side of the tractor 1 such that an optical axis 51UX thereof forms an angle δ with the left side surface of the tractor 1 as viewed from above and forms an angle ε with the road surface as viewed from the rear. The right auxiliary camera 51V is attached in the same manner as the left auxiliary camera 51U with right and left reversed. In FIG. 1B, an area 51FA surrounded by a dashed line in FIG. 1B indicates the imaging range of the front camera 51F, an area 51LA surrounded by a one-dot chain line indicates the imaging range of the left camera 51L, and an area 51RA surrounded by a one-dot chain line indicates the imaging range of the right camera 51R. An area 51UA surrounded by a two-dot chain line indicates the imaging range of the left auxiliary camera 51U, and an area 51VA surrounded by a two-dot chain line indicates the imaging range of the right auxiliary camera 51V.
  • The left camera 51L and the left auxiliary camera 51U are attached to the tractor 1 such that the area 51UA indicating the imaging range of the left auxiliary camera 51U is completely included in the area 51LA indicating the imaging range of the left camera 51L. Alternatively, the left camera 51L and the left auxiliary camera 51U may be attached to the tractor 1 such that the area 51LA and the area 51UA overlap each other, namely, the area 51UA protrudes from the area 51LA. Likewise, the right camera 51R and the right auxiliary camera 51V are attached to the tractor 1 such that the area 51VA indicating the imaging range of the right auxiliary camera 51V is completely included in the area 51RA indicating the imaging range of the right camera 51R. Alternatively, the right camera 51R and the right auxiliary camera 51V may be attached to the tractor 1 such that the area 51RA and the area 51VA overlap each other, namely, the area 51VA protrudes from the area 51RA. The left auxiliary camera 51U and the right auxiliary camera 51V may be omitted.
  • The image capturing devices 51 are attached to the asphalt finisher 100 via, for example, a bracket, a stay, a bar, or the like. According to this embodiment, the image capturing devices 51 are attached to the tractor 1 via an attachment stay. Alternatively, the image capturing devices 51 may be directly attached to the tractor 1 without using an attachment stay or be embedded in the tractor 1.
  • According to this embodiment, the image capturing devices 51 output captured input images to the controller 50. In the case of obtaining input images using a fish-eye lens or a wide-angle lens, the image capturing devices 51 may output corrected input images in which an apparent distortion or tilt/shift caused by using such a lens is corrected to the controller 50. Alternatively, the image capturing devices 51 may output input images in which the apparent distortion or tilt/shift is not corrected as they are to the controller 50. In this case, the apparent distortion or tilt/shift is corrected by the controller 50.
  • Thus, the image capturing devices 51 are arranged such that multiple blind areas to the left and to the right of the asphalt finisher 100 and inside and outside the hopper 2 are included in their imaging ranges.
  • The input device 55 is a device for enabling the operator to input various kinds of information to the image generating system SYS, and is, for example, a touchscreen, buttons, switches, or the like. According to this embodiment, the input device 55 includes a display change switch and a screw dial.
  • The display change switch is a switch for changing configurations of an output image displayed on the display device 52. The screw dial is a dial for controlling the rotational speed of a screw SC that is an example of a work apparatus.
  • The storage device 54 is a device for storing various kinds of information. According to this embodiment, the storage device 54 is a nonvolatile storage device and is integrated into the controller 50. Alternatively, the storage device 54 may be placed outside the controller 50 as a structure different from the controller 50.
  • The display device 52 is a device for displaying various kinds of information. According to this embodiment, the display device 52 is a liquid crystal display installed in the operation panel 65, and displays various images output by the controller 50.
  • The output image generating part 50A is a functional element for generating an output image, and is composed of, for example, software, hardware, or their combination. According to this embodiment, the output image generating part 50A refers to an input image-output image correspondence map 54a stored in the storage device 54 to correlate coordinates in input image planes in which input images captured by the image capturing devices 51 are positioned with coordinates in an output image plane in which an output image is positioned. Then, the output image generating part 50A generates the output image by associating the values (for example, luminance value, hue value, chromatic value, etc.) of pixels in the output image and the values of pixels in the input images.
  • The input image-output image correspondence map 54a stores the correspondence between the coordinates in the input image planes and the coordinates in the output image plane in such a manner as to allow reference to the correspondence. The correspondence is preset based on various parameters of the image capturing devices 51, such as an optical center, a focal length, a CCD size, an optical axis direction vector, a camera horizontal direction vector, and a projection method. The correspondence may be set in such a manner as to prevent an apparent distortion or tilt/shift from appearing in the output image if an input image includes the apparent distortion or tilt/shift. In this case, a coordinate group forming a non-rectangular area in the input image plane is correlated with a coordinate group forming a rectangular area in the output image plane. The correspondence may be set such that a coordinate group forming a rectangular area in the input image plane corresponds directly to a coordinate group forming a rectangular area in the output image if the apparent distortion or tilt/shift in the input image is already corrected when the input image is obtained.
  • The highlighting part 50B is a functional element for changing the contents of an output image displayed on the display device 52, and is composed of, for example, software, hardware, or their combination. According to this embodiment, the highlighting part 50B switches an output image displayed on the display device 52 between a first output image and a second output image when the display change switch serving as a highlighting switch is depressed. The highlighting part 50B may also switch the first output image to the second output image when the screw dial is operated and thereafter switch the second output image to the first output image when the length of time without screw dial operation (non-operating time) reaches a predetermined period. Likewise, the highlighting part 50B may also switch the first output image to the second output image when the highlighting switch is operated and thereafter switch the second output image to the first output image when the length of time without highlighting switch operation (non-operating time) reaches a predetermined period. The non-operating time is counted using the timer function of the controller 50, for example.
  • The first output image includes a peripheral image and does not include a local image. The second output image includes a peripheral image and a local image. The peripheral image is an image showing an area surrounding the asphalt finisher 100. The local image is an image showing a predetermined local area associated with the asphalt finisher 100, for example, an image showing an area where a paving material is fed (scattered out) by the screw SC. The area where a paving material is fed by the screw SC is, for example, an area in front of the screed 3 and surrounded by retaining plates 70 (see FIG. 1B), side plates 71 (see FIG. 1B), and mold boards 72 (see FIG. 1B). By looking at this local image, the operator can check the amount of a paving material enclosed in the local area while remaining seated in the operator seat 1S without moving around on the tractor 1 or twisting her/his body in the operator seat 1S to look into the local area. Furthermore, the operator can check the surroundings and the amount of a paving material enclosed in the local area substantially simultaneously without greatly moving the line of sight. Thus, the asphalt finisher 100 in which the image generating system SYS is installed can reduce the operator's fatigue due to the work of checking the enclosed amount. As a result, it is possible to improve safety with respect to the asphalt finisher 100. The local image may be an image showing an area in the hopper 2.
  • Furthermore, in the case of displaying the second output image on the display device 52, the highlighting part 50B displays the local image in a highlighted manner on the display device 52 so that the operator can distinguish between the peripheral image and the local image. For example, the highlighting part 50B displays the local image in a display frame different from a display frame surrounding the peripheral image. In this case, the local image may be displayed over the peripheral image in such a manner as to overlap the peripheral image or be displayed at a different position than the peripheral image so as not to overlap the peripheral image. Alternatively, an image part corresponding to the local area in the peripheral image may be enlarged and displayed. In this case, at least part of the other image part of the peripheral image may be reduced in size and displayed, or the display of the other image part may be omitted. In the case where an image part corresponding to the local area in the peripheral image is enlarged and displayed, the display of the local image with another display frame may be omitted.
  • Next, the first output image generated using the respective input images of the left camera 51L, the right camera 51R, and the front camera 51F is described with reference to FIG. 3. FIG. 3 is an example of the display of the first output image displayed on the display device 52.
  • The first output image mainly includes a hopper image HG, a left peripheral image LG, a right peripheral image RG, and an illustration image 1CG. The hopper image HG, the left peripheral image LG, and the right peripheral image RG compose the peripheral image. The image generating system SYS displays the hopper image HG, the left peripheral image LG, the right peripheral image RG, and the illustration image 1CG at predetermined positions in predetermined size in the first output image so that the operator can understand that the front of the asphalt finisher 100 coincides with the upper side of the screen of the display device 52. This is for causing the operator to intuitively understand the positional relationship between the asphalt finisher 100 and an object in its surrounding area by showing the operator the first output image serving as an overhead view image showing an aerial view of the asphalt finisher 100 and its surrounding area from above.
  • The hopper image HG is generated based on an input image of the front camera 51F. According to this embodiment, the hopper image HG is an image showing the state of the inside of the hopper 2 as seen when looking down at the hopper 2 from the tractor 1, and is generated by clipping part of the input image of the front camera 51F to be placed in the top center of the first output image.
  • The left peripheral image LG is generated based on an input image of the left camera 51L. According to this embodiment, the left peripheral image LG is an image showing the state of a left peripheral area to the left of the asphalt finisher 100 in the travel direction as seen when looking down at the left peripheral area from the tractor 1. Specifically, the left peripheral image LG is generated by clipping part of the input image of the left camera 51L, performing distortion correction thereon, and further performing image rotation thereon, and is placed at the left end of the first output image. Furthermore, the left peripheral image LG includes an image of the left end portion of the screed 3 and an image of the left end portion of the hopper 2.
  • The right peripheral image RG is generated based on an input image of the right camera 51R. According to this embodiment, the right peripheral image RG is an image showing the state of a right peripheral area to the right of the asphalt finisher 100 in the travel direction as seen when looking down at the right peripheral area from the tractor 1. Specifically, the right peripheral image RG is generated by clipping part of the input image of the right camera 51R, performing distortion correction thereon, and further performing image rotation thereon, and is placed at the right end of the first output image. Furthermore, the right peripheral image RG includes an image of the right end portion of the screed 3 and an image of the right end portion of the hopper 2.
  • The distortion correction is image processing for correcting an apparent distortion or tilt/shift caused by using a wide-angle lens or the like. The image rotation is image processing for matching the front side of the asphalt finisher 100 in the travel direction (the upper side of the screen of the display device 52) and the respective orientations of the left peripheral image LG and the right peripheral image RG. According to this embodiment, the correspondence between the coordinates in the input image planes associated with the respective input images of the left camera 51L and the right camera 51R and the coordinates in the output image plane is stored in the input image-output image correspondence map 54a with effects according to the distortion correction and the image rotation being incorporated in advance. The distortion correction and the image rotation may be performed on the hopper image HG.
  • The illustration image 1CG is a computer-generated graphic of the tractor 1, and is so displayed as to enable the operator to understand the position of the tractor 1. According to this embodiment, the illustration image 1CG is placed in the bottom center of the first output image.
  • Thus, the display device 52 can display the first output image showing an aerial view of the asphalt finisher 100 and its surrounding area from above.
  • According to the above-described embodiment, the hopper image HG, the left peripheral image LG, and the right peripheral image RG are adjacently placed as separate independent images. The three images, however, may be synthesized into a single continuous image. In this case, image processing may be performed to prevent disappearance of an image of an object in the overlapping area of the imaging range of the front camera 51F and the imaging ranges of the left camera 51L and the right camera 51R.
  • Furthermore, according to the above-described embodiment, each of the hopper image HG, the left peripheral image LG, and the right peripheral image RG is generated based on an input image captured by a corresponding single camera. Each of the hopper image HG, the left peripheral image LG, and the right peripheral image RG, however, may be generated based on input images captured by two or more cameras. For example, the left peripheral image LG may be generated based on input images captured by the left camera 51L and the left auxiliary camera 51U. Furthermore, the right peripheral image RG may be generated based on input images captured by the right camera 51R and the right auxiliary camera 51V.
  • Next, the second output image generated using the respective input images of the left camera 51L, the right camera 51R, the front camera 51F, the left auxiliary camera 51U, and the right auxiliary camera 51V is described with reference to FIGS. 4A through 4C. FIGS. 4A through 4C are examples of the display of the second output image displayed on the display device 52.
  • The second output image mainly includes the hopper image HG, the left peripheral image LG, the right peripheral image RG, the illustration image 1CG, and a local image SG. According to this embodiment, the local image SG is displayed over the illustration image 1CG. FIG. 4A illustrates a second output image including a right local image SGR showing a right local area that is an area surrounded by the retaining plate 70 (see FIG. 1B), the side plate 71 (see FIG. 1B), and the mold board 72 (FIG. 1B) on the right side of the tractor 1. FIG. 4B illustrates a second output image including a left local image SGL showing a left local area that is an area surrounded by the retaining plate 70 (see FIG. 1B), the side plate 71 (see FIG. 1B), and the mold board 72 (FIG. 1B) on the left side of the tractor 1. FIG. 4C illustrates a second output image including the right local image SGR and the left local image SGL.
  • The right local image SGR is generated based on an input image of the right auxiliary camera 51V. According to this embodiment, the right local image SGR is an image showing the right local area as seen when looking down at the right local area from the tractor 1. Specifically, the right local image SGR is generated by clipping part of the input image of the right auxiliary camera 51V, performing distortion correction thereon, and further performing image rotation thereon, and is placed along the right end of the illustration image 1CG over the illustration image 1CG.
  • The left local image SGL is generated based on an input image of the left auxiliary camera 51U. According to this embodiment, the left local image SGL is an image showing the left local area as seen when looking down at the left local area from the tractor 1. Specifically, the left local image SGL is generated by clipping part of the input image of the left auxiliary camera 51U, performing distortion correction thereon, and further performing image rotation thereon, and is placed along the left end of the illustration image 1CG over the illustration image 1CG.
  • The display change switch may be configured to include a first switch for displaying the second output image including the right local image SGR (see FIG. 4A) and a second switch for displaying the second output image including the left local image SGL (see FIG. 4B). Alternatively, the display change switch may be composed only of a switch for displaying the second output image including the right local image SGR and the left local image SGL (see FIG. 4C). Alternatively, the display change switch may be configured to include the three switches.
  • The screw dial may be configured to include a right dial for controlling the rotational speed of a right screw and a left dial for controlling the rotational speed of a left screw, or may be composed only of a common dial for controlling the rotational speeds of the right and left screws simultaneously, or may be configured to include the three dials. For example, the highlighting part 50B may display the second output image including the right local image SGR (see FIG. 4A) when the right dial is operated, and display the second output image including the left local image SGR (see FIG. 4B) when the left dial is operated. The highlighting part 50B may display the second output image including the right local image SGR and the left local image SGL (see FIG. 4C) when the common dial is operated.
  • A display frame surrounding the local image SG may be displayed differently from display frames surrounding the hopper image HG, the left peripheral image LG, the right peripheral image RG, and the illustration image 1CG. For example, the display frame surrounding the local image SG may be so displayed as to be different in color, line type, thickness, etc., or may be caused to blink.
  • Next, a process of generating an output image by the image generating system SYS (hereinafter, "output image generating process") is described with reference to FIG. 5. FIG. 5 is a flowchart of the output image generating process. The output image includes the first output image and the second output image. The image generating system SYS repeatedly executes this output image generating process at predetermined control intervals to selectively generate one of the first output image and the second output image. The image generating system SYS, however, may generate both the first output image and the second output image.
  • First, the output image generating part 50A of the controller 50 correlates the values of coordinates in the output image plane and the values of coordinates in the input image planes (step S1). According to this embodiment, the output image generating part 50A refers to the input image-output image correspondence map 54a to obtain the values (for example, luminance value, hue value, chromatic value, etc.) of coordinates in the input image planes corresponding to the coordinates in the output image plane, and set the obtained values as the values of the corresponding coordinates in the output image plane.
  • Thereafter, the controller 50 determines whether the values of all the coordinates in the output image plane are correlated with the values of coordinates in the input image planes (step S2).
  • In response to determining that the values of all the coordinates are not correlated (NO at step S2), the output image generating part 50A repeats the process of step S1 and step S2.
  • In response to determining that the values of all the coordinates are correlated (YES at step S2), the output image generating part 50A ends the output image generating process of this time.
  • Next, a process of switching an output image displayed on the display device 52 between the first output image and the second output image by the image generating system SYS (hereinafter, "output image switching process") is described with reference to FIG. 6. FIG. 6 is a flowchart of the output image switching process. The image generating system SYS repeatedly executes this output image switching process at predetermined control intervals.
  • First, the highlighting part 50B of the controller 50 determines whether highlighting is turned on (step S11). For example, the highlighting part 50B determines that highlighting is turned on when the display change switch serving as a highlighting switch is depressed while the first output image is displayed on the display device 52. The highlighting part 50B may determine that highlighting is turned on when the screw dial is operated.
  • In response to determining that highlighting is turned on (YES at step S11), the highlighting part 50B displays a local image (step S12) in a highlighted manner. For example, the highlighting part 50B switches the first output image (see FIG. 3) displayed on the display device 52 to the second output image (see FIGS. 4A through 4C) to display at least one of the left local image (SGL) and the right local image (SGR) along with a peripheral image.
  • In response to determining that highlighting is not turned on (NO at step S11), the highlighting part 50B continues to display the first output image on the display device 52 without switching the displayed first output image to the second output image.
  • Thereafter, the highlighting part 50B determines whether highlighting is turned off (step S13). For example, the highlighting part 50B determines that highlighting is turned off when the display change switch serving as a highlighting switch is depressed while the second output image is displayed on the display device 52. The highlighting part 50B may determine that highlighting is turned off when a predetermined time has passed since the completion of the operation of the screw dial. Alternatively, the highlighting part 50B may determine that highlighting is turned off when a predetermined time has passed since the depression of the display change switch.
  • In response to determining that highlighting is turned off (YES at step S13), the highlighting part 50B stops displaying the local image in a highlighted manner (step S14). For example, the highlighting part 50B stops displaying the local image in a highlighted manner by switching the second output image displayed on the display device 52 to the first output image.
  • In response to determining that highlighting is not turned off (NO at step S13), the highlighting part 50B continues to display the second output image on the display device 52 without switching the displayed second output image to the first output image.
  • According to this configuration, the image generating system SYS can display a local image on the display device 52 in response to the operator's request. By looking at this local image, the operator can check the amount of a paving material enclosed in a local area while remaining seated in the operator seat 1S without moving around on the tractor 1 or twisting her/his body in the operator seat 1S to look into the local area. Furthermore, the operator can check the surroundings and the amount of a paving material enclosed in the local area substantially simultaneously without greatly moving the line of sight. Thus, the asphalt finisher 100 in which the image generating system SYS is installed can reduce the operator's fatigue due to the work of checking the enclosed amount. As a result, it is possible to improve safety with respect to the asphalt finisher 100.
  • Next, another example of the display of the second output image is described with reference to FIG. 7. FIG. 7 is another example of the display of the second output image displayed on the display device 52. The second output image of FIG. 7 is different in that the right local image SGR is displayed over not the illustration image 1CG but the right peripheral image RG from, but otherwise equal to, the second output image of FIG. 4A. Therefore, a description of a common portion is omitted, and differences are described in detail. The following description, which is of the right local image SGR, is also applied to the left local image SGL.
  • According to the illustration of FIG. 7, the right local image SGR is generated based on an input image of the right camera 51R, the same as the right peripheral image RG. Therefore, the right auxiliary camera 51V may be omitted. The right local image SGR, however, may be an image generated based on an input image of the right auxiliary camera 51V.
  • The right local image SGR corresponds to part of the right peripheral image RG. For example, when the right peripheral image RG is composed of a first image part RG1 through an eleventh image part RG11, the right local image SGR of FIG. 7 corresponds to the ninth image part RG9. Specifically, the image generating system SYS displays the right local image SGR, which is an image into which the ninth image part RG9 is vertically enlarged, where the seventh image part RG7 through the eleventh image part RG11 have been displayed. That is, the first image part RG1 through the sixth image part RG6 continue to be displayed, while the seventh image part RG7 through the eleventh image part RG11 are concealed by the right local image SGR to be invisible.
  • Thus, the image generating system SYS displays the right local image SGR over image parts showing the right local area in the right peripheral image RG. Therefore, the operator can intuitively understand that the right local area is shown in the right local image SGR. Furthermore, by displaying the right local image SGR, the right local area can be enlarged and displayed compared with the case where the right peripheral image RG is displayed. Therefore, it is possible to more clearly show the operator the state of the right local area fed with a paving material by the right screw.
  • Next, yet another example of the display of the second output image is described with reference to FIG. 8. FIG. 8 is yet another example of the display of the second output image displayed on the display device 52. The second output image of FIG. 8 is different in that the right local image SGR is displayed to cover not part but the entirety of the right peripheral image RG from, but otherwise equal to, the second output image of FIG. 7. Therefore, a description of a common portion is omitted, and differences are described in detail. The following description, which is of the right local image SGR, is also applied to the left local image SGL.
  • The same as in the case of FIG. 7, the right local image SGR corresponds to part of the right peripheral image RG. For example, when the right peripheral image RG is composed of the first image part RG1 through the eleventh image part RG11, the right local image SGR of FIG. 8 corresponds to the ninth image part RG9. Specifically, the image generating system SYS displays the right local image SGR, which is an image into which the ninth image part RG9 is vertically enlarged, where the first image part RG1 through the eleventh image part RG11 have been displayed. That is, the first image part RG1 through the eleventh image part RG11 are concealed by the right local image SGR to be invisible.
  • Thus, the image generating system SYS displays the right local image SGR over the entirety of the right peripheral image RG including an image part showing the right local area. Therefore, the operator can intuitively understand that the right local area is shown in the right local image SGR. Furthermore, by displaying the right local image SGR sized to the overall vertical length of the display device 52, the right local area can be further enlarged and displayed compared with the case of the second output image of FIG. 7. Therefore, it is possible to even more clearly show the operator the state of the right local area fed with a paving material by the right screw.
  • Next, still another example of the display of the second output image is described with reference to FIG. 9. FIG. 9 is still another example of the display of the second output image displayed on the display device 52. The second output image of FIG. 9 is different in that the right local image SGR is generated using not part but the entirety of the right peripheral image RG and that an indicator BG is displayed from, but otherwise equal to, the second output image of FIG. 8. Therefore, a description of a common portion is omitted, and differences are described in detail. The following description, which is of the right local image SGR, is also applied to the left local image SGL.
  • Unlike in the case of FIG. 8, the right local image SGR corresponds to the entirety of the right peripheral image RG. For example, when the right peripheral image RG is composed of the first image part RG1 through the eleventh image part RG11, the right local image SGR of FIG. 9 is generated by vertically enlarging or reducing the size of each of the first image part RG1 through the eleventh image part RG11. Specifically, the right local image SGR of FIG. 9 is composed of images into which the first image part RG1 through the sixth image part RG6 and the eleventh image part RG11 are vertically reduced in size and images into which the seventh image part RG7 through the tenth image part RG10 are vertically enlarged. That is, unlike in the case of FIGS. 7 and 8, a view shown in the right peripheral image RG continues to be visible even when the right local image SGR is displayed.
  • The indicator BG is a graphic image representing the state of enlargement or reduction of the image parts of the local image SG relative to the corresponding image parts of the peripheral image. According to the illustration of FIG. 9, a right indicator BGR represents the state of enlargement or reduction of the image parts of the right local image SGR relative to the corresponding image parts of the right peripheral image RG. Specifically, the right indicator BGR is a vertically elongated bar-shaped indicator composed of eleven rectangular segments corresponding to the first image part RG1 through the eleventh image part RG11, and is displayed at the right end of the screen. When a left indicator is displayed, the left indicator may be displayed at the left end of the screen. The left indicator represents the state of enlargement or reduction of the image parts of the left local image SGL relative to the corresponding image parts of the left peripheral image LG. In either case, the bar-shaped indicator indicates that a vertically longer rectangular segment represents a higher enlargement rate and that a vertically shorter rectangular element represents a higher reduction rate. The display of the indicator BG may be omitted.
  • Thus, the same as in the case of FIG. 8, the image generating system SYS displays the right local image SGR over the entirety of the right peripheral image RG including an image part showing the right local area. Therefore, the operator can intuitively understand that the right local area is shown in the right local image SGR. Furthermore, the same as in the case of FIG. 8, by displaying the right local image SGR over the overall vertical length of the display device 52, the right local area can be enlarged and displayed compared with the case of the second output image of FIG. 7. Therefore, it is possible to more clearly show the operator the state of the right local area fed with a paving material by the right screw. Furthermore, unlike in the case of FIG. 8, it is made possible for the operator to continuously see a view shown in the right peripheral image RG while enlarging and displaying the right local area. Therefore, the operator can check the state of the right local area while watching a worker to the right of the asphalt finisher 100, for example.
  • According to the above-described configuration, the image generating system SYS can generate an output image that makes it possible for the operator to intuitively understand the positional relationship between the asphalt finisher 100 and a worker working in its surrounding area, etc., based on input images captured by multiple cameras.
  • Furthermore, the image generating system SYS displays the hopper image HG, the left peripheral image LG, the right peripheral image RG, and the illustration image 1CG so that the operator can be presented with an image showing an aerial view of the asphalt finisher 100 and its surrounding area from above. This makes it possible for the operator to visually check a blind area around the asphalt finisher 100 without leaving the operator seat 1S. As a result, the image generating system SYS can improve the safety and operability of the asphalt finisher 100. Specifically, the image generating system SYS can show the operator the amount of a paving material remaining in the hopper 2, the position of a feature (for example, a maintenance hole) in a road surface to be paved, etc. Furthermore, the image generating system SYS can show the operator the position of a worker working around the asphalt finisher 100, etc. Therefore, the operator can look at the display device 52 and check the position of a worker, etc., and thereafter perform various operations such as opening or closing the hopper, extending or retracting the screed, and raising or lowering the screed 3. Furthermore, the operator can suspend various operations or stop the asphalt finisher when finding that the positional relationship between a worker and the hopper, the screed, or a dump truck is dangerous.
  • Furthermore, the image generating system SYS displays a peripheral image that is an image showing the surrounding area of the asphalt finisher 100 and displays a local image that is an image showing an area fed with a paving material by the screw SC in a highlighted manner. Displaying the local image in a highlighted manner includes displaying the local image in a different display frame, enlarging and displaying the local image, and changing the mode of display of the display frame of the local image. According to this process, the image generating system SYS can clearly show the operator the state of a predetermined local area as well as the state of the surrounding area of the asphalt finisher 100.
  • The image generating system SYS may include the right camera 51R serving as a first camera to capture an image of an area to the right of the asphalt finisher 100 and the right auxiliary camera 51V serving as a second camera to capture an image of an area fed with a paving material by the right screw. In this case, the right peripheral image RG that is a component of the peripheral image may be generated based on an image captured by the right camera 51R and the right local image SGR composing the local image may be generated based on an image captured by the right auxiliary camera 51V.
  • Furthermore, the image generating system SYS may include the left camera 51L serving as a first camera to capture an image of an area to the left of the asphalt finisher 100 and the left auxiliary camera 51U serving as a second camera to capture an image of an area fed with a paving material by the left screw. In this case, the left peripheral image LG that is a component of the peripheral image may be generated based on an image captured by the left camera 51L and the left local image SGL composing the local image may be generated based on an image captured by the left auxiliary camera 51U.
  • The image generating system SYS may include the right camera 51R that captures an image of an area to the right of the asphalt finisher 100 and the right local area fed with a paving material by the right screw. In this case, the right peripheral image RG and the right local image SGR may be generated based on an image captured by the right camera 51R. In this case, the right auxiliary camera 51V may be omitted.
  • Furthermore, the image generating system SYS may include the left camera 51L that captures an image of an area to the left of the asphalt finisher 100 and the left local area fed with a paving material by the left screw. In this case, the left peripheral image LG and the left local image SGL may be generated based on an image captured by the left camera 51L. In this case, the left auxiliary camera 51U may be omitted.
  • The local image may be displayed without overlapping the peripheral image as illustrated in FIGS. 4A through 4C, for example, and may be displayed over the peripheral image as illustrated in FIGS. 6 through 8, for example.
  • The display device 52 may display the indicator BG that represents the state of enlargement or reduction of the local image SG. By looking at the indicator BG, the operator can instantaneously understand the state of enlargement or reduction of each image part of the local image SG.
  • An asphalt finisher that displays images captured by a front camera, a left camera, and a right camera side by side around a computer-generated graphic of a tractor has been known (see Patent Document 1). The front camera is attached to the upper end of the front of the tractor to capture an image of the inside of a hopper in front of the tractor. The left camera is attached to the upper end of the left side of the tractor to capture an image of a space to the left of the asphalt finisher. The right camera is attached to the upper end of the right side of the tractor to capture an image of a space to the right of the asphalt finisher.
  • According to the above-described configuration, however, a space around the front end of hopper wings is the blind area of the hopper wings when viewed from the front camera, the left camera, and the right camera. Therefore, an operator of the asphalt finisher cannot understand the state of the space around the front end of the hopper wings by looking at the displayed image. The operator has to check the safety of a space that is invisible in an image directly with her/his eyes.
  • In view of the above, it is desired to provide a road machine that further reduces an area that is invisible in an image.
  • FIG. 10 is a side view of the asphalt finisher 100 that is an example of a road machine according to an embodiment of the invention. FIG. 11 is a plan view of the asphalt finisher 100. The asphalt finisher 100 mainly includes the tractor 1, the hopper 2, and the screed 3. In the following, the direction of the hopper 2 as viewed from the tractor 1 is defined as a forward direction (the +X direction), and the direction of the screed 3 as viewed from the tractor 1 is defined as a rearward direction (the -X direction).
  • The tractor 1 is a vehicle for causing the asphalt finisher 100 to travel. According to this embodiment, the tractor 1 rotates rear wheels 5 using rear wheel traveling hydraulic motors and rotates front wheels 6 using front wheel traveling hydraulic motors to move the asphalt finisher 100. The rear wheel traveling hydraulic motors and the front wheel traveling hydraulic motors are supplied with hydraulic oil from a hydraulic pump to rotate. The rear wheels 5 and the front wheels 6 may be replaced with crawlers. Furthermore, the tractor 1 includes a canopy 1C. The canopy 1C is attached to the top of the tractor 1.
  • The controller 50 is a control device to control the asphalt finisher 100. According to this embodiment, the controller 50 is composed of a microcomputer including a CPU, a volatile memory, and a nonvolatile memory, and is installed in the tractor 1. The CPU executes programs stored in the nonvolatile memory to implement various functions of the controller 50.
  • The hopper 2 is a mechanism for receiving a paving material, and mainly includes hopper wings 20 and hopper cylinders 24. According to this embodiment, the hopper 2 is installed in front of the tractor 1 and receives a paving material in the hopper wings 20. The hopper wings 20 include a left hopper wing 20L that can be opened and closed in the Y-axis directions (widthwise of the vehicle) by a left hopper cylinder 24L and a right hopper wing 20R that can be opened and closed in the Y-axis directions (widthwise of the vehicle) by a right hopper cylinder 24R. Normally, the asphalt finisher 100 fully opens the hopper wings 20 to receive a paving material (for example, an asphalt mixture) from the bed of a dump truck. FIG. 11 illustrates that the hopper wings 20 are fully open. When the paving material in the hopper 2 decreases, the hopper wings 20 are closed to gather the paving material near the inner wall of the hopper 2 to the center of the hopper 2 so that the conveyor CV in the center of the hopper 2 can continuously feed the paving material to the back of the tractor 1, that is, the paving material can be kept piled on the conveyor CV. Thereafter, the paving material fed to the back of the tractor 1 is laid and spread over the width of the vehicle behind the tractor 1 and in front of the screed 3 by the screw SC. According to this embodiment, the screw SC has extension screws laterally coupled.
  • The screed 3 is a mechanism for spreading and smoothing a paving material. According to this embodiment, the screed 3 includes a front screed 30 and rear screeds 31. The screed 3 is a free floating screed towed by the tractor 1, and is coupled to the tractor 1 via leveling arms 3A. The rear screeds 31 include the left rear screed 31L and the right rear screed 31R. The left rear screed 31L extends and retracts widthwise of the vehicle using a left screed extendable and retractable cylinder 26L, and the right rear screed 31R extends and retracts widthwise of the vehicle using a right screed extendable and retractable cylinder 26R.
  • The image capturing devices 51 are devices to capture images. According to this embodiment, the image capturing devices 51 are monocular cameras, and are connected to the controller 50 wirelessly or by wire. The controller 50 can generate an overhead view image by performing a viewpoint changing process on images captured by the image capturing devices 51. The overhead view image is, for example, an image of a space around the asphalt finisher 100 as virtually viewed from substantially directly above. The image capturing devices 51 may be stereo cameras. According to this embodiment, the image capturing devices 51 include the front camera 51F, the left camera 51L, the right camera 51R, and a back camera 51B. The back camera 51B may be omitted.
  • The front camera 51F captures an image of a space in front of the asphalt finisher 100. According to this embodiment, the front camera 51F is attached to a hood forming the front of the tractor 1 so as to be able to capture an image of the inside of the hopper 2 that is a blind area from the viewpoint of the operator seated in the operator seat 1S (hereinafter, "operator seat viewpoint"). The front camera 51F may be attached to the front edge of the top plate of the canopy 1C. In FIG. 11, a gray area Z1 indicates the imaging range of the front camera 51F.
  • The left camera 51L captures an image of a space to the left of the asphalt finisher 100. According to this embodiment, the left camera 51L is attached to the end of a bar member BL extending in the +Y direction (leftward) from the left edge of the top plate of the canopy 1C so as to be able to capture an image of a space outside the left hopper wing 20L in the vehicle width direction that is a blind area from the operator seat viewpoint. The left camera 51L may be attached to the end of the bar member BL extending in the +Y direction (leftward) from the right side of the tractor 1. For example, the left camera 51L is so attached as to protrude outward (leftward) in the vehicle width direction relative to the left end of the fully opened left hopper wing 20L. In FIG. 11, a gray area Z2 indicates the imaging range of the left camera 51L.
  • The right camera 51R captures an image of a space to the right of the asphalt finisher 100. According to this embodiment, the right camera 51R is attached to the end of a bar member BR extending in the -Y direction (rightward) from the right edge of the top plate of the canopy 1C so as to be able to capture an image of a space outside the right hopper wing 20R in the vehicle width direction that is a blind area from the operator seat viewpoint. The right camera 51R may be attached to the end of the bar member BR extending in the -Y direction (rightward) from the right side of the tractor 1. For example, the right camera 51R is so attached as to protrude outward (rightward) in the vehicle width direction relative to the right end of the fully opened right hopper wing 20R. In FIG. 11, a gray area Z3 indicates the imaging range of the right camera 51R.
  • The bar members BL and BR are desirably removable. The bar members BL and BR may be extendable and retractable. This is for handling the case of transporting the asphalt finisher 100 in a trailer or the like.
  • The back camera 51B captures an image of a space behind the asphalt finisher 100. According to this embodiment, the back camera 51B is attached to the rear edge of the top plate of the canopy 1C so as to be able to capture an image of a space behind the screed 3 that is a blind area from the operator seat viewpoint. In FIG. 11, a gray area Z4 indicates the imaging range of the back camera 51B.
  • The imaging range of the front camera 1F and the imaging range of the left camera 1L may overlap. The imaging range of the back camera 1B and the imaging range of the left camera 1L do not have to overlap. The same applies to the imaging range of the right camera 1R.
  • The controller 50 generates an overhead view image by changing the viewpoints of and synthesizing the respective captured images of the front camera 51F, the left camera 51L, the right camera 51R, and the back camera 51B. The overhead view image includes an image of the internal space of the hopper 2, a space to the left of the left hopper wing 20L, a space to the right of the right hopper wing 20R, and a space behind the screed 3 as virtually viewed from substantially directly above and a computer graphics image (hereinafter, "model image") of the asphalt finisher 100. The controller 50 may generate an overhead view image by changing the viewpoints of and synthesizing the respective captured images of the three cameras of the front camera 51F, the left camera 51L, and the right camera 51R. That is, the overhead view image may be generated without using the back camera 51B.
  • The display device 52 is a device to display various images. According to this embodiment, the display device 52 is a liquid crystal display, and is connected to the controller 50 wirelessly or by wire. The display device 52 can display an image captured by each of the image capturing devices 51, and is placed at such a position as to make it easy for an operator seated in the operator seat 1S to look at the display device 52. The display device 52 may be placed where a rear controller is. For example, the controller 50 displays an image generated by performing a viewpoint changing process on images captured by the image capturing devices 51 on the display device 52.
  • Next, a display system GS installed in the asphalt finisher 100 is described with reference to FIG. 12. FIG. 12 is a block diagram illustrating an example configuration of the display system GS. The display system GS mainly includes the controller 50, the image capturing devices 51, the display device 52, an information obtaining device 53, and the storage device 54. For example, the display system GS generates an image for display (hereinafter, "output image") based on images captured by the image capturing devices 51 (hereinafter, "input images") and displays the output image on the display device 52.
  • The information obtaining device 53 obtains information and outputs the obtained information to the controller 50. The information obtaining device 53 includes at least one of, for example, a hopper cylinder stroke sensor, screed extendable and retractable cylinder stroke sensors, a steering angle sensor, a travel speed sensor, and a positioning sensor. The hopper cylinder stroke sensor detects the stroke amount of the hopper cylinders 24. The screed extendable and retractable cylinder stroke sensors detect the stroke amounts of screed extendable and retractable cylinders 26. The steering angle sensor detects the steering angle of the front wheels 6. The travel speed sensor detects the travel speed of the asphalt finisher 100. The positioning sensor is, for example, a GNSS compass, and detects the position (latitude, longitude, and altitude) and the orientation of the asphalt finisher 100.
  • The storage device 54 is a device for storing various kinds of information. According to this example, the storage device 54 is a nonvolatile storage device that stores the input image-output image correspondence map 54a in such a manner as to allow reference to the input image-output image correspondence map 54a.
  • The input image-output image correspondence map 54a stores the correspondence between the coordinates in the input image planes and the coordinates in the output image plane. The correspondence is preset based on various parameters of the image capturing devices 51, such as an optical center, a focal length, a CCD size, an optical axis direction vector, a camera horizontal direction vector, and a projection method, so that a viewpoint can be changed as desired. The correspondence is so set as to prevent an apparent distortion or tilt/shift from appearing in the output image.
  • The controller 50 includes a viewpoint changing part 50a and an auxiliary line creating part 50b. The viewpoint changing part 50a and the auxiliary line creating part 50b are composed of software, hardware, or firmware.
  • The viewpoint changing part 50a is a functional element to generate the output image. According to this embodiment, the viewpoint changing part 50a refers to the input image-output image correspondence map 54a stored in the storage device 54 to correlate the coordinates in the input image planes in which input images captured by the image capturing devices 51 are positioned with the coordinates in the output image plane in which the overhead view image as the output image is positioned. Specifically, the viewpoint changing part 50a generates the output image by correlating the values (for example, luminance value, hue value, chromatic value, etc.) of pixels in the input images with the values of pixels in the output image.
  • The auxiliary line creating part 50b is a functional element to create auxiliary lines to be displayed over the output image. According to this embodiment, the auxiliary line creating part 50b creates auxiliary lines such that the auxiliary lines match the overhead view image generated by the viewpoint changing part 50a. Examples of auxiliary lines include an auxiliary line indicating an expected pavement trajectory that is the expected trajectory of an end of the screed 3 and an auxiliary line indicating an expected travel trajectory that is the expected trajectory of a wheel.
  • According to this embodiment, the controller 50 refers to the input image-output image correspondence map 54a through the viewpoint changing part 50a. Then, the controller 50 obtains the values (for example, luminance value, hue value, chromatic value, etc.) of coordinates in the input image planes corresponding to the coordinates in the output image plane, and adopts the obtained values as the values of the corresponding coordinates in the output image plane.
  • Thereafter, the controller 50 determines whether the values of all the coordinates in the output image plane are correlated with the values of coordinates in the input image planes. In response to determining that the values of all the coordinates are not correlated, the controller 50 repeats the above-described process.
  • In response to determining that the values of all the coordinates are correlated, the controller 50 displays auxiliary lines indicating expected pavement trajectories, auxiliary lines indicating expected travel trajectories, etc., over the output image. Positions in the output image where auxiliary lines are to be superposed, which are preset according to this embodiment, may also be dynamically derived. Furthermore, the controller 50 may correlate the coordinates in the input image planes and the coordinates in the output image plane after displaying auxiliary lines.
  • Next, the overhead view image generated using input images captured by the four image capturing devices 51 (the front camera 51F, the left camera 51L, the right camera 51R, and the back camera 51B) installed in the asphalt finisher 100 is described with reference to FIGS. 13A and 13B. FIGS. 13A and 13B are diagrams illustrating an example of the overhead view image. Specifically, FIG. 13A illustrates an example of the overhead view image displayed on the display device 52. FIG. 13B illustrates the segments of input images used for generating the overhead view image of FIG. 13A.
  • The overhead view image of FIG. 13A is an image of a space around the asphalt finisher 100 as virtually viewed from substantially directly above. The overhead view image of FIG. 13A mainly includes an image G1 (see the hatched area) generated by the viewpoint changing part 50a and a model image CG1.
  • The model image CG1 is an image representing the asphalt finisher 100, and includes a model image CGa of the hopper 2, a model image CGb of the tractor 1, and a model image CGc of the screed 3.
  • The model image CGa of the hopper 2 includes a model image WL of the left hopper wing 20L and a model image WR of the right hopper wing 20R. The model images WL and WR change in shape according to the output of the hopper cylinder stroke sensor. FIG. 13A illustrates the model image CGa when each of the left hopper wing 20L and the right hopper wing 20R is fully open. Part of the overhead view image (an image of the inside of the hopper 2 as virtually viewed from substantially directly above) is placed in the hatched area between the model image WL and the model image WR. The model image CGa of the hopper 2 may be omitted. In this case, part of the overhead view image based on an image captured by the front camera 1F is placed.
  • The model image CGc of the screed 3 includes a model image SL of the left rear screed 31L and a model image SR of the right rear screed 31R. The model images SL and SR change in shape according to the outputs of the screed extendable and retractable cylinder stroke sensors. FIG. 13A illustrates the model image CGc when each of the left rear screed 31L and the right rear screed 31R extends most. The model image CGc of the screed 3 may be omitted. In this case, part of the overhead view image based on an image captured by the back camera 1B is placed.
  • The image G1 is an image generated using the respective captured input images of the four image capturing devices 51. According to this embodiment, the image G1 includes an image Ga of a worker in front and to the left of the asphalt finisher 100 and an image Gb of a maintenance hole cover in front and to the right of the asphalt finisher 100. As illustrated in FIG. 13B, the controller 50 generates the image G1 by synthesizing a front image R1, a left image R2, a right image R3, and a rear image R4. The worker image Ga is included in the left image R2, and the maintenance hole cover image Gb is included in the right image R3.
  • The front image R1 is an image generated based on an input image captured by the front camera 51F. According to this embodiment, the front image R1 includes an image showing the state of the inside of the hopper 2 as viewed down from the tractor 1 side. The controller 50 generates the front image R1 by clipping and performing a viewpoint changing process on part of the input image captured by the front camera 51F. The front image R1 is placed between and on the upper side of the model image WL and the model image WR. The front image R1 may change in shape according as the model images WL and WR change in shape.
  • The left image R2 is an image generated based on an input image captured by the left camera 51L. According to this embodiment, the left image R2 includes an image of a space on the outer side (to the left) of the left hopper wing 20L in the vehicle width direction. The controller 50 generates the left image R2 by clipping and performing a viewpoint changing process by the viewpoint changing part 50a on part of the input image captured by the left camera 51L. The left image R2 is placed to the left of the model image CG1. The left image R2 may change in shape according as the model images WL and SL change in shape.
  • The right image R3 is an image generated based on an input image captured by the right camera 51R. According to this embodiment, the right image R3 includes an image of a space on the outer side (to the right) of the right hopper wing 20R in the vehicle width direction. The controller 50 generates the right image R3 by clipping and performing a viewpoint changing process by the viewpoint changing part 50a on part of the input image captured by the right camera 51R. The right image R3 is placed to the right of the model image CG1. The right image R3 may change in shape according as the model images WR and SR change in shape.
  • The rear image R4 is an image generated based on an input image captured by the back camera 51B. According to this embodiment, the rear image R4 includes an image showing the state of the screed 3 as viewed down from the tractor 1 side. The controller 50 generates the rear image R4 by clipping and performing a viewpoint changing process on part of the input image captured by the back camera 51B. The rear image R4 is placed on the lower side of the model image CGc of the screed 3. The rear image R4 may change in shape according as the model images SL and SR change in shape.
  • According to this embodiment, the correspondence between the coordinates in the respective input image planes of the four cameras and the coordinates in the output image plane is stored in the input image-output image correspondence map 54a with effects according to the viewpoint changing process being incorporated in advance. Therefore, the controller 50 can correlate coordinates in the input image planes with coordinates in the output image plane by only referring to the input image-output image correspondence map 54a. As a result, the controller 50 can generate and display an output image at a relatively low computational load.
  • Furthermore, according to this embodiment, each of the front image R1, the left image R2, the right image R3, and the rear image R4 is generated based on an input image captured by a corresponding single camera. The invention, however, is not limited to this configuration. For example, each of the front image R1, the left image R2, the right image R3, and the rear image R4 may be generated based on input images captured by two or more cameras.
  • As described above, the asphalt finisher 100 includes the multiple image capturing devices 51 attached to the tractor 1, the controller 50 that generates an overhead view image by changing the viewpoints of and synthesizing the respective captured images of the image capturing devices 51, and the display device 52 that displays the overhead view image on a screen. The overhead view image is configured to include an image of a space around the asphalt finisher 100 as virtually viewed from substantially directly above. Therefore, it is possible to further reduce a blind area that is an invisible area in the output image. As a result, for example, it is possible to show an operator of the asphalt finisher 100 the state of a space around the front end of the hopper wings 20. It is also possible to show the operator the state of a paving material in the hopper 2, the state of a space behind the screed 3, etc.
  • Furthermore, by showing the overhead view image, the asphalt finisher 100 can improve visibility, safety, operability and work efficiency. Specifically, the asphalt finisher 100 can cause the operator to intuitively understand the amount of a paving material remaining in the hopper 2, the position of a feature (for example, a maintenance hole) in a road surface to be paved, etc. Therefore, the operator can look at the overhead view image and check the position of a feature or a worker, and thereafter perform various operations such as opening or closing the hopper wings 20.
  • Next, other examples of the output image are described with reference to FIGS. 14A and 14B. FIGS. 14A and 14B illustrate examples where auxiliary lines are displayed over the image G1 generated by the viewpoint changing part 50a. Specifically, FIG. 14A illustrates an overhead view image of the asphalt finisher 100 that is traveling straight. FIG. 14B illustrates an overhead view image of the asphalt finisher 100 that is turning right.
  • According to the illustration of FIGS. 14A and 14B, the auxiliary line creating part 50b creates auxiliary lines L1 and L2 based on the respective outputs of the steering angle sensor and the travel speed sensor. The auxiliary line L1 is the expected travel trajectory of a left rear wheel 5L, and the auxiliary line L2 is the expected travel trajectory of a right rear wheel 5R. According to this embodiment, the auxiliary lines L1 and L2 are derived based on a current steering angle and travel speed, but may be derived based solely on the steering angle. The auxiliary lines L1 and L2 represent, for example, travel trajectories during a period from a current point of time until a predetermined time (for example, several tens of seconds) passes.
  • Furthermore, the auxiliary line creating part 50b creates auxiliary lines L3 and L4, additionally referring to the outputs of the screed extendable and retractable cylinder stroke sensors. The auxiliary line L3 is the expected trajectory of the left end of the left rear screed 31L, and the auxiliary line L4 is the expected trajectory of the right end of the right rear screed 31R. According to this embodiment, the auxiliary lines L3 and L4 are derived based on a current steering angle, travel speed, and stroke amounts of the screed extendable and retractable cylinders 26. The auxiliary lines L3 and L4 represent trajectories during a period from a current point of time until a predetermined time (for example, several tens of seconds) passes.
  • Furthermore, the auxiliary line creating part 50b creates auxiliary lines L5 and L6 based on road design data and the output of the positioning sensor. The road design data are data related to a road that is a construction target, and is prestored in a nonvolatile storage device, for example. The road design data include, for example, data on the position of a feature in a road surface to be paved. The auxiliary line L5 represents the left edge of the road that is a construction target, and the auxiliary line L6 represents the right edge of the road that is a construction target.
  • Furthermore, the controller 50 displays an image of a feature such as a maintenance hole over the overhead view image based on the road design data and the output of the positioning sensor. According to the illustration of FIGS. 14A and 14B, the controller 50 displays a model image CG2 of a maintenance hole cover over the overhead view image.
  • By looking at the overhead view image over which auxiliary lines, etc., are displayed, the operator of the asphalt finisher 100 can determine whether the current steering angle, travel speed, amounts of extension or retraction of the rear screeds 31, etc., of the asphalt finisher 100 are appropriate. For example, by looking at the overview head image of FIG. 14A, the operator can be aware that if the asphalt finisher 100 is kept traveling straight, the right rear wheel 5R will run over the maintenance hole cover and the road will not be paved as designed. In contrast, by looking at the overhead view image of FIG. 14B, the operator can be aware that if a current steering condition (condition in which a steering wheel is turned to the right) and travel speed are maintained, the right rear wheel 5R can avoid running over the maintenance hole cover and the road will be paved as designed.
  • The controller 50, which displays the expected travel trajectories of the rear wheels 5 according to the illustration of FIGS. 14A and 14B, may display the expected travel trajectories of the front wheels 6 or display the respective expected travel trajectories of the rear wheels 5 and the front wheels 6.
  • According to the above-described configuration, in addition to the effects of the overhead view image described with reference to FIGS. 13A and 13B, the controller 50 can achieve the additional effect of being able to present the movement of the asphalt finisher 100 during a period from a current point of time until a predetermined time passes to the operator in advance.
  • Preferred embodiments of the invention are described above. The invention, however, is not limited to the above-described embodiments. Various variations, replacements, etc., may be applied to the above-described embodiments without departing from the scope of the invention. Furthermore, the features described with reference to the above-described embodiments may be suitably combined as long as causing no technical contradiction.
  • For example, while an image part of the local image SG is enlarged and reduced in size only vertically according to the above-described embodiments, the image part may be enlarged and reduced in size only laterally or may be enlarged and reduced in size vertically and laterally. In the case where the image part is enlarged and reduced in size only laterally, the indicator BG may be a laterally elongated bar-shaped indicator. In the case where the image part is enlarged and reduced in size vertically and laterally, the indicator BG may be a vertically and laterally elongated matrix-shaped indicator.
  • Furthermore, the asphalt finisher 100 may be a guss asphalt finisher using a guss asphalt mixture. The image generating system SYS may be installed in a guss asphalt finisher using a guss asphalt mixture.
  • The invention is based upon and claims priority to Japanese patent application No. 2017-153668, filed on August 8, 2017 and Japanese patent application No. 2017-164687, filed on August 29, 2017 , the entire contents of which are hereby incorporated herein by reference.
  • DESCRIPTION OF THE REFERENCE NUMERALS
  • 1 ... tractor 1S ... operator seat 2 ... hopper 3 ... screed 3A ... leveling arm 5 ... rear wheel 6 ... front wheel 20 ... hopper wings 20L ... left hopper wing 20R ... right hopper wing 24 ... hopper cylinder 24L ... left hopper cylinder 24R ... right hopper cylinder 26 ... screed extendable and retractable cylinder 26L ... left screed extendable and retractable cylinder 26R ... right screed extendable and retractable cylinder 30 ... front screed 31 ... rear screed 31L ... left rear screed 31R ... right rear screed 50 ... controller 50A ... output image generating part 50B ... highlighting part 50a ... viewpoint changing part 50b ... auxiliary line creating part 51 ... image capturing device 51B ... back camera 51F ... front camera 51L ... left camera 51R ... right camera 51U ... left auxiliary camera 51V ... right auxiliary camera 52 ... display device 53 ... information obtaining device 54 ... storage device 54a ... input image-output image correspondence map 55 ... input device 65 ... operation panel 70 ... retaining plate 71 ... side plate 72 ... mold board 100 ... asphalt finisher BL, BR ... bar member CV ... conveyor SC ... screw SYS ... image generating system

Claims (14)

  1. A road machine including a tractor and a screed placed behind the tractor, the road machine comprising:
    a work apparatus configured to feed a paving material in front of the screed; and
    a display device configured to display a peripheral image that is an image showing a surrounding area of the road machine, and to display a local image in a highlighted manner, the local image being an image showing an area fed with the paving material by the work apparatus.
  2. The road machine as claimed in claim 1, comprising:
    a first camera configured to capture an image of an area to a side of the road machine; and
    a second camera configured to capture an image of the area fed with the paving material by the work apparatus,
    wherein the peripheral image is generated based on the image captured by the first camera, and
    the local image is generated based on the image captured by the second camera.
  3. The road machine as claimed in claim 1, comprising:
    a camera configured to capture an image of an area to a side of the road machine and the area fed with the paving material by the work apparatus,
    wherein the peripheral image and the local image are generated based on the image captured by the camera.
  4. The road machine as claimed in claim 1, wherein the local image is displayed without overlapping the peripheral image.
  5. The road machine as claimed in claim 1, wherein the local image is displayed over the peripheral image.
  6. The road machine as claimed in claim 1, wherein the display device is configured to display an indicator representing a state of enlargement or reduction of the local image.
  7. The road machine as claimed in claim 1, wherein
    the screed includes a left rear screed and a right rear screed that are extendable and retractable in a vehicle width direction, and
    the left rear screed and the right rear screed are arranged with an offset from each other in a travel direction.
  8. A road machine comprising:
    a tractor;
    a hopper installed in front of the tractor and configured to receive a paving material in a hopper wing;
    a conveyor configured to feed the paving material in the hopper to a back of the tractor;
    a screw configured to lay and spread the paving material fed by the conveyor behind the tractor;
    a screed configured to spread and smooth the paving material laid and spread by the screw behind the screw;
    a plurality of image capturing devices attached to the tractor; and
    a display device configured to display respective captured images of the image capturing devices,
    wherein at least one of the image capturing devices is placed at such a position as to be able to capture an image of a space outside the hopper wing in a vehicle width direction.
  9. The road machine as claimed in claim 8, wherein one of the image capturing devices is configured to capture an image of a space behind the screed as virtually viewed from substantially directly above.
  10. The road machine as claimed in claim 8, comprising:
    a control device configured to generate an overhead view image by changing viewpoints of and synthesizing the respective captured images of the image capturing devices,
    wherein the display device is configured to display the overhead view image on a screen, and
    the overhead view image includes an image of a space around the road machine as virtually viewed from substantially directly above.
  11. The road machine as claimed in claim 10, wherein the overhead view image includes an image of the space outside the hopper wing in the vehicle width direction as virtually viewed from substantially directly above.
  12. The road machine as claimed in claim 8, wherein the image capturing device is so attached to the tractor as to protrude outward in the vehicle width direction relative to the hopper wing that is fully opened.
  13. The road machine as claimed in claim 8, wherein
    a canopy is attached to the tractor, and
    the image capturing device is attached to the canopy.
  14. The road machine as claimed in claim 8, wherein the display device is configured to display an expected trajectory and an expected travel trajectory of an end of the screed.
EP18843681.0A 2017-08-08 2018-07-31 Road machine Active EP3666977B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017153668 2017-08-08
JP2017164687 2017-08-29
PCT/JP2018/028750 WO2019031318A1 (en) 2017-08-08 2018-07-31 Road machine

Publications (3)

Publication Number Publication Date
EP3666977A1 true EP3666977A1 (en) 2020-06-17
EP3666977A4 EP3666977A4 (en) 2021-01-27
EP3666977B1 EP3666977B1 (en) 2023-12-13

Family

ID=65271383

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18843681.0A Active EP3666977B1 (en) 2017-08-08 2018-07-31 Road machine

Country Status (4)

Country Link
EP (1) EP3666977B1 (en)
JP (3) JP7146767B2 (en)
CN (1) CN111032958B (en)
WO (1) WO2019031318A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3779047A4 (en) * 2018-03-30 2021-04-21 Sumitomo (S.H.I.) Construction Machinery Co., Ltd. Road machine
WO2022100838A1 (en) * 2020-11-12 2022-05-19 Moba Mobile Automation Ag Control system for a construction machine
CN114763701A (en) * 2021-01-14 2022-07-19 现代斗山英维高株式会社 Control system and method for construction machine
US11724652B2 (en) 2019-06-11 2023-08-15 Kubota Corporation Protection mechanism for working vehicle, and working vehicle including the same
US11884141B2 (en) 2019-06-11 2024-01-30 Kubota Corporation Protection mechanism for working vehicle, and working vehicle including the same

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020196539A1 (en) * 2019-03-25 2020-10-01 住友建機株式会社 Asphalt finisher
EP3951061A4 (en) * 2019-03-29 2022-06-01 Sumitomo Construction Machinery Co., Ltd. Asphalt finisher
EP3951062A4 (en) * 2019-03-29 2022-05-11 Sumitomo Construction Machinery Co., Ltd. Asphalt finisher
JP7187389B2 (en) * 2019-06-11 2022-12-12 株式会社クボタ Work vehicle protection mechanism and work vehicle equipped with the same
WO2021193351A1 (en) * 2020-03-26 2021-09-30 住友建機株式会社 Asphalt finisher
CN115135832A (en) * 2020-03-27 2022-09-30 住友建机株式会社 Asphalt rolling machine and machine learning device
CN111809481B (en) * 2020-07-21 2022-04-19 三一汽车制造有限公司 Paver material conveying vehicle guiding system, paver and paver material conveying vehicle guiding method

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6029941B2 (en) 1979-06-27 1985-07-13 株式会社日立製作所 Composite electrophotographic board
JPH077368Y2 (en) * 1990-11-13 1995-02-22 株式会社新潟鐵工所 Construction monitoring equipment for asphalt finishers
JP2505210Y2 (en) * 1993-04-09 1996-07-24 建設省東北地方建設局長 Automatic steering device for paving vehicles
JPH089849B2 (en) * 1993-04-09 1996-01-31 建設省東北地方建設局長 Automatic control device for asphalt finisher
JPH11217853A (en) * 1998-01-30 1999-08-10 Komatsu Ltd Rearward monitoring device of construction machine and its rearward monitoring method
JP4907883B2 (en) * 2005-03-09 2012-04-04 株式会社東芝 Vehicle periphery image display device and vehicle periphery image display method
JP4776491B2 (en) * 2006-10-06 2011-09-21 日立建機株式会社 Work machine ambient monitoring device
JP4705611B2 (en) * 2007-07-10 2011-06-22 住友建機株式会社 Driver assistance device for paving machines
JP5269026B2 (en) * 2010-09-29 2013-08-21 日立建機株式会社 Work machine ambient monitoring device
JP5809988B2 (en) * 2012-01-10 2015-11-11 日立建機株式会社 Travel support device for work machine
WO2013136374A1 (en) 2012-03-16 2013-09-19 三菱電機株式会社 Driving assistance device
JP5917371B2 (en) 2012-11-08 2016-05-11 住友重機械工業株式会社 Image generator for paving machine and operation support system for paving machine
JP6029941B2 (en) 2012-11-08 2016-11-24 住友重機械工業株式会社 Image generator for paving machines
JP6073182B2 (en) 2013-04-26 2017-02-01 住友重機械工業株式会社 Image generator for paving machine and operation support system for paving machine
JP6279856B2 (en) 2013-08-27 2018-02-14 住友建機株式会社 Excavator display
JP2015104375A (en) * 2013-12-02 2015-06-08 ヤンマー株式会社 Combine harvester
JP6095592B2 (en) * 2014-02-17 2017-03-15 日立建機株式会社 Monitoring image display device for hydraulic excavator
CN104278616A (en) * 2014-09-28 2015-01-14 广东惠利普路桥信息工程有限公司 Driverless paver
JP2016139914A (en) 2015-01-27 2016-08-04 パイオニア株式会社 Display device, portable terminal and control method
DE102015002692A1 (en) * 2015-03-04 2016-09-08 Dynapac Gmbh Road construction machine and method for operating a self-propelled road construction machine
JPWO2016174754A1 (en) 2015-04-28 2018-02-15 株式会社小松製作所 Work machine periphery monitoring device and work machine periphery monitoring method
SE539312C2 (en) 2015-06-10 2017-06-27 Conny Andersson Med Firma Ca Konsult A method of determining the quality of a newly produced asphalt pavement
EP3985184A1 (en) 2015-08-24 2022-04-20 Komatsu Ltd. Control system for wheel loader, control method thereof, and method of controlling wheel loader
JP2017089325A (en) * 2015-11-16 2017-05-25 住友建機株式会社 Asphalt finisher
JP6677534B2 (en) 2016-03-01 2020-04-08 株式会社三共 Slot machine
JP2017164687A (en) 2016-03-16 2017-09-21 中間貯蔵・環境安全事業株式会社 Pcb contaminated equipment dismantling method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3779047A4 (en) * 2018-03-30 2021-04-21 Sumitomo (S.H.I.) Construction Machinery Co., Ltd. Road machine
US11724652B2 (en) 2019-06-11 2023-08-15 Kubota Corporation Protection mechanism for working vehicle, and working vehicle including the same
US11884141B2 (en) 2019-06-11 2024-01-30 Kubota Corporation Protection mechanism for working vehicle, and working vehicle including the same
WO2022100838A1 (en) * 2020-11-12 2022-05-19 Moba Mobile Automation Ag Control system for a construction machine
CN114763701A (en) * 2021-01-14 2022-07-19 现代斗山英维高株式会社 Control system and method for construction machine
EP4029998A1 (en) * 2021-01-14 2022-07-20 Hyundai Doosan Infracore Co., Ltd. System and method for controlling a construction machine

Also Published As

Publication number Publication date
CN111032958B (en) 2023-03-10
EP3666977A4 (en) 2021-01-27
WO2019031318A1 (en) 2019-02-14
EP3666977B1 (en) 2023-12-13
JPWO2019031318A1 (en) 2020-09-24
JP2022111290A (en) 2022-07-29
JP2024028606A (en) 2024-03-04
JP7146767B2 (en) 2022-10-04
CN111032958A (en) 2020-04-17

Similar Documents

Publication Publication Date Title
EP3666977B1 (en) Road machine
US10650251B2 (en) Monitoring image display device of industrial machine
EP2918726B1 (en) Paving machine with an image generation device
EP2918725B1 (en) Image generation device for paving machine and operation assistance system for paving device
JP5888956B2 (en) Excavator and surrounding image display method of the excavator
AU2013293921B2 (en) Environment monitoring device for operating machinery
JP7445591B2 (en) road machinery
US20150009329A1 (en) Device for monitoring surroundings of machinery
US10621743B2 (en) Processing-target image creating device, processing-target image creating method, and operation assisting system
JP6896525B2 (en) Asphalt finisher
JP6746303B2 (en) Excavator
JP7178334B2 (en) excavator and excavator display

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200210

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

A4 Supplementary search report drawn up and despatched

Effective date: 20210112

RIC1 Information provided on ipc code assigned before grant

Ipc: E01C 19/48 20060101AFI20201221BHEP

17Q First examination report despatched

Effective date: 20210122

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20230705

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

RIN1 Information on inventor provided before grant (corrected)

Inventor name: HAGIWARA, KAZUAKI

Inventor name: BABA, NOBUYUKI

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602018062708

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D