JP4951639B2 - Work machine with ambient monitoring device - Google Patents

Work machine with ambient monitoring device Download PDF

Info

Publication number
JP4951639B2
JP4951639B2 JP2009047989A JP2009047989A JP4951639B2 JP 4951639 B2 JP4951639 B2 JP 4951639B2 JP 2009047989 A JP2009047989 A JP 2009047989A JP 2009047989 A JP2009047989 A JP 2009047989A JP 4951639 B2 JP4951639 B2 JP 4951639B2
Authority
JP
Japan
Prior art keywords
image
combined
work machine
position
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009047989A
Other languages
Japanese (ja)
Other versions
JP2010204821A (en
Inventor
小沼  知恵子
竜 弓場
將裕 清原
英史 石本
Original Assignee
日立建機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立建機株式会社 filed Critical 日立建機株式会社
Priority to JP2009047989A priority Critical patent/JP4951639B2/en
Publication of JP2010204821A publication Critical patent/JP2010204821A/en
Application granted granted Critical
Publication of JP4951639B2 publication Critical patent/JP4951639B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a work machine including a surrounding monitoring device, and more particularly, to a working machine including a surrounding monitoring device that monitors surrounding obstacles and presents it to a driver.

  In order to prevent an accident in which a vehicle or a work machine comes into contact with a person or an object in the vicinity, a camera equipped with a camera for monitoring the environment is known. For example, Patent Document 1 captures images of a plurality of external cameras installed, displays an image from the viewpoint of the upper side of the vehicle, performs synthesis processing, and detects a luminance change portion in the screen in the image synthesis processing image. This indicates that a three-dimensional object is detected, and the detected three-dimensional object is connected to the vehicle upper viewpoint image at the base front end portion to display the three-dimensional object.

  Further, in Patent Document 2, a plurality of images, which are composed of a distant view portion at the top of the screen and a close view portion at the bottom of the screen, and are photographed so as to be continuous in the horizontal direction, the distant view portion is aligned with a conventional camera image. For the foreground part, convert it to an upper viewpoint image by geometric conversion, and then connect the converted image and then perform geometric conversion again as seen from the normal viewpoint to create a panoramic image It is shown.

JP 2007-233876 A JP 2000-312287 A

  In the apparatus disclosed in Patent Document 1, since a three-dimensional object is detected by detecting a luminance change portion in the screen, disturbances such as shadows, changes in sunlight, and shaking of trees are detected in an outdoor image taken around the vehicle or work machine. Frequently occur. For this reason, these may be erroneously detected as obstacles. In addition, when the background image and the three-dimensional object are similar, a case where it cannot be detected occurs.

  In the case of erroneous detection, a camera image other than a three-dimensional object is connected to the vehicle upper viewpoint image and displayed. In addition, if a three-dimensional object cannot be detected, only the vehicle upper viewpoint image is displayed, and even if it is captured in the camera image, it becomes out of view, or even if it exists in the view, it becomes a stereoscopic image that is very uncomfortable, It becomes difficult to confirm the state of the actual object from the display image.

  In addition, if there are many false detections or cases where detection is not possible, in the combined image, the region where the camera image is connected to the upper viewpoint image and the upper viewpoint image region are mixed, and the actual object state is displayed from the display image. It becomes difficult to confirm and obstructions increase. For this reason, it takes time to check the safety of the surroundings, and the work efficiency is lowered.

  In the apparatus described in Patent Document 2, as described above, from a plurality of images photographed so as to be continuous in the horizontal direction, the far-field portion is aligned using a conventional camera image, and the near-field portion is subjected to geometric transformation. Then, the image is converted into an upper viewpoint image, and after the converted images are connected, geometric conversion is performed again as seen from the normal viewpoint to create a panoramic image.

  In this apparatus, the position to be aligned is fixed. For this reason, when a fixed position is set so as to be optimal when the shooting scene is narrow, an unnatural panoramic image is obtained when the shooting scene is wide. Conversely, when the fixed position is set so as to be optimal when the shooting scene is wide, an unnatural panoramic image is obtained when the shooting scene is narrow. In addition, since the portion to be aligned is not adjusted so as to be continuous, if there is an object that spans the connecting portion of the foreground portion and the distant portion, an uncomfortable feeling occurs.

  The present invention has been made in view of these problems, and combines a bird's-eye view image having a viewpoint above the work machine and a camera image from a camera that shoots the surroundings at an optimal position to generate a combined image without a sense of incongruity. To do.

  In order to solve the above problems, the present invention employs the following means.

An overhead image creation unit that creates an overhead image having a viewpoint above the work machine based on camera images from a plurality of cameras that are attached to the work machine and photograph the surroundings of the work machine, and determines the operation of the work machine Then, a combined position calculation unit that calculates a combined position when a combined image is created by combining the camera image with the overhead image, and the camera image is combined with the overhead image at the position calculated by the combined position calculation unit. A combined image generating unit that generates a combined image and a combined image combining unit that combines the combined images generated by the combined image generating unit, and the combined position calculating unit sets the combined position to the height of the installation position of the camera. Calculate from

  Since the present invention has the above-described configuration, it is possible to combine a bird's-eye view image with a viewpoint above the work machine and a camera image from a camera that captures the surroundings at an optimal position to generate a combined image without a sense of incongruity.

It is a figure which shows the external appearance of a hydraulic shovel. It is the external view which looked at the hydraulic excavator from the upper viewpoint. It is a figure explaining the surroundings monitoring device concerning an embodiment. It is a figure explaining other embodiment. It is a figure explaining the detail of the attitude | position data acquisition part of a working machine. It is a figure explaining the detail of the operation data taking-in part of a working machine. It is a figure explaining the detail of a joint position calculation part. It is a schematic diagram explaining a combined image creation part. It is a figure explaining the case where in a combined image creation part, the combined position is set near the camera. It is a figure explaining the case where in a combined image creation part, a combined position is set far from the camera. It is a schematic diagram which shows the other example of a combined image preparation part. It is a figure explaining a non-linear table. It is a schematic diagram which shows the further another example of a combined image preparation part. It is a figure explaining the detail of a combined image synthetic | combination part. It is an example which shows the center position of a right side combined image. This is an example in which the combined position in the combined image created by the combined image combining unit is far from the camera. This is an example in which the combined position created by the combined image composition unit is close to the camera. It is a figure explaining the image processing in a synthetic image obstacle detection part. It is a figure explaining the detail of a display image generation part. It is an example of a screen displayed on the display device (an example of a screen for monitoring a narrow range around the front working machine when the front end is short). It is an example of a screen displayed on the display device (an example of a screen when the front end of the front work machine is short). It is an example of a screen displayed on the display device (an example of a screen for monitoring a wide surrounding area when the front working machine has a long tip). It is an example of a screen displayed on the display device (an example of a screen for monitoring a wide surrounding area when the front working machine has a long tip).

  Hereinafter, the best embodiment will be described with reference to the accompanying drawings. FIG. 1 is a diagram illustrating an appearance of a hydraulic excavator as an example of a work machine. The hydraulic excavator includes an articulated front work machine 1A including a boom 1a, an arm 1b, and a bucket 1c that rotate in a vertical direction, and a vehicle body 1B that includes an upper swing body 1d and a lower traveling body 1e. The

  The upper swing body 1d is provided with an operator cab 1f. The base end of the boom 1a of the front work machine 1A is supported by the front part of the upper swing body 1d. Boom 1a, arm 1b, bucket 1c, upper swing body 1d and lower traveling body 1e are boom cylinder 3a, arm cylinder 3b, bucket cylinder 3c, swing motor 3d (not shown in FIG. 1) and left and right travel motors 3e, respectively. It is driven by each actuator 3f (not shown). In addition, the boom 1a, the arm 1b, the bucket 1c, and the upper swing body 1d include angle detectors 8a, 8b, 8c, and 8d that detect the respective rotation angles. Further, for example, an operator 20 may exist behind the hydraulic excavator.

  FIG. 2 is an external view (overhead image) of the hydraulic excavator viewed from above. The vehicle body 1B above the upper swing body 1d is provided with a right-side monitoring camera 13a, a rear-side monitoring camera 13b, and a left-side monitoring camera 13c. These monitoring ranges are 12a, 12b, and 12c. . The excavator includes lower traveling bodies 1e and 1e ', and the monitoring range includes the working range 14 of the front working machine 1A.

  FIG. 3 is a diagram illustrating the surrounding monitoring apparatus according to the present embodiment. In FIG. 3, 31 is a video signal of the camera 13a, 32 is a video signal of the camera 13b, 33 is a video signal of the camera 13c, 50 is an image processing device, and 900 is a surrounding scene of the work machine by photographing a surrounding scene with the cameras 13a, 13b and 13c. A monitor TV 1000 for displaying the video and the like is an operator in the cab 1f that monitors the display content of the display device 900.

  The image processing apparatus 50 includes an image input unit 200, a bird's-eye view image generation unit 300, a combined position calculation unit 400, a combined image generation unit 450, a combined image composition unit 500, a composite image obstacle detection unit 600, and a work machine posture data capture unit. 700, a work machine operation data capturing unit 750, and a display image generating unit 800. The image processing apparatus 50 can be configured by a PC or an image processing dedicated apparatus that can realize these image processes.

  In the processing, first, the camera 12a captures the scene 12a, and transmits the captured video signal 31 to the image processing device 50. The image processing apparatus 50 inputs the video signal 31 and stores it in the image input unit 200. In addition, the camera 13 b captures the shooting target scene 12 b and transmits the captured video signal 32 to the image processing device 50. The image processing device 50 inputs the video signal 32 and stores it in the image input unit 200. The camera 13 c captures the shooting target scene 12 c and transmits the captured video signal 33 to the image processing apparatus 50. The image processing apparatus 50 inputs the video signal 33 and stores it in the image input unit 200. The stored video signal 31, video signal 32, and video signal 33 are overhead converted to create an overhead image 300. Note that the creation of the bird's-eye view image 300 can be realized by a known technique (see, for example, JP-A-2006-48451).

  Next, the posture data 700 of the front work machine 1A of the work machine is captured, and further, the operation data 750 of the upper swing body 1d and the lower traveling body 1e are captured, and the camera 13a takes an image from the posture data 700 and the operation data 750. The combined position in the overhead image 300 is calculated 400 in conjunction with the motion data 750 and the fluctuation of the motion data 750 for the scene 12a, and the combined position in the input image 200 is also calculated 400, and the overhead image 300 and the input image 200 are calculated. A combined image 450 is created using the calculated combined position 400.

  Similarly, a combined position in the overhead image 300 is calculated 400 for the scene 12b captured by the camera 13b, a combined position in the input image 200 is also calculated 400, and the combined position 400 calculated from the overhead image 300 and the input image 200 is calculated. Is used to create a combined image 450. Also, the combined position in the overhead image 300 is calculated 400 for the scene 12c photographed by the camera 13c, the combined position in the input image 200 is also calculated 400, and the combined position 400 calculated from the overhead image 300 and the input image 200 is obtained. To create a combined image 450. Using the combined image 450 for the scene 12a, the combined image 450 for the scene 12b, and the combined image 450 for the scene 12c, the combined image is synthesized 500 by placing the combined image around the simulated work machine at the upper viewpoint.

  Further, the obstacle detection 600 is performed on the synthesized combined image 500 using an image processing method, the detection result of the obstacle detection 600 is superimposed, and the front work machine imported from the posture data 700 of the front work machine 1A is superimposed. Using the tip position data, the danger range is also superimposed, and the display image generation 800 is also performed to superimpose the turning range and / or the predicted travel path and / or the travel guideline taken from the operation data 750 of the front work machine 1A, and the monitor, etc. Displayed on the display device 900.

  As a result, the grasp of the obstacle around the work machine becomes clear at a glance, and the suitability of the operation of the work machine can be instantly and accurately determined. For this reason, work efficiency and safety can be improved.

  FIG. 4 is a diagram for explaining another embodiment of the present invention. In this example, the image processing apparatus 50 includes an image input unit 200, an image distortion correction unit 250, an overhead image creation unit 300, a combined position calculation unit 400, a combined image generation unit 450, a combined image composition unit 500, and a composite image obstacle. It includes a detection unit 600, a work machine attitude data acquisition unit 700, a work machine operation data acquisition unit 750, and a display image generation unit 800. Note that the image processing apparatus 50 can be configured by a PC or an image processing dedicated apparatus that can realize these image processes.

  The distortion correction unit 300 may be omitted if there is no distortion in the input image. However, when monitoring the surroundings of a vehicle or work machine, a camera using a lens with a wide viewing angle capable of shooting a wide range is often used, and distortion occurs around the shooting scene. Therefore, a distortion correction unit 250 that corrects this distortion is required. The distortion correction unit can be realized by, for example, a known technique (for example, see Japanese Patent Application Laid-Open No. 2006-59270).

  FIG. 5 is a diagram for explaining the details of the posture data capturing unit 750 of the work machine. First, in 701, the output θ1 of the angle detector 8a that detects the rotation angle of the boom 1a is taken in, in 702, the output θ2 of the angle detector 8b that detects the rotation angle of the arm 1b is taken in, and in 703 The output θ3 of the angle detector 8c that detects the rotation angle of 1c is viewed, and at 704, the output θ4 of the angle detector 8d that detects the rotation angle of the upper swing body 1d is captured.

  Next, in 705, the calculation of the tip coordinates of the bucket 1c is calculated using these data θ1, θ2, θ3, and θ4. Thereby, the data of the tip position of the front work machine 1A in the vehicle body 1B is calculated, and the surrounding danger range in the work machine can be known.

  FIG. 6 is a diagram for explaining the details of the operation data fetch unit 750 of the work machine. First, in 704, the output θ4 of the angle detector 8d that detects the rotation angle of the upper swing body 1d is taken. Next, in 707, the turning direction of the upper swing body 1d is calculated. In 708, the traveling speed is calculated from the traveling motor 3e. Further, in 709, the traveling direction is calculated from the rotation of the lower traveling body 1e. As a result, the turning direction and the traveling speed of the work machine are calculated, the traveling locus can be predicted, and the traveling guideline can be presented.

  FIG. 7 is a diagram for explaining the details of the coupling position calculation unit 400. First, at 401, the operation of the work machine is determined. When the determined operation is a front work, that is, excavation, in 402, the distance of the coupling position close to the camera is calculated (calculated value Pa) in order to grasp a wide surrounding range, and in 403, the overhead image creation unit 300 creates An area of the connecting portion from the camera side to the Pa position is cut out from the overhead image thus obtained (overhead image PL). In 404, a rectangular area after the Pa position is cut out from the input image (camera image RL). Next, in 405, in order to combine the PL image and the RL image at the Pa position, the image is resized by the combining unit so that the RL image is adjusted to the PL image.

  Further, when the operation determined in 401 is traveling, the same processing as that in the case of excavation is performed. When the action determined in 401 is turning, in 406, the posture of the work machine is determined from the posture data captured by the posture data capturing unit 700. When the front end of the front work machine is long, as in the case of excavation, Processes 402, 403, and 404 are performed. On the other hand, when the front end of the front work machine is short, in 407, in order to grasp the surrounding narrow range, the distance that is the coupling position far from the camera is calculated (calculated value Pb). An area of the joint from the camera side to the Pb position is cut out from the created overhead image (overhead image PM). In 409, a rectangular region after the Pb position is cut out from the input image (camera image RM). In 410, in order to combine the PM image and the RM image at the Pb position, the combining unit resizes the image to be adjusted so that the RM image matches the PM image.

  Here, in 407, in order to grasp the narrow range of the surroundings, the distance that is the coupling position far from the camera is calculated (Pb), but the overhead image created by the overhead image creation unit 300 is calculated without calculating the distance of the coupling position. The image may be left as it is.

  When the coupling position far from the camera is calculated, since there are many bird's-eye view image areas, a narrow range around the work machine can be grasped in detail, and the nearby workers can be clearly displayed. In addition, when calculating the coupling position close to the camera, it is possible to roughly grasp a wide range around the work machine and to display distant workers without a sense of incongruity.

  On the other hand, when the camera installation position is high, the coupling position is calculated at a position slightly further than the coupling position calculated by the coupling position calculation unit 400. When the camera installation position is low, the coupling position calculated by the coupling position calculation unit 400 is calculated. The coupling position is calculated at a slightly close position. Here, when the camera installation height is calculated based on an example of 2 m, the camera installation position is higher than 2 m, and the camera installation position is lower than 2 m.

  FIG. 8 is a schematic diagram illustrating the combined image creation unit 450. Up to the combined position distance 453 calculated by the combined position calculation unit 400 (distance 453 from the camera), the overhead view image 451 of the overhead view image creation unit 300 viewed from the upper viewpoint is displayed. The images are combined using the input image of the camera image 452 captured by the image input unit 200 as it is.

  When combined in this way, the person 20 existing around the rear camera 13b becomes a person whose upper viewpoint is deformed in the overhead image 451 created by the overhead image creation unit 300, and in the camera image 452 captured by the image input unit 200. Become a person with a vertical perspective. When these are combined at the combining position 453, the person 20 becomes close to the appearance and the feeling of strangeness is reduced.

  Here, FIG. 8 shows an example in which the coupling position is calculated by a straight line, but it can also be set by an arc centered on the camera position.

  FIG. 9 shows that the combined image creation unit 450 calculates the combined position close to the camera 454 when the front working machine has a long tip and when it is necessary to grasp a wide range of surroundings as in excavation or traveling. It is a figure explaining the case where it did. Here, for example, when the camera installation height is 2 m, the wide range is a scene that is photographed up to a distance of 4 m or more.

  First, in the case of the input image 200a of the rear camera 13b, a case where the foot of the person 20a is present at a position of 4 m in a scene in which the surrounding area is displayed 204 will be described.

  In the bird's-eye view image 300a created by the bird's-eye view image creation unit 300 based on the input image 200a, the person 20b is also converted together with the background scene. It becomes a strange person image. Therefore, the overhead image region 300b is cut out from the camera to the combining position 454, and the region 200b of the input image 200a is cut out and combined after the combining position 454. However, if the region 300b and the region 200b are combined as they are, the sizes do not match. Therefore, the size is adjusted by adjusting the position combining position of the region 300b and the position combining position of the region 200b, and the region 300b is resized to create the region 300c. Then, the region 200b is resized to create the region 200c, and these are combined to create a combined image 451. In the resized area 200c, the person 20b is also resized to become the person 20c.

  FIG. 10 is a diagram for explaining a case where the combined position for grasping a narrow range around the camera is calculated 455 far away when the front end of the front work machine in the combined image creation unit 450 is short. The narrow range here is, for example, a scene that is shot within 3 m when the camera installation height is 2 m.

  First, in the case of the input image 200a of the rear camera 13b, a case where the foot of the person 20a is present at a position 2m in a scene 205 that displays up to 3m around will be described.

  In the bird's-eye view image 200b created by the bird's-eye view image creation unit 300, the background scene is converted into a person 20b. Become. Accordingly, the overhead image region 300d is cut out from the camera to the combining position 455, and the region 200d of the input image 200a is cut out after the combining position 455. However, since the size does not match if the region 300d and the region 200d are joined together, the size is adjusted by adjusting the joining position of the region 300d and the joining position of the region 200d, the region 300d is resized, and the region 300e is created. The region 200d is resized to create and combine the region 200e to create a combined image 452. In the resized area 200e, the person 20c is also resized to become a person 20d.

  As a result, the persons 20c and 20d are close to the appearance, the feeling of incongruity is reduced, the presence of persons, obstacles, and the like can be accurately and effectively grasped, and work efficiency and safety are improved.

FIG. 11 is a schematic diagram illustrating another example of the combined image creation unit 450. An area 461 in the vicinity of the coupling position distance 453 calculated by the coupling position calculation unit 400 is set, and a non-linear table in which the coupling at the coupling position distance 453 continuously and smoothly changes is created. The area 461 and the camera image area 452 are combined.

  FIG. 12 illustrates a non-linear table. For example, as shown in FIG. 12, a non-linear table indicating the ratio of the combination of the overhead image and the camera image in the region 461 near the coupling position is from 0 (overhead image region 451) to 1 (camera image region 452). ) In the ratio of 0.1, 0.2, 0.4, 0.6, 0.8.

  FIG. 13 is a schematic diagram illustrating still another example of the combined image creation unit 450. As shown in FIG. 13, a logarithmic conversion table 463 may be applied to the overhead view image 451 and the camera image 452 of the upper viewpoint.

  FIG. 14 is a diagram for explaining the details of the combined image composition unit 500. First, at 501, the center position (OC) of the simulated work machine at the upper viewpoint is set. In 502, the right side combined image storage memory of the right side camera 13a is read. In 503, the rear combined image storage memory of the rear camera 13b is read. In 504, the left side combined image storage memory of the left side camera 13c is read. In 505, the right-side combined image 502 is rotated 90 ° on the right side to arrange the right side combined image 502. In 506, the upside down image is arranged to arrange the rear combined image 503 in the rear side. In 507, the left side combined image 504 is displayed. Rotate left 90 ° to position the left side. Next, at 508, the center position of the right side combined image and the center position OC of the simulated work machine at the upper viewpoint are arranged together, and at 509, the center position of the rear combined image and the center position OC of the simulated work machine at the upper viewpoint are arranged. In 510, the center position of the left side combined image and the center position OC of the simulated work machine at the upper viewpoint are aligned. Finally, at 511, the right side combined image 508 is arranged on the right side of the simulated work machine at the upper viewpoint, the rear combined image 509 is arranged behind, the left side combined image 510 is arranged on the left side, and the display image is displayed. create.

  FIG. 15 is an example showing the center position 523 of the right side combined image. The intersection of the extension line 521 at the upper end and the extension line 522 at the lower end of the right side combined image is the center position 523.

  FIG. 16 is an example in which the combined position in the combined image created by the combined image combining unit 500 is far from the camera. First, the intersection 523 of the combined image 533 combined at the combined position 532 calculated by the combined position calculation unit 400 with respect to the right side input image is arranged according to the center 531 of the simulated work machine 1 viewed from the upper viewpoint. Next, the intersection 523 of the combined image 535 combined at the combined position 534 calculated by the combined position calculation unit 400 with respect to the rear input image is arranged according to the center 531 of the simulated work machine 1 viewed from the upper viewpoint. Furthermore, the intersection 523 of the combined image 537 combined at the combined position 536 calculated by the combined position calculation unit 400 with respect to the left side input image is arranged according to the center 531 of the simulated work machine 1 viewed from the upper viewpoint. Here, since the head of the person 20 existing in the field of view of the rear camera is combined in the input image, the person 538 is less likely to feel strange. Further, it can be confirmed whether or not the blind spot is generated at the rear part of the simulated work machine 1, and it is understood that the blind spot of the black part 539 is generated. Thereby, from the driver's seat of the simulation work machine 1 which the operator operates, the presence direction and position of the worker around the work machine and the blind spot of the camera installation can be confirmed. Further, a narrow range around the work machine can be grasped in detail, and nearby workers can be clearly displayed.

  FIG. 17 shows an example in which the combined position created by the combined image combining unit 500 is close to the camera. First, the intersection point 523 of the combined image 542 combined at the combined position 541 calculated by the combined position calculation unit 400 with respect to the right side input image is arranged according to the center 531 of the simulated work machine 1 viewed from the upper viewpoint. Next, the intersection point 523 of the combined image 544 combined at the combined position 543 calculated by the combined position calculation unit 400 with respect to the rear input image is arranged in accordance with the center 531 of the simulated work machine 1 viewed from the upper viewpoint. Furthermore, the intersection 523 of the combined image 546 combined at the combined position 545 calculated by the combined position calculation unit 400 with respect to the left side input image is arranged according to the center 531 of the simulated work machine 1 viewed from the upper viewpoint. Here, since the person 20 existing in the field of view of the rear camera is combined in the input image, the person 547 is less discomfort. Thereby, from the driver's seat of the simulation work machine 1 which the operator operates, the presence direction and position of the worker around the work machine and the blind spot of the camera installation can be confirmed. In addition, it is possible to roughly grasp a wide range around the work machine, and distant workers are displayed without a sense of incongruity.

  FIG. 18 is a diagram for describing image processing in the composite image obstacle detection unit 800. First, in 610, a composite image is captured, and in 610 and 615, masking is performed so as to exclude the area of the simulated work machine from the upper viewpoint. At 620, a difference image for each pixel of the immediately preceding composite image and the current composite image is created. In 630, the difference image created in 620 is compared with a predetermined threshold value (about 7 to 15), and binarization processing is performed so that a value less than the threshold value is 0, and a value greater than or equal to the threshold value is 1 or more, and an obstacle change region is extracted. . In 640, it is determined whether or not a change area has been extracted.

  On the other hand, if a change area is extracted at 640, feature calculation is performed at 660. As the features of the obstacle, the aspect ratio or the extraction region is divided into three parts, the head, the body, and the lower part, and the respective contour shapes and the like are calculated. For example, the aspect ratio of the feature amount calculated in 660 is vertically long, the head of the extraction region has a fan-shaped contour, the shoulder portion of the trunk portion has an oblique contour, the trunk portion has a vertical contour, and an inverted V-shaped lower portion. If there are a plurality of features such as contours and vertical contours, it is determined in 690 that there is a person, and if there are not a plurality of these features, it is determined in 680 that there is an obstacle other than a person.

  Here, the composite image obstacle detection unit 800 may make the determination using a sensor such as a distance sensor installed in the work machine, instead of using a technique based on image processing.

  FIG. 19 is a diagram for explaining the details of the display image generation unit 800. First, at 810, a danger range is calculated from the tip position of the front work machine 1A. In 820, the positional relationship with the lower traveling body 1e is determined in front of the front working machine 1A using the length of the front working machine 1A and the direction from the lower traveling body 1e. At 830, the upper viewpoint simulation work machine data which is the drawing data of the vehicle body 1B is created. Next, at 840, distance data to the cab 1f, the obstacle 20, and the obstacle 20 is calculated using the detection data of the composite image obstacle detection unit 600. Further, in 850, an image in which a simulated image of the front work machine 1A of the vehicle body 1B is arranged at the center with the upper part (front) (an image obtained by converting the front work machine 1A forward and an upper viewpoint centered on the vehicle body 1B). A display image in which the danger range, the obstacle 20 mark, the turning range, the predicted travel path, and the travel guideline are superimposed is generated and displayed on the display device 900. As a result, the operator in the cab 1f looks at the front at the center position, and the position of the obstacle 20 around the work machine from directly above the entire surrounding to be monitored. This makes it possible to determine at a glance the turning of the work machine, the driving guidelines, and the predicted driving trajectory.

  The display device 900 may be provided in the driver's cab, and may be anywhere as long as the operator can visually check it. Further, when the composite image obstacle detection unit 600 detects the obstacle 20, it may be displayed on the display device 900 and notified to the driver by voice. Further, at the start of turning, an alarm such as the presence of an obstacle may be output, or the movement of the obstacle may be output by voice.

  This makes it easy for the driver to understand the positional relationship between the machine and the obstacle when there is an obstacle, making it possible to determine the propriety of turning the front work machine accurately and instantaneously, thus avoiding the obstacle. At the same time, the working efficiency is improved.

FIG. 20 is an example of a screen displayed on the display device 900 (an example of a screen for monitoring a narrow range around the front working machine when the tip of the front work machine is short), and is an example when an obstacle is detected. The obstacle mark 905 is superimposed on the obstacle 538 combined by the combined image composition unit 500, and the danger range 905 is superimposed. Further, at the bottom of the screen, an obstacle detection is displayed so that it can be easily determined which of the rear camera 901, the right side camera 902, and the left side camera 903 is detected. For example, when the rear camera 901 detects, the part is exaggerated and displayed, and the turning state of the work machine is displayed 908. Further, an I / F for enlarging the display 906 or standard display 907 may be set.

  FIG. 21 is an example of a screen displayed on the display device 900 (an example of a screen when the front end of the front work machine is short). A traveling speed meter 905, a remaining oil meter 906, an engine output meter 906, etc., which are the state of the work machine, may also be displayed.

  FIG. 22 is an example of a screen displayed on the display device 900 (an example of a screen for monitoring a wide surrounding area when the front work machine has a long tip), and is an example when an obstacle is detected. . The obstacle mark 905 is superimposed on the obstacle 538 combined by the combined image composition unit 500, and the danger range 905 is superimposed. Further, at the bottom of the screen, an obstacle detection is displayed so that it can be easily determined which of the rear camera 901, the right side camera 902, and the left side camera 903 is detected. For example, when the rear camera 901 detects, the portion is exaggerated and displayed, and the turning state of the work machine is displayed 910. Further, an I / F for enlarging the display 906 or standard display 907 may be set.

  FIG. 23 is an example of a screen displayed on the display device 900 (an example of a screen for monitoring a wide surrounding area when the front end of the front work machine is long). The traveling direction 912 of the lower traveling body 1e captured by the operation data capturing unit 750 of the work machine is superimposed and displayed, and the traveling speed 908 calculated from the traveling motor 3e is displayed on the display device 900.

  This makes it possible for the operator to understand the presence of the work machine and surrounding obstacles and people, the direction of travel, and the speed when danger occurs, regardless of whether the shooting scene is narrow or wide. Moreover, since information can be presented instantly, the operator can accurately and effectively grasp the presence of an obstacle or an intruder without lowering the work efficiency, thereby improving work efficiency and safety.

  As described above, according to the present embodiment, the coupling position calculating means, for example, according to the front position of the front work machine, which is captured by the camera installation state and the work machine attitude data fetch means, for example, the front work machine When the tip position is far from the camera, the coupling position is set near the camera, so that a wide range around the work machine can be roughly grasped, and distant workers can also display without discomfort.

  In addition, when the data of the front end position of the front work machine is close to the camera, the coupling position is far from the camera, so that a narrow range around the work machine can be grasped in detail, and nearby workers are clearly displayed. be able to.

  In addition, data on excavation, turning, and traveling of the work machine is captured by the operation data capturing means. In the case of turning, the coupling position is far from the camera. In the case of traveling, the coupling position is close to the camera. If it is fast, set it closer. If it is slow, set it far. For this reason, when the coupling position is far from the camera, a narrow range around the work machine can be grasped in detail, and nearby workers can be clearly displayed. Further, when the coupling position is close to the camera, a wide range around the work machine can be roughly grasped, and distant workers can be displayed without a sense of incongruity.

  In addition, a combined image created by the combined image creating unit is arranged around the simulation work machine of the upper viewpoint to create a composite image, and obstacle detection information detected by the composite image obstacle detecting unit is added to the composite image. The danger range is superimposed and displayed using the data of the tip position of the front work machine captured by the posture data capture means, the turning range captured by the motion data capture means, and the predicted travel path Or, you can display the driving guidelines superimposed. For this reason, it is possible to display a natural combined image with no sense of incongruity regardless of whether the shooting scene is narrow or wide, regardless of the presence or absence of an obstacle or a person, and when it exists, regardless of the position in the scene. Can do. In addition, when the danger occurs, it becomes obvious at a glance that the work machine and the surrounding obstacles and persons are present, and it is possible to quickly and accurately determine whether or not the work machine is operated, thereby improving work efficiency and safety.

  As described above, according to the present embodiment, it is possible to present the driver with a natural combined image that has no sense of incongruity regardless of whether the shooting scene is narrow or wide, or where an object is present in the scene. For this reason, the presence of obstacles and persons around the work machine can be understood at a glance, and whether or not the operation of the work machine is appropriate can be determined quickly and accurately.

DESCRIPTION OF SYMBOLS 1A Front working machine 1B Car body 1a Boom 1b Arm 1c Bucket 1d Upper turning body 1e Lower traveling body 1f Driver's cab 3a-3f Hydraulic actuators 8a, 8b, 8c Angle detectors 13a, 13b, 13c Camera 20 Obstacle (worker)
200 Image input unit 250 Image distortion correction unit 300 Overhead image creation unit 400 Combined position calculation unit 450 Combined image creation unit 500 Combined image composition unit 600 Composite image obstacle detection unit 700 Work machine posture data capture unit 750 Work machine operation data capture unit 800 Display Image Generation Unit 900 Display Device

Claims (11)

  1. An overhead image creation unit that creates an overhead image having a viewpoint above the work machine based on camera images from a plurality of cameras that are attached to the work machine and photograph the surroundings of the work machine;
    A combined position calculation unit that determines the operation of the work machine and calculates a combined position when combining the camera image with the overhead image to create a combined image;
    A combined image creating unit that creates a combined image by combining the camera image with the overhead image at the position calculated by the combined position calculating unit;
    A combined image composition unit that combines the combined image created by the combined image creation unit ,
    The work machine provided with a surrounding monitoring device, wherein the joint position calculation unit calculates a joint position from a height of an installation position of the camera .
  2. In the work machine provided with the surroundings monitoring device according to claim 1,
    A work machine provided with a surrounding monitoring device, comprising: a composite image obstacle detection unit that detects an obstacle from a composite image synthesized by the combined image composition unit .
  3. In the work machine provided with the surroundings monitoring device according to claim 2,
    A work machine provided with a surrounding monitoring device, comprising: a display image generation unit configured to display obstacle detection information superimposed on the composite image .
  4. In the work machine provided with the surroundings monitoring device according to claim 1,
    The combined position calculation unit calculates the combined position according to the tip position data of the front work machine in the work machine, and each work data representing excavation, turning, and traveling of the front work machine. machine.
  5. A work machine comprising the surrounding monitoring device according to claim 4,
    The coupling position calculation unit includes an ambient monitoring device that sets the coupling position close when the tip position of the work machine is far from the camera and sets the distance far when the tip position is close to the camera. Working machine.
  6. A work machine comprising the surrounding monitoring device according to claim 4.
    The combined position calculation unit sets the combined position far from the camera when the work data is turning, sets the distance close to the camera when the traveling speed is fast, and sets the distance far when the traveling speed is slow. A working machine having a surrounding monitoring device.
  7. In the work machine provided with the surroundings monitoring device according to claim 1,
    The combined image synthesizing unit includes a surrounding monitoring device characterized in that the position within the position calculated by the combined position calculating unit from the camera is an overhead image, and the position calculated by the combined position calculating unit is combined with the captured image of the camera. Work machine.
  8. In the work machine provided with the surroundings monitoring device according to claim 1,
    The combined image composition unit sets an overhead image within the position calculated by the combined position calculation unit from the camera, sets a distance from the position calculated by the combined position calculation unit as a captured image of the camera, and near the position calculated by the combined position calculation unit. A work machine provided with a surrounding monitoring device, wherein an overhead image and a camera image are combined with an image adjusted so as to be continuous.
  9. In the work machine provided with the surroundings monitoring device according to claim 1,
    The combined image synthesizing unit is a work machine provided with a surrounding monitoring device, which synthesizes an image of a simulated vehicle at an upper viewpoint with a synthesized image obtained by synthesizing a camera image with an overhead image.
  10. A work machine comprising the surrounding monitoring device according to claim 1,
    The display image generating unit superimposes and displays the dangerous work range created using the front work machine tip position data, the swivel range of the front work machine, and the predicted travel path. .
  11. An overhead image creation unit that creates an overhead image having a viewpoint above the work machine based on camera images from a plurality of cameras that capture the periphery of the work machine according to direction;
    A combination position calculation unit that determines the operation of the work machine and calculates a combination position when a combined image is created by combining camera images in different directions with the overhead image;
    A combined image creating unit that creates a combined image for each direction by combining the camera image with the overhead image at the position calculated by the combined position calculating unit;
    A combined image composition unit for compositing the combined images for each direction created by the combined image creation unit ,
    The working position surrounding monitoring device according to claim 1, wherein the combined position calculation unit calculates the combined position from the height of the installation position of the camera .
JP2009047989A 2009-03-02 2009-03-02 Work machine with ambient monitoring device Active JP4951639B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009047989A JP4951639B2 (en) 2009-03-02 2009-03-02 Work machine with ambient monitoring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009047989A JP4951639B2 (en) 2009-03-02 2009-03-02 Work machine with ambient monitoring device

Publications (2)

Publication Number Publication Date
JP2010204821A JP2010204821A (en) 2010-09-16
JP4951639B2 true JP4951639B2 (en) 2012-06-13

Family

ID=42966260

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009047989A Active JP4951639B2 (en) 2009-03-02 2009-03-02 Work machine with ambient monitoring device

Country Status (1)

Country Link
JP (1) JP4951639B2 (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5775283B2 (en) * 2010-10-08 2015-09-09 住友建機株式会社 Work machine monitoring device
JP5497617B2 (en) 2010-11-16 2014-05-21 住友重機械工業株式会社 Image generating apparatus and operation support system
JP5124671B2 (en) * 2011-06-07 2013-01-23 株式会社小松製作所 Work vehicle perimeter monitoring device
CN103080990B (en) 2011-06-07 2015-04-01 株式会社小松制作所 Work vehicle vicinity monitoring device
JP5722127B2 (en) * 2011-06-07 2015-05-20 株式会社小松製作所 Work vehicle perimeter monitoring device
JP5124672B2 (en) * 2011-06-07 2013-01-23 株式会社小松製作所 Work vehicle perimeter monitoring device
JP5974433B2 (en) * 2011-08-03 2016-08-23 日本精機株式会社 Peripheral image display device
JP5809988B2 (en) * 2012-01-10 2015-11-11 日立建機株式会社 Travel support device for work machine
CN104041018A (en) * 2012-01-12 2014-09-10 日立建机株式会社 Periphery monitoring device for self-propelled industrial machine
US9598836B2 (en) 2012-03-29 2017-03-21 Harnischfeger Technologies, Inc. Overhead view system for a shovel
JP2013253397A (en) * 2012-06-06 2013-12-19 Hitachi Constr Mach Co Ltd Operation support device for work machine
WO2013183536A1 (en) * 2012-06-08 2013-12-12 日立建機株式会社 Display device for self-propelled industrial machine
JP5961472B2 (en) * 2012-07-27 2016-08-02 日立建機株式会社 Work machine ambient monitoring device
JP6214976B2 (en) * 2013-09-11 2017-10-18 住友建機株式会社 Construction machinery
JP6339337B2 (en) * 2013-09-24 2018-06-06 日立建機株式会社 Moving object detection system around the vehicle
JP2014123955A (en) * 2014-01-17 2014-07-03 Sumitomo Heavy Ind Ltd Shovel
JP6262068B2 (en) 2014-04-25 2018-01-17 日立建機株式会社 Near-body obstacle notification system
KR101518361B1 (en) * 2014-05-27 2015-05-15 주식회사 피엘케이 테크놀로지 Method of monitoring around view preventing collision of operation vehicle, apparatus performing the same and storage media storing the same
JP6204884B2 (en) 2014-07-25 2017-09-27 日立建機株式会社 Peripheral display device for swivel work machine
JP6262644B2 (en) * 2014-12-19 2018-01-17 日立建機株式会社 Work machine ambient monitoring device
US10017112B2 (en) 2015-03-03 2018-07-10 Hitachi Construction Machinery Co., Ltd. Surroundings monitoring device of vehicle
KR20160111102A (en) * 2015-03-16 2016-09-26 두산인프라코어 주식회사 Method of displaying a dead zone of a construction machine and apparatus for performing the same
WO2016157462A1 (en) * 2015-03-31 2016-10-06 株式会社小松製作所 Work-machine periphery monitoring device
JP2016225865A (en) 2015-06-01 2016-12-28 東芝アルパイン・オートモティブテクノロジー株式会社 Overhead image generation apparatus
JP6246185B2 (en) * 2015-12-28 2017-12-13 住友重機械工業株式会社 Excavator
JP6068710B1 (en) * 2016-05-30 2017-01-25 株式会社ネクスコ・エンジニアリング北海道 Overhead image adjustment apparatus and overhead image adjustment program
JP2018085685A (en) * 2016-11-25 2018-05-31 株式会社Jvcケンウッド Overview video generation device, overview video generation system, overview video generation method, and program
WO2019066693A1 (en) * 2017-09-26 2019-04-04 Cargotec Patenter Ab Operator assistance system and a method in relation to the system
JP2018035669A (en) * 2017-11-14 2018-03-08 住友重機械工業株式会社 Shovel and surrounding image generating device of the same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005311868A (en) * 2004-04-23 2005-11-04 Auto Network Gijutsu Kenkyusho:Kk Vehicle periphery visually recognizing apparatus
JP4847913B2 (en) * 2007-03-30 2011-12-28 日立建機株式会社 Work machine periphery monitoring device

Also Published As

Publication number Publication date
JP2010204821A (en) 2010-09-16

Similar Documents

Publication Publication Date Title
US8918246B2 (en) Augmented reality implement control
EP2899496B1 (en) Periphery-information acquisition device for vehicle
EP2660104B1 (en) Apparatus and method for displaying a blind spot
US9115482B2 (en) Collision detection and mitigation systems and methods for a shovel
EP2623376B1 (en) Parking assistance device
TWI417639B (en) Method and system for forming surrounding seamless bird-view image
US7511734B2 (en) Monitoring apparatus and method of displaying bird's-eye view image
JP4323377B2 (en) Image display device
JP5436086B2 (en) Vehicle periphery image display device and vehicle periphery image display method
EP2723069B1 (en) Vehicle periphery monitoring device
EP1985416B1 (en) Mobile robot
JP4257356B2 (en) Image generating apparatus and image generating method
AU2016227095B2 (en) Image display system for work machine, remote operation system for work machine, and work machine
TWI392366B (en) Method and system for generating surrounding seamless bird-view image with distance interface
US9956915B2 (en) Dump truck periphery monitoring apparatus
JP6427597B2 (en) Peripheral monitoring device for working machine and peripheral monitoring method for working machine
JP4876118B2 (en) Three-dimensional object appearance detection device
US8155385B2 (en) Image-processing system and image-processing method
US8330816B2 (en) Image processing device
US20130182066A1 (en) Device for surveying surround of working machine
DE102008029916B4 (en) Image display device and image display system for a vehicle
JP5454934B2 (en) Driving assistance device
DE112013003703B4 (en) Environmental monitoring device for a business machine
JP5052708B2 (en) Driving support device, driving support system, and driving support camera unit
JP4606336B2 (en) Vehicle periphery visual recognition device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110117

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111021

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111101

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20111209

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120228

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120312

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150316

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150