WO2015045904A1 - Vehicle-periphery-moving-object detection system - Google Patents

Vehicle-periphery-moving-object detection system Download PDF

Info

Publication number
WO2015045904A1
WO2015045904A1 PCT/JP2014/074212 JP2014074212W WO2015045904A1 WO 2015045904 A1 WO2015045904 A1 WO 2015045904A1 JP 2014074212 W JP2014074212 W JP 2014074212W WO 2015045904 A1 WO2015045904 A1 WO 2015045904A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving object
object detection
vehicle
video
surrounding
Prior art date
Application number
PCT/JP2014/074212
Other languages
French (fr)
Japanese (ja)
Inventor
善文 福田
小沼 知恵子
守飛 太田
石本 英史
Original Assignee
日立建機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立建機株式会社 filed Critical 日立建機株式会社
Publication of WO2015045904A1 publication Critical patent/WO2015045904A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a vehicle surrounding moving object detection system, and more particularly, to a vehicle surrounding moving object detection for avoiding contact with an object moving in the vicinity of a vehicle in a construction machine such as a large dump truck working in a mine. About the system.
  • Patent Document 1 JP 2010-147523 A (Patent Document 1). This publication describes that “a composite bird's-eye view image that can easily grasp a three-dimensional object around a vehicle to be watched by a supervisor while avoiding the disappearance of the three-dimensional object in the common area” is described.
  • the conventional vehicle surrounding detection system a part of the image around the host vehicle captured by a plurality of cameras is extracted, and the shape is converted and synthesized to generate a composite overhead image, and the user is notified of the surroundings of the host vehicle. In addition to showing the situation, if an object (three-dimensional object) that the user should pay attention to is present in the video, the user is warned by highlighting it.
  • a composite overhead image is generated to display the situation around the host vehicle, and an object (a three-dimensional object) is detected at the displayed location.
  • an object a three-dimensional object
  • the three-dimensional object is not detected and the user cannot be alerted. For this reason, there existed a possibility of impairing operational safety.
  • the present invention has been made based on the above-mentioned matters, and its purpose is to detect a moving object even in a place that is not displayed as a composite bird's-eye view image such as a lower part of a vehicle body of a large construction machine.
  • a vehicle surrounding moving object detection system capable of notifying the user of the result is provided.
  • a first invention provides a surrounding image input unit that captures a plurality of images around the host vehicle, and a composite image configuration area from the plurality of surrounding images captured by the surrounding image input unit.
  • Vehicle surrounding movement provided with a composite video construction unit that constructs a composite overhead view video by combining a plurality of extracted composite video composition areas and an output unit that presents the composite overhead view video to a user
  • the object detection system includes a moving object detection processing unit that performs a detection process of the moving object with respect to the plurality of surrounding images, and the moving object detection processing unit detects a moving object in a region other than the region shown in the composite overhead image. Even in this case, the detection result is output to the synthesized bird's-eye view image to notify the user.
  • a region information holding unit that holds information of a detection result display non-target region that is a region where a detection result indicating that a moving object has been detected is not displayed, and the region information holding A moving object detection processing unit that takes in the information of the detection result display non-target region from the unit, and performs a moving object detection process on the video that excludes the detection result display non-target region from the plurality of surrounding images,
  • the moving object detection processing unit outputs a detection result to the synthesized bird's-eye video and notifies the user even when a moving object is detected outside the area shown in the synthesized bird's-eye video.
  • the third invention is characterized in that, in the second invention, the detection result display non-target area is a part of the own vehicle reflected in the plurality of surrounding images.
  • the moving object detection processing unit when the moving object detection processing unit detects a moving object in a lower part of the host vehicle, the detection result is output as a message to the synthesized overhead view image and used. It is characterized by notifying a person.
  • the moving object detection processing unit when the moving object detection processing unit detects a moving object in a lower portion of the host vehicle, the detection result is set to a position indicating the host vehicle in the synthesized overhead view image. The drawing is output and notified to the user.
  • the information processing apparatus further includes an emphasis information holding unit that holds color information indicating a degree of emphasis display according to a distance between the host vehicle and the detected moving object.
  • the moving object detection processing unit takes in color information indicating the degree of emphasis in the emphasis display from the emphasis information holding unit, and sets an emphasis display color from a distance between the host vehicle and the detected moving object, The moving object that is a detection result is displayed on the composite bird's-eye view image in the highlighted display color that is set, and is notified to the user.
  • the detection result is notified to the user. Therefore, the user can quickly grasp the existence and the positional relationship of the object to be noted in the work of the own vehicle. As a result, it is possible to improve the efficiency of confirmation work for ensuring work safety, and to avoid contact with an object such as a person or another vehicle. As a result, the operational efficiency of the entire work can be improved.
  • FIG. 1 is a side view showing a dump truck provided with a first embodiment of a vehicle surrounding moving object detection system of the present invention.
  • a dump truck (vehicle) 1 shown in FIG. 1 includes a vehicle body 2 formed with a sturdy frame structure, a vessel (loading platform) 3 mounted on the vehicle body 2 so as to be able to undulate, and a left front wheel 4A mounted on the vehicle body 2. (L) and left rear wheel 4B (L) are mainly provided.
  • the vehicle body 2 is provided with an engine (not shown) that drives the rear wheels 4B.
  • the engine has, for example, an engine control device (hereinafter referred to as ECU), and the number of revolutions thereof is controlled by controlling the flow rate of fuel supplied by a command signal from the ECU.
  • ECU engine control device
  • the vessel 3 is a container provided to load a load such as a crushed stone, and is connected to the vehicle body 2 via a pin coupling portion 5 so as to be raised and lowered.
  • Two undulation cylinders 6 are installed at a lower portion of the vessel 3 with a predetermined interval in the width direction of the vehicle. When pressure oil is supplied to and discharged from the undulation cylinder 6, the undulation cylinder 6 extends and contracts, and the vessel 3 is undulated.
  • a collar portion 7 is provided on the upper front side of the vessel 3.
  • the heel portion 7 has a function of protecting the cab 8 installed on the lower side thereof (that is, the front portion of the vehicle body 2) from scattered objects such as rocks and protecting the cab 8 when the vehicle falls. Yes.
  • a control device 100 constituting a vehicle surrounding moving object detection system, a steering handle (not shown), an accelerator pedal, a brake pedal and the like (not shown). Is installed.
  • FIG. 2 is a conceptual diagram for explaining the arrangement of cameras constituting the surrounding image input unit in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 3 is a diagram of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows the image
  • a front camera 301 that captures the front of the dump truck 1 with a wide-angle lens and a rear camera 303 that captures the rear of the dump truck 1 with a wide-angle lens are provided on the front and rear sides of the vehicle body 2 of the dump truck 1. And are provided. Further, on the left side surface and the right side surface of the vehicle body 2, a left direction camera 302 that captures the left direction of the dump truck 1 with a wide angle lens and a right direction camera 304 that captures the right direction of the dump truck 1 with a wide angle lens are provided. Is provided.
  • the vehicle body 2 of the dump truck 1 is equipped with a right front wheel 4A (R) and a left rear wheel 4B (R).
  • FIG. 3 shows an example of surrounding images taken by these cameras 301-303.
  • Reference numeral 401 denotes an example of a front image captured by the front camera 301.
  • Reference numeral 402 denotes an example of a left direction image captured by the left direction camera 302.
  • Reference numeral 403 denotes an example of a rear image taken by the rear camera 303.
  • Reference numeral 404 denotes an example of a right direction image captured by the right direction camera 304.
  • the surrounding images 401 to 404 are each photographed with a wide-angle lens, a distant horizon located above each image is reflected in a curved state.
  • a part of the vehicle body 2 on the front side, a part of the left and right front wheels 4A (R) and 4A (L), and a part of the left and right rear wheels 4B (R) and 4B (L) are shown below each image. ing.
  • a synthesized overhead view video is generated based on these surrounding videos.
  • FIG. 4 is a block diagram showing the configuration of the first embodiment of the vehicle surrounding moving object detection system of the present invention.
  • the first embodiment of the vehicle surrounding moving object detection system includes a vehicle surrounding moving object detection device 100 and a surrounding image input unit 101.
  • the ambient video input unit 101 includes a plurality of cameras 301 to 304 that respectively capture the situation around the dump truck 1.
  • the surrounding image input unit 101 transmits a plurality of surrounding images 401 to 404, which are photographing results, to the vehicle surrounding moving object detection device 100.
  • the vehicle surrounding moving object detection device 100 includes a moving object detection unit 102, a region information holding unit 103, a composite video construction unit 104, and an output unit 105.
  • the moving object detection unit 102 includes a plurality of surrounding images 401 to 404 transmitted by the surrounding image input unit 101 and a detection result display non-display region that is a non-detection / non-display region from the region information holding unit 103 (details will be described later). ) And the moving object detection process is performed in a region excluding the detection result display non-target region from the surrounding video.
  • the moving object detection unit 102 outputs an information request signal to the area information holding unit 103 and outputs a detection position coordinate signal indicating the detected position to the composite video construction unit 104 when a moving object is detected.
  • the area information holding unit 103 holds an information signal of a detection result display non-target area, and outputs the held information signal to the moving object detection unit 102 in response to a request from the moving object detection unit 102.
  • the detection result display non-target area refers to an area in which a detection result indicating that a moving object is detected is not displayed even if a moving object is detected in the surrounding images 401 to 404.
  • a part of the host vehicle 1 such as the front wheel 4A and the rear wheel 4B of the vehicle body 2 is set.
  • the composite video constructing unit 104 receives a plurality of surrounding images 401 to 404 transmitted from the surrounding image input unit 101 and a detection position coordinate signal indicating the position of the moving object from the moving object detection unit 102.
  • the composite video constructing unit 104 cuts out and transforms a necessary part from the surrounding video input to generate the composite overhead video, combines them, and based on the input detection position coordinate signal, the detection position of the moving object A composite bird's-eye view image highlighting the region is generated and output to the output unit 105.
  • the output unit 105 includes a display or the like, and displays the synthesized overhead view video input from the synthesized video construction unit 104 on the display or the like.
  • FIG. 5 is a flowchart showing the processing contents in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 6 shows the detection result display in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 7 is a conceptual diagram showing an example of an extraction region from a surrounding image in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 8 is a vehicle of the present invention
  • FIG. 9 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the first embodiment of the surrounding moving object detection system
  • FIG. 9 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the first embodiment of the surrounding moving object detection system
  • FIG. 9 is a diagram in the first embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram explaining a moving object detection. 5 to 9, the same reference numerals as those shown in FIGS. 1 to 4 are the same parts, and detailed description thereof is omitted.
  • the vehicle surrounding moving object detection device 100 performs surrounding image input (step S101). Specifically, ambient images 401 to 404 photographed by the ambient image input unit 101 are input.
  • the vehicle surrounding moving object detection device 100 performs the exclusion process on the non-target region (step S102). Specifically, based on the information on the detection result display non-target area held by the area information holding unit 103 shown in FIG. 4, even if a moving object is detected, an area where the detection result is not displayed is set.
  • a portion surrounded by a broken line is set as a detection result display non-target area.
  • 501 indicates a non-target area for display of the front detection result
  • 502 in the left direction video 402 indicates a non-target area for display of the detection result of the left direction
  • 503 in the rear video 403
  • Reference numeral 504 denotes a right direction detection result display non-target area.
  • the vehicle surrounding moving object detection apparatus 100 executes construction of a composite image (step S103). Specifically, the composite video construction unit 104 shown in FIG. 4 cuts a part from the front video 401, the left video 402, the rear video 403, and the right video 404 and arranges them so as to touch the boundary, thereby providing the dump truck 1 Display the surroundings of the front, back, left and right.
  • a front extraction region 601, a left direction extraction region 602, a rear extraction region 603, and a right direction extraction region 604 are provided inside each of the surrounding images 401 to 404. Based on the characteristics of the wide-angle camera, these extraction areas are reflected from the area starting from just below the dump truck 1 to a portion separated by a certain distance, and the extraction area in the video input from the adjacent camera and the camera. The range is set based on the fact that the extracted area in the input video is adjacent without overlapping.
  • a synthesized overhead image 605 shown in FIG. 8 is constructed.
  • the forward extraction area 601 is converted into a forward converted image 606.
  • the left direction extraction region 602 is converted into a left direction conversion image 607
  • the rear extraction region 603 is converted into a rear conversion image 608, and the right direction extraction region 604 is converted into a right direction conversion image 609.
  • these converted images 606 to 609 are sequentially arranged so as to be adjacent to each other, and in the present embodiment, an image of the bird's-eye view of the periphery of the dump truck 1 is represented.
  • the own vehicle icon 610 is inserted and synthesized in the central portion where the video does not exist to indicate the region where the vehicle body 2 of the dump truck 1 exists.
  • the vehicle surrounding moving object detection apparatus 100 executes a detection process (step S104). Specifically, the moving object detection unit 102 shown in FIG. 4 performs a moving object detection process.
  • the images to be detected are the forward video 401, the left video 402, the rear video 403, and the right video 404 input from the surrounding video input unit 101. Further, in each video, the front detection result display non-display area 501, the detection results in the left direction detection result display non-target area 502, the rear detection result display non-target area 503, and the right direction detection result display non-target area 504 are excluded.
  • FIG. 9 shows detection of a moving object in the left-direction image 402, and in order from the top, (a) an image at a time t 0 ⁇ 1 that is a time before a certain specific time t 0 , and (b) a certain specific An image at time t 0 , (c) an image showing a moving object initial detection result, and (d) an image showing a corrected detection result are shown.
  • (a) shows a state in which the moving object 702 at time t 0 ⁇ 1 is photographed on the left direction image 402, and (b) shows the moving object 701 at the specific time t 0 in the left direction. A state of being photographed on the video 402 is shown.
  • These moving objects 701 and 702 are the same, and the state of moving with the passage of time is captured by the left camera 303.
  • the moving object 701 at a particular time t 0, the moving object 702 at time t 0 -1 is a time before the specific time t 0, is present in the interior of the left extraction region 602
  • it is a target for moving object detection, and in the case of a moving object, it is a target for display.
  • the moving object detection unit 102 shown in FIG. 4 compares these images input from the surrounding image input unit 101, and identifies a site where a change has occurred in the image with time. Specifically, a part is specified like a detection result 703 in the left direction indicated by a square in FIG.
  • the moving object detection unit 102 calculates and holds coordinate information on the left direction image 402 of the detection result 703 in the left direction. It should be noted that such a detection result cannot be obtained when there is no change in the image.
  • the vehicle surrounding moving object detection apparatus 100 determines whether or not a moving object is detected by the detection processing in (Step S104) (Step S105). If a moving object is detected, the process proceeds to (Step S106). Otherwise, the process proceeds to (Step S108).
  • the vehicle surrounding moving object detection device 100 calculates the coordinates of the detection location (step S106). Specifically, the moving object detection unit 102 shown in FIG. 4 converts the part of the moving object obtained in the front video 401, the left video 402, the rear video 403, and the right video 404 into coordinate information, and combines it. Coordinate information in the overhead view image 605 is calculated. The conversion added to the coordinate information is the same as that used when constructing the composite bird's-eye view image 605 in (Step S103).
  • the moving object 701 at the specific time t 0 exists outside the left direction extraction region 602 although it is inside the left direction image 402.
  • the coordinate information of the moving object 701 after conversion indicates the outside of the left-direction converted image 607 shown in FIG. Therefore, the moving object detection unit 102 calculates a part inside the left direction extraction region 602 closest to the left direction detection result 703 at the specific time t 0 shown in FIG. 9C, and the calculated part is shown in FIG.
  • the corrected detection result 704 in the left direction at the specific time t 0 is used.
  • the vehicle surrounding moving object detection apparatus 100 performs enhancement setting of the detection location (step S ⁇ b> 107). Specifically, the composite video constructing unit 104 sets highlighting in order to indicate to the user that the moving object is detected.
  • the detection result 801 after coordinate conversion at the specific time t 0 is emphasized by adding a conversion to the corrected detection result 704 in the left direction at the specific time t 0 shown in FIG. Display is set.
  • the left direction of the corrected detection results 704 at a specific time t 0 is, in the interior of the left extraction region 602, and the detection result of the left at a particular time t 0 It exists at a position closest to 703. Then, the detection result 801 after the coordinate conversion at the specific time t 0 shown in FIG. 8 becomes a moving object indicated by the detection result 703 in the left direction at the specific time t 0 on the left-direction converted video 607 on the synthesized overhead image 605. It exists in the nearest position.
  • the vehicle surrounding moving object detection apparatus 100 outputs a composite image (step S ⁇ b> 108). Specifically, the composite video construction unit 104 outputs the composite overhead video 605 to the output unit 105. In the above-described processing, when a moving object is detected, the output unit 105 outputs a composite bird's-eye view image 605 that is highlighted as in the detection result 801 after coordinate conversion at the specific time t 0 . .
  • the vehicle surrounding moving object detection apparatus 100 determines whether the detection and output of the moving object is completed (step S109). If these processes are completed, the process ends. Otherwise, the process returns to (Step S101).
  • the detection and display processing of the moving object in the present embodiment is performed, it is possible to detect the presence or absence of the moving object in the part that is not displayed on the composite overhead view image 605, and its existence. By highlighting, it is possible to notify the user.
  • the vehicle surrounding moving object detection system of the present invention it is included in the portion displayed as the synthesized bird's-eye view image 605 for the user in the situation around the host vehicle 1. Even when a moving object is detected at a non-existing location, the detection result is notified to the user, so that the user can quickly grasp the presence and positional relationship of the object to be noted in the operation of the host vehicle 1. As a result, it is possible to improve the efficiency of confirmation work for ensuring work safety, and to avoid contact with an object such as a person or another vehicle. As a result, the operational efficiency of the entire work can be improved.
  • FIG. 10 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the second embodiment of the vehicle surrounding moving object detection system of the present invention.
  • the same reference numerals as those shown in FIG. 1 to FIG. 9 are the same parts, and detailed description thereof will be omitted.
  • the vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment.
  • the second embodiment when a moving object that is located below the host vehicle 1 and is not displayed on the composite bird's-eye view image 605 is detected, the user is notified by displaying a message to that effect. Different from the first embodiment.
  • Step S106 the processing of (Step S106) and (Step S107) in the flowchart showing the processing contents shown in FIG. 5 in the first embodiment, and the composite overhead view video shown in FIG. 8 in the first embodiment. Since the detection result display of the moving object in is different, this part will be described.
  • the coordinates of the detection location are calculated in the same manner as in the first embodiment, and whether or not the moving object detected based on this information is below the host vehicle 1. Determine whether. Specifically, when all of the following conditions are satisfied in the moving object detection unit 102 shown in FIG. 4, it is determined that the moving object exists below the host vehicle 1. (1) A moving object is included in any of the front video 401, the left video 402, the rear video 403, and the right video 404. (2) A moving object exists below the lower side indicating each of the front extraction area 601, the left direction extraction area 602, the rear extraction area 603, and the right direction extraction area 604 shown in FIG.
  • the detection location is emphasized.
  • the composite video constructing unit 104 performs settings for indicating the detection of the moving object to the user.
  • the message character string “A moving object has been detected in the lower part of the own vehicle body” is output to the output unit 105.
  • the message character string is set not to be output to the output unit 105.
  • FIG. 10 is a display example on the output unit 105 made by these processes, and shows a state in which a moving object is detected in the lower part of the host vehicle 1.
  • a detection message 901 indicating that a moving object has been detected is displayed on the lower part of the host vehicle 1 on the synthesized overhead image 605 displayed on the output unit 105. Accordingly, it is possible to notify the user of the state of the moving object existing below the host vehicle 1 that is not displayed in any of the forward converted video 606, the left converted video 607, the backward converted video 608, and the right converted video 609. it can.
  • FIG. 11 is a conceptual diagram showing an example of a composite overhead view image and a moving object detection result display in the third embodiment of the vehicle surrounding moving object detection system of the present invention.
  • the same reference numerals as those shown in FIGS. 1 to 10 are the same parts, and detailed description thereof is omitted.
  • the vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment.
  • a moving object that is below the host vehicle 1 and is not displayed on the composite overhead view image 605 is detected, it is detected at a position on the host vehicle icon 610 corresponding to the detected position.
  • the point which notifies a user by displaying a result differs from a 1st embodiment.
  • Step S106 the processing of (Step S106) and (Step S107) in the flowchart showing the processing contents shown in FIG. 5 in the first embodiment, and the composite overhead view video shown in FIG. 8 in the first embodiment. Since the detection result display of the moving object in is different, this part will be described.
  • step S106 but calculates the coordinates of the same detected position of the first embodiment, this time, without correcting the detection result, the left direction at a particular time t 0
  • the detection result 703 is used in subsequent processing. Note that the detection result 703 in the left direction at the specific time t 0 is present at the lower part of the host vehicle 1 and behind the front wheel 4A as shown in FIG. 7C.
  • Step S107 the detection location is emphasized. Specifically, the composite video constructing unit 104 sets highlighting in order to indicate to the user that the moving object is detected.
  • the detection result 703 in the left direction at the specific time t 0 exists outside the left direction extraction region 602. Therefore, when coordinate conversion is performed, the detection result 703 exists in the display region of the vehicle icon 610 in FIG. Will do. For this reason, in FIG. 10, the detection result 1001 after the coordinate conversion at the specific time t 0 is displayed so as to be superimposed on the vehicle icon 610. As a result, the state of the moving object existing below the host vehicle 1 that is not displayed in any of the forward converted video 606, the left converted video 607, the backward converted video 608, and the right converted video 609, including the approximate position. The user can be notified.
  • FIG. 12 is a block diagram showing the configuration of the fourth embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 13 is a table showing an example of the table in the emphasis information holding unit shown in FIG. 12, and FIG. FIG.
  • FIG. 15 is a flowchart showing the processing contents in the fourth embodiment of the vehicle surrounding moving object detection system of the invention, and FIG. 15 is an image of the moving object reflected in the fourth embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows an example. 12 to 15, the same reference numerals as those shown in FIGS. 1 to 11 are the same parts, and detailed description thereof is omitted.
  • the vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment.
  • the fourth embodiment when the moving object is present at a position closer to the host vehicle 1, the first point is that the detection result of the moving object is emphasized and displayed to the user. Different from the embodiment.
  • the vehicle surrounding moving object detection device 100 ⁇ / b> A further includes an emphasis information holding unit 106.
  • the emphasis information holding unit 106 holds, as a table, for example, a color information signal indicating the degree of emphasis in the emphasis display according to the distance between the host vehicle 1 and the moving object in the video displayed by the output unit 105. Output to the video construction unit 104.
  • FIG. 1 An example of the table held by the emphasis information holding unit 106 is shown in FIG.
  • the distance L between the moving object and the vehicle 1 in the surrounding images 401 to 404 is 300 pixels or more
  • green is used as the highlight color.
  • the highlight color is yellow
  • the highlight color is red.
  • the emphasis information holding unit 106 can store other types of tables.
  • step S206 is added between (step S106) and (step S107). Others are the same.
  • the vehicle surrounding moving object detection device 100A calculates the coordinates of the detection location in (Step S106) in the same manner as in the first embodiment, and then the region contents based on the coordinates of the detection location. Is set (step S206). Specifically, the composite video construction unit 104 shown in FIG. 12 calculates the distance L between the moving object and the host vehicle 1 in the surrounding videos 401 to 404, and highlights the color in light of the table held by the highlight information holding unit 106. Set.
  • the composite video construction unit 104 sets the highlighted color to yellow in (Step S206). Set to.
  • the vehicle surrounding moving object detection device 100 ⁇ / b> A performs enhancement setting of the detection location (step S ⁇ b> 107). Specifically, the composite video construction unit 104 sets the highlight color to yellow. Thereafter, in the composite video output in (Step S108), the moving object 1001 is highlighted in yellow with respect to the composite overhead image 605.
  • the presence of the moving object 1001 is highlighted and the highlighted color is changed according to the distance.
  • the person can quickly and intuitively grasp the distance L between the moving object 1001 and the host vehicle 1 together with the presence of the moving object 1001. As a result, more safety can be ensured.
  • FIG. 16 is a table showing an example of a table in the emphasis information holding unit in the fifth embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 17 shows the fifth embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows an example of the image
  • the vehicle surrounding moving object detection system in the present embodiment has substantially the same configuration and operation method as in the fourth embodiment.
  • the table held by the emphasis information holding unit 106 is different from that in the fourth embodiment.
  • FIG. 16 shows an example of a table held by the emphasis information holding unit 106 of the present embodiment
  • FIG. 17 shows an example of an image showing a moving object.
  • an object photographed above the surrounding image is located farther from the vehicle body 2 of the host vehicle 1 and is photographed below the surrounding image. The closer the object is, the closer to the body 2 of the host vehicle 1 is.
  • the y-coordinate which is the vertical coordinate in each surrounding image is defined as 0, with the upper end of each surrounding image being 0, and the numerical value is increased as it moves downward. For this reason, it shows that it is closer to the own vehicle 1 so that the numerical value of y coordinate is large.
  • the vertical coordinate is 0.
  • the highlighted color is green.
  • the highlight color is yellow, and when the coordinates are 300 pixels or more, the highlight color is red.
  • the vehicle surrounding moving object detection device 100A calculates the coordinates of the detection location in (step S106) as in the first embodiment, and these coordinates are the same as those of the moving object 1001 described above. It has been converted to a y-coordinate value.
  • the vehicle surrounding moving object detection device 100A performs region content enhancement setting based on the y-coordinate of this detection location (step S206). Specifically, the composite video construction unit 104 calculates the y-coordinate v of the moving object in the surrounding videos 401 to 404, and sets the highlight display color against the table held by the highlight information holding unit 106.
  • the composite image constructing unit 104 sets the highlight color to yellow in (Step S206). .
  • the vehicle surrounding moving object detection device 100 ⁇ / b> A performs enhancement setting of the detection location (step S ⁇ b> 107). Specifically, the composite video construction unit 104 sets the highlight color to yellow. Thereafter, in the composite video output in (Step S108), the moving object 1001 is highlighted in yellow with respect to the composite overhead image 605.
  • a table is used to set the highlight color from the position of the moving object 1001, but the present invention is not limited to this. For example, you may set using a numerical formula etc.
  • this invention is not limited to the above-mentioned Example, Various modifications are included.
  • the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
  • Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)

Abstract

Provided is a vehicle-periphery-moving-object detection system for detecting a moving object in a location not displayed in a synthesized overhead video such as underneath the chassis of a large construction vehicle, and capable of reporting the results thereof to a user. A vehicle-periphery-moving-object detection system equipped with a peripheral-video input unit for capturing a plurality of videos around a vehicle, a synthesized-video-building unit for extracting regions for synthesized-video building from the plurality of peripheral videos captured by the peripheral-video input unit, and building a synthesized overhead video by synthesizing the plurality of extracted synthesized-video-building regions, and an output unit for presenting the synthesized overhead video to the user, the vehicle-periphery-moving-object detection system also having a moving-object-detection processing unit for performing the moving-object detection processing on the plurality of peripheral videos, and reporting the detection results to the user by outputting the same to the synthesized overhead video, even when a moving object is detected somewhere other than in the regions expressed in the synthesized overhead video.

Description

車両周囲移動物体検知システムMoving object detection system around the vehicle
 本発明は、車両周囲移動物体検知システムに係り、さらに詳しくは、鉱山で作業を行う大型ダンプトラック等の建設機械において、車両近傍を移動する物体との接触を回避するための車両周囲移動物体検知システムに関する。 The present invention relates to a vehicle surrounding moving object detection system, and more particularly, to a vehicle surrounding moving object detection for avoiding contact with an object moving in the vicinity of a vehicle in a construction machine such as a large dump truck working in a mine. About the system.
 本技術分野の背景技術として、特開2010-147523号公報(特許文献1)がある。この公報には、“共通領域における立体物の消失を回避しつつ、監視者が注意すべき車両周囲の立体物を把握しやすい合成俯瞰画像を生成すること。”が記載されている。 As a background art in this technical field, there is JP 2010-147523 A (Patent Document 1). This publication describes that “a composite bird's-eye view image that can easily grasp a three-dimensional object around a vehicle to be watched by a supervisor while avoiding the disappearance of the three-dimensional object in the common area” is described.
特開2010-147523号公報JP 2010-147523 A
 従来の車両周囲検知システムでは、複数のカメラが撮影する自車両周囲の映像から一部を抽出し、形状を変換、合成することにより、合成俯瞰画像を生成して、利用者に自車両周囲の状況を示すとともに、利用者が注目すべき物体(立体物)が映像中に存在する場合はこれを強調表示することで利用者に注意を促していた。 In the conventional vehicle surrounding detection system, a part of the image around the host vehicle captured by a plurality of cameras is extracted, and the shape is converted and synthesized to generate a composite overhead image, and the user is notified of the surroundings of the host vehicle. In addition to showing the situation, if an object (three-dimensional object) that the user should pay attention to is present in the video, the user is warned by highlighting it.
 一方、鉱山向けのダンプトラックなど大型の建設機械は、自車両の車体下部にも広い空間が存在する。したがって、建設機械の安全な作業を確保するためには、車体下部の空間に人物や車両が立ち入っていないことを確認することが重要となる。 On the other hand, large construction machines such as dump trucks for mines have a large space under the body of the vehicle. Therefore, in order to ensure safe work of the construction machine, it is important to confirm that no person or vehicle enters the space below the vehicle body.
 しかしながら、上述した従来のシステムでは、合成俯瞰画像を生成して自車両周囲の状況を表示すると共に、表示した箇所における物体(立体物)の検知を行うため、自車両の車体下部などの合成俯瞰画像に表示されない箇所に人物や他車両が立ち入った場合、立体物の検知がなされず利用者に注意を促すことができない。このため、作業上の安全を損なう虞があった。 However, in the above-described conventional system, a composite overhead image is generated to display the situation around the host vehicle, and an object (a three-dimensional object) is detected at the displayed location. When a person or another vehicle enters a place that is not displayed in the image, the three-dimensional object is not detected and the user cannot be alerted. For this reason, there existed a possibility of impairing operational safety.
 本発明は上述の事柄に基づいてなされたものであって、その目的は、大型の建設機械の自車両の車体下部などの合成俯瞰映像として表示されない箇所においても移動物体の検知を行うと共に、その結果を利用者に報知できる車両周囲移動物体検知システムを提供するものである。 The present invention has been made based on the above-mentioned matters, and its purpose is to detect a moving object even in a place that is not displayed as a composite bird's-eye view image such as a lower part of a vehicle body of a large construction machine. A vehicle surrounding moving object detection system capable of notifying the user of the result is provided.
 上記の目的を達成するために、第1の発明は、自車両の周囲の複数の映像を撮影する周囲映像入力部と、前記周囲映像入力部が撮影した複数の周囲映像から合成映像構成用領域を抽出して、抽出した複数の合成映像構成用領域を合成することで合成俯瞰映像を構築する合成映像構築部と、前記合成俯瞰映像を利用者に提示する出力部とを備えた車両周囲移動物体検知システムにおいて、前記複数の周囲映像に関して移動物体の検知処理を行う移動物体検知処理部とを有し、前記移動物体検知処理部は、前記合成俯瞰映像に示す領域以外で移動物体を検知した場合にも検知結果を前記合成俯瞰映像に出力して前記利用者に報知するものとする。 In order to achieve the above object, a first invention provides a surrounding image input unit that captures a plurality of images around the host vehicle, and a composite image configuration area from the plurality of surrounding images captured by the surrounding image input unit. Vehicle surrounding movement provided with a composite video construction unit that constructs a composite overhead view video by combining a plurality of extracted composite video composition areas and an output unit that presents the composite overhead view video to a user The object detection system includes a moving object detection processing unit that performs a detection process of the moving object with respect to the plurality of surrounding images, and the moving object detection processing unit detects a moving object in a region other than the region shown in the composite overhead image. Even in this case, the detection result is output to the synthesized bird's-eye view image to notify the user.
 また、第2の発明は、第1の発明において、移動物体を検知したとの検知結果を表示させない領域である検知結果表示対象外領域の情報を保持する領域情報保持部と、前記領域情報保持部から前記検知結果表示対象外領域の情報を取込み、前記複数の周囲映像から前記検知結果表示対象外領域を除外した映像に関して移動物体の検知処理を行う移動物体検知処理部とを更に備え、前記移動物体検知処理部は、前記合成俯瞰映像に示す領域以外で移動物体を検知した場合にも検知結果を前記合成俯瞰映像に出力して前記利用者に報知することを特徴とする。 According to a second aspect of the present invention, in the first aspect of the present invention, a region information holding unit that holds information of a detection result display non-target region that is a region where a detection result indicating that a moving object has been detected is not displayed, and the region information holding A moving object detection processing unit that takes in the information of the detection result display non-target region from the unit, and performs a moving object detection process on the video that excludes the detection result display non-target region from the plurality of surrounding images, The moving object detection processing unit outputs a detection result to the synthesized bird's-eye video and notifies the user even when a moving object is detected outside the area shown in the synthesized bird's-eye video.
 更に、第3の発明は、第2の発明において、前記検知結果表示対象外領域は、前記複数の周囲映像に映りこんだ前記自車両の一部であることを特徴とする。 Furthermore, the third invention is characterized in that, in the second invention, the detection result display non-target area is a part of the own vehicle reflected in the plurality of surrounding images.
 また、第4の発明は、第2の発明において、前記移動物体検知処理部は、前記自車両の下部で移動物体を検知した場合に、検知結果を前記合成俯瞰映像にメッセージ出力して前記利用者に報知することを特徴とする。 In a fourth aspect based on the second aspect, when the moving object detection processing unit detects a moving object in a lower part of the host vehicle, the detection result is output as a message to the synthesized overhead view image and used. It is characterized by notifying a person.
 更に、第5の発明は、第2の発明において、前記移動物体検知処理部は、前記自車両の下部で移動物体を検知した場合に、検知結果を前記合成俯瞰映像における自車両を示す位置に描画出力して前記利用者に報知することを特徴とする。 Further, in a fifth aspect based on the second aspect, when the moving object detection processing unit detects a moving object in a lower portion of the host vehicle, the detection result is set to a position indicating the host vehicle in the synthesized overhead view image. The drawing is output and notified to the user.
 また、第6の発明は、第2の発明において、前記自車両と検知した移動物体との距離に応じて、強調表示の強調の度合いを示す色の情報を保持する強調情報保持部を更に備え、前記移動物体検知処理部は、前記強調情報保持部から前記強調表示での強調の度合いを示す色の情報を取込み、前記自車両と検知した移動物体との距離から強調表示色を設定し、検知結果である前記移動物体を設定した前記強調表示色で前記合成俯瞰映像上に表示して前記利用者に報知することを特徴とする。 According to a sixth aspect, in the second aspect, the information processing apparatus further includes an emphasis information holding unit that holds color information indicating a degree of emphasis display according to a distance between the host vehicle and the detected moving object. The moving object detection processing unit takes in color information indicating the degree of emphasis in the emphasis display from the emphasis information holding unit, and sets an emphasis display color from a distance between the host vehicle and the detected moving object, The moving object that is a detection result is displayed on the composite bird's-eye view image in the highlighted display color that is set, and is notified to the user.
 本発明によれば、自車両周囲の状況において、利用者に対して合成俯瞰映像として表示している部分には含まれない箇所で移動物体を検知した場合でも、検知結果を利用者に報知するので、利用者は、自車両の作業上注意すべき対象物の存在と位置関係を迅速に把握できる。これにより、作業上の安全を確保するための確認作業を効率化できると共に、人物や他車両といった対象物との接触を回避することができる。この結果、作業全体の運用効率を向上することができる。 According to the present invention, even when a moving object is detected at a location that is not included in the portion displayed as a composite bird's-eye view image for the user in a situation around the host vehicle, the detection result is notified to the user. Therefore, the user can quickly grasp the existence and the positional relationship of the object to be noted in the work of the own vehicle. As a result, it is possible to improve the efficiency of confirmation work for ensuring work safety, and to avoid contact with an object such as a person or another vehicle. As a result, the operational efficiency of the entire work can be improved.
本発明の車両周囲移動物体検知システムの第1の実施の形態を備えたダンプトラックを示す側面図である。It is a side view showing a dump truck provided with a 1st embodiment of a vehicle circumference moving object detection system of the present invention. 本発明の車両周囲移動物体検知システムの第1の実施の形態における周囲映像入力部を構成するカメラの配置を説明する概念図である。It is a conceptual diagram explaining arrangement | positioning of the camera which comprises the surrounding image input part in 1st Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第1の実施の形態を構成するカメラが撮影した映像を示す概念図である。It is a conceptual diagram which shows the image | video which the camera which comprises 1st Embodiment of the vehicle surrounding moving object detection system of this invention image | photographed. 本発明の車両周囲移動物体検知システムの第1の実施の形態の構成を示すブロック図である。It is a block diagram which shows the structure of 1st Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第1の実施の形態における処理内容を示すフローチャート図である。It is a flowchart figure which shows the processing content in 1st Embodiment of the vehicle periphery moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第1の実施の形態における検知結果表示対象外領域の例を示す概念図である。It is a conceptual diagram which shows the example of the detection result display non-target area | region in 1st Embodiment of the vehicle periphery moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第1の実施の形態における周囲映像からの抽出領域の例を示す概念図である。It is a conceptual diagram which shows the example of the extraction area | region from the surrounding image | video in 1st Embodiment of the vehicle periphery moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第1の実施の形態における合成俯瞰映像と移動物体の検知結果表示の一例を示す概念図である。It is a conceptual diagram which shows an example of the synthetic | combination bird's-eye view image and the detection result display of a moving object in 1st Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第1の実施の形態における移動物体検知を説明する概念図である。It is a conceptual diagram explaining the moving object detection in 1st Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第2の実施の形態における合成俯瞰映像と移動物体の検知結果表示の一例を示す概念図である。It is a conceptual diagram which shows an example of the synthetic | combination bird's-eye view image and the detection result display of a moving object in 2nd Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第3の実施の形態における合成俯瞰映像と移動物体の検知結果表示の一例を示す概念図である。It is a conceptual diagram which shows an example of the synthetic | combination bird's-eye view image and the detection result display of a moving object in 3rd Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第4の実施の形態の構成を示すブロック図である。It is a block diagram which shows the structure of 4th Embodiment of the vehicle surrounding moving object detection system of this invention. 図12に示す強調情報保持部におけるテーブルの一例を示す表図である。It is a table | surface figure which shows an example of the table in the emphasis information holding | maintenance part shown in FIG. 本発明の車両周囲移動物体検知システムの第4の実施の形態における処理内容を示すフローチャート図である。It is a flowchart figure which shows the processing content in 4th Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第4の実施の形態において移動物体が映った映像の一例を示す概念図である。It is a conceptual diagram which shows an example of the image | video which showed the moving object in 4th Embodiment of the vehicle surrounding moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第5の実施の形態における強調情報保持部におけるテーブルの一例を示す表図である。It is a table | surface figure which shows an example of the table in the emphasis information holding | maintenance part in 5th Embodiment of the vehicle periphery moving object detection system of this invention. 本発明の車両周囲移動物体検知システムの第5の実施の形態において移動物体が映った映像の一例を示す概念図である。It is a conceptual diagram which shows an example of the image | video which showed the moving object in 5th Embodiment of the vehicle surrounding moving object detection system of this invention.
 以下、鉱山等で採掘した砕石や鉱物等を運搬する大型の運搬車両であるダンプトラックに適用した場合を例にとって本発明の実施の形態を図面を用いて説明する。なお、本発明の適用はダンプトラックに限定されるものではない。
<実施例1>
 図1は本発明の車両周囲移動物体検知システムの第1の実施の形態を備えたダンプトラックを示す側面図である。 
 図1に示すダンプトラック(車両)1は、頑丈なフレーム構造で形成された車体2と、車体2上に起伏可能に搭載されたベッセル(荷台)3と、車体2に装着された左前輪4A(L)及び左後輪4B(L)を主に備えている。
Embodiments of the present invention will be described below with reference to the drawings, taking as an example a case where the present invention is applied to a dump truck that is a large transport vehicle that transports crushed stones, minerals, and the like mined at a mine or the like. The application of the present invention is not limited to the dump truck.
<Example 1>
FIG. 1 is a side view showing a dump truck provided with a first embodiment of a vehicle surrounding moving object detection system of the present invention.
A dump truck (vehicle) 1 shown in FIG. 1 includes a vehicle body 2 formed with a sturdy frame structure, a vessel (loading platform) 3 mounted on the vehicle body 2 so as to be able to undulate, and a left front wheel 4A mounted on the vehicle body 2. (L) and left rear wheel 4B (L) are mainly provided.
 車体2には、後輪4Bを駆動するエンジン(図示せず)が配設されている。エンジンは、例えば、エンジン制御装置(以下ECUという)を有し、ECUからの指令信号によって、供給される燃料流量が制御されることで、その回転数が制御されている。 The vehicle body 2 is provided with an engine (not shown) that drives the rear wheels 4B. The engine has, for example, an engine control device (hereinafter referred to as ECU), and the number of revolutions thereof is controlled by controlling the flow rate of fuel supplied by a command signal from the ECU.
 ベッセル3は、砕石物等の荷物を積載するために設けられた容器であり、ピン結合部5等を介して車体2に対して起伏可能に連結されている。ベッセル3の下部には、車両の幅方向に所定の間隔を介して2つの起伏シリンダ6が設置されている。起伏シリンダ6に圧油が供給・排出されると、起伏シリンダ6が伸長・縮短してベッセル3が起伏される。また、ベッセル3の前側上部には庇部7が設けられている。 The vessel 3 is a container provided to load a load such as a crushed stone, and is connected to the vehicle body 2 via a pin coupling portion 5 so as to be raised and lowered. Two undulation cylinders 6 are installed at a lower portion of the vessel 3 with a predetermined interval in the width direction of the vehicle. When pressure oil is supplied to and discharged from the undulation cylinder 6, the undulation cylinder 6 extends and contracts, and the vessel 3 is undulated. In addition, a collar portion 7 is provided on the upper front side of the vessel 3.
 庇部7は、その下側(すなわち車体2の前部)に設置された運転室8を岩石等の飛散物から保護するとともに、車両転倒時等に運転室8を保護する機能を有している。運転室8の内部には、車両周囲移動物体検知システムを構成する制御装置100(図4参照)と、操舵用のハンドル(図示せず)と、アクセルペダル及びブレーキペダル等(図示せず)とが設置されている。 The heel portion 7 has a function of protecting the cab 8 installed on the lower side thereof (that is, the front portion of the vehicle body 2) from scattered objects such as rocks and protecting the cab 8 when the vehicle falls. Yes. Inside the cab 8 is a control device 100 (see FIG. 4) constituting a vehicle surrounding moving object detection system, a steering handle (not shown), an accelerator pedal, a brake pedal and the like (not shown). Is installed.
 図2は本発明の車両周囲移動物体検知システムの第1の実施の形態における周囲映像入力部を構成するカメラの配置を説明する概念図、図3は本発明の車両周囲移動物体検知システムの第1の実施の形態を構成するカメラが撮影した映像を示す概念図である。 FIG. 2 is a conceptual diagram for explaining the arrangement of cameras constituting the surrounding image input unit in the first embodiment of the vehicle surrounding moving object detection system of the present invention, and FIG. 3 is a diagram of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows the image | video which the camera which comprises 1 embodiment image | photographed.
 図2において、ダンプトラック1の車体2の前側面と後側面とには、ダンプトラック1の前方を広角レンズで撮影する前方カメラ301と、ダンプトラック1の後方を広角レンズで撮影する後方カメラ303とが設けられている。また、車体2の左側面と右側面とには、ダンプトラック1の左方向を広角レンズで撮影する左方向カメラ302と、ダンプトラック1の右方向を広角レンズで撮影する右方向カメラ304とが設けられている。また、ダンプトラック1の車体2には、右前輪4A(R)と左後輪4B(R)とが装着されている。 In FIG. 2, a front camera 301 that captures the front of the dump truck 1 with a wide-angle lens and a rear camera 303 that captures the rear of the dump truck 1 with a wide-angle lens are provided on the front and rear sides of the vehicle body 2 of the dump truck 1. And are provided. Further, on the left side surface and the right side surface of the vehicle body 2, a left direction camera 302 that captures the left direction of the dump truck 1 with a wide angle lens and a right direction camera 304 that captures the right direction of the dump truck 1 with a wide angle lens are provided. Is provided. The vehicle body 2 of the dump truck 1 is equipped with a right front wheel 4A (R) and a left rear wheel 4B (R).
 図3はこれらのカメラ301~303が撮影した周囲映像の例を示す。401は前方カメラ301が撮影した前方映像の例を示す。402は左方向カメラ302が撮影した左方向映像の例を示す。403は後方カメラ303が撮影した後方映像の例を示す。404は右方向カメラ304が撮影した右方向映像の例を示す。 FIG. 3 shows an example of surrounding images taken by these cameras 301-303. Reference numeral 401 denotes an example of a front image captured by the front camera 301. Reference numeral 402 denotes an example of a left direction image captured by the left direction camera 302. Reference numeral 403 denotes an example of a rear image taken by the rear camera 303. Reference numeral 404 denotes an example of a right direction image captured by the right direction camera 304.
 周囲映像401~404は、それぞれ広角レンズで撮影されるため、各映像の上方に位置する遠方の地平線が湾曲状態に映っている。また、各映像の各下方には手前側の車体2の一部や左右前輪4A(R),4A(L)の一部や左右後輪4B(R),4B(L)の一部が映っている。これらの周囲映像を基に合成俯瞰映像が生成される。 Since the surrounding images 401 to 404 are each photographed with a wide-angle lens, a distant horizon located above each image is reflected in a curved state. In addition, a part of the vehicle body 2 on the front side, a part of the left and right front wheels 4A (R) and 4A (L), and a part of the left and right rear wheels 4B (R) and 4B (L) are shown below each image. ing. A synthesized overhead view video is generated based on these surrounding videos.
 図4は本発明の車両周囲移動物体検知システムの第1の実施の形態の構成を示すブロック図である。図4において、車両周囲移動物体検知システムの第1の実施の形態は、車両周囲移動物体検知装置100と、周囲映像入力部101とを備えている。 FIG. 4 is a block diagram showing the configuration of the first embodiment of the vehicle surrounding moving object detection system of the present invention. In FIG. 4, the first embodiment of the vehicle surrounding moving object detection system includes a vehicle surrounding moving object detection device 100 and a surrounding image input unit 101.
 周囲映像入力部101は、ダンプトラック1の周囲の様子をそれぞれ撮影する複数のカメラ301~304を備えている。周囲映像入力部101は、撮影結果である複数の周囲映像401~404を車両周囲移動物体検知装置100へ送信する。 The ambient video input unit 101 includes a plurality of cameras 301 to 304 that respectively capture the situation around the dump truck 1. The surrounding image input unit 101 transmits a plurality of surrounding images 401 to 404, which are photographing results, to the vehicle surrounding moving object detection device 100.
 車両周囲移動物体検知装置100は、移動物体検知部102と領域情報保持部103と合成映像構築部104と出力部105とを備えている。 The vehicle surrounding moving object detection device 100 includes a moving object detection unit 102, a region information holding unit 103, a composite video construction unit 104, and an output unit 105.
 移動物体検知部102は、周囲映像入力部101が送信した複数の周囲映像401~404と、領域情報保持部103からの非検知非表示の領域である検知結果表示対象外領域(詳細は後述する)の情報信号とを入力し、周囲映像から検知結果表示対象外領域を除いた領域において移動物体の検知処理を行う。移動物体検知部102は、領域情報保持部103へ情報要求信号を出力すると共に、移動物体を検知した場合には、検知した位置を示す検知位置座標信号を合成映像構築部104へ出力する。 The moving object detection unit 102 includes a plurality of surrounding images 401 to 404 transmitted by the surrounding image input unit 101 and a detection result display non-display region that is a non-detection / non-display region from the region information holding unit 103 (details will be described later). ) And the moving object detection process is performed in a region excluding the detection result display non-target region from the surrounding video. The moving object detection unit 102 outputs an information request signal to the area information holding unit 103 and outputs a detection position coordinate signal indicating the detected position to the composite video construction unit 104 when a moving object is detected.
 領域情報保持部103は、検知結果表示対象外領域の情報信号を保持し、移動物体検知部102からの要求に応じて保持した情報信号を移動物体検知部102に出力する。ここで、検知結果表示対象外領域とは、周囲映像401~404の中で、移動する物体を検知した場合であっても、移動物体を検知したとの検知結果を表示させない領域をいい、例えば、車体2の前輪4Aや後輪4B等の自車両1の一部が設定される。 The area information holding unit 103 holds an information signal of a detection result display non-target area, and outputs the held information signal to the moving object detection unit 102 in response to a request from the moving object detection unit 102. Here, the detection result display non-target area refers to an area in which a detection result indicating that a moving object is detected is not displayed even if a moving object is detected in the surrounding images 401 to 404. A part of the host vehicle 1 such as the front wheel 4A and the rear wheel 4B of the vehicle body 2 is set.
 合成映像構築部104は、周囲映像入力部101の送信した複数の周囲映像401~404と、移動物体検知部102から移動物体の位置を示す検知位置座標信号とを入力する。合成映像構築部104は、合成俯瞰映像を生成するために入力した周囲映像の中から必要な部分を切り出し変形して合成を行い、入力した検知位置の座標信号を基に移動物体の検知位置の部位を強調表示した合成俯瞰映像を生成し、出力部105へ出力する。 The composite video constructing unit 104 receives a plurality of surrounding images 401 to 404 transmitted from the surrounding image input unit 101 and a detection position coordinate signal indicating the position of the moving object from the moving object detection unit 102. The composite video constructing unit 104 cuts out and transforms a necessary part from the surrounding video input to generate the composite overhead video, combines them, and based on the input detection position coordinate signal, the detection position of the moving object A composite bird's-eye view image highlighting the region is generated and output to the output unit 105.
 出力部105は、ディスプレイ等を備えていて、合成映像構築部104から入力した合成俯瞰映像をディスプレイ等に表示する。 The output unit 105 includes a display or the like, and displays the synthesized overhead view video input from the synthesized video construction unit 104 on the display or the like.
 次に、本発明の車両周囲移動物体検知システムの第1の実施の形態における移動物体の検知と表示の処理内容について図5乃至図9を用いて説明する。図5は本発明の車両周囲移動物体検知システムの第1の実施の形態における処理内容を示すフローチャート図、図6は本発明の車両周囲移動物体検知システムの第1の実施の形態における検知結果表示対象外領域の例を示す概念図、図7は本発明の車両周囲移動物体検知システムの第1の実施の形態における周囲映像からの抽出領域の例を示す概念図、図8は本発明の車両周囲移動物体検知システムの第1の実施の形態における合成俯瞰映像と移動物体の検知結果表示の一例を示す概念図、図9は本発明の車両周囲移動物体検知システムの第1の実施の形態における移動物体検知を説明する概念図である。図5乃至図9において、図1乃至図4に示す符号と同符号のものは同一部分であるので、その詳細な説明は省略する。 Next, moving object detection and display processing contents in the first embodiment of the vehicle surrounding moving object detection system of the present invention will be described with reference to FIGS. FIG. 5 is a flowchart showing the processing contents in the first embodiment of the vehicle surrounding moving object detection system of the present invention, and FIG. 6 shows the detection result display in the first embodiment of the vehicle surrounding moving object detection system of the present invention. FIG. 7 is a conceptual diagram showing an example of an extraction region from a surrounding image in the first embodiment of the vehicle surrounding moving object detection system of the present invention, and FIG. 8 is a vehicle of the present invention. FIG. 9 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the first embodiment of the surrounding moving object detection system, and FIG. 9 is a diagram in the first embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram explaining a moving object detection. 5 to 9, the same reference numerals as those shown in FIGS. 1 to 4 are the same parts, and detailed description thereof is omitted.
 図5において、車両周囲移動物体検知装置100は、周囲映像入力を行う(ステップS101)。具体的には、周囲映像入力部101で撮影した周囲映像401~404を入力する。 In FIG. 5, the vehicle surrounding moving object detection device 100 performs surrounding image input (step S101). Specifically, ambient images 401 to 404 photographed by the ambient image input unit 101 are input.
 車両周囲移動物体検知装置100は、対象外領域を除外処理する(ステップS102)。具体的には、図4に示す領域情報保持部103の保持する検知結果表示対象外領域の情報を基に、移動物体を検知した場合であっても、検知結果を表示させない領域を設定する。 The vehicle surrounding moving object detection device 100 performs the exclusion process on the non-target region (step S102). Specifically, based on the information on the detection result display non-target area held by the area information holding unit 103 shown in FIG. 4, even if a moving object is detected, an area where the detection result is not displayed is set.
 この設定例を図6を用いて説明する。図6の周囲映像401~404において、破線で囲んでいる部分を検知結果表示対象外領域と設定する。前方映像401において501は前方検知結果表示対象外領域を、左方向映像402において502は左方向検知結果表示対象外領域を、後方映像403において503は後方検知結果表示対象外領域を、右方向映像404において504は右方向検知結果表示対象外領域をそれぞれ示している。 This setting example will be described with reference to FIG. In the surrounding images 401 to 404 in FIG. 6, a portion surrounded by a broken line is set as a detection result display non-target area. In the front image 401, 501 indicates a non-target area for display of the front detection result, 502 in the left direction video 402 indicates a non-target area for display of the detection result of the left direction, 503 in the rear video 403, Reference numeral 504 denotes a right direction detection result display non-target area.
 これらの検知結果表示対象外領域においては、この領域で動作する物体を検知した場合であっても、移動物体を検知したとの表示を行わない。例えば、左方向映像402または右方向映像404において車体2の前輪4Aや後輪4Bの動作を検知しても、出力部105に対して移動物体の検知信号が出力されないので、合成俯瞰映像には移動物体が検知されたとの表示はされない。 In these detection result display non-target areas, even if an object operating in this area is detected, a display indicating that a moving object has been detected is not performed. For example, even if the movement of the front wheel 4A or the rear wheel 4B of the vehicle body 2 is detected in the left direction image 402 or the right direction image 404, the detection signal of the moving object is not output to the output unit 105. There is no indication that a moving object has been detected.
 図5にもどり、車両周囲移動物体検知装置100は、合成映像の構築を実行する(ステップS103)。具体的には、図4に示す合成映像構築部104において前方映像401、左方向映像402、後方映像403、右方向映像404から一部を切り出して境界を接するように並べることにより、ダンプトラック1の前後左右にわたる周囲の状況を表示させる。 Returning to FIG. 5, the vehicle surrounding moving object detection apparatus 100 executes construction of a composite image (step S103). Specifically, the composite video construction unit 104 shown in FIG. 4 cuts a part from the front video 401, the left video 402, the rear video 403, and the right video 404 and arranges them so as to touch the boundary, thereby providing the dump truck 1 Display the surroundings of the front, back, left and right.
 この例を図7及び図8を用いて説明する。図7の周囲映像401~404において、それぞれの内部に前方抽出領域601、左方向抽出領域602、後方抽出領域603、右方向抽出領域604を設ける。これらの抽出領域は、広角カメラの特性を基に、ダンプトラック1の真下から始まって一定距離だけ離れた部分までの間が映ることと、隣接するカメラから入力する映像における抽出領域と当該カメラから入力する映像における抽出領域とが重ならずに隣接することを基に範囲が設定されている。 This example will be described with reference to FIGS. In the surrounding images 401 to 404 in FIG. 7, a front extraction region 601, a left direction extraction region 602, a rear extraction region 603, and a right direction extraction region 604 are provided inside each of the surrounding images 401 to 404. Based on the characteristics of the wide-angle camera, these extraction areas are reflected from the area starting from just below the dump truck 1 to a portion separated by a certain distance, and the extraction area in the video input from the adjacent camera and the camera. The range is set based on the fact that the extracted area in the input video is adjacent without overlapping.
 次に、これら抽出領域601~604に対して座標変換及び合成を行うことで、図8に示す合成俯瞰映像605を構築する。図8の合成俯瞰映像605において、前方抽出領域601は前方変換映像606へと変換されている。同様に、左方向抽出領域602は左方向変換映像607に、後方抽出領域603は後方変換映像608に、右方向抽出領域604は右方向変換映像609へと変換されている。 Next, by performing coordinate conversion and synthesis on these extraction areas 601 to 604, a synthesized overhead image 605 shown in FIG. 8 is constructed. In the synthesized bird's-eye view image 605 in FIG. 8, the forward extraction area 601 is converted into a forward converted image 606. Similarly, the left direction extraction region 602 is converted into a left direction conversion image 607, the rear extraction region 603 is converted into a rear conversion image 608, and the right direction extraction region 604 is converted into a right direction conversion image 609.
 合成俯瞰映像605において、これらの変換映像606~609は順次隣接するように配置されていて、本実施の形態においては、ダンプトラック1の周囲を俯瞰した映像を表現している。なお、変換映像の配置の結果映像が存在しない中央の部分にはダンプトラック1の車体2の存在領域を示すために、自車アイコン610を嵌め込み合成している。 In the composite bird's-eye view image 605, these converted images 606 to 609 are sequentially arranged so as to be adjacent to each other, and in the present embodiment, an image of the bird's-eye view of the periphery of the dump truck 1 is represented. As a result of the arrangement of the converted video, the own vehicle icon 610 is inserted and synthesized in the central portion where the video does not exist to indicate the region where the vehicle body 2 of the dump truck 1 exists.
 図5にもどり、車両周囲移動物体検知装置100は、検知処理を実行する(ステップS104)。具体的には、図4に示す移動物体検知部102において、移動物体の検知処理を実行する。検知の対象とする映像は、周囲映像入力部101から入力する前方映像401、左方向映像402、後方映像403、右方向映像404であり、さらにそれぞれの映像内において、前方検知結果表示対象外領域501、左方向検知結果表示対象外領域502、後方検知結果表示対象外領域503、右方向検知結果表示対象外領域504での検知結果は対象外としている。 Returning to FIG. 5, the vehicle surrounding moving object detection apparatus 100 executes a detection process (step S104). Specifically, the moving object detection unit 102 shown in FIG. 4 performs a moving object detection process. The images to be detected are the forward video 401, the left video 402, the rear video 403, and the right video 404 input from the surrounding video input unit 101. Further, in each video, the front detection result display non-display area 501, the detection results in the left direction detection result display non-target area 502, the rear detection result display non-target area 503, and the right direction detection result display non-target area 504 are excluded.
 この例を図9を用いて説明する。図9は左方向映像402における移動物体の検知を示すものであって、上から順に、(a)ある特定時刻t0 より前の時刻である時刻t0-1における映像、(b)ある特定時刻t0 における映像、(c)移動物体初期検知結果を示す映像、(d)補正済み検知結果を示す映像を示している。 This example will be described with reference to FIG. FIG. 9 shows detection of a moving object in the left-direction image 402, and in order from the top, (a) an image at a time t 0 −1 that is a time before a certain specific time t 0 , and (b) a certain specific An image at time t 0 , (c) an image showing a moving object initial detection result, and (d) an image showing a corrected detection result are shown.
 ここで、(a)には、時刻t0-1における移動物体702が左方向映像402上に撮影されている様子が示され、(b)には特定時刻t0 における移動物体701が左方向映像402上に撮影されている様子が示されている。これらの移動物体701と702は同一のものであり、時間の推移に伴って移動している様子が左方向カメラ303によって撮影されている。本実施の形態において、特定時刻t0 における移動物体701と、特定時刻t0 より前の時刻である時刻t0-1における移動物体702とは、左方向抽出領域602の内部には存在していないが、左方向検知結果表示対象外領域502からも外れているので、移動物体検知の対象となると共に、移動物体の場合には、その表示がなされる対象となる。 Here, (a) shows a state in which the moving object 702 at time t 0 −1 is photographed on the left direction image 402, and (b) shows the moving object 701 at the specific time t 0 in the left direction. A state of being photographed on the video 402 is shown. These moving objects 701 and 702 are the same, and the state of moving with the passage of time is captured by the left camera 303. In the present embodiment, the moving object 701 at a particular time t 0, the moving object 702 at time t 0 -1 is a time before the specific time t 0, is present in the interior of the left extraction region 602 Although not included in the left direction detection result display non-target area 502, it is a target for moving object detection, and in the case of a moving object, it is a target for display.
 図4に示す移動物体検知部102は、周囲映像入力部101から入力するこれらの映像を比較し、時間の推移に伴って映像に変化が発生した部位を特定する。具体的には、図9(c)の四角形で示す左方向の検知結果703のように、部位を特定する。移動物体検知部102は、この左方向の検知結果703の左方向映像402上での座標情報を算出して保持する。なお、映像に変化が発生していない場合はこのような検知結果は得られない。 The moving object detection unit 102 shown in FIG. 4 compares these images input from the surrounding image input unit 101, and identifies a site where a change has occurred in the image with time. Specifically, a part is specified like a detection result 703 in the left direction indicated by a square in FIG. The moving object detection unit 102 calculates and holds coordinate information on the left direction image 402 of the detection result 703 in the left direction. It should be noted that such a detection result cannot be obtained when there is no change in the image.
 図5にもどり、車両周囲移動物体検知装置100は、(ステップS104)における検知処理による移動物体の検知の有無を判断する(ステップS105)。移動物体を検知した場合には、(ステップS106)へ進み、それ以外の場合には、(ステップS108)へ進む。 Returning to FIG. 5, the vehicle surrounding moving object detection apparatus 100 determines whether or not a moving object is detected by the detection processing in (Step S104) (Step S105). If a moving object is detected, the process proceeds to (Step S106). Otherwise, the process proceeds to (Step S108).
 車両周囲移動物体検知装置100は、検知箇所の座標を算出する(ステップS106)。具体的には、図4に示す移動物体検知部102において前方映像401、左方向映像402、後方映像403、右方向映像404の中で得られた移動物体の部位を座標情報に変換し、合成俯瞰映像605における座標情報を算出する。座標情報に加える変換は、(ステップS103)にて合成俯瞰映像605を構築する際に用いるものと同等である。 The vehicle surrounding moving object detection device 100 calculates the coordinates of the detection location (step S106). Specifically, the moving object detection unit 102 shown in FIG. 4 converts the part of the moving object obtained in the front video 401, the left video 402, the rear video 403, and the right video 404 into coordinate information, and combines it. Coordinate information in the overhead view image 605 is calculated. The conversion added to the coordinate information is the same as that used when constructing the composite bird's-eye view image 605 in (Step S103).
 しかし、本ステップにおいては、図9(b)に示すように、特定時刻t0 における移動物体701は、左方向映像402の内部にはあるものの左方向抽出領域602の外部に存在している。このため、この座標情報に変換を加えた場合、変換後の移動物体701の座標情報は、図8に示す左方向変換映像607の外部を指すことになる。そこで、移動物体検知部102は、図9(c)で示す特定時刻t0 における左方向の検知結果703に最も近い左方向抽出領域602の内部の部位を算出し、この算出した部位を図9(d)に示すように特定時刻t0 における左方向の補正済み検知結果704としている。 However, in this step, as shown in FIG. 9B, the moving object 701 at the specific time t 0 exists outside the left direction extraction region 602 although it is inside the left direction image 402. For this reason, when conversion is applied to this coordinate information, the coordinate information of the moving object 701 after conversion indicates the outside of the left-direction converted image 607 shown in FIG. Therefore, the moving object detection unit 102 calculates a part inside the left direction extraction region 602 closest to the left direction detection result 703 at the specific time t 0 shown in FIG. 9C, and the calculated part is shown in FIG. As shown in (d), the corrected detection result 704 in the left direction at the specific time t 0 is used.
 図5にもどり、車両周囲移動物体検知装置100は、検知箇所の強調設定を行う(ステップS107)。具体的には、合成映像構築部104にて、移動物体の検知を利用者に示すために、強調表示の設定が行われる。 Returning to FIG. 5, the vehicle surrounding moving object detection apparatus 100 performs enhancement setting of the detection location (step S <b> 107). Specifically, the composite video constructing unit 104 sets highlighting in order to indicate to the user that the moving object is detected.
 この例を図8と図9を用いて説明する。図8の合成俯瞰映像605において、図9(d)に示す特定時刻t0 における左方向の補正済み検知結果704に変換を加えることにより、特定時刻t0 における座標変換後の検知結果801に強調表示が設定されている。 This example will be described with reference to FIGS. In the composite bird's-eye view image 605 in FIG. 8, the detection result 801 after coordinate conversion at the specific time t 0 is emphasized by adding a conversion to the corrected detection result 704 in the left direction at the specific time t 0 shown in FIG. Display is set.
 図9(c)と図9(d)において、特定時刻t0 における左方向の補正済み検知結果704は、左方向抽出領域602の内部にあって、かつ特定時刻t0 における左方向の検知結果703に最も近い位置に存在している。そして、図8に示す特定時刻t0 における座標変換後の検知結果801は、合成俯瞰映像605上における左方向変換映像607上で、特定時刻t0 における左方向の検知結果703の示す移動物体に最も近い位置に存在している。 Figure 9 (c) and in FIG. 9 (d), the left direction of the corrected detection results 704 at a specific time t 0 is, in the interior of the left extraction region 602, and the detection result of the left at a particular time t 0 It exists at a position closest to 703. Then, the detection result 801 after the coordinate conversion at the specific time t 0 shown in FIG. 8 becomes a moving object indicated by the detection result 703 in the left direction at the specific time t 0 on the left-direction converted video 607 on the synthesized overhead image 605. It exists in the nearest position.
 図5にもどり、車両周囲移動物体検知装置100は、合成映像を出力する(ステップS108)。具体的には、合成映像構築部104が合成俯瞰映像605を出力部105へ出力する。上述した処理において、移動物体を検知している場合には、出力部105には特定時刻t0 における座標変換後の検知結果801のような強調表示が行われた合成俯瞰映像605が出力される。 Returning to FIG. 5, the vehicle surrounding moving object detection apparatus 100 outputs a composite image (step S <b> 108). Specifically, the composite video construction unit 104 outputs the composite overhead video 605 to the output unit 105. In the above-described processing, when a moving object is detected, the output unit 105 outputs a composite bird's-eye view image 605 that is highlighted as in the detection result 801 after coordinate conversion at the specific time t 0 . .
 なお、本実施の形態においては、特定時刻t0 における左方向の検知結果703に補正を加えた特定時刻t0 における左方向の補正済み検知結果704を表示に利用すると共に、図8に示すように三角形の形状で強調表示している。また、上述の処理で移動物体を検知しない場合には、強調表示は行われない。 In the present embodiment, while utilizing the left of the corrected detection results 704 at a particular time t 0 plus correction in the left direction of the detection result 703 at a particular time t 0 on the display, as shown in FIG. 8 Is highlighted with a triangular shape. Further, when a moving object is not detected in the above-described processing, highlighting is not performed.
 図5にもどり、車両周囲移動物体検知装置100は、移動物体の検知及び出力が完了したかどうかを判定する(ステップS109)。これらの処理が完了した場合は終了し、それ以外の場合は(ステップS101)に戻る。 Returning to FIG. 5, the vehicle surrounding moving object detection apparatus 100 determines whether the detection and output of the moving object is completed (step S109). If these processes are completed, the process ends. Otherwise, the process returns to (Step S101).
 上述したように、本実施の形態における移動物体の検知と表示の処理がなされるので、合成俯瞰映像605上では表示されていない部位における移動物体の有無に関しても、検知することができ、その存在を強調表示することで、利用者に報知できる。 As described above, since the detection and display processing of the moving object in the present embodiment is performed, it is possible to detect the presence or absence of the moving object in the part that is not displayed on the composite overhead view image 605, and its existence. By highlighting, it is possible to notify the user.
 上述した本発明の車両周囲移動物体検知システムの第1の実施の形態によれば、自車両1の周囲の状況において、利用者に対して合成俯瞰映像605として表示している部分には含まれない箇所で移動物体を検知した場合でも、検知結果を利用者に報知するので、利用者は、自車両1の作業上注意すべき対象物の存在と位置関係を迅速に把握できる。これにより、作業上の安全を確保するための確認作業を効率化できると共に、人物や他車両といった対象物との接触を回避することができる。この結果、作業全体の運用効率を向上することができる。
<実施例2>
 以下、本発明の車両周囲移動物体検知システムの第2の実施の形態を図面を用いて説明する。図10は本発明の車両周囲移動物体検知システムの第2の実施の形態における合成俯瞰映像と移動物体の検知結果表示の一例を示す概念図である。図10において、図1乃至図9に示す符号と同符号のものは同一部分であるので、その詳細な説明は省略する。
According to the first embodiment of the vehicle surrounding moving object detection system of the present invention described above, it is included in the portion displayed as the synthesized bird's-eye view image 605 for the user in the situation around the host vehicle 1. Even when a moving object is detected at a non-existing location, the detection result is notified to the user, so that the user can quickly grasp the presence and positional relationship of the object to be noted in the operation of the host vehicle 1. As a result, it is possible to improve the efficiency of confirmation work for ensuring work safety, and to avoid contact with an object such as a person or another vehicle. As a result, the operational efficiency of the entire work can be improved.
<Example 2>
Hereinafter, a second embodiment of the vehicle surrounding moving object detection system of the present invention will be described with reference to the drawings. FIG. 10 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the second embodiment of the vehicle surrounding moving object detection system of the present invention. In FIG. 10, the same reference numerals as those shown in FIG. 1 to FIG. 9 are the same parts, and detailed description thereof will be omitted.
 本実施の形態における車両周囲移動物体検知システムは、その構成及び運用方法は第1の実施の形態と大略同様である。第2の実施の形態においては、自車両1の下部にあって合成俯瞰映像605上では表示されていない移動物体を検知した場合に、その旨をメッセージ表示することで利用者に報知する点が第1の実施の形態と異なる。 The vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment. In the second embodiment, when a moving object that is located below the host vehicle 1 and is not displayed on the composite bird's-eye view image 605 is detected, the user is notified by displaying a message to that effect. Different from the first embodiment.
 具体的には、第1の実施の形態において図5に示す処理内容を示すフローチャート図の(ステップS106)と(ステップS107)の処理と、第1の実施の形態において図8に示す合成俯瞰映像における移動物体の検知結果表示が異なるので、この部分について説明する。 Specifically, the processing of (Step S106) and (Step S107) in the flowchart showing the processing contents shown in FIG. 5 in the first embodiment, and the composite overhead view video shown in FIG. 8 in the first embodiment. Since the detection result display of the moving object in is different, this part will be described.
 本実施の形態においては、(ステップS106)において、第1の実施の形態と同様に検知箇所の座標を算出すると共に、この情報を基に検知した移動物体が自車両1の下部にあるか否かを判断する。具体的には、図4に示す移動物体検知部102において以下の条件が全て充足されたときに、この移動物体は自車両1の下部に存在すると判断する。
(1)前方映像401、左方向映像402、後方映像403、右方向映像404の中のいずれかに移動物体が含まれていること。
(2)図7に示す前方抽出領域601、左方向抽出領域602、後方抽出領域603、右方向抽出領域604のそれぞれを示す下辺よりも下側に移動物体が存在すること。
(3)図6に示す前方検知結果表示対象外領域501、左方向検知結果表示対象外領域502、後方検知結果表示対象外領域503、右方向検知結果表示対象外領域504のいずれも含まない領域に移動物体が存在すること。
In the present embodiment, in (Step S106), the coordinates of the detection location are calculated in the same manner as in the first embodiment, and whether or not the moving object detected based on this information is below the host vehicle 1. Determine whether. Specifically, when all of the following conditions are satisfied in the moving object detection unit 102 shown in FIG. 4, it is determined that the moving object exists below the host vehicle 1.
(1) A moving object is included in any of the front video 401, the left video 402, the rear video 403, and the right video 404.
(2) A moving object exists below the lower side indicating each of the front extraction area 601, the left direction extraction area 602, the rear extraction area 603, and the right direction extraction area 604 shown in FIG.
(3) Area that does not include any of the forward detection result display non-target area 501, the left direction detection result display non-target area 502, the rear detection result display non-target area 503, and the right direction detection result display non-target area 504 illustrated in FIG. 6. There must be a moving object.
 次に(ステップS107)において、検知箇所の強調設定がなされる。具体的には、合成映像構築部104にて、移動物体の検知を利用者に示すための設定が行われる。検知した移動物体が自車両1の下部に存在する場合、“自車体の下部に移動物体を検知しました。”とのメッセージ文字列を出力部105に出力するように設定される。同様に、検知した移動物体が自車両1の下部に存在しない場合は、メッセージ文字列を出力部105に出力しないように設定される。 Next, in (Step S107), the detection location is emphasized. Specifically, the composite video constructing unit 104 performs settings for indicating the detection of the moving object to the user. When the detected moving object exists in the lower part of the own vehicle 1, the message character string “A moving object has been detected in the lower part of the own vehicle body” is output to the output unit 105. Similarly, when the detected moving object does not exist below the host vehicle 1, the message character string is set not to be output to the output unit 105.
 図10は、これらの処理によりなされた出力部105上の表示例であって、自車両1の下部に移動物体を検知した場合の様子を示している。ステップ106及びステップ107の処理により、出力部105に表示する合成俯瞰映像605上には、自車両1の下部に移動物体を検知したことを示す検知メッセージ901が表示されている。これにより、前方変換映像606、左方向変換映像607、後方変換映像608、右方向変換映像609のいずれにも表示されない自車両1の下部に存在する移動物体の状態を利用者に報知することができる。 FIG. 10 is a display example on the output unit 105 made by these processes, and shows a state in which a moving object is detected in the lower part of the host vehicle 1. As a result of the processing of Step 106 and Step 107, a detection message 901 indicating that a moving object has been detected is displayed on the lower part of the host vehicle 1 on the synthesized overhead image 605 displayed on the output unit 105. Accordingly, it is possible to notify the user of the state of the moving object existing below the host vehicle 1 that is not displayed in any of the forward converted video 606, the left converted video 607, the backward converted video 608, and the right converted video 609. it can.
 上述した本発明の車両周囲移動物体検知システムの第2の実施の形態によれば、上述した第1の実施の形態と同様な効果を得ることができる。 According to the second embodiment of the vehicle surrounding moving object detection system of the present invention described above, the same effect as that of the first embodiment described above can be obtained.
 また、上述した本発明の車両周囲移動物体検知システムの第2の実施の形態によれば、利用者は、具体的に自車両1の下部に移動物体が存在することを迅速に把握できるので、作業上の安全を確保するための確認作業を効率化できる。この結果、より安全性を確保できる。
<実施例3>
 以下、本発明の車両周囲移動物体検知システムの第3の実施の形態を図面を用いて説明する。図11は本発明の車両周囲移動物体検知システムの第3の実施の形態における合成俯瞰映像と移動物体の検知結果表示の一例を示す概念図である。図11において、図1乃至図10に示す符号と同符号のものは同一部分であるので、その詳細な説明は省略する。
In addition, according to the second embodiment of the vehicle surrounding moving object detection system of the present invention described above, the user can quickly grasp that there is a moving object in the lower part of the own vehicle 1 specifically. It is possible to improve the efficiency of confirmation work to ensure work safety. As a result, more safety can be ensured.
<Example 3>
Hereinafter, a third embodiment of the vehicle surrounding moving object detection system of the present invention will be described with reference to the drawings. FIG. 11 is a conceptual diagram showing an example of a composite overhead view image and a moving object detection result display in the third embodiment of the vehicle surrounding moving object detection system of the present invention. In FIG. 11, the same reference numerals as those shown in FIGS. 1 to 10 are the same parts, and detailed description thereof is omitted.
 本実施の形態における車両周囲移動物体検知システムは、その構成及び運用方法は第1の実施の形態と大略同様である。第3の実施の形態においては、自車両1の下部にあって合成俯瞰映像605上では表示されていない移動物体を検知した場合に、検知した位置に相当する自車アイコン610上の位置に検知結果を表示することで利用者に報知する点が第1の実施の形態と異なる。 The vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment. In the third embodiment, when a moving object that is below the host vehicle 1 and is not displayed on the composite overhead view image 605 is detected, it is detected at a position on the host vehicle icon 610 corresponding to the detected position. The point which notifies a user by displaying a result differs from a 1st embodiment.
 具体的には、第1の実施の形態において図5に示す処理内容を示すフローチャート図の(ステップS106)と(ステップS107)の処理と、第1の実施の形態において図8に示す合成俯瞰映像における移動物体の検知結果表示が異なるので、この部分について説明する。 Specifically, the processing of (Step S106) and (Step S107) in the flowchart showing the processing contents shown in FIG. 5 in the first embodiment, and the composite overhead view video shown in FIG. 8 in the first embodiment. Since the detection result display of the moving object in is different, this part will be described.
 本実施の形態においては、(ステップS106)において、第1の実施の形態と同様に検知箇所の座標を算出するが、このとき、検知結果の補正を行わず、特定時刻t0 における左方向の検知結果703をその後の処理で使用する。なお、特定時刻t0 における左方向の検知結果703は、図7(c)に示すように、自車両1の下部、前輪4Aの後ろ側に存在している。 In the present embodiment, in (step S106), but calculates the coordinates of the same detected position of the first embodiment, this time, without correcting the detection result, the left direction at a particular time t 0 The detection result 703 is used in subsequent processing. Note that the detection result 703 in the left direction at the specific time t 0 is present at the lower part of the host vehicle 1 and behind the front wheel 4A as shown in FIG. 7C.
 次に(ステップS107)において、検知箇所の強調設定がなされる。具体的には、合成映像構築部104にて、移動物体の検知を利用者に示すために、強調表示の設定が行われる。 Next, in (Step S107), the detection location is emphasized. Specifically, the composite video constructing unit 104 sets highlighting in order to indicate to the user that the moving object is detected.
 この例を11を用いて説明する。図11の合成俯瞰映像605において、図9(c)に示す特定時刻t0 における左方向の検知結果703に変換を加えることにより、特定時刻t0 における座標変換後の検知結果1001に強調表示が設定されている。 This example will be described with reference to 11. In the composite bird's-eye view image 605 of FIG. 11, by adding a conversion to the detection result 703 in the left direction at the specific time t 0 shown in FIG. 9C, the detection result 1001 after the coordinate conversion at the specific time t 0 is highlighted. Is set.
 図9(c)において、特定時刻t0 における左方向の検知結果703は、左方向抽出領域602の外部に存在するため、座標変換を行うと図10の自車アイコン610の表示領域内部に存在することになる。このため、図10において、特定時刻t0 における座標変換後の検知結果1001が自車アイコン610の上に重ねるように表示されている。これにより、前方変換映像606、左方向変換映像607、後方変換映像608、右方向変換映像609のいずれにも表示されない自車両1の下部に存在する移動物体の状態を、大略の位置を含めて利用者に報知することができる。 In FIG. 9C, the detection result 703 in the left direction at the specific time t 0 exists outside the left direction extraction region 602. Therefore, when coordinate conversion is performed, the detection result 703 exists in the display region of the vehicle icon 610 in FIG. Will do. For this reason, in FIG. 10, the detection result 1001 after the coordinate conversion at the specific time t 0 is displayed so as to be superimposed on the vehicle icon 610. As a result, the state of the moving object existing below the host vehicle 1 that is not displayed in any of the forward converted video 606, the left converted video 607, the backward converted video 608, and the right converted video 609, including the approximate position. The user can be notified.
 上述した本発明の車両周囲移動物体検知システムの第3の実施の形態によれば、上述した第1の実施の形態と同様な効果を得ることができる。 According to the third embodiment of the vehicle surrounding moving object detection system of the present invention described above, the same effects as those of the first embodiment described above can be obtained.
 また、上述した本発明の車両周囲移動物体検知システムの第3の実施の形態によれば、利用者は、具体的に自車両1の下部に移動物体が存在することと、大略の位置を迅速に把握できるので、作業上の安全を確保するための確認作業を効率化できる。この結果、より安全性を確保できる。
<実施例4>
 以下、本発明の車両周囲移動物体検知システムの第4の実施の形態を図面を用いて説明する。図12は本発明の車両周囲移動物体検知システムの第4の実施の形態の構成を示すブロック図、図13は図12に示す強調情報保持部におけるテーブルの一例を示す表図、図14は本発明の車両周囲移動物体検知システムの第4の実施の形態における処理内容を示すフローチャート図、図15は本発明の車両周囲移動物体検知システムの第4の実施の形態において移動物体が映った映像の一例を示す概念図である。図12乃至図15において、図1乃至図11に示す符号と同符号のものは同一部分であるので、その詳細な説明は省略する。
Further, according to the third embodiment of the vehicle surrounding moving object detection system of the present invention described above, the user can quickly determine that there is a moving object in the lower part of the own vehicle 1 and the approximate position. Therefore, it is possible to improve the efficiency of confirmation work to ensure work safety. As a result, more safety can be ensured.
<Example 4>
Hereinafter, a fourth embodiment of the vehicle surrounding moving object detection system of the present invention will be described with reference to the drawings. FIG. 12 is a block diagram showing the configuration of the fourth embodiment of the vehicle surrounding moving object detection system of the present invention, FIG. 13 is a table showing an example of the table in the emphasis information holding unit shown in FIG. 12, and FIG. FIG. 15 is a flowchart showing the processing contents in the fourth embodiment of the vehicle surrounding moving object detection system of the invention, and FIG. 15 is an image of the moving object reflected in the fourth embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows an example. 12 to 15, the same reference numerals as those shown in FIGS. 1 to 11 are the same parts, and detailed description thereof is omitted.
 本実施の形態における車両周囲移動物体検知システムは、その構成及び運用方法は第1の実施の形態と大略同様である。第4の実施の形態においては、移動物体が自車両1に対してより近い位置に存在する場合に、移動物体の検知結果をより強調して表示して利用者に報知する点が第1の実施の形態と異なる。 The vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment. In the fourth embodiment, when the moving object is present at a position closer to the host vehicle 1, the first point is that the detection result of the moving object is emphasized and displayed to the user. Different from the embodiment.
 具体的には、図12に示すように車両周囲移動物体検知装置100Aにおいて、強調情報保持部106をさらに備えている。 
 強調情報保持部106は、出力部105で表示する映像において、自車両1と移動物体との距離に応じて、強調表示での強調の度合いを示す色の情報信号を例えばテーブルとして保持し、合成映像構築部104に出力する。
Specifically, as shown in FIG. 12, the vehicle surrounding moving object detection device 100 </ b> A further includes an emphasis information holding unit 106.
The emphasis information holding unit 106 holds, as a table, for example, a color information signal indicating the degree of emphasis in the emphasis display according to the distance between the host vehicle 1 and the moving object in the video displayed by the output unit 105. Output to the video construction unit 104.
 強調情報保持部106が保持するテーブルの一例を図13に示す。本実施の形態においては、周囲映像401~404における移動物体と自車両1との距離Lが300ピクセル以上離れている場合には、強調表示色として緑色を使用する。同様に距離Lが150ピクセル以上300ピクセル未満の場合は強調表示色を黄色とし、距離が0ピクセル以上150ピクセル未満の場合は強調表示色を赤色としている。なお、強調情報保持部106には、この他の種類のテーブルを格納することも可能である。 An example of the table held by the emphasis information holding unit 106 is shown in FIG. In the present embodiment, when the distance L between the moving object and the vehicle 1 in the surrounding images 401 to 404 is 300 pixels or more, green is used as the highlight color. Similarly, when the distance L is 150 pixels or more and less than 300 pixels, the highlight color is yellow, and when the distance is 0 pixel or more and less than 150 pixels, the highlight color is red. The emphasis information holding unit 106 can store other types of tables.
 次に、本発明の車両周囲移動物体検知システムの第4の実施の形態における移動物体の検知と表示の処理内容について図14と図15を用いて説明する。本実施の形態における処理内容は、図5に示す第1の実施の形態における処理内容とは、(ステップS106)と(ステップS107)との間に(ステップS206)が加わっている点が異なるが、その他は同様である。 Next, moving object detection and display processing contents in the fourth embodiment of the vehicle surrounding moving object detection system of the present invention will be described with reference to FIGS. The processing contents in this embodiment are different from the processing contents in the first embodiment shown in FIG. 5 in that (step S206) is added between (step S106) and (step S107). Others are the same.
 本実施の形態においては、車両周囲移動物体検知装置100Aは、(ステップS106)において、第1の実施の形態と同様に検知箇所の座標を算出した後、この検知箇所の座標を基に領域内容の強調設定を行う(ステップS206)。具体的には、図12に示す合成映像構築部104が周囲映像401~404における移動物体と自車両1との距離Lを算出し、強調情報保持部106の保持するテーブルと照らしあわて強調表示色を設定する。 In the present embodiment, the vehicle surrounding moving object detection device 100A calculates the coordinates of the detection location in (Step S106) in the same manner as in the first embodiment, and then the region contents based on the coordinates of the detection location. Is set (step S206). Specifically, the composite video construction unit 104 shown in FIG. 12 calculates the distance L between the moving object and the host vehicle 1 in the surrounding videos 401 to 404, and highlights the color in light of the table held by the highlight information holding unit 106. Set.
 図15に示すように、例えば、前方映像401において、移動物体1001と自車両1との距離Lが200ピクセルであった場合、(ステップS206)において、合成映像構築部104が強調表示色を黄色に設定する。 As shown in FIG. 15, for example, when the distance L between the moving object 1001 and the host vehicle 1 is 200 pixels in the forward video 401, the composite video construction unit 104 sets the highlighted color to yellow in (Step S206). Set to.
 図14にもどり、車両周囲移動物体検知装置100Aは、検知箇所の強調設定を行う(ステップS107)。具体的には、合成映像構築部104にて、強調表示色を黄色に設定する。この後、(ステップS108)での合成映像出力において、合成俯瞰映像605に対して移動物体1001を黄色で強調表示される。 Returning to FIG. 14, the vehicle surrounding moving object detection device 100 </ b> A performs enhancement setting of the detection location (step S <b> 107). Specifically, the composite video construction unit 104 sets the highlight color to yellow. Thereafter, in the composite video output in (Step S108), the moving object 1001 is highlighted in yellow with respect to the composite overhead image 605.
 これらの処理が実行されることより、例えば、移動物体1001が自車両1に近づいてくる場合、その接近距離につれて、緑色から黄色、黄色から赤色へと強調表示色の強調度が増大する。このことにより、運転の安全性確保にとって重要となる自車両1の近傍の移動物体1001の存在を利用者により強調することができる。 By executing these processes, for example, when the moving object 1001 approaches the host vehicle 1, the degree of emphasis of the highlighted color increases from green to yellow and from yellow to red with the approach distance. Thus, the presence of the moving object 1001 in the vicinity of the host vehicle 1 that is important for ensuring driving safety can be emphasized by the user.
 上述した本発明の車両周囲移動物体検知システムの第4の実施の形態によれば、上述した第1の実施の形態と同様な効果を得ることができる。 According to the fourth embodiment of the vehicle surrounding moving object detection system of the present invention described above, the same effects as those of the first embodiment described above can be obtained.
 また、上述した本発明の車両周囲移動物体検知システムの第4の実施の形態によれば、移動物体1001の存在を強調表示すると共に、距離に応じて強調表示色を変化させているので、利用者は、移動物体1001の存在と共に、移動物体1001と自車両1との距離Lとを感覚的に迅速に把握できる。この結果、より安全性を確保できる。 Further, according to the fourth embodiment of the vehicle surrounding moving object detection system of the present invention described above, the presence of the moving object 1001 is highlighted and the highlighted color is changed according to the distance. The person can quickly and intuitively grasp the distance L between the moving object 1001 and the host vehicle 1 together with the presence of the moving object 1001. As a result, more safety can be ensured.
 なお、本実施の形態においては、移動物体1001の強調表示を色の切り替えで行っているが、これに限るものではない。例えば、表示する線の太さや、表示を点滅させる速度といった情報に対応づけることも可能である。
<実施例5>
 以下、本発明の車両周囲移動物体検知システムの第5の実施の形態を図面を用いて説明する。図16は本発明の車両周囲移動物体検知システムの第5の実施の形態における強調情報保持部におけるテーブルの一例を示す表図、図17は本発明の車両周囲移動物体検知システムの第5の実施の形態において移動物体が映った映像の一例を示す概念図である。図16及び図17において、図1乃至図15に示す符号と同符号のものは同一部分であるので、その詳細な説明は省略する。
In this embodiment, highlighting of the moving object 1001 is performed by switching colors, but the present invention is not limited to this. For example, information such as the thickness of the line to be displayed and the speed at which the display is blinked can be associated.
<Example 5>
Hereinafter, a fifth embodiment of the vehicle surrounding moving object detection system of the present invention will be described with reference to the drawings. FIG. 16 is a table showing an example of a table in the emphasis information holding unit in the fifth embodiment of the vehicle surrounding moving object detection system of the present invention, and FIG. 17 shows the fifth embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows an example of the image | video which showed the moving object in the form of. In FIG. 16 and FIG. 17, the same reference numerals as those shown in FIG. 1 to FIG.
 本実施の形態における車両周囲移動物体検知システムは、その構成及び運用方法は第4の実施の形態と大略同様である。第5の実施の形態においては、強調情報保持部106の保持するテーブルが第4の実施の形態と異なる。 The vehicle surrounding moving object detection system in the present embodiment has substantially the same configuration and operation method as in the fourth embodiment. In the fifth embodiment, the table held by the emphasis information holding unit 106 is different from that in the fourth embodiment.
 本実施の形態の強調情報保持部106が保持するテーブルの一例を図16に、また、移動物体が映った映像の一例を図17に示す。上述した周囲映像401~404に示すように、撮影条件にもとづき、周囲映像のより上方に撮影されている物体ほど自車両1の車体2からより遠い位置に存在し、周囲映像のより下方に撮影されている物体ほど自車両1の車体2からより近い位置に存在する。 FIG. 16 shows an example of a table held by the emphasis information holding unit 106 of the present embodiment, and FIG. 17 shows an example of an image showing a moving object. As shown in the surrounding images 401 to 404 described above, based on the shooting conditions, an object photographed above the surrounding image is located farther from the vehicle body 2 of the host vehicle 1 and is photographed below the surrounding image. The closer the object is, the closer to the body 2 of the host vehicle 1 is.
 そこで、本実施の形態においては、各周囲映像における垂直方向の座標であるy座標を、各周囲映像の上端を0として、下方への移動に伴って数値を増大するものと規定している。このため、y座標の数値が大きいほど自車両1に近いことを示している。 Therefore, in the present embodiment, the y-coordinate which is the vertical coordinate in each surrounding image is defined as 0, with the upper end of each surrounding image being 0, and the numerical value is increased as it moves downward. For this reason, it shows that it is closer to the own vehicle 1 so that the numerical value of y coordinate is large.
 図16に示すように、本実施の形態においては、周囲映像401~404における垂直方向の最も上部のピクセル位置を0とし、下に向かって座標値が増大するとした場合、垂直方向の座標が0ピクセル以上150ピクセル未満の場合は強調表示色を緑色としている。同様に、座標が150ピクセル以上300ピクセル未満の場合は強調表示色を黄色とし、座標が300ピクセル以上の場合は強調表示色を赤色としている。 As shown in FIG. 16, in the present embodiment, when the uppermost pixel position in the vertical direction in the surrounding images 401 to 404 is 0 and the coordinate value increases downward, the vertical coordinate is 0. When the pixel is greater than or equal to 150 pixels, the highlighted color is green. Similarly, when the coordinates are 150 pixels or more and less than 300 pixels, the highlight color is yellow, and when the coordinates are 300 pixels or more, the highlight color is red.
 次に、本発明の車両周囲移動物体検知システムの第5の実施の形態における移動物体の検知と表示の処理内容について図14と図17を用いて説明する。 Next, moving object detection and display processing contents in the fifth embodiment of the vehicle surrounding moving object detection system of the present invention will be described with reference to FIGS.
 本実施の形態においては、車両周囲移動物体検知装置100Aは、(ステップS106)において、第1の実施の形態と同様に検知箇所の座標を算出するが、この座標は、上述した移動物体1001のy座標値に変換されている。 In the present embodiment, the vehicle surrounding moving object detection device 100A calculates the coordinates of the detection location in (step S106) as in the first embodiment, and these coordinates are the same as those of the moving object 1001 described above. It has been converted to a y-coordinate value.
 車両周囲移動物体検知装置100Aは、この検知箇所のy座標を基に領域内容の強調設定を行う(ステップS206)。具体的には、合成映像構築部104が周囲映像401~404における移動物体のy座標vを算出し、強調情報保持部106の保持するテーブルと照らしあわて強調表示色を設定する。 The vehicle surrounding moving object detection device 100A performs region content enhancement setting based on the y-coordinate of this detection location (step S206). Specifically, the composite video construction unit 104 calculates the y-coordinate v of the moving object in the surrounding videos 401 to 404, and sets the highlight display color against the table held by the highlight information holding unit 106.
 図17に示すように、例えば、前方映像401において、移動物体1001のy座標値vが250ピクセルであった場合、(ステップS206)において、合成映像構築部104が強調表示色を黄色に設定する。 As illustrated in FIG. 17, for example, when the y coordinate value v of the moving object 1001 is 250 pixels in the front image 401, the composite image constructing unit 104 sets the highlight color to yellow in (Step S206). .
 図14にもどり、車両周囲移動物体検知装置100Aは、検知箇所の強調設定を行う(ステップS107)。具体的には、合成映像構築部104にて、強調表示色を黄色に設定する。この後、(ステップS108)での合成映像出力において、合成俯瞰映像605に対して移動物体1001を黄色で強調表示される。 Returning to FIG. 14, the vehicle surrounding moving object detection device 100 </ b> A performs enhancement setting of the detection location (step S <b> 107). Specifically, the composite video construction unit 104 sets the highlight color to yellow. Thereafter, in the composite video output in (Step S108), the moving object 1001 is highlighted in yellow with respect to the composite overhead image 605.
 上述した本発明の車両周囲移動物体検知システムの第5の実施の形態によれば、上述した第1の実施の形態及び第4の実施の形態と同様な効果を得ることができる。 According to the fifth embodiment of the vehicle surrounding moving object detection system of the present invention described above, the same effects as those of the first embodiment and the fourth embodiment described above can be obtained.
 なお、本実施の形態においては、移動物体1001の位置から強調表示色を設定するに当たりテーブルを用いているが、これに限るものではない。例えば、数式等を用いて設定しても良い。 In the present embodiment, a table is used to set the highlight color from the position of the moving object 1001, but the present invention is not limited to this. For example, you may set using a numerical formula etc.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。 In addition, this invention is not limited to the above-mentioned Example, Various modifications are included. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described. Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
1    ダンプトラック(車両)
2    車体
3    ベッセル(荷台)
4A   前輪
4B   後輪
8    運転室
100  車両周囲移動物体検知装置
101  周囲映像入力部
102  移動物体検知部
103  領域情報保持部
104  合成映像構築部
105  出力部
301  前方カメラ
302  左方向カメラ
303  後方カメラ
304  右方向カメラ
401  前方映像
402  左方向映像
403  後方映像
404  右方向映像
501  前方検知結果表示対象外領域
502  左方向検知結果表示対象外領域
503  後方検知結果表示対象外領域
504  右方向検知結果表示対象外領域
601  前方抽出領域
602  左方向抽出領域
603  後方抽出領域
604  右方向抽出領域
605  合成俯瞰映像
606  前方変換映像
607  左方向変換映像
608  後方変換映像
609  右方向変換映像
610  自車アイコン
701  時刻t0 における移動物体
702  時刻t0-1における移動物体
703  時刻t0 における左方向の検知結果
704  時刻t0 における左方向の補正済み検知結果
801  時刻t0 における座標変換後の検知結果
901  検知メッセージ
1001 移動物体
1 Dump truck (vehicle)
2 Body 3 Vessel
4A Front wheel 4B Rear wheel 8 Driver's cab 100 Vehicle surrounding moving object detection device 101 Ambient image input unit 102 Moving object detection unit 103 Area information holding unit 104 Composite image construction unit 105 Output unit 301 Front camera 302 Left camera 303 Rear camera 304 Right Direction camera 401 Front video 402 Left video 403 Rear video 404 Right video 501 Front detection result display non-target area 502 Left detection result display non-target area 503 Rear detection result display non-target area 504 Right direction detection result non-target display area 601 Forward extraction area 602 Left extraction area 603 Rear extraction area 604 Right extraction area 605 Composite overhead image 606 Forward conversion video 607 Left conversion video 608 Rear conversion video 609 Right conversion video 610 Car icon 701 Movement at time t 0 object Body 702 time t 0 detection result 901 detection message 1001 moving object after the coordinate conversion in the corrected detection results 801 time t 0 in the left direction in the left direction in the detection result 704 time t 0 in the moving object 703 time t 0 at -1

Claims (6)

  1.  自車両の周囲の複数の映像を撮影する周囲映像入力部と、前記周囲映像入力部が撮影した複数の周囲映像から合成映像構成用領域を抽出して、抽出した複数の合成映像構成用領域を合成することで合成俯瞰映像を構築する合成映像構築部と、前記合成俯瞰映像を利用者に提示する出力部とを備えた車両周囲移動物体検知システムにおいて、
     前記複数の周囲映像に関して移動物体の検知処理を行う移動物体検知処理部とを有し、
     前記移動物体検知処理部は、前記合成俯瞰映像に示す領域以外で移動物体を検知した場合にも検知結果を前記合成俯瞰映像に出力して前記利用者に報知する
     ことを特徴とする車両周囲移動物体検知システム。
    A surrounding image input unit that captures a plurality of images around the host vehicle, and a plurality of synthesized image configuration areas extracted from the plurality of surrounding images captured by the surrounding image input unit. In a vehicle surrounding moving object detection system comprising a composite video construction unit that constructs a composite overhead view video by combining, and an output unit that presents the composite overhead view video to a user,
    A moving object detection processing unit that performs detection processing of a moving object with respect to the plurality of surrounding images
    The moving object detection processing unit outputs a detection result to the synthesized bird's-eye image and notifies the user even when a moving object is detected outside the region shown in the synthesized bird's-eye view image. Object detection system.
  2.  請求項1に記載の車両周囲移動物体検知システムにおいて、
     移動物体を検知したとの検知結果を表示させない領域である検知結果表示対象外領域の情報を保持する領域情報保持部と、
     前記領域情報保持部から前記検知結果表示対象外領域の情報を取込み、前記複数の周囲映像から前記検知結果表示対象外領域を除外した映像に関して移動物体の検知処理を行う移動物体検知処理部とを更に備え、
     前記移動物体検知処理部は、前記合成俯瞰映像に示す領域以外で移動物体を検知した場合にも検知結果を前記合成俯瞰映像に出力して前記利用者に報知する
     ことを特徴とする車両周囲移動物体検知システム。
    The vehicle surrounding moving object detection system according to claim 1,
    An area information holding unit that holds information of a detection result display non-target area that is an area in which a detection result that a moving object is detected is not displayed;
    A moving object detection processing unit that takes in information on the detection result non-target region from the region information holding unit, and performs a moving object detection process on a video that excludes the detection result display non-target region from the plurality of surrounding images; In addition,
    The moving object detection processing unit outputs a detection result to the synthesized bird's-eye image and notifies the user even when a moving object is detected outside the region shown in the synthesized bird's-eye view image. Object detection system.
  3.  請求項2に記載の車両周囲移動物体検知システムにおいて、
     前記検知結果表示対象外領域は、前記複数の周囲映像に映りこんだ前記自車両の一部である
     ことを特徴とする車両周囲移動物体検知システム。
    The vehicle surrounding moving object detection system according to claim 2,
    The detection result display non-target area is a part of the host vehicle that is reflected in the plurality of surrounding images. The vehicle surrounding moving object detection system.
  4.  請求項2に記載の車両周囲移動物体検知システムにおいて、
     前記移動物体検知処理部は、前記自車両の下部で移動物体を検知した場合に、検知結果を前記合成俯瞰映像にメッセージ出力して前記利用者に報知する
     ことを特徴とする車両周囲移動物体検知システム。
    The vehicle surrounding moving object detection system according to claim 2,
    When the moving object detection processing unit detects a moving object in a lower part of the host vehicle, the moving object detection processing unit outputs a detection result to the synthesized bird's-eye view message to notify the user. system.
  5.  請求項2に記載の車両周囲移動物体検知システムにおいて、
     前記移動物体検知処理部は、前記自車両の下部で移動物体を検知した場合に、検知結果を前記合成俯瞰映像における自車両を示す位置に描画出力して前記利用者に報知する
     ことを特徴とする車両周囲移動物体検知システム。
    The vehicle surrounding moving object detection system according to claim 2,
    When the moving object detection processing unit detects a moving object at a lower part of the host vehicle, the moving object detection processing unit draws and outputs a detection result at a position indicating the host vehicle in the synthesized bird's-eye view image and notifies the user. Moving object detection system around the vehicle.
  6.  請求項2に記載の車両周囲移動物体検知システムにおいて、
     前記自車両と検知した移動物体との距離に応じて、強調表示の強調の度合いを示す色の情報を保持する強調情報保持部を更に備え、
     前記移動物体検知処理部は、前記強調情報保持部から前記強調表示での強調の度合いを示す色の情報を取込み、前記自車両と検知した移動物体との距離から強調表示色を設定し、検知結果である前記移動物体を設定した前記強調表示色で前記合成俯瞰映像上に表示して前記利用者に報知する
     ことを特徴とする車両周囲移動物体検知システム。
    The vehicle surrounding moving object detection system according to claim 2,
    An emphasis information holding unit that holds color information indicating the degree of emphasis according to the distance between the host vehicle and the detected moving object;
    The moving object detection processing unit fetches color information indicating the degree of emphasis in the emphasis display from the emphasis information holding unit, sets an emphasis display color from the distance between the host vehicle and the detected moving object, and detects The moving object detection system for vehicles is characterized in that the moving object as a result is displayed on the composite bird's-eye view image with the highlighted display color set and notified to the user.
PCT/JP2014/074212 2013-09-24 2014-09-12 Vehicle-periphery-moving-object detection system WO2015045904A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-197091 2013-09-24
JP2013197091A JP6339337B2 (en) 2013-09-24 2013-09-24 Moving object detection system around the vehicle

Publications (1)

Publication Number Publication Date
WO2015045904A1 true WO2015045904A1 (en) 2015-04-02

Family

ID=52743046

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/074212 WO2015045904A1 (en) 2013-09-24 2014-09-12 Vehicle-periphery-moving-object detection system

Country Status (2)

Country Link
JP (1) JP6339337B2 (en)
WO (1) WO2015045904A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121250A (en) * 2018-01-09 2019-07-22 日立建機株式会社 Transport vehicle

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160136779A (en) * 2015-05-21 2016-11-30 주식회사 케이에스에스이미지넥스트 System and Method for Displaying State of Vehicle
JP6581519B2 (en) * 2016-02-09 2019-09-25 日立建機株式会社 Obstacle detection system for construction machinery
WO2018055873A1 (en) * 2016-09-20 2018-03-29 株式会社Jvcケンウッド Overhead view video image generation device, overhead view video image generation system, overhead view video image generation method, and program
JP6644264B2 (en) * 2016-09-21 2020-02-12 株式会社Jvcケンウッド Overhead video generation device, overhead video generation system, overhead video generation method and program
JP6730617B2 (en) * 2016-09-20 2020-07-29 株式会社Jvcケンウッド Overhead video generation device, overhead video generation system, overhead video generation method and program
JP6700644B2 (en) * 2017-03-29 2020-05-27 日立建機株式会社 Excavation condition monitoring device
JP7374602B2 (en) * 2019-03-29 2023-11-07 日立建機株式会社 work vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007230319A (en) * 2006-02-28 2007-09-13 Hitachi Ltd Start security device
JP2009071790A (en) * 2007-09-18 2009-04-02 Denso Corp Vehicle surroundings monitoring apparatus
WO2012169359A1 (en) * 2011-06-07 2012-12-13 株式会社小松製作所 Dump truck
JP2012254660A (en) * 2011-06-07 2012-12-27 Komatsu Ltd Periphery monitoring device of work vehicle

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4601505B2 (en) * 2005-07-20 2010-12-22 アルパイン株式会社 Top-view image generation apparatus and top-view image display method
JP4687411B2 (en) * 2005-11-15 2011-05-25 株式会社デンソー Vehicle peripheral image processing apparatus and program
JP4846426B2 (en) * 2006-04-20 2011-12-28 パナソニック株式会社 Vehicle perimeter monitoring device
JP4951639B2 (en) * 2009-03-02 2012-06-13 日立建機株式会社 Work machine with ambient monitoring device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007230319A (en) * 2006-02-28 2007-09-13 Hitachi Ltd Start security device
JP2009071790A (en) * 2007-09-18 2009-04-02 Denso Corp Vehicle surroundings monitoring apparatus
WO2012169359A1 (en) * 2011-06-07 2012-12-13 株式会社小松製作所 Dump truck
JP2012254660A (en) * 2011-06-07 2012-12-27 Komatsu Ltd Periphery monitoring device of work vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121250A (en) * 2018-01-09 2019-07-22 日立建機株式会社 Transport vehicle

Also Published As

Publication number Publication date
JP6339337B2 (en) 2018-06-06
JP2015065515A (en) 2015-04-09

Similar Documents

Publication Publication Date Title
JP6339337B2 (en) Moving object detection system around the vehicle
JP6262068B2 (en) Near-body obstacle notification system
CN106797450B (en) Vehicle body external moving object detection device
JP5529943B2 (en) Work vehicle periphery monitoring system and work vehicle
US10781572B2 (en) Working machine
JP5456123B1 (en) Work vehicle periphery monitoring system and work vehicle
CN109691088B (en) Image processing apparatus, image processing method, and program
JP5550695B2 (en) Work vehicle periphery monitoring system and work vehicle
JP5926315B2 (en) Work vehicle periphery monitoring system and work vehicle
KR20170039615A (en) Periphery monitoring device for crawler-type working machine
US10017112B2 (en) Surroundings monitoring device of vehicle
WO2017038123A1 (en) Device for monitoring area around vehicle
WO2018159019A1 (en) Bird&#39;s-eye-view video image generation device, bird&#39;s-eye-view video image generation system, bird&#39;s-eye-view video image generation method, and program
JP6401141B2 (en) Vehicle obstacle detection device
JP2017074871A (en) Device for detecting obstacle around vehicle
JP6796518B2 (en) Moving object detection system
JP5788048B2 (en) Work vehicle periphery monitoring system and work vehicle
JP6802196B2 (en) Transport vehicle
JP7180172B2 (en) OVERALL VIEW IMAGE GENERATING DEVICE, OVERALL VIEW IMAGE GENERATING METHOD AND PROGRAM
JP2018142883A (en) Bird&#39;s eye video creation device, bird&#39;s eye video creation system, bird&#39;s eye video creation method, and program
JP7145137B2 (en) Working machine controller
JP2015029357A (en) Periphery monitoring system for dump truck and dump truck
JP2014143713A (en) Periphery monitoring system for work vehicle and work vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14849436

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14849436

Country of ref document: EP

Kind code of ref document: A1