WO2015045904A1 - 車両周囲移動物体検知システム - Google Patents

車両周囲移動物体検知システム Download PDF

Info

Publication number
WO2015045904A1
WO2015045904A1 PCT/JP2014/074212 JP2014074212W WO2015045904A1 WO 2015045904 A1 WO2015045904 A1 WO 2015045904A1 JP 2014074212 W JP2014074212 W JP 2014074212W WO 2015045904 A1 WO2015045904 A1 WO 2015045904A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving object
object detection
vehicle
video
surrounding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2014/074212
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
善文 福田
小沼 知恵子
守飛 太田
石本 英史
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Construction Machinery Co Ltd
Original Assignee
Hitachi Construction Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Construction Machinery Co Ltd filed Critical Hitachi Construction Machinery Co Ltd
Publication of WO2015045904A1 publication Critical patent/WO2015045904A1/ja
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a vehicle surrounding moving object detection system, and more particularly, to a vehicle surrounding moving object detection for avoiding contact with an object moving in the vicinity of a vehicle in a construction machine such as a large dump truck working in a mine. About the system.
  • Patent Document 1 JP 2010-147523 A (Patent Document 1). This publication describes that “a composite bird's-eye view image that can easily grasp a three-dimensional object around a vehicle to be watched by a supervisor while avoiding the disappearance of the three-dimensional object in the common area” is described.
  • the conventional vehicle surrounding detection system a part of the image around the host vehicle captured by a plurality of cameras is extracted, and the shape is converted and synthesized to generate a composite overhead image, and the user is notified of the surroundings of the host vehicle. In addition to showing the situation, if an object (three-dimensional object) that the user should pay attention to is present in the video, the user is warned by highlighting it.
  • a composite overhead image is generated to display the situation around the host vehicle, and an object (a three-dimensional object) is detected at the displayed location.
  • an object a three-dimensional object
  • the three-dimensional object is not detected and the user cannot be alerted. For this reason, there existed a possibility of impairing operational safety.
  • the present invention has been made based on the above-mentioned matters, and its purpose is to detect a moving object even in a place that is not displayed as a composite bird's-eye view image such as a lower part of a vehicle body of a large construction machine.
  • a vehicle surrounding moving object detection system capable of notifying the user of the result is provided.
  • a first invention provides a surrounding image input unit that captures a plurality of images around the host vehicle, and a composite image configuration area from the plurality of surrounding images captured by the surrounding image input unit.
  • Vehicle surrounding movement provided with a composite video construction unit that constructs a composite overhead view video by combining a plurality of extracted composite video composition areas and an output unit that presents the composite overhead view video to a user
  • the object detection system includes a moving object detection processing unit that performs a detection process of the moving object with respect to the plurality of surrounding images, and the moving object detection processing unit detects a moving object in a region other than the region shown in the composite overhead image. Even in this case, the detection result is output to the synthesized bird's-eye view image to notify the user.
  • a region information holding unit that holds information of a detection result display non-target region that is a region where a detection result indicating that a moving object has been detected is not displayed, and the region information holding A moving object detection processing unit that takes in the information of the detection result display non-target region from the unit, and performs a moving object detection process on the video that excludes the detection result display non-target region from the plurality of surrounding images,
  • the moving object detection processing unit outputs a detection result to the synthesized bird's-eye video and notifies the user even when a moving object is detected outside the area shown in the synthesized bird's-eye video.
  • the third invention is characterized in that, in the second invention, the detection result display non-target area is a part of the own vehicle reflected in the plurality of surrounding images.
  • the moving object detection processing unit when the moving object detection processing unit detects a moving object in a lower part of the host vehicle, the detection result is output as a message to the synthesized overhead view image and used. It is characterized by notifying a person.
  • the moving object detection processing unit when the moving object detection processing unit detects a moving object in a lower portion of the host vehicle, the detection result is set to a position indicating the host vehicle in the synthesized overhead view image. The drawing is output and notified to the user.
  • the information processing apparatus further includes an emphasis information holding unit that holds color information indicating a degree of emphasis display according to a distance between the host vehicle and the detected moving object.
  • the moving object detection processing unit takes in color information indicating the degree of emphasis in the emphasis display from the emphasis information holding unit, and sets an emphasis display color from a distance between the host vehicle and the detected moving object, The moving object that is a detection result is displayed on the composite bird's-eye view image in the highlighted display color that is set, and is notified to the user.
  • the detection result is notified to the user. Therefore, the user can quickly grasp the existence and the positional relationship of the object to be noted in the work of the own vehicle. As a result, it is possible to improve the efficiency of confirmation work for ensuring work safety, and to avoid contact with an object such as a person or another vehicle. As a result, the operational efficiency of the entire work can be improved.
  • FIG. 1 is a side view showing a dump truck provided with a first embodiment of a vehicle surrounding moving object detection system of the present invention.
  • a dump truck (vehicle) 1 shown in FIG. 1 includes a vehicle body 2 formed with a sturdy frame structure, a vessel (loading platform) 3 mounted on the vehicle body 2 so as to be able to undulate, and a left front wheel 4A mounted on the vehicle body 2. (L) and left rear wheel 4B (L) are mainly provided.
  • the vehicle body 2 is provided with an engine (not shown) that drives the rear wheels 4B.
  • the engine has, for example, an engine control device (hereinafter referred to as ECU), and the number of revolutions thereof is controlled by controlling the flow rate of fuel supplied by a command signal from the ECU.
  • ECU engine control device
  • the vessel 3 is a container provided to load a load such as a crushed stone, and is connected to the vehicle body 2 via a pin coupling portion 5 so as to be raised and lowered.
  • Two undulation cylinders 6 are installed at a lower portion of the vessel 3 with a predetermined interval in the width direction of the vehicle. When pressure oil is supplied to and discharged from the undulation cylinder 6, the undulation cylinder 6 extends and contracts, and the vessel 3 is undulated.
  • a collar portion 7 is provided on the upper front side of the vessel 3.
  • the heel portion 7 has a function of protecting the cab 8 installed on the lower side thereof (that is, the front portion of the vehicle body 2) from scattered objects such as rocks and protecting the cab 8 when the vehicle falls. Yes.
  • a control device 100 constituting a vehicle surrounding moving object detection system, a steering handle (not shown), an accelerator pedal, a brake pedal and the like (not shown). Is installed.
  • FIG. 2 is a conceptual diagram for explaining the arrangement of cameras constituting the surrounding image input unit in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 3 is a diagram of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows the image
  • a front camera 301 that captures the front of the dump truck 1 with a wide-angle lens and a rear camera 303 that captures the rear of the dump truck 1 with a wide-angle lens are provided on the front and rear sides of the vehicle body 2 of the dump truck 1. And are provided. Further, on the left side surface and the right side surface of the vehicle body 2, a left direction camera 302 that captures the left direction of the dump truck 1 with a wide angle lens and a right direction camera 304 that captures the right direction of the dump truck 1 with a wide angle lens are provided. Is provided.
  • the vehicle body 2 of the dump truck 1 is equipped with a right front wheel 4A (R) and a left rear wheel 4B (R).
  • FIG. 3 shows an example of surrounding images taken by these cameras 301-303.
  • Reference numeral 401 denotes an example of a front image captured by the front camera 301.
  • Reference numeral 402 denotes an example of a left direction image captured by the left direction camera 302.
  • Reference numeral 403 denotes an example of a rear image taken by the rear camera 303.
  • Reference numeral 404 denotes an example of a right direction image captured by the right direction camera 304.
  • the surrounding images 401 to 404 are each photographed with a wide-angle lens, a distant horizon located above each image is reflected in a curved state.
  • a part of the vehicle body 2 on the front side, a part of the left and right front wheels 4A (R) and 4A (L), and a part of the left and right rear wheels 4B (R) and 4B (L) are shown below each image. ing.
  • a synthesized overhead view video is generated based on these surrounding videos.
  • FIG. 4 is a block diagram showing the configuration of the first embodiment of the vehicle surrounding moving object detection system of the present invention.
  • the first embodiment of the vehicle surrounding moving object detection system includes a vehicle surrounding moving object detection device 100 and a surrounding image input unit 101.
  • the ambient video input unit 101 includes a plurality of cameras 301 to 304 that respectively capture the situation around the dump truck 1.
  • the surrounding image input unit 101 transmits a plurality of surrounding images 401 to 404, which are photographing results, to the vehicle surrounding moving object detection device 100.
  • the vehicle surrounding moving object detection device 100 includes a moving object detection unit 102, a region information holding unit 103, a composite video construction unit 104, and an output unit 105.
  • the moving object detection unit 102 includes a plurality of surrounding images 401 to 404 transmitted by the surrounding image input unit 101 and a detection result display non-display region that is a non-detection / non-display region from the region information holding unit 103 (details will be described later). ) And the moving object detection process is performed in a region excluding the detection result display non-target region from the surrounding video.
  • the moving object detection unit 102 outputs an information request signal to the area information holding unit 103 and outputs a detection position coordinate signal indicating the detected position to the composite video construction unit 104 when a moving object is detected.
  • the area information holding unit 103 holds an information signal of a detection result display non-target area, and outputs the held information signal to the moving object detection unit 102 in response to a request from the moving object detection unit 102.
  • the detection result display non-target area refers to an area in which a detection result indicating that a moving object is detected is not displayed even if a moving object is detected in the surrounding images 401 to 404.
  • a part of the host vehicle 1 such as the front wheel 4A and the rear wheel 4B of the vehicle body 2 is set.
  • the composite video constructing unit 104 receives a plurality of surrounding images 401 to 404 transmitted from the surrounding image input unit 101 and a detection position coordinate signal indicating the position of the moving object from the moving object detection unit 102.
  • the composite video constructing unit 104 cuts out and transforms a necessary part from the surrounding video input to generate the composite overhead video, combines them, and based on the input detection position coordinate signal, the detection position of the moving object A composite bird's-eye view image highlighting the region is generated and output to the output unit 105.
  • the output unit 105 includes a display or the like, and displays the synthesized overhead view video input from the synthesized video construction unit 104 on the display or the like.
  • FIG. 5 is a flowchart showing the processing contents in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 6 shows the detection result display in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 7 is a conceptual diagram showing an example of an extraction region from a surrounding image in the first embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 8 is a vehicle of the present invention
  • FIG. 9 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the first embodiment of the surrounding moving object detection system
  • FIG. 9 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the first embodiment of the surrounding moving object detection system
  • FIG. 9 is a diagram in the first embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram explaining a moving object detection. 5 to 9, the same reference numerals as those shown in FIGS. 1 to 4 are the same parts, and detailed description thereof is omitted.
  • the vehicle surrounding moving object detection device 100 performs surrounding image input (step S101). Specifically, ambient images 401 to 404 photographed by the ambient image input unit 101 are input.
  • the vehicle surrounding moving object detection device 100 performs the exclusion process on the non-target region (step S102). Specifically, based on the information on the detection result display non-target area held by the area information holding unit 103 shown in FIG. 4, even if a moving object is detected, an area where the detection result is not displayed is set.
  • a portion surrounded by a broken line is set as a detection result display non-target area.
  • 501 indicates a non-target area for display of the front detection result
  • 502 in the left direction video 402 indicates a non-target area for display of the detection result of the left direction
  • 503 in the rear video 403
  • Reference numeral 504 denotes a right direction detection result display non-target area.
  • the vehicle surrounding moving object detection apparatus 100 executes construction of a composite image (step S103). Specifically, the composite video construction unit 104 shown in FIG. 4 cuts a part from the front video 401, the left video 402, the rear video 403, and the right video 404 and arranges them so as to touch the boundary, thereby providing the dump truck 1 Display the surroundings of the front, back, left and right.
  • a front extraction region 601, a left direction extraction region 602, a rear extraction region 603, and a right direction extraction region 604 are provided inside each of the surrounding images 401 to 404. Based on the characteristics of the wide-angle camera, these extraction areas are reflected from the area starting from just below the dump truck 1 to a portion separated by a certain distance, and the extraction area in the video input from the adjacent camera and the camera. The range is set based on the fact that the extracted area in the input video is adjacent without overlapping.
  • a synthesized overhead image 605 shown in FIG. 8 is constructed.
  • the forward extraction area 601 is converted into a forward converted image 606.
  • the left direction extraction region 602 is converted into a left direction conversion image 607
  • the rear extraction region 603 is converted into a rear conversion image 608, and the right direction extraction region 604 is converted into a right direction conversion image 609.
  • these converted images 606 to 609 are sequentially arranged so as to be adjacent to each other, and in the present embodiment, an image of the bird's-eye view of the periphery of the dump truck 1 is represented.
  • the own vehicle icon 610 is inserted and synthesized in the central portion where the video does not exist to indicate the region where the vehicle body 2 of the dump truck 1 exists.
  • the vehicle surrounding moving object detection apparatus 100 executes a detection process (step S104). Specifically, the moving object detection unit 102 shown in FIG. 4 performs a moving object detection process.
  • the images to be detected are the forward video 401, the left video 402, the rear video 403, and the right video 404 input from the surrounding video input unit 101. Further, in each video, the front detection result display non-display area 501, the detection results in the left direction detection result display non-target area 502, the rear detection result display non-target area 503, and the right direction detection result display non-target area 504 are excluded.
  • FIG. 9 shows detection of a moving object in the left-direction image 402, and in order from the top, (a) an image at a time t 0 ⁇ 1 that is a time before a certain specific time t 0 , and (b) a certain specific An image at time t 0 , (c) an image showing a moving object initial detection result, and (d) an image showing a corrected detection result are shown.
  • (a) shows a state in which the moving object 702 at time t 0 ⁇ 1 is photographed on the left direction image 402, and (b) shows the moving object 701 at the specific time t 0 in the left direction. A state of being photographed on the video 402 is shown.
  • These moving objects 701 and 702 are the same, and the state of moving with the passage of time is captured by the left camera 303.
  • the moving object 701 at a particular time t 0, the moving object 702 at time t 0 -1 is a time before the specific time t 0, is present in the interior of the left extraction region 602
  • it is a target for moving object detection, and in the case of a moving object, it is a target for display.
  • the moving object detection unit 102 shown in FIG. 4 compares these images input from the surrounding image input unit 101, and identifies a site where a change has occurred in the image with time. Specifically, a part is specified like a detection result 703 in the left direction indicated by a square in FIG.
  • the moving object detection unit 102 calculates and holds coordinate information on the left direction image 402 of the detection result 703 in the left direction. It should be noted that such a detection result cannot be obtained when there is no change in the image.
  • the vehicle surrounding moving object detection apparatus 100 determines whether or not a moving object is detected by the detection processing in (Step S104) (Step S105). If a moving object is detected, the process proceeds to (Step S106). Otherwise, the process proceeds to (Step S108).
  • the vehicle surrounding moving object detection device 100 calculates the coordinates of the detection location (step S106). Specifically, the moving object detection unit 102 shown in FIG. 4 converts the part of the moving object obtained in the front video 401, the left video 402, the rear video 403, and the right video 404 into coordinate information, and combines it. Coordinate information in the overhead view image 605 is calculated. The conversion added to the coordinate information is the same as that used when constructing the composite bird's-eye view image 605 in (Step S103).
  • the moving object 701 at the specific time t 0 exists outside the left direction extraction region 602 although it is inside the left direction image 402.
  • the coordinate information of the moving object 701 after conversion indicates the outside of the left-direction converted image 607 shown in FIG. Therefore, the moving object detection unit 102 calculates a part inside the left direction extraction region 602 closest to the left direction detection result 703 at the specific time t 0 shown in FIG. 9C, and the calculated part is shown in FIG.
  • the corrected detection result 704 in the left direction at the specific time t 0 is used.
  • the vehicle surrounding moving object detection apparatus 100 performs enhancement setting of the detection location (step S ⁇ b> 107). Specifically, the composite video constructing unit 104 sets highlighting in order to indicate to the user that the moving object is detected.
  • the detection result 801 after coordinate conversion at the specific time t 0 is emphasized by adding a conversion to the corrected detection result 704 in the left direction at the specific time t 0 shown in FIG. Display is set.
  • the left direction of the corrected detection results 704 at a specific time t 0 is, in the interior of the left extraction region 602, and the detection result of the left at a particular time t 0 It exists at a position closest to 703. Then, the detection result 801 after the coordinate conversion at the specific time t 0 shown in FIG. 8 becomes a moving object indicated by the detection result 703 in the left direction at the specific time t 0 on the left-direction converted video 607 on the synthesized overhead image 605. It exists in the nearest position.
  • the vehicle surrounding moving object detection apparatus 100 outputs a composite image (step S ⁇ b> 108). Specifically, the composite video construction unit 104 outputs the composite overhead video 605 to the output unit 105. In the above-described processing, when a moving object is detected, the output unit 105 outputs a composite bird's-eye view image 605 that is highlighted as in the detection result 801 after coordinate conversion at the specific time t 0 . .
  • the vehicle surrounding moving object detection apparatus 100 determines whether the detection and output of the moving object is completed (step S109). If these processes are completed, the process ends. Otherwise, the process returns to (Step S101).
  • the detection and display processing of the moving object in the present embodiment is performed, it is possible to detect the presence or absence of the moving object in the part that is not displayed on the composite overhead view image 605, and its existence. By highlighting, it is possible to notify the user.
  • the vehicle surrounding moving object detection system of the present invention it is included in the portion displayed as the synthesized bird's-eye view image 605 for the user in the situation around the host vehicle 1. Even when a moving object is detected at a non-existing location, the detection result is notified to the user, so that the user can quickly grasp the presence and positional relationship of the object to be noted in the operation of the host vehicle 1. As a result, it is possible to improve the efficiency of confirmation work for ensuring work safety, and to avoid contact with an object such as a person or another vehicle. As a result, the operational efficiency of the entire work can be improved.
  • FIG. 10 is a conceptual diagram showing an example of a composite bird's-eye view image and a moving object detection result display in the second embodiment of the vehicle surrounding moving object detection system of the present invention.
  • the same reference numerals as those shown in FIG. 1 to FIG. 9 are the same parts, and detailed description thereof will be omitted.
  • the vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment.
  • the second embodiment when a moving object that is located below the host vehicle 1 and is not displayed on the composite bird's-eye view image 605 is detected, the user is notified by displaying a message to that effect. Different from the first embodiment.
  • Step S106 the processing of (Step S106) and (Step S107) in the flowchart showing the processing contents shown in FIG. 5 in the first embodiment, and the composite overhead view video shown in FIG. 8 in the first embodiment. Since the detection result display of the moving object in is different, this part will be described.
  • the coordinates of the detection location are calculated in the same manner as in the first embodiment, and whether or not the moving object detected based on this information is below the host vehicle 1. Determine whether. Specifically, when all of the following conditions are satisfied in the moving object detection unit 102 shown in FIG. 4, it is determined that the moving object exists below the host vehicle 1. (1) A moving object is included in any of the front video 401, the left video 402, the rear video 403, and the right video 404. (2) A moving object exists below the lower side indicating each of the front extraction area 601, the left direction extraction area 602, the rear extraction area 603, and the right direction extraction area 604 shown in FIG.
  • the detection location is emphasized.
  • the composite video constructing unit 104 performs settings for indicating the detection of the moving object to the user.
  • the message character string “A moving object has been detected in the lower part of the own vehicle body” is output to the output unit 105.
  • the message character string is set not to be output to the output unit 105.
  • FIG. 10 is a display example on the output unit 105 made by these processes, and shows a state in which a moving object is detected in the lower part of the host vehicle 1.
  • a detection message 901 indicating that a moving object has been detected is displayed on the lower part of the host vehicle 1 on the synthesized overhead image 605 displayed on the output unit 105. Accordingly, it is possible to notify the user of the state of the moving object existing below the host vehicle 1 that is not displayed in any of the forward converted video 606, the left converted video 607, the backward converted video 608, and the right converted video 609. it can.
  • FIG. 11 is a conceptual diagram showing an example of a composite overhead view image and a moving object detection result display in the third embodiment of the vehicle surrounding moving object detection system of the present invention.
  • the same reference numerals as those shown in FIGS. 1 to 10 are the same parts, and detailed description thereof is omitted.
  • the vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment.
  • a moving object that is below the host vehicle 1 and is not displayed on the composite overhead view image 605 is detected, it is detected at a position on the host vehicle icon 610 corresponding to the detected position.
  • the point which notifies a user by displaying a result differs from a 1st embodiment.
  • Step S106 the processing of (Step S106) and (Step S107) in the flowchart showing the processing contents shown in FIG. 5 in the first embodiment, and the composite overhead view video shown in FIG. 8 in the first embodiment. Since the detection result display of the moving object in is different, this part will be described.
  • step S106 but calculates the coordinates of the same detected position of the first embodiment, this time, without correcting the detection result, the left direction at a particular time t 0
  • the detection result 703 is used in subsequent processing. Note that the detection result 703 in the left direction at the specific time t 0 is present at the lower part of the host vehicle 1 and behind the front wheel 4A as shown in FIG. 7C.
  • Step S107 the detection location is emphasized. Specifically, the composite video constructing unit 104 sets highlighting in order to indicate to the user that the moving object is detected.
  • the detection result 703 in the left direction at the specific time t 0 exists outside the left direction extraction region 602. Therefore, when coordinate conversion is performed, the detection result 703 exists in the display region of the vehicle icon 610 in FIG. Will do. For this reason, in FIG. 10, the detection result 1001 after the coordinate conversion at the specific time t 0 is displayed so as to be superimposed on the vehicle icon 610. As a result, the state of the moving object existing below the host vehicle 1 that is not displayed in any of the forward converted video 606, the left converted video 607, the backward converted video 608, and the right converted video 609, including the approximate position. The user can be notified.
  • FIG. 12 is a block diagram showing the configuration of the fourth embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 13 is a table showing an example of the table in the emphasis information holding unit shown in FIG. 12, and FIG. FIG.
  • FIG. 15 is a flowchart showing the processing contents in the fourth embodiment of the vehicle surrounding moving object detection system of the invention, and FIG. 15 is an image of the moving object reflected in the fourth embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows an example. 12 to 15, the same reference numerals as those shown in FIGS. 1 to 11 are the same parts, and detailed description thereof is omitted.
  • the vehicle surrounding moving object detection system in the present embodiment has a configuration and an operation method that are substantially the same as those in the first embodiment.
  • the fourth embodiment when the moving object is present at a position closer to the host vehicle 1, the first point is that the detection result of the moving object is emphasized and displayed to the user. Different from the embodiment.
  • the vehicle surrounding moving object detection device 100 ⁇ / b> A further includes an emphasis information holding unit 106.
  • the emphasis information holding unit 106 holds, as a table, for example, a color information signal indicating the degree of emphasis in the emphasis display according to the distance between the host vehicle 1 and the moving object in the video displayed by the output unit 105. Output to the video construction unit 104.
  • FIG. 1 An example of the table held by the emphasis information holding unit 106 is shown in FIG.
  • the distance L between the moving object and the vehicle 1 in the surrounding images 401 to 404 is 300 pixels or more
  • green is used as the highlight color.
  • the highlight color is yellow
  • the highlight color is red.
  • the emphasis information holding unit 106 can store other types of tables.
  • step S206 is added between (step S106) and (step S107). Others are the same.
  • the vehicle surrounding moving object detection device 100A calculates the coordinates of the detection location in (Step S106) in the same manner as in the first embodiment, and then the region contents based on the coordinates of the detection location. Is set (step S206). Specifically, the composite video construction unit 104 shown in FIG. 12 calculates the distance L between the moving object and the host vehicle 1 in the surrounding videos 401 to 404, and highlights the color in light of the table held by the highlight information holding unit 106. Set.
  • the composite video construction unit 104 sets the highlighted color to yellow in (Step S206). Set to.
  • the vehicle surrounding moving object detection device 100 ⁇ / b> A performs enhancement setting of the detection location (step S ⁇ b> 107). Specifically, the composite video construction unit 104 sets the highlight color to yellow. Thereafter, in the composite video output in (Step S108), the moving object 1001 is highlighted in yellow with respect to the composite overhead image 605.
  • the presence of the moving object 1001 is highlighted and the highlighted color is changed according to the distance.
  • the person can quickly and intuitively grasp the distance L between the moving object 1001 and the host vehicle 1 together with the presence of the moving object 1001. As a result, more safety can be ensured.
  • FIG. 16 is a table showing an example of a table in the emphasis information holding unit in the fifth embodiment of the vehicle surrounding moving object detection system of the present invention
  • FIG. 17 shows the fifth embodiment of the vehicle surrounding moving object detection system of the present invention. It is a conceptual diagram which shows an example of the image
  • the vehicle surrounding moving object detection system in the present embodiment has substantially the same configuration and operation method as in the fourth embodiment.
  • the table held by the emphasis information holding unit 106 is different from that in the fourth embodiment.
  • FIG. 16 shows an example of a table held by the emphasis information holding unit 106 of the present embodiment
  • FIG. 17 shows an example of an image showing a moving object.
  • an object photographed above the surrounding image is located farther from the vehicle body 2 of the host vehicle 1 and is photographed below the surrounding image. The closer the object is, the closer to the body 2 of the host vehicle 1 is.
  • the y-coordinate which is the vertical coordinate in each surrounding image is defined as 0, with the upper end of each surrounding image being 0, and the numerical value is increased as it moves downward. For this reason, it shows that it is closer to the own vehicle 1 so that the numerical value of y coordinate is large.
  • the vertical coordinate is 0.
  • the highlighted color is green.
  • the highlight color is yellow, and when the coordinates are 300 pixels or more, the highlight color is red.
  • the vehicle surrounding moving object detection device 100A calculates the coordinates of the detection location in (step S106) as in the first embodiment, and these coordinates are the same as those of the moving object 1001 described above. It has been converted to a y-coordinate value.
  • the vehicle surrounding moving object detection device 100A performs region content enhancement setting based on the y-coordinate of this detection location (step S206). Specifically, the composite video construction unit 104 calculates the y-coordinate v of the moving object in the surrounding videos 401 to 404, and sets the highlight display color against the table held by the highlight information holding unit 106.
  • the composite image constructing unit 104 sets the highlight color to yellow in (Step S206). .
  • the vehicle surrounding moving object detection device 100 ⁇ / b> A performs enhancement setting of the detection location (step S ⁇ b> 107). Specifically, the composite video construction unit 104 sets the highlight color to yellow. Thereafter, in the composite video output in (Step S108), the moving object 1001 is highlighted in yellow with respect to the composite overhead image 605.
  • a table is used to set the highlight color from the position of the moving object 1001, but the present invention is not limited to this. For example, you may set using a numerical formula etc.
  • this invention is not limited to the above-mentioned Example, Various modifications are included.
  • the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
  • Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
PCT/JP2014/074212 2013-09-24 2014-09-12 車両周囲移動物体検知システム Ceased WO2015045904A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013197091A JP6339337B2 (ja) 2013-09-24 2013-09-24 車両周囲移動物体検知システム
JP2013-197091 2013-09-24

Publications (1)

Publication Number Publication Date
WO2015045904A1 true WO2015045904A1 (ja) 2015-04-02

Family

ID=52743046

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/074212 Ceased WO2015045904A1 (ja) 2013-09-24 2014-09-12 車両周囲移動物体検知システム

Country Status (2)

Country Link
JP (1) JP6339337B2 (enExample)
WO (1) WO2015045904A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121250A (ja) * 2018-01-09 2019-07-22 日立建機株式会社 運搬車両

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH082486B2 (ja) 1987-09-25 1996-01-17 新日本製鐵株式会社 連続鋳造におけるパウダーの供給方法
KR20160136779A (ko) * 2015-05-21 2016-11-30 주식회사 케이에스에스이미지넥스트 차량 상태 표시 시스템 및 방법
JP6581519B2 (ja) * 2016-02-09 2019-09-25 日立建機株式会社 建設機械の周囲障害物検知システム
WO2018055873A1 (ja) * 2016-09-20 2018-03-29 株式会社Jvcケンウッド 俯瞰映像生成装置、俯瞰映像生成システム、俯瞰映像生成方法およびプログラム
JP6730617B2 (ja) * 2016-09-20 2020-07-29 株式会社Jvcケンウッド 俯瞰映像生成装置、俯瞰映像生成システム、俯瞰映像生成方法およびプログラム
JP6644264B2 (ja) * 2016-09-21 2020-02-12 株式会社Jvcケンウッド 俯瞰映像生成装置、俯瞰映像生成システム、俯瞰映像生成方法およびプログラム
JP6700644B2 (ja) * 2017-03-29 2020-05-27 日立建機株式会社 放土状態監視装置
JP7374602B2 (ja) * 2019-03-29 2023-11-07 日立建機株式会社 作業車両

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007230319A (ja) * 2006-02-28 2007-09-13 Hitachi Ltd 発進安全装置
JP2009071790A (ja) * 2007-09-18 2009-04-02 Denso Corp 車両周辺監視装置
WO2012169359A1 (ja) * 2011-06-07 2012-12-13 株式会社小松製作所 ダンプトラック
JP2012254660A (ja) * 2011-06-07 2012-12-27 Komatsu Ltd 作業車両の周辺監視装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4601505B2 (ja) * 2005-07-20 2010-12-22 アルパイン株式会社 トップビュー画像生成装置及びトップビュー画像表示方法
JP4687411B2 (ja) * 2005-11-15 2011-05-25 株式会社デンソー 車両周辺画像処理装置及びプログラム
JP4846426B2 (ja) * 2006-04-20 2011-12-28 パナソニック株式会社 車両周囲監視装置
JP4951639B2 (ja) * 2009-03-02 2012-06-13 日立建機株式会社 周囲監視装置を備えた作業機械

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007230319A (ja) * 2006-02-28 2007-09-13 Hitachi Ltd 発進安全装置
JP2009071790A (ja) * 2007-09-18 2009-04-02 Denso Corp 車両周辺監視装置
WO2012169359A1 (ja) * 2011-06-07 2012-12-13 株式会社小松製作所 ダンプトラック
JP2012254660A (ja) * 2011-06-07 2012-12-27 Komatsu Ltd 作業車両の周辺監視装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121250A (ja) * 2018-01-09 2019-07-22 日立建機株式会社 運搬車両

Also Published As

Publication number Publication date
JP2015065515A (ja) 2015-04-09
JP6339337B2 (ja) 2018-06-06

Similar Documents

Publication Publication Date Title
JP6339337B2 (ja) 車両周囲移動物体検知システム
JP6262068B2 (ja) 車体近傍障害物報知システム
JP5529943B2 (ja) 作業車両用周辺監視システム及び作業車両
CN106797450B (zh) 车身外部移动物体探测装置
US10781572B2 (en) Working machine
JP5456123B1 (ja) 作業車両用周辺監視システム及び作業車両
CN104584540A (zh) 作业机械的周围监视装置
WO2014148203A1 (ja) 作業機械用周辺監視装置
WO2017038123A1 (ja) 車両の周囲監視装置
US10017112B2 (en) Surroundings monitoring device of vehicle
JP5926315B2 (ja) 作業車両用周辺監視システム及び作業車両
JP2018142884A (ja) 俯瞰映像生成装置、俯瞰映像生成システム、俯瞰映像生成方法およびプログラム
JP2016119559A (ja) 作業機械の周囲監視装置
JP6401141B2 (ja) 車両周囲障害物検出装置
JP2017074871A (ja) 車両周囲障害物検出装置
JP6796518B2 (ja) 移動物体検知システム
JP7145137B2 (ja) 作業機械の制御装置
JP6802196B2 (ja) 運搬車両
JP7180172B2 (ja) 俯瞰画像生成装置、俯瞰画像生成方法およびプログラム
JP6724821B2 (ja) 俯瞰映像生成装置、俯瞰映像生成システム、俯瞰映像生成方法およびプログラム
CN109515321A (zh) 行车影像接口切换系统及行车影像切换方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14849436

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14849436

Country of ref document: EP

Kind code of ref document: A1