WO2015118806A1 - 画像解析装置、および画像解析方法 - Google Patents

画像解析装置、および画像解析方法 Download PDF

Info

Publication number
WO2015118806A1
WO2015118806A1 PCT/JP2015/000182 JP2015000182W WO2015118806A1 WO 2015118806 A1 WO2015118806 A1 WO 2015118806A1 JP 2015000182 W JP2015000182 W JP 2015000182W WO 2015118806 A1 WO2015118806 A1 WO 2015118806A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle
front camera
amount
vehicle camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/000182
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
宗作 重村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Priority to US15/116,362 priority Critical patent/US10220782B2/en
Publication of WO2015118806A1 publication Critical patent/WO2015118806A1/ja
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/40Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
    • B60R2300/402Image calibration

Definitions

  • the present disclosure relates to an image analysis apparatus (Image Analysis Apparatus) and an image analysis method for detecting that the mounting position or angle of an in-vehicle camera attached to a vehicle is abnormal.
  • Patent Document 1 a technology has been proposed in which in-vehicle cameras are mounted on the front, rear, left and right sides of the vehicle, and by combining the images taken with these in-vehicle cameras, an image as if the vehicle was viewed from above is displayed.
  • the in-vehicle camera captures an image of a preset monitoring area, and therefore, the in-vehicle camera needs to be attached at an appropriate position and at an appropriate angle.
  • the in-vehicle camera needs to be attached at a predetermined angle in both the left and right and up and down directions with respect to the traveling direction of the vehicle.
  • the respective on-vehicle cameras mounted on the front, rear, left and right are attached at angles set respectively. Need to be.
  • mounting position changes unknowingly due to vibration during traveling of the vehicle or due to some external force during stopping or traveling. Sometimes it ends up.
  • An object of this disclosure is to provide an image analysis apparatus and an image analysis method for detecting that the mounting position of the in-vehicle camera attached to the vehicle is abnormal.
  • an image analysis apparatus is an apparatus that analyzes an image of the in-vehicle camera applied to an in-vehicle camera that captures an image of a predetermined monitoring area set in a predetermined direction with respect to the vehicle, and the in-vehicle camera captures the monitoring area.
  • the feature amount of the monitoring area image obtained in this case is stored. Then, by acquiring an image from the in-vehicle camera, extracting the feature amount of the image, and comparing the feature amount with the feature amount of the monitoring area image, it is determined whether there is an abnormality in the mounting position of the in-vehicle camera, The judgment result is output.
  • the in-vehicle camera If the in-vehicle camera is properly mounted so as to capture the monitoring area, it is estimated that an image having a predetermined feature amount is captured.
  • the feature amount when such an in-vehicle camera is normally attached is stored in advance, and the feature amount is compared with the feature amount of the image actually captured by the in-vehicle camera. By doing this, it is determined whether there is an abnormality in the mounting position of the in-vehicle camera. Therefore, the presence or absence of an abnormality in the mounting position of the in-vehicle camera can be determined with high accuracy without relying on human eyes.
  • the configuration of the above example is also provided as an image analysis method.
  • FIG. 1 shows a configuration of an image analysis apparatus 10 provided in a vehicle 1 (also referred to as a host vehicle).
  • the image analysis apparatus 10 captures images taken by vehicle-mounted cameras 11a to 11d (front camera 11a, rear camera 11b, left camera 11c, and right camera 11d) attached to the front, rear, left and right of the vehicle 1.
  • vehicle-mounted cameras 11a to 11d front camera 11a, rear camera 11b, left camera 11c, and right camera 11d
  • it is a device that determines whether or not there is an abnormality in the mounting position of each of the in-vehicle cameras 11a to 11d and outputs the result.
  • each of the in-vehicle cameras 11a to 11d is a camera for monitoring the periphery of the vehicle, and a system different from the image analysis apparatus 10 is based on images taken by the in-vehicle cameras 11a to 11d. Detect lane positions, obstacles, pedestrians, etc. Accordingly, each of the in-vehicle cameras 11a to 11d is preliminarily set at a predetermined position of the vehicle 1 so that an area around the vehicle (referred to as “monitoring area” in this specification) according to the system using the image can be captured. It is mounted at a fixed angle (the shooting direction is adjusted).
  • the image analysis device 10 includes a control device 12 that executes processing for analyzing images taken by the in-vehicle cameras 11a to 11d, and a display unit 13 that outputs the analysis result.
  • the control apparatus 12 has a board
  • the display unit 13 As the display unit 13, a liquid crystal display installed on the instrument panel with the display screen facing the driver's seat, a head-up display for projecting display contents on a windshield (windshield), and the like are installed. .
  • the control device 12 When the inside of the control device 12 is classified into functional blocks having respective functions, the control device 12 has the feature amount of the images photographed by the in-vehicle cameras 11a to 11d (because it is the feature amount of the actually photographed image.
  • the feature amount extraction unit 14 for extracting the “actual feature amount” and the feature amounts of the images obtained when the in-vehicle cameras 11a to 11d photograph the “monitoring area”, that is, the attachment positions of the in-vehicle cameras 11a to 11d.
  • a feature amount storage unit 15 for storing in advance an image feature amount obtained when the image is normal (because it is an ideal image feature amount, hereinafter referred to as an “ideal feature amount”), “actual feature amount” And an “ideal feature amount” to determine whether there is an abnormality in the mounting positions of the in-vehicle cameras 11a to 11d, a vehicle speed sensor provided in the vehicle 1 And a travel determination unit 17 determines whether the vehicle 1 is traveling based.
  • the feature amount extraction unit 14 is also referred to as an extraction unit / device / means.
  • the feature amount storage unit 15 is also referred to as storage unit / device / means.
  • the abnormality determination unit 16 and the display unit 13 are also referred to as output unit / device / means.
  • the traveling determination unit 17 is also referred to as a traveling determination unit / device / means.
  • the actual feature amount extracted by the feature amount extraction unit 14 is also referred to as the feature amount of the image obtained by the in-vehicle camera.
  • the ideal feature amount stored in the feature amount storage unit 15 is also referred to as a feature amount of the monitoring area image.
  • the image analysis processing the images taken by the in-vehicle cameras 11a to 11d are analyzed to determine whether or not there is an abnormality in the mounting positions of the in-vehicle cameras 11a to 11d, and a process for outputting the result to the display unit 13 is performed. Is called.
  • each section is expressed as S100, for example.
  • each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section.
  • each section can be referred to as a device, module, or means.
  • each of the above sections or a combination thereof includes not only (i) a section of software combined with a hardware unit (eg, a computer), but also (ii) hardware (eg, an integrated circuit, As a section of (wiring logic circuit), it can be realized with or without the function of related devices.
  • the hardware section can be included inside the microcomputer.
  • FIG. 2 shows a flowchart of image analysis processing performed in the image analysis apparatus 10 of the present embodiment. This image analysis process is actually performed by the CPU in the control device 12 executing a program stored in the ROM. Hereinafter, the control device 12 or the function blocks 14 to 17 described above are executed. Explain as the subject. The image analysis process is performed when the engine of the vehicle 1 is started or every predetermined time (for example, every 10 seconds).
  • the control device 12 performs, as an image analysis process, a front camera process (S100) for analyzing an image (front camera image) captured by the front camera 11a, and an image (rearward) captured by the rear camera 11b.
  • image taken by the right camera 11d The right camera processing (S106) for analyzing (right camera image) is executed. Since the processes of S100 to S106 differ only in which on-board camera image is to be analyzed, the following description will be given by taking the front camera process of S100 as an example.
  • FIG. 3 shows a flowchart of the front camera process.
  • the image was taken in the “monitoring period” from several seconds ago (3 seconds ago in the present embodiment) to the present. Based on the front camera image, it is determined whether there is an abnormality in the attachment position of the front camera 11a.
  • the traveling determination unit 17 of the control device 12 determines whether or not the vehicle 1 is traveling (S200). This determination process is performed by determining whether or not the vehicle speed of the vehicle 1 detected based on the vehicle speed sensor is 0 km / h. As a result, when the vehicle 1 is not traveling (S200: NO), the front camera processing shown in FIG. 3 is terminated as it is (that is, the image photographed by the front camera 11a is not analyzed), and FIG. The processing returns to the image analysis processing shown in FIG.
  • the control device 12 captures an image on the front camera 11a by transmitting a photographing instruction signal toward the front camera 11a. At the same time, the image (front camera image) is acquired (S202). And the feature-value extraction part 14 of the control apparatus 12 calculates the luminance value of each pixel from the acquired front camera image (S204).
  • FIG. 4 shows an example of an image (front camera image) taken by the front camera 11a.
  • a part of the bumper of the vehicle 1 is shown in the lower area of the front camera image, and the other areas are in front of the vehicle 1 (such as road conditions). Is attached (so that a predetermined area in front of the vehicle 1 becomes a monitoring area).
  • the bumper is distorted due to distortion.
  • the luminance value of each pixel of such a front camera image is calculated.
  • the luminance value is compared with the luminance value of the front camera image captured last time for each pixel.
  • the above-described processing of S200 to S204 is repeatedly executed. Therefore, in the process of S206, for each pixel, the luminance value of the front camera image (previous frame) taken last time is read and compared with the luminance value of the front camera image (current frame) taken this time.
  • the front camera image captured last time is not acquired, and thus the processing of S206 is omitted.
  • the luminance value from the “front camera image captured last time” is calculated.
  • Pixels whose change amount is equal to or less than the threshold change amount th are extracted.
  • the ratio of the extracted pixels that is, the ratio of the “number of pixels whose luminance value change amount is equal to or less than the threshold change amount th” to the total number of pixels of the front camera image (the unchanged pixel ratio R) is calculated (S208). ).
  • the region that has not changed from the “front camera image captured last time” in the region of “front camera image captured this time” (or the amount of change is small). Area, which is expressed as a non-changed area for convenience in this specification).
  • the abnormality determination unit 16 of the control device 12 determines that the unchanged pixel ratio R is the threshold ratio. It is determined whether or not ⁇ (for example, 30%) or more (S210). In other words, in the process of S210, whether or not the amount of change in the luminance value of the “front camera image captured this time” (hereinafter referred to as “the amount of change in the luminance value of the front camera image”) is smaller than a predetermined amount (first predetermined amount). Is determined.
  • the small change number counter counts the number of times that the amount of change in the luminance value of the front camera image is determined to be smaller than the first predetermined amount in the monitoring period (the number of frames in which the amount of change in the luminance value in the monitoring period is small). It is a counter for doing.
  • the monitoring period it is determined whether or not the monitoring period has ended (S214).
  • the front camera process shown in FIG. 3 is a process of determining whether there is an abnormality in the attachment position of the front camera 11a based on the front camera image taken during the monitoring period, as described above, Until the period ends (S214: NO), the processes of S200 to S212 described above are repeated. That is, until the monitoring period ends, the front camera image is repeatedly captured (the front camera image is captured a plurality of times), and each time the front camera image is captured, the amount of change in the luminance value of the front camera image is the first.
  • the abnormality determination unit 16 of the control device 12 determines the value of the small change count counter, that is, the front camera in the monitoring period. It is determined whether or not the number of times that the amount of change in the luminance value of the image is determined to be smaller than the first predetermined amount (the number of frames in which the amount of change in the luminance value during the monitoring period is small) is equal to or greater than the threshold number ⁇ (for example, 25). (S216). That is, in the determination processing in S216, it is determined whether or not “the amount of change in the luminance value of the front camera image over the monitoring period” is smaller than a predetermined amount (second predetermined amount).
  • the attachment position (attachment mode, attachment angle) of the front camera 11a is abnormal. It is judged that. The reason for this determination is as follows.
  • FIG. 6 shows a front camera image when the attachment position of the front camera 11a is normal (when the monitoring area is photographed) and an abnormality when the attachment position of the front camera 11a is abnormal (photographs an area that is different from the monitoring area).
  • the front camera image is illustrated. It is presumed that the front camera 11a is in an abnormal mounting position when, for example, the front camera 11a is tilted downward due to vibration during traveling.
  • the front camera 11a compared to the case where the front camera 11a is in a normal mounting position, the area where the bumper appears in the front camera image increases and the situation in front of the vehicle 1 (the road situation) Etc.) is reduced.
  • the luminance value hardly changes. Therefore, if the area where the bumper is captured (area where the luminance value does not change significantly) increases due to the abnormal attachment position of the front camera 11a, the amount of change in the luminance value of the entire front camera image decreases. End up.
  • FIG. 7 illustrates an abnormality warning image displayed on the display unit 13.
  • the abnormality warning image a host vehicle image showing the positions of the in-vehicle cameras 11a to 11d and an abnormal camera notification image for informing the in-vehicle camera having an abnormal mounting position are displayed.
  • the amount of change in the luminance value of the front camera image taken this time (the unchanged pixel ratio R)” and “the amount of change in the luminance value of the front camera image over the monitoring period (the amount of change in the front camera image during the monitoring period).
  • “The number of times that the amount of change in the luminance value is determined to be smaller than the first predetermined amount” is extracted from the front camera image by the feature amount extraction unit 14“ actual feature amount (brightness between a plurality of images taken by the in-vehicle camera) Value change amount) ”, and the first predetermined amount and the second predetermined amount (threshold ratio ⁇ and threshold number ⁇ ) are stored in advance in the feature amount storage unit 15 as“ ideal feature amount (luminance of the monitoring region image). Value change amount).
  • This “ideal feature value (amount of change in luminance value of the monitoring area image)” is stored at the time of factory shipment or when the camera position is adjusted at a vehicle dealer.
  • the process for the front camera shown in FIG. 3 is terminated and the process returns to the image analysis apparatus shown in FIG.
  • the rear camera 11b, the left camera 11c, and the right camera 11d also execute the same processing as S200 to S218 (S102 to S106).
  • the above-described processing processing for determining the presence / absence of abnormality of the mounting positions of the in-vehicle cameras 11a to 11d based on the respective camera images taken during the monitoring period
  • the front camera 11a is normally attached so as to capture the monitoring area, it is estimated that a front camera image having a predetermined ideal feature amount is captured. That is, in the image analysis apparatus 10 of the present embodiment, the number of times that the ratio of the area in which the “front camera image captured this time” has not changed in the monitoring period is determined to be equal to or greater than the threshold ratio ⁇ (the front camera image captured this time). The number of times that the amount of change in the luminance value of the image is determined to be smaller than the first predetermined amount) is smaller than the threshold number ⁇ , in other words, “the amount of change in the luminance value of the front camera image over the monitoring period” is the second. It is estimated that the amount is equal to or greater than a predetermined amount.
  • ideal feature amounts when the front camera 11a is normally attached as described above.
  • the threshold ratio ⁇ the luminance value of the front camera image captured this time
  • Is equal to or more than the threshold number ⁇ the second predetermined amount. If it is smaller, it is determined that the attachment position of the front camera 11a is abnormal.
  • the presence or absence of an abnormality in the attachment position of the front camera 11a can be determined with high accuracy without depending on human eyes. It should be noted that the above-described effects can be similarly achieved with the rear camera 11b, the left camera 11c, and the right camera 11d.
  • a front camera image (a front part during traveling) in which the difference in the amount of change in luminance value tends to be large between when the front camera 11a is attached normally and when it is abnormal. Since the presence / absence of an abnormality in the attachment position of the front camera 11a is determined using the camera image), the presence / absence of an abnormality in the attachment position of the front camera 11a can be determined with higher accuracy.
  • the ratio of the area in which the luminance value of the “front camera image captured this time” has not changed (the amount of change in the luminance value between a plurality of images captured by the in-vehicle camera is equal to or greater than a predetermined amount Is determined to be equal to or greater than the threshold ratio ⁇ (the ratio of the amount of change in the luminance value in the monitoring area image to the area of the predetermined amount or more and the area of the predetermined amount or less).
  • the brightness value is calculated for the pixels in the entire area of the front camera image (S204 in FIG. 3), and the “front camera image captured this time” and the “front camera image captured last time” are calculated for each pixel. (S206).
  • the luminance value is calculated for pixels in a region (specific region) in which the luminance value is likely to change when the front camera 11a is attached normally in the front camera image. The “front camera image taken this time” is compared with the “front camera image taken last time”.
  • the area where the situation in front of the vehicle 1 is shown is hatched in the front camera image when the front camera 11a is attached normally (when the monitoring area is photographed).
  • this region is a region where the luminance value is likely to change corresponding to the situation in front of the vehicle 1.
  • the area where hatching is not performed is an area in which the brightness value is difficult to change because the bumper remains reflected even if the surrounding situation of the vehicle 1 changes.
  • the luminance value is calculated for the region where the luminance value that is hatched in this manner is easily changed, and the “front camera image captured this time” is compared with the “front camera image captured last time”. .
  • the hatched area (“front camera image taken this time” and “previous camera image”) is displayed.
  • the bumper is reflected in the area) where the “front camera image taken” is compared. Since the luminance value of the area where the bumper is reflected in the hatched area hardly changes, the amount of change in the luminance value of the hatched area becomes small. When the amount of change in the luminance value of the hatched area becomes small in this way, it is determined that the attachment position of the front camera 11a is abnormal.
  • the luminance value is calculated for the pixels in the region where the luminance value is easily changed in the entire front camera image, and the “front camera image captured this time” and the “front camera image captured last time” are calculated. Therefore, the processing burden on the image analysis apparatus 10 can be reduced.
  • an area where the luminance value is likely to change in the entire front camera image is a target for determination as to whether or not the attachment position of the front camera 11a is abnormal, noise (changes regardless of whether the attachment position is normal or not). Determination of an area where the amount does not change) can be eliminated, and whether or not there is an abnormality in the attachment position of the front camera 11a can be determined with higher accuracy.
  • the ideal feature amount (threshold ratio ⁇ , threshold number ⁇ , first predetermined amount, second predetermined amount) when the front camera 11 a is normally attached is independent of the speed of the vehicle 1. It was compared with actual feature values (amount of change in the luminance value of the front camera image taken this time, a amount of change in the luminance value of the front camera image over the monitoring period) as being constant.
  • an ideal feature amount may be determined according to the speed of the vehicle 1, and the ideal feature amount and the actual feature amount may be compared.
  • the luminance value of the area in the front camera image in which the situation in front of the vehicle 1 is shown. It is estimated that the amount of change increases as the speed of the vehicle 1 increases. Accordingly, if the ideal feature amount (threshold ratio ⁇ , threshold number ⁇ , first predetermined amount, second predetermined amount) is constant regardless of the speed of the vehicle 1, the actual feature amount increases as the speed of the vehicle 1 increases. And the possibility that the attachment position of the front camera 11a is determined to be normal is increased.
  • a threshold value (ideal feature amount, that is, threshold ratio ⁇ , threshold number ⁇ , first predetermined amount, second predetermined amount) for determining that there is a more strict setting as the speed of the vehicle 1 increases (threshold ratio ⁇ , threshold value). If the number of times ⁇ is set, it is set to be small, and if it is the first predetermined amount and the second predetermined amount is set to be large).
  • the value of the threshold ratio ⁇ is set smaller as the speed of the vehicle 1 increases.
  • the value of the threshold number ⁇ is set smaller as the average speed of the vehicle 1 in the monitoring period increases.
  • a wiper operation detection unit 21 that detects that the wiper is operating in the control device 12, and detects that the headlamp is lit.
  • a headlamp lighting detection unit 22 is provided. Then, before the image analysis process described above with reference to FIG. 2 is performed, the wiper operation detection unit 21 detects whether or not the wiper is operating. In other words, since the wiper is normally operating when it is raining, it is determined whether it is raining by detecting whether the wiper is operating. Further, the headlamp lighting detection unit 22 detects whether or not the headlamp is lit.
  • the headlamp is normally lit when fog is generated or when traveling in a tunnel, so whether or not fog is generated or tunneling is detected by detecting whether the headlamp is lit or not. To detect. As a result, if the wiper is not operating and the headlamp is not turned on, that is, if it is not rainy, foggy, or tunneling, the image analysis process described above with reference to FIG. 2 is executed. .
  • the third modification it is possible to prevent the front camera 11a from being determined to be abnormal even when it is raining, when fog is generated, or when traveling through the tunnel, although it is normal. It is possible to determine whether there is an abnormality in the attachment position of the front camera 11a with high accuracy.
  • the wiper operation detection 21 is also referred to as a wiper operation detection unit / device / means.
  • the headlamp lighting detection unit 22 is also referred to as a headlamp lighting detection device / means.
  • FIG. 11 shows a flowchart of the front camera process of the fourth modification.
  • the control device 12 causes the front camera 11a to take an image by transmitting a shooting instruction signal toward the front camera 11a, and acquires the image (front camera image) (S300).
  • the feature amount extraction unit 14 of the control device 12 calculates the bumper region as the “actual feature amount” from the acquired front camera image (S302). This calculation is performed using a known edge detection process or the like.
  • the abnormality determination unit 16 of the control unit 12 stores the bumper region and the feature amount storage unit 15 as ideal feature amounts in advance.
  • the bumper area is compared (S304). Then, it is determined whether or not the bumper region calculated as the actual feature amount matches the bumper region stored in advance as the ideal feature amount (S306). For example, whether or not the area (number of pixels) of the bumper region calculated as the actual feature amount falls within a range of ⁇ 5% of the area (number of pixels) of the bumper region stored in advance as the ideal feature amount. to decide.
  • the edge portion of the bumper region calculated as the actual feature amount is within a predetermined distance (for example, 10 pixels) from the edge portion of the bumper region previously stored as the ideal feature amount. Or less).
  • the bumper region calculated as the actual feature amount matches the bumper region stored in advance as the ideal feature amount (S306: YES)
  • the attachment position of the front camera 11a is normal. Determination is made, and the front camera processing shown in FIG.
  • the bumper area calculated as the actual feature quantity does not match the bumper area previously stored as the ideal feature quantity (S308: YES)
  • the mounting position of the front camera 11a is abnormal.
  • the abnormality warning image (see FIG. 7) is displayed on the display unit 13 (S310).
  • the rear camera 11b, the left camera 11c, and the right camera 11d also execute the same processing as S300 to S310.
  • the luminance value of the pixel is employed as the “actual feature amount” or the “ideal feature amount”, but the RGB value or YUV value of the pixel may be employed.
  • the bumper is reflected in the front camera image even when the front camera 11a is attached normally. However, when the front camera 11a is attached normally. The bumper may not appear in the front camera image.
  • the front camera 11a when the mounting position of the front camera 11a is abnormal, the front camera 11a is detected by detecting that the amount of change in the brightness value is reduced by increasing the area where the bumper is captured. It was decided that the mounting position of was abnormal. Not limited to this, when the attachment position of the front camera 11a is abnormal, the attachment of the front camera 11a is detected by detecting that the amount of change in the brightness value is reduced by increasing the area of the road surface such as asphalt. It may be determined that the position is abnormal.
  • the region where the bumper is reflected is adopted as the region that is always reflected when the front camera 11a is attached normally, but the region where a part of the vehicle 1 such as a door mirror or bonnet is reflected is adopted. May be.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Image Analysis (AREA)
PCT/JP2015/000182 2014-02-06 2015-01-16 画像解析装置、および画像解析方法 Ceased WO2015118806A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/116,362 US10220782B2 (en) 2014-02-06 2015-01-16 Image analysis apparatus and image analysis method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014021572A JP6364797B2 (ja) 2014-02-06 2014-02-06 画像解析装置、および画像解析方法
JP2014-021572 2014-02-06

Publications (1)

Publication Number Publication Date
WO2015118806A1 true WO2015118806A1 (ja) 2015-08-13

Family

ID=53777621

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/000182 Ceased WO2015118806A1 (ja) 2014-02-06 2015-01-16 画像解析装置、および画像解析方法

Country Status (3)

Country Link
US (1) US10220782B2 (enExample)
JP (1) JP6364797B2 (enExample)
WO (1) WO2015118806A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3200445A4 (en) * 2014-09-26 2018-06-06 Kyocera Corporation Image capture device, vehicle, and fluctuation detecting method

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107848465B (zh) * 2015-05-06 2021-06-01 麦格纳镜片美国有限公司 具有盲区显示和警示系统的车辆视觉系统
JP6565769B2 (ja) * 2016-04-03 2019-08-28 株式会社デンソー 車載カメラの取付角度検出装置、取付角度較正装置、取付角度検出方法、取付角度較正方法、およびコンピュータープログラム
KR101816423B1 (ko) * 2016-07-12 2018-01-08 현대자동차주식회사 사이드 미러 대체 디스플레이 장치 및 이의 출력 밝기 제어 방법
JP6767255B2 (ja) * 2016-12-20 2020-10-14 株式会社デンソーテン 画像処理装置及び画像処理方法
JP6807771B2 (ja) * 2017-02-20 2021-01-06 株式会社アルファ 車両用監視装置
JPWO2018155142A1 (ja) * 2017-02-21 2019-07-11 日立オートモティブシステムズ株式会社 車両制御装置
CN108965687B (zh) 2017-05-22 2021-01-29 阿里巴巴集团控股有限公司 拍摄方向识别方法、服务器及监控方法、系统及摄像设备
JP6820075B2 (ja) * 2017-07-25 2021-01-27 日本電気株式会社 乗員数検知システム、乗員数検知方法、およびプログラム
US11206375B2 (en) 2018-03-28 2021-12-21 Gal Zuckerman Analyzing past events by utilizing imagery data captured by a plurality of on-road vehicles
US11138418B2 (en) 2018-08-06 2021-10-05 Gal Zuckerman Systems and methods for tracking persons by utilizing imagery data captured by on-road vehicles
JP2020051168A (ja) * 2018-09-28 2020-04-02 コベルコ建機株式会社 取付状態表示装置
JP7081425B2 (ja) * 2018-09-28 2022-06-07 コベルコ建機株式会社 取付位置認識装置
JP7385412B2 (ja) 2019-09-25 2023-11-22 株式会社Subaru 自動運転システム
JP7759739B2 (ja) * 2021-05-28 2025-10-24 株式会社Subaru 車両の車外撮像装置
KR20240033944A (ko) * 2022-09-06 2024-03-13 현대자동차주식회사 차량 및 차량 운전자의 안전운전지수 반영방법

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004001658A (ja) * 2002-06-03 2004-01-08 Nissan Motor Co Ltd 車載カメラの光軸ずれ検出装置
JP2007008325A (ja) * 2005-06-30 2007-01-18 Nissan Motor Co Ltd 設置状態判断装置及び方法
JP2007038773A (ja) * 2005-08-02 2007-02-15 Auto Network Gijutsu Kenkyusho:Kk 車載カメラ点検装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4151137B2 (ja) 1998-12-16 2008-09-17 株式会社豊田自動織機 車両における障害物検出装置及び車両
JP4026543B2 (ja) * 2003-05-26 2007-12-26 日産自動車株式会社 車両用情報提供方法および車両用情報提供装置
JP4666049B2 (ja) * 2008-10-17 2011-04-06 株式会社デンソー 光源識別装置、光源識別プログラム、車両検出装置、およびライト制御装置
KR101424421B1 (ko) * 2009-11-27 2014-08-01 도요타지도샤가부시키가이샤 운전 지원 장치 및 운전 지원 방법
JP5724446B2 (ja) 2011-02-21 2015-05-27 日産自動車株式会社 車両の運転支援装置
DE102012214464A1 (de) * 2012-08-14 2014-02-20 Ford Global Technologies, Llc System zur Überwachung und Analyse des Fahrverhaltens eines Fahrers in einem Kraftfahrzeug

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004001658A (ja) * 2002-06-03 2004-01-08 Nissan Motor Co Ltd 車載カメラの光軸ずれ検出装置
JP2007008325A (ja) * 2005-06-30 2007-01-18 Nissan Motor Co Ltd 設置状態判断装置及び方法
JP2007038773A (ja) * 2005-08-02 2007-02-15 Auto Network Gijutsu Kenkyusho:Kk 車載カメラ点検装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3200445A4 (en) * 2014-09-26 2018-06-06 Kyocera Corporation Image capture device, vehicle, and fluctuation detecting method
US10242460B2 (en) 2014-09-26 2019-03-26 Kyocera Corporation Imaging apparatus, car, and variation detection method

Also Published As

Publication number Publication date
JP2015149614A (ja) 2015-08-20
US10220782B2 (en) 2019-03-05
US20160347251A1 (en) 2016-12-01
JP6364797B2 (ja) 2018-08-01

Similar Documents

Publication Publication Date Title
JP6364797B2 (ja) 画像解析装置、および画像解析方法
US10449899B2 (en) Vehicle vision system with road line sensing algorithm and lane departure warning
JP6163207B2 (ja) 車載装置
US10268902B2 (en) Outside recognition system, vehicle and camera dirtiness detection method
JP5682304B2 (ja) 画像提供装置
JP5233583B2 (ja) 車載用監視装置
US20170305365A1 (en) Driving information display apparatus and driving information display method
US9965690B2 (en) On-vehicle control device
JP5680436B2 (ja) 車載カメラレンズ用異物付着判定装置
JP6081034B2 (ja) 車載カメラ制御装置
EP3115930A1 (en) Malfunction diagnosis apparatus
CN110378836B (zh) 获取对象的3d信息的方法、系统和设备
JP7183729B2 (ja) 撮影異常診断装置
WO2016157698A1 (ja) 車両検出システム、車両検出装置、車両検出方法、及び車両検出プログラム
US10516848B2 (en) Image processing device
JP6424449B2 (ja) 後方状況表示装置、後方状況表示方法
CN111886858A (zh) 车辆用影像系统
JP5861584B2 (ja) ふらつき判定装置
JP7262043B2 (ja) 異常検出システム、移動体、異常検出方法、及びプログラム
JP2006107000A (ja) 画像異常判定方法及び画像異常判定装置
US20230274554A1 (en) System and method for vehicle image correction
US12432435B2 (en) In-vehicle camera shield state determination device
US20250242746A1 (en) Poor visibility determination device
JP2020009386A (ja) 周辺認識装置及び車載カメラシステム
CN108297691A (zh) 用于在车辆的相机显示器上提供通知的方法和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15746849

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15116362

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15746849

Country of ref document: EP

Kind code of ref document: A1