WO2015045501A1 - Dispositif de reconnaissance d'environnement externe - Google Patents

Dispositif de reconnaissance d'environnement externe Download PDF

Info

Publication number
WO2015045501A1
WO2015045501A1 PCT/JP2014/065667 JP2014065667W WO2015045501A1 WO 2015045501 A1 WO2015045501 A1 WO 2015045501A1 JP 2014065667 W JP2014065667 W JP 2014065667W WO 2015045501 A1 WO2015045501 A1 WO 2015045501A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
region
image information
camera
image
Prior art date
Application number
PCT/JP2014/065667
Other languages
English (en)
Japanese (ja)
Inventor
永崎 健
未来 樋口
亮 太田
達朗 小船
修之 一丸
及川 浩隆
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to JP2015538948A priority Critical patent/JPWO2015045501A1/ja
Publication of WO2015045501A1 publication Critical patent/WO2015045501A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the present invention relates to an in-vehicle external environment recognition device, for example, an apparatus for recognizing an environment outside a vehicle based on image information of an image sensor.
  • external recognition devices such as a stereo camera device that is mounted on a vehicle and recognizes the situation in front of the vehicle are known.
  • a stereo camera device that accurately measures a distance to a wide-range subject from a short distance to a long distance and secures a wide field of view with a wide-angle lens at a short distance, and a vehicle exterior monitoring device using the stereo camera device are disclosed. (See Patent Document 1 below).
  • the stereo camera device described in Patent Document 1 processes first and second imaging units arranged on the left and right sides to obtain two stereo images for short distance and long distance, and a long distance image.
  • a first camera control unit and a second camera control unit that processes an image for short distance are provided.
  • the first imaging unit includes a first imaging lens for a long distance, a second imaging lens for a short distance, an imaging element having respective imaging regions for a short distance and a long distance, and each imaging lens. And a prism and a polarization beam splitter disposed between the image pickup device and the image pickup device.
  • the second imaging unit includes a third imaging lens for a long distance, a second imaging lens for a short distance, and an imaging element having respective imaging regions for a short distance and a long distance, A prism and a polarizing beam splitter are provided between each imaging lens and the imaging element.
  • the prism and polarization beam splitter of each imaging unit allow only the light of the S polarization component out of the light transmitted through the long-distance imaging lens to pass through the long-distance imaging region of the imaging device. Further, only the P-polarized light component of the light transmitted through the short-distance imaging lens is passed through the short-distance imaging region of the imaging element. Accordingly, the first camera control unit combines the right image and the left image of the long-distance subject imaged in the long-distance imaging region of the imaging element of each imaging unit as one image. Similarly, the second camera control unit combines the right image and the left image of the short-distance subject imaged in the short-distance imaging region of the imaging element of each imaging unit as one image.
  • the stereo camera device described in Patent Document 1 can simultaneously obtain a stereo image of a long-distance subject with different polarization components and a stereo image of a short-distance subject. It is difficult to recognize the state of the imaging target region by comparing these images.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide an external environment recognition apparatus that can recognize an environment outside a vehicle by comparing a plurality of images having different optical conditions.
  • an external environment recognition apparatus includes an external camera that is mounted on a vehicle and acquires image information of a shooting target area in front of the vehicle, and an image processing unit that processes the image information.
  • An apparatus wherein the camera includes an imaging element having first and second imaging areas, and a first optical member that forms an image of the imaging target area on each of the first and second imaging areas. And a second optical member that allows light of different optical conditions to pass through each of the first and second imaging regions, and the image processing unit is a first obtained from the first imaging region. And the second image information obtained from the second imaging area are compared to recognize the state of the imaging target area.
  • the environment outside the vehicle can be recognized by comparing a plurality of images having different optical conditions.
  • Other problems and effects of the present invention will be clarified by the following embodiments.
  • the block diagram which shows schematic structure of the image process part of the external field recognition apparatus shown in FIG. The figure which shows the arrangement
  • region of the camera of the external field recognition apparatus shown in FIG. (A) is a figure which shows the image of the imaging
  • FIG. 1 is a block diagram illustrating a schematic configuration of an in-vehicle system including an external environment recognition device 100 according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing a schematic configuration of the image processing unit 2 of the external environment recognition apparatus 100 shown in FIG.
  • the external environment recognition apparatus 100 is an apparatus that is mounted on a vehicle and recognizes the environment outside the vehicle based on image information of a shooting target area in front of the vehicle.
  • the external environment recognition apparatus 100 is used for, for example, road surface wetness, snow accumulation and pattern identification, pedestrians, cyclists and animal / background identification, or image processing under bad weather conditions such as snowfall, rain, and fog.
  • the external environment recognition apparatus 100 includes two cameras 1L and 1R arranged on the left and right sides for acquiring image information, and an image processing unit 2 for processing the image information.
  • Each of the left camera 1L and the right camera 1R includes a first optical member 11, a second optical member 12, and an image sensor 13.
  • the imaging element 13 has a first imaging area 13a and a second imaging area 13b that convert the images of the imaging target areas of the cameras 1L and 1R into image information.
  • the first optical member 11 includes a first lens 11a and a second lens 11b that are arranged adjacent to each other on the optical paths of the cameras 1L and 1R.
  • the first lens 11a is disposed so as to overlap a part of the second lens 11b on the optical path.
  • the first lens 11a and the second lens 11b are separate members.
  • the first lens 11a and the second lens 11b are integrally formed to form the first lens 11a and the second lens 11b.
  • the optical member 11 may be a single member.
  • the second optical member 12 is a polarizing filter, for example, and is an optical filter that selectively transmits light of a predetermined polarization component.
  • the second optical member 12 is disposed on the optical path of the first imaging region 13a and is not disposed on the optical path of the second imaging region 13b, so that light of a predetermined polarization component is present in the first imaging region 13a.
  • L1 is allowed to pass, and unpolarized light L2 is allowed to pass through the second imaging region 13b. That is, the second optical member 12 allows the light L1 and L2 having different optical conditions to pass through the first and second imaging regions 13a and 13b, respectively.
  • the first optical member 11 causes light of a predetermined polarization component that has passed through the second optical member 12 to enter the second lens 11 b, and the image of the imaging target region is displayed on the first imaging region 13 a of the imaging device 13. Make an image.
  • the first optical member 11 causes the non-polarized light that does not pass through the second optical member 12 to enter the first lens 11 a and the second lens 11 b, so that the second imaging region on the imaging element 13. An image of the imaging target area is formed on 13b.
  • the cameras 1L and 1R include the polarization region A1 in which the first and second optical members 11 and 12 are disposed on the optical path, and the first optical member 11 that is not disposed in the optical path. And a polarization region A2.
  • the cameras 1L and 1R form an image of the imaging target area on the first imaging area 13a via the polarization area A1, and shoot on the second imaging area 13b via the non-polarization area A2. An image of the target area is formed.
  • the image processing unit 2 includes first image information obtained from the first imaging regions 13a and 13a of the imaging devices 13 and 13 of the cameras 1L and 1R, and second imaging of the imaging devices 13 and 13 of the cameras 1L and 1R.
  • the second image information obtained from the areas 13b and 13b is compared to recognize the state of the shooting target areas of the cameras 1L and 1R.
  • the image processing unit 2 will be described in detail.
  • the image processing unit 2 is connected to the imaging devices 13 of the two cameras 1 arranged as the left camera 1L and the right camera 1R of the stereo camera, and as shown in FIG. 2, the image processing unit 21 and the parallax processing unit 22, road surface estimation means 23, and road surface ⁇ data 24.
  • Each means is composed of, for example, a single or a plurality of computer units, and is configured to be able to exchange data with each other.
  • the image processing unit 2 converts the signals output from the first imaging regions 13a and 13a of the imaging devices 13 and 13 of the left and right cameras 1L and 1R into first image information by the image processing unit 21. I do.
  • the first image information is image information based on the image of the imaging target area formed on the first imaging area 13a of the imaging element 13 via the polarization areas A1 of the left and right cameras 1L and 1R. Image information through the optical member 12, that is, the polarization filter.
  • the image processing unit 2 is connected to the control unit 3, the storage unit 4, and the input / output unit 5 via the bus 6.
  • the control unit 3 includes a single or a plurality of computer units that control the image processing unit 2, the storage unit 4, and the input / output unit 5.
  • the storage unit 4 is configured by, for example, a memory that stores image information and the like obtained by the image processing unit 2.
  • the input / output unit 5 outputs the information output from the external environment recognition device 100 to the control system 8 of the host vehicle via the CAN 7.
  • the control system 8 can issue a warning to the occupant, brake the host vehicle, or perform avoidance control of the object based on information from the external environment recognition device 100.
  • the image processing unit 2 uses the image processing unit 21 to convert the signals output from the second imaging regions 13b and 13b of the imaging devices 13 and 13 of the left and right cameras 1L and 1R into second image information. I do.
  • the second image information is image information based on the image of the imaging target area formed on the second imaging area 13b of the imaging element 13 via the non-polarized areas A2 of the left and right cameras 1L and 1R.
  • FIG. 3 is a diagram showing the arrangement and coordinate system of the cameras 1L and 1R of the external environment recognition apparatus 100 shown in FIG.
  • the external recognition apparatus 100 is an xyz orthogonal coordinate system in which an intermediate point between the left and right cameras 1L and 1R is an origin o, a horizontal direction (vehicle width direction) is an x axis, a vertical direction is a y axis, and a traveling direction of the vehicle is a z axis. Is used.
  • the left and right cameras 1L and 1R each have a predetermined angle range ⁇ that is symmetrical with respect to the center lines c1 and c2 parallel to the z-axis as an imaging target region.
  • the image processing unit 2 uses the parallax processing unit 22 to perform the parallax processing of the first image information based on the video through the second optical member 12 of the left and right cameras 1L and 1R obtained by the image processing unit 21. .
  • the external environment recognition apparatus 100 uses the parallax processing unit 22 to perform a shooting target area between the first image information of the shooting target area of the left camera 1L and the first image information of the shooting target area of the right camera 1R.
  • the parallax process of the same point is performed, and the distance to each point in the imaging target region can be obtained by the principle of triangulation.
  • the image processing unit 2 includes road surface estimation means 23 for estimating a road surface friction coefficient ⁇ and road surface ⁇ data 24 as a road surface ⁇ estimation dictionary.
  • the road surface estimation means 23 includes first image information based on the image of the imaging target area via the second optical member 12 obtained by the image processing means 21 and the imaging target area not via the second optical member 12. Second image information based on the video is acquired. In the road surface ⁇ data 24, the relationship between the first image information, the second image information, and the friction coefficient ⁇ of the road surface based on the calculation and learning of the road surface estimation means 23 is recorded.
  • the road surface estimating means 23 refers to the road surface ⁇ data 24 based on the first and second image information acquired from the image processing means 21 and estimates the friction coefficient ⁇ of the road surface in the imaging target area. Details of the calculation and learning of the road surface estimation means 23 and the road surface ⁇ data 24 will be described later.
  • FIG. 4 is a diagram showing the relationship between the image information P1 of the shooting target area of the left camera 1L and the image information P2 of the shooting target area of the right camera 1R.
  • a stereo camera normally, parallax processing is performed in a region where the image information P1 of the shooting target region of the left camera 1L and the image information P2 of the shooting target region of the right camera 1R overlap. That is, the stereoscopic imaging target is imaged in the image information P1a on the right side of the image information P1 of the left camera 1L and the image information P2a on the left side of the image information P2 of the right camera 1R.
  • the image information P1b of the left region of the image information P1 of the left camera 1L and the image information P2b of the right region of the image information P2 of the right camera 1R are not subjected to parallax processing and are not used for stereoscopic viewing.
  • the imaging targets of the left and right cameras 1L and 1R are, for example, the road surface RS, road elements RE such as water, ice, snow or oil on the road surface RS, obstacles OB including pedestrians and cyclists, or rain, snow, Includes meteorological phenomena WP such as smoke.
  • FIGS. 5A and 5B are explanatory diagrams for estimating the friction coefficient of the road surface.
  • FIG. 5A is a diagram showing image information PA1 of the imaging target region based on light that has passed through the polarizing filter, and FIG. 5B is not through the polarizing filter. It is a figure which shows the image information PA2 of the imaging
  • the road surface estimation means 23 includes the road element RE and the road surface RS of the image information PA1 of the imaging target area that has passed through the polarization filter shown in FIG. 5A and the imaging target area that has passed through the polarization filter shown in FIG.
  • the road surface ⁇ data 24 is referred to based on the road element RE and the road surface RS of the image information PA1. Then, the road surface estimation means 23 estimates the road surface friction coefficient ⁇ on the road surface RS and the road element RE based on the reference result of the road surface ⁇ data 24.
  • FIG. 6 is an explanatory diagram of a camera provided in the external environment recognition apparatus 100 shown in FIG.
  • the light L from the imaging targets RE, RS, OB, and WP in the imaging target areas of the left and right cameras 1L and 1R is the polarization filter in the polarization area A1 of the left and right cameras 1L and 1R. Passes through the optical member 12. Then, the light L1 having a predetermined polarization component that has passed through the optical member 12 is collected by the second lens 11b of the first optical member 11, and forms an image on the first imaging region 13a of the imaging element 13.
  • the light L from the imaging targets RE, RS, OB, and WP does not pass through the second optical member 12 that is a polarization filter in the non-polarization region A2 of the left and right cameras 1L and 1R, and the first optical The light is collected by the first lens 11 a and the second lens 11 b of the member 11, and forms an image on the second imaging region 13 b of the imaging device 13. Thereby, the image information P1, P2 of the imaging target region as shown in FIG. 7 is acquired by the left and right cameras 1L, 1R.
  • FIG. 7 is an explanatory diagram of the image information P1, P2 of the shooting target area acquired by the left and right cameras 1L, 1R shown in FIG.
  • Image information P1a on the right side of the image information P1 on the imaging target area of the left camera 1L is based on an image formed on the first imaging area 13a of the imaging element 13 of the left camera 1L in the polarization area A1.
  • the image information P1b in the left area of the image information P1 is based on a video image formed in the second imaging area 13b of the imaging element 13 in the non-polarized area A2 of the left camera 1L.
  • the image information P2a of the left area of the image information P2 of the imaging target area of the right camera 1R is based on the image formed on the first imaging area 13a of the imaging element 13 of the right camera 1R in the polarization area A1.
  • the image information P2b in the right region of the image information P1 is based on a video image formed on the second imaging region 13b of the imaging device 13 in the non-polarized region A2 of the right camera 1L.
  • the polarization area A1 is provided corresponding to the right side of the imaging target area of the left camera 1L and the left side of the imaging target area of 1R of the right camera.
  • the non-polarized area A2 is provided corresponding to the left side of the shooting target area of the left camera 1L and the right side of the shooting target area of the right camera 1R.
  • the area of the polarization region A1 of the left and right cameras 1L and 1R is larger than the area of the non-polarization region A2.
  • FIGS. 8A and 8B are diagrams showing examples of image information P1 and P2 of the shooting target area acquired by the cameras 1L and 1R shown in FIG.
  • the image information P1a and P2a of the region corresponding to the polarization region A1 in which the second optical member 12 is disposed is parallax. Used for processing.
  • the image information P1b and P2b in the area corresponding to the non-polarized area A2 where the second optical member 12 is not disposed corresponds to the polarized area A1.
  • the image information in the same range as the imaging target area of the area image information P1a and P2a is acquired.
  • the optical filters FL and FR may be arranged on the optical path of at least one non-polarization region A2 of the left and right cameras 1L and 1R shown in FIG.
  • at least one of the image information P1b and P2b in the region corresponding to the non-polarized region A2 of the image information P1 and P2 shown in FIG. 8A is the image information based on the light that has passed through the optical filters FL and FR.
  • the optical filters FL and FR for example, a polarizing filter, a near-infrared filter, an ultraviolet filter, or the like that transmits a polarization component different from that of the second optical member 12 can be used.
  • optical filters FL and FR are provided in the non-polarization regions A2 of the left and right cameras 1L and 1R, optical filters with the same optical conditions may be used for the optical filters FL and FR, or different optical conditions may be used.
  • An optical filter may be used.
  • the polarization region A1 in which the second optical member 12 is arranged among the image information P1 and P2 of the left and right cameras 1L and 1R, the polarization region A1 in which the second optical member 12 is arranged.
  • the image information P1a and P2a of the area corresponding to is used for the parallax processing.
  • the image information P1b and P2b in the region corresponding to the non-polarized region A2 where the second optical member 12 is not disposed corresponds to the polarized region A1.
  • the image information of the left half and the image information of the right half of the imaging target area of the image information P1a and P2a of the area are acquired.
  • the external environment recognition apparatus 100 having the above configuration captures an imaging target area in front of the vehicle with the left and right cameras 1L and 1R constituting the stereo camera, and obtains it from the first imaging area 13a of the imaging element 13.
  • the first image information P1a and P2a to be obtained and the second image information P1b and P2b obtained from the second imaging region 13b are compared by the image processing unit 2 to recognize the state of the imaging target region. Therefore, according to the external environment recognition apparatus 100 of the present embodiment, the first image information P1a and P2a and the second image information P1b and P2b that are different in the optical conditions acquired by the left and right cameras 1L and 1R constituting the stereo camera, It becomes possible to recognize the environment outside the vehicle.
  • the cameras 1L and 1R include a polarization region A1 in which the first and second optical members 11 and 12 are disposed on the optical path, and a non-polarization region in which the first optical member 11 is disposed on the optical path.
  • A2 is formed on the first imaging region 13a of the image sensor 13 through the polarization region A1, and the second imaging region 13b is coupled with the image through the non-polarization region A2.
  • the image processing unit 2 includes road surface ⁇ data 24 that records the relationship between the friction coefficient ⁇ of the road surface RS, the first image information P1a and P2a, and the second image information P1b and P2b, and the first image information.
  • the road surface ⁇ data 24 is referred to, and the friction coefficient ⁇ of the road surface RS and the road element RE in the imaging target region is estimated.
  • information is output via the input / output unit 5, a warning is given to the vehicle occupant by the vehicle control system 8, and the vehicle braking control is performed. Or avoidance control for avoiding danger can be performed.
  • Two cameras 1R and 1L are arranged as a left camera 1L and a right camera 1R, and the image processing unit 2 includes first image information P1a obtained by the left camera 1L and a first image obtained by the right camera 1R.
  • the parallax processing of the image information P2a is performed. Thereby, the influence of the reflected light from in-vehicle equipment or glass can be reduced, and the distance to an obstacle OB including a road element RE, a pedestrian, or a cyclist, which is a subject to be photographed in the photographing target region, can be accurately measured. it can.
  • the image information P1a and P2a to be subjected to the parallax processing can be increased, and a more accurate distance measurement can be performed.
  • optical filters FL and FR are arranged on the optical path of at least one non-polarized region A2 of the left and right cameras 1L and 1R, not only the determination of the road surface RS and the road element RE and the estimation of the friction coefficient ⁇ are performed. This makes it possible to perform more diverse recognition of the outside world.
  • a near-infrared filter can be used for the optical filter FL or FR disposed on the optical path of one non-polarized region A2 of the left and right cameras 1L and 1R.
  • the image information through one of the near-infrared filters of the second image information P1b and P2b and the image not through the other near-infrared filter.
  • a near infrared filter, an infrared filter, an ultraviolet filter, or the like can be used as the optical filters FL and FR disposed on the optical path of at least one non-polarized region A2 of the left and right cameras 1L and 1R.
  • the scattering between the ideal image and the actually captured image is made using the fact that the scattering coefficient for each wavelength band of light is different.
  • the coefficient can be estimated.
  • a high quality image can be obtained by performing an inverse operation on the estimated scattering coefficient.
  • by making the optical conditions of the optical filters FL and FR of the left and right cameras 1L and 1R different more various estimations are possible. In this case, not only the polarizing filter but also the fume removal can be performed using a near infrared filter, an infrared filter, and an ultraviolet filter.
  • the first optical member 11 includes first and second lenses 11a and 11b, the first lens 11a and the second optical member 12 are disposed on the optical path of the polarization region A1, and the non-polarization region A2.
  • the first lens 11a and the second lens 11b are arranged on the optical path.
  • the external environment recognition apparatus 100 of the present embodiment it is possible to recognize the environment outside the vehicle by comparing a plurality of images with different optical conditions by the stereo camera.
  • the image information P1b and P2b of the area corresponding to the non-polarized area A2 is stored in the left camera 1L.
  • the image information P1 is arranged on the left side of the image information P1 and on the right side of the image information P2 of the right camera 1R.
  • 9 (a), 9 (b) and 9 (c) show modifications thereof.
  • the polarization areas A1 are provided corresponding to the right and left sides of the imaging target areas of the left and right cameras 1L and 1R.
  • the polarization area A1 is provided corresponding to the upper side of the imaging target areas of the left and right cameras 1L and 1R
  • the non-polarization area A2 is provided corresponding to the lower side of the imaging target areas of the left and right cameras 1L and 1R. May be.
  • the image information P1a and P2a corresponding to the polarization area A1 is arranged above the image information P1 and P2 of the imaging target areas of the left and right cameras 1L and 1R.
  • the image information P1b and P2b corresponding to the non-polarized area A2 is arranged below the image information P1 and P2 of the shooting target areas of the left and right cameras 1L and 1R.
  • the lower area of the image information P1 and P2 is image information of an area approaching the vehicle and is not normally used for parallax processing, so that it is possible to effectively use this area.
  • the polarization area A1 is provided corresponding to the lower side of the imaging target areas of the left and right cameras 1L and 1R
  • the non-polarization area A2 is provided corresponding to the upper side of the imaging target areas of the left and right cameras 1L and 1R. May be.
  • the image information P1a and P2a corresponding to the polarization area A1 is arranged below the image information P1 and P2 of the imaging target areas of the left and right cameras 1L and 1R.
  • the image information P1b and P2b corresponding to the non-polarized area A2 is arranged above the image information P1 and P2 of the shooting target areas of the left and right cameras 1L and 1R.
  • the upper areas of the image information P1 and P2 are image information corresponding to the sky ahead of the vehicle, and are not normally used for parallax processing, so that these areas can be used effectively.
  • the arrangement of the image information P1a and P2a and the image information P1b and P2b shown in FIGS. 7, 9A, and 9B can be appropriately combined.
  • the image information P1a and P2a corresponding to the polarization region A1 are arranged on the lower right side and the left side of the image information P1 and P2 of the left and right cameras 1L and 1R, and the non-polarization region
  • the image information P1b and P2b corresponding to A2 can be arranged on the upper side, the left side and the right side of the image information P1 and P2 of the left and right cameras 1L and 1R.
  • Embodiment 2 of the external recognition apparatus of the present invention will be described with reference to FIGS. 10 to 12 with reference to FIG.
  • FIG. 10 is a block diagram showing a schematic configuration of the image processing unit 2B of the external environment recognition apparatus 100B according to the second embodiment of the present invention.
  • FIG. 11 is an explanatory diagram of the monocular camera 1 provided in the external environment recognition device 100B of the present embodiment.
  • FIG. 12 is a diagram illustrating an image of the imaging target area acquired by the monocular camera 1 provided in the external environment recognition device 100B of the present embodiment.
  • the external environment recognition device 100B of the present embodiment does not include a stereo camera composed of left and right cameras 1L and 1R, but includes one monocular camera 1 having the same configuration as the left camera 1L or the right camera 1R shown in FIG. This is different from the external environment recognition apparatus 100 of the first embodiment. Further, the image processing unit 2B included in the external environment recognition device 100B of the present embodiment includes an object recognition processing unit 22B instead of the parallax processing unit 22. Since the other points are the same, the same parts are denoted by the same reference numerals and description thereof is omitted.
  • the monocular camera 1 provided in the external environment recognition device 100 ⁇ / b> B of the present embodiment forms a first image of the imaging target region on each of the first and second imaging regions 13 a and 13 b of the imaging device 13.
  • 1 optical member 11 is provided.
  • the monocular camera 1 also includes a second optical member 12 that allows light of different optical conditions to pass through each of the first and second imaging regions 13a and 13b of the imaging device 13.
  • the first optical member 11 includes a plurality of first lenses 11a. In the present embodiment, for example, two first lenses 11 a and 11 a are arranged on both sides of the second optical member 12 in the y direction.
  • the imaging element 13 includes second imaging regions 13b on both sides in the y direction of the first imaging region 13a.
  • the external environment recognition apparatus 100B having the above configuration captures an imaging target area in front of the vehicle with the monocular camera 1, and recognizes the imaging targets RE, RS, OB, and WP by the object recognition processing means 22B of the image processing unit 2B. Then, as shown in FIG. 12, the first image information P1a obtained from the first imaging area 13a and the second image information P1b obtained from the second imaging area are compared in the same manner as in the first embodiment. Then, the state of the imaging target area is recognized. Therefore, according to the external environment recognition device 100B of the present embodiment, the first image information P1a and the second image information P1b obtained by the monocular camera 1 with different optical conditions are the same as the external environment recognition device 100 of the first embodiment. It becomes possible to recognize the environment outside the vehicle.
  • FIG. 13 is an explanatory diagram of the estimation and learning of the road surface friction coefficient using the road surface ⁇ data 24.
  • FIG. 13 and Table 1 below describe a mechanism for learning a road surface ⁇ estimation dictionary (hereinafter referred to as road surface ⁇ data 24) necessary for estimating a road surface friction coefficient ⁇ (hereinafter also referred to as road surface ⁇ ). Yes.
  • road surface ⁇ data 24 a road surface ⁇ estimation dictionary
  • road surface ⁇ road surface friction coefficient ⁇
  • the process for learning the road surface ⁇ data 24 for recognizing a specific pattern in an image is as follows. First, as shown in FIG. 13, for example, image / original signals S1, S2,... Such as first image information P1a, P2b and second image information P1b, P2b are stored in the road surface ⁇ data 24. In addition to various parameters, various vector operations, matrix operations, autocorrelation operations, convolution operations, and other operations OP are performed, and the result of determining what the unknown pattern category is is output to verify its correctness . Automatic differentiation (AD) is a basic numerical operation system that supports this calculation process.
  • AD Automatic differentiation
  • v is the place where the function value is stored.
  • AD number represented by the above formula (1)
  • various calculations are performed based on the AD number.
  • Table 1 The detailed process of various operations is shown in Table 1.
  • the recognition dictionary parameter adjustment by the learning process can be flexibly configured according to the purpose and situation.
  • a plurality of images such as an image with a polarization filter and an image without a polarization filter, such as the first image information P1a and P2b and the second image information P1b and P2b, are converted into the original signals S1, Treat as S2, ...
  • what kind of situation it is, for example, whether it is a wet road surface, an icy road surface, a snow surface, or dirt on the road surface that is misleading to be wet, and the situation
  • the measured value of the road surface ⁇ value at is attached as correct answer information.
  • the process of automatically calculating the discrimination result and the process of calculating the gap between the correct answer information can all be written as a program including conditional branches such as an if statement and arithmetic operations.
  • the automatic differentiation mechanism that can process the value calculation and the partial differential function value calculation at the same time is useful.
  • the target numerical value for example, the estimated value of the road surface ⁇
  • the partial differential function value for the parameter used in the calculation is obtained simultaneously. .
  • the method of finely adjusting parameters as described above is called the gradient method.
  • the partial differential function value of the target function is required.
  • the road surface ⁇ estimation an image without a polarization filter and an image with a polarization filter and road surface correct information for the image are input, and the road surface for the input image is referred to with reference to the parameters described in the recognition dictionary (road surface ⁇ data 24).
  • the objective function is to calculate the ⁇ value and minimize the error between it and the correct answer.
  • This goal function is often changed by learning situations. For example, when other filters (infrared rays, ultraviolet filters, etc.) are used in addition to the polarizing filter, the number of input images increases to three or more. Even if such a condition change is made, if automatic differentiation is used, the calculation value and parameters of the function can be obtained simply by writing the objective function as a program, so that learning by the gradient method can be easily performed.
  • the case where a lens is used as the first optical member has been described, but other optical members including a mirror, a prism, and a beam splitter may be used as the first optical member.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Measurement Of Optical Distance (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un dispositif de reconnaissance d'environnement externe qui est capable de comparer une pluralité d'images provenant d'une unique caméra monoculaire ou caméra stéréo et ayant différents états optiques, et reconnaissant un environnement à l'extérieur d'un véhicule. Une camera (1L, 1R), qui est montée dans un véhicule et qui sert à acquérir des informations d'image d'une zone soumise à une prise d'image se trouvant devant le véhicule, est dotée d'un élément d'imagerie (13) comportant une première et une seconde région d'imagerie (13a, 13b), un premier élément optique (11) servant à former des images de la zone soumise à une prise d'image sur la première et la seconde région d'imagerie (13a, 13b), et un second élément optique (12) (par exemple, un filtre de polarisation) permettant à une lumière ayant des états optiques différents de passer vers la première et la seconde région d'imagerie (13a, 13b). Une unité de traitement d'image (2) compare des premières informations d'image provenant de la première région d'imagerie (13a) et des secondes informations d'image provenant de la seconde région d'imagerie (13b) et reconnaît un état (par exemple, un coefficient de frottement de surface de route) de la zone soumise à une prise d'image.
PCT/JP2014/065667 2013-09-27 2014-06-13 Dispositif de reconnaissance d'environnement externe WO2015045501A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015538948A JPWO2015045501A1 (ja) 2013-09-27 2014-06-13 外界認識装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013201555 2013-09-27
JP2013-201555 2013-09-27

Publications (1)

Publication Number Publication Date
WO2015045501A1 true WO2015045501A1 (fr) 2015-04-02

Family

ID=52742659

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/065667 WO2015045501A1 (fr) 2013-09-27 2014-06-13 Dispositif de reconnaissance d'environnement externe

Country Status (2)

Country Link
JP (2) JPWO2015045501A1 (fr)
WO (1) WO2015045501A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017056822A1 (fr) * 2015-09-30 2017-04-06 ソニー株式会社 Dispositif de traitement d'image, procédé de traitement d'image et système de commande de véhicule
JP2017083352A (ja) * 2015-10-29 2017-05-18 Smk株式会社 車載センサ、車両用灯具、車両及び路面状態センサ
CN112470456A (zh) * 2018-07-24 2021-03-09 株式会社东芝 铁路车辆用摄像系统
CN112997189A (zh) * 2018-10-31 2021-06-18 罗伯特·博世有限公司 借助第一超声波传感器的道路潮湿信息来支持行进工具的基于摄像机的周围环境识别的方法
JP7021798B1 (ja) 2020-12-04 2022-02-17 国立研究開発法人土木研究所 学習済みモデル生成方法、路面滑り摩擦係数推定装置、路面滑り摩擦係数推定プログラムおよび路面滑り摩擦係数推定方法
WO2023203748A1 (fr) * 2022-04-22 2023-10-26 日立Astemo株式会社 Unité de caméra

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021064833A1 (fr) * 2019-09-30 2021-04-08 日本電気株式会社 Dispositif de détection d'anomalie, procédé de détection d'anomalie, et support d'enregistrement

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6015532A (ja) * 1983-07-07 1985-01-26 Nippon Doro Kodan 路上水面測定装置
JPH11211659A (ja) * 1998-01-23 1999-08-06 Nagoya Denki Kogyo Kk 路面状態判別方法およびその装置
JPH11230898A (ja) * 1998-02-10 1999-08-27 Mitsubishi Motors Corp 路面状態判別装置
JP2010004090A (ja) * 2008-06-18 2010-01-07 Ricoh Co Ltd 撮像装置
JP2010025915A (ja) * 2008-06-18 2010-02-04 Ricoh Co Ltd 撮像装置及び路面状態判別方法
JP2010261877A (ja) * 2009-05-11 2010-11-18 Ricoh Co Ltd ステレオカメラ装置及びそれを用いた車外監視装置
WO2013114891A1 (fr) * 2012-02-03 2013-08-08 パナソニック株式会社 Dispositif d'imagerie et système d'imagerie

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040169656A1 (en) * 2002-11-15 2004-09-02 David Piponi Daniele Paolo Method for motion simulation of an articulated figure using animation input
JP4974543B2 (ja) * 2005-08-23 2012-07-11 株式会社フォトニックラティス 偏光イメージング装置
JP5102718B2 (ja) * 2008-08-13 2012-12-19 株式会社Ihi 植生検出装置および方法
JP2012008947A (ja) * 2010-06-28 2012-01-12 Hitachi Ltd 営業活動分析方法及び営業支援システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6015532A (ja) * 1983-07-07 1985-01-26 Nippon Doro Kodan 路上水面測定装置
JPH11211659A (ja) * 1998-01-23 1999-08-06 Nagoya Denki Kogyo Kk 路面状態判別方法およびその装置
JPH11230898A (ja) * 1998-02-10 1999-08-27 Mitsubishi Motors Corp 路面状態判別装置
JP2010004090A (ja) * 2008-06-18 2010-01-07 Ricoh Co Ltd 撮像装置
JP2010025915A (ja) * 2008-06-18 2010-02-04 Ricoh Co Ltd 撮像装置及び路面状態判別方法
JP2010261877A (ja) * 2009-05-11 2010-11-18 Ricoh Co Ltd ステレオカメラ装置及びそれを用いた車外監視装置
WO2013114891A1 (fr) * 2012-02-03 2013-08-08 パナソニック株式会社 Dispositif d'imagerie et système d'imagerie

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017056822A1 (fr) * 2015-09-30 2017-04-06 ソニー株式会社 Dispositif de traitement d'image, procédé de traitement d'image et système de commande de véhicule
CN108028022A (zh) * 2015-09-30 2018-05-11 索尼公司 图像处理装置、图像处理方法和车辆控制系统
US10769951B2 (en) 2015-09-30 2020-09-08 Sony Corporation Image processing apparatus, image processing method, and vehicle control system to determine the presence of an object from an image of a peripheral area of a moving body
CN108028022B (zh) * 2015-09-30 2021-06-15 索尼公司 图像处理装置、图像处理方法和车辆控制系统
JP2017083352A (ja) * 2015-10-29 2017-05-18 Smk株式会社 車載センサ、車両用灯具、車両及び路面状態センサ
CN112470456A (zh) * 2018-07-24 2021-03-09 株式会社东芝 铁路车辆用摄像系统
CN112997189A (zh) * 2018-10-31 2021-06-18 罗伯特·博世有限公司 借助第一超声波传感器的道路潮湿信息来支持行进工具的基于摄像机的周围环境识别的方法
JP7021798B1 (ja) 2020-12-04 2022-02-17 国立研究開発法人土木研究所 学習済みモデル生成方法、路面滑り摩擦係数推定装置、路面滑り摩擦係数推定プログラムおよび路面滑り摩擦係数推定方法
JP2022089676A (ja) * 2020-12-04 2022-06-16 国立研究開発法人土木研究所 学習済みモデル生成方法、路面滑り摩擦係数推定装置、路面滑り摩擦係数推定プログラムおよび路面滑り摩擦係数推定方法
WO2023203748A1 (fr) * 2022-04-22 2023-10-26 日立Astemo株式会社 Unité de caméra

Also Published As

Publication number Publication date
JPWO2015045501A1 (ja) 2017-03-09
JP2017151121A (ja) 2017-08-31

Similar Documents

Publication Publication Date Title
WO2015045501A1 (fr) Dispositif de reconnaissance d'environnement externe
KR101458287B1 (ko) 거리 측정용 카메라 장치
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
KR101411668B1 (ko) 교정 장치, 거리 측정 시스템, 교정 방법, 및 교정 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체
US11620837B2 (en) Systems and methods for augmenting upright object detection
US10412370B2 (en) Photographing device and vehicle
EP3150961B1 (fr) Dispositif de caméra stéréo et véhicule pourvu d'un dispositif de caméra stéréo
JP5967463B2 (ja) 物体識別装置、並びに、これを備えた移動体制御装置及び情報提供装置
US20200404224A1 (en) Multispectrum, multi-polarization (msmp) filtering for improved perception of difficult to perceive colors
CN107122770B (zh) 多目相机系统、智能驾驶系统、汽车、方法和存储介质
US10992920B2 (en) Stereo image processing device
US20180276844A1 (en) Position or orientation estimation apparatus, position or orientation estimation method, and driving assist device
US11012684B2 (en) Vehicular camera testing using a slanted or staggered target
WO2011016257A1 (fr) Dispositif de calcul de la distance pour un véhicule
JP2015195489A (ja) 衝突防止システム、衝突防止方法およびコンピュータプログラム
JP6204844B2 (ja) 車両のステレオカメラシステム
JP7498364B2 (ja) 雨、侵入光および汚れがある場合のカメラの画像の補正
US10643077B2 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program
CN111989541A (zh) 立体摄像机装置
CN111373411A (zh) 用于确定与对象的间距的方法、设备和计算机程序
Barros et al. Deep speed estimation from synthetic and monocular data
JP7207889B2 (ja) 測距装置および車載カメラシステム
JP7492599B2 (ja) 車載カメラ装置
WO2024018709A1 (fr) Dispositif de caméra stéréo et procédé d'étalonnage
JP2023547515A (ja) 画像および/または画像点を偏位修正する方法、カメラベースのシステムおよび車両

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14848357

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015538948

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14848357

Country of ref document: EP

Kind code of ref document: A1