CN102783144A - Vehicle perimeter monitoring device - Google Patents
Vehicle perimeter monitoring device Download PDFInfo
- Publication number
- CN102783144A CN102783144A CN2011800118582A CN201180011858A CN102783144A CN 102783144 A CN102783144 A CN 102783144A CN 2011800118582 A CN2011800118582 A CN 2011800118582A CN 201180011858 A CN201180011858 A CN 201180011858A CN 102783144 A CN102783144 A CN 102783144A
- Authority
- CN
- China
- Prior art keywords
- mentioned
- vehicle
- image
- display
- display unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012806 monitoring device Methods 0.000 title claims description 12
- 238000012360 testing method Methods 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 abstract 2
- 238000012545 processing Methods 0.000 description 36
- 230000005484 gravity Effects 0.000 description 9
- 238000000034 method Methods 0.000 description 9
- 238000005070 sampling Methods 0.000 description 8
- 238000006467 substitution reaction Methods 0.000 description 6
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- RKTYLMNFRDHKIL-UHFFFAOYSA-N copper;5,10,15,20-tetraphenylporphyrin-22,24-diide Chemical compound [Cu+2].C1=CC(C(=C2C=CC([N-]2)=C(C=2C=CC=CC=2)C=2C=CC(N=2)=C(C=2C=CC=CC=2)C2=CC=C3[N-]2)C=2C=CC=CC=2)=NC1=C3C1=CC=CC=C1 RKTYLMNFRDHKIL-UHFFFAOYSA-N 0.000 description 5
- 239000000284 extract Substances 0.000 description 5
- 238000004088 simulation Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000000205 computational method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000012467 final product Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/30—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- G06T5/94—
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/307—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8033—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for pedestrian protection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
Abstract
An imaging means mounted on a vehicle performs imaging resulting in grayscale images which have brightness values corresponding to object temperature, and objects around the vehicle are detected from said images. On the basis of said grayscale images, display images to be displayed on a display device mounted on the vehicle are generated and displayed on the display device. The display images are generated by means of lowering the brightness of areas not corresponding to the objects detected in the grayscale images. The display device is positioned in the vehicle width direction at no more than a prescribed distance away from an imaginary line passing through the center of rotation of the vehicle steering wheel and extending in the longitudinal direction of the vehicle. In this way, because display images are generated in which only the objects are spotlighted, the driver can quickly comprehend the objects present when using a display device like that described above.
Description
Technical field
The present invention relates to a kind ofly a kind ofly detect the object of vehicle periphery and carry out device shown to carrying out device for monitoring around the vehicle, specifically being meant.
Background technology
In following patent documentation 1, disclose a kind of like this system: on vehicle, be provided with head up display (HUD); Utilize infrared camera to detect the object of vehicle periphery; On the travel direction of vehicle, set near judging the zone; Judging the zone near judging to set outside the zone to get into, be positioned near judging the zone, then at outstanding this object that shows of middle section of the display frame of above-mentioned head-up indicator if detect object; Be positioned at above-mentioned entering and judge the zone if detect object, then in the left field of the display frame of head-up indicator or right side area, show this object through normal image.
Like what put down in writing in the above-mentioned patent documentation, use the advantage of head-up indicator to be that head-up indicator is arranged on the place ahead of driver, the driver is in order to confirm that display frame only need make sight (sight line) move less distance and get final product.But, at present, have not to be equipped with such head-up indicator on a lot of vehicles.On the other hand, the display unit that is installed on the instrument board is more common.Particularly, along with popularizing of guider, the display unit that the various information such as cartographic information that guider provided are shown also is widely used.
Such common display device is not the place ahead that is positioned at the driver generally, but is positioned at left side or right side, compares with head-up indicator, and the driver need make sight line move on to more distance in order to confirm display frame.Thereby, images displayed on the display unit can be understood in the short as far as possible time by the driver.
The prior art document
Patent documentation
Patent documentation 1: No. the 4334686th, Japanese invention patent mandate communique
Summary of the invention
Thereby, a kind of technical scheme need be provided, when stating common display unit in the use, can make the driver in the short as far as possible time, recognize the existence of (identifying) object.
The technical scheme of technical solution problem
One aspect of the present invention relates to the periphery monitoring device of vehicle, comprising: image mechanism, its use are provided in camera the making a video recording to vehicle on the vehicle on every side; Gray level image obtains mechanism, and it obtains having the gray level image corresponding to the brightness value of the temperature of above-mentioned object through the photographic images that is photographed by above-mentioned image mechanism; Object testing agency, it detects the object of the regulation on every side that is present in above-mentioned vehicle from above-mentioned gray level image; Display image generates mechanism, and it generates based on above-mentioned gray level image, the display image that the display unit that confession is equipped with on above-mentioned vehicle shows; Indication mechanism, it is presented on the aforementioned display device the above-mentioned display image that is generated.Above-mentioned display image generates mechanism, and the brightness value in the zone outside the detected above-mentioned object in the above-mentioned gray level image is reduced, and generates above-mentioned display image thus.
Adopt the present invention, reduce generating display image and it is presented on the display unit through the brightness value that makes the zone outside the object in the gray level image.Thereby, on display unit, showing the image that contrast is higher, only object is given prominence to demonstration.So, can make the driver who observes this display image can promptly recognize the existence of this object.
In an embodiment of the invention; Above-mentioned display device configurations is in the position that can be seen by the driver of above-mentioned vehicle; And the following liner set a distance of distance on the overall width direction; This straight line is meant, the center of rotation of the lever through above-mentioned vehicle and the straight line that extends along the fore-and-aft direction of this vehicle.
Such display unit is not the above-mentioned less HUD of sight line amount of movement, is the common display unit that is configured in lever left side or right side, need make sight line move bigger distance (the sight line amount of movement is bigger) in order to watch the picture on it.Adopt the present invention, owing to can shorten driver's identifying object required time of thing through the higher display image of above-mentioned contrast, thereby, even adopt common display unit, also can make the driver promptly identify object.
In an embodiment of the invention; Above-mentioned object testing agency judges whether the possibility that above-mentioned vehicle and above-mentioned object bump is higher; Be judged as possibility that above-mentioned vehicle and above-mentioned object bump when higher, above-mentioned display image generates mechanism reduces the brightness value in the zone outside the above-mentioned detected object.
Adopt the present invention; When the possibility that is judged as object and collision happens is higher; The formation object thing is by the outstanding higher display image of contrast of display degree, thereby, can make the driver promptly identify the existence of the higher object of the possibility of this and collision happens.
In an embodiment of the invention; Above-mentioned display image generates mechanism and makes on the existing position of above-mentioned object of the doubling of the image in above-mentioned gray level image of simulating above-mentioned object, and above-mentioned indication mechanism is presented on the above-mentioned display unit this overlapping display image.
Adopt the present invention, vehicle periphery is taken on the position of the object in the resulting gray level image, thereby the diversion that can make the driver is on analog image institute position overlapped because analog image overlapped.In addition, because the image of object blocked by analog image, thereby can impel the driver to watch the place ahead.
In an embodiment of the invention, above-mentioned display unit is the display unit of guider.Adopt the present invention, can use the display unit of guider effectively, the driver is reported existing of the object that is positioned at vehicle periphery.
The present invention also has other characteristics and advantage, will carry out detailed explanation to it below.
Description of drawings
Fig. 1 is in one embodiment of the present invention, has utilized the structural representation block diagram of periphery monitoring device of vehicle of the display unit of guider;
Fig. 2 is installed in the accompanying drawing of the state on the vehicle for display unit and camera in expression one embodiment of the present invention;
Fig. 3 is the flow chart of the program of the graphics processing unit in one embodiment of the present invention;
Fig. 4 is the accompanying drawing of the coordinate system of real space coordinate system in expression one embodiment of the present invention and image;
Fig. 5 is the accompanying drawing of the computational methods of the vector that relatively moves in expression one embodiment of the present invention;
Fig. 6 is the flow chart of the alarm determining program in expression one embodiment of the present invention;
Fig. 7 is the coverage of the vehicle front in expression one embodiment of the present invention and the accompanying drawing of each dividing region;
Fig. 8 is the accompanying drawing that is used for explaining the entering judgment processing of one embodiment of the present invention;
Fig. 9 is the accompanying drawing of generation that is used for explaining the display image of one embodiment of the present invention;
Figure 10 is the accompanying drawing of an example of the display image in expression one embodiment of the present invention;
Figure 11 is the flow chart of the alarm judgment processing program in expression one embodiment of the present invention;
Figure 12 is the accompanying drawing of generation that is used for explaining the display image of one embodiment of the present invention.
Embodiment
Specific embodiments of the invention describes with reference to the accompanying drawings.Fig. 1 is in one embodiment of the present invention, has utilized the structural representation block diagram of periphery monitoring device of vehicle of the display unit of guider; Fig. 2 is installed in the accompanying drawing on the vehicle for representing display unit and camera.
On vehicle, be equipped with guider, this guider has navigation elements 5 and display unit 4.Shown in (a) among Fig. 2; Display unit 4 is configured in the position apart from straight line L1 predetermined distance; Can be seen by a driver; This straight line L1 does, extends the straight line of (for ease of understanding, what indicate among the figure is to extend along above-below direction among the figure) through the center of the lever (steering wheel) 21 of vehicle and along the fore-and-aft direction of vehicle.In this execution mode, display unit 4 is embedded on the instrument board 23 of vehicle.
Navigation elements is made up of the computer with central processing unit (CPU) and memory.Navigation elements 5 has the communicator (not shown), and navigation elements 5 receives the gps signal of position that is used to measure vehicle 10 that for example is used to come from artificial satellite through this communicator, detects the current location of vehicle 10 according to this gps signal.Navigation elements 5 makes the cartographic information of the doubling of the image around vehicle of representing this current location, and (this cartographic information can be to be stored in the storage device of guider; Also can obtain from server through communicator) on, and make it to be presented in the display frame 25 of display unit 4.In addition, the display frame 25 of display unit 4 constitutes touch-screen, and the passenger can perhaps be input to the destination in the navigation elements 5 through other input units 27 of button or button etc. through this touch-screen.Navigation elements 5 calculates the optimal path that vehicle arrives this destination, and the doubling of the image of this optimal path of expression is being shown on the cartographic information and in the display frame 25 in display unit 4.
In addition, recently, guider has the transport information of providing and reports near the various functions such as place of vehicle, in this execution mode, can use any one suitable guider.
In addition, the periphery monitoring device of vehicle has graphics processing unit 2 and loud speaker 3.Graphics processing unit 2 is provided on the vehicle, detects the object of vehicle periphery according to two infrared camera 1R that can detect far infrared and the taken view data of 1L.Loud speaker 3 according to this testing result with the mode of sound (comprise ring and simulate voice) give the alarm (reporting).4 pairs of display unit show through camera 1R and the resulting image of 1L, and, also be used to make the driver to recognize that there is the demonstration of object in vehicle periphery.In addition, the vehicle periphery monitoring device also has: the vehicle speed sensor 7 of the travel speed (speed of a motor vehicle) of the yaw rate sensor 6 of the yaw rate of detection vehicle, detection vehicle.The testing result of these transducers is sent to graphics processing unit 2.
In this execution mode, shown in (b) among Fig. 2, camera 1R and 1L are configured in the front portion of vehicle, and the two is with respect to the center line balanced configuration at the center of passing through the overall width direction, so that the place ahead of vehicle 10 is taken.These two camera 1R and 1L are fixed on the vehicle, and the optical axis of the two is parallel to each other, and equate with respect to the height on ground. Infrared camera 1R, 1L have high more its of temperature of object and export the characteristic of the strength grade high more (that is, the brightness value in the photographic images is big more) of signal.
So; In this execution mode; The display unit 4 of guider is used to carry out, and takes the demonstration of resulting image and will go out to exist this information of object of regulation to report (alarm) according to this image detection through camera 1R and 1L and give the required demonstration of driver.As stated; Be arranged on before on the glass for vehicle window and to make image be displayed on driver's the head up display (HUD) in the place ahead different; Display unit 4 is the 21 predetermined distance settings of leave operation handle on the overall width direction, thereby, than HUD; The driver moves on to bigger distance for the image of observing display unit 4 need make sight line, so it is also longer to observe required time of identification.Therefore, the existence of object the demonstration of display unit 4 is understood more easily than HUD, can be made the driver identify object in the short period of time for the driver is recognized through display unit 4.The object of the invention promptly is to provide a kind of like this scheme, says bluntly, makes the object in the image can be more noticeable exactly.Concrete implementation, explanation below.
Fig. 3 is the flow chart that is used for the performed program of presentation video processing unit 2.And, carry out this program at interval with official hour.In addition, about the related processing of step S11 ~ S23, opening the open communique spy of Japanese patent of invention has concrete record in 2001-6096 number, thereby only simply it is narrated at this.
In step S11 ~ S13, receive the output signal (that is, the data of photographic images) of camera 1R and 1L as input signal, and it is carried out the A/D conversion, be stored in the video memory afterwards.Institute's image stored data are the gray level image that comprises luminance information.
In step S14, being that image right is that benchmark image (also can be benchmark image with the left-side images) carries out binary conversion treatment to picture signal by the taken image of camera 1R.Particularly, make the zone brighter be " 1 " (in vain), make than its dark zone to be " 0 " (deceiving) than briliancy threshold value ITH.Through this binary conversion treatment the object that the for example such temperature of organism (animal, people) is higher than set point of temperature is extracted as white region.Briliancy threshold value ITH can confirm through any suitable method.
In step S15, will be by binary conversion treatment view data be transformed to Run-Length Coding (perhaps being run length encoding) data (run length data).Particularly; Promptly; For the zone that becomes " in vain " through binaryzation; Represent run length encoded data with the coordinate of the starting point (being positioned at the pixel of the left end of stroke) of the white region (be called " stroke or the distance of swimming ", English for " line ") in each row in the pixel-matrix and from the breast the tape length (representing) of (pixel that is positioned at the stroke right-hand member) of starting point with pixel count.Wherein, the vertical direction in the image (vertically) is the y axle, and horizontal direction (laterally) is the x axle.For example, the y coordinate be white region in 1 the pixel column be one from (x1, y1) to (so, the trip is made up of 3 pixels for x3, stroke y1), so, with (x1, y1,3) expression.
In step S16 and S17, object is enclosed label, carry out the extraction of object and handle.That is, in by the stroke of Run-Length Codingization, will be regarded as an object altogether, it will be enclosed label (sign) at the stroke that the y direction has a lap.So, one or more object is proposed.
In step S18, calculate the external tetragonal aspect ratio ASPECT of center of gravity G, area S and this object of the object that is extracted.About concrete algorithm: area S carries out quadrature through the length of using run length encoded data to the same target thing and calculates.The x direction coordinate of center of gravity G is the x axial coordinate of the bisector on the x direction of area S, and y direction coordinate is the y axial coordinate of the bisector on the y direction of area S.Aspect ratio ASPECT is the ratio Dy/Dx of the length Dx on length Dy and the x direction on the external tetragonal y direction.In addition, the position of center of gravity G also can adopt external tetragonal position of centre of gravity to replace.
In step S19, carry out the time tracking (tracking) of object, that is, the same target thing is discerned in each sampling period.Sampling period can be identical with the program implementation cycle of Fig. 3.Particularly, will use the sampling period discretization, extract object A at moment k as the time t of analog quantity, judge object A with in i.e. homogeneity between the moment the object B that k+1 extracted of next sampling period.The judgement of homogeneity can condition according to the rules be carried out.For example; If; (1) object A and B are under the permissible value that permissible value little, (2) object B at area image on respect to object A at permissible value that the odds ratio of area image on stipulate little, the external tetragonal aspect ratio of (3) object B with respect to the odds ratio of the external tetragonal aspect ratio of object A stipulate of difference than regulation of the position coordinate value (x and y) of the center of gravity G on the image, and then being judged as object A and object B is the same target thing.
The position of the object that so, in each sampling period, extracts (being the coordinate of center of gravity G in this execution mode) is stored in the memory as time series data with the sign of being enclosed.
In addition, the processing of the S14 ~ S19 of explanation is that the benchmark image (being image right in this execution mode) that is directed against binaryzation carries out above.
In step S20, read by vehicle speed sensor 7 detected speed V CAR and by yaw rate sensor 6 detected yaw rate YR, and yaw rate YR is carried out time integral, be steering angle θ r thereby calculate the angle that vehicle 10 turned over.
On the other hand, parallel mutually with the processing of step S19 and S20 in step S31 ~ S33, calculate the distance of vehicle 10 to object.Because this computing need be than the longer time of processing of step S19 and S20, thereby, can be to carry out this computing with the longer cycle (for example, three times of the cycle among step S19 and the S20) in cycle among the S20 than step S19.
In step S31, in the object of following the tracks of by the binary image of benchmark image (in this execution mode for image right), select one, with it as searching image R1 (, being searching image) with the graphics field that external quadrangle was enclosed at this.In step S32, the search object image identical in left-side images (below be called correspondence image) with this searching image R1.Particularly, this process realizes through between searching image R1 and left-side images, carrying out related operation.What this related operation used is not binary image, but gray level image.
Wherein, searching image R1 has M * N pixel count, IR (m; N) be interior coordinate (m, the n) brightness value of position, IL (a+m-M of searching image R1; B+n-N) do, in the left-side images, with the coordinate (a of regulation; B) be basic point, have coordinate (m, n) gray value of position in the regional area of same shape with searching image R1.(a, (a is minimum b) to the coordinate of conversion basic point, and (a b) is minimum position, thereby confirms the position of correspondence image to try to achieve briliancy difference total value C b) to make briliancy difference total value C.
Method instead can preestablish in left-side images and want the zone searched for, at searching image R1 and this interregional related operation that carries out.
In step S33; Try to achieve the position of centre of gravity of searching image R1 and the picture centre line of the photographic images line of photographic images five equilibrium (on the x direction with) LCTR apart from the position of centre of gravity of dR (pixel count) and correspondence image and picture centre line LCTR apart from dL (pixel count); In the substitution formula (2), obtain the distance z of vehicle 10 to object.
Wherein, B representes that base length is the distance (that is, the interval of the optical axis of two cameras) of center on x direction (horizontal direction) of capturing element of center and camera 1L of the capturing element of camera 1R; F representes the focal length of the camera lens on camera 1R and the 1L, and p representes the pixel separation of the capturing element of camera 1R and 1L.Δ d (=dR+dL) expression parallax size.
In step S21, with the i.e. coordinate (x in image in the position (as stated, in this execution mode being the position of center of gravity G) of object; Y) and in the distance z substitution formula of being calculated by formula (2) (3); With its be transformed to the real space coordinate (X, Y, Z).About real space coordinate (X, Y, coordinate system Z); Shown in (a) among Fig. 4; Mid point (fixing with respect to vehicle) with the line between the installation site of two camera 1R and 1L is initial point O, and as shown in the figure, the overall width direction of vehicle 10 is the X axle; The short transverse of vehicle 10 is the Y axle, and the travel direction of vehicle 10 is the Z axle.Shown in (b) among Fig. 4, the coordinate on the image is initial point, is the x axle with the horizontal direction, is that the coordinate system of y axle is represented with the vertical direction by the center with image.
f=F/p
Wherein, (xc, yc) expression is, with the coordinate (x on the image right; Y) be transformed to according to the relative position relation between the initial point O of the installation site of camera 1R and real space coordinate system, make the coordinate in the imaginary image consistent of the center of image with real space initial point O.In addition, f representes the ratio of focal length F and pixel separation P.
In step S22, carry out the steering angle correcting process, to revise the position deviation on the image that causes that turns to owing to vehicle 10.For example, during from moment k to moment k+1, the vehicle 10 over-angle θ r (steering angle) that turns left, the image that then camera obtained go up in x direction (positive direction) and produce deviation delta x.Thereby to revise it.
Particularly, with the real space coordinate (X, Y, in Z) the substitution formula (4), calculate revise coordinate (Xr, Yr, Zr).(Xr, Yr Zr) are stored in the memory according to time series to the true spatial location data of each object of being calculated.In addition, in the explanation below this revised coordinate representation be (X, Y, Z).
In step S23; To the same target thing; N true spatial location data (for example, about N=10) according to behind the resulting correction steering angle in during Δ T are the time sequence data, try to achieve the near linear LMV that characterizes the vector that relatively moves that object moves with respect to vehicle 10.Particularly, the direction vector L of the direction of expression near linear LMV be (lx, ly, lz)! ∣ L ∣=1), try to achieve the represented straight line of following formula (5).
X=u·lx+Xav
Y=u·ly+Yav (5)
Z=u·lz+Zav
Wherein, u is the parameter of getting arbitrary value.Xav, Yav and Zav represent X coordinate mean value, Y coordinate mean value and the Z coordinate mean value of true spatial location data group respectively.Through this parameters u of cancellation, formula (5) becomes following formula (5a).
(X-X?a?v)/lx=(Y-Y?a?v)/l?y=(Z-Z?a?v)/l?z
(5a)
Fig. 5 is the accompanying drawing that is used to explain near linear LMV.P (0), P (1), P (2) ... P (N-2), the revised time series data of P (N-1) expression steering angle; Near linear LMV is the straight line of trying to achieve through least square method; The i.e. mean place coordinate Pav (=(Xav of this time series data; Yav, Zav)) to the minimum value of the power of the distance of each data point.Wherein, the numerical value in the bracket of the P back of the coordinate of expression data point is big more, representes the time forward more (past) of this data point.For example, the position coordinates that P (0) expression is up-to-date, P (1) representes the position coordinates before the sampling period, P (2) representes the position coordinates before two sampling periods.Below illustrated X (j), Y (j), Z (j) etc. also be like this.In addition, about calculating the concrete grammar of near linear LMV, opening the open communique spy of Japanese patent of invention has detailed record in 2001-6096 number.
Afterwards, with up-to-date position coordinates P (0)=(X (0), Y (0); Z (0)) and that sampling period of N-1 (promptly; Before the time Δ t) position coordinates P (N-1)=(X (N-1), Y (N-1), Z (N-1)) be adapted to the position on the near linear LMV.Particularly, in Z coordinate Z (0) and Z (N-1) substitution formula (5a), obtain formula (6); Through type (6) is tried to achieve revised position coordinates Pv (0)=(Xv (0), Yv (0), Zv (0)) and Pv (N-1)=(Xv (N-1); Yv (N-1), Zv (N-1)).
Zv(j)=Z(j)
j=0,N-1
Obtain position coordinates Pv (N-1) that formula (6) calculated vector to Pv (0), this vector for the vector that relatively moves that will try to achieve.
So; According to a plurality of (N) the individual data in the Δ T during the monitoring; Calculate the akin near linear LMV of the track that relatively moves that moves with respect to vehicle 10 with object; Try to achieve the vector that relatively moves, detect the influence that error causes, can more accurately predict the possibility that vehicle 10 and object bump thereby can dip.
In step S24, carry out the alarm determining program.What Fig. 6 represented is this alarm determining program, with reference to Fig. 7 this program is described below.AR0 representes the coverage of camera 1R, 1L among Fig. 7.The step S11 of Fig. 3 is to carry out to the photographic images corresponding with this coverage AR0 to the processing of S23.
Zone AR1 adds that in the both sides of the overall width α of vehicle 10 margin beta (for example; Can be for about 50 ~ 100mm) the zone; In other words, that is, on the overall width direction of vehicle 10, all has the zone of width (α/2+ β) in the both sides of fore-and-aft direction central axis.If object continues to be present in this zone, then the collision possibility of itself and vehicle is higher, thereby this zone is near judging (judgement) zone.Zone AR2 and AR3 are the zone of the absolute value ratio of X coordinate figure near determinating area big (being positioned at the lateral outer side near determinating area), and the object that is in this zone might get near determinating area, and the sub-region is for getting into determinating area.These regional AR1, AR2, AR3 have specified altitude H in the Y direction, have predetermined distance Z1 in the Z direction.
Return Fig. 6 explanation, in step S41, judge the possibility of itself and collision happens to each object.Particularly, extract the object that exists among regional AR1 ~ AR3.Calculate each object with respect to the relative moving speed Vs of vehicle on the Z direction according to formula (7).Afterwards, extract the object that satisfies formula (8) and the said condition of formula (9).
V?s=(Z?v(N-1)-Z?v(0))/ΔT (7)
Z?v(0)/V?s≤T (8)
|Y?v(0)|≤H (9)
Wherein, Zv (0) for up-to-date distance detecting value (having enclosed " v " expression is by the revised data of near linear LMV, still, the Z coordinate still with revise before identical), Zv (N-1) be a time Δ T distance detecting value before.In addition, T is the surplus time, and setting this surplus time T is to judge the possibility that bumps for (before the time span T) before colliding constantly in prediction.For example can T be set at about 2 ~ 5 seconds.VsxT is corresponding to above-mentioned regional AR1 ~ AR3.H regulation (has confirmed) that the size (scope) of Y direction is a height dimension, for example, be set at about 2 times of overall height of vehicle 10, and it equates with the specified altitude H of above-mentioned zone AR1 ~ AR3.So, the vertical direction size of regional AR1 ~ AR3 is restricted to specified altitude H, and the range direction size is restricted to predetermined distance Z1, and the object that is positioned at this regional AR1 ~ AR3 is judged as to have with the possibility of collision happens and is extracted out.
Afterwards, in step S42, carry out near judgment processing, whether be positioned near determinating area AR1 to judge it to each object that extracts.Whether the X coordinate Xv (0) that particularly, promptly judges the position Pv (0) of object is positioned at regional AR1.To be judged as that to be that the object of (YES) is judged as with the possibility of collision happens high, the processing among the execution in step S44 immediately.For being judged as the not object of (NO), represent that it is present among regional AR2 or the AR3, (judgement) processing is judged in its entering of carrying out step S43.
In getting into judgment processing; Whether the difference of judging the x coordinate xc (N-1) that the up-to-date x coordinate xc (0) (what enclose " c " expression be performed center with the real space initial point O consistent revised coordinate that make image) of object on image and time Δ T are preceding satisfies formula (10).Satisfy the object of formula (10) if exist, move and get into, get into step S44 near determinating area AR1 and the possibility that vehicle 10 bumps higher (" being " (YES) among the S43) thereby then be judged as this object.If there is not the object that satisfies formula (10), then is judged as among regional AR1 ~ AR3 and do not have the object (" deny " (NO) among the S43) that has with the possibility of collision happens, entering step S48 at present.In step S48, will flow to display unit 4 and carry out common display as display image by the gray level image that step S13 obtains.
At this, the origin (principle) of formula (10) is simply explained.With reference to Fig. 8; Through the latest position coordinate of object 20 and the straight line of the position coordinates before the time Δ T is near linear LMV; This near linear LMV and the XY plane (plane that comprises X axle and Y axle; That is, comprise the straight line corresponding (X axle) with the leading section of vehicle 10 and perpendicular to the plane of the travel direction of vehicle 10) the X coordinate of intersection point be XCL.Then, consider overall width α, the collision occurrence condition is suc as formula shown in (11).
-α/2≤XCL≤α/2 (11)
On the other hand, will be similar to straight line LMV and be projected on the XZ plane resulting straight line suc as formula shown in (12).
With calculating XCL in Z=0, this formula of X=XCL substitution, then, shown in (13).
In addition, owing to have the relation shown in the formula (3) between the coordinate xc on real space coordinate X and the image, thereby have following formula (14), (15) to set up.
Xv?(0)=x?c?(0)×Z?v?(0)/f (14)
X?v?(N-1)=x?c?(N-1)×Z?v?(N-1)/f (15)
This two formula is applied in the formula (13), and then, the X coordinate XCL of intersection point is suc as formula shown in (16).With in its substitution formula (11) and put in order, then can access following formula (10).In addition, about concrete entering judgment processing, opening the open communique spy of Japanese patent of invention has detailed record in the 2001-6096 communique.
Returning Fig. 6 describes; Judge with the possibility of collision happens higher object near judgment processing with getting in the judgment processing above-mentioned; In step S44, carry out alarm output judgment processing to this object, to judge whether that the driver is sent actual alarm.
In this execution mode, in alarm output judgment processing, according to whether the operation judges of brake being sent actual alarm.Particularly; According to the output signal of brake sensor (not shown), judge whether the driver of vehicle 10 has carried out brake operating (brake), if do not carry out brake operating; Then be judged as send a warning (" being " (YES) among the S44), get into step S45.
When having carried out brake operating, calculate consequent acceleration Gs (deceleration direction for just).To each object; According to calculated threshold GTH shown in the formula (17); Satisfy acceleration Gs and be positioned at below the threshold value GTH that (object of this condition of Gs≤GTH) then is judged as give the alarm (" being " (YES) among the S), gets into step S45 if exist.
Do not satisfy the object of this condition of Gs≤GTH) if do not exist; Then being judged as through brake operating makes collision avoid producing (" denying " (NO) among the S44); Get into step S48, as stated, do not export alarm; Only be that gray level image is exported to display unit 4, carry out common display by this display unit 4.Here, when braking acceleration Gs was kept all the time, value in the following formula (17) is corresponding was that the operating range of vehicle 10 has just stopped this situation below Zv (0).
Through carrying out alarm output judgment processing, can when the driver has carried out suitable brake operating, not give the alarm, thereby can avoid bringing unnecessary bothering to the driver.Yet, scheme instead, also can be, do not carry out alarm output judgment processing, existing when being judged as with the higher object of the possibility of collision happens, get into step S45 by step S41 ~ S43.
In step S45, make in the gray level image that step S 13 obtained, through above-mentioned steps S41 ~ S43 be judged as with the higher object The corresponding area of the possibility of collision happens outside the brightness value in zone reduce.
With reference to Fig. 9, wherein, (a) expression is the sketch map of the gray level image that in step S 13, obtains through infrared camera 1R, 1L.Suppose that object (being the pedestrian in this execution mode) 101 is for being judged as the object higher with the possibility of collision happens.For easy understanding, in this execution mode, the brightness value of object 101 is I1, and the brightness value in the zone outside this object is I2.
In step S45; Shown in (b) among Fig. 9; Generate such display image: in this display image; Except the brightness value of the All Ranges B2 outside the regulation area B 1 that comprises this object (for example, can be the zone that external quadrangle surrounded that set, object in the step S 18 of Fig. 3) lower.In this execution mode; The brightness value that comprises the whole pixels in the area B 1 of object kept (so; The brightness value of the pixel of object 101 maintains I1), the brightness value of all pixels that area B 2 comprised in addition reduces setting and becomes I3 from I2.In step S47, such display image that generates is presented on the display unit 4.
Shown in (b) among Fig. 9, because the brightness value of the area B 2 outside the object area B 1 is lowered, this has been shown by outstanding with respect to the pedestrian in the object area B 1 101.So, generated the higher display image of contrast, compare with (a) among Fig. 9, it is brighter that the pedestrian seems, as spinning off from other parts.
Among Figure 10, as an example, wherein (a) expression is the actual gray level image that obtains, and (b) expression is the display image that processing generated by above-mentioned steps S45.Pedestrian 103 is taken in gray level image.Suppose that this pedestrian 103 is for being judged as the object higher with the possibility of collision happens.
In Figure 10 shown in (b) in the image, the brightness value that surrounds the zone outside pedestrian 103 the regulation zone is lower.Thereby, compare with (a), (driver) can promptly identify pedestrian 103, and moment is just recognized its existence.In addition, as comparing, (c) expression is to make the object in the briliancy image given prominence to the existing alarm mode that shows with outstanding display box 111 among Figure 10.Shown in (c) since in gray level image except pedestrian 103, also photograph a lot of various contents, thereby even pedestrian 103 has been carried out outstanding demonstration, the driver also might be difficult to understand to note where watching in a short period of time.
Particularly, be that configuration is under the situation of the position of leave operation handle certain distance on the overall width direction in the display unit of being utilized 4, use the method shown in (c) among Figure 10, it is slower that the user identifies the time of object.And employing the present invention, shown in (b) among Figure 10, the part outside the pedestrian 103 is darker, thereby, compare with (c), can shorten the required time of identification pedestrian 103.
In embodiment illustrated, the brightness value in the zone outside the regulation zone of encirclement object is lowered, yet the regulation that this encirclement object also can be set is regional, only is the brightness value that reduces the zone outside the object 101.In addition, in the superincumbent explanation, in order to reduce the brightness value in the zone outside the object (perhaps surrounding the regulation zone of object); Make the brightness value of image reduce setting; Yet, instead, also can be that the brightness value with whole pixels in the zone outside the object regulation of object (perhaps surround zone) (for example is reduced to a predefined lower value; All blackening perhaps are reduced to the brightness value that is similar to black).
In addition, though diagram not, yet, can in step S47, carry out in display unit 4 output display images (signal), through loud speaker 3 sound that gives the alarm.The form of alarm tone can be chosen wantonly, can be simple ringing, and also we can say any acoustic information (comprising simulation voice).
In addition; In this execution mode, the brightness value that in step S41 ~ S43, is judged as with the higher object of the possibility of collision happens is held, and so generates display image; Yet; Instead, also can be in alarm output judgment processing, to be judged as when having carried out brake operating by step S44; Make to be judged as briliancy higher with the possibility of collision happens and that satisfy the object of this condition of Gs≤GTH in the alarm output judgment processing of step S44 and to be maintained, so generate display image.Thereby, the object that should be noted that is discerned by the driver.
Figure 11 is the flow chart of the alarm determining program of in the step S24 of Fig. 3, carrying out of other execution modes of the present invention.The places different with Fig. 6 are, have increased step S46, to this, describe with reference to Figure 12.
In this embodiment, shown in (a) among Figure 12, the analog image of object is stored in the storage device of graphics processing unit 2 in advance.Analog image is meant icon (icon) image of simulated object thing.In this execution mode, suppose that object is the pedestrian, storage be the simulation pedestrian icon 105.
In step S46; From storage device, read out this analog image; This analog image is overlapped on the position by the object in the resulting image of the result of step S45 (as stated, i.e. the object that is not lowered of briliancy), generate display image.As shown in Figure 12, the analog image 105 shown in (a) is overlapped, on the position by the object 103 in the image shown in the step S45 resulting (b) (with identical shown in (b) among Fig. 1), obtain the image shown in (c).In step S47, the image of overlapping analog image is outputed on the display unit 4.
On image overlapping analog image be for, object is come in the zone outside visually strongerly with it difference on institute's images displayed.So, preferably make analog image have higher brightness value and have the color that can cause the driver easily and note (for example, red or yellow).In addition, preferred, the brightness value of analog image is set in, than the brightness value in the zone outside the object (brightness value that promptly is lowered) more than the high setting through the processing of step S45.
In addition; The brightness value of analog image can preestablish; Also make it variable, under variable situation, for example; Obtain the brightness value (can be the mean value of the briliancy of this regional pixel) in zone that in step S45, reduce, outside the object, on this brightness value, add setting and as the brightness value of analog image.The analog image that will have this brightness value that is calculated overlaps on the above-mentioned image, generates display image.
In addition, shown in (d) among Figure 12, can on the overlapping basis that analog image arranged, this analog image be fenced up with outstanding display box 107, analog image is shown by outstanding.Should give prominence to display box 107 also can be identical with analog image, has and arouse the color that the driver notes easily, and have higher brightness value.
So, on display unit 4, show the image that contrast is higher, only the analog image of expression object position is being given prominence to demonstration.In addition, analog image is different from taken various objects (the promptly true thing of taking), is the thing of a simulation.Thereby, use this analog image to show, can promptly be discerned by the driver, thereby can promptly discern the existence of recognizing the object that note in other words.In addition, because the photographic images of the reality of object blocked by analog image, so can impel the driver to watch the place ahead.
In addition, as organism (living), except the people, also having animal, that is, as object, also might be to detect animal.To this, can generate simulation pedestrian's the image and the image of simulated animal in advance, and store.At this moment, for example carrying out judgment processing to judge that object is pedestrian or animal before the step S35.Such judgment processing method can be selected any appropriate from prior art.If in this judgment processing, be judged as object is the pedestrian; Then in step S46; Read out corresponding to pedestrian's analog image and to it and carry out above-mentioned overlapping processing,, then read out corresponding to the analog image of animal and to it and carry out above-mentioned overlapping processing if being judged as object is animal.The existence of the object (pedestrian or animal) that so, the driver is promptly recognized note.
In addition, in the above-described embodiment, in the processing of judging the collision possibility, used near determinating area and entering determinating area, yet this determination methods is not limited to this execution mode, can select the determination methods of the collision possibility of other any appropriate for use.
In addition, in the above-described embodiment, what display unit 4 adopted is the display unit of guider.In the present invention, because the driver is discerned object in the short period of time, thereby can adopt the display unit that is configured in driver left side or right side.Yet the present invention also can adopt other display unit, for example can adopt existing head-up indicator.
In addition, what use in the above-described embodiment is the far infrared camera, yet the present invention also can adopt other cameras (for example, visible cameras).
In the above specific implementations of the present invention is illustrated, yet the present invention is not limited in these execution modes.
Description of reference numerals
1R, 1L infrared camera (photographic unit); 2 graphics processing units; 3 loud speakers; 4 display unit.
Claims (5)
1. the periphery monitoring device of a vehicle comprises:
Image mechanism, its use are provided in camera the making a video recording to vehicle on the vehicle on every side;
Gray level image obtains mechanism, and it obtains having the gray level image corresponding to the brightness value of the temperature of above-mentioned object through the photographic images that is photographed by above-mentioned image mechanism;
Object testing agency, it detects the object of the regulation on every side that is present in above-mentioned vehicle from above-mentioned gray level image;
Display image generates mechanism, and it generates based on above-mentioned gray level image, the display image that the display unit that confession is equipped with on above-mentioned vehicle shows;
Indication mechanism, it is presented on the aforementioned display device the above-mentioned display image that is generated,
It is characterized in that,
Above-mentioned display image generates mechanism, and the brightness value in the zone outside the detected above-mentioned object in the above-mentioned gray level image is reduced, and generates above-mentioned display image thus.
2. the periphery monitoring device of vehicle according to claim 1; It is characterized in that; Above-mentioned display device configurations is in the position that can be seen by the driver of above-mentioned vehicle; And the following liner set a distance of distance on the overall width direction, this straight line is meant, the center of rotation of the lever through above-mentioned vehicle and the straight line that extends along the fore-and-aft direction of this vehicle.
3. the periphery monitoring device of vehicle according to claim 1 and 2 is characterized in that,
Above-mentioned object testing agency judges whether the possibility that above-mentioned vehicle and above-mentioned object bump is higher,
Be judged as possibility that above-mentioned vehicle and above-mentioned object bump when higher, above-mentioned display image generates mechanism reduces the brightness value in the zone outside the above-mentioned detected object.
4. according to the periphery monitoring device of each described vehicle in the claim 1 ~ 3; It is characterized in that; Above-mentioned display image generates mechanism and makes on the existing position of above-mentioned object of the doubling of the image in above-mentioned gray level image of simulating above-mentioned object, and above-mentioned indication mechanism is presented on the above-mentioned display unit this overlapping display image.
5. according to the periphery monitoring device of each described vehicle in the claim 1 ~ 4, it is characterized in that above-mentioned display unit is the display unit of guider.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010044465 | 2010-03-01 | ||
JP2010-044465 | 2010-03-01 | ||
PCT/JP2011/000948 WO2011108217A1 (en) | 2010-03-01 | 2011-02-21 | Vehicle perimeter monitoring device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102783144A true CN102783144A (en) | 2012-11-14 |
CN102783144B CN102783144B (en) | 2016-06-29 |
Family
ID=44541887
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201180011858.2A Active CN102783144B (en) | 2010-03-01 | 2011-02-21 | The periphery monitoring apparatus of vehicle |
Country Status (5)
Country | Link |
---|---|
US (1) | US8953840B2 (en) |
EP (1) | EP2544449B1 (en) |
JP (1) | JP5503728B2 (en) |
CN (1) | CN102783144B (en) |
WO (1) | WO2011108217A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104269070A (en) * | 2014-08-20 | 2015-01-07 | 东风汽车公司 | Active vehicle safety pre-warning method and safety pre-warning system with same applied |
CN108431631A (en) * | 2015-12-21 | 2018-08-21 | 株式会社小糸制作所 | Vehicle image acquiring device, control device, include vehicle image acquiring device or control device vehicle and vehicle image acquiring method |
CN109844828A (en) * | 2016-10-19 | 2019-06-04 | 罗伯特·博世有限公司 | For generating the method and apparatus for being used for the urgent call of vehicle |
CN109935107A (en) * | 2017-12-18 | 2019-06-25 | 姜鹏飞 | A kind of method and device promoting traffic whole-visible area |
US11187805B2 (en) | 2015-12-21 | 2021-11-30 | Koito Manufacturing Co., Ltd. | Image acquiring apparatus for vehicle, control device, vehicle having image acquiring apparatus for vehicle or control device, and image acquiring method for vehicle |
US11194023B2 (en) | 2015-12-21 | 2021-12-07 | Koito Manufacturing Co., Ltd. | Image acquiring apparatus for vehicle, control device, vehicle having image acquiring apparatus for vehicle or control device, and image acquiring method for vehicle |
US11204425B2 (en) | 2015-12-21 | 2021-12-21 | Koito Manufacturing Co., Ltd. | Image acquisition device for vehicles and vehicle provided with same |
CN115210790A (en) * | 2020-08-28 | 2022-10-18 | Jvc建伍株式会社 | Object recognition control device and object recognition method |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2546819B1 (en) * | 2010-04-19 | 2015-06-03 | Honda Motor Co., Ltd. | Device for monitoring vicinity of vehicle |
JP2013253961A (en) * | 2012-05-07 | 2013-12-19 | Denso Corp | Image display system |
JP2014116756A (en) * | 2012-12-07 | 2014-06-26 | Toyota Motor Corp | Periphery monitoring system |
JP6296056B2 (en) * | 2013-06-17 | 2018-03-20 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
KR102092625B1 (en) | 2013-10-15 | 2020-04-14 | 현대모비스 주식회사 | Method for alarming status of vehicle and Apparatus for the same |
CN104079881B (en) * | 2014-07-01 | 2017-09-12 | 中磊电子(苏州)有限公司 | The relative monitoring method of supervising device |
KR102149276B1 (en) * | 2014-10-23 | 2020-08-28 | 한화테크윈 주식회사 | Method of image registration |
KR102225617B1 (en) * | 2014-11-03 | 2021-03-12 | 한화테크윈 주식회사 | Method of setting algorithm for image registration |
JP6432332B2 (en) * | 2014-12-15 | 2018-12-05 | 株式会社リコー | Photoelectric conversion element, image reading apparatus, and image forming apparatus |
JP6402684B2 (en) * | 2015-06-10 | 2018-10-10 | トヨタ自動車株式会社 | Display device |
JP7143728B2 (en) * | 2017-11-07 | 2022-09-29 | 株式会社アイシン | Superimposed image display device and computer program |
CN107820116B (en) * | 2017-11-14 | 2020-02-18 | 优酷网络技术(北京)有限公司 | Video playing method and device |
JP6605052B2 (en) * | 2018-01-05 | 2019-11-13 | 株式会社 ミックウェア | Navigation system |
KR20200005282A (en) | 2018-07-06 | 2020-01-15 | 현대모비스 주식회사 | Apparatus and method for lateral image processing of a mirrorless car |
CN114930804A (en) * | 2020-02-14 | 2022-08-19 | 索尼集团公司 | Imaging device and vehicle control system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08305999A (en) * | 1995-05-11 | 1996-11-22 | Hitachi Ltd | On-vehicle camera system |
JP2001023091A (en) * | 1999-07-07 | 2001-01-26 | Honda Motor Co Ltd | Picture display device for vehicle |
US6327536B1 (en) * | 1999-06-23 | 2001-12-04 | Honda Giken Kogyo Kabushiki Kaisha | Vehicle environment monitoring system |
US20060126898A1 (en) * | 2004-11-30 | 2006-06-15 | Honda Motor Co., Ltd. | Vehicle surroundings monitoring apparatus |
JP2010044561A (en) * | 2008-08-12 | 2010-02-25 | Panasonic Corp | Monitoring device to be mounted on vehicle |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3987013B2 (en) * | 2003-09-01 | 2007-10-03 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
JP2005318408A (en) * | 2004-04-30 | 2005-11-10 | Nissan Motor Co Ltd | Vehicle surrounding monitoring apparatus and method |
US20090066819A1 (en) * | 2005-03-15 | 2009-03-12 | Omron Corporation | Image processing apparatus and image processing method, program and recording medium |
JP4426535B2 (en) * | 2006-01-17 | 2010-03-03 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
JP4456086B2 (en) * | 2006-03-09 | 2010-04-28 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
US7671725B2 (en) * | 2006-03-24 | 2010-03-02 | Honda Motor Co., Ltd. | Vehicle surroundings monitoring apparatus, vehicle surroundings monitoring method, and vehicle surroundings monitoring program |
JP4171501B2 (en) * | 2006-04-25 | 2008-10-22 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
JP4173902B2 (en) * | 2006-05-19 | 2008-10-29 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
JP4173901B2 (en) * | 2006-05-19 | 2008-10-29 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
US7741961B1 (en) * | 2006-09-29 | 2010-06-22 | Canesta, Inc. | Enhanced obstacle detection and tracking for three-dimensional imaging systems used in motor vehicles |
DE102007011180A1 (en) * | 2007-03-06 | 2008-09-11 | Daimler Ag | Rangierhilfe and method for drivers of vehicles or vehicle combinations, which consist of mutually bendable vehicle elements |
US7936923B2 (en) * | 2007-08-31 | 2011-05-03 | Seiko Epson Corporation | Image background suppression |
DE102007044535B4 (en) * | 2007-09-18 | 2022-07-14 | Bayerische Motoren Werke Aktiengesellschaft | Method for driver information in a motor vehicle |
EP2401176B1 (en) * | 2009-02-27 | 2019-05-08 | Magna Electronics | Alert system for vehicle |
US8164543B2 (en) * | 2009-05-18 | 2012-04-24 | GM Global Technology Operations LLC | Night vision on full windshield head-up display |
-
2011
- 2011-02-21 JP JP2012502992A patent/JP5503728B2/en not_active Expired - Fee Related
- 2011-02-21 EP EP11750331.8A patent/EP2544449B1/en active Active
- 2011-02-21 CN CN201180011858.2A patent/CN102783144B/en active Active
- 2011-02-21 US US13/579,754 patent/US8953840B2/en active Active
- 2011-02-21 WO PCT/JP2011/000948 patent/WO2011108217A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08305999A (en) * | 1995-05-11 | 1996-11-22 | Hitachi Ltd | On-vehicle camera system |
US6327536B1 (en) * | 1999-06-23 | 2001-12-04 | Honda Giken Kogyo Kabushiki Kaisha | Vehicle environment monitoring system |
JP2001023091A (en) * | 1999-07-07 | 2001-01-26 | Honda Motor Co Ltd | Picture display device for vehicle |
US20060126898A1 (en) * | 2004-11-30 | 2006-06-15 | Honda Motor Co., Ltd. | Vehicle surroundings monitoring apparatus |
JP2010044561A (en) * | 2008-08-12 | 2010-02-25 | Panasonic Corp | Monitoring device to be mounted on vehicle |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104269070A (en) * | 2014-08-20 | 2015-01-07 | 东风汽车公司 | Active vehicle safety pre-warning method and safety pre-warning system with same applied |
CN104269070B (en) * | 2014-08-20 | 2017-05-17 | 东风汽车公司 | Active vehicle safety pre-warning method and safety pre-warning system with same applied |
CN108431631A (en) * | 2015-12-21 | 2018-08-21 | 株式会社小糸制作所 | Vehicle image acquiring device, control device, include vehicle image acquiring device or control device vehicle and vehicle image acquiring method |
US11187805B2 (en) | 2015-12-21 | 2021-11-30 | Koito Manufacturing Co., Ltd. | Image acquiring apparatus for vehicle, control device, vehicle having image acquiring apparatus for vehicle or control device, and image acquiring method for vehicle |
US11194023B2 (en) | 2015-12-21 | 2021-12-07 | Koito Manufacturing Co., Ltd. | Image acquiring apparatus for vehicle, control device, vehicle having image acquiring apparatus for vehicle or control device, and image acquiring method for vehicle |
US11204425B2 (en) | 2015-12-21 | 2021-12-21 | Koito Manufacturing Co., Ltd. | Image acquisition device for vehicles and vehicle provided with same |
CN109844828A (en) * | 2016-10-19 | 2019-06-04 | 罗伯特·博世有限公司 | For generating the method and apparatus for being used for the urgent call of vehicle |
CN109935107A (en) * | 2017-12-18 | 2019-06-25 | 姜鹏飞 | A kind of method and device promoting traffic whole-visible area |
CN115210790A (en) * | 2020-08-28 | 2022-10-18 | Jvc建伍株式会社 | Object recognition control device and object recognition method |
Also Published As
Publication number | Publication date |
---|---|
US8953840B2 (en) | 2015-02-10 |
JPWO2011108217A1 (en) | 2013-06-20 |
CN102783144B (en) | 2016-06-29 |
JP5503728B2 (en) | 2014-05-28 |
EP2544449A4 (en) | 2014-01-01 |
EP2544449A1 (en) | 2013-01-09 |
EP2544449B1 (en) | 2016-03-16 |
WO2011108217A1 (en) | 2011-09-09 |
US20130004021A1 (en) | 2013-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102783144A (en) | Vehicle perimeter monitoring device | |
CN102782741B (en) | Vehicle periphery monitoring device | |
JP4456086B2 (en) | Vehicle periphery monitoring device | |
US8766816B2 (en) | System for monitoring the area around a vehicle | |
JP4410292B1 (en) | Vehicle periphery monitoring device | |
JP4173902B2 (en) | Vehicle periphery monitoring device | |
CN106463060A (en) | Processing apparatus, processing system, processing program, and processing method | |
CN107848416A (en) | Display control unit, display device and display control method | |
JP2007193445A (en) | Periphery monitoring device for vehicle | |
JP4528283B2 (en) | Vehicle periphery monitoring device | |
JP2008027309A (en) | Collision determination system and collision determination method | |
CN110936963A (en) | Driver assistance system and method for displaying traffic information | |
JP4813304B2 (en) | Vehicle periphery monitoring device | |
JP5192007B2 (en) | Vehicle periphery monitoring device | |
JP3999088B2 (en) | Obstacle detection device | |
JP3919975B2 (en) | Vehicle periphery monitoring device | |
JP3949628B2 (en) | Vehicle periphery monitoring device | |
JP4823753B2 (en) | Vehicle periphery monitoring device | |
JP4176558B2 (en) | Vehicle periphery display device | |
JP4943403B2 (en) | Vehicle periphery monitoring device | |
JP2008276787A (en) | Vehicle periphery monitoring device | |
JP2007334763A (en) | Vehicular periphery monitoring system, vehicle, vehicular periphery monitoring program, constitution method for vehicular periphery monitoring system, and server | |
CN104192065A (en) | Automobile periphery monitoring device capable of distinguishing monitored objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20210507 Address after: Sweden Wajiada Patentee after: Vennell Sweden Address before: Tokyo, Japan Patentee before: HONDA MOTOR Co.,Ltd. |
|
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20220824 Address after: Linkping Patentee after: Anzher software Co. Address before: Sweden Wajiada Patentee before: Vennell Sweden |
|
TR01 | Transfer of patent right |