CN101844545B - Vehicle periphery display device and method for vehicle periphery image - Google Patents

Vehicle periphery display device and method for vehicle periphery image Download PDF

Info

Publication number
CN101844545B
CN101844545B CN2010101419438A CN201010141943A CN101844545B CN 101844545 B CN101844545 B CN 101844545B CN 2010101419438 A CN2010101419438 A CN 2010101419438A CN 201010141943 A CN201010141943 A CN 201010141943A CN 101844545 B CN101844545 B CN 101844545B
Authority
CN
China
Prior art keywords
image
vehicle
road
graphicinformation
white line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010101419438A
Other languages
Chinese (zh)
Other versions
CN101844545A (en
Inventor
今西胜之
若山信彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Publication of CN101844545A publication Critical patent/CN101844545A/en
Application granted granted Critical
Publication of CN101844545B publication Critical patent/CN101844545B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0275Parking aids, e.g. instruction means by overlaying a vehicle path based on present steering angle over an image without processing that image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/70Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by an event-triggered choice to display a specific image among a selection of captured images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Combustion & Propulsion (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The invention relates to a vehicle periphery display device and a method for vehicle periphery image. The imaging unit photographs an image of a periphery of the vehicle. An object detection unit obtains image information related to a brightness or a color of the image and position information on the obtained image from the image and detects object from the image information according to a specific feature of the image. A storing unit prestores a vehicle image and history data, which include the image information associated with the position information. A graphic unit overlaps the vehicle image and an image produced according to the history data and converts the overlapped image to be viewed from a predetermined viewpoint. A peripheral image display region indicates a present periphery of the vehicle. A history display region is adjacent to the peripheral image display region for indicating the converted image and the detected object at a position according to the position information.

Description

Vehicle surrounding display device and the method that is used for the vehicle-surroundings image
Technical field
The present invention relates to be used to represent the vehicle surrounding display device of the surrounding condition of vehicle.The invention still further relates to the method that is used to represent the vehicle-surroundings image.
Background technology
Traditionally; For example, JP-A-2003-189291 and JP-A-2006-7498 disclose a kind of being used for through will be with the combined technology of representing synthetic view of hypograph: through in real time the surrounding condition from the car rear side being carried out to image that picture obtains and when car is mobile backward, representing to the image of the view of visual field external moving with from the image of car.Therefore, the physical relation between car and surrounding condition is provided for the driver.
According to first kind of conventional art explaining among the JP-A-2003-189291, for example be converted into data, so that produce the view data of aerial view image continuously according to the ground surface coordinate of the viewpoint projection of pick up camera through the image that uses pick up camera to obtain.Therefore, represented image is not the fault image that is directed against original image especially.Subsequently, the first aerial view image that obtains in the imaging formerly corresponding to moving of vehicle, so that after vehicle moves, produce the view data of aerial view image.The view data of the aerial view image of vehicle after moving is combined with the view data than the second aerial view image of the first aerial view image update, so that produce synthetic aerial view image.In addition, synthetic aerial view image is converted into and seems the projected image that is obtained by pick up camera, and on monitor device etc. this projected image of expression.
According to disclosed second kind of conventional art in JP-A-2006-7498, GPU is proofreaied and correct the distorton aberration in the current image date that caused by wide-angle lens and the previous view data, so that obtain undistorted image.And GPU only extracts the part of the previous view data that is corrected in vehicle side, is used for composograph so that obtain past data.And the visual angle that is used for the past data of composograph is converted into from the viewpoint of the imaginary viewpoint of the upside at back wheels of vehicle axle center.And GPU only extracts from the vehicle part of the current image date that is corrected on the distally more, is used for composograph so that obtain current data.And GPU is carried out and is dwindled conversion, so that recently dwindle the current data that is used for composograph with predetermined dwindling, so that produce the data of dwindling.Subsequently, GPU produces composograph through the synthetic zone combination of the data that will dwindle and the past data that is used for composograph that has carried out the viewpoint conversion, and on the screen of read out instrument, exports the composograph that is produced.And, the extended line of GPU conversion automobile side and the line of drawing according to imaginary visual angle by trailing wheel, and the line that output is changed on the screen of telltale.
In every kind of said conventional art, carry out various image processing, such as the conversion of dwindling of distortion correction, viewpoint conversion and the image that obtains.Therefore, the processing of image processing load becomes big.
Summary of the invention
In view of above-mentioned and other problems, the objective of the invention is to produce a kind of vehicle surrounding display device, it can conveniently be represented such as centering on the vehicle object of the white line of portion out of sight.Another object of the present invention is to propose a kind of method that is used to represent the vehicle-surroundings image.
According to an aspect of the present invention, a kind of vehicle surrounding display device is used to represent the image of vehicle-surroundings, and vehicle surrounding display device comprises: image-generating unit, it is arranged to the image of taking vehicle-surroundings.Vehicle surrounding display device also comprises: first subject detecting unit; It is configured to from captured image, obtain and the brightness of captured image and at least one the relevant graphicinformation in the color; From captured image, obtain location information about captured image; From the graphicinformation that is obtained, detect at least one the object that comprises object on road indication and the road according to the concrete characteristic of the object in the captured image; And, will be associated with location information about the graphicinformation of this object about this object.Said vehicle surrounding display device also comprises memory cell, and memory cell is configured to the vehicle image of store car in advance and the graphicinformation that storage is associated and the historical data of location information.Vehicle surrounding display device also comprises the drawing unit, and the drawing unit is configured to generate image according to historical data, with the doubling of the image of the image that is generated with the vehicle image that comprises in advance storage, and with overlapping image transitions for to watch from predetermined viewpoint.Vehicle surrounding display device also comprises display screen, and display screen comprises: peripheral image display area, peripheral image display area are configured to represent the present image by the vehicle-surroundings of image-generating unit acquisition; And; The history display zone; It adjoins peripheral image display area; And be configured to the object that in image of representing according to the position of the location information that is associated to be changed and the graphicinformation from the image of being changed, detects, wherein, display screen also is configured to cross over the border between peripheral image display area and the history display zone continuously and representes that the road indication of the present image in the peripheral image display area and the road in the history display zone indicate.
According to a further aspect in the invention, a kind of method that is used to represent the vehicle-surroundings image, method comprises: the present image of taking vehicle-surroundings.Method also comprises: obtain and the brightness of captured image and at least one the relevant graphicinformation the color from captured image.Method also comprises: obtain the location information about captured image from captured image.Method also comprises: the concrete characteristic according to the object in captured image detects the object that comprises at least one in the object on road indication and the road from the graphicinformation that is obtained.Method also comprises: the graphicinformation about this object that is obtained is associated with the location information about this object that is obtained.Method also comprises: the historical data of graphicinformation that storage is associated and the location information that is associated.Method also comprises: generate image according to said historical data.Method also comprises: the vehicle image that the image that is generated and vehicle are stored in advance is overlapping.Method also comprises: with overlapping image transitions for to watch from predetermined viewpoint.Method also comprises: the present image of expression vehicle-surroundings in peripheral image display area.Method also comprises: in adjoining the history display zone of peripheral image display area, represent the image changed.Method also comprises: the location tables according to the location information that is associated is shown in the object that is detected in the image of being changed; Wherein, cross over the border between peripheral image display area and the history display zone continuously and represent the road indication of the present image in the peripheral image display area and the road indication in the history display zone.
Description of drawings
Through following detailed description with reference to accompanying drawing, above and other objects of the present invention, feature and advantage will become clearer.In the accompanying drawings:
Fig. 1 is the block scheme that illustrates according to the rear view of vehicle read out instrument of first embodiment;
Fig. 2 is the scheme drawing that the physical relation between vehicle and the pick up camera is shown;
Fig. 3 shows the figure of the Luminance Distribution in the image that is obtained, and is used for describing a kind ofly detecting the method as the white line on road surface of detected object from the image that is obtained;
Fig. 4 is the main flow chart that the operation of rear view of vehicle read out instrument when vehicle stops backward is shown;
Fig. 5 is the subroutine that is used for the main flow chart of image processing;
Fig. 6 is the view that illustrates through the original image that uses pick up camera to obtain and on whole screen, represent;
Fig. 7 is the view that the image of when not carrying out according to the operation of the object detection of said embodiment, representing is shown;
Fig. 8 is the view that illustrates when vehicle stops backward by the image of representing according to the rear view of vehicle read out instrument of said embodiment; And
The image of the white line that is corrected when Fig. 9 is the correct operation that illustrates when the white line in the history display zone of carrying out at Fig. 8.
The specific embodiment
Vehicle surrounding display device can be represented the situation of vehicle periphery, and the driver behavior through on screen, providing expression to help the driver.To the rear view of vehicle read out instrument as an example of vehicle surrounding display device be described with reference to figure 1-9.The rear view of vehicle read out instrument is represented composograph, said composograph be included in vehicle rear-side current real image, vehicle image and according to the periphery of the vehicle that produces with the continuous previous image of this real image.Therefore, said rear view of vehicle read out instrument can be notified from the accurate position of car to the driver, and helps effectively to stop backward.
Fig. 1 shows the block diagram of rear view of vehicle read out instrument.Fig. 2 is the scheme drawing that the physical relation between vehicle 10 and the pick up camera 7 is shown.The rear view of vehicle read out instrument comprises that pick up camera 7, read out instrument 9, graphic processing circuit (drawing unit) 4, demoder 8, memory device 5, microcomputer 1 and vehicle move detecting unit 2.Pick up camera 7 obtains the image of (shooting) vehicle rear-side.Read out instrument 9 expressions obtain the image of (shooting) through using pick up camera 7.Graphic processing circuit 4 generates the image that will on read out instrument 9, represent.Demoder 8 will be a digital signal from the analog signal conversion that pick up camera 7 sends, and the digital signal of being changed is sent to graphic processing circuit 4 as picture signal.Memory device 5 is stored the graphicinformation about the image that obtains through use pick up camera 7.The image that microcomputer 1 is controlled on the read out instrument 9 is represented.
As shown in Figure 2, pick up camera 7 is positioned at the upside of the rear bumper 10a of vehicle 10.Pick up camera 7 has predetermined field range in predetermined angular range and the predetermined distance range in vehicle rear-side such as 180 degree.Pick up camera 7 is used to obtain the rear image in the whole wide as image-generating unit.
Memory device 5 is as memory cell, and it comprises random-access memory (ram), is used to store the picture signal that sends from demoder 8 as view data.Memory device 5 will be stored as historical data about the graphicinformation through the previous image that uses pick up camera 7 to obtain and be associated with location information on the image.In addition, the memory device 5 rear image of vehicle 10 that prestores.The rear image of vehicle 10 by generation to have accurate physical relation with respect to the image that is obtained.The rear image of the vehicle 10 that in memory device 5, prestores comprises the rear bumper 10a of vehicle 10 etc.Among the embodiment below, when the computer graphics image (vehicle computer is drawn, vehicle CG) when the desired location of the upside (high side) of vehicle is watched with predetermined viewpoint is produced in advance, and as an example of rear image.In memory device 5 and microcomputer 1 and the graphic processing circuit 4 each is communicated by letter.
Microcomputer 1 is used for detecting at least one road indication and the object (object on the road) on road from image as subject detecting unit, and above-mentioned object on road is positioned at vehicle periphery, and is asked to detect to help driver behavior.Object is on the above-mentioned road, for example another vehicle, wall and the pedestrian through using super sonic sonar etc. to detect.The indication of above-mentioned road is on road or is arranged in obstacle, lay-by and the leading line etc. in the space that vehicle 10 is about to move to.Obstacle is, for example groove, step, kerb, bar and people.Lay-by is that the white line of parking space for example is shown.The road indication is as the mark at vehicle periphery.
Current application people notes from move to the exterior image in the visual field, detecting the edge of conduct at the white line of vehicle periphery mark; And with current this edge of original image combination expression; So that obtain the view of vehicle periphery and carry out the detection of white line and do not use high vision to handle.In the processing at the edge that detects white line, possibly from the image that is obtained, detect the object except that white line by error.As a result, should be plotted as white line by the object (noise) except that white line.Therefore, the point that has a same color with white line etc. is plotted in the zone except white line of the image of representing on the telltale.Like this, the image that is difficult to discern is provided for the driver.
According to the attention that should choose, microcomputer 1 from through obtain the image that uses pick up camera 7 to obtain about the brightness of image and color at least one graphicinformation and about the location information of image.Therefore, microcomputer 1 comes detected object based on the graphicinformation that is obtained.The graphicinformation that is obtained is associated with location information about image, and is stored in the memory device 5 as historical data.Microcomputer 1 comes detected object from the graphicinformation that is obtained according to the concrete characteristic on image.Concrete characteristic on the image is based on Properties of Objects characteristic on the image that graphicinformation generates.Concrete characteristic for example is color characteristic, brightness, luminous emittance characteristic and shape facility etc.Microcomputer 1 is used to extract the concrete characteristic of image as subject detecting unit, so that detect the detected object that is equal to concrete characteristic.
For example, microcomputer 1 uses the object detection method to detect a scope,, with wherein brightness or color are different from this scope image range on every side as detected object.The example of microcomputer 1 as the object detection method of subject detecting unit used in explanation below.Microcomputer 1 is controlled at the time point and the position of the image that expression is generated on the read out instrument 9.Microcomputer 1 also as indicative control unit, is used for the expression of control chart picture on read out instrument 9.Microcomputer 1 is also as control setup; Be used for moving detecting unit 2, vehicle-state detecting unit 3, memory device 5 and graphic processing circuit 4 etc. and receive signal from vehicle; Carry out predetermined calculating, and to the view data of graphic processing circuit 4 outputs as result of calculation.Microcomputer 1 comprises: read-only memory (ROM) (ROM), its storage computation program etc.; And random-access memory (ram) is as the working space that calculates.Microcomputer 1 is carried out program stored in ROM.Microcomputer 1 reads that to be used for executive routine required and temporarily be stored in the information of memory device 5 and RAM.
Graphic processing circuit 4 is used to generate image as drawing unit, exports the image that is generated to read out instrument 9, and communicates by letter with the microcomputer 1 as subject detecting unit.Graphic processing circuit 4 generates (drafting) image according to the result by microcomputer 1 carries out image processing on read out instrument 9.1 generation instruction that provide, that generate predetermined image generates predetermined image to graphic processing circuit 4 based on microcomputer.Therefore, microcomputer 1 and the image generation unit of graphic processing circuit 4 as the generation image.Graphic processing circuit 4 outputs are through the original image of the vehicle rear-side of use pick up camera 7 acquisitions, so that this original image is positioned in the rear image display area 20 (Fig. 7).Graphic processing circuit 4 at least one and vehicle CG in object and the road indication of from historical data, extracting on the road around the stack vehicle rear-side on history display zone 30 is to watch from above-mentioned predetermined viewpoint.Like this, graphic processing circuit 4 is to read out instrument 9 output stack views.Graphic processing circuit 4 can be on history display zone 30 in the position of axletree part 11 and wheel 12 of expression vehicle 10 at least one.Be added in the distortion of the rear view of vehicle image in the rear image display area 20 in the image of graphic processing circuit 4 in history display zone 30.Graphic processing circuit 4 is also drawn object or road indication on the road of being extracted in history display zone 30, be expressed continuously with object or road indication on the road of current expression in the image display area 20 in the wings.Graphic processing circuit 4 execution are used for the processing of drawing image through the gradation data of each pixel that generates read out instrument 9.Microcomputer 1 is carried out the processing that is used for obtaining from memory device 5 historical data, and carries out the processing that is used for reference to generate the gradation data of image at the ROM canned data, simultaneously the temporary transient needed information of stores processor in RAM.Graphic processing circuit 4 is to the gradation data of read out instrument 9 outputs as the image of result of calculation.
Read out instrument 9 has display screen, and it comprises peripheral image display area and history display zone 30.The current vehicle periphery image that periphery image display area domain representation obtains through using pick up camera 7.History display zone 30 adjacent perimeter image display area are used to represent the image that generates based on historical data.Read out instrument 9 is illustrated in the vehicle-surroundings image in the predetermined vehicle-state, and when the gear of the for example driving device of vehicle 10 during in reverse gear position in peripheral image display area the image of expression vehicle rear-side.Read out instrument 9 has rear image display area 20 and history display zone 30.Image display area 20 expressions in rear through the current vehicle rear image that uses pick up camera 7 to obtain, and are not proofreaied and correct the distortion in image when vehicle 10 reversing backs.Rear image display area 20 is examples of peripheral image display area.History display zone 30 is contiguous with rear image display area 20, is used to represent the image that generates according to previous historical data.History display zone 30 is at the object of representing based on the definite position of the location information that is associated to be detected from the graphicinformation that is obtained by microcomputer 1.Read out instrument 9 for example is a Liquid Crystal Display (LCD), and is arranged in the part that handle perhaps is positioned at the instrument indicating device of instrument carrier panel middle body before.In the present embodiment, dot matrix type TFT transflective liquid crystal panel is used as an example of read out instrument 9.TFT transflective liquid crystal panel has the expression unit that a plurality of pixels conducts that in matrix shape, distribute are used for presentation video.Each pixel of read out instrument 9 comprises sub-pixel, and it has the colour filter that comprises redness (R), green (G) and blue (B) three looks respectively, and can enough color indicating images.Read out instrument 9 drives each pixel with presentation video based on the gradation data that obtains from graphic processing circuit 4.Read out instrument 9 is included in the back lighting device (not shown) of expression rear flank side, is used for lighting image from rear side, representes with the transmission of carries out image.Read out instrument 9 can be arranged near the instrument carrier panel the windscreen.According to this structure, the driver can visually discern the rear image in watching the comfort conditions of vehicle front, and near need not be handle preceding side shifting sight.Read out instrument 9 can be located at the central authorities of vehicle-width direction overhead panel, and tilts at instrument carrier panel.Perhaps, read out instrument 9 can be positioned on the Liquid Crystal Display (LCD) of vehicle navigation apparatus.
Vehicle moves detecting unit 2 and waits based on the rotative speed of wheel and detect the distance that vehicle 10 moves from particular state; Wait based on the anglec of rotation of control wheel and to detect current deflecting direction, and send to microcomputer 1 and to be used to represent the signal that moves that detected.Can obtain to move and deflecting direction through following manner: with image transitions be the aerial view image, and calculates through its previous image and its present image are mated.Vehicle-state detecting unit 3 receives the various status detection signals of vehicle 10.In the present embodiment, vehicle-state detecting unit 3 receives the gear signal of driving device at least, and the signal that is received is sent to microcomputer 1.
Microcomputer 1 can receive from the signal of sonar equipment 6 outputs.Sonar equipment 6 is used to detect object on the road of vehicle periphery as subject detecting unit, and detects on the road object apart from the distance of vehicle 10.The location information that sonar equipment 6 obtains about object on the road.The location information that is obtained with through using pick up camera 7 to be associated from the graphicinformation that image obtains.Sonar equipment 6 for example is a ultrasonic transduter, and is arranged to the distance of solid object that detection exists at vehicle periphery etc. and vehicle.Sonar equipment 6 is to microcomputer 1 output detection signal.For example, the sonar equipment 6 of predetermined quantity is embedded in predetermined interval in the surface at rear of vehicle.More specifically, each sonar equipment 6 has detecting unit, and it for example exposes from rear bumper 10a.
Below will be with reference to an example of the detecting operation of figure 3 description objects.In detecting operation, microcomputer 1 obtains object according to the concrete characteristic of image from the graphicinformation that is obtained.Fig. 3 is the figure that is illustrated in the Luminance Distribution in the image that is obtained, and is used for describing a kind of being used for to detect the method as the white line on road surface of detected object from the image that is obtained.The detection of white line that below will describe lay-by is as an example of detected object.And, in one example, be used as the graphicinformation that obtains from image about the information of the light level of image.
Change in the brightness of the microcomputer 1 previous image detection image that escort vehicle 10 moves from be stored in memory device 5, and, be created in the lightness distribution on the two-dimensional coordinate thus in the change that is plotted on the screen on the imaginary two-dimensional coordinate.In the original image of the vehicle rear-side shown in Fig. 6, two-dimensional coordinate for example comprises in the horizontal direction movably X coordinate and Y coordinate movably in vertical direction.In Luminance Distribution shown in Figure 3, the X axle of image is associated with horizontal shaft, and the amplitude of brightness is associated with vertical axis.Microcomputer 1 is the upright position (Y coordinate) in the still image in the Y axle in two-dimensional coordinate at first, and changes in the brightness that the level attitude on the image (X coordinate) is extracted on the image through only changing along the X axle.That is scan image on the X axle.Therefore, microcomputer 1 produces the Luminance Distribution that is accompanied by the change in the X coordinate.In the Luminance Distribution that is accompanied by the change in the X coordinate, be higher than the brightness of another part on the X axle corresponding to the brightness of the white line in the image of level attitude on the X axle (X coordinate).This is caused by following reason: road surface is by with such as the color showing that likens the black that is the darker asphalt material of the color of white line of road indication in image.
Brightness is greater than in the scope of the part around it around light level is different from it in image.In Fig. 3, this part appears in each of two some X1 and X2 on the X axle.Two some X1 and X2 comprise a plurality of scopes, around the light level of each scope is different from.Two some X1 and X2 comprise: at the mountain-shaped part in the left side of Fig. 3 minute, it has width W 1 on the X axle; And at the mountain-shaped part branch on the right side of Fig. 3, it has width W 2 on the X axle.Symmetrical basically at the branch of the mountain-shaped part on the left side with respect to the center of the coordinate X1 on the X axle, and divide symmetrical basically with respect to the center of the coordinate X2 on the X axle at the mountain-shaped part on the right side.The coordinate at the center that said mountain-shaped part divides is the coordinate from the center of the white line on the X axle of image detection.The width W 1 that said mountain-shaped part divides, each value of W2 are the width of white line.Each two ends in the horizontal direction of width W 1 and width W 2 are corresponding to the marginal portion of white line.In the upright position that changes on the Y axle, can extract and detect a plurality of Luminance Distribution that are accompanied by the change that on the level attitude on the X axle, produces by this way.That is, in the upright position that changes on the Y axle, continue the scanning of the image on X-direction.Along with the coordinate on the Y axle increases, when promptly when continuing scanning, improving the change on the upright position on the Y axle, brightness becomes white line actual on image greater than part on every side on the Luminance Distribution that is accompanied by the change on the X axle.
Through using the white line method of inspection by this way, can the image in the read out instrument shown in Fig. 79 be represented to change into the expression shown in Fig. 8.Fig. 7 is the view that the image of when the white line detecting operation (object detection operation) do not carried out according to this embodiment, representing is shown.Fig. 8 illustrates when the white line detecting operation (object detection operation) carried out according to this embodiment and the view of the image of when vehicle stops backward, representing.In the expression of Fig. 7, read out instrument 9 is shown white line with psophometer, and in fact it is not white line.Therefore, the point that has with the color same color of white line 310 is indicated in the part on the road surface 32, and in fact white line does not exist on road surface 32.As a result, unclear image is provided for the driver.On the contrary, when the white line detecting operation carried out according to present embodiment, as shown in Figure 8, white line 31 is shown in the psophometer on the less road surface 32 in the history display zone 30, and white line 31 can further clearly be identified as line.
Can carry out the detection of white line through following manner: only store the marginal portion of the white line that is associated with coordinate (location information), rather than detect the distribution that brightness changes on two-dimensional coordinate.By this way, can reduce data handing and needed work area of data handing and processing time in the white line detecting operation.Can draw white line through following manner: detect each the location information of coordinate of only core of the width W 1 comprise the Luminance Distribution from the two-dimensional coordinate and width W 2, and the point (for example white point) that in history display zone 30, has predetermined color and preliminary dimension corresponding to the location assignment of the location information that is detected.For example, when the white line that is detected had the width of 11 pixels, can on each edge of white line, draw diameter was the circle symbol of 8 pixels.By this way, can further reduce data handing and needed work area of data handing and processing time with drawing in the operation at the white line detecting operation.Can on the image in the read out instrument 9, reflect the brightness of the white line that is detected.Reason is described below.Predetermined threshold L shown in Fig. 3 is used to determine whether to liking the reference value of white line.Reference value can be the predetermined value that is stored in advance in the microcomputer 1.Perhaps, reference value can be the value that the predetermined computation method of the mean flow rate through using entire image obtains.
As subject detecting unit, microcomputer 1 detects poor as between the brightness of image of the graphicinformation of object and the threshold value L.When this difference changes according to the position on the X axle, should be poor for each pixel detection, and for the brightness of each pixel calculating according to the difference that is detected.Like this, microcomputer 1 calculates brightness according to the graphicinformation of the white line that is detected (object) and the difference between the threshold value L.Microcomputer 1 makes graphic processing circuit 4 in history display zone 30, draw white line (object), so that in brightness, calculate the brightness of white line for each pixel.Graphic processing circuit 4 is drawn white line with the brightness of being calculated, and makes history display zone 30 expression white lines, and it has reflected the brightness that is detected.Therefore, the white line of expression is not same brightness on the whole in history display zone 30.White line has reflected the intrinsic brilliance when obtaining image, and thus, white line is represented with intrinsic brilliance basically.
As shown in Figure 3, the white line in left side partly has at the brightness (graphicinformation) of the image at the center that mountain-shaped part divides and the poor A1 between the threshold value L in Fig. 3, and the white line on right side partly has poor A2 among Fig. 3.Microcomputer 1 can only calculate poor between brightness and the threshold value L of mountain-shaped part branch center by this way, and can make graphic processing circuit 4 in history display zone 30, draw the whole piece white line, so that the whole piece white line has the brightness of in this center, calculating.Therefore, graphic processing circuit 4 is drawn the white line of the brightness with this center, and makes history display zone 30 represent white line with this brightness.In this case, the white line integral body of expression has same brightness in history display zone 30.
Microcomputer 1 can make history display zone 30 expression white lines 31 continuous with the white line 21 in the rear image display area 20, so that the driver can discern the historic records of white line, its current visual field the driver is outside.In the case, according to said processing, preferably can reduce the quantity that the coordinate on the Y axle changes.In addition, compare, preferably reduce and extract and draw quantity with the drafting quantity of the complete trajectory that is enough to form white line.Therefore, detect according to the white line of being carried out by microcomputer 1, when white line was detected as the line on image, said white line was detected as the set of the drafting of a plurality of points.By this way, can further reduce data handing and needed work area of said data handing and processing time.
Through using the method for inspection of white line, can reduce processing load and the work area of memory device in the rear view of vehicle read out instrument.Therefore, needn't be provided for controlling the for example high performance control device of vehicle navigation apparatus, and it is just enough to be used to control the control setup of instrument indicating device for example.And when the information about the color level of image was used as from graphicinformation that image obtains, the method for inspection of white line was equal to the method with reference to the description of figure 3.In this case, each of the R value of the three primary colours of composing images (red level), G value (green level) and B value (blueness grade) is equal to the brightness of the upright position appointment in the Y of Fig. 3 axle.And, come the color of detected image on color level, to be different from the scope of the periphery of image based on the value of the color that meets the object that will detect and the grade of this value, therefore, the scope that is detected is associated with location information, and is stored in the memory device 5.Other part of this method of inspection is equal to those parts based on the method for inspection of brightness.Therefore, be plotted in the history display zone 30 according to the brightness of the graphicinformation of the white line that is detected and the white line 31 of color.
Next, the operation of when vehicle stops backward, carrying out will be described to an example of the operation of rear view of vehicle read out instrument.Fig. 4 is the main flow chart that the operation of the rear view of vehicle read out instrument of when vehicle stops backward, carrying out is shown.Fig. 5 shows the subroutine of the image processing in main flow chart.The diagram of circuit of Figure 4 and 5 is carried out by microcomputer 1, and microcomputer 1 also is used as the control setup of the operation of control vehicle rear read out instrument.
When to microcomputer 1 power supply, the main flow chart of Fig. 4 begins.At first, in step 10, read out instrument 9 is represented the image except that the rear view of vehicle image under normal conditions.In step 20, microcomputer 1 determines whether that microcomputer 1 receives signal from vehicle-state detecting unit 3, and this signal specifies the gear of driving device will be in reverse gear position.Confirming of microcomputer 1 repeating step 20 receives the signal of specifying reverse gear position up to microcomputer 1.In step 20, can determine whether that vehicle-state detecting unit 3 and microcomputer 1 receive the command signal of expression vehicle-surroundings image on read out instrument 9.When handling, occupant for example is used for receiving when the switch of vehicle-surroundings image is represented in order command signal.In this case, can be chosen in the image of expression on the read out instrument 9 according to the direction that vehicle moves.When vehicle moves forward, can on read out instrument 9, be illustrated in the image of vehicle front.The occupant can handle the switch that is used for the command selection vehicle side image.
When receiving the signal of specifying reverse gear position in step 20, this processing proceeds to step 30, and wherein, the subroutine of microcomputer 1 carries out image processing is so that produce the image that will be outputed to read out instrument 9.Step 300 among Fig. 5 of subroutine through being used for image processing is drawn (generation) image to the processing of step 311.When during to image that read out instrument 9 output is drawn, comprising the screen in rear image display area 20 and history display zone 30 in step 40 expression from graphic processing circuit 4.
When vehicle moves,, receive and obtain and draw the image that on read out instrument 9, to represent in step 40 continuously according to being used for signal that designated vehicle moves backward in step 20.On read out instrument 9, be illustrated in continuously in the rear image display area 20 realtime graphic and with image display area 20 in the wings in the consecutive image in history display zone 30 of image, up to confirming that in step 50 gear of reverse gear position is released.When step 50 obtains negative evaluation, handle and proceed to step 30, at this, continue the subroutine of carries out image processing.Therefore, with image display area 20 in the wings in the consecutive image in history display zone 30 of realtime graphic by mobile drafting according to vehicle 10.And, the image that expression is drawn in the history display zone 30 on read out instrument 9.When step 50 obtains affirmative determination, handle and turn back to step 10, wherein, the order of cancellation expression rear image on read out instrument 9.Generally, when stopping, turn-off ignition lock.Therefore, when turn-offing ignition lock and deactivating microcomputer 1, the processing of process ends figure.
Next, will the subroutine of the step 300 of image processing to step 311 be described with reference to figure 5.The view data that pick up camera 7 obtains vehicle rear-side is to comprise presumptive area.At first, obtain view data.In step 301, the signal of the image that is obtained by pick up camera 7 to memory device 5 and microcomputer 1 output through demoder 8 and graphic processing circuit 4 is as view data.View data is converted into the data on the imaginary coordinate on the screen that moves along with vehicle 10, and the view data of being changed is stored and is accumulated in the memory device 5.The view data that the ground expression of unactual former state is obtained on read out instrument 9.Suppose, when the view data that obtained by like real representation the time, then in the view shown in Fig. 6, represent the rear image, and represent the rear branch of vehicle 10 in the latter half of screen in the first half of screen.Fig. 6 illustrates by pick up camera 7 to obtain and the view of the original image on whole screen, represented.
In step 301, graphic processing circuit 4 is carried out and is handled with generation history display zone 30 and rear image display area 20, and history display zone 30 is assigned to screen with rear image display area 20.As shown in Figure 7, rear image display area 20 is positioned at the top of screen, and image display area 20 ground, 30 contiguous rear, history display zone are positioned at the lower part of screen.Fig. 7 is the view that is used to explain an example of the screen that when vehicle stops backward, will represent.
In step 302, microcomputer 1 reads the view data that is stored in the memory device 5, and from the graphicinformation that obtains from image, detects the road indication through the white line method of inspection.In this example, the road indication is the white line 21 of lay-by.Under precondition, the white line 21 that is detected is converted into the form of image on imaginary coordinate (image coordinate).Position (location information), color, brightness with such as with image in the size of the relevant width in the edge (white line edge) of white line to be detected.In step 303, the image coordinate relevant with the white line edge is converted into three-dimensional coordinate.Therefore, calculate the position (three-dimensional position) at the white line edge in three dimensional space.
In step 304, on sonar equipment (subject detecting unit) 6 detects such as the road of outside vehicle obstacle during object, sonar equipment 6 with the distance of object on the position of object on the road of being detected and the road of being detected from vehicle 10 to microcomputer 1 output.When object on the road of being detected by sonar equipment 6 with by microcomputer 1 from object that image detected when identical, comprise that the location information of the position detected by sonar equipment 6, distance etc. is used to proofread and correct and compensate the location information that obtains from image.In step 305, the graphicinformation that microcomputer 1 will be converted into the white line 21 of image coordinate is associated with the location information of the white line of being changed 21, and this information that is associated is stored as historical data in memory device 5.
In step 306, calculate the white line edge and the three-dimensional position of object according to moving of vehicle 10 and on the road of moving on the screen.Particularly; When vehicle 10 moves; Microcomputer 1 read with memory device 5 in the relevant three-dimensional position (historical data) of object on white line edge and the road of storage; And the three-dimensional position according to after the moving and moved the anglec of rotation (the driving angle of handle) that detecting unit 2 detects and calculated vehicle and be moved by vehicle of vehicle upgrades this three-dimensional position thus.
In step 307; Microcomputer 1 calculate with upgrade with the white line edge that in step 306, upgrades and road on the image coordinate of the relevant three-dimensional position of object so that with rear image display area 20 in the consecutive history display of original image regional 30 in represent object on white line 31 and the road.In step 308, microcomputer 1 is stored the image coordinate of being calculated in memory device 5.
Subsequently, to step 311, graphic processing circuit 4 is drawn the image that will on read out instrument 9, represent based on the signal of slave microcomputer 1 output in step 309.Particularly, in step 309, graphic processing circuit 4 draw in the image display area 20 in the wings vehicle when front and back image (original image) etc.In step 310, the vehicle CG of graphic processing circuit 4 reading pre-stored in memory device 5, and in history display zone 30, draw the vehicle CG that watches from predetermined viewpoint.In this accompanying drawing, CG adds distortion to vehicle, and this distortion is similar to the distortion that causes in the original image of image display area 20 in the wings, and in history display zone 30, draws out-of truth vehicle CG.In step 311, graphic processing circuit 4 reads from memory device 5 position, color, brightness of storage and such as the size of the width of the image coordinate of white line, road indication etc.Therefore, graphic processing circuit 4 is drawn white line 31 in history display zone 30, and it comprises the white line edge of watching from predetermined viewpoint.In this accompanying drawing, in step 310, the white line that is read is employed with the conversion of known visual angle, to be watched from the predetermined angle of view identical with the position of vehicle CG.And, to white line add with image display area 20 in the wings in the identical distortion of distortion of original image.Therefore, draw out-of truth white line 31 greatly similarly with vehicle CG.
Below explanation is used for adding to image in step 310 and step 311 example of out-of truth detail operations.At first, carry out correction to remove the distortion in the image of the view data that obtains through use pick up camera 7 in advance.And the view data that is corrected is converted into the view data in three-dimensional coordinate.Said three dimensional space is changed and mobile (rotation and translation etc.) according to the mobile of vehicle based on the signal that moves and rotate of vehicle.Therefore, maintenance is from the accurate position of car.The view data that is converted into the data in the three-dimensional coordinate further is converted into the data in the imaginary coordinate on screen, forms 2-D data thus.The distortion of original image is added on the 2-D data of being changed.Therefore, can in history display zone 30, drafting be coupled with out-of truth out-of truth vehicle CG and the white line 31 that is similar to original image.
In Fig. 4, in step 40, will be in image that step 310 or step 311 are drawn in history display zone 30 and the image combination (synthesizing) of drawing in the image display area 20 in the wings in step 309.Like this, the combination image in two adjacent areas is exported and is illustrated on the read out instrument 9.
Through carrying out top image processing subroutine, as shown in Figure 8, the original image that comprises the white line 31 in white line 21 and the history display zone 30 in the rear image display area 20 is crossed over border between two zones by expression continuously.Note, can be with the clear white line 31 of representing clearly of the remarkable noise that reduces.In addition, can draw white line 31 and make to have substantially the same width and substantially the same color with the borderline white line 21 between image display area 20 and the history display regional 30 in the wings.Therefore, white line 31 can be represented as with white line 21 and overlap.In other words, be positioned at vehicle body periphery and current at pick up camera 7 the exterior white line 31 in the visual field be positioned at vehicle rear-side and the current white line of being seen by pick up camera 7 21 on read out instrument 9 by expression integrally.Therefore, the image of vehicle-surroundings can be provided to the driver, it strengthens on visuality widely.By this way, can previous image and present image are provided and represent the border between previous image and the present image to the driver, wherein, previous image representes to be used for effectively accurately checking the mark (white line 31) from the car position.Like this, can know clearly provides previous image and present image as an image to the driver, and does not have uncomfortable sensation.Therefore, the driver can be clearly and correctly obtain vehicle body, the physical relation between the road indication of drawing around the vehicle body and the current rear image.
As shown in Figure 8, graphic processing circuit 4 extended line of the side of the expectation path line 22 of stack vehicle 10 and vehicle etc. on the image in the image display area 20 in the wings.Expectation path line 22 is to illustrate based on the angle of handle and the indicating device of the expected trajectory of the vehicle that moves backward.Can draw the extended line of expectation path line 22 and automobile side with different colours, so that clear and clearly demarcated for the driver.
In addition, as shown in Figure 9, graphic processing circuit 4 can reduce the distortion that is added among the white line 31A that in history display zone 30, draws, with away from further proofreading and correct white line 31A linearly with the position on rear image display area 20 adjacent boundaries.Fig. 9 is the view that the example of the white line 31A that is corrected that obtains through the white line of proofreading and correct in the history display zone 30 in Fig. 8 31 is shown.By this way, white line 31A be represented as in the original image with the border of rear image display area 20 on a continuous line of white line 21, and white line 31A is crooked in the little amplitude in position away from the border of rear image display area 20.
Operating effect according to the rear view of vehicle read out instrument of present embodiment below will be described.The rear view of vehicle read out instrument is used for representing when order vehicle 10 begins mobile backward signal the vehicle-surroundings image when receiving.
Read out instrument 9 comprises: pick up camera 7 is used to obtain the image of vehicle rear-side; Microcomputer 1 (subject detecting unit) is used for from the image detection white line 31 that is obtained; Memory device 5 (memory cell); Be used for storing the data that obtain from the previous image of taking by pick up camera 7 as historical data; And the pre-rendered image at store car rear makes pre-rendered image and captured image have accurate physical relation; And, graphic processing circuit 4.The current vehicle rear image that image display area 20 expressions in rear are taken by pick up camera 7, and do not have to proofread and correct the distortion in image.Rear image display area 20 is adjoined in history display zone 30, and the image of historical data generation is used in expression.White line 31 is examples of at least one in the indication of object and road on the road.Microcomputer 1 from captured image, obtain with image on brightness and in the color at least one the relevant graphicinformation of level and about the location information of image.Microcomputer 1 detects white line 31 based on this graphicinformation.Memory device 5 is the image of the rear branch of store car in advance.In addition, memory device 5 is associated this graphicinformation with location information about the object that detected by microcomputer 1, and it is stored as historical data.The image overlay that the rear of the vehicle that graphic processing circuit 4 will be stored in history display zone 30 is in advance divided is to dividing on the image of white line 31 on every side at the rear of vehicle; Wherein, The image of white line 31 extracts from historical data, therefore in history display zone 30, watches the image that is superposeed from predetermined viewpoint.Microcomputer 1 is according to concrete characteristic detected object from the graphicinformation that is obtained of image.For example, microcomputer 1 detected image range of information is as object, and in this scope, image is different with on every side on light level or color level.In this structure, expression current vehicles periphery image in peripheral image display area.In addition, the object that from graphicinformation, is detected by microcomputer 1 is drawn in the determined position of the location information based on being associated in the history display zone.
According to said structure, the operating effect below obtaining.In object detection, use and the brightness of captured image and at least one grade relevant graphicinformation in the color.Therefore, can when reducing to be used for the processing load of processing image information, carry out the detection of object.In addition, obtain graphicinformation, and obtain location information about image through detecting operation based on brightness or color.Image and location information are associated, and are stored as historical data.Thus, can be enough the accurately position of detected object and size.In addition, also can reduce the data volume of institute's stored history.Microcomputer 1 (subject detecting unit) is based on the concrete characteristic detected object from the graphicinformation that is obtained in the image.Rendered object on the position of graphic processing circuit 4 position-based information in history display zone 30.Therefore, can on read out instrument 9, represent to have the image of less noise.In addition, the driver can see the image that ambiguity is lowered.Therefore, can reduce and be applied to the operational load that is used for image processing on the microcomputer 1.In addition, the driver can correctly discern the position from car on the screen that visuality strengthens.Therefore, can strengthen the function of aid parking.
In addition, in history display zone 30, draw and have according to the brightness of the graphicinformation of the white line (object) that detects and the white line 31 (object) of color.According to this structure,, represent this white line with further visual tone from captured image with traditional comparing with unified brightness and unified color showing white line.Particularly, even when white line 31 therein in fact also detects noise in the part of non-existent image, also represent wherein to detect the part of the image of noise with the same luminance level that from said image, obtains and color level.Therefore, even expression is during detected noise in detecting noise and the part at this image, do not represent the part of this image with the color of the white line of appointment.Therefore, represent noise indistinctively in the image on display screen.
And microcomputer 1 (subject detecting unit) detects poor between graphicinformation and the predetermined threshold L of white line 31 (object).In history display zone 30, with brightness and color showing white line 31 (object) according to the graphicinformation and the difference between the threshold value L of the white line that is detected 31 (object).According to this structure, with according in detection being light level or the brightness and the color showing white line 31 of the difference between color level and the threshold value L in the scope of white line 31.Therefore, with traditional comparing with unified color showing white line to unify brightness, can be to represent white line with the brightness and the further similar tone of color of captured image.That is,, use the same luminance level that obtains from image and same color level to represent wherein to detect the part of the image of noise even when white line 31 therein in fact also detects noise in the part of non-existent image.By this way, not similar with orthodox method, detect the point of not drawing the preliminary dimension that is used in reference to the ding white ware line in the part of image of noise therein.Therefore, even expression is during detected noise in detecting noise and this part at image, also not with this part of the white presentation video of appointment.Therefore, represent noise indistinctively in the image on display screen.
And microcomputer 1 (subject detecting unit) detects the width of white line 31 (object) from the graphicinformation of the white line 31 (object) that detected.In history display zone 30, draw and have white line 31 (object) corresponding to the width of the width that is detected.According to this structure, specify in the width of the white line 31 (object) of expression in the history display zone 30 based on the width of the white line that is detected.Therefore, can clearly represent to have the white line 31 of specified width, which width.Like this, the edge that also can clearly represent white line 31.By this way, can clearly notify the existence of white line 31 to the driver.Therefore, can strengthen the driver assistance operation.
Can with in the mean flow rate of the average color of the road surface that from the graphicinformation that is obtained, obtains and road surface at least one draw will expression in history display zone 30 white line 31 (object) and the road surface except the image that rear view of vehicle divides.According to this structure, can in history display zone 30, represent road surface with average color and mean flow rate near the real road surface color.Through representing road surface, can further reduce the noise on the screen with in average color and the mean flow rate at least one.And road surface can further be distinguished with white line 31 (object) mutually in the image.Therefore, can further strengthen driver's identification.
And, can also provide sonar equipment 6 (subject detecting unit) to detect the location information of on object, road on the road around vehicle-surroundings object object apart from the distance of vehicle 10 and on about the road.According to this structure, can be through sonar equipment 6 additional location informations about object on the road.Therefore, the position of object on the road can be further accurately notified, and the driver assistance operation can be further strengthened to the driver.
Can in history display zone 30, draw reticle, be used to illustrate preset distance with vehicle 10.Can be according to confirming about the location information of object by shown in the reticle and distance vehicle.Can confirm current distance according to the range information that detects by sonar equipment 6 as subject detecting unit.By this way, draw object and/or road indication and reticle on the road around the vehicle 10 in history display zone 30.Therefore, the driver can at length discern the position of vehicle with respect to the space around vehicle-surroundings.Therefore, can further strengthen the driver assistance operation.
In history display zone 30, use the position of axletree part 11 and wheel 12 to represent vehicle 10.By this way, can come in the following manner clearly to notify axletree part 11 and the position of wheel 12: the position of expression axletree part 11 and wheel 12 in history display zone 30 with respect to the situation around the vehicle-surroundings to the driver.Therefore, can be provided for the guide image of the further enhancing of driver behavior.
In addition, the rear view of vehicle read out instrument detects as the white line 21 (lay-by) of detected object etc., and white line 21 (lay-by) is as parking space road illustrate.The rear view of vehicle read out instrument is also represented detected object on the image that produces from previous data on the present image in the image display area 20 and in history display zone 30 in the wings continuously.By this way, from captured image, detecting is not the road indication of three dimensional object.Therefore, compare, can reduce needed handling property of image processing apparatus and check implement and processing time with the method for inspection that is used to detect three dimensional object.Therefore, can produce cheap equipment.
The vehicle-surroundings present image on every side that image display area 20 expressions in rear are taken by pick up camera 7, and do not proofread and correct the distortion in image.By this way, can represent the for example big zone of the vehicle rear-side of 180 degree scopes.Therefore, compare, can amplify the expression scope of vehicle rear-side with the expression of the image of correcting distortion wherein.Therefore, driver's vehicle-surroundings information that can obtain to strengthen.
The image of graphic processing circuit 4 in history display zone 30 adds the distortion of the rear view of vehicle image in the rear image display area 20, and draws out-of truth rear view of vehicle image.By this way; Through in history display zone 30, drawing the out-of truth rear image of vehicle, in the wings the current original image in the image display area 20 can be further visually outside with the current visual field at pick up camera 7 and in history display zone 30 represented image be associated.Therefore, the driver can more easily discern the relation between the periphery of vehicle periphery of vehicle and current portion out of sight.And the image of graphic processing circuit 4 in history display zone 30 adds the distortion of the rear view of vehicle image in the image display area 20 in the wings.In addition, graphic processing circuit 4 white line 31 that expression is extracted in history display zone 30 (object or road indication on the road) so that with image display area 20 in the wings in expression white line 21 (object or road indication on the road) continuously.By this way, in history display zone 30, the current original image in the image display area 20 can visually with in history display zone 30 be associated (connection) on the exterior integral image in the current visual field at pick up camera 7 in the wings.And object and/or white line 31A (road indication) are being in line away from the part with rear image display area 20 adjacent boundaries on the road of graphic processing circuit 4 corrections in history display zone 30.By this way, drawing white line 31A etc. continuously with the boundary vicinity of rear image display area 20.In addition, draw white line 31A etc. with rectilinear form, so that remove the distortion that the rear image in history display zone 30 adds in part away from this border.By this way, in history display zone 30, draw white line 31A etc., make at vehicle 10 substantially parallel with vehicle 10 nearby.Therefore, the discomfort that causes by the image fault in history display zone 30 be can reduce, image and the conformability between the image in history display zone 30 in the rear image display area 20 remained on via the border simultaneously.
Graphic processing circuit 4 can only represent in history display zone 30 that the rear of vehicle divides the CG with white line 31 (object and/or road are indicated on the road), and this CG has been coupled with the distortion of the original image in the image display area 20 in the wings.By this way, only in history display zone 30, draw the CG of vehicle and white line 31 (object and/or road indication on the road), and do not draw out of Memory.Therefore, the driver can directly and clearly obtain vehicle and such as the physical relation between the mark of white line 31.In addition, can reduce the image processing load through the information that restriction is drawn.
The ground color of the image object and/or the road indication can be different from the color of the road surface in the rear image display area 20 on vehicle CG, road in history display zone 30.In this distribution of ground color, the border between image in the rear image display area 20 and the image in history display zone 3 becomes clear.Like this, in the single-piece composograph on read out instrument 9, the driver can be current original image with the image recognition in screen top easily, and will be the previous image of current portion out of sight in the image recognition in the screen lower part.For example, the ground color in history display zone 30 can be different from the actual color of road surface.Particularly, can represent this ground color with black or blueness.Note, use with the color various colors of vehicle CG and represent this earth background color.By this way, two images further are expressed as the independent image in isolated area distinctively.In addition, can in history display zone 30, further represent object and road indication etc. on vehicle CG, the road distinctively.
Graphic processing circuit 4 can be controlled the operation of control setup control instrument indicating device by controlled device.The control setup that is used to control graphic processing circuit 4 and instrument indicating device can have relatively little memory device and low relatively handling property.According to this structure, the processing load of memory device and the minimizing of work area can be more effective with the image processing of present image (they are connected each other continuously) at the previous image that is used for producing simplification.
Read out instrument 9 is arranged in the part of instrument indicating device.Particularly, the instrument that the image of vehicle rear-side can overlapping vehicle.According to this structure, the driver can confirm the image of vehicle rear-side when checking instrument, and little amplitude ground moving view point.
The control of graphic processing circuit 4 controlled devices, the operation of control setup control instrument indicating device.Read out instrument 9 can be positioned on the instrument carrier panel of vehicle interior.According to this structure, the image of vehicle rear-side is near the height of driver's eye.Therefore, the driver can confirm the image of rear side, and little amplitude ground moving view point.
(other embodiment)
As stated, put down in writing the embodiment of expectation of the present invention.Note, the invention is not restricted to the foregoing description.In spirit of the present invention, the present invention can carry out various modifications, and can drop into practical application.
In the above-described embodiments, the rear view of vehicle read out instrument is described as an example of vehicle surrounding display device.Display screen representes that peripheral image display area and history display zone are just enough, and peripheral image display area is used to illustrate the current vehicle periphery, and the history display zone is used to illustrate the exterior vehicle-surroundings in the current visual field at pick up camera.For example, can the front side of vehicle and/or the side of vehicle be expressed as vehicle-surroundings.
In the above-described embodiments, the extraction that the white line (lay-by) that will draw in the image that is described in the history display zone 30 is served as a mark is as an example.Note, can extract object on the road except white line, such as another road indication, groove, kerb and bar.The method for distilling of object can be similar to the above-mentioned method that is used to extract the edge of white line on such road.
In the above-described embodiments, graphic processing circuit 4 can be drawn reticle in history display zone 30, is used to illustrate the preset distance with vehicle 10.Said reticle can be included on the screen reticle of drawing with predetermined space, is used to illustrate the distance with vehicle.For example, with respect to vehicle vehicle vertically and the shape of the grid of arranging on the Width draw reticle.According to this example, except white line 31 (object and/or road indication on the road), in history display zone 30, around vehicle 10, draw reticle.Therefore, the driver can be identified in object on the road around the vehicle 10 and in more detail with respect to the physical relation of road indication.
In the above-described embodiments, single key element pick up camera 7 is taken the image of on read out instrument 9, representing when stopping.Perhaps, can use a plurality of pick up cameras to take a plurality of images, and can make up or revise and the captured a plurality of images of expression on read out instrument 9.
Gather the foregoing description, be used to represent that the vehicle surrounding display device of vehicle-surroundings image comprises:
Image-generating unit is used to obtain the image on vehicle rear-side;
Subject detecting unit; Be used for obtaining graphicinformation and about the location information of image from the image that obtained; And be used for detecting at least one of object on road indication and the road, the brightness on this graphicinformation and the image and at least one in the color grade relevant based on the graphicinformation that is obtained;
Memory cell is used for the image of store car in advance, and is used for store historical data, and historical data comprises the graphicinformation that is associated with location information about the object that detected by subject detecting unit;
Display screen; It comprises peripheral image display area and history display zone; The periphery image display area is used to represent the present image by the vehicle-surroundings of image-generating unit shooting that peripheral image display area is adjoined in the history display zone, is used to represent to use the image of historical data generation; And
The drawing unit; Be used to generate image, and be used on the history display zone, representing the image that generated, the image of generation comprises the vehicle image and the image that uses historical data to produce of storage in advance; These two images overlap each other, and are converted into from predetermined angle of view and watch.
In addition, subject detecting unit is come detected object from graphicinformation according to the concrete characteristic that on image, occurs.The present image of expression vehicle-surroundings in peripheral image display area.In the history display zone, represent the object that from graphicinformation, detects according to the position of the location information that is associated.
By this way, in object detection, use and the brightness of captured image and at least one grade relevant graphicinformation in the color.Therefore, can, minimizing carry out object detection when being used for the processing load of processing image information.In addition, come detected image information based in brightness and the color at least one and about the location information of image.Graphicinformation is associated with location information, and is stored as historical data.In view of the above, can come the position and the size of detected object with enough precision.In addition, also can reduce the data volume of institute's stored history.Subject detecting unit is based on the concrete characteristic detected object from the graphicinformation that is obtained in the image.Rendered object on the position of position-based information in the history display zone, drawing unit.Thus, can represent to have the seldom image of noise.In addition, the driver can watch the image that on visuality, strengthens.Therefore, vehicle surrounding display device can reduce uncomfortable in the vehicle-surroundings image of current portion out of sight and presentation video.The indication of object and road is arranged in vehicle to the space of wherein moving on the road in peripheral image display area and history display zone, and is used as the mark around the vehicle-surroundings.The indication of object and road for example is groove, step, kerb, bar, people, another vehicle and the white line of specifying the lay-by of parking space on the road.Concrete characteristic on image is the Properties of Objects characteristic, and it appears on the image of representing based on graphicinformation.Concrete characteristic for example is color characteristic, brightness, luminous emittance characteristic and shape facility etc.Subject detecting unit is extracted the concrete characteristic of image with detected object.
According to said embodiment, in the history display zone, use the brightness and the color of the graphicinformation that is detected as object to come indicated object.By this way, come indicated object with the brightness and the color of the graphicinformation that is detected as object.Therefore, with traditional comparing with unified color showing white line, can on display screen, represent to watch and the further similar white line of captured image to unify brightness.In addition, even when causing noise in the detection of object on indication of the road around the vehicle-surroundings or the road, and even when on screen, representing caused noise, can represent noise not remarkable in the image on display screen.Therefore, can generate the wherein inapparent relatively image of noise, and can be provided at the visual image that strengthens of going up to the driver.
According to said embodiment, subject detecting unit can be from the graphicinformation that is detected as object the width value of detected object, and can on the history display zone, represent to have object corresponding to the width of the width value that is detected.By this way, specify in the width of the object of representing in the history display zone based on the width value of the object that from the graphicinformation that is detected as object, obtains.Therefore, the object of this width can be clearly represented to have, and also the edge of this object can be clearly represented.By this way, the driver can be clearly by the existence of notify object.Therefore, can strengthen the driver assistance operation.
According to said embodiment, can use at least one road surface except vehicle image and object of indicating in the history display zone, to represent in the mean flow rate of average color and road surface of the road surface that from the graphicinformation that is obtained, obtains.By this way, can in the history display zone, use the color and the brightness that are similar to real road surface color and brightness to represent this road surface.Through using at least one the expression road surface in average color and the mean flow rate, can on screen, reduce noise.And road surface can further be distinguished with the object in the image.Therefore, can further strengthen driver's identification.
According to said embodiment, detecting unit can also be provided, be used to detect the distance with vehicle, and be used to detect object on the road around the vehicle-surroundings.Detecting unit can detect the location information about object on the road.
According to this structure, can replenish location information through subject detecting unit about object on the road.Therefore, the position of object on the road can be further accurately notified, and the driver assistance operation can be further strengthened to the driver.
According to said embodiment, reticle can be represented in the drawing unit in the history display zone, is used to illustrate the preset distance with vehicle.In this structure, the vehicle periphery in the history display zone is drawn object and/or road indication and reticle on the road.Therefore, the driver can discern the position of vehicle with respect to the space around vehicle-surroundings in detail.Therefore, can further strengthen the driver assistance operation.
According to said embodiment, can in the history display zone, represent at least one in the position of axletree partial sum wheel of vehicle.By this way, axletree partial sum wheel that can be through expression vehicle in the history display zone the position at least one comes clearly to notify axletree and/or the wheel position with respect to vehicle-surroundings to the driver.Therefore, can further strengthen the driver assistance operation.
According to said embodiment, can in peripheral image display area, represent present image, and not proofread and correct the distortion that in image, causes by the vehicle-surroundings of image-generating unit shooting.By this way, and wherein proofread and correct the out-of truth image and represent to compare, can amplify the expression scope of vehicle rear-side.Therefore, driver's vehicle-surroundings information that can obtain to strengthen.
According to said embodiment, image-generating unit can be taken the image of vehicle rear-side, and can in peripheral image display area, represent the present image by the vehicle rear-side of image-generating unit shooting, and does not proofread and correct the distortion that in image, causes.By this way, the image of the actual distortion of expression vehicle rear-side in peripheral image display area.Thus, and wherein proofread and correct the out-of truth image and represent to compare, can amplify the expression scope of vehicle rear-side.Therefore, driver's vehicle-surroundings information that can obtain to strengthen.Add the distortion in the image in the wings to the history display zone.Therefore, can be in display screen will be in peripheral image display area current original image visually be connected with image in the history display zone, the image in the current history display zone is outside in the visual field of image-generating unit.Therefore, the driver can further easily discern the position from car.Therefore, the image range of vehicle rear-side can be guaranteed, and driver's visual capabilities can be strengthened.
According to said embodiment, the image of drawing unit in the history display zone adds the distortion of the image of in peripheral image display area, representing.By this way, the distortion of the image around object and/or road indication on and the road that the history display zone, represent that obtain from historical data add vehicle-surroundings.Therefore, in display screen, the current original image in peripheral image display area can be connected with the image visual ground in the history display zone, and the image in the current history display zone is outside in the visual field of image-generating unit.Therefore, the driver can further easily discern the relation between the vehicle-surroundings of vehicle and current portion out of sight.
Be not limited to by microcomputer 1 and graphic processing circuit 4 performed such as the above-mentioned processing of calculating and confirm.Control unit can have various structures, comprises the microcomputer 1 and graphic processing circuit 4 that for example illustrate as an example.
Can carry out such as the above-mentioned processing of calculating and confirming through any one or any combination of software, electronic circuit and mechanical device etc.This software can be stored in the storage medium, and can send via the sending set such as network equipment.This electronic circuit can be an IC, and can be discrete circuitry, such as the hardware logic that has been configured electronics or electric elements etc.The element that produces above-mentioned processing can be a discrete component, and can part or integrally integrated.
Be understood that; Though be described as comprising concrete sequence of steps, comprise that various other orders of these steps and/or the other alternate embodiment of other not step disclosed herein also are considered in step of the present invention in this processing with embodiments of the invention.
Under the situation that does not break away from spirit of the present invention, can diversely carry out various modifications and change for the foregoing description.

Claims (12)

1. vehicle surrounding display device is used to represent the image of vehicle-surroundings, and said vehicle surrounding display device comprises:
Image-generating unit (7), it is arranged to the image of taking said vehicle-surroundings;
First subject detecting unit (1), it is arranged to:
From captured image, obtain and the brightness of captured image and at least one the relevant graphicinformation in the color,
From captured image, obtain location information about captured image,
Concrete characteristic according to the object in captured image detects the object that comprises at least one in the object on road indication (21) and the road from the graphicinformation that is obtained, and,
To be associated with location information about the graphicinformation of said object about said object;
Memory cell (5), it is arranged to:
Store the vehicle image of said vehicle in advance, and
Store the historical data of said graphicinformation that is associated and location information;
Drawing unit (4), it is arranged to:
Generate image according to said historical data,
With image that is generated and the image overlaid that comprises the vehicle image of storing in advance, and
With overlapping image transitions for to watch from predetermined viewpoint,
Display screen (9), it comprises:
Periphery image display area (20), it is arranged to the present image of expression by the said vehicle-surroundings of said image-generating unit (7) acquisition; And,
History display zone (30), it is adjacent with said peripheral image display area (20), and image that is configured to represent to be changed in the position of the location information that basis is associated and the object that from the graphicinformation of the image changed, detects,
Wherein, said display screen (9) also is configured to cross over the border between said peripheral image display area (20) and the said history display zone (30) continuously and representes the road indication of the said present image in the said peripheral image display area (20) and the road indication in the said history display zone (30).
2. according to the vehicle surrounding display device of claim 1, wherein, said history display zone (30) is arranged to uses the brightness that is associated with said graphicinformation and at least one in the color to come indicated object.
3. according to the vehicle surrounding display device of claim 1 or 2,
Wherein, said first subject detecting unit (1) is configured to detect the width of the said object that from said graphicinformation, detects, and
Said history display zone (30) is configured to represent that said object has the width corresponding to the width that is detected.
4. according to the vehicle surrounding display device of claim 1 or 2; Wherein, said history display zone (30) is configured to use the mean flow rate of average color and road surface of the road surface that obtains from said graphicinformation at least one to represent the road surface except said vehicle image and said object.
5. according to the vehicle surrounding display device of claim 1 or 2, also comprise:
Second subject detecting unit (6), it is configured to detect object on the road around the said vehicle-surroundings, and detects the distance of object and said vehicle on the road of being detected,
Wherein, said second subject detecting unit (6) is configured to detect the location information about object on the said road.
6. according to the vehicle surrounding display device of claim 5, wherein, said drawing unit (4) is configured to make said history display zone (30) expression reticle, said reticle to be used to specify the preset distance with said vehicle.
7. according to the vehicle surrounding display device of claim 1 or 2, wherein, said history display zone (30) is configured to represent at least one in the position of axletree part of wheel and said vehicle of said vehicle.
8. according to the vehicle surrounding display device of claim 1 or 2; Wherein, Said peripheral image display area (20) is configured to the present image of expression by the said vehicle-surroundings of said image-generating unit (7) acquisition, and does not proofread and correct the distortion that in said present image, causes.
9. according to Claim 8 vehicle surrounding display device, wherein, said drawing unit (4) is configured to add to the image in said history display zone (30) distortion of the said present image in said peripheral image display area (20).
10. according to the vehicle surrounding display device of claim 1 or 2,
Wherein, said image-generating unit (7) be configured to obtain said vehicle rear-side when the front and back image, and
Said peripheral image display area (20) is configured to expression and works as the front and back image by the said of said image-generating unit (7) acquisition, and does not proofread and correct the distortion that in the image of front and back, causes said.
11. according to the vehicle surrounding display device of claim 10, wherein, said drawing unit (4) is configured to add the said distortion when the front and back image in said peripheral image display area (20) to the image in said history display zone (30).
12. a method that is used to represent the vehicle-surroundings image, said method comprises:
Take the present image of said vehicle-surroundings;
From captured image, obtain and the brightness of captured image and at least one the relevant graphicinformation in the color;
From captured image, obtain location information about captured image;
Concrete characteristic according to the object in the captured image detects the object that comprises at least one in the object on road indication (21) and the road from the graphicinformation that is obtained;
The graphicinformation about said object that is obtained is associated with the location information about said object that is obtained;
The historical data of graphicinformation that storage is associated and the location information that is associated;
Generate image according to said historical data;
The vehicle image of storing in advance of image that is generated and said vehicle is overlapping;
With overlapping image transitions for to watch from predetermined viewpoint;
The present image of the said vehicle-surroundings of expression in peripheral image display area (20);
The image that expression is changed in the history display zone (30) adjacent with said peripheral image display area (20); And
Location tables according to the said location information that is associated is shown in the object that is detected in the image of being changed,
Wherein, cross over the border between said peripheral image display area (20) and the said history display zone (30) continuously and represent the road indication of the said present image in the said peripheral image display area (20) and the road indication in the said history display zone (30).
CN2010101419438A 2009-03-25 2010-03-25 Vehicle periphery display device and method for vehicle periphery image Expired - Fee Related CN101844545B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009075035A JP5035284B2 (en) 2009-03-25 2009-03-25 Vehicle periphery display device
JP2009-075035 2009-03-25

Publications (2)

Publication Number Publication Date
CN101844545A CN101844545A (en) 2010-09-29
CN101844545B true CN101844545B (en) 2012-09-05

Family

ID=42664282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010101419438A Expired - Fee Related CN101844545B (en) 2009-03-25 2010-03-25 Vehicle periphery display device and method for vehicle periphery image

Country Status (4)

Country Link
US (1) US8514282B2 (en)
JP (1) JP5035284B2 (en)
CN (1) CN101844545B (en)
DE (1) DE102010016064A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI578271B (en) * 2012-10-23 2017-04-11 義晶科技股份有限公司 Dynamic image processing method and dynamic image processing system

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060061008A1 (en) 2004-09-14 2006-03-23 Lee Karner Mounting assembly for vehicle interior mirror
US10144353B2 (en) 2002-08-21 2018-12-04 Magna Electronics Inc. Multi-camera vision system for a vehicle
US7720580B2 (en) 2004-12-23 2010-05-18 Donnelly Corporation Object detection system for vehicle
US8655019B2 (en) * 2009-09-24 2014-02-18 Panasonic Corporation Driving support display device
JP5113881B2 (en) 2010-06-03 2013-01-09 株式会社デンソー Vehicle periphery monitoring device
DE102010042026B4 (en) * 2010-10-06 2020-11-26 Robert Bosch Gmbh Method for generating an image of at least one object in the surroundings of a vehicle
CN102073048A (en) * 2010-11-16 2011-05-25 东北电力大学 Method with logical judgment for monitoring rectangular high-voltage working area
JP5152309B2 (en) * 2010-12-01 2013-02-27 株式会社デンソー Electronic mirror
JP5870608B2 (en) * 2011-10-13 2016-03-01 アイシン精機株式会社 Image generation device
DE112012004354T5 (en) * 2011-10-18 2014-07-10 Hitachi Construction Machinery Co., Ltd. Device for monitoring the environment of machinery
FR2982552B1 (en) * 2011-11-10 2014-06-27 Denso Corp VEHICLE PERIMETER MONITORING DEVICE
CN103139532B (en) * 2011-11-22 2016-04-20 株式会社电装 vehicle periphery monitor
DE102011087797A1 (en) * 2011-12-06 2013-06-06 Robert Bosch Gmbh Method and device for localizing a predefined parking position
CN103167276A (en) * 2011-12-19 2013-06-19 富泰华工业(深圳)有限公司 Vehicle monitoring system and vehicle monitoring method
JP5915155B2 (en) * 2011-12-21 2016-05-11 アイシン・エィ・ダブリュ株式会社 Lane guidance display system, method and program
JP6254084B2 (en) * 2012-07-27 2017-12-27 クラリオン株式会社 Image processing device
RU2571368C1 (en) * 2012-07-27 2015-12-20 Ниссан Мотор Ко., Лтд. Device for detecting three-dimensional objects, method of detecting three-dimensional objects
BR112015001861B1 (en) * 2012-07-27 2022-05-31 Nissan Motor Co., Ltd Three-dimensional object detection device
JP6023216B2 (en) * 2012-11-27 2016-11-09 クラリオン株式会社 In-vehicle image processing device
JP6304242B2 (en) * 2013-04-04 2018-04-04 ソニー株式会社 Image processing apparatus, image processing method, and program
KR102062923B1 (en) * 2013-06-12 2020-01-06 현대모비스 주식회사 Parking assistance system
KR101723401B1 (en) 2013-08-12 2017-04-18 주식회사 만도 Apparatus for storaging image of camera at night and method for storaging image thereof
JP6375633B2 (en) * 2014-02-12 2018-08-22 株式会社デンソー Vehicle periphery image display device and vehicle periphery image display method
JP6326869B2 (en) * 2014-03-05 2018-05-23 株式会社デンソー Vehicle periphery image display device and vehicle periphery image display method
US9359009B2 (en) 2014-03-22 2016-06-07 Ford Global Technologies, Llc Object detection during vehicle parking
JP2015204516A (en) * 2014-04-14 2015-11-16 キヤノン株式会社 Imaging device, control method and control program thereof
JP6459226B2 (en) * 2014-06-02 2019-01-30 株式会社Soken In-vehicle image processing device
KR102176775B1 (en) * 2014-07-02 2020-11-09 현대모비스 주식회사 Around view system and the operating method
KR101637716B1 (en) * 2014-11-03 2016-07-07 현대자동차주식회사 Apparatus and method for recognizing position of obstacle in vehicle
EP3342644B1 (en) * 2015-08-27 2020-07-22 Jvc Kenwood Corporation Display device for vehicle and display method for vehicle
JP6565674B2 (en) * 2015-12-28 2019-08-28 株式会社Jvcケンウッド VEHICLE DISPLAY DEVICE AND VEHICLE DISPLAY METHOD
JP6519408B2 (en) * 2015-08-27 2019-05-29 株式会社Jvcケンウッド Display device for vehicle and display method for vehicle
JP6555056B2 (en) * 2015-09-30 2019-08-07 アイシン精機株式会社 Perimeter monitoring device
US9718404B2 (en) * 2015-10-01 2017-08-01 Ford Global Technologies, LLCS Parking obstruction locator and height estimator
US10937168B2 (en) 2015-11-02 2021-03-02 Cognex Corporation System and method for finding and classifying lines in an image with a vision system
DE102016120775A1 (en) * 2015-11-02 2017-05-04 Cognex Corporation System and method for detecting lines in an image with a vision system
JP6239205B2 (en) * 2015-11-06 2017-11-29 三菱電機株式会社 Image processing apparatus, image processing method, and image processing program
JP6545108B2 (en) * 2016-01-14 2019-07-17 アルパイン株式会社 Parking assistance apparatus and parking assistance method
JP6628672B2 (en) * 2016-04-05 2020-01-15 アルパイン株式会社 Vehicle periphery display device and vehicle periphery display method
KR102488512B1 (en) * 2016-04-15 2023-01-13 주식회사 에이치엘클레무브 Parking assistance device for a vechicle and method for controlling parking thereof
JP6642306B2 (en) * 2016-06-29 2020-02-05 アイシン精機株式会社 Perimeter monitoring device
JP2018036937A (en) * 2016-09-01 2018-03-08 住友電気工業株式会社 Image processing device, image processing system, image processing program and label
DE102016223391A1 (en) * 2016-11-25 2018-05-30 Conti Temic Microelectronic Gmbh METHOD AND DEVICE FOR PRODUCING A VEHICLE ENVIRONMENT VIEW ON A VEHICLE
JP2018097535A (en) * 2016-12-12 2018-06-21 株式会社デンソーテン Driving support device and driving support method
JP6733647B2 (en) * 2017-12-05 2020-08-05 トヨタ自動車株式会社 Image display
CN108297794B (en) * 2018-01-11 2022-11-18 阿尔派株式会社 Parking assistance device and travel prediction line display method
CN108357496A (en) * 2018-02-12 2018-08-03 北京小马智行科技有限公司 Automatic Pilot control method and device
JP7135339B2 (en) * 2018-02-28 2022-09-13 株式会社デンソー Imaging system mounted on vehicle, object identification device, and object identification method
DE102018119024A1 (en) * 2018-08-06 2020-02-06 Knorr-Bremse Systeme für Nutzfahrzeuge GmbH Camera surveillance system
CN111213153A (en) * 2019-01-30 2020-05-29 深圳市大疆创新科技有限公司 Target object motion state detection method, device and storage medium
JP7318377B2 (en) * 2019-07-10 2023-08-01 株式会社Soken Object detection device
CN112622761B (en) * 2019-09-24 2022-07-29 博泰车联网科技(上海)股份有限公司 Backing display method and device and computer storage medium
JP7160014B2 (en) * 2019-10-11 2022-10-25 トヨタ自動車株式会社 parking assist device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002120675A (en) * 2000-10-17 2002-04-23 Mitsubishi Electric Corp Vehicle surroundings monitoring support system
CN2538616Y (en) * 2002-04-15 2003-03-05 张成况 Monitoring alarm for backing aotomobile
CN1472095A (en) * 2003-05-20 2004-02-04 单文昌 Intelligent three-dimensional reverse sliding aiming systems
CN2769853Y (en) * 2003-10-02 2006-04-05 日产自动车株式会社 Vehicle backing auxiliary device
GB2434020A (en) * 2005-11-04 2007-07-11 Denso Corp Driver parking assisting system that generates a birds eye view of the parking space behind the vehicle.
CN101055176A (en) * 2006-04-12 2007-10-17 丰田自动车株式会社 Vehicle surrounding monitoring system and vehicle surrounding monitoring method
CN201181331Y (en) * 2008-03-12 2009-01-14 卢灿光 Radar type vehicle reverse monitoring apparatus

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5670935A (en) * 1993-02-26 1997-09-23 Donnelly Corporation Rearview vision system for vehicle including panoramic view
JPH07302346A (en) 1994-05-06 1995-11-14 Nippon Soken Inc Method for detecting white line on road
JP3553698B2 (en) 1995-09-07 2004-08-11 富士通テン株式会社 White line recognition device that recognizes inclined white lines in images
US8120652B2 (en) * 1997-04-02 2012-02-21 Gentex Corporation System for controlling vehicle equipment
JPH11219435A (en) 1998-02-03 1999-08-10 Honda Motor Co Ltd White line detector for automobile
JP3624769B2 (en) * 1999-09-30 2005-03-02 株式会社豊田自動織機 Image conversion device for vehicle rear monitoring device
JP3932379B2 (en) * 2001-10-02 2007-06-20 株式会社日立製作所 Image processing apparatus and image sensor
JP3778849B2 (en) 2001-12-18 2006-05-24 株式会社デンソー Vehicle periphery image processing apparatus and recording medium
JP3930366B2 (en) 2002-04-18 2007-06-13 株式会社デンソー White line recognition device
JP4374211B2 (en) * 2002-08-27 2009-12-02 クラリオン株式会社 Lane marker position detection method, lane marker position detection device, and lane departure warning device
US20050146607A1 (en) * 2004-01-06 2005-07-07 Romeo Linn Object Approaching Detection Anti Blind e-Mirrors System
JP4455943B2 (en) 2004-06-23 2010-04-21 株式会社リコー Droplet discharge head, liquid cartridge, droplet discharge device, recording device
JP3977368B2 (en) * 2004-09-30 2007-09-19 クラリオン株式会社 Parking assistance system
JP4561470B2 (en) * 2005-05-19 2010-10-13 アイシン・エィ・ダブリュ株式会社 Parking assistance device
JP4696691B2 (en) 2005-05-27 2011-06-08 アイシン・エィ・ダブリュ株式会社 Parking support method and parking support device
JP2007062649A (en) * 2005-09-01 2007-03-15 Clarion Co Ltd Distance information display device for vehicle
JP4815993B2 (en) * 2005-10-19 2011-11-16 アイシン・エィ・ダブリュ株式会社 Parking support method and parking support device
JP4575275B2 (en) * 2005-11-02 2010-11-04 三菱ふそうトラック・バス株式会社 Vehicle display device
JP2007241606A (en) 2006-03-08 2007-09-20 Niles Co Ltd White line detector
US7511607B2 (en) * 2006-03-28 2009-03-31 D. Larry Hubbard Vehicle back-up viewing system
JP5309442B2 (en) * 2006-05-29 2013-10-09 アイシン・エィ・ダブリュ株式会社 Parking support method and parking support device
WO2008134715A1 (en) * 2007-04-30 2008-11-06 Mobileye Technologies Ltd. Rear obstruction detection
WO2009027089A2 (en) * 2007-08-30 2009-03-05 Valeo Schalter Und Sensoren Gmbh Method and system for weather condition detection with image-based road characterization
JP5120880B2 (en) * 2007-10-15 2013-01-16 アルパイン株式会社 Image processing apparatus and image processing method
FR2925706B1 (en) * 2007-12-19 2010-01-15 Soc Tech Michelin DEVICE FOR EVALUATING THE SURFACE OF A TIRE.
US20110205365A1 (en) * 2008-10-28 2011-08-25 Pasco Corporation Road measurement device and method for measuring road
JP4914458B2 (en) * 2009-02-12 2012-04-11 株式会社日本自動車部品総合研究所 Vehicle periphery display device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002120675A (en) * 2000-10-17 2002-04-23 Mitsubishi Electric Corp Vehicle surroundings monitoring support system
CN2538616Y (en) * 2002-04-15 2003-03-05 张成况 Monitoring alarm for backing aotomobile
CN1472095A (en) * 2003-05-20 2004-02-04 单文昌 Intelligent three-dimensional reverse sliding aiming systems
CN2769853Y (en) * 2003-10-02 2006-04-05 日产自动车株式会社 Vehicle backing auxiliary device
GB2434020A (en) * 2005-11-04 2007-07-11 Denso Corp Driver parking assisting system that generates a birds eye view of the parking space behind the vehicle.
CN101055176A (en) * 2006-04-12 2007-10-17 丰田自动车株式会社 Vehicle surrounding monitoring system and vehicle surrounding monitoring method
CN201181331Y (en) * 2008-03-12 2009-01-14 卢灿光 Radar type vehicle reverse monitoring apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI578271B (en) * 2012-10-23 2017-04-11 義晶科技股份有限公司 Dynamic image processing method and dynamic image processing system

Also Published As

Publication number Publication date
DE102010016064A1 (en) 2010-09-30
CN101844545A (en) 2010-09-29
US8514282B2 (en) 2013-08-20
JP2010232723A (en) 2010-10-14
JP5035284B2 (en) 2012-09-26
US20100245574A1 (en) 2010-09-30

Similar Documents

Publication Publication Date Title
CN101844545B (en) Vehicle periphery display device and method for vehicle periphery image
CN101808236B (en) Vehicle periphery displaying apparatus
JP4766841B2 (en) Camera device and vehicle periphery monitoring device mounted on vehicle
JP4677104B2 (en) Vehicle display device
US8928753B2 (en) Method and apparatus for generating a surrounding image
US9272731B2 (en) Driving-operation assist and recording medium
CN104185009B (en) enhanced top-down view generation in a front curb viewing system
EP1115250B1 (en) Method and apparatus for displaying image
KR101510655B1 (en) Around image generating method and apparatus
JP5178361B2 (en) Driving assistance device
US20140043466A1 (en) Environment image display apparatus for transport machine
CN102196242A (en) Self-adaptive scene image auxiliary system with image enhancing function
JP2011193485A (en) Camera device mounted on vehicle, and apparatus for monitoring vehicle periphery
JP2001114047A (en) Vehicle-surrounding situation indication device
JP4499319B2 (en) Driving support device, driving support method, and driving guide data creation method
JP3961969B2 (en) Vehicle periphery image processing system
CN204367998U (en) A kind of novel electron back-sight visual system
US11938863B2 (en) Peripheral image generation device and display control method
US20220222947A1 (en) Method for generating an image of vehicle surroundings, and apparatus for generating an image of vehicle surroundings
CN110576796A (en) Standard-definition 360-panorama system UI layout method
US11830409B2 (en) Peripheral image display device
CN109987027A (en) A kind of semitrailer looks around the design method of blind area display planning
JP2014021543A (en) Visual field support device and program for vehicle
CN105100702B (en) Method, display device, system and computer program for presenting vehicle environment
CN116071234A (en) Method and device for correcting looking-around splicing based on vehicle body posture and vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120905

Termination date: 20210325