EP1168241A2 - Bildwiedergabevorrichtung - Google Patents
Bildwiedergabevorrichtung Download PDFInfo
- Publication number
- EP1168241A2 EP1168241A2 EP01115271A EP01115271A EP1168241A2 EP 1168241 A2 EP1168241 A2 EP 1168241A2 EP 01115271 A EP01115271 A EP 01115271A EP 01115271 A EP01115271 A EP 01115271A EP 1168241 A2 EP1168241 A2 EP 1168241A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- vehicle
- estimated path
- image
- rudder angle
- display image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—Two-dimensional [2D] image generation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Definitions
- the present invention relates to rendering devices and, more specifically, to a rendering device which can be incorporated in a drive assistant device.
- the rendering device generates a display image of around a vehicle based on an image captured by an image capture device fixedly placed in the vehicle.
- a conventional-type drive assistant device is mounted in a vehicle, and generally includes an image capture device, a rudder angle sensor, a computing unit, a rendering device, and a display device.
- the image capture device is fixedly placed in a predetermined position in the vehicle, and takes charge of capturing an image of an area defined by its own viewing angle. The resulting image is now referred to as a captured image.
- the rudder angle sensor is also fixed in a predetermined position in the vehicle, and detects to what degree the steering wheel of the vehicle is turned. Based on the detection result, the computing unit calculates a path estimated for the vehicle to take.
- the rendering device then renders the estimated path on the captured image, and the image generated thereby is such display image as shown in FIG. 20.
- the display image is displayed on the display device.
- a driver of the vehicle can know if his/her current steering will fit the vehicle in the parking space without colliding any obstacle in a close range. If his/her steering is not appropriate, the estimated path is displayed out of the parking space in the display image. Therefore, the driver can appropriately adjust the rudder angle of the steering wheel.
- the drive assistant device additionally carries an active sensor for measuring a distance between the vehicle and an obstacle observed near the estimated path. Based on the measurement result provided by the active sensor, the computing unit determines which part of the estimated path is to be rendered on the captured image. The part thus determined is now referred to as a rendering estimated path. In this manner, the rendering device accordingly renders on the captured image the rendering estimated path, which ends right before the obstacle.
- the above conventional drive assistant devices carry two problems as follows.
- the estimated path is fixedly determined in color for display.
- the color is similar in tone to a predominant color of the display image, the color is unchangeable.
- the predominant color is mainly determined by the road, for example, whether paved or not with asphalt. If this is the case, the driver finds it difficult to instantaneously locate the estimated path on the display image.
- the estimated path rendered in the display image is represented simply by lines, failing to help the driver instantaneously perceive how far he/she can move the vehicle. More specifically, as shown in FIG. 21, a vehicle Vusr carrying the conventional drive assistant device is moving toward an obstacle Vbst . In this case, the vehicle Vusr first collides a corner point Pcnr of the obstacle Vbst, not intersection points Pcrg of an estimated path Pp and the surface of the obstacle Vbst. It means that the farthest point possible for the vehicle Vusr to move is the corner point Pcnr of the obstacle Vbst. As such, even if the estimated path is so rendered as to end immediately before the object, the second problem remains yet unsolved.
- an object of the present invention is to provide a rendering device, a display image being generated thereby shows an estimated path in an eye-catching manner for the driver to easily locate.
- Another object of the present invention is to provide a rendering device, a display image generated thereby being indicative and helpful for the driver to know how far he/she can move the vehicle.
- the present invention has the following features to attain the objects above.
- a first aspect of the present invention is directed to a rendering device for generating a display image of around a vehicle for drive assistance.
- the rendering device comprises a reception part for receiving a current rudder angle of a steering wheel of the vehicle from a rudder angle sensor fixed therein; a derivation part for deriving an estimated path for the vehicle to take based on the rudder angle received by the reception part; and an image generation part for generating the display image based on a captured image captured by an image capture device fixed in the vehicle, and the estimated path derived by the derivation part.
- the estimated path is overlaid on an intermittent basis.
- a second aspect of the present invention is directed to a rendering device for generating a display image of around a vehicle for drive assistance.
- the rendering device comprises a first reception part for receiving a distance to an obstacle located around the vehicle from a measuring sensor placed in the vehicle; a first derivation part for deriving a farthest point for the vehicle to move based on the distance received by the first reception part; a second reception part for receiving a current rudder angle of a steering wheel of the vehicle from a rudder angle sensor fixed in the vehicle; a second derivation part for deriving an estimated path for the vehicle to take based on the rudder angle received by the second reception part; and an image generation part for generating the display image based on a captured image captured by an image capture device fixed in the vehicle, the farthest point derived by the first derivation part, and the estimated path derived by the second derivation part.
- FIG. 1 is a block diagram showing the hardware structure of a rendering device Urnd1 according to a first embodiment of the present invention.
- the rendering device Urnd1 includes a processor 1, program memory 2, and a working area 3.
- the program memory 2 is typified by ROM (Read Only Memory), and stores a program PGa for defining the processing procedure in the processor 1.
- the processor 1 By following the program PGa, the processor 1 generates such display image Sout as shown in FIG. 2.
- the display image Sout shows a path Pp estimated for a vehicle Vusr (see FIG. 3) to take in the course of time.
- the estimated path Pp is composed of a left-side trajectory Pp1 and a right-side trajectory Pp2 indicated by, respectively, indicators Sind1 and Sind2.
- the left-side trajectory Pp1 is for a left-rear wheel of the vehicle Vusr, while the right-side trajectory Pp2 by a right-rear wheel.
- the indicators Sind1 and Sind2 are both objects in a predetermined shape (e.g., circle, rectangle) previously stored in the program memory 2.
- the working area 3 is typified by RAM (RandomAccess Memory), and used when the processor 1 executes the program PGa .
- the rendering device Urnd1 in the above structure is typically incorporated in a drive assistant device Uast1 .
- the drive assistant device Uast1 is mounted in the vehicle Vusr, and includes at least one image capture device 4, a rudder angle sensor 5, and a display device 6 together with the rendering device Urnd1.
- the image capture device 4 is embedded in the rear-end of the vehicle Vusr, and captures an image covering an area rear of the vehicle Vusr .
- the resulting image is a captured image Scpt as shown in FIG. 4.
- the rudder angle sensor 5 detects a rudder angle ⁇ of the steering wheel of the vehicle Vusr, and transmits it to the processor 1.
- the rudder angle ⁇ here indicates at what angle the steering wheel is turned with respect to the initial position.
- the steering wheel is considered in the initial position when not turned, that is, when the vehicle Vusr is in the straight-ahead position.
- the display device 6 is typically a liquid crystal display.
- the processor 1 first generates an image capture instruction Icpt , and transmits it to the image capture device 4 (step S1).
- the procedure returns to step S1 after step S10 is through, and the processor 1 generates another image capture instruction Icpt.
- the program PGa is so written that a time interval between those two image capture instructions Icpt is substantially a t1 second.
- the value of t1 is so selected as to allow the display device 6 to display the display image Sout for 30 frames per second.
- the image capture instruction Icpt is a signal instructing the image capture device 4 for image capturing.
- the image capture device 4 responsively captures such captured image Scpt as shown in FIG. 4, and stores it in frame memory (not shown) reserved in the working area 3 (step S2).
- the processor 1 then watches a deriving timing T1 (step S3).
- This deriving timing T1 is previously written in the program PGa , and allows the processor 1 to derive the left- and right-side trajectories Pp1 and Pp2 once every t2 second.
- the value of t2 is selected larger than that of t1 (e.g. , 0.1 second) since a change on a time base in the rudder angle ⁇ is small.
- the processor 1 In the deriving timing T1 , the processor 1 generates a detection instruction Idtc , and transmits it to the rudder angle sensor 5 (step S4).
- the detection instruction Idtc is a signal instructing the rudder angle sensor 5 to detect the rudder angle ⁇ .
- the rudder angle sensor 5 responsively detects the rudder angle ⁇ , and stores it in the working area 3 (step S5).
- the processor 1 Based on thus detected rudder angle ⁇ , the processor 1 derives the left- and right-side trajectories Pp1 and Pp2 (step S6). More specifically, derived by the processor 1 here are equations respectively for the left- and right-side trajectories Pp1 and Pp2 under the Ackermann's model.
- the left- and right-side trajectories Pp1 and Pp2 are defined as being trajectories traced by left- and right-rear wheels of the vehicle Vusr on condition that the driver keeps the steering wheel at the currently derived rudder angle ⁇ .
- the left-side trajectory Pp1 calculated by such equation becomes an arc in a predetermined length.
- the arc is a segment of a circle traceable by the vehicle Vusr around a circling center.
- the radius of the circle is equal to a distance from the circling center to a point having a rotation center of the left-rear wheel projected onto the road surface.
- the equation for the right-side trajectory Pp2 is similar except that the arc is traced by the right-rear wheel, on its rotation center, of the vehicle Vusr.
- the processor 1 generates overlaying position data Dsp indicating where to overlay the two indicators Sind1 and Sind2 , and stores the data in the working area 3 (step S7).
- the processor 1 calculates two points a0 and b0 being closest to the vehicle Vusr (not shown) on those trajectories Pp1 and Pp2, respectively.
- the processor 1 then calculates a point a1 being away by a predetermined distance ⁇ d from the point a0 on the left-side trajectory Pp1 , and a point b1 being away also by ⁇ d from the point b0 on the right-side trajectory Pp2 .
- the processor 1 repeats the same processing until i (where i is a natural number being 2 or larger) sets of coordinates such as ( a0 , b0 ), ( a1 , b1 ), ..., ( a ( i-1 ), b ( i-1 )) are calculated.
- the sets of coordinates are numbered starting from the one closest to the vehicle Vusr. Accordingly, as shown in FIG. 7, stored in the working area 3 is the overlaying position data Dsp including those numbered sets of coordinates.
- the processor 1 Based on the overlaying position data Dsp and the aforementioned captured image Scpt , the processor 1 then generates a frame of the display image Sout on the frame memory (step S8).
- the display image Sout is the one having the indicators Sind1 and Sind2 overlaid on the captured image Scpt.
- the processor 1 first selects, from the overlaying position data Dsp generated in step S7, a set of coordinates which is not yet selected and the smallest in number. In this example, since no set has yet been selected, selected now is the set of ( a0 , b0 ).
- the processor 1 then overlays the indicators Sind1 and Sind2 onto the points a0 and b0 in the captured image Scpt on the frame memory. After this overlaying process, such display image Sout as shown in FIG. 8 is generated for one frame on the frame memory.
- the processor 1 then transfers the display image Sout on the frame memory to the display device 6 for display thereon (step S9).
- the indicator Sind1 is overlaid on the point a0 on the left-side trajectory Pp1 , and the indicator Sind2 on the point b0 on the right-side trajectory Pp2 .
- step S10 determines whether now is the time to end the processing of FIG. 5 (step S10). If determined not yet, the procedure returns to step S1 for generating another display image Sout. By the time when steps S1 and S2 are through, another captured image Scpt is newly stored on the frame memory. Then in step S3, if determining that the timing T1 has not come yet, the processor 1 then watches a timing T2 to change the overlaying positions of the indicators Sind1 and Sind2 (step S11).
- the changing timing T2 is previously written in the program PGa, and allows the processor 1 to change the overlaying positions of the indicators Sind1 and Sind2 once every t3 second.
- the value of T3 is set too small, the indicator Sind1 moves too fast from the point a0 to a1 for the driver to follow with her/his eyes on the display device 6.
- the value of t3 is selected larger than that of t1 (e.g., 0.05 second).
- the processor 1 determines that the timing T2 has not come yet, the processor 1 generates a frame of the display image Sout on the frame memory (step S12). This is based on the captured image Scpt stored in step S2 and the set of coordinates currently selected in the overlaying position data Dsp (in this example, the set of ( a0, b0 )). As such, the resulting display image Sout is also the one having the indicators Sind1 and Sind2 overlaid on the points a0 and b0 on the captured image Scpt. Then, the processor 1 transfers thus generated display image Sout on the frame memory to the display device 6 for display thereon (step S13).
- step S10 if the processor 1 determines that now is not the time to end the processing of FIG. 5, the procedure returns to step S1. By the time when steps S1 and S2 are through, another captured image Scpt is newly stored on the frame memory. Then in step S3, if the processor 1 determines that the timing T1 has not come yet, and in step S11, if determines that the timing T2 is now right, the procedure goes to step S14. Then, the processor 1 selects, from the overlaying position data Dsp on the working area 3, a set of coordinates which is not yet selected and the smallest in number (step S14). Since the set selected last is ( a0, b0 ), selected this time is the set ( a1, b1 ).
- the processor 1 generates a new frame of the display image Sout on the frame memory based on the captured image Scpt and the set of coordinate (in this example, the set of ( a1 , b1 )) currently selected in the overlaying position data Dsp (step S15).
- the resulting display image Sout is the one having the indicators Sind1 and Sind2 overlaid on the points a1 and b1 on the captured image Scpt .
- the processor 1 transfers thus generated display image Sout on the frame memory to the display device 6 for display thereon (step S16).
- steps S1 to S16 are repeated until the determination in step S10 becomes Yes to end the processing of FIG. 5.
- the overlaying positions of the indicators Sind1 and Sind2 change, in increments of the distance ⁇ d, from the points a0 and b0 to a ( i-1 ) and b ( i-1 ), respectively.
- the indicators Sind1 and Sind2 are displayed as if moving in the same heading direction as the vehicle Vusr along the left- and right-side trajectories Pp1 and Pp2 .
- the left- and right-side trajectories Pp1 and Pp2 are also displayed on an intermittent basis on the display device 6.
- the left- and right-side trajectories Pp1 and Pp2 become more noticeable and emphasized to a further degree.
- the driver can instantaneously locate the trajectories Pp1 and Pp2 in the display image Sout.
- the processor 1 derives the left- and right-trajectories Pp1 and Pp2 based on the current rudder angle ⁇ . In this manner, the trajectories Pp1 and Pp2 displayed on the display device 6 become always responsive to the driver's steering.
- the changing timing T2 may be variable.
- the program PGa may be so written that the changing timing T2 comes earlier. If so, the left- and right-side trajectories Pp1 and Pp2 become easier to notice.
- the distance ⁇ d between two successive points of aj and a ( j+1 ) is constant on the left-side trajectory Pp1 .
- the value j is a positive integer between 0 and ( i-1 ).
- the distance ⁇ d may not necessarily be constant.
- the program PGa may be so written that the distance ⁇ d is set relatively small to cause the processor 1 to select the point a ( j +1).
- the program PGa may be so written that the distance ⁇ d is set relatively large to cause the processor 1 to select the point a ( j +1).
- the left- and right-side trajectories Pp1 and Pp2 become conspicuous to a further degree.
- FIG. 10 is a block diagram showing the hardware structure of a rendering device Urnd2 according to a second embodiment of the present invention.
- the rendering device Urnd2 includes a processor 21, program memory 22, and a working area 23.
- the program memory 22 is typified by ROM (Read Only Memory), and stores a program PGb for defining the processing procedure in the processor 21.
- the processor 21 By following the program PGb , the processor 21 generates such display image Sout as shown in FIG. 11.
- the display image Sout shows an estimated path Pp of the vehicle Vusr (see FIG. 3) to be traced by a left-rear wheel of the vehicle Vusr.
- the estimated path Pp is displayed only during a display time period Pdt , which will be later described.
- the working area 3 is typified by RAM (Random Access Memory), and used when the processor 21 executes the program PGb.
- the rendering device Urnd2 in the above structure is typically incorporated in a drive assistant device Uast2.
- the drive assistant device Uast2 the only structural difference from the drive assistant device Uast1 is including the rendering device Urnd2 instead of Urnd1 .
- any component appeared in FIG. 1 is under the same reference numeral in FIG. 10, and not described again.
- FIG. 12 Compared with FIG. 5, the flowchart of FIG. 12 includes the same steps, and thus those are under the same step numbers and not described again.
- the processor 21 derives an equation for the estimated path Pp .
- the procedure then goes to step S21, and the processor 21 generates the display image Sout based on the captured image Scpt stored in step S2 and the estimated path Pp derived in step S6. More specifically, the processor 21 renders thus derived estimated path Pp in its entirety on the display image Sout, and the resulting display image Sout looks as shown in FIG. 11.
- step S9 the procedure then goes to step S9, and the processor 21 transfers the display image Sout currently on the frame memory to the display device 6 for display thereon. Then, the processor 21 determines whether now is the time to end the processing of FIG. 12 (step S10), and if not yet, the procedure returns to step S1 for generating another display image Sout on the frame memory. By the time when steps S1 and S2 are through, another captured image Scpt is newly stored on the frame memory. Then in step S3, if determines that the timing T1 has not come yet, the processor 1 then determines whether now is in the display time period Pdt for the estimated path Pp (step S22). Here, the display time period is previously written in the program PGb , and comes every t4 second in this embodiment.
- the estimated path Pp appears on and disappears from the display with a time lapse of t4 second.
- the value of t4 is selected larger than that of t1 (e.g. , 0.1 second).
- step S21 If the processor 21 determines that now is in the display time period Pdt , the procedure goes to step S21. The processor 21 then generates, on the frame memory, the display image Sout including the estimated path Pp (see FIG. 11). The procedure then goes to step S9, and the processor 21 transfers the current display image Sout on the frame memory to the display device 6 for display thereon. Then, the processor 21 determines whether now is the time to end the processing of FIG. 12 (step S10), and if not yet, the procedure returns to step S1 for generating another display image Sout.
- step S3 if the processor 21 determines that the deriving timing T1 has not come yet, and in step S22, if determines that now is not in the display time period Pdt , the procedure goes to step S23.
- step 23 the processor 21 transfers, to the display device 6 for display, the captured image Scpt stored in step S2 (see FIG. 4) as the display image Sout without any change (step S23).
- step S10 becomes Yes to end the processing of FIG. 12.
- step S10 becomes Yes to end the processing of FIG. 12.
- the estimated path Pp is displayed only during the display time period Pdt .
- the estimated path Pp appears on and disappears from the display on an intermittent basis. Accordingly, the estimated path Pp becomes noticeable, and the driver finds it easy to locate the estimated path Pp in the display image Sout .
- FIG. 13 is a block diagram showing the hardware structure of a rendering device Urnd3 according to a third embodiment of the present invention.
- the rendering device Urnd3 includes a processor 41, program memory 42, and a working area 43.
- the program memory 42 is typified by ROM (Read Only Memory), and stores a program PGc for defining the processing procedure in the processor 41.
- the processor 41 By following the program PGc , the processor 41 generates such display image Sout as shown in FIG. 14.
- the display image Sout shows an estimated region Rpt on a road surface Frd for the vehicle Vusr (see FIG. 3) to move.
- the estimated region Rpt is defined by the left- and right-side trajectories Pp1 and Pp2 described in the first embodiment, and a line segment Llmt passing through a no-go point Plmt.
- the no-go point Plmt is a point indicating the farthest limit for the vehicle Vusr to move, and if the vehicle Vusr keeps moving, it might collide the obstacle Vbst first.
- the working area 43 is typified by RAM (Random Access Memory), and used when the processor 41 executes the program PGc .
- the rendering device Urnd3 in the above structure is typically incorporated in a drive assistant device Uast3.
- the structural difference from the drive assistant device Uast1 is including the rendering device Urnd3 instead of Urnd1, and further including 4 active sensors 441 to 444, which is exemplified for a measuring sensor in Claims.
- the active sensors 441 to 444 are embedded in the rear-end of the vehicle Vusr, preferably, in a lateral direction.
- the active sensors 441 to 444 arranged as such emit ultrasonic waves or radio waves toward the area rear of the vehicle Vusr , and monitor reflected waves.
- distances d1 to d4 to an obstacle Vbst located closest behind the vehicle Vusr are detected by the active sensors 441 to 444.
- the processor 41 first generates a distance measuring instruction Imsr , and transmits it to all of the active sensors 441 to 444 (step S41).
- the distance measuring instruction Imsr is a signal to instruct all of the active sensors 441 to 444 to detect the distances d1 to d4 , and transmit those to the processor 41.
- the active sensors 441 to 444 each responsively perform such detection, and store the resultant distances d1 to d4 to the working area 43 (step S42).
- step S43 calculates coordinates ( x1, y1 ) to ( x4, y4) of four points P1 to P4 on the surface of the object Vbst (step S43).
- FIG. 17 shows the vehicle Vusr, the obstacle Vbst , and a two-dimensional (2D) coordinate system.
- the Y-axis connects a rotation center of a left-rear wheel Wr1 and that of a right-rear wheel Wr2 .
- the X-axis is a perpendicular bisector parallel to a horizontal plane.
- positions A1 to A4 of the active sensors 441 to 444 from which the ultrasonic waves, for example, are emitted can be all defined by coordinates ( xa1, ya1 ) to ( xa4, ya4 ) known in the 2D coordinate system.
- angles ⁇ 1 to ⁇ 4 at which the active sensors 441 to 444 emit the ultrasonic waves are known.
- the angles ⁇ 1 to ⁇ 4 are formed by the X-axis and the emitted waves, and FIG. 17 exemplarily shows only the angle ⁇ 1.
- the above coordinates ( x1,y1 ) is equal to ( d1 ⁇ cos ⁇ 1 + xa1 , d1 ⁇ sin ⁇ 1+ya1 ), and those ( x2, y2 ) to ( x4 to y4 ) are equal to ( d2 ⁇ cos ⁇ 2 + xa2 , d2 ⁇ sin ⁇ 2+ya2 ) to ( d4 ⁇ cos ⁇ 4+xa4, d4 ⁇ sin ⁇ 4+ya4 ), respectively.
- the processor 41 calculates coordinates ( xlmt, ylmt ) of the corner point Pcnr of the obstacle Vbst as one example of the no-go point Plmt (step S44).
- the processor 41 first performs Hough transform with respect to the points P1 to P4 so that curves C1 to C4 are derived in the Hough space which is defined by the ⁇ -axis and ⁇ -axis.
- the curves C1 to C4 are expressed as the following equations (1) to (4), respectively.
- the processor 41 calculates coordinates ( ⁇ 1, ⁇ 1) of an intersection point Pc1 of the curves C1 and C2 in the Hough space, and according to the equations (2) to (4), calculates coordinates ( ⁇ 2, ⁇ 2 ) of an intersection point Pc2 of the curves C2 to C4 in the Hough space. From the intersection point Pc1 , the processor 41 then derives an equation for a straight line P1 P2 .
- the line P1 P2 is expressed by the following equation (5) on the 2D coordinate system.
- a line P2 P4 is expressed by an equation (6).
- y (-cos ⁇ 1 ⁇ x+ ⁇ 1)/sin ⁇ 1
- y (-cos ⁇ 2 ⁇ x+ ⁇ 2)/sin ⁇ 2
- the processor 41 calculates coordinates of an intersection point of the line P1 P2 and the line P2 P3, and the resulting coordinates are determined as the above-mentioned coordinates ( xlmt, ylmt ).
- the processor 41 receives the current rudder angle ⁇ of the vehicle Vusr (steps S45 and S46).
- the processor 41 calculates, in the 2D coordinate system, coordinates ( xcnt, ycnt ) of a center point Pcnt (see FIG. 19) of the circle traceable by the vehicle Vusr when rotated (step S47).
- the processor 41 also derives equations for circles Cr1 and Cr2 , which are traced respectively by the left- and right-rear wheels Wr1 and Wr2, on each rotation center, of the vehicle Vusr when rotated around the center point Pcnt (step S48).
- the coordinates ( xcnt, ycnt ) are easily calculated under the well-known Ackermann's model, steps S47 and S48 are not described in detail.
- the circles Cr1 and Cr2 include the left- and right-side trajectories Pp1 and Pp2 described in the first embodiment.
- the processor 41 then derives an equation for a straight line Llmt , which passes through the coordinates ( xcnr, ycnr ) calculated in step S44, and the coordinates ( xcnt, ycnt ) calculated in step S47 (step S49).
- the straight line Llmt specifies the farthest limit for the vehicle Vusr to move without colliding the obstacle Vbst.
- the processor 41 next generates the estimated region Rpt , which is a region enclosed by the circles Cr1 and Cr2 calculated in step S48, the straight line Llmt calculated in step S49, and a line segment Lr12 (step S410).
- the line segment Lr12 is the one connecting the rotation centers of the left- and right-rear wheels Wr1 and Wr2 .
- the processor 41 receives the captured image Scpt from the image capture device 4 (steps S411, S412). Based on the captured image Scpt and the estimated region Rpt generated in step S410, the processor 41 then generates the display image Sout on the frame memory. More specifically, the processor 41 deforms the estimated region Rpt to the one viewed from the image capture device 4, and renders that on the captured image Scpt. The resulting display image Sout looks as the one shown in FIG. 14. The processor 41 then transfers the display image Sout on the frame memory to the display device 6 for display thereon (step S414). Such steps S41 to S414 are repeated until the determination becomes Yes in step S415 to end the processing of FIG. 16. As such, as the estimated region Rpt extends to the no-go point Plmt, the driver can instantaneously know the farthest limit to move the vehicle Vusr.
- the image capture device 4 is embedded in the rear-end of the vehicle Vusr. This is not restrictive, and in the front-end of the vehicle Vusr will also do. Further, the number of image capture devices 4 is not limited to one, and may be more depending on the design requirements of the drive assistant devices Uast1 to Uast3.
- the captured image Scpt is the one on which the left- and right-side trajectories Pp1 and Pp2 , the estimated path Pp , and the estimated region Rpt are rendered.
- the captured image Scpt may be subjected to some image processing by the processors 1, 21, and 41 before having those rendered thereon.
- image processing is typified by processing of generating an image of around the vehicle Vusr viewed from a virtual viewpoint set high up the vehicle Vusr.
- the captured image Scpt is stored in the frame memory in response to the image capture instruction Icpt transmitted from the processors 1, 21, and 41 to the image capture device 4.
- the captured image Scpt is voluntarily generated by the image capture device 4, and then stored in the frame memory.
- the rudder angle ⁇ may be detected voluntarily by the rudder angle sensor 5 without responding to the detection instruction Idct coming from the processors 1, 21, and 41.
- four active sensors 441 to 444 are placed in the vehicle Vusr.
- the number thereof is not restrictive, and may be one or more.
- the direction of the lens thereof needs to be dynamically changed so that the angle ⁇ of the emitted waves is set wider.
- the active sensors 441 to 444 are provided as one example of a measuring sensor in Claims for measuring the distances d1 to d4 to the obstacle Vbst.
- a measuring sensor such as a passive sensor
- two image capture devices are required to cover the area rear of the vehicle Vusr. These image capture devices each pick up an image of the obstacle Vbst located behind the vehicle Vusr. Based on a parallax of the obstacle in images, the processor 41 then measures a distance to the obstacle Vbst with stereoscopic views (stereoscopic vision).
- the programs PGa to PGc are stored in the rendering devices Urnd1 to Urnd3, respectively.
- those programs PGa to PGc may be distributed in a recording medium typified by CD-ROM, or over a communications network such as the Internet.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Image Generation (AREA)
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2000199513 | 2000-06-30 | ||
| JP2000199513 | 2000-06-30 | ||
| JP2000199512 | 2000-06-30 | ||
| JP2000199512 | 2000-06-30 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| EP1168241A2 true EP1168241A2 (de) | 2002-01-02 |
| EP1168241A3 EP1168241A3 (de) | 2004-01-14 |
Family
ID=26595166
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP01115271A Withdrawn EP1168241A3 (de) | 2000-06-30 | 2001-06-25 | Bildwiedergabevorrichtung |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US6825779B2 (de) |
| EP (1) | EP1168241A3 (de) |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPWO2003034738A1 (ja) * | 2001-10-10 | 2005-02-10 | 松下電器産業株式会社 | 画像処理装置 |
| DE102004009924A1 (de) * | 2004-02-23 | 2005-09-01 | Valeo Schalter Und Sensoren Gmbh | Verfahren und Warnvorrichtung zum grafischen Aufbereiten eines Bildes einer Kamera |
| US8665116B2 (en) | 2010-07-18 | 2014-03-04 | Ford Global Technologies | Parking assist overlay with variable brightness intensity |
| KR101329510B1 (ko) * | 2011-10-07 | 2013-11-13 | 엘지이노텍 주식회사 | 주차 보조 장치 및 방법 |
| US9683861B2 (en) * | 2013-09-27 | 2017-06-20 | Nissan Motor Co., Ltd. | Estimated route presentation apparatus and estimated route presentation method |
| KR102592825B1 (ko) * | 2018-08-31 | 2023-10-23 | 현대자동차주식회사 | 충돌 회피 제어 장치 및 그 방법 |
| US11930583B1 (en) * | 2022-09-08 | 2024-03-12 | Ali Kaddoura | Heat conditioning through deflection/reflection/absorption of electromagnetic waves |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0236417A (ja) | 1988-07-26 | 1990-02-06 | Nec Corp | ファーストインファーストアウト型半導体メモリ |
Family Cites Families (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS6414700A (en) | 1987-07-08 | 1989-01-18 | Aisin Aw Co | Device for displaying prospective track of vehicle |
| JPH01168538A (ja) * | 1987-12-23 | 1989-07-04 | Honda Motor Co Ltd | 車両の後方視界表示装置 |
| IT1240974B (it) | 1990-07-05 | 1993-12-27 | Fiat Ricerche | Metodo e apparecchiatura per evitare la collisione di un autoveicolo contro ostacoli. |
| DE69333543T2 (de) | 1992-09-30 | 2005-06-02 | Hitachi, Ltd. | Unterstützungssystem für den Fahrer eines Fahrzeugs und damit ausgerüstetes Fahrzeug |
| US5670935A (en) * | 1993-02-26 | 1997-09-23 | Donnelly Corporation | Rearview vision system for vehicle including panoramic view |
| DE69730570T2 (de) | 1996-10-09 | 2005-02-03 | Honda Giken Kogyo K.K. | Automatisches Lenksystem für ein Fahrzeug |
| KR100208806B1 (ko) | 1996-12-12 | 1999-07-15 | 윤종용 | 차량의 진행을 예측하여 안내하는 방법 |
| JP3575279B2 (ja) * | 1998-05-22 | 2004-10-13 | アイシン精機株式会社 | 駐車補助装置 |
| JP4519957B2 (ja) * | 1998-10-22 | 2010-08-04 | 富士通テン株式会社 | 車両の運転支援装置 |
| FR2785383B1 (fr) | 1998-10-30 | 2000-12-15 | Renault | Procede et dispositif d'assistance au deplacement d'un vehicule en vue en particulier de la parquer |
| JP4108210B2 (ja) | 1998-12-11 | 2008-06-25 | 富士通テン株式会社 | 車両の駐車支援装置 |
| JP4287532B2 (ja) * | 1999-03-01 | 2009-07-01 | 矢崎総業株式会社 | 車両用後側方監視装置 |
| JP4312883B2 (ja) | 1999-06-29 | 2009-08-12 | 富士通テン株式会社 | 車両の駐車支援装置 |
| JP4320873B2 (ja) | 1999-10-26 | 2009-08-26 | 株式会社エクォス・リサーチ | 車両用駐車スペース検出装置 |
| EP1158803A3 (de) * | 2000-05-24 | 2003-12-10 | Matsushita Electric Industrial Co., Ltd. | Wiedergabevorrichtung zur Erzeugung einer Bildanzeige |
| EP1167120B1 (de) * | 2000-06-30 | 2014-08-27 | Panasonic Corporation | Wiedergabevorrichtung für Einparkhilfe |
| US6369701B1 (en) * | 2000-06-30 | 2002-04-09 | Matsushita Electric Industrial Co., Ltd. | Rendering device for generating a drive assistant image for drive assistance |
| JP2002036991A (ja) * | 2000-07-27 | 2002-02-06 | Honda Motor Co Ltd | 駐車支援装置 |
| JP2003072495A (ja) * | 2001-09-06 | 2003-03-12 | Yazaki Corp | 駐車支援装置および駐車支援方法 |
-
2001
- 2001-06-21 US US09/885,095 patent/US6825779B2/en not_active Expired - Lifetime
- 2001-06-25 EP EP01115271A patent/EP1168241A3/de not_active Withdrawn
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0236417A (ja) | 1988-07-26 | 1990-02-06 | Nec Corp | ファーストインファーストアウト型半導体メモリ |
Also Published As
| Publication number | Publication date |
|---|---|
| US20020097170A1 (en) | 2002-07-25 |
| US6825779B2 (en) | 2004-11-30 |
| EP1168241A3 (de) | 2004-01-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US6539288B2 (en) | Vehicle rendering device for generating image for drive assistance | |
| US9435879B2 (en) | Alert display device and alert display method | |
| EP1167120B1 (de) | Wiedergabevorrichtung für Einparkhilfe | |
| EP1005234B1 (de) | Dreidimensionales Visualisierungssystem für Fahrzeuge mit einer einzelnen Kamera | |
| JP5681569B2 (ja) | 情報処理システム、サーバ装置、および、車載装置 | |
| JP5729158B2 (ja) | 駐車支援装置および駐車支援方法 | |
| CN107818581B (zh) | 车辆的图像处理系统 | |
| CN108259879B (zh) | 图像生成装置及图像生成方法 | |
| WO2001021446A1 (en) | Device for assisting automobile driver | |
| JP2004114977A (ja) | 移動体周辺監視装置 | |
| JP4154980B2 (ja) | 移動体周辺監視装置 | |
| JP2011065219A (ja) | 道路曲率推定装置 | |
| JP4374850B2 (ja) | 移動体周辺監視装置 | |
| EP1168241A2 (de) | Bildwiedergabevorrichtung | |
| EP1168248A2 (de) | Bildwiedergabevorrichtung | |
| JP2020127171A (ja) | 周辺監視装置 | |
| JPH11264868A (ja) | 車両用表示装置 | |
| JP2011203643A (ja) | 車両用ヘッドアップディスプレイ装置 | |
| JP2018001899A (ja) | 周辺監視装置 | |
| JP2008310585A (ja) | 車両周辺監視装置 | |
| JP2012037924A (ja) | 運転支援装置 | |
| JP3638132B2 (ja) | 描画装置 | |
| JP5651491B2 (ja) | 画像表示システム、画像表示装置、及び、画像表示方法 | |
| JP2006153778A (ja) | 車両周辺監視装置 | |
| JP6999239B2 (ja) | 画像処理装置および画像処理方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
| AX | Request for extension of the european patent |
Free format text: AL;LT;LV;MK;RO;SI |
|
| PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
| AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
| AX | Request for extension of the european patent |
Extension state: AL LT LV MK RO SI |
|
| 17P | Request for examination filed |
Effective date: 20040417 |
|
| AKX | Designation fees paid |
Designated state(s): DE FR IT |
|
| 17Q | First examination report despatched |
Effective date: 20050331 |
|
| RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: PANASONIC CORPORATION |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20140806 |