JPH1116097A - Operation supporting device for vehicle - Google Patents

Operation supporting device for vehicle

Info

Publication number
JPH1116097A
JPH1116097A JP9169063A JP16906397A JPH1116097A JP H1116097 A JPH1116097 A JP H1116097A JP 9169063 A JP9169063 A JP 9169063A JP 16906397 A JP16906397 A JP 16906397A JP H1116097 A JPH1116097 A JP H1116097A
Authority
JP
Japan
Prior art keywords
vehicle
means
dimensional map
narrow road
ideal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP9169063A
Other languages
Japanese (ja)
Other versions
JP3917241B2 (en
Inventor
Atsushi Ikeda
Masahiro Kinoshita
昌裕 木下
敦 池田
Original Assignee
Fuji Heavy Ind Ltd
富士重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Heavy Ind Ltd, 富士重工業株式会社 filed Critical Fuji Heavy Ind Ltd
Priority to JP16906397A priority Critical patent/JP3917241B2/en
Publication of JPH1116097A publication Critical patent/JPH1116097A/en
Application granted granted Critical
Publication of JP3917241B2 publication Critical patent/JP3917241B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To provide a certain, reliable, and practical operation supporting device in which a driver can travel in a narrow road by evading contact with an obstacle by easily and quickly making accurate judgment. SOLUTION: A speed V and a handle angle 9 are detected, the environment of a traveling direction is image picked-up by a CCD camera 3, and the calculation of relative position information is operated by a picture recognizing part 21 and a road shape and obstacle recognizing part 22. When a narrow road is present in the direction of travel according to judgment by a narrow road judgment processing part 23, second-dimensional maps prepared in the past are successively updated, and the second-dimensional map of environment in the surrounding of a vehicle including the direction of travel is prepared by a second-dimensional map preparing part 25. Afterwards, an ideal path when the vehicle is intruding into the narrow road is calculated based on the second- dimensional map by an ideal path calculating part 26, and an expected position after the set time of a vehicle 1 is expected on the second-dimensional map by an expected position estimating part 27. Then, an announcement controlling part 28 outputs a signal to a state display part 8, and the ideal path is synthesized with the expected position and displayed on the second-dimensional map.

Description

DETAILED DESCRIPTION OF THE INVENTION

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention provides accurate information on the possibility of contact with obstacles such as guardrails, side walls, parked vehicles, etc., so that the vehicle can easily enter and run on narrow roads and the like. The present invention relates to a vehicle driving support device for assisting the driving of a driver.

[0002]

2. Description of the Related Art In recent years, in order to improve the safety of vehicles,
2. Description of the Related Art A comprehensive driving assistance system (ADA; Active Drive Assist system) has been developed that actively supports the driving operation of a driver. This ADA system estimates various possibilities such as a collision with a preceding vehicle, a contact with an obstacle, a lane departure, and the like from the traveling environment information of the vehicle and a traveling state of the own vehicle, and is predicted to be unable to maintain safety. In this case, notification and other control are performed to the driver.

As a device for obtaining the traveling environment information of the vehicle, a laser radar device and the like have been conventionally known, but recently, an image of a landscape or an object in front of the vehicle captured by a plurality of cameras mounted on the vehicle. It has become possible to process information to three-dimensionally recognize roads and traffic environments with sufficient accuracy and time for practical use.

A parking assist device is used as a device that uses a narrow road guide function for determining whether or not to enter a narrow road, which is one of the functions of the ADA system, and for guiding the vehicle on a narrow road to prevent contact with an obstacle. Yes, for example, see JP-A-6-23434.
No. 1 discloses a technique for determining a parking space and efficiently giving a voice instruction to guide a host vehicle to a parking position along a guide path calculated based on the positional relationship between the parking position and the current position. I have.

[0005]

However, since the guideway of the prior art is calculated based on the positional relationship between the parking position and the current position, obstacles such as telephone poles and curbs exist before the parking position. If it is difficult to respond.

[0006] That is, a narrow road guide which has to cope with various situations other than the case of parking must be formed in consideration of various obstacles in the traveling direction. Even if there is, it is necessary that the driver can effectively avoid the problem and run easily.

The present invention has been made in view of the above circumstances, and notifies the driver of any obstacle in the traveling direction, even if there is any obstacle, so that the driver can easily and quickly make an accurate decision. It is an object of the present invention to provide a reliable, reliable, and practical vehicle driving assistance device that guides a vehicle to travel on a narrow road while avoiding contact with an obstacle.

[0008]

According to a first aspect of the present invention, there is provided a driving support system for a vehicle, comprising: a driving state detecting means for detecting a driving state of a host vehicle; and a driving direction of the host vehicle. Traveling environment detecting means for detecting the road shape and the three-dimensional object of the vehicle; and an environment for forming position information of the environment around the own vehicle including the traveling direction of the own vehicle based on the traveling state, the road shape, and the three-dimensional object information. Position information forming means, ideal path calculating means for calculating an ideal path for the own vehicle to enter the narrow road when there is a narrow road in the running direction of the own vehicle, and based on the running state of the own vehicle. Expected position estimating means for estimating the expected position of the own vehicle after a set time; and the position information of the environment around the own vehicle formed by the environmental position information forming means and the ideal path calculated by the ideal path calculating means. the above Based on the expected position of the vehicle estimated by the virtual position estimating means it is obtained by a notification means for guiding the narrow road in the vehicle.

According to the first aspect of the present invention, the vehicle driving support device detects a traveling state of the own vehicle by the traveling state detecting means, and detects a road shape and a three-dimensional object in the traveling direction of the own vehicle by the traveling environment detecting means. The environment position information forming means forms position information of the environment around the host vehicle including the driving direction of the host vehicle based on the traveling state, the road shape, and the three-dimensional object information. Then, when there is a narrow road in the traveling direction of the own vehicle by the ideal route calculating means, the ideal path where the own vehicle enters this narrow road is calculated, and the predicted position estimating means changes the running state of the own vehicle to the running state. Estimating the expected position of the own vehicle after the set time based on the position information of the environment around the own vehicle formed by the environmental position information forming means by the notifying means and the ideal route calculated by the ideal path calculating means. The vehicle guides the vehicle on a narrow road based on the predicted position of the vehicle estimated by the predicted position estimating means.

According to a second aspect of the present invention, there is provided a driving support system for a vehicle according to the first aspect of the invention, wherein the notifying unit includes the surrounding area formed by the environmental position information forming unit. The ideal route calculated by the ideal route calculating means and the expected position of the own vehicle estimated by the expected position estimating means are displayed on the position information of the environment. By visually recognizing the ideal route and the expected position of the vehicle displayed on the information, it is possible to easily recognize the possibility of avoiding an obstacle and quickly and easily recognize the driving operation to be performed in the future. Obtain obstacle information that is not available.

Further, in the vehicle driving support device according to the present invention, the notifying means may be calculated by the ideal route calculating means. Based on the ideal route and the predicted position estimated by the predicted position estimating means, a deviation amount of the predicted position of the host vehicle from the ideal route is calculated, and a speed correction amount and a steering angle for minimizing the deviation amount. The correction amount is calculated and displayed in a predetermined manner. The driver can visually recognize the speed correction amount and the steering angle correction amount to be performed, and can recognize the driving operation to be performed more quickly and easily.

According to a fourth aspect of the present invention, there is provided a vehicle driving assistance apparatus according to any one of the first, second, and third aspects, wherein the notifying means comprises the ideal route. Based on the ideal path calculated by the calculating means and the predicted position estimated by the predicted position estimating means, a deviation amount of the predicted position of the own vehicle from the ideal path is calculated, and the deviation amount is minimized. It calculates the speed correction amount and the steering angle correction amount, outputs a predetermined voice, and guides the vehicle on narrow roads, so that the driver cannot visually confirm the position of the obstacle by viewing the display inside the vehicle. Is also guided along the ideal path without fail.

According to a fifth aspect of the present invention, in the vehicle driving assistance apparatus according to the fourth aspect of the present invention, the notifying means sets the timing of the sound output to the traveling state of the own vehicle. It is variable and performed according to
Voice guidance is provided at an appropriate time according to parameters such as vehicle speed and acceleration, so that driving operation is further facilitated.

[0014]

Embodiments of the present invention will be described below with reference to the drawings. 1 to 9 show a first embodiment of the present invention.
1 is a functional block diagram of the vehicle driving support device, FIG. 2 is a schematic configuration diagram of the vehicle driving support device, FIG. 3 is a flowchart of narrow road guide control, and FIG. 4 is a flowchart of a two-dimensional map creation routine. FIG. 5 is an explanatory diagram of a narrow road determination range, FIG. 6 is an explanatory diagram of three-dimensional object position information around a vehicle,
FIG. 7 is an explanatory diagram when moving the previous three-dimensional object position information,
FIG. 8 is an explanatory diagram showing an example of setting an ideal route on a narrow road ahead of the vehicle, and FIG. 9 is an explanatory diagram showing an example of display on a monitor.

In FIG. 2, reference numeral 1 denotes a vehicle such as an automobile (own vehicle). The function of determining whether or not the vehicle 1 can enter a narrow road and preventing contact with an obstacle is one of the functions. And a vehicle driving support device 2 for supporting driving of the driver. Hereinafter, in the first embodiment of the present invention, only the function of determining whether or not the vehicle driving support device 2 can enter a narrow road and preventing contact with an obstacle will be described, and other functions will be described. Will not be described.

The vehicle driving support device 2 has a set of (left and right) CCD cameras 3 using a solid-state imaging device such as a charge-coupled device (CCD) as a stereo optical system.
These left and right CCD cameras 3 are respectively mounted at a certain interval in front of the ceiling in the vehicle compartment, and stereoscopically image a target outside the vehicle from different viewpoints. A video signal of the traveling direction of the vehicle 1 captured by the set of CCD cameras 3 is input to the control device 4.

The vehicle driving support device 2 has a vehicle speed sensor 5 for detecting the speed of the host vehicle 1 and a steering wheel angle sensor 6 for detecting the steering wheel angle as running state detecting means.
Are input to the control device 4. The control device 4 transmits the above information (the video signal from the CCD camera 3, the signal from the vehicle speed sensor 5 and the signal from the steering wheel angle sensor 6). ) To control the alarm device 7 and the status display unit 8 so as to determine whether or not the vehicle can enter a narrow road and to achieve a function of guiding traveling on a narrow road by preventing contact with an obstacle. Have been.

The alarm 7 is, for example, a buzzer or the like.
When the vehicle travels on a narrow road of a size that cannot be entered, or when there is a possibility of contact with an obstacle if the vehicle continues traveling, an alarm signal is issued by the output signal from the control device 4 to notify the driver. It has become.

The status display section 8 is provided on a monitor or the like provided in the vehicle according to an output signal from the control device 4.
For example, as shown in FIG. 9, the own vehicle 1 and an obstacle (fence H0,
The positional relationship with the parked vehicles H1, H2, telephone poles H3) and the expected position after a set time (for example, 2 seconds) when the host vehicle 1 maintains the driving state (the steering wheel angle θ, the vehicle speed V) as it is. 1 'and an ideal route RR for traveling on a narrow road are visually displayed on a two-dimensional map viewed from above. In the case shown in FIG. 9, for example, the ideal route RR is displayed in blue, the obstacles are displayed in red, and the predicted position after the set time is displayed in yellow, so that the display is easy to understand in color.

The control device 4 is formed by a microcomputer and its peripheral circuits, and as shown in FIG. 1, an image recognition unit 21, a road shape / obstacle recognition unit 22, a narrow road determination processing unit 23, an alarm control unit. 24, a two-dimensional map creation unit 25, an ideal route calculation unit 26, an expected position estimation unit 27, a notification control unit 2
8 mainly.

The image recognizing section 21 converts a pair of stereo images of the environment in the traveling direction of the vehicle 1 captured by the CCD camera 3 into the entire image by the principle of triangulation from the corresponding positional deviation. It is configured to perform a process of obtaining information on the distance to be crossed, generate a distance image representing a three-dimensional distance distribution, and output the generated distance image to the road shape / obstacle recognition unit 22.

The road shape / obstacle recognition unit 22 recognizes a three-dimensional object such as a road or an obstacle by performing histogram processing on a distance distribution of the distance image from the image recognition unit 21, and recognizes from the own vehicle 1. The relative position coordinates (relative position information) of the viewed three-dimensional object are calculated, and the narrow road determination processing unit 23 is calculated.
Is output to the two-dimensional map creation unit 25.

That is, as described above, the CCD camera 3, the image recognizing section 21 and the road shape / obstacle recognizing section 22 form a driving environment detecting means.

Based on the relative position information of the traveling direction of the vehicle 1 input from the road shape / obstacle recognition unit 22, the narrow road determination processing unit 23 sets a substantially front set range in the traveling direction of the vehicle 1. It is determined whether there is a narrow road inside.

Here, as shown in FIG. 5, for example, as shown in FIG. 5, when the traveling direction is forward, about 20
m, the tangent lines α1L, α1R of the left and right outermost edges (eg, door mirrors) of the vehicle 1 extending forward of the vehicle 1 in the range up to m.
And a range surrounded by lines α2L and α2R with margins added to the left and right sides of the range.
The range may be surrounded by lines α2L ′ and α2R ′ that gradually increase the margin as the distance increases.

The actual width of a road or the like is detected by measuring the distance between obstacles such as a vehicle that is extremely low or stationary in the traveling direction, a guardrail at the edge of the road, a curb, and a house fence. For example, in relation to the maximum width and the margin of the vehicle body of the own vehicle 1, for example, the road width is smaller than a value obtained by adding a margin of 40 cm to the maximum width of the vehicle body, and is equal to or more than a value obtained by adding a margin of 10 cm to the maximum width of the vehicle body. In some cases, it is determined that there is a narrow road, and the result is output to the two-dimensional map creating unit 25.

If the result of the determination by the narrow road determination processing section 23 is that there is no narrow road, it is further determined whether or not the vehicle can pass with sufficient margin, and it is determined that the vehicle cannot pass (10 cm maximum width of the vehicle body). If the traffic width is smaller than the value obtained by adding the margin, or if there is no road that can pass at all), the warning is output to the alarm control unit 24.

The alarm control section 24 emits an alarm sound from the alarm device 7 based on a signal from the narrow road determination processing section 23 to warn the driver of impassability. . In this case, the alarm sound is also increased in volume as the vehicle is closer to the obstacle, and the intermittent alarm interval is shortened so that the driver can be notified effectively. Furthermore, if a collision with an obstacle is obviously unavoidable, an automatic braking device (not shown) may be activated.

The two-dimensional map creating section 25 is formed as environment position information forming means,
, The vehicle speed V detected by the vehicle speed sensor 5 and the relative position information from the road shape / obstacle recognition unit 22 based on the environmental position information (2) created in the past (previous time). The two-dimensional map is updated one after another to form a two-dimensional map of the environment around the own vehicle 1 including the traveling direction of the own vehicle 1, and outputs the two-dimensional map to the ideal route calculating unit 26 and the expected position estimating unit 27. It has become.

The environment position information (two-dimensional map) around the vehicle is, as shown in FIG. 6, position information of a three-dimensional object in an area QRST centered on the own vehicle 1 set in advance on an XY plane. Relative position information (information in the area PQR) from the road shape / obstacle recognition unit 22 obtained this time,
It is formed from the information from the road shape / obstacle recognition unit 22 obtained up to the previous time.

That is, the region (two-dimensional map) of the three-dimensional object position information calculated and stored last time, Q'R'S'T '
From this time, the own vehicle 1 moves this time (movement amount M = (vehicle speed)
((Measurement time)), when the relative position information of the area PQR is newly obtained from the road shape / obstacle recognition unit 22,
The area Q'R'S'T 'of the two-dimensional map is updated by moving the area Q'R'S'T' by the movement amount M so as to be information on the current vehicle position, and the updated area Q'R'S of the previous two-dimensional map. From 'T', the data that has moved out of the storage area (the data in the area TSS'T ') and the newly obtained area PQ
The data of the area PEF overlapping the relative position information of R is deleted, and the relative position information of the area PQR is added to form the area QRST of the current two-dimensional map. Although FIG. 6 shows the case where the vehicle moves forward for easy understanding, the present two-dimensional map is similarly obtained when the vehicle turns and the like.

By guiding such a two-dimensional map for traveling on a narrow road, it is possible to recognize the position of a three-dimensional object in the traveling direction of the vehicle as well as to recognize it once in the traveling direction of the vehicle. The three-dimensional object can be grasped even if it has become to the side of the vehicle with the movement of the vehicle, and without any additional camera or three-dimensional object recognition device, It is possible to recognize a three-dimensional object in a wide range.

Here, to move the previous position information of the three-dimensional object based on the detected movement amount of the own vehicle 1, for example, the following formula is used.

In FIG. 7, when the vehicle 1 goes straight,
The object at point A (xa, ya) is at point B (xb, yb).
(Xa = xb). Here, assuming that the actual steering angle based on the steering wheel angle θ is δ, δ = 0 during straight running, and yb = ya−ΔM, where ΔM is the amount of movement of the vehicle. That is, when traveling straight ahead, the coordinates (xold, yol
The previous two-dimensional map shown in d) has coordinates (xnew, yne
The new two-dimensional map shown in w) is moved by the following two equations. xnew = xold (1) ynew = yold-.DELTA.M (2) Even if the actual steering angle .delta. is not exactly 0, if it is a value within a preset range, it is assumed that the vehicle is traveling straight. It has become. This setting range may be variably set by a parameter such as a vehicle speed.

When the vehicle 1 turns (when δ ≠ 0), the object at the point B (xb, yb) moves to the point C (xc,
yc). Center coordinates Pc of this turn
(XCE, YCE) is obtained by referring to a preset table based on the vehicle specifications based on the actual steering angle δ (indicated by f (δ)), and XCE = f (δ). 3) YCE = (offset to wheel axis) = 0 (4)

Further, the turning angle θc of the turning is calculated as follows: θc = ΔM / (XCE−XW) (5), where XW is the offset in the X direction from the camera position to the rear left wheel.

Using the center coordinates Pc (XCE, YCE) and the turning angle θc, the coordinates (xold, yold,
) Indicates the coordinates (xnew, ynew
) Is moved to the new two-dimensional map this time as follows. r = ((xold−XCE) 2 + (yold−YCE) 2 ) 1/2 a = arctan ((yold−YCE) / (xold−XCE)), xnew = r · cos (a + θc) + XCE 6) ynew = r · sin (a + θc) + YCE (7) The ideal path calculation unit 26 performs the two-dimensional map creation unit 2
Based on the two-dimensional map calculated in step 5, when there is a narrow road in the traveling direction of the host vehicle 1, the host vehicle 1 is formed as ideal path calculating means for calculating an ideal path when the host vehicle 1 enters this narrow road. The ideal route calculated by the ideal route calculation unit 26 is output to the notification control unit 28.

For example, as shown in FIG. 8A, a parked vehicle H1 and a parked vehicle H2 serving as obstacles are located in front of the host vehicle 1.
Formed by the narrow road SP (tangent line / straight line L1 of the leftmost outer edge of the parked vehicle H1 and tangent line of the rightmost outer edge of the parked vehicle H2 /
When the ideal route for entering this narrow road SP is calculated, the two-dimensional map creation unit 25 uses the two-dimensional map creation unit 25 to calculate the information on the periphery of the vehicle as shown in FIG. A dimensional map is entered.

Then, as shown in FIG.
A straight line L3 having a preset margin is drawn on the narrow road SP side from the straight line L2 on the three-dimensional map, the intersection of the parked vehicles H1 and H2 with the straight line on the own vehicle 1 side is set to Pt1, and the telephone pole H3 A certain width with a margin around the
The point with the highest possibility of contact with the side is Pt2.

With this point Pt2 as the origin, a coordinate system in which the positive direction of the y-axis is taken in the direction in which the vehicle travels on the narrow road SP, x = −k1 · tanh
(K2 · y) and (k1 is about 1) are set so that the straight line L3 is asymptotic and almost along the straight line L3 near the point Pt1.
The right trajectory is determined as an ideal trajectory through which the leftmost outer edge of the vehicle 1 passes using the curve L4 formed by this equation.

The predicted position estimating section 27 serves as predicted position estimating means, and includes a vehicle speed from the vehicle speed sensor 5, a steering wheel angle from the steering wheel angle sensor 6,
Based on the two-dimensional map from the two-dimensional map creation unit 25, the predicted position after a set time (for example, two seconds) when the own vehicle 1 maintains the driving state on the two-dimensional map is determined based on the two-dimensional map. The vehicle position is obtained and predicted by the vehicle motion equation set in advance in the vehicle specifications, and the obtained predicted position is output to the notification control unit 28.

The notification control unit 28 combines the ideal route obtained by the ideal route calculation unit 26 with the predicted position obtained by the predicted position estimation unit 27, and forms the two-dimensional map creation unit 25.
And a signal output to a status display unit 8 such as a monitor provided in the vehicle cabin so as to be displayed together on the two-dimensional map created in the above. The notification means is formed. For this reason, the driver can easily recognize the possibility of avoiding an obstacle by looking at the state display section 8, can quickly and easily recognize the driving operation to be performed, and also know the obstacle information that he has not noticed. You can do it.

Next, the operation of the vehicle driving support system having the above configuration will be described with reference to the flowchart of FIG. When the program starts, first, in step (hereinafter abbreviated as “S”) 101, the speed V of the own vehicle 1 is detected by the vehicle speed sensor 5, and the steering angle θ of the own vehicle 1 is detected and read by the steering angle sensor 6. Left and right CCD
The camera 3 captures an image of the environment in the traveling direction of the host vehicle 1 and captures the captured environment in the image recognition unit 21 of the control device 4. This one set of stereo image pairs is subjected to a process of obtaining distance information over the entire image by the image recognition unit 21 based on the principle of triangulation from the corresponding position shift amount, and a distance image representing a three-dimensional distance distribution. Is generated and output to the road shape / obstacle recognition unit 22. The road shape / obstacle recognizing unit 22 recognizes a three-dimensional object such as a road or an obstacle by performing histogram processing on the distance distribution of the distance image from the image recognizing unit 21 and recognizes the three-dimensional object from the host vehicle 1. The relative position coordinates (relative position information) of the three-dimensional object are calculated and output to the narrow road determination processing unit 23 and the two-dimensional map creation unit 25 (that is, the road / obstacle information is read). Is done).

Thereafter, the program proceeds to S102, in which it is determined whether or not there is a narrow road in the traveling direction (within a set range substantially in front of the traveling direction of the host vehicle 1). Vehicles that are extremely slow or stationary in the direction of travel, guardrails at the edge of the road, curbs,
A substantial road width D such as a road is detected by measuring an interval between obstacles such as a house fence, and a relation between the road width D and a maximum width W of the vehicle body of the own vehicle 1 and a margin allow, for example, a maximum width of the vehicle body. W is less than the value obtained by adding a margin of 40 cm to the maximum width W of the vehicle.
cm plus the margin plus (W + 10 ≦ D <W + 4
0) is a narrow road, and there is no narrow road (W + 10
> D or D ≧ W + 40), the process proceeds to S103.

In step S103, it is further determined whether or not the above-mentioned passage (a passage other than a narrow passage) is a passable passage, and a passage having a sufficient margin for traveling, that is,
If the passage is D ≧ W + 40, the process returns to S101. If the passage is not passable, that is, if the passage is W + 10> D, the process proceeds to S104. Note that S102 and S103 are processing performed by the narrow road determination processing unit 23.

In step S104, the alarm control unit 24
However, a warning sound is emitted from an alarm device 7 such as a buzzer in order to warn the driver that the vehicle cannot pass. In this case, the alarm sound is also effectively notified to the driver such that the sound volume is higher as the vehicle is closer to the obstacle and the alarm interval that has been performed intermittently is shortened. Furthermore, if a collision with an obstacle is obviously unavoidable, an automatic braking device (not shown) is activated. Then, after the process of S104, the program exits.

On the other hand, if there is a narrow road in the traveling direction of the vehicle 1 in S102 (if W + 10 ≦ D <W + 40), the flow proceeds to S105. In step S105, the two-dimensional map creator 25 executes the past (previous) based on the steering wheel angle θ, the vehicle speed V, and the relative position information (road / obstacle information) according to a two-dimensional map creation routine described later. Is updated one after another to form a two-dimensional map of the environment around the host vehicle 1 including the traveling direction of the host vehicle 1.

Thereafter, the process proceeds to S106, where the ideal route calculation unit 26 determines whether the own vehicle 1 has a narrow road in the traveling direction based on the two-dimensional map calculated by the two-dimensional map creation unit 25. An ideal route when entering a narrow road is calculated.

Then, the process proceeds to S107, in which the predicted position estimating unit 27 sets the case where the own vehicle 1 maintains the driving state on the two-dimensional map based on the steering wheel angle θ, the vehicle speed V, and the two-dimensional map. The predicted position after a time (for example, after 2 seconds) is obtained and predicted based on the equation of motion of the vehicle set in advance in the vehicle specifications of the host vehicle 1.

Then, the process proceeds to S108, where the notification control unit 2
9 outputs a signal to a state display unit 8 such as a monitor provided in the vehicle interior, and combines the ideal route obtained by the ideal route calculation unit 26 with the predicted position obtained by the expected position estimation unit 27, and FIG.
As shown in FIG.
Display both on the dimensional map and exit the program.

Therefore, the driver can easily recognize the possibility of avoiding the obstacle by looking at the status display section 8, and can quickly and easily recognize the driving operation to be performed. You can also get information.

Next, FIG. 4 shows a flowchart of a two-dimensional map creation routine executed by the two-dimensional map creation unit 25. When this routine is started, first, at S20
In step 1, the actual steering angle δ based on the steering wheel angle θ, the vehicle movement amount ΔM (calculated from the vehicle speed and the measurement time), and the previous two-dimensional map are read. Is determined, and if the vehicle is traveling straight, S203
The process proceeds to S204 in the case of the turning state.

In step S202, it is determined that the vehicle is traveling straight, and in step S20
When the process proceeds to 3, the vehicle movement amount ΔM is added to the previous two-dimensional map (by performing the processing based on the formulas (1) and (2)), and the process proceeds to S206.

On the other hand, when it is determined in S202 that the vehicle is in a turning state and the process proceeds to S204, the turning center PC and the turning angle θc are calculated from the actual steering angle δ and the vehicle movement amount ΔM (Equations (3) and (4) above).
The calculation proceeds based on the formulas (5), (S205), and proceeds to S205 to rotate the previous two-dimensional map by the turning angle θc about the turning center PC (the processing based on the formulas (6) and (7) is performed. ), And proceed to S206.

From S203 or S205 to S
In step 206, in the previous two-dimensional map, the above S2
03 or the data that has gone out of the storage area by the processing of S205 is erased.

Next, the process proceeds to S207, in which data overlapping with the new relative position information of the three-dimensional object in the previous two-dimensional map is deleted by the processing of S203 or S205.

Next, the flow proceeds to S208, where the relative position coordinates (relative position information) of the three-dimensional object viewed from the host vehicle 1 are read.
In step 209, the new relative position information is added to the previous two-dimensional map formed in step S207 and stored. This three-dimensional object position information is a new two-dimensional map updated this time.

Note that the stored new two-dimensional map is stored in the previous two-dimensional map when the next control program is executed.
It is read and processed as a dimensional map. Since a two-dimensional map is created in this way, even if a three-dimensional object once recognized in front of the vehicle becomes lateral to the vehicle as the vehicle moves, the position can be grasped. In addition, it is possible to easily perform not only driving assistance for an obstacle existing in front of the vehicle but also driving assistance for an obstacle existing on the side of the vehicle.

As described above, according to the first embodiment of the present invention, in response to various situations other than the case of parking, even if there is any obstacle in the traveling direction, the obstacle is notified to the driver. By notifying, the driver can easily and quickly make an accurate determination to avoid contact with an obstacle and guide the driver to run on a narrow road with high reliability, high reliability, and high practicality.

Next, FIGS. 10 to 13 relate to a second embodiment of the present invention, FIG. 10 is a functional block diagram of a vehicle driving support device, FIG. 11 is a schematic configuration diagram of a vehicle driving support device,
FIG. 12 is a flowchart of the narrow road guide control, and FIG. 13 is an explanatory diagram showing an example of display on the monitor. According to the second embodiment of the present invention, the ideal route calculated by the ideal route calculation unit and the predicted position of the own vehicle estimated by the predicted position estimation unit are displayed, and based on the ideal route and the predicted position. A deviation amount of the predicted position of the host vehicle from the ideal path is calculated, and a speed correction amount and a steering angle correction amount for minimizing the deviation amount are calculated and displayed in a predetermined manner. The steering angle correction amount and the steering angle correction amount are varied according to the traveling state of the host vehicle, and are output in a predetermined voice to guide the vehicle on a narrow road.

In FIG. 10, reference numeral 41 denotes a vehicle driving support device, which is a set of CCDs.
A video signal of the traveling direction of the vehicle 1 captured by the camera 3 is
The data is input to the control device 42.

In the vehicle driving support device 41, the detection signals from the vehicle speed sensor 5 and the steering wheel angle sensor 6 are transmitted to the control device 42 as running state detecting means, as in the first embodiment of the present invention. The control device 42 has a function of determining whether or not to enter a narrow road based on the above-described information, and has a function of guiding traveling on a narrow road by preventing contact with an obstacle. For example, the alarm device 7, the status display unit 8, the operation guide display unit 43, the left audio output unit 44L,
It is configured to output control to the right audio output unit 44R.

The operation guide display section 43 is provided on a monitor or the like provided in the vehicle according to an output signal (a steering angle correction amount and a speed correction amount described later) from the control device 42, for example, as shown in FIG.
Is displayed together with the status display section 8 (A
(p area), how much the steering wheel angle should be corrected (APH area), and how much the vehicle speed should be corrected (APV area) are visually displayed.

The left and right audio output units 44L, 44
R is an audio signal recorded in advance on a recording medium in accordance with an output signal (the steering angle correction amount and the speed correction amount) from the control device 42. Activate the output unit 44L and output a voice, for example, "Please turn the steering wheel slightly to the left". On the other hand, if the instruction is to the right, activate the above-described right audio output unit 44R and, for example, "Turn the steering wheel slightly to the right". Please, please "(in the case of a speed-only variable instruction, it is performed from both the left and right audio output units 44L and 44R).

Here, the left and right audio output units 44L, 44L
The timing of audio output from the 4R is determined by the control unit 42
With this, it can be changed in advance according to the current vehicle speed and acceleration. For example, even if the current vehicle speed is low, an early sound output is issued during acceleration, and a slow sound output is issued during deceleration.

The control unit 42 is formed by a microcomputer and its peripheral circuits. As shown in FIG. 10, the image recognition unit 21, the road shape / obstacle recognition unit 22, the narrow road determination processing unit 23, the alarm control unit 24, two-dimensional map creation unit 2
5, mainly composed of an ideal route calculation unit 26, an expected position estimation unit 27, and a notification control unit 45.

The notification control unit 45 combines the ideal route obtained by the ideal route calculation unit 26 with the predicted position obtained by the predicted position estimation unit 27, and generates the two-dimensional map creation unit 25.
A signal is output to a status display unit 8 such as a monitor provided in the vehicle interior so as to be displayed together on the two-dimensional map created in step (1). The notification control unit 45 calculates a deviation amount of the predicted position of the host vehicle 1 from the ideal path based on the ideal path and the predicted position, and calculates a speed correction amount and a steering amount that minimize the deviation amount. The angle correction amount is calculated based on the vehicle specifications and is output to the operation guide display unit 43, and the speed correction amount and the steering angle correction amount are calculated based on the traveling state of the host vehicle (current speed and acceleration). ), The generation timing is varied according to the left and right audio output units 44L,
The signal is output to 44R. That is, the notification control unit 45, the status display unit 8, and the operation guide display unit 4
3. A notification unit is formed by the left and right audio output units 44L and 44R.

In the second embodiment of the present invention having such a configuration, a program is executed as shown in the flowchart of FIG. The second embodiment of the present invention includes S101 to S101.
The same processing as in the first embodiment of the present invention is performed up to S107, and in S107, the steering wheel angle θ, the vehicle speed V,
Based on the two-dimensional map, the vehicle 1
After calculating the expected position after the set time when the driving state is maintained as it is by using the equation of motion of the vehicle set in advance in the vehicle specifications of the host vehicle 1, the process proceeds to S301.

In step S301, the notification control unit 45 calculates the deviation of the predicted position of the vehicle 1 from the ideal path based on the ideal path and the predicted position, and corrects the speed to minimize the deviation. The amount and the steering angle correction amount are calculated based on the vehicle specifications.

Then, the process proceeds to S302, where the notification control unit 45 combines the ideal route and the predicted position, and displays a state display unit such as a monitor provided in the vehicle compartment so as to display the ideal route and the predicted position together on a two-dimensional map. 8 and outputs the speed correction amount and the steering angle correction amount to the operation guide display unit 43 for display, while the speed correction amount and the steering angle correction amount are displayed in the running state of the host vehicle (current The output timing of the left and right audio output units 44 is varied by varying the generation timing according to the speed and acceleration).
A signal is output to L and 44R to output a voice and guide traveling.

As described above, in the second embodiment of the present invention,
In addition to the effects of the first embodiment, by calculating the speed correction amount and the steering angle correction amount and displaying them in a predetermined manner, the driver visually recognizes the speed correction amount and the steering angle correction amount to be performed from now on. ,
The driving operation to be performed from now on can be recognized more quickly and easily.

Further, since the speed correction amount and the steering angle correction amount are output as voices, even if the driver cannot visually confirm the position of the obstacle by viewing the display inside the vehicle, the driver can follow the ideal route without fail. To be guided. Since the timing of the sound output is also changed in accordance with the running state of the vehicle, the sound is guided at an appropriate time by parameters such as the vehicle speed and acceleration, so that the driving operation is further facilitated.

In each of the embodiments of the present invention described above, an example is described in which a vehicle speed sensor and a steering wheel angle sensor are provided as running state detecting means, but control may be performed by further adding another sensor. good. For example, a yaw rate sensor or the like may be provided to use the yaw rate as a control parameter.

[0074]

As described above, according to the present invention, even if there is any obstacle in the traveling direction, the driver is notified of the obstacle so that the driver can easily and quickly make an accurate decision. It is guided so as to be able to run on a narrow road while avoiding contact with an obstacle, and has an excellent effect that it is reliable, highly reliable and practical.

[Brief description of the drawings]

FIG. 1 is a functional block diagram of a vehicle driving assistance device according to a first embodiment of the present invention.

FIG. 2 is a schematic configuration diagram of the vehicle driving assistance device according to the first embodiment;

FIG. 3 is a flowchart of a narrow road guide control;

FIG. 4 is a flowchart of a two-dimensional map creation routine;

FIG. 5 is an explanatory diagram of a range for determining a narrow road;

FIG. 6 is an explanatory diagram of three-dimensional object position information around a vehicle according to the first embodiment;

FIG. 7 is an explanatory view when the previous three-dimensional object position information is moved.

FIG. 8 is an explanatory diagram showing an example of setting an ideal route on a narrow road ahead of the vehicle.

FIG. 9 is an explanatory diagram showing an example of display on a monitor according to the embodiment;

FIG. 10 is a functional block diagram of a vehicle driving support device according to a second embodiment of the present invention.

FIG. 11 is a schematic configuration diagram of the vehicle driving assistance device according to the first embodiment;

FIG. 12 is a flowchart of the narrow road guide control;

FIG. 13 is an explanatory diagram showing an example of display on a monitor according to the first embodiment;

[Explanation of symbols]

 DESCRIPTION OF SYMBOLS 1 Own vehicle 2 Vehicle driving support device 3 CCD camera (driving environment detecting means) 4 Control device 5 Vehicle speed sensor (driving state detecting means) 6 Handle angle sensor (driving state detecting means) 7 Alarm 8 Status display section (notifying means) 21) Image Recognition Unit (Driving Environment Detecting Means) 22 Road Shape / Obstacle Recognition Unit (Driving Environment Detecting Means) 23 Narrow Road Judgment Processing Unit 24 Alarm Control Unit 25 2D Map Creating Unit (Environmental Position Information Forming Means) 26 Ideal Route calculation unit (ideal route calculation unit) 27 Expected position estimation unit (expected position estimation unit) 28 Notification control unit (notification unit)

──────────────────────────────────────────────────の Continued on the front page (51) Int.Cl. 6 Identification code FIG08G 1/0969 H04N 7/18 J H04N 7/18 G06F 15/62 380

Claims (5)

[Claims]
1. A traveling state detecting means for detecting a traveling state of an own vehicle, a traveling environment detecting means for detecting a road shape and a three-dimensional object in a traveling direction of the own vehicle, the traveling state, the road shape, and the three-dimensional object Environmental position information forming means for forming position information of the environment around the host vehicle including the running direction of the host vehicle based on the object information; and, when there is a narrow road in the running direction of the host vehicle, the host vehicle Ideal route calculating means for calculating an ideal route to enter the road; expected position estimating means for estimating the expected position of the own vehicle after a set time based on the running state of the own vehicle; and the environmental position information forming means Guiding the own vehicle on a narrow road based on the position information of the environment around the own vehicle formed in the above, the ideal route calculated by the ideal route calculating means, and the expected position of the own vehicle estimated by the expected position estimating means. Do A driving assistance device for a vehicle, comprising: a notifying unit.
2. The notifying means estimates the ideal route calculated by the ideal route calculating means and the predicted position estimating means on the position information of the environment around the own vehicle formed by the environment position information forming means. The vehicle driving support device according to claim 1, wherein the predicted position of the own vehicle is displayed.
3. The informing means deviates from the ideal route the expected position of the vehicle based on the ideal route calculated by the ideal route calculating means and the expected position estimated by the expected position estimating means. 3. The vehicle driving support device according to claim 1, wherein the amount is calculated, and a speed correction amount and a steering angle correction amount that minimize the deviation amount are calculated and displayed in a predetermined manner.
4. The informing means deviates the expected position of the own vehicle from the ideal path based on the ideal path calculated by the ideal path calculating means and the expected position estimated by the expected position estimating means. And calculating a speed correction amount and a steering angle correction amount for minimizing the deviation amount, outputting a predetermined voice, and guiding a narrow road traveling. The driving assistance device for a vehicle according to any one of the above.
5. The driving support system for a vehicle according to claim 4, wherein the notification unit performs the sound output timing by changing the timing of the sound output according to a traveling state of the own vehicle.
JP16906397A 1997-06-25 1997-06-25 Vehicle driving support device Expired - Lifetime JP3917241B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP16906397A JP3917241B2 (en) 1997-06-25 1997-06-25 Vehicle driving support device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP16906397A JP3917241B2 (en) 1997-06-25 1997-06-25 Vehicle driving support device

Publications (2)

Publication Number Publication Date
JPH1116097A true JPH1116097A (en) 1999-01-22
JP3917241B2 JP3917241B2 (en) 2007-05-23

Family

ID=15879659

Family Applications (1)

Application Number Title Priority Date Filing Date
JP16906397A Expired - Lifetime JP3917241B2 (en) 1997-06-25 1997-06-25 Vehicle driving support device

Country Status (1)

Country Link
JP (1) JP3917241B2 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001024527A1 (en) * 1999-09-30 2001-04-05 Kabushiki Kaisha Toyoda Jidoshokki Seisakusho Image conversion device for vehicle rearward-monitoring device
JP2001109999A (en) * 1999-10-13 2001-04-20 Fuji Heavy Ind Ltd Driving support device for vehicle
JP2003259356A (en) * 2002-03-05 2003-09-12 Nissan Motor Co Ltd Apparatus for monitoring surrounding of vehicle
WO2005014341A1 (en) * 2003-08-07 2005-02-17 Matsushita Electric Industrial Co., Ltd. Operation assisting system and operation assisting method
JP2007064804A (en) * 2005-08-31 2007-03-15 Clarion Co Ltd Obstacle detecting device for vehicle
DE102005062151A1 (en) * 2005-12-22 2007-07-05 Daimlerchrysler Ag Head-up display and camera process to guide a vehicle driver through a road constriction
JP2008049959A (en) * 2006-08-28 2008-03-06 Honda Motor Co Ltd Device for supporting contact avoidance of vehicle
DE102006041651A1 (en) * 2006-08-24 2008-03-13 Valeo Schalter Und Sensoren Gmbh Motor vehicle ability determining method for use in e.g. parking space, in roadway, involves detecting restricted gap in roadway, and defining vehicle image of vehicle, and comparing vehicle image with restricted gap
JP2008137442A (en) * 2006-11-30 2008-06-19 Toyota Motor Corp Traveling controller
JP2009262837A (en) * 2008-04-25 2009-11-12 Toyota Motor Corp Traveling control device for vehicle
JP2010149847A (en) * 2009-12-25 2010-07-08 Fujitsu Ten Ltd Driving assist device
JP2011245970A (en) * 2010-05-26 2011-12-08 Aisin Seiki Co Ltd Parking assist apparatus
JP2012192843A (en) * 2011-03-16 2012-10-11 Fuji Heavy Ind Ltd Driving support device for vehicle
JP2013203338A (en) * 2012-03-29 2013-10-07 Fuji Heavy Ind Ltd Travelling control device for hybrid vehicle
DE102012112395A1 (en) * 2012-12-17 2014-06-18 Deutsches Zentrum für Luft- und Raumfahrt e.V. Support device for supporting the lateral and longitudinal guides of e.g. motor car has representation device to optically, acoustically and haptically represent detected field border between outer hazard field and inner action field
US8854462B2 (en) 2006-08-24 2014-10-07 Valeo Vision Method of determining the passage of a vehicle through a gap
JP2015057688A (en) * 2013-08-12 2015-03-26 株式会社日本自動車部品総合研究所 Travel route generation apparatus
WO2016129301A1 (en) * 2015-02-10 2016-08-18 クラリオン株式会社 Entry possibility determining device for vehicle
US9731717B2 (en) 2014-10-27 2017-08-15 Hyundai Motor Company Driver assistance apparatus and method for operating the same
KR20170121562A (en) * 2016-04-25 2017-11-02 현대자동차주식회사 Navigation apparatus, vehicle and method for controlling vehicle
WO2018220912A1 (en) * 2017-06-02 2018-12-06 アイシン精機株式会社 Periphery monitoring device

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6985171B1 (en) 1999-09-30 2006-01-10 Kabushiki Kaisha Toyoda Jidoshokki Seisakusho Image conversion device for vehicle rearward-monitoring device
GB2361376A (en) * 1999-09-30 2001-10-17 Toyoda Automatic Loom Works Image conversion device for vehicle rearward-monitoring device
WO2001024527A1 (en) * 1999-09-30 2001-04-05 Kabushiki Kaisha Toyoda Jidoshokki Seisakusho Image conversion device for vehicle rearward-monitoring device
GB2361376B (en) * 1999-09-30 2004-07-28 Toyoda Automatic Loom Works Image conversion device for vehicle rearward-monitoring device
JP2001109999A (en) * 1999-10-13 2001-04-20 Fuji Heavy Ind Ltd Driving support device for vehicle
JP4563531B2 (en) * 1999-10-13 2010-10-13 富士重工業株式会社 Vehicle driving support device
JP2003259356A (en) * 2002-03-05 2003-09-12 Nissan Motor Co Ltd Apparatus for monitoring surrounding of vehicle
WO2005014341A1 (en) * 2003-08-07 2005-02-17 Matsushita Electric Industrial Co., Ltd. Operation assisting system and operation assisting method
US7756618B2 (en) 2003-08-07 2010-07-13 Panasonic Corporation Camera calibration and image adjusting for drive assisting system
JP2007064804A (en) * 2005-08-31 2007-03-15 Clarion Co Ltd Obstacle detecting device for vehicle
DE102005062151A1 (en) * 2005-12-22 2007-07-05 Daimlerchrysler Ag Head-up display and camera process to guide a vehicle driver through a road constriction
DE102005062151B4 (en) * 2005-12-22 2007-09-13 Daimlerchrysler Ag Method and device for assisting a driver in the passage of constrictions
DE102006041651A1 (en) * 2006-08-24 2008-03-13 Valeo Schalter Und Sensoren Gmbh Motor vehicle ability determining method for use in e.g. parking space, in roadway, involves detecting restricted gap in roadway, and defining vehicle image of vehicle, and comparing vehicle image with restricted gap
US8854462B2 (en) 2006-08-24 2014-10-07 Valeo Vision Method of determining the passage of a vehicle through a gap
JP2008049959A (en) * 2006-08-28 2008-03-06 Honda Motor Co Ltd Device for supporting contact avoidance of vehicle
JP2008137442A (en) * 2006-11-30 2008-06-19 Toyota Motor Corp Traveling controller
JP2009262837A (en) * 2008-04-25 2009-11-12 Toyota Motor Corp Traveling control device for vehicle
JP2010149847A (en) * 2009-12-25 2010-07-08 Fujitsu Ten Ltd Driving assist device
JP2011245970A (en) * 2010-05-26 2011-12-08 Aisin Seiki Co Ltd Parking assist apparatus
JP2012192843A (en) * 2011-03-16 2012-10-11 Fuji Heavy Ind Ltd Driving support device for vehicle
JP2013203338A (en) * 2012-03-29 2013-10-07 Fuji Heavy Ind Ltd Travelling control device for hybrid vehicle
DE102012112395A1 (en) * 2012-12-17 2014-06-18 Deutsches Zentrum für Luft- und Raumfahrt e.V. Support device for supporting the lateral and longitudinal guides of e.g. motor car has representation device to optically, acoustically and haptically represent detected field border between outer hazard field and inner action field
DE102012112395B4 (en) * 2012-12-17 2016-05-12 Deutsches Zentrum für Luft- und Raumfahrt e.V. Assistance system
JP2015057688A (en) * 2013-08-12 2015-03-26 株式会社日本自動車部品総合研究所 Travel route generation apparatus
US10407060B2 (en) 2014-10-27 2019-09-10 Hyundai Motor Company Driver assistance apparatus and method for operating the same
US9731717B2 (en) 2014-10-27 2017-08-15 Hyundai Motor Company Driver assistance apparatus and method for operating the same
WO2016129301A1 (en) * 2015-02-10 2016-08-18 クラリオン株式会社 Entry possibility determining device for vehicle
US10339396B2 (en) 2015-02-10 2019-07-02 Clarion Co., Ltd. Vehicle accessibility determination device
JP2016149594A (en) * 2015-02-10 2016-08-18 クラリオン株式会社 Vehicle entry propriety determination device
KR20170121562A (en) * 2016-04-25 2017-11-02 현대자동차주식회사 Navigation apparatus, vehicle and method for controlling vehicle
US10337881B2 (en) 2016-04-25 2019-07-02 Hyundai Motor Company Navigation device, vehicle, and method for controlling the vehicle
WO2018220912A1 (en) * 2017-06-02 2018-12-06 アイシン精機株式会社 Periphery monitoring device

Also Published As

Publication number Publication date
JP3917241B2 (en) 2007-05-23

Similar Documents

Publication Publication Date Title
JP6304086B2 (en) Automatic driving device
US10046803B2 (en) Vehicle control system
US9091558B2 (en) Autonomous driver assistance system and autonomous driving method thereof
US20170028985A1 (en) Parking assistance device
US9862416B2 (en) Automatic parking control device, and parking assistance device
US10005391B2 (en) Information presentation system
EP3088268B1 (en) Vehicle driving aid device and vehicle having same
CN105799700B (en) Avoid collision control system and control method
US8872919B2 (en) Vehicle surrounding monitoring device
US10179588B2 (en) Autonomous vehicle control system
EP2766237B1 (en) Device for assisting a driver driving a vehicle or for independently driving a vehicle
JP5620472B2 (en) Camera system for use in vehicle parking
JP5212748B2 (en) Parking assistance device
JP6269546B2 (en) Automatic driving device
JP5375752B2 (en) Vehicle driving support device
EP2011701B1 (en) Parking assistance device and parking assistance method
JP4933962B2 (en) Branch entry judgment device
JP3592043B2 (en) Intersection warning device
DE602004000990T2 (en) Driver assistance system for vehicles
US7190282B2 (en) Nose-view monitoring apparatus
JP5167051B2 (en) Vehicle driving support device
JP6323385B2 (en) Vehicle travel control device
EP2119602B1 (en) Parking assistance device and parking assistance method
JP4769625B2 (en) Parking assistance device and parking assistance method
JP5124875B2 (en) Vehicle travel support device, vehicle, vehicle travel support program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20040601

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20060222

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060322

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060413

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060627

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060823

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20061024

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20061116

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20061222

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20070123

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20070208

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100216

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110216

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120216

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120216

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130216

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140216

Year of fee payment: 7

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

EXPY Cancellation because of completion of term