WO2000020257A1 - Dispositif d'assistance a la conduite et support enregistre - Google Patents
Dispositif d'assistance a la conduite et support enregistre Download PDFInfo
- Publication number
- WO2000020257A1 WO2000020257A1 PCT/JP1999/005509 JP9905509W WO0020257A1 WO 2000020257 A1 WO2000020257 A1 WO 2000020257A1 JP 9905509 W JP9905509 W JP 9905509W WO 0020257 A1 WO0020257 A1 WO 0020257A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- driving operation
- assumed
- driving
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 242
- 238000003384 imaging method Methods 0.000 claims abstract description 33
- 239000002131 composite material Substances 0.000 claims description 100
- 238000000034 method Methods 0.000 claims description 40
- 238000006243 chemical reaction Methods 0.000 claims description 38
- 238000012937 correction Methods 0.000 claims description 14
- 230000008859 change Effects 0.000 claims description 11
- 238000010586 diagram Methods 0.000 description 54
- 238000012986 modification Methods 0.000 description 20
- 230000004048 modification Effects 0.000 description 20
- 238000011156 evaluation Methods 0.000 description 17
- 238000013507 mapping Methods 0.000 description 12
- 230000002194 synthesizing effect Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000010354 integration Effects 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/28—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with an adjustable field of view
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/027—Parking aids, e.g. instruction means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/027—Parking aids, e.g. instruction means
- B62D15/028—Guided parking by providing commands to the driver, e.g. acoustically or optically
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/102—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/302—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/307—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8086—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for vehicle path indication
Definitions
- the present invention relates to a driving operation assisting device that assists a driving operation of a vehicle, and a recording medium that stores a program that causes a computer to execute all or a part of the functions of each unit of the driving operation assisting device.
- a conventional general driving assist device predicts a vehicle trajectory corresponding to a steering angle of a steering wheel when the vehicle is reversing, using a steering sensor that detects a steering angle of the steering wheel.
- the driving operation of the driver is as follows. First, move the vehicle to a place where parking is possible with the steering wheel fixed. Next, at this location, while checking the vehicle movement trajectory predicted by the steering operation, a steering angle at which the vehicle can be moved to the space to be parked without steering operation is found. Then, if the vehicle is backed and moved to the parking space while maintaining the steering angle, parking is completed in principle.
- the driver can easily enter the parking space while checking the combined image of the parking space, the situation around the space, and the route for guiding the vehicle to the parking space.
- I could not intuitively find a place where the vehicle could move. Disclosure of the invention
- the present invention in consideration of the problems of the conventional driving operation assisting device, considers a case where a driver intends to perform a predetermined series of driving operations, and a vehicle motion when performing the predetermined series of driving operations. Is displayed together with the surrounding situation, so that the driver can directly confirm the relationship between the vehicle motion and the surrounding situation by the predetermined series of driving operations, thereby reducing the driver's burden. It is intended to provide a device.
- the present invention provides an image of a surrounding state of a vehicle using a camera to generate a surrounding state image, and / or a state of storing the generated surrounding state image.
- a driving operation assisting device comprising: a composite image generating unit that generates a composite image; and a display unit that displays the composite image.
- the second invention (corresponding to the invention described in claim 26) stores a program for causing a computer to execute all or a part of the functions of each means of the driving assist system of the present invention.
- a recording medium characterized by the following.
- FIG. 1 is a block diagram showing a configuration of the driving assist device according to the first embodiment of the present invention.
- FIG. 2 is a plan view and an elevational view of the vehicle showing an example in which each force lens of the imaging unit 101 of the driving assist device according to the first embodiment of the present invention is attached to the vehicle.
- FIG. 3 is an elevation view showing an example of a viewpoint of a virtual camera of the driving assist system according to the first embodiment of the present invention.
- FIG. 4 is a diagram illustrating an example of a surrounding situation image from the virtual camera of the driving assist device according to the first embodiment of the present invention.
- FIG. 5 is a diagram illustrating an example of assumed motion data of the driving assist system according to the first embodiment of the present invention.
- FIG. 6 is a diagram illustrating an example of assumed motion data of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 7 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 8 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 9 is a diagram showing the motion of the vehicle when performing parallel parking to the left.
- FIG. 10 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 11 is a diagram showing an example of a variation of the assumed movement pattern stored in the assumed movement pattern storage means 108 of the driving assist system according to the first embodiment of the present invention.
- FIG. 12 is a diagram illustrating a modified example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 13 shows a configuration of a driving assist system according to a second embodiment of the present invention.
- FIG. 14 is a diagram illustrating an example of a composite image of the driving assist device according to the second embodiment of the present invention.
- FIG. 15 is a block diagram showing the configuration of the driving assist system according to the third embodiment of the present invention.
- FIG. 16 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the third embodiment of the present invention.
- FIG. 17 is a block diagram illustrating a configuration of a driving assist system according to the fourth embodiment of the present invention.
- FIG. 18 is a block diagram showing a configuration of the driving assist system according to the fifth embodiment of the present invention.
- FIG. 19 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the fifth embodiment of the present invention.
- FIG. 20 is a block diagram showing a configuration of the driving assist system according to the sixth embodiment of the present invention.
- FIG. 21 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the sixth embodiment of the present invention.
- FIG. 22 is a block diagram showing a configuration of a modified example of the driving assist system according to the sixth embodiment of the present invention.
- FIG. 19 is a block diagram showing a configuration of a driving assist system according to a seventh embodiment of the present invention.
- FIG. 24 shows a map of the driving assist system according to the seventh embodiment of the present invention.
- 3 is a conceptual diagram showing an example of a mapping table stored in a mapping table 2302.
- FIG. 24 shows a map of the driving assist system according to the seventh embodiment of the present invention.
- 3 is a conceptual diagram showing an example of a mapping table stored in a mapping table 2302.
- FIG. 25 is a block diagram showing a configuration of the driving assist system according to the eighth embodiment of the present invention.
- FIG. 26 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the eighth embodiment of the present invention.
- FIG. 27 is a block diagram showing a configuration of the driving assist system according to the ninth embodiment of the present invention.
- FIG. 28 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the ninth embodiment of the present invention.
- FIG. 29 is a block diagram showing the configuration of the driving assist system in the tenth embodiment of the present invention.
- FIG. 30 is a diagram illustrating an example of a composite image of the driving assist device according to the tenth embodiment of the present invention.
- FIG. 31 is a block diagram showing a configuration of the driving assist device in the eleventh embodiment of the present invention.
- FIG. 32 is a diagram showing an example of a composite image of the driving assist system according to the first embodiment of the present invention.
- FIG. 33 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 34 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 35 is a composite image of the driving assist system according to the first embodiment of the present invention. It is a figure showing an example of an image.
- FIG. 36 is a graph for explaining a contact risk evaluation function of the driving assist system according to the first embodiment of the present invention.
- FIG. 37 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 38 is a diagram illustrating an example of a contact risk evaluation function of the driving assist system according to the first embodiment of the present invention.
- FIG. 39 is a diagram illustrating an example of assumed motion data in the driving assist system according to the first embodiment of the present invention.
- FIG. 40 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the eleventh embodiment of the present invention.
- FIG. 41 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the first embodiment of the present invention.
- FIG. 42 is a block diagram showing a configuration of the driving assist device in the 12th embodiment of the present invention.
- FIG. 43 is a diagram illustrating an example of a composite image of the driving assist device according to the 12th embodiment of the present invention.
- FIG. 44 is a block diagram showing a configuration of the driving assist system in the 12th embodiment of the present invention.
- FIG. 45 is a diagram illustrating an example of a composite image of the driving operation assisting device according to the 12th embodiment of the present invention.
- FIG. 46 is a diagram illustrating an example of a composite image of the driving assist device according to the 12th embodiment of the present invention. (Explanation of code)
- FIG. 1 is a block diagram showing a configuration of the driving assist device according to the first embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is mainly intended to assist driving operation at the time of parking and parallel parking.
- the driving assist system stores an imaging unit 101 composed of N cameras (cameras 1 to N) and a camera parameter which is a characteristic of each camera.
- Spatial reconstruction means 1 for creating spatial data in which each pixel constituting an output image from each force lens is associated with a point in a three-dimensional space based on the camera parameter table 103 and the camera parameters.
- a viewpoint conversion means 106 for generating an image viewed from a predetermined viewpoint as a surrounding situation image by referring to the spatial data and a spatial data buffer 105 for temporarily storing the spatial data.
- An assumed movement pattern storage means 108 for holding assumed movement data including a movement pattern; a superimposition means 102 for superimposing the assumed movement pattern on the surrounding situation image to generate a composite image; Combination Is composed of a display means 1 0 7 for displaying an image.
- the combination of the imaging unit 101, the camera parameter table 103, the spatial reconstruction means 104, and the viewpoint conversion means 106 corresponds to the surrounding situation imaging means of the present invention.
- the superimposing means 102 corresponds to the composite image generating means of the present invention.
- FIG. 2 is a plan view and an elevation view of the vehicle showing an example in which each camera of the imaging unit 101 is attached to the vehicle.
- N 6 and six cameras 201 to 206 are arranged on the roof of the vehicle.
- a part of the individual imaging ranges of the six cameras 201 to 206 are arranged so as to overlap with a part of the imaging range of the other force lens and so as not to generate a blind spot in a plane. ing.
- the camera parameter table 103 stores the camera parameters of each of the cameras (parameters representing camera characteristics such as a camera mounting position, a camera mounting angle, a camera lens distortion correction value, and a camera lens focal length).
- the spatial reconstruction means 104 creates spatial data in which each pixel constituting an output image from each camera is associated with a three-dimensional space point based on the vehicle based on the camera parameters.
- the space buffer 105 temporarily stores the space data, and the viewpoint conversion means 106 converts an image viewed from an arbitrary viewpoint, for example, the viewpoint of the virtual camera 301 shown in FIG. By composing each pixel with reference to the spatial data, a surrounding image is generated.
- FIG. 4 shows an example of the surrounding situation image from the viewpoint of the virtual camera 301 shown in FIG.
- parallel parking is performed, and two vehicles that are parked are represented on the surrounding situation image as an obstacle 401 and an obstacle 402.
- the procedure from when the superimposing means 102 generates the composite image of the present invention until the display means 107 displays it will be described.
- the assumed motion pattern storage means 108 stores an assumed motion pattern which is image data representing the motion of the vehicle when a typical driving operation of the vehicle is performed on the vehicle. And time-series data representing the relationship between the moving distance of the vehicle (the amount of rotational movement of the tire) and the steering angle of the steering wheel (steering angle of the steering wheel) are stored as assumed motion data of the present invention.
- Fig. 5 shows the assumed movement data when performing a parallel parking driving operation on the left side
- Fig. 6 shows the assumption when performing a garage driving operation on the right side.
- Exercise data are shown.
- the assumed motion pattern is shown in (a) of each figure
- the operation start positions 501, 60 correspond to the case where the driving operation is performed according to the time-series data shown in (b) of each figure. 1 (corresponding to the assumed motion start area of the present invention), the operation end position 502, 602 (corresponding to the assumed motion end area of the present invention) and the tire trajectory 503, 603 (vehicle of the present invention) (Corresponding to image data representing the locus of the tire).
- the driver selects one of the assumed movement patterns stored in the assumed movement pattern storage means 108 by the pattern selection means (not shown).
- the superimposing means 102 superimposes the selected assumed motion pattern (for example, Fig. 5 (a)) on the surrounding situation image (for example, Fig. 4) generated by the viewpoint conversion means 106 and synthesizes it.
- the composite image of the present invention is generated, and the display means 107 displays the composite image.
- the operation end position 502 becomes the driving operation corresponding to the assumed motion pattern.
- the operation end position when starting from the current position, that is, the parking position.
- FIG. 7 shows an example of a synthesized image obtained by synthesizing the assumed motion pattern shown in FIG. 5
- FIG. 8 shows an example of a synthesized image obtained by synthesizing the assumed motion pattern shown in FIG.
- the driver can control the obstacles 401, 402 (803, 804), parking position 702 (802), tire trajectory 503 (603), and start position 70 1 (80 Move the vehicle to the start position 701 (80 1) so as not to interfere with 1) and start a series of driving operations according to the time-series data from there to park at the parking position 702 (802). Parallel parking to the left (garage to the right) can be performed.
- FIG. 9 is a diagram showing the motion of the vehicle when performing parallel parking to the left.
- the driver of the vehicle in order to park the vehicle at the target parking position 902, the driver of the vehicle first operates the assumed motion pattern (FIG. 5 (a)) of parallel parking to the left.
- the operation start position 501 when the operation end position 502 is matched with the target parking position 902 is set as the target start position 903, and it is necessary to move the vehicle located at the current position 901 to the target start position 903. .
- the relative positional relationship between the operation end position 502 and the operation start position 501 in FIG. 5 (a) corresponds to the case where the driving operation is performed according to the time-series data in FIG. 5 (b).
- fine adjustment is possible by fine adjustment of the steering wheel operation etc. during operation.
- the driver assumes obstacles 401 and 402 and a target parking position 902 from a scene that can be seen directly from the inside of the car or from a mirror or the like. Then, it is necessary to move the vehicle to the target start position 903. In this case, the driver can see the obstacles 401, 402, and the target parking position 90 from the scene that can be seen from the inside of the car by looking directly at the vehicle or from the mirror.
- the task of assuming 2 has the problem of requiring skill. In addition, even if the size of the vehicle or the position of the mirror changes, it is difficult for the driver to immediately respond to the change.
- the assumed motion pattern as shown in Fig. 5 (a) is superimposed on the surrounding situation image image from the viewpoint of the above, and a composite image as shown in Fig. 7 is generated and displayed to the driver.
- the operation start position 5 0 1 is displayed so as to match the current position 9 0 1 as the operation start position 5 0 1 and the operation end position 5 0 2 force It will be displayed as the parking position 1001.
- the vehicle is located at the current position 901 where the parking position 1001 matches the target parking position 902, the movement to the target start position 903 has been completed.
- the tire trajectory does not overlap the obstacles 401 and 402. Also, since the parking position 1001 is a place suitable for parking at a glance, it can be confirmed that the parking operation should be started from this position.
- the operation start position, the operation end position, and the tire trajectory of the assumed motion pattern are unique to each vehicle, and for example, are large in a small vehicle and a large vehicle. This can be dealt with by storing the assumed movement pattern for each vehicle in the assumed movement pattern storage means 108 in FIG. Therefore, even if the driver changes the vehicle, the driver can perform the driving operation while watching the relation between the assumed motion pattern corresponding to the vehicle and the obstacles around the driver.
- the driver can operate the vehicle while observing the relationship between the assumed motion pattern corresponding to the vehicle and the surrounding obstacles in a manner similar to that before the vehicle changes. Therefore, it is possible to greatly reduce a driver's skill load for changing the vehicle.
- FIG. 11 shows an example of a variation of the assumed movement pattern stored in the assumed movement pattern storage means 108 in FIG.
- the driver selects one of these using the pattern selecting means (not shown).
- the area to be displayed as a composite image is also determined as shown in the outer frame of each assumed movement pattern 1101-1104 in FIG. That is, the operation start position is the current vehicle position, and the rectangular area including the tire trajectory and the operation end position is the synthetic image area.
- the vehicle body is not generally imaged by the on-board camera, but here, the CG data of the vehicle, the actual vehicle data, etc. are retained, and may be superimposed and displayed in the composite image similarly to the track data. .
- the assumed motion pattern of the present invention includes the operation start position (the assumed motion start region of the present invention), the operation end position (the assumed motion end region of the present invention), and the tire trajectory (the vehicle of the present invention).
- Image showing tire trajectory Although described as image data, the present invention is not limited to this. For example, a trajectory in which a plane projection of the vehicle itself moves together with a Z trajectory instead of a tire trajectory (image data representing a moving area of the vehicle of the present invention) May be included.
- the assumed motion pattern of the present invention only needs to be image data representing the motion of the vehicle when a predetermined series of driving operations are performed on the vehicle in advance.
- a margin line 1 201 may be displayed, which is arranged a predetermined amount (for example, 50 cm) outside from the outer edge of the moving area of the moving area.
- the surrounding situation image of the present invention has been described as being a combination of image data captured in real time by the imaging unit 101, but the present invention is not limited to this.
- FIG. 13 is a block diagram showing a configuration of the driving assist system according to the second embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is also mainly intended to assist driving operation such as when entering a garage and during parallel parking. Therefore, in the present embodiment, components that are not particularly described are the same as those in the first embodiment, and those components that are assigned the same reference numerals as those in the first embodiment are particularly described. Unless it has the same function as the first embodiment, I do. Unless otherwise specified, the modifications described in the first embodiment are applied to the present embodiment by performing similar modifications.
- the configuration of the driving assist system according to the present embodiment is different from the configuration of the driving assist system according to the first embodiment in that the start detecting means 13 01, the integrating means It is characterized in that it comprises 1302 and space conversion means 1303.
- the start detection means 1301 which receives the gear signal indicating whether the vehicle is moving forward or backward and the steering wheel turning signal indicating the steering angle of the front wheels, is provided.
- the steering angle of the front wheels becomes larger than a certain value, it is determined that the driving operation (parking operation) corresponding to the assumed movement pattern stored in the assumed movement pattern storage means 108 has been started. This corresponds to the operation start detecting means of the invention.
- the integrating means 1302 integrates the steering wheel turning angle and the rear wheel rotation speed to calculate a spatial movement change of the vehicle from the start of the driving operation (parking operation) to the present time. Yes, it corresponds to the moving position calculating means of the present invention.
- the space conversion means 1303 moves the assumed movement pattern in accordance with the change in the space movement, and the combination of the superimposition means 102 and the space conversion means 1303 is This corresponds to the composite image generating means of the invention.
- the procedure from generation of image data captured by the imaging unit 101 to generation of the surrounding situation image of the present invention is the same as that described in the first embodiment. Also, in the procedure from the superimposing means 1 ⁇ 2 generating the synthetic image of the present invention and displaying it by the display means 107, The procedure up to the start of the actual driving operation corresponding to the turn is the same as that described in the first embodiment.
- the start detection means 13 01 1 When the driver starts the driving operation corresponding to the parallel parking to the left, the start detection means 13 01 1 is set because the gear signal is in the reverse state and the steering angle of the front wheel is larger than a certain value by the steering wheel turning signal. It is determined that the driving operation (parking operation) corresponding to the parallel parking to the left has been started, and that the driving operation (parking operation) has been started to the integrating means 1302, and the steering wheel has been turned off thereafter. The angle signal and the rear wheel speed signal are input to the integrating means 1302.
- the integrating means 13 02 integrates the steering angle signal after the start of the driving operation and the rear wheel speed signal which are input, and integrates the current vehicle position 1 as shown in FIG. 14 (a).
- the positional relationship of the parking operation start position 1401 with respect to 402 is calculated.
- the space conversion means 133 sets the assumed movement pattern 1443 corresponding to the left-hand parallel parking as its operation start position (see FIG. 14B). 5 5 0 1) and the parking operation start position 1 4 0 1-move so as to match.
- the space conversion means 1403 spatially fixes the expected movement pattern 1443 at the position at the time of the start of the parking operation after the start of the driving operation.
- the superimposing means 102 converts the assumed motion pattern 1403 spatially fixed to the position at the start of the parking operation and the current vehicle position 1442 into the surrounding situation image.
- a composite image of the present invention is generated.
- the display means 107 displays the composite image. Since the surrounding situation images of the obstacles 401, 402, etc. are, of course, fixed in space, in this composite image, the positional relationship between the surrounding situation image and the assumed motion pattern 144 is fixed. I have. Also, since the composite image is an image viewed from the viewpoint fixed on the vehicle, in Fig. 14 (c), when the vehicle moves, only the current vehicle position 1402 is fixed on the screen, and The situation image and the assumed exercise pattern 1443 are displayed relatively moving.
- the surrounding situation image from the viewpoint of the virtual camera moves according to the actual movement of the vehicle, and the assumed driving pattern 1443 synthesized by superimposition is also Since the vehicle moves in accordance with the movement of the vehicle calculated by the integrating means 13 02, the two move together. Since the driver only has to operate the steering wheel along the tire trajectory of the displayed assumed motion pattern at each time, the vehicle can be operated more easily and safely.
- FIG. 15 is a block diagram showing the configuration of the driving assist system according to the third embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is also mainly intended to assist driving operation such as when entering a garage and during parallel parking. Therefore, in the present embodiment, components that are not particularly described are the same as those in the first embodiment, and those components that are assigned the same reference numerals as those in the first embodiment are particularly described. Unless otherwise described, it is assumed that it has the same function as the first embodiment. Unless otherwise noted, each modification described in the first embodiment is applied to the present embodiment by performing the same modification. Shall be.
- the configuration of the driving assist system according to the present embodiment is different from the configuration of the driving assist system according to the first embodiment in that the start detecting means 1501 In that it comprises means 1502 and space conversion means 1503.
- the start detection means 1501 does not output the steering wheel turning angle signal and the rear wheel speed signal to the other, the start detection means 1300 shown in FIG. It has the same function as 1.
- the image tracking means 1502 is based on all or part (eg, obstacles) of the image data of the surrounding situation image on the composite image at the time when the driving operation (parking operation) is started. It stores the position information of all or a part of the image data of the assumed movement pattern (for example, the operation end position), and corresponds to the position information storage means of the present invention.
- the space conversion means 1503 moves the assumed movement pattern according to the position information, and the combination of the superimposition means 102 and the space conversion means 1503 is a combination of the present invention.
- the procedure from generation of image data captured by the imaging unit 101 to generation of the surrounding situation image of the present invention is the same as that described in the first embodiment.
- the procedure from the superimposing means 102 generating the composite image of the present invention to the display thereof by the display means 107 the actual driving operation corresponding to the assumed motion pattern is started.
- the procedure up to this point is the same as that described in the first embodiment.
- the start detection means 1501 When the driver starts the driving operation corresponding to the parallel parking to the left, the start detection means 1501 is set because the gear signal is in the reverse state and the steering angle of the front wheel is larger than a predetermined value by the steering wheel turning signal. Then, it is determined that the driving operation (parking operation) corresponding to the parallel parking on the left side has been started, and the fact that the driving operation (parking operation) has been started is notified to the image tracking means 1502.
- the image tracking means 1502 transmits the image data of the surrounding situation image on the composite image (FIG. 16 (a)) at this time. Obtain the image data of the end position surrounding image 1603 including the part of the obstacle 4002 and the parking operation end position 1602 via the spatial data buffer 105.
- the relevant part of the obstacle 402 is found from the surrounding situation image (obtained via the spatial data buffer 105) at each point in time, and By making the relevant part of the obstacle 402 coincide with the relevant part of the obstacle 402 in the surrounding situation image, the positional relationship between the parking operation end position 1602 at that time and the surrounding situation image is determined. . In other words, the image tracking means 1502 tracks the positional relationship between the parking operation end position 1602 at each time and the surrounding situation image.
- the space conversion means 1503 determines the assumed motion pattern corresponding to parallel parking to the left by the operation end position (502 in FIG. 5) and the parking operation end position 1602. -Move to match. In other words, the space conversion means 1503 changes the assumed motion pattern after the start of the driving operation to the position at the start of the parking operation. Fix the space in place.
- the superimposing means 102 uses the assumed motion pattern 1605, which is spatially fixed to the position at the start of the parking operation, and the current position of the vehicle. 1664 is superimposed on the surrounding situation image and synthesized to generate a synthesized image of the present invention.
- the display means 107 displays the composite image. Since the surrounding situation images of the obstacles 401, 4 ⁇ 2, etc. are of course fixed in space, the positional relationship between the surrounding situation image and the assumed motion pattern 1443 is fixed in this composite image. I have.
- the composite image is an image viewed from the viewpoint fixed on the vehicle, in Fig. 16 (c), when the vehicle moves, only the current position of the vehicle is fixed on the screen.
- the surrounding situation image and the assumed movement pattern 1605 are relatively moved and displayed. That is, in the driving operation assisting device according to the present embodiment, if the procedure is performed under the same conditions as those of the driving operation assisting device according to the second embodiment, the composite image shown in FIG. c) This is the same as the composite image shown in (2).
- the surrounding situation image from the viewpoint of the virtual camera moves in accordance with the actual vehicle movement, and the assumed motion pattern 1605 synthesized by being superimposed is also the same. Since both move according to the movement of the vehicle, the two move in unison. The driver only has to operate the steering wheel along the trajectory data of the assumed motion pattern displayed at each point in time, so that a simpler and safer vehicle operation is possible.
- FIG. 17 shows the configuration of the driving assist system according to the fourth embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is also mainly intended to assist driving operation such as when entering a garage and during parallel parking. Therefore, in the present embodiment, components that are not particularly described are the same as those in the first embodiment, and those components that are assigned the same reference numerals as those in the first embodiment are particularly described. Unless otherwise described, it is assumed that it has the same function as the first embodiment. Unless otherwise specified, the modifications described in the first embodiment are applied to the present embodiment by performing similar modifications.
- the configuration of the driving assist system in the present embodiment is different from the configuration of the driving assist system in the first embodiment in that the start input means 1 This is characterized in that it comprises means 1702, integrating means 1703, and space transforming means 1704.
- the start input means 1701 is for the driver to input an instruction to start an actual driving operation (parking operation) corresponding to the assumed movement pattern, and corresponds to the operation start detecting means of the present invention. Things.
- the driving control means 1702 sets the steering wheel turning angle and the rear wheel rotation speed in accordance with the time series data (for example, Fig. 5 (b)) corresponding to the assumed motion pattern. This automatically controls the driving of the vehicle by controlling the driving force, and corresponds to the driving control means of the present invention.
- the integrating means 1703 integrates the steering wheel turning angle and the rear wheel speed to calculate a change in the spatial movement of the vehicle from the start of the driving operation (parking operation) to the present time. This corresponds to the moving position calculating means of the present invention. That is, the one having the same function as the integrating means 1302 in FIG. 13 described in the second embodiment. It is.
- the space conversion means 1704 moves the assumed movement pattern in accordance with the change in the space movement, and the combination of the superimposing means 102 and the space conversion means 1704 is This corresponds to the composite image generating means of the invention. That is, the space conversion means 1704 has the same function as the space conversion means 133 shown in FIG. 13 described in the second embodiment.
- the procedure until the surrounding situation image of the present invention is generated from the image data picked up by the image pickup section 1 • 1 is the same as that described in the first embodiment. Further, in the procedure from the superimposing means 102 generating the composite image of the present invention to the display thereof by the display means 107, the actual driving operation corresponding to the assumed motion pattern is started. The procedure up to this point is the same as that described in the first embodiment.
- the superimposing / pausing means 102 From the point in time when the actual driving operation corresponding to the assumed motion pattern is started, the superimposing / pausing means 102 generates the composite image of the present invention and the display means 107 displays the composite image. The procedure will be described below with reference to an example of parallel parking on the left side.
- the driver Before starting the parking operation, the driver positions the vehicle at a position suitable for starting the parking operation while looking at the composite image displayed on the display means 107, and then inputs the start input means 1701, The user inputs an instruction to start the parking operation.
- the start input unit 1701 informs the operation control unit 1702 and the integrating unit 1703 that the driving operation start instruction has been input via the assumed motion pattern storage unit 108.
- the driving control means 1702 sets the steering wheel turning angle in accordance with the time-series data (Fig. 5 (b)) corresponding to the assumed motion pattern.
- a control signal and a rear wheel speed control signal are generated to control the steering wheel control system and the rear wheel drive system, thereby automatically controlling the operation of the vehicle.
- the integrating means 1703 integrates the steering wheel turning angle signal and the rear wheel rotation speed signal, and as shown in FIG. 14 (a), the vehicle current position 1 The positional relationship of the parking operation start position 1401 with respect to 402 is calculated.
- the space conversion means 1704 sets the assumed movement pattern 1443 corresponding to the parallel parking to the left as its operation start position (Fig. 5 5 0 1) and the parking operation start position 1 4 0 1-move so as to match.
- the space conversion means 1704 spatially fixes the expected movement pattern 1443 at the position at the time of the start of the parking operation after the start of the driving operation.
- the subsequent procedure in which the superimposing means 102 generates a composite image and the display means 107 displays the composite image is the same as that described in the second embodiment.
- the driving control unit 1702 stops the vehicle according to the time-series data, and the parking operation ends.
- the effect that the steering wheel operation and the like are performed automatically is obtained.
- the driver only needs to confirm that the steering operation along the trajectory data of the displayed expected motion pattern is automatically generated at each point in time, and that a new obstacle appears. Therefore, easier and safer vehicle operation is possible.
- FIG. 18 shows the configuration of the driving assist system according to the fifth embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is also mainly intended to assist driving operation such as when entering a garage and during parallel parking. Therefore, in the present embodiment, components that are not particularly described are the same as those in the first embodiment, and those components that are assigned the same reference numerals as those in the first embodiment are particularly described. Unless otherwise described, it is assumed that it has the same function as the first embodiment. Unless otherwise specified, the modifications described in the first embodiment are applied to the present embodiment by performing similar modifications.
- the configuration of the driving assist device in the present embodiment is different from the configuration of the driving assist device in the first embodiment in that it includes a trajectory correcting means 1801. is there.
- the trajectory correction means 1801 corrects the assumed motion pattern and the time-series data based on the operation start position and the operation end position of the exercise operation input from the driver. It corresponds to.
- the procedure from generation of image data captured by the imaging unit 101 to generation of the surrounding situation image of the present invention is the same as that described in the first embodiment.
- the procedure from the superimposing means 102 generating the synthesized image of the present invention to the display of the synthesized image by the display means 107 the stored image is stored in the assumed movement pattern storing means 108.
- the procedure until the assumed motion pattern is displayed on the composite image with the operation start position coincident with the current position of the vehicle is the same as that described in the first embodiment.
- the driver modifies the assumed motion pattern and the time-series data using the trajectory correction means 1801, and the The procedure up to the display on the composite image will be described below, taking as an example a case where the garage is moved to the left side.
- the driver performs a garage entry operation with the target parking position 1902 as the operation end position so as not to touch the obstacles 1904 and 1905.
- the tire trajectory 1903 of the assumed motion pattern becomes an obstacle.
- the assumed movement pattern storage means 108 stores an assumed movement pattern for another garage entry operation to the left, this is selected by the pattern selection means (not shown), and the parking operation is successfully performed. It may be possible to consider whether it is possible, but if not, or if another assumed movement pattern also interferes with obstacles, etc., the driver corrects the assumed movement pattern.
- the driver draws a figure showing the vehicle at the current position 1901 in the composite image (FIG. 19 (a)) displayed on the display means 107 as shown in FIG. 19 (b). Then, move to the new operation start position 1906 by numerical input, pointer, or other means.
- the trajectory correction means 1801 will start the new tire trajectory 1 such that the vehicle moves from the new operation start position 1906 to the target parking position 1902. 90 7 (see Fig. 19 (c)) is calculated, and new assumed motion patterns and their corresponding time-series data are generated.
- the display means 107 displays this. Therefore, the driver moves the vehicle so that the operation end position 1908 of the new assumed motion pattern coincides with the target parking position 1902, and then drives the vehicle according to the new assumed motion pattern ( When the parking operation starts, the vehicle can be parked at the target parking position 1902.
- the generated new assumed movement pattern and time-series data may be stored in the assumed movement pattern storage means 108 by updating the original assumed movement pattern. It may be stored additionally in the assumed movement pattern storage means 108. It does not need to be stored as an ad hoc one. Furthermore, even if the driver selects “update storage”, “additional storage”, or “do not store” each time,
- the assumed movement pattern to be updated or additionally stored in the assumed movement pattern storage means 108 is based on the position of the vehicle at the start and end of the movement input by the driver. Although it was explained that it was automatically obtained, actual driving operation was performed, and time series data such as the steering angle of the steering wheel and the number of rotations of the wheel at this time were collected, and an expected motion pattern was generated based on this data. According to the present embodiment, it is possible to realize a scalable driving operation assist device as compared with the driving operation assist device according to the first embodiment.
- FIG. 20 is a block diagram showing a configuration of the driving assist system according to the sixth embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is also mainly intended to assist driving operation such as when entering a garage and during parallel parking. I Therefore, in the present embodiment, components that are not particularly described are the same as those in the first embodiment, and those components that are assigned the same reference numerals as those in the first embodiment are the same as those in the first embodiment. Unless otherwise provided, it has the same function as the first embodiment. Unless otherwise specified, the modifications described in the first embodiment are applied to the present embodiment by performing similar modifications.
- the configuration of the driving assistance device in the present embodiment is different from the configuration of the driving assistance device in the first embodiment in that it has a CG image synthesizing means 200 1. It is.
- the CG image synthesizing means 200 stores three-dimensional data corresponding to each assumed movement pattern stored in the assumed movement pattern storage means 108, and the three-dimensional data corresponding to the viewpoint of the surrounding situation image is stored. It generates an (or two-dimensional) image, and corresponds to a part of the function of the assumed motion pattern storing means of the present invention and a part of the function of the synthetic image generating means of the present invention.
- the viewpoint conversion means 106 can switch the position of the viewpoint automatically or by an input from the driver.
- the assumed movement pattern storage means 108 stores the stored assumed movement patterns (operation start position 501, operation end position 502, and tire trajectory 5). 0 3), the positions of a plurality of virtual poles 200 1 arranged on the tire track 503 are stored. On the basis of the assumed motion pattern and the data of the virtual pole 2001, the CG image synthesizing means 201 previously generates three-dimensional data (see FIG. 21 (b)) corresponding to the assumed motion pattern. Generated and stored.
- the procedure from generation of image data captured by the imaging unit 101 to generation of the surrounding situation image of the present invention is the same as that described in the first embodiment. Further, in the procedure from when the superimposing means 102 generates the composite image of the present invention and when the display means 107 displays it, the actual driving operation corresponding to the assumed motion pattern is performed. The procedure up to the start is the same as that described in the first embodiment.
- the driver switches the viewpoint of the surrounding situation image used by the viewpoint conversion means 106 from a position immediately above the vehicle to a viewpoint from the rear of the vehicle to the rear.
- the viewpoint is automatically switched to the viewpoint by detecting that the actual driving operation corresponding to the assumed movement pattern has been started by the viewpoint conversion unit 106.
- the same means as the start detecting means 1301 described in the second embodiment may be used.
- the peripheral situation image output from the viewpoint conversion unit 106 becomes as shown in FIG. 21 (c).
- the CG image synthesizing unit 20001 matches the current position of the vehicle with the operation start position 501 and generates a CG image viewed from the same viewpoint as the viewpoint used by the viewpoint conversion unit 106.
- the CG image at this time is as shown in FIG. 21 (d).
- the superimposing means 102 superimposes this CG image on the surrounding situation image Then, the combined image is generated as shown in FIG. 21 (e).
- the display means 107 displays the composite image. Since the composite image is an image viewed from a viewpoint fixed on the vehicle, in FIG. 21 (e), when the vehicle moves, the entire image is relatively moved and displayed.
- the driver can determine the parking start position while observing at a glance the relationship between the virtual pole, the operation end position, and the actual obstacle while looking at the displayed image. And a reliable driving operation can be performed.
- the CG image synthesizing unit 2001 generates a CG image viewed from the same viewpoint as the viewpoint used by the viewpoint conversion unit 106 in real time.
- a configuration may be used in which the viewed CG images are generated in advance for each assumed motion pattern, and these are stored.
- the surrounding situation image viewed from the virtual camera is generated, but it is possible to view the rear from the rear of the vehicle without switching the viewpoint.
- the image captured by the camera installed at the viewpoint position may be used as it is as the surrounding situation image.
- the configuration of the driving assist system in this case is as shown in the block diagram of FIG. That is, the CG image synthesizing means 2001 obtains data on the viewpoint of one vehicle-mounted camera 2 201 from the camera parameter table 103 to generate a CG image.
- FIG. 23 is a block diagram showing a configuration of the driving assist system according to the seventh embodiment of the present invention.
- the driving assist device in the present embodiment also mainly It is intended to assist driving operations such as when entering a garage and during parallel parking. Therefore, in the present embodiment, components that are not particularly described are the same as those in the first embodiment, and those components that are assigned the same reference numerals as those in the first embodiment are particularly described. Unless otherwise described, it is assumed that it has the same function as the first embodiment. Unless otherwise specified, the modifications described in the first embodiment are applied to the present embodiment by performing similar modifications.
- the configuration of the driving assistance device in the present embodiment is different from the configuration of the driving assistance device in the first embodiment in that a mapping means 2301 and a mapping table 2 302.
- the matching means 2301 performs high-speed processing for converting an input image from each camera of the imaging unit 101 into an image viewed from an arbitrary viewpoint.
- the mapping table 2302 stores data used when the mapping means 2301 performs conversion.
- FIG. 24 is a conceptual diagram showing an example of the mapping table stored in the mapping table 2302.
- the mapping table is composed of cells of the number of pixels of the screen displayed on the display means 107 (that is, the composite image generated by the superimposing means 102). That is, the table is configured such that the number of horizontal pixels on the display screen is the number of columns in the table, and the number of vertical pixels on the display screen is the number of rows in the table.
- Each cell has a camera number, pixel coordinates of an image taken by each camera, and.
- the cell in the upper left of FIG. 24 indicates the upper left of the display screen, that is, the part of (0,0).
- Bing means 2301 the data content (1, 10, 0, 10) stored in the cell, from the data of the pixel (10, 10) of the image taken by the first camera, Display on the display screen (0, 0) ".
- the mapping table 2302 needs to store a table as shown in FIG. 24 for each viewpoint.
- FIG. 25 is a block diagram showing a configuration of the driving assist system according to the eighth embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is also mainly intended to assist driving operation such as when entering a garage and during parallel parking. Therefore, in the present embodiment, components that are not particularly described are the same as those in the first embodiment, and those components that are assigned the same reference numerals as those in the first embodiment are particularly described. Unless otherwise described, it is assumed that it has the same function as the first embodiment. Unless otherwise specified, the modifications described in the first embodiment are applied to the present embodiment by performing similar modifications.
- the configuration of the driving assist system in the present embodiment is different from the configuration of the driving assist system in the first embodiment, because the final position input means 2501 This is the point that the position determining means 2502 and the space fixing means 2503 are provided.
- the final position input means 2501 is for inputting a target end position of the driving operation with a pointer.
- the input of the target end position may be input by numerical input or other means.
- the start position determining means 2502 obtains the start position of the driving operation corresponding to the target end position input by the final position inputting means 2501 according to the expected motion pattern corresponding to the driving operation. is there.
- the space fixing means 2503 fixes the assumed motion pattern corresponding to the driving operation between the target end position and the operation end position, and thereafter fixes the space. It has the functions of the means 13 02 and the space conversion means 13 03 (in FIG. 25, the rear wheel rotation signal input and the steering wheel turning angle signal input are not shown).
- the image tracking means 1502 and the space transforming means 1503 in FIG. 15 may be combined, but in this case, as in the image tracking means 1502 in FIG. It is necessary to receive input of spatial data from the spatial buffer 105.
- the combination of the superimposing means 102 and the space fixing means 2503 corresponds to the composite image generating means of the present invention.
- the procedure from generation of image data captured by the imaging unit 101 to generation of the surrounding situation image of the present invention is the same as that described in the first embodiment.
- the stored image is stored in the assumed movement pattern storing means 108.
- the procedure until the assumed motion pattern is displayed on the composite image with the operation start position coincident with the current position of the vehicle is the same as that described in the first embodiment. Assuming that the driver inputs the target end position of the driving operation using the final position input means 2501, after the assumed movement pattern is displayed on the composite image, and includes the corresponding starting position of the driving operation.
- the procedure until the motion pattern is displayed on the composite image will be described below with reference to a case where the garage is moved to the left side as an example.
- the driver tries to park during this time so as not to touch the obstacles 401 and 402, and displays the composite image of the present invention on the display means 107.
- the vehicle is located at the current position 901 as the operation start position, the parking position at the operation end position of the 403 as the operation end position, and the parking position as the operation end position. I do.
- the driver uses the pointer 2601 displayed on the screen of the display means 107 to move the parking position 1001 to the target position 2602.
- the assumed motion pattern 1403 moves together with the parking position 1001, so the operation start position of the assumed motion pattern 1403 is It will be displayed as the starting position 2 6 0 3 where the operation starts.
- the current position 901 of the vehicle is displayed on the screen displayed by the display means 107 as shown in FIG. 26 (c).
- the driver may move the vehicle to the start position 2603 while watching this screen.
- the assumed movement pattern 1403 is fixed to the space by the space fixing means 2503
- the relative movement between the assumed movement pattern 1403 and the obstacles 401 and 402 is assumed. The exact positional relationship does not change.
- the start position of the driving operation can be efficiently obtained, so that the time required for starting the operation can be reduced.
- the start position 2603 is determined by the driving assist system in the present embodiment, a relative positional relationship with the current position 901 is calculated, and the vehicle is started from the current position 901. Time-series data on the steering wheel turning angle and the rear wheel speed required to guide to position 2603 are obtained, and a steering wheel angle control signal and a rear wheel speed control signal are generated in accordance with the data.
- start position guidance means for automatically guiding the vehicle from the current position 901 to the start position 2603 may be added. Les ,.
- the vehicle is guided to the start position without any operation by the driver, so that simpler and safer vehicle operation is possible.
- FIG. 27 is a block diagram showing a configuration of the driving assist system according to the ninth embodiment of the present invention.
- the difference between the driving assisting device in the present embodiment and the driving assisting device in the second embodiment is that the outputs of the start detecting means 13 01 and the integrating means 13 02 6 and the viewpoint conversion means 106 is only concerned with switching the viewpoint of the virtual force camera accordingly.
- components that are not particularly described are the same as those in the second embodiment, and components that are assigned the same reference numerals as those in the second embodiment are not particularly described. As long as they have the same functions as in the second embodiment.
- each of the modifications described in the second embodiment is applied to the present embodiment by performing the same modification unless otherwise specified.
- the procedure from generation of image data captured by the imaging unit 101 to generation of the surrounding situation image of the present invention is the same as that described in the first embodiment.
- the procedure from the superimposing means 102 generating the composite image of the present invention to the display thereof by the display means 107 the actual driving operation corresponding to the assumed motion pattern is started. The procedure up to this point is the same as that described in the first embodiment.
- the superimposing means 102 After the actual driving operation corresponding to the assumed movement pattern is started, the superimposing means 102 generates a composite image of the present invention, and displays it on the display means 1.
- the viewpoint position of the virtual camera is fixed right above the vehicle.
- Fig. 28 (a) the current position 9101 of the vehicle and the assumed motion pattern 1404 that uses this position as the operation start position are fixed on the screen, and the The surrounding situation images such as 1, 402 and the like are displayed relatively moving on the screen in accordance with the movement of the vehicle.
- Fig. 28 (b) when the current position 901 of the vehicle comes to a position corresponding to the target parking position 902, the driver starts driving operation corresponding to parallel parking to the left. I do.
- the gear signal is in the reverse state and the steering angle signal of the steering wheel turning angle signal becomes larger than a certain value, so that the start detecting means 1 301 corresponds to the left side parallel parking.
- the integrating means 13 02 integrates the steering angle signal after the start of the driving operation and the rear wheel speed signal which are input, and integrates the current vehicle position 1 as shown in FIG. 14 (a).
- the positional relationship of the parking operation start position 1401 with respect to 402 is calculated.
- the space conversion means 133 sets the assumed movement pattern 1443 corresponding to the left-hand parallel parking as its operation start position (see FIG. 14B). 5 5 0 1) and the parking operation start position 1 4 0 1-move so as to match.
- the space conversion means 1403 spatially fixes the expected movement pattern 1443 at the position at the time of the start of the parking operation after the start of the driving operation.
- the viewpoint conversion means 106 fixes the viewpoint position of the virtual camera at this time with respect to the space (ground). That is, after the start of the driving operation, the surrounding situation images (obstacles 401, 402, etc.) are fixed on the screen.
- the superimposing means 102 converts the assumed motion pattern 1403 spatially fixed to the position at the start of the parking operation and the current vehicle position 1442 into the surrounding situation image.
- the images are superimposed and synthesized to generate a synthesized image of the present invention.
- the viewpoint of this composite image is the same as the viewpoint of the surrounding situation image, in which the viewpoint position of the virtual force camera at the start of the parking operation is spatially fixed, so that the superimposing means 102 is integrated with the integrating means 13
- the composite image is generated by back-calculating the positional relationship calculated in 02 with respect to the viewpoint. That is, in this composite image (FIG.
- the surrounding situation image (obstacles 401, 402, etc.) and the assumed motion pattern 144 are fixed on the screen, and the current vehicle position 14 0 2 Power Displayed by moving relatively on the screen according to the actual movement of the vehicle.
- the viewpoints at the start of the parking operation are fixed to the space, so that the driver can grasp at a glance the moving state of the vehicle with respect to the situation around the parking space.
- the spatial data buffer 105 may be displayed using the data stored in the data storage.
- FIG. 29 is a block diagram showing a configuration of the driving assist system in the tenth embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is different from the driving assisting device according to the first embodiment in that the assumed movement pattern storage means 108 has a default transport pattern as shown in FIG. 30 (b).
- the superimposing means 102 simultaneously combines two circumscribed area trajectories 604 of the space through which the entire vehicle passes, instead of the tire trajectory 603, and displays them on the display means 107.
- the defonoreto of the assumed motion storing pattern 108 the simplest case shown in FIG. Assuming that the assumed motion storage pattern 108 is the default, if parking is possible, there is no need to switch between multiple assumed motion storage patterns, thus reducing the driver's operational burden. is there.
- two circumscribed area trajectories 604 generated from the assumed motions in the above two cases are simultaneously synthesized by the superimposing means 102 and displayed on the display means 107, so that two left and right trajectories are obtained. This eliminates the need to perform the operation of selecting and switching the assumed motion storage pattern, which has the effect of reducing the driver's operation burden.
- the superimposing means 102 combines the circumscribed area trajectory 60 of the space through which the entire vehicle passes, instead of the tire trajectory 603, and the display means 10 By displaying the information in Fig. 7, the driver can more accurately understand whether the vehicle is inflated more than the tire trajectory. This has the effect of making it easier.
- FIG. 31 is a block diagram showing a configuration of the driving assist device in the eleventh embodiment of the present invention.
- the driving operation assisting device according to the present embodiment is also mainly intended to assist driving operation such as when entering a garage and during parallel parking. Therefore, in the present embodiment, those not particularly described are described in the first and second embodiments. Components that are the same as those of the first and fifth embodiments and that are assigned the same reference numerals as those of the first and fifth embodiments are the same as those of the first and fifth embodiments unless otherwise specified. Assume that they have similar functions. Unless otherwise stated, the modifications described in the first and fifth embodiments are also applied to the present embodiment by performing similar modifications.
- the configuration of the driving assist system in the present embodiment is different from the configuration of the driving assist system in the first and fifth embodiments because the obstacle input means 3101 shown in FIG. Having correction means 3 1 0 2;
- the assumed movement pattern storage means 108 switching is performed as shown in FIG. 32 (a), that is, a point includes a movement pattern for switching between retreat and forward movement during the movement.
- the handle angle corresponding to the tire rotation is stored in the assumed movement pattern storage means 108 in FIG. 31 as time series data of the assumed movement pattern.
- the tire rotation from 0 to 0.8 indicates the vehicle retreat, from which it switches from reverse to forward.
- the position of the vehicle is at the backward / forward switching position 3201 shown in Fig. 32 (a).
- the tire angle becomes 0.6 to 1.4 again. It switches to retreat.
- the vehicle's position direction can be changed even when there is only a small spatial margin for obstacles. Can control.
- the procedure up to generation of the surrounding situation image of the present invention from the image data captured by the imaging unit 101 has been described in the first embodiment. Same as the ones. Also, in the procedure from the superimposing means 102 generating the synthesized image of the present invention to the display of the synthesized image by the display means 107, the stored image is stored in the assumed movement pattern storing means 108. The procedure until the assumed motion pattern is displayed on the composite image with the operation start position coincident with the current position of the vehicle is the same as that described in the first embodiment.
- the driver corrects the assumed motion pattern and time series data using the trajectory correction means 1801 and the obstacle input means 3101, The procedure until is displayed on the composite image will be described below, taking as an example a case where the garage is moved to the right side.
- the driver should move the target parking position 1902 so that the driver does not touch the obstacles: (a, b, c3204, 3205, 3206).
- the assumed movement pattern storage means 108 stores an assumed movement pattern for another garage entry operation on the right side, this is selected by the pattern selection means (not shown), and the parking operation is successfully performed. It may be possible to consider whether it is possible, but if not, or if another assumed movement pattern also interferes with obstacles, etc., the driver corrects the assumed movement pattern.
- the driver displays the figure indicating the vehicle at the current position 3200 of the composite image (FIG. 33) displayed on the display means 107 shown in FIG. 31 at the parking operation start position 19001.
- the obstacle input means 3101 shown in FIG. As shown in FIG. 34, the area where the obstacles a, b, and c in the image are located is indicated by an obstacle indicating rectangle: a 3207 or an obstacle indicating circle: b 3208.
- the trajectory correction means 1801 sets a contact danger area 3209 in a 60 cm surrounding area including the obstacle indication area: 3210 as shown in FIG. Then, for the region, a contact risk evaluation function H32 10 as shown in FIG. 36 is given.
- the trajectory contact risk evaluation function H “3 213 is obtained from the sum of the contact risk evaluation functions H32 10 at the position of the trajectory evaluation point 32 12.
- the function H "32 13 is a function of N items (tire rotation tm, tire angle km) in the table shown in FIG. 32 (b), as shown in FIG. Therefore, by sequentially correcting the diagram (tire rotation tm, tire angle km) using the partial differential method, this trajectory contact risk evaluation function H "32 13 is minimized (tire rotation tm, tire angle km). Can be requested.
- the track contact danger evaluation function H "3 2 13 is minimized ( Tire rotation tm, tire angle km) can correct the assumed motion pattern.
- the superpose means 102 matches the modified assumed motion pattern 3 214 with its operation start position 1901 and the current position 3200 of the vehicle. Then, a composite image is generated, and the display means 107 displays the composite image.
- the driver can move to the target parking position 1902 by a motion pattern with a margin from obstacles.
- the vehicle can be parked.
- the generated new assumed movement pattern and time-series data may be stored in the assumed movement pattern storage means 108 by updating the original assumed movement pattern. It is possible to additionally store the assumed movement pattern in the storage means 108. It does not need to be stored as an ad hoc one. In addition, the driver stores the updated storage, additional storage, and no storage You may select it once.
- the assumed movement pattern to be updated or additionally stored in the assumed movement pattern storage means 108 is based on the position of the vehicle at the start and end of the movement input by the driver. Although it was explained that it was automatically obtained, actual driving operation was performed, and time series data such as the steering angle of the steering wheel and the number of rotations of the wheel at this time were collected, and an expected motion pattern was generated based on this data. And store
- one assumed movement pattern shown in Fig. 32 (a) is corrected based on the obstacle region input by the driver.
- the above assumed movement pattern may be corrected, and a preferable one may be selected.
- the movement pattern selection trajectory correction means 3301 is used instead of the trajectory correction means 3101.
- the driver sets the current position of the vehicle as the parking operation start position as shown in FIG. 43, and specifies and inputs the target parking position 1902 on the image displayed on the display device.
- the movement pattern selection trajectory correction means 3301 first of the plurality of assumed movement patterns stored in the assumed movement pattern storage means 108, the target parking position 19002 is set with respect to this parking operation start position. From the rough positional relationship, two assumed motion patterns for parking to the right rear as shown in Fig. 44 are extracted.
- the assumed motion pattern is modified so as to minimize the trajectory contact risk evaluation function H "3 2 1 3.
- the two minimized trajectory contact risk evaluation functions H "Comparing 3 2 1 3 by choosing the smaller one can select a safer assumed exercise pattern.
- a simpler assumed motion pattern is selected by increasing the priority of the simpler driving operation in advance.
- the driver only needs to input a target parking position and an obstacle area to automatically select an optimal expected movement pattern, and realize optimal parking with a safer and easier driving operation.
- a scalable driving operation assisting device can be realized as compared with the driving operation assisting device in the first embodiment.
- the surrounding situation imaging means of the present invention mainly generates an image from the viewpoint of a virtual camera using a plurality of in-vehicle cameras.
- the present invention is not limited to this.
- a single camera installed on the ceiling of a covered parking lot may be used.
- the surrounding situation imaging means of the present invention generates a surrounding situation image by imaging the surrounding situation of a vehicle using a camera, and Z or even stores the generated surrounding situation image. I just need.
- a driving operation assisting device of the present invention is a device for imaging a surrounding situation of a vehicle using a camera to generate a surrounding situation image, and a Z or a surrounding situation imaging means for storing the generated surrounding situation image.
- a synthetic image is generated by superimposing an assumed motion pattern, which is image data representing the motion of the vehicle when a predetermined series of driving operations are performed on the vehicle in advance, on the surrounding situation image.
- a display means for displaying the synthesized image.
- the present invention displays, when the driver intends to perform a predetermined series of driving operations, the motion of the vehicle when performing the predetermined series of driving operations together with the surrounding situation.
- a driving operation assisting device that enables a driver to directly confirm the relationship between the vehicle motion and the surrounding situation due to the predetermined series of driving operations and to reduce the burden on the driver.
- the driver can use the driving operation assisting device of the present invention to start driving in a garage or parallel parking, or to finally stop, or to position the vehicle relative to obstacles such as other vehicles. Can be grasped at a glance from the display image, so that the driver's operation burden can be reduced and safety can be improved.
- the driver can park while watching the trajectory data. As long as the vehicle is moved to the processing start point, all subsequent processing such as garage entry can be performed automatically.
- the present invention can provide a recording medium for storing a program for causing a computer to execute all or a part of the functions of each means of the driving assist system of the present invention.
Description
Claims
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP99970064.4A EP1038734B1 (en) | 1998-10-08 | 1999-10-06 | Driving assisting device and recording medium |
US09/581,004 US7277123B1 (en) | 1998-10-08 | 1999-10-06 | Driving-operation assist and recording medium |
KR1020007006236A KR100568396B1 (ko) | 1998-10-08 | 1999-10-06 | 운전조작 보조장치 |
US11/846,085 US8111287B2 (en) | 1998-10-08 | 2007-08-28 | Driving-operation assist and recording medium |
US11/846,048 US8077202B2 (en) | 1998-10-08 | 2007-08-28 | Driving-operation assist and recording medium |
US11/846,027 US20080033606A1 (en) | 1998-10-08 | 2007-08-28 | Driving-operation assist and recording medium |
US13/848,800 US9272731B2 (en) | 1998-10-08 | 2013-03-22 | Driving-operation assist and recording medium |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP28624698 | 1998-10-08 | ||
JP10/286246 | 1998-10-08 | ||
JP34910798 | 1998-12-08 | ||
JP10/349107 | 1998-12-08 |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/581,004 A-371-Of-International US7277123B1 (en) | 1998-10-08 | 1999-10-06 | Driving-operation assist and recording medium |
US11/846,085 Continuation US8111287B2 (en) | 1998-10-08 | 2007-08-28 | Driving-operation assist and recording medium |
US11/846,048 Continuation US8077202B2 (en) | 1998-10-08 | 2007-08-28 | Driving-operation assist and recording medium |
US11/846,027 Continuation US20080033606A1 (en) | 1998-10-08 | 2007-08-28 | Driving-operation assist and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000020257A1 true WO2000020257A1 (fr) | 2000-04-13 |
Family
ID=26556225
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1999/005509 WO2000020257A1 (fr) | 1998-10-08 | 1999-10-06 | Dispositif d'assistance a la conduite et support enregistre |
Country Status (5)
Country | Link |
---|---|
US (5) | US7277123B1 (ja) |
EP (1) | EP1038734B1 (ja) |
KR (1) | KR100568396B1 (ja) |
CN (1) | CN1132750C (ja) |
WO (1) | WO2000020257A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106379238A (zh) * | 2016-07-07 | 2017-02-08 | 广州勘帝德电子科技有限公司 | 无can_bus智能车辆动态轨迹线后视影像系统 |
Families Citing this family (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000020257A1 (fr) * | 1998-10-08 | 2000-04-13 | Matsushita Electric Industrial Co., Ltd. | Dispositif d'assistance a la conduite et support enregistre |
JP3297040B1 (ja) * | 2001-04-24 | 2002-07-02 | 松下電器産業株式会社 | 車載カメラの画像合成表示方法及びその装置 |
JP3947375B2 (ja) | 2001-08-24 | 2007-07-18 | アイシン精機株式会社 | 駐車補助装置 |
DE10256770A1 (de) * | 2002-12-05 | 2004-06-17 | Bayerische Motoren Werke Ag | Verfahren zum Lenken eines rückwärts in eine Parklücke einzuparkenden Fahrzeugs |
JP4316960B2 (ja) * | 2003-08-22 | 2009-08-19 | 株式会社半導体エネルギー研究所 | 装置 |
DE102004009924A1 (de) * | 2004-02-23 | 2005-09-01 | Valeo Schalter Und Sensoren Gmbh | Verfahren und Warnvorrichtung zum grafischen Aufbereiten eines Bildes einer Kamera |
US20060178787A1 (en) * | 2005-02-09 | 2006-08-10 | Mccall Clark E | Rear obstacle avoidance system for vehicle |
JP3968375B2 (ja) | 2005-02-15 | 2007-08-29 | 松下電器産業株式会社 | 周辺監視装置および周辺監視方法 |
DE102005018408A1 (de) * | 2005-04-20 | 2006-10-26 | Valeo Schalter Und Sensoren Gmbh | Verfahren und Vorrichtung zur Auswertung von Abstandsmessdaten eines Abstandsmesssystems eines Kraftfahrzeugs |
EP1916846B1 (en) * | 2005-08-02 | 2016-09-14 | Nissan Motor Company Limited | Device and method for monitoring vehicle surroundings |
FR2891934B1 (fr) * | 2005-10-12 | 2008-01-18 | Valeo Electronique Sys Liaison | Dispositif de traitement de donnees video pour un vehicule automobile |
JP2007176324A (ja) * | 2005-12-28 | 2007-07-12 | Aisin Seiki Co Ltd | 駐車支援装置 |
JP4812510B2 (ja) * | 2006-05-17 | 2011-11-09 | アルパイン株式会社 | 車両周辺画像生成装置および撮像装置の測光調整方法 |
JP4818816B2 (ja) * | 2006-06-05 | 2011-11-16 | 富士通株式会社 | 駐車支援プログラム及び駐車支援装置 |
US20090128630A1 (en) * | 2006-07-06 | 2009-05-21 | Nissan Motor Co., Ltd. | Vehicle image display system and image display method |
DE102007009745A1 (de) * | 2007-02-28 | 2008-09-04 | Continental Automotive Gmbh | Einparkhalbautomat |
JP4595976B2 (ja) * | 2007-08-28 | 2010-12-08 | 株式会社デンソー | 映像処理装置及びカメラ |
JP5053776B2 (ja) * | 2007-09-14 | 2012-10-17 | 株式会社デンソー | 車両用視界支援システム、車載装置、及び、情報配信装置 |
KR100871044B1 (ko) * | 2007-09-28 | 2008-11-27 | 한국오므론전장주식회사 | 차량 후방안내 시스템의 온스크린 디스플레이 라인 사전생성 방법 및 이를 출력하는 차량 후방이동 궤적 안내방법 |
JP2009101718A (ja) * | 2007-10-19 | 2009-05-14 | Toyota Industries Corp | 映像表示装置及び映像表示方法 |
JP5132249B2 (ja) * | 2007-10-23 | 2013-01-30 | アルパイン株式会社 | 車載用撮像装置 |
JP5090126B2 (ja) * | 2007-10-23 | 2012-12-05 | アルパイン株式会社 | 車載用撮像装置 |
JP5057936B2 (ja) * | 2007-11-09 | 2012-10-24 | アルパイン株式会社 | 鳥瞰画像生成装置および方法 |
JP4900232B2 (ja) | 2007-12-26 | 2012-03-21 | 日産自動車株式会社 | 車両用駐車支援装置および映像表示方法 |
JP4902575B2 (ja) * | 2008-02-27 | 2012-03-21 | 日立オートモティブシステムズ株式会社 | 道路標示認識装置、および道路標示認識方法 |
JP4900326B2 (ja) * | 2008-06-10 | 2012-03-21 | 日産自動車株式会社 | 駐車支援装置及び駐車支援方法 |
DE102008027779A1 (de) * | 2008-06-11 | 2009-12-17 | Valeo Schalter Und Sensoren Gmbh | Verfahren zur Unterstützung eines Fahrers eines Fahrzeugs beim Einparken in eine Parklücke |
CA2672511A1 (en) * | 2008-07-16 | 2010-01-16 | Verint Systems Inc. | A system and method for capturing, storing, analyzing and displaying data relating to the movements of objects |
JP4661917B2 (ja) * | 2008-07-25 | 2011-03-30 | 日産自動車株式会社 | 駐車支援装置および駐車支援方法 |
JP4840427B2 (ja) * | 2008-07-29 | 2011-12-21 | 日産自動車株式会社 | 車両制御装置 |
JP2010128794A (ja) * | 2008-11-27 | 2010-06-10 | Aisin Seiki Co Ltd | 車両周辺認知支援装置 |
DE102009024083A1 (de) * | 2009-06-05 | 2010-12-09 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Durchführen eines zumindest semi-autonomen Parkvorgangs eines Fahrzeugs und Parkassistenzsystem für ein Fahrzeug |
US8174375B2 (en) * | 2009-06-30 | 2012-05-08 | The Hong Kong Polytechnic University | Detection system for assisting a driver when driving a vehicle using a plurality of image capturing devices |
CN102055956B (zh) | 2009-11-02 | 2017-05-10 | 通用汽车环球科技运作公司 | 车载三维视频系统及用其监测车辆周围环境的方法 |
WO2011070641A1 (ja) * | 2009-12-07 | 2011-06-16 | クラリオン株式会社 | 車両周辺監視システム |
US9457842B2 (en) * | 2010-06-09 | 2016-10-04 | Nissan Motor Co., Ltd. | Parking mode selection apparatus and method using the steering wheel |
DE102010023162A1 (de) * | 2010-06-09 | 2011-12-15 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Unterstützen eines Fahrers eines Kraftfahrzeugs beim Einparken in eine Parklücke, Fahrerassistenzeinrichtung und Kraftfahrzeug |
US9321400B2 (en) * | 2010-06-15 | 2016-04-26 | Aisin Seiki Kabushiki Kaisha | Drive assist device |
JP5444139B2 (ja) * | 2010-06-29 | 2014-03-19 | クラリオン株式会社 | 画像のキャリブレーション方法および装置 |
DE102010034142A1 (de) | 2010-08-12 | 2012-02-16 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Unterstützen eines Fahrers beim Führen eines Kraftfahrzeugs und Fahrerassistenzsystem |
DE102010034127A1 (de) | 2010-08-12 | 2012-02-16 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Anzeigen von Bildern auf einer Anzeigeeinrichtung in einem Kraftfahrzeug, Fahrerassistenzsystem und Kraftfahrzeug |
DE102010034139A1 (de) * | 2010-08-12 | 2012-02-16 | Valeo Schalter Und Sensoren Gmbh | Verfahren zur Unterstützung eines Parkvorgangs eines Kraftfahrzeugs, Fahrerassistenzsystem und Kraftfahrzeug |
JP5454934B2 (ja) * | 2010-09-21 | 2014-03-26 | アイシン精機株式会社 | 運転支援装置 |
DE102010048185B4 (de) * | 2010-10-13 | 2021-10-28 | Wirtgen Gmbh | Selbstfahrende Baumaschine |
CN102463989A (zh) * | 2010-11-18 | 2012-05-23 | 江彦宏 | 视频雷达辅助驾驶系统 |
US20120249342A1 (en) * | 2011-03-31 | 2012-10-04 | Koehrsen Craig L | Machine display system |
TWI421624B (zh) * | 2011-04-01 | 2014-01-01 | Ind Tech Res Inst | 環場監控裝置及其方法 |
JP2012253543A (ja) * | 2011-06-02 | 2012-12-20 | Seiko Epson Corp | 表示装置、表示装置の制御方法、及び、プログラム |
DE102011080930A1 (de) * | 2011-08-12 | 2013-02-14 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Unterstützung eines Fahrers eines Kraftfahrzeugs bei einem Fahrmanöver |
KR20130021988A (ko) * | 2011-08-24 | 2013-03-06 | 현대모비스 주식회사 | 차량용 카메라를 통해 획득한 영상의 오버레이 처리 장치 및 그 방법 |
EP2581268B2 (en) * | 2011-10-13 | 2019-09-11 | Harman Becker Automotive Systems GmbH | Method of controlling an optical output device for displaying a vehicle surround view and vehicle surround view system |
CN103987582B (zh) * | 2011-12-15 | 2017-02-22 | 松下知识产权经营株式会社 | 驾驶辅助装置 |
JP5941292B2 (ja) * | 2012-02-10 | 2016-06-29 | 矢崎総業株式会社 | 車両用表示装置 |
JP5888087B2 (ja) * | 2012-04-25 | 2016-03-16 | ソニー株式会社 | 走行支援画像生成装置、走行支援画像生成方法、車載用カメラおよび機器操縦支援画像生成装置 |
DE102012008858A1 (de) * | 2012-04-28 | 2012-11-08 | Daimler Ag | Verfahren zum autonomen Parken eines Kraftfahrzeugs, Fahrerassistenzvorrichtung zur Durchführung des Verfahrens, sowie Kraftfahrzeug mit der Fahrerassistenzvorrichtung |
JP5814187B2 (ja) * | 2012-06-07 | 2015-11-17 | 日立建機株式会社 | 自走式産業機械の表示装置 |
US20140057237A1 (en) * | 2012-08-27 | 2014-02-27 | Stephen Chen | Method for parking a vehicle by using a parking assistant system |
JP6143469B2 (ja) * | 2013-01-17 | 2017-06-07 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
US9167214B2 (en) * | 2013-01-18 | 2015-10-20 | Caterpillar Inc. | Image processing system using unified images |
JP6001792B2 (ja) * | 2013-09-09 | 2016-10-05 | 三菱電機株式会社 | 運転支援装置および運転支援方法 |
TWI552907B (zh) * | 2013-10-30 | 2016-10-11 | 緯創資通股份有限公司 | 行車安全輔助系統和方法 |
CA2932184A1 (en) * | 2013-11-29 | 2015-06-04 | Ims Solutions Inc. | Advanced context-based driver scoring |
US20150156391A1 (en) * | 2013-12-04 | 2015-06-04 | Chung-Shan Institute Of Science And Technology, Armaments Bureau, M.N.D | Vehicle image correction system and method thereof |
US9598012B2 (en) * | 2014-03-11 | 2017-03-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Surroundings monitoring system for a vehicle |
JP6274936B2 (ja) * | 2014-03-25 | 2018-02-07 | ダイハツ工業株式会社 | 運転支援装置 |
US9592826B2 (en) * | 2015-02-13 | 2017-03-14 | Ford Global Technologies, Llc | System and method for parallel parking a vehicle |
KR101860610B1 (ko) * | 2015-08-20 | 2018-07-02 | 엘지전자 주식회사 | 디스플레이 장치 및 이를 포함하는 차량 |
CN105824592A (zh) * | 2016-03-07 | 2016-08-03 | 乐卡汽车智能科技(北京)有限公司 | 倒车轨迹的显示方法及装置 |
JP6723820B2 (ja) * | 2016-05-18 | 2020-07-15 | 株式会社デンソーテン | 画像生成装置、画像表示システムおよび画像表示方法 |
CN105799596A (zh) * | 2016-05-20 | 2016-07-27 | 广州市晶华精密光学股份有限公司 | 一种汽车智能后视系统及图像显示方法 |
CN107776489B (zh) * | 2016-08-26 | 2020-07-10 | 比亚迪股份有限公司 | 车辆及其全景影像的显示方法和显示系统 |
KR101949438B1 (ko) * | 2016-10-05 | 2019-02-19 | 엘지전자 주식회사 | 차량용 디스플레이 장치 및 이를 포함하는 차량 |
US10162360B2 (en) * | 2016-12-01 | 2018-12-25 | GM Global Technology Operations LLC | Vehicle environment imaging systems and methods |
DE102017218921A1 (de) * | 2017-10-24 | 2019-04-25 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren, Vorrichtung, Computerprogramm und Computerprogrammprodukt zum Betreiben einer Displayeinheit eines Fahrzeugs |
JP6984373B2 (ja) * | 2017-12-07 | 2021-12-17 | トヨタ自動車株式会社 | 駐車支援装置 |
CN108332716A (zh) * | 2018-02-07 | 2018-07-27 | 徐州艾特卡电子科技有限公司 | 一种自动驾驶汽车环境感知系统 |
CN110211256A (zh) * | 2018-02-28 | 2019-09-06 | 上海博泰悦臻电子设备制造有限公司 | 车辆及其车辆行驶实况复显装置和车辆行驶实况复显方法 |
DE102018208513A1 (de) * | 2018-05-29 | 2019-12-05 | Continental Automotive Gmbh | Kamera-Monitor-System für ein Kraftfahrzeug und Verwendung eines Spiegelersatzsystems für ein Kraftfahrzeug |
US10497232B1 (en) * | 2019-03-01 | 2019-12-03 | Motorola Solutions, Inc. | System and method for dynamic vehicular threat detection perimeter modification for an exited vehicular occupant |
CN113496626B (zh) * | 2020-03-19 | 2023-06-02 | 广州汽车集团股份有限公司 | 一种车辆碰撞预警方法、装置及汽车 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06234341A (ja) * | 1993-02-12 | 1994-08-23 | Toyota Motor Corp | 駐車補助装置 |
JPH10262240A (ja) * | 1997-03-17 | 1998-09-29 | Mitsubishi Motors Corp | 車両用周辺視認装置 |
Family Cites Families (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5257867A (en) | 1975-11-06 | 1977-05-12 | Nippon Soken | Inspecting and indicating device |
US4214266A (en) * | 1978-06-19 | 1980-07-22 | Myers Charles H | Rear viewing system for vehicles |
FR2542540B1 (fr) * | 1983-03-08 | 1989-02-10 | Canon Kk | Systeme de traitement d'images |
JPS61113532A (ja) | 1985-09-05 | 1986-05-31 | Nippon Denso Co Ltd | メンテナンス表示装置 |
JPS62155140A (ja) | 1985-12-27 | 1987-07-10 | Aisin Warner Ltd | 車両制御用道路画像入力方式 |
JPS6414700A (en) | 1987-07-08 | 1989-01-18 | Aisin Aw Co | Device for displaying prospective track of vehicle |
JPH0441117Y2 (ja) | 1987-07-18 | 1992-09-28 | ||
US4931930A (en) * | 1988-04-19 | 1990-06-05 | Industrial Technology Research Institute | Automatic parking device for automobile |
US5172315A (en) | 1988-08-10 | 1992-12-15 | Honda Giken Kogyo Kabushiki Kaisha | Automatic travelling apparatus and method |
JP2792566B2 (ja) | 1989-05-24 | 1998-09-03 | マツダ株式会社 | 移動車の走行制御装置 |
JPH0399952A (ja) | 1989-09-12 | 1991-04-25 | Nissan Motor Co Ltd | 車両用周囲状況モニタ |
JPH03166534A (ja) | 1989-11-25 | 1991-07-18 | Seikosha Co Ltd | カメラ用測距装置 |
JPH05265547A (ja) | 1992-03-23 | 1993-10-15 | Fuji Heavy Ind Ltd | 車輌用車外監視装置 |
EP0567059B1 (en) * | 1992-04-24 | 1998-12-02 | Hitachi, Ltd. | Object recognition system using image processing |
JPH05310078A (ja) | 1992-05-07 | 1993-11-22 | Clarion Co Ltd | 車両安全確認装置及びその装置に使用するカメラ |
JP3391405B2 (ja) | 1992-05-29 | 2003-03-31 | 株式会社エフ・エフ・シー | カメラ映像内の物体同定方法 |
EP0590588B2 (en) | 1992-09-30 | 2003-09-10 | Hitachi, Ltd. | Vehicle driving support system |
US5670935A (en) | 1993-02-26 | 1997-09-23 | Donnelly Corporation | Rearview vision system for vehicle including panoramic view |
JP3468428B2 (ja) | 1993-03-24 | 2003-11-17 | 富士重工業株式会社 | 車輌用距離検出装置 |
JP2887039B2 (ja) | 1993-03-26 | 1999-04-26 | 三菱電機株式会社 | 車両周辺監視装置 |
US5638116A (en) | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
JP3431962B2 (ja) | 1993-09-17 | 2003-07-28 | 本田技研工業株式会社 | 走行区分線認識装置を備えた自動走行車両 |
DE4333112A1 (de) | 1993-09-29 | 1995-03-30 | Bosch Gmbh Robert | Verfahren und Vorrichtung zum Ausparken eines Fahrzeugs |
US5883739A (en) * | 1993-10-04 | 1999-03-16 | Honda Giken Kogyo Kabushiki Kaisha | Information display device for vehicle |
JP3381351B2 (ja) | 1993-12-24 | 2003-02-24 | 日産自動車株式会社 | 車両用周囲状況表示装置 |
JP3522317B2 (ja) | 1993-12-27 | 2004-04-26 | 富士重工業株式会社 | 車輌用走行案内装置 |
JP3205477B2 (ja) | 1994-02-17 | 2001-09-04 | 富士フイルムマイクロデバイス株式会社 | 車間距離検出装置 |
JP2919284B2 (ja) | 1994-02-23 | 1999-07-12 | 松下電工株式会社 | 物体認識方法 |
JPH07311857A (ja) * | 1994-05-16 | 1995-11-28 | Fujitsu Ltd | 画像合成表示装置およびシミュレーションシステム |
JP3475507B2 (ja) | 1994-08-08 | 2003-12-08 | 日産自動車株式会社 | 車両用周囲モニタ装置 |
JPH0896118A (ja) | 1994-09-28 | 1996-04-12 | Nissan Motor Co Ltd | 車両用周囲状況表示装置 |
JP3478432B2 (ja) | 1995-03-02 | 2003-12-15 | 矢崎総業株式会社 | 車両周辺監視装置 |
JP3503840B2 (ja) | 1995-04-06 | 2004-03-08 | 矢崎総業株式会社 | 車両周辺監視装置 |
JPH09305796A (ja) | 1996-05-16 | 1997-11-28 | Canon Inc | 画像情報処理装置 |
JP3328478B2 (ja) | 1995-10-18 | 2002-09-24 | 日本電信電話株式会社 | カメラシステム |
JP3293441B2 (ja) | 1996-01-09 | 2002-06-17 | トヨタ自動車株式会社 | 撮像装置 |
US6192145B1 (en) * | 1996-02-12 | 2001-02-20 | Sarnoff Corporation | Method and apparatus for three-dimensional scene processing using parallax geometry of pairs of points |
DE19611718A1 (de) * | 1996-03-25 | 1997-10-02 | Trw Repa Gmbh | Verfahren zur Steuerung der Aktivierung eines Fahrzeuginsassen-Rückhaltesystems, Steuersystem und Fahrzeuginsassen-Rückhaltesystem |
JP3866328B2 (ja) | 1996-06-06 | 2007-01-10 | 富士重工業株式会社 | 車両周辺立体物認識装置 |
JP3600378B2 (ja) | 1996-07-24 | 2004-12-15 | 本田技研工業株式会社 | 車両の外界認識装置 |
JP3625622B2 (ja) | 1996-08-30 | 2005-03-02 | 三洋電機株式会社 | 立体モデル作成装置、立体モデル作成方法および立体モデル作成プログラムを記録した媒体 |
JP3147002B2 (ja) | 1996-09-26 | 2001-03-19 | 富士電機株式会社 | 距離検出値の補正方法 |
US5994701A (en) | 1996-10-15 | 1999-11-30 | Nippon Avonics Co., Ltd. | Infrared sensor device with temperature correction function |
JPH10164566A (ja) | 1996-11-28 | 1998-06-19 | Aiphone Co Ltd | マルチ天井カメラ装置 |
US6119068A (en) * | 1996-12-27 | 2000-09-12 | Kannonji; Michihiro | Rear-end collision alarming device and method linked to speed control device of a vehicle |
JPH10244891A (ja) | 1997-03-07 | 1998-09-14 | Nissan Motor Co Ltd | 駐車補助装置 |
JPH10257482A (ja) * | 1997-03-13 | 1998-09-25 | Nissan Motor Co Ltd | 車両周辺状況表示装置 |
JPH10264841A (ja) | 1997-03-25 | 1998-10-06 | Nissan Motor Co Ltd | 駐車誘導装置 |
JP3571893B2 (ja) | 1997-12-03 | 2004-09-29 | キヤノン株式会社 | 画像記録装置及び画像記録方法、画像データベース生成装置及び画像データベース生成方法 |
JP3511892B2 (ja) * | 1998-05-25 | 2004-03-29 | 日産自動車株式会社 | 車両用周囲モニタ装置 |
WO2000020257A1 (fr) * | 1998-10-08 | 2000-04-13 | Matsushita Electric Industrial Co., Ltd. | Dispositif d'assistance a la conduite et support enregistre |
US6396535B1 (en) * | 1999-02-16 | 2002-05-28 | Mitsubishi Electric Research Laboratories, Inc. | Situation awareness system |
KR20010112433A (ko) * | 1999-04-16 | 2001-12-20 | 마츠시타 덴끼 산교 가부시키가이샤 | 화상처리장치 및 감시시스템 |
JP3966673B2 (ja) * | 1999-10-26 | 2007-08-29 | 本田技研工業株式会社 | 物体検知装置および車両の走行安全装置 |
GB2364192A (en) * | 2000-06-26 | 2002-01-16 | Inview Systems Ltd | Creation of a panoramic rear-view image for display in a vehicle |
EP1916846B1 (en) * | 2005-08-02 | 2016-09-14 | Nissan Motor Company Limited | Device and method for monitoring vehicle surroundings |
JP4812510B2 (ja) * | 2006-05-17 | 2011-11-09 | アルパイン株式会社 | 車両周辺画像生成装置および撮像装置の測光調整方法 |
JP5053776B2 (ja) * | 2007-09-14 | 2012-10-17 | 株式会社デンソー | 車両用視界支援システム、車載装置、及び、情報配信装置 |
JP5090126B2 (ja) * | 2007-10-23 | 2012-12-05 | アルパイン株式会社 | 車載用撮像装置 |
JP5132249B2 (ja) * | 2007-10-23 | 2013-01-30 | アルパイン株式会社 | 車載用撮像装置 |
JP4900326B2 (ja) * | 2008-06-10 | 2012-03-21 | 日産自動車株式会社 | 駐車支援装置及び駐車支援方法 |
JP4840427B2 (ja) * | 2008-07-29 | 2011-12-21 | 日産自動車株式会社 | 車両制御装置 |
US8174375B2 (en) * | 2009-06-30 | 2012-05-08 | The Hong Kong Polytechnic University | Detection system for assisting a driver when driving a vehicle using a plurality of image capturing devices |
-
1999
- 1999-10-06 WO PCT/JP1999/005509 patent/WO2000020257A1/ja active IP Right Grant
- 1999-10-06 US US09/581,004 patent/US7277123B1/en not_active Expired - Lifetime
- 1999-10-06 CN CN998017868A patent/CN1132750C/zh not_active Expired - Lifetime
- 1999-10-06 KR KR1020007006236A patent/KR100568396B1/ko active IP Right Grant
- 1999-10-06 EP EP99970064.4A patent/EP1038734B1/en not_active Expired - Lifetime
-
2007
- 2007-08-28 US US11/846,085 patent/US8111287B2/en not_active Expired - Fee Related
- 2007-08-28 US US11/846,027 patent/US20080033606A1/en not_active Abandoned
- 2007-08-28 US US11/846,048 patent/US8077202B2/en not_active Expired - Fee Related
-
2013
- 2013-03-22 US US13/848,800 patent/US9272731B2/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06234341A (ja) * | 1993-02-12 | 1994-08-23 | Toyota Motor Corp | 駐車補助装置 |
JPH10262240A (ja) * | 1997-03-17 | 1998-09-29 | Mitsubishi Motors Corp | 車両用周辺視認装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106379238A (zh) * | 2016-07-07 | 2017-02-08 | 广州勘帝德电子科技有限公司 | 无can_bus智能车辆动态轨迹线后视影像系统 |
CN106379238B (zh) * | 2016-07-07 | 2018-11-02 | 广州勘帝德电子科技有限公司 | 无can_bus智能车辆动态轨迹线后视影像系统 |
Also Published As
Publication number | Publication date |
---|---|
US9272731B2 (en) | 2016-03-01 |
EP1038734A1 (en) | 2000-09-27 |
US7277123B1 (en) | 2007-10-02 |
US20080033606A1 (en) | 2008-02-07 |
US8111287B2 (en) | 2012-02-07 |
US8077202B2 (en) | 2011-12-13 |
EP1038734A4 (en) | 2005-10-12 |
US20130231863A1 (en) | 2013-09-05 |
US20070299572A1 (en) | 2007-12-27 |
US20070299584A1 (en) | 2007-12-27 |
CN1132750C (zh) | 2003-12-31 |
KR20010032902A (ko) | 2001-04-25 |
EP1038734B1 (en) | 2019-05-15 |
KR100568396B1 (ko) | 2006-04-05 |
CN1287532A (zh) | 2001-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2000020257A1 (fr) | Dispositif d'assistance a la conduite et support enregistre | |
JP3445197B2 (ja) | 運転操作補助装置 | |
JP4561479B2 (ja) | 駐車支援方法及び駐車支援装置 | |
CN101676149B (zh) | 车辆驾驶辅助装置 | |
JP3606816B2 (ja) | 運転操作補助装置 | |
JP3938559B2 (ja) | 車両後退支援装置 | |
US7088262B2 (en) | Method of operating a display system in a vehicle for finding a parking place | |
JP4696691B2 (ja) | 駐車支援方法及び駐車支援装置 | |
JP4670463B2 (ja) | 駐車区画監視装置 | |
JP2006298115A (ja) | 運転支援方法及び運転支援装置 | |
JP2003063336A (ja) | 駐車補助装置 | |
JP2004114879A (ja) | 駐車補助装置および画像表示装置 | |
JP5400316B2 (ja) | 駐車支援装置 | |
JP2007055378A (ja) | 駐車支援装置及び駐車支援方法 | |
JP4499367B2 (ja) | 運転操作補助装置および運転操作補助方法 | |
JP2012023505A (ja) | 運転支援装置 | |
JP2003259356A (ja) | 車両周辺監視装置 | |
JP4561470B2 (ja) | 駐車支援装置 | |
JP4059309B2 (ja) | 車載カメラの画像表示制御方法及びその装置 | |
JP4561512B2 (ja) | 駐車支援方法及び駐車支援装置 | |
JP2009232331A (ja) | 車両運転支援装置 | |
JP2004009959A (ja) | 運転支援装置 | |
JP2002274304A (ja) | 駐車位置設定装置 | |
JPH11283199A (ja) | 駐車支援装置、駐車場データの入力方法、駐車支援システム、およびプログラム記憶媒体 | |
JP2007090991A (ja) | 車両後退運転支援装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 99801786.8 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1020007006236 Country of ref document: KR Ref document number: 1999970064 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09581004 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1999970064 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020007006236 Country of ref document: KR |
|
WWG | Wipo information: grant in national office |
Ref document number: 1020007006236 Country of ref document: KR |