WO2012032809A1 - 運転支援装置 - Google Patents
運転支援装置 Download PDFInfo
- Publication number
- WO2012032809A1 WO2012032809A1 PCT/JP2011/060597 JP2011060597W WO2012032809A1 WO 2012032809 A1 WO2012032809 A1 WO 2012032809A1 JP 2011060597 W JP2011060597 W JP 2011060597W WO 2012032809 A1 WO2012032809 A1 WO 2012032809A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- guide
- display
- storage unit
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/029—Steering assistants using warnings or proposing actions to the driver without influencing the steering system
- B62D15/0295—Steering assistants using warnings or proposing actions to the driver without influencing the steering system by overlaying a vehicle path based on present steering angle over an image without processing that image
Definitions
- the present invention relates to a driving support device that supports driving by superimposing a guide image on a photographed image by a vehicle-mounted camera.
- Patent Document 1 discloses a parking assistance device that is intended to appropriately provide useful information for parking as such a driving assistance device.
- the parking assist device obtains a predicted travel locus of the vehicle based on the steering angle, and displays the predicted travel locus in a superimposed manner on the rear image. This predicted travel path is derived based on a relational expression between the steering angle, the vehicle wheel base, and the turning radius.
- Such calculation is executed with the CPU as the core, as shown in FIG. That is, an image processing device capable of relatively high-level arithmetic as described above is required between a camera device having an optical system including an image sensor and a signal processing circuit and a monitor device. Moreover, if the calculation performance of the CPU is low, drawing of the predicted travel locus is delayed, and there is a possibility that the display is not smooth in conjunction with the steering angle. On the other hand, the number of in-vehicle electrical equipment is on the rise, and market demands for suppressing power consumption as a whole vehicle and for reducing the cost of electrical equipment are increasing.
- the characteristic configuration of the driving support device is as follows.
- An image receiving unit for receiving a photographed image of the periphery of the vehicle photographed by the in-vehicle camera;
- An output section A plurality of previously generated graphic images for each predetermined rudder angle, with the guide line indicating the expected course according to the actual steering angle of the vehicle on the captured image and assisting the driving operation as the expected course line.
- a storage unit that stores the predicted route of the vehicle as the guide display;
- the predicted course line as the guide display is acquired from the storage unit according to at least the actual steering angle of the vehicle, and the guide image is configured and provided to the image output unit using the acquired guide display And a guide image providing unit.
- the predicted route generated in advance according to the steering angle and stored in the storage unit Acquired according to the actual rudder angle instead of calculating and drawing the coordinates in the captured image of the predicted route according to the steering angle of the vehicle, the predicted route generated in advance according to the steering angle and stored in the storage unit Acquired according to the actual rudder angle. Therefore, it is possible to obtain a guide image using the predicted course line by simply using the rudder angle as an argument without requiring a complicated calculation.
- the addition of the predicted course line to the captured image is realized by superimposing the guide image using the predicted course line on the captured image, so that it can be executed without requiring a complicated calculation. As a result, it is possible to reduce a driver's operation burden by superimposing a guide image on an image captured by an in-vehicle camera with a simple system configuration.
- the guide image providing unit acquires the guide display from the storage unit in synchronization with a shooting cycle of the shot image or a rewrite cycle of the display image on the display device, and outputs the guide image to the image It is suitable to provide to the part.
- the image output unit can obtain a new guide image for each display image rewrite cycle.
- the guide image also changes at the same speed as the captured image captured by the in-vehicle camera.
- the guide image may be provided at a cycle that is an integral multiple of the rewrite cycle of the display image. Even in this case, there is no change in being synchronized with the rewrite cycle of the display image.
- the guide image is provided at a cycle twice the display image rewrite cycle, then the guide image is provided at a cycle of three times, and then the guide image is provided at a cycle of twice.
- an anomalous case in which two guide images are provided at a cycle that is five times the rewrite cycle as a whole is also included in the synchronization. If the video format that can be displayed by the monitor device does not match the video format of the captured image of the in-vehicle camera, the captured image cannot be displayed on the monitor device. Therefore, synchronizing with the rewrite cycle of the display image is substantially the same as synchronizing with the shooting cycle of the captured image.
- the operation according to the present invention is performed using an error between an ideal coordinate and an actual coordinate of a reference point in the two-dimensional coordinate system of the captured image as an error of the optical system related to a position when the vehicle-mounted camera is attached to the vehicle
- the guide image providing unit of the support apparatus preferably corrects the position of the guide display in the display image based on a correction parameter for correcting an error of the optical system related to the position on the display image.
- the ideal coordinates of the reference point in the two-dimensional coordinate system of the photographed image may not match the actual coordinates due to the effects of component tolerances and mounting tolerances.
- the error in the coordinates of the reference point is an offset on the projection plane of the two-dimensional coordinate system, it can be easily corrected by a correction parameter that defines such an offset amount. That is, it is only necessary to give an offset to the coordinate values in the two-dimensional coordinate system without performing complicated calculations such as coordinate transformation. By such correction, the error of the optical system related to the position can be corrected very easily.
- the rotation angle of the captured image with respect to the ideal state of the captured image is defined as an error of an optical system related to rotation when the on-vehicle camera is attached to the vehicle
- the storage unit of the driving support device according to the present invention includes: Further, a graphic image indicating the guide display according to the steering angle of the vehicle is stored according to an error of the optical system related to the rotation, and the guide image providing unit is configured to store the actual steering angle of the vehicle and the vehicle. It is preferable that the graphic image of the predicted route is acquired from the storage unit in accordance with a rotation parameter indicating an error of the optical system related to rotation when the on-vehicle camera is attached.
- the ideal angle of the coordinate axis in the two-dimensional coordinate system of the captured image may not match the actual angle due to the effects of component tolerances and mounting tolerances. Since the angle error of the coordinate axis is an error accompanying rotation for the photographed image, if it is corrected by reversely rotating it, coordinate conversion is required. Although the calculation load of such coordinate conversion is relatively high, according to the present configuration, the storage unit further stores a rough image according to the error of the optical system related to rotation, and the guide image providing unit determines the actual steering angle and A graphic image of the expected course line is acquired from the storage unit according to the rotation parameter. As described above, the guide image providing unit outputs the guide image in which the error of the optical system related to rotation is suppressed without performing a heavy load calculation such as coordinate conversion in order to correct the error of the optical system related to rotation. It can be provided to the department.
- the rotation angle of the captured image with respect to the ideal state of the captured image is defined as an error of an optical system related to rotation when the on-vehicle camera is attached to the vehicle
- the storage unit of the driving support device according to the present invention includes: The graphic image of the predicted course line is stored over a range wider than the range of the actual rudder angle, and the guide image providing unit is an error of the optical system regarding the actual rudder angle of the vehicle and the actual rotation. It is preferable that the actual steering angle value is corrected based on the rotation parameter indicating, and the predicted course line is acquired from the storage unit according to the corrected value.
- the calculation load for coordinate conversion is relatively high, but according to this configuration, the actual steering angle value is corrected based on the rotation parameter, and the storage unit is set according to the corrected steering angle value. Therefore, a high calculation load such as coordinate conversion is not required. Since the storage unit stores a graphic image of the predicted course line over a wider range than the actual steering angle range, there is no problem even if the steering angle value is corrected based on the rotation parameter. .
- the guide image providing unit outputs the guide image in which the error of the optical system related to rotation is suppressed without performing a heavy load calculation such as coordinate conversion in order to correct the error of the optical system related to rotation. It can be provided to the department.
- the graphic image of the predicted course line stored in the storage unit of the driving assistance device is a state in which the predetermined steering angle in the vicinity of the neutral position of the steering wheel of the vehicle is turned off. It is preferable that the distance is smaller than the predetermined steering angle. According to this, the resolution when the steering wheel is in the vicinity of the neutral position becomes high, and the expected course line follows the steering from the neutral position where the steering is started sensitively. Therefore, it is possible to promptly inform the driver that the expected course line changes in the display image. As a result, the driver recognizes the movement of the expected course at an early stage, and the driving support effect is enhanced.
- the storage unit of the driving assistance apparatus further includes the guide display displayed at a predetermined position on the display screen as a fixed guide regardless of the actual steering angle of the vehicle on the captured image,
- the guide image providing unit acquires the expected course line and the fixed guide from the storage unit, combines them into one guide image, and provides the combined image to the image output unit. It is. Even when a plurality of types of guide displays are provided, the guide image providing unit synthesizes them into one guide image, so that the image output unit only needs to superimpose the captured image and the guide image. As a result, the calculation load is reduced, and the driving support device can be configured with a simple system configuration.
- the fixed guide includes a vehicle extension line as the guide line indicating an extension line in the traveling direction of the vehicle at a predetermined steering angle regardless of an actual steering angle of the vehicle on the captured image. It is preferable. Since both the vehicle extension line and the expected course line are superimposed on the captured image as a guide image, the relationship between the steering amount and the traveling direction is easily transmitted to the driver.
- a preferred aspect of the driving support device is: An image sensor core having an image sensor that converts a scene around the vehicle into an analog image signal by photoelectric conversion, and a signal processing unit that processes the analog image signal to generate the captured image by a digital image signal; An image sensor processor that obtains the guide display including at least the predicted route from the storage unit to generate the guide image, and generates and outputs the display image by superimposing the captured image and the guide image; An image sensor device comprising: An optical unit that forms an image of a scene around the vehicle on a light receiving surface of the image sensor; And a memory in which a graphic image of the guide display is stored.
- the integrated in-vehicle camera module completes from capturing a captured image to outputting a display image in which a guide image is superimposed on the captured image. That is, if the monitor device is mounted on the vehicle, the driving support device can be added to the vehicle very simply by displaying the display image on the monitor device. In general, such a driving support device needs to have a function added when a vehicle is produced in a manufacturer's production factory. However, if the in-vehicle camera module is mounted on a vehicle and connected to a monitor device, the driving support device of the present invention can be added. Therefore, there is a high possibility that the driving support device can be added even in a maintenance factory such as a dealer that can easily adjust the optical axis.
- the driving support device can be realized by the in-vehicle camera module configured by a simple arithmetic device without mounting a high-performance arithmetic device on the vehicle, the driving support device can be added at low cost.
- the vehicle includes an object detection unit that detects an object around the vehicle
- the storage unit includes, as surrounding information indicating a situation around the vehicle, a distance from the vehicle to the object and the object.
- a plurality of the surrounding information generated in advance as graphic images for each existing direction are stored, and the guide image providing unit acquires the surrounding information from the storage unit based on a detection result of the object detecting unit, and It is preferable to synthesize it with the guide image and provide it to the image output unit.
- the dynamic information from the object detection unit can be displayed on a guide image that is different from the guide image of the expected course, and the display can be changed as much as necessary with respect to changes in the surrounding situation information. It is efficient because it is sufficient.
- the surrounding information can be combined with the upper part in the guide image.
- the upper part of the captured image often includes an unnecessary scene such as the upper floor of the building or the sky.
- the surrounding information can be arranged in an area including an unnecessary scene in the photographed image. Therefore, an area indicating a situation around the vehicle included in the photographed image (for example, attention should be paid). Area) is not impaired.
- the surrounding information may be combined with the lower part in the guide image.
- the lower part of the captured image often includes the vehicle body.
- the surrounding information can be arranged in a region in which the vehicle body is included in the photographed image. Area) is not impaired.
- Block diagram schematically showing an example of a system configuration of a vehicle A perspective view of a vehicle showing a partially cut-out vehicle Block diagram schematically showing an example of the functional configuration of the driving support device
- the figure which shows typically an example of the data format of the memory in which the graphic image corresponding to a rotation error is stored The figure which shows typically the other example of the data format of the memory in which the graphic image corresponding to a rotation error is stored
- the monitor device 4 displays, as a display image VI, a captured image PI around the vehicle 90 captured as a moving image by a camera 100 (on-vehicle camera) provided in the vehicle.
- the driving support device (parking support device / periphery monitoring device) 10 displayed on the device will be described as an example.
- the camera 100 uses a CCD (charge coupled device), CIS (CMOS image sensor) or other imaging device to capture a 15 to 30 frame per second 2D image in time series, and digitally converts it into moving image data (captured image). Is a camera that outputs in real time.
- CCD charge coupled device
- CIS CMOS image sensor
- the camera 100 is configured as a part of a camera module (vehicle camera module) 1 in which an optical unit 3 such as a lens and an optical path, an image sensor device 5, and a memory 7 are integrated.
- the camera module 1 corresponds to the driving support device 10 of the present invention.
- the image sensor device 5 is a semiconductor chip in which an image sensor and a signal processing circuit for electrical signals after photoelectric conversion by the image sensor are integrated and integrated. In such a semiconductor chip, it is desirable to use a CMOS process from the viewpoint of integration and power consumption. Therefore, in this embodiment, CIS is used for the image sensor.
- the image sensor device 5 includes an image sensor core 8, an image sensor processor (ISP) 9, an output interface 99, and a memory interface 6.
- the image sensor core 8 is a CIS 81 serving as an image sensor that converts a scene around the vehicle 90 into an analog image signal by photoelectric conversion, and signal processing that generates a captured image PI using a digital image signal by processing the analog image signal.
- the signal processing unit includes an analog signal processing circuit 82 configured by an amplifier circuit, a correlated double sampling circuit, and the like, and an A / D converter 83 that converts the analog signal into a digital signal.
- the analog signal that has passed through the analog signal processing circuit 82 is converted into a digital signal by the A / D converter 83, and a captured image PI is generated by the digital image signal.
- the camera 100 is configured by the image sensor core 8 and the optical unit 3.
- the optical unit 3 includes a lens and a lens barrel (optical path) for preventing disturbance light entering from the lens to the light receiving surface of the CIS 81, and forms an image of the surroundings of the vehicle 90 on the light receiving surface of the CIS 81.
- the lens provided in the optical unit 3 is a wide-angle lens. In particular, in the present embodiment, a viewing angle of 140 to 190 ° is ensured in the horizontal direction.
- the camera 100 is installed in the vehicle 90 with a depression angle of about 30 degrees on the optical axis, and can capture an area from the vehicle 90 to about 8 m.
- the camera module 1 is installed in the vehicle 90 with a depression angle of about 30 degrees on the optical axis of the optical unit 3, and the camera 100 can capture an area from the vehicle 90 to about 8 m.
- a back camera that captures the rear of the vehicle 90 is illustrated as the camera 100, but a camera that captures the front or side of the vehicle 90 may be used.
- the number of cameras 100 provided in the vehicle 90 is not limited to one, and a plurality of cameras such as a rear side and a front passenger side side, a rear side and a front passenger side side, a front side, a rear side, a both side side, and a front side are provided.
- the vehicle 90 may be provided.
- the photographed image PI photographed by the camera 100 can be displayed as a display image VI on the monitor device 4 (display device) in the vehicle via the ISP 9, as shown in FIG.
- the ISP 9 is configured with a DSP (digital signal processor) as a core, but may be configured with another logical operation processor such as a CPU or a logic circuit as a core.
- the ISP 9 may execute image processing such as correcting distortion caused by shooting with a wide-angle lens, changing viewpoint, and adjusting color. Note that no image processing may be executed by the ISP 9, and the captured image PI and the display image VI may be the same image.
- the ISP 9 superimposes the guide image GI on the captured image PI to form the display image VI.
- the guide image GI is an image using a guide display V for assisting the driving operation of the vehicle 90 by the driver as shown in FIG.
- the guide display V is a graphic image including a guide line G and a message M that assist the driving operation of the vehicle 90 by the driver.
- the guide line G includes an expected course line C that changes according to the steering angle of the vehicle 90, a vehicle extension line E as a fixed guide F that is drawn at a fixed position regardless of the steering angle.
- the captured image PI and the display image VI may be the same image.
- the image sensor device 5 acquires the guide display V from the memory 7 via the memory interface 6 and generates a guide image GI.
- the memory interface 6 is, for example, a SPI (serial peripheral interface).
- the image sensor device 5 When the image sensor device 5 generates the display image VI by superimposing the guide image GI on the captured image PI, the image sensor device 5 outputs the display image VI to the monitor device 4 via the output interface 99.
- the memory 7 is a chip different from the image sensor device 5 is shown, but this does not prevent the memory 7 from being integrated in the same package as the image sensor device 5.
- the monitor device 4 is also used as a monitor device for a navigation system, for example.
- the monitor device 4 includes a display unit 4a and a touch panel 4b formed on the display unit 4a.
- the display unit 4a displays the display image VI provided from the camera module 1.
- the display unit 4a is configured by a liquid crystal display.
- the touch panel 4b is a pressure-sensitive or electrostatic instruction input device that is formed together with the display unit 4a and can output a contact position by a finger or the like as location data.
- the image sensor device 5 is provided with a controller 91 and controls various arithmetic units of the image sensor device 5.
- the control target is not shown except for the memory interface 6, but the image sensor core 8, the function unit in the ISP 9 including the data calculation unit 92, the output interface 99, and the like are naturally included in the control target.
- the image sensor device 5 can also communicate with various systems and sensors via an in-vehicle network indicated by reference numeral 50 in FIG.
- a CAN (controller area network) 50 is illustrated as an in-vehicle network.
- the power steering system 31 and the brake system 37 are constructed with an electronic control unit (ECU) configured with peripheral circuits around an electronic circuit such as a CPU.
- ECU electronicee control unit
- the power steering system 31 is an electric power steering (EPS): SBW (steer-by-wire) system including an actuator 41 and a torque sensor 22.
- the brake system 37 includes an actuator 47 and a brake sensor 27, an ABS (anti lock braking system) that suppresses locking of the brake, a skid prevention device (ESC: electronic stability control) that suppresses vehicle skidding during cornering,
- EPS electric power steering
- SBW steer-by-wire
- ABS anti lock braking system
- ESC electronic stability control
- a steering sensor 21, a wheel speed sensor 23, a shift lever switch 25, and an accelerator sensor 29 are connected to the CAN 50 as examples of various sensors.
- the steering sensor 21 is a sensor that detects the steering amount (rotation angle) of the steering wheel, and is configured using, for example, a Hall element.
- the camera module 1 constituting the driving support device 10 can acquire the steering amount of the steering wheel 2 by the driver from the steering sensor 21.
- the wheel speed sensor 23 is a sensor that detects the amount of rotation of the wheel of the vehicle 90 and the number of rotations per unit time, and is configured using, for example, a Hall element.
- the wheel speed sensor 23 may be provided in the brake system 37 in order to quickly detect a brake lock, an idling of the wheel, a sign of a skid, or the like from a difference in rotation between the left and right wheels.
- the driving support device 10 acquires information via the brake system 37.
- the brake sensor 27 is a sensor that detects an operation amount of a brake pedal.
- the shift lever switch 25 is a sensor or switch that detects the position of the shift lever, and is configured using a displacement sensor or the like. For example, when the shift is set to reverse, the driving support device 10 can start the support control, or can end the support control when the shift is changed from reverse to forward. Further, the torque sensor 22 for detecting the operation torque to the steering wheel can detect whether or not the driver is holding the steering wheel.
- the driving support device 10 of the present invention is configured in the camera module 1 with the ISP 9 of the image sensor device 5 as a core.
- the driving support apparatus 10 includes functional units including an image receiving unit 11, a storage unit 13, a guide image providing unit 17, and an image output unit 19.
- Each of these functional units does not need to have an independent physical configuration, and it is sufficient if the respective functions are realized. That is, each functional unit may share the same hardware, or each function may be realized by cooperation of software such as programs and parameters and hardware.
- the image receiving unit 11 is a functional unit that receives a captured image PI around the vehicle 90 captured by the in-vehicle camera 100, specifically, the optical unit 3 and the image sensor core 8.
- the data calculation unit 92 functions as the image receiving unit 11.
- the storage unit 13 is a functional unit that stores a graphic image constituting the guide display V.
- the memory 7 functions as the storage unit 13.
- the guide image providing unit 17 is a functional unit that acquires a graphic image constituting the guide display V from the storage unit 13 (memory 7) and provides it to the image output unit 19.
- the controller 91, the buffer 93, the decompression unit 94, the overlay unit 95, and the data calculation unit 92 function as the guide image providing unit 17.
- the guide image providing unit 17 acquires a graphic image of the predicted route C from the storage unit 13 according to the actual steering angle ⁇ of the vehicle 90. Specifically, the guide image providing unit 17 acquires the steering angle ⁇ of the vehicle 90 from the steering sensor 21, designates an address based on the steering angle ⁇ , accesses the memory 7, and responds to the steering angle ⁇ . A graphic image of the expected course C is acquired.
- the image output unit 19 is a functional unit that displays the display image VI on the monitor device 4 in the vehicle 90 as an image obtained by superimposing the guide image GI using the guide display V on the captured image PI.
- the data calculation unit 92 and the output interface 99 function as the image output unit 19.
- the guide display V refers to a graphic image including a guide line G that assists the driving operation of the vehicle 90 by the driver.
- the guide display V includes a guide line G, a fixed guide F, a vehicle extension line E, a message M, and an expected course line C shown in FIG.
- the guide line G is a reference line for guiding the driver in order to assist the driving operation of the vehicle 90 by the driver.
- the guide line G includes a vehicle extension line E and an expected course line C.
- the vehicle extension line E is a rear vehicle width extension line obtained by extending the vehicle width of the vehicle 90 rearward.
- a vehicle extension line when a vehicle extension line is superimposed on a photographed image obtained by photographing the front of the vehicle 90, it may be a front vehicle width extension line.
- the vehicle extension line E may be an extension line of the axle in the front-rear direction without being limited to the vehicle width. Further, the vehicle extension line E may be such that the front and rear ends of the vehicle 90, the center in the vehicle length direction, and the like are extended toward the side of the vehicle 90. Since the position of the vehicle extension line E is determined by the relationship between the vehicle 90 and the captured image PI by the camera 100, it is superimposed on a certain position in the captured image PI. Therefore, the vehicle extension line E is included in the fixed guide F.
- the fixed guide F includes a message M.
- the expected course line C is a guide line G indicating an expected course according to the actual steering angle of the vehicle 90 on the captured image PI. Therefore, the predicted course C on the captured image PI is not superimposed at a fixed position, but is a dynamic guide that changes according to the actual steering angle ⁇ of the vehicle 90. As shown in FIG. 5C, when the steering wheel 2 is in the neutral position (neutral position), the steering angle ⁇ is zero, and the expected course line C is drawn at a position that substantially overlaps the vehicle extension line E.
- the expected course line C has a shape curved in the left direction.
- the captured image PI is a mirror image that is reversed left and right. Therefore, the expected route C superimposed on the captured image PI is also drawn as a mirror image.
- the expected route C has a shape curved to the left larger than (b).
- the expected route C is curved to the right.
- the expected course line C has a shape curved to the right larger than (d).
- the guide image providing unit 17 configured with the ISP 9 as a core reads out four types of guide displays from the memory 7 and superimposes them on the captured image PI within a time when the monitor device 4 rewrites the screen to generate the display image VI. If the video format that can be displayed on the monitor device 4 does not match the video format of the captured image PI of the camera 100 (or the video format of the display image VI), the captured image PI (display image VI) is displayed on the monitor device 4. Cannot be displayed.
- This video format is defined by the number of vertical and horizontal pixels such as NTSC / PAL, the number of screens per unit time (or screen frequency / line frequency), and a display method such as interlace / progressive.
- the guide image providing unit 17 synchronizes with the shooting period of one screen of the captured image PI (or the generation period of the display image VI), and within the time when the camera 100 captures the captured image PI of one screen, It can be said that the guide display is read from the memory 7 and superimposed on the captured image PI to generate the display image VI.
- the guide image providing unit 17 when the monitor device 4 displays an image at a frame rate of 30 fps, the guide image providing unit 17 generates a display image VI of one frame in 1/30 seconds ( ⁇ 33 ms).
- a guide display V of a type that can be processed by the guide image providing unit 17 in 1/30 seconds can be stored in the memory 7 as the storage unit 13.
- the memory 7 can store four types of guide displays V indicated by OV1 to OV4. In the present embodiment, an example will be described in which three types of guide displays V, OV1 to OV3, are stored.
- OV1 is a guide display V corresponding to the vehicle extension line E.
- OV2 is a guide display V corresponding to the expected route C.
- OV3 corresponds to the message M.
- the vehicle extension line E and the message M correspond to the fixed guide F, only one design is not always stored. For example, if it is determined that an obstacle exists in the vicinity of the vehicle 90 based on the detection result by the clearance sonar 33, the message M may be displayed in a different color. In this case, the guide display V stored at a different address in the memory 7 is read as OV3. The same applies to the vehicle extension line E.
- the guide image providing unit 17 accesses the memory 7 based on the address A ( ⁇ ) corresponding to the steering angle ⁇ indicated by the rotation angle of the steering wheel 2 or the displacement angle of the wheel. An expected route C corresponding to the steering angle ⁇ is acquired. Further, the guide image providing unit 17 acquires other guide displays V such as the vehicle extension line E and the message M from the memory 7 as needed under the control of the controller 91.
- the guide display V having a different shape is drawn and stored in advance for each predetermined rudder angle, that is, every 15 degrees of the rotation angle of the steering wheel 2.
- the increment of a predetermined rudder angle that is, so-called resolution
- the guide display V may be prepared and stored in the memory 7 so that the resolution is higher than in other areas where the steering wheel is turned.
- the predicted course line C is generated every 5 degrees or every 10 degrees with the rotation angle of the steering wheel 2 within the range of 90 degrees on the left and right sides around the neutral position as the vicinity of the neutral position.
- the expected route C may be generated every 15 degrees. In this way, the expected course C follows the sensitivity sensitively to the operation from the neutral position where the steering is started, so that the driver can be quickly notified that the expected course C changes in the display image VI. Can do. In addition, the driver can recognize the movement of the expected course C at an early stage.
- the guide display V stored in the memory 7 is compressed in order to reduce the data capacity.
- the guide display V acquired from the memory 7 via the buffer 93 configured by a small capacity memory, a register, or the like is decompressed (expanded) in the decompression unit 94. Further, when a plurality of guide displays V are superimposed on the captured image PI, the guide display V after decompression is superimposed on the overlay unit 95 and combined into one guide image GI.
- the guide image GI is superimposed on the captured image PI in the data calculation unit 92 to become a display image VI, and is output to the monitor device 4 via the output interface 99.
- the guide image providing unit 17 obtains the graphic image constituting the guide display V from the storage unit 13 in synchronization with the rewriting cycle of the display image VI in the monitor device 4 and outputs the guide image GI to the image output unit. 19 to provide. Accordingly, the ISP 9 also acquires the steering angle ⁇ from the steering sensor 21 in synchronization with the rewriting cycle of the display image VI. Even if there is no change in the steering angle ⁇ , or even if there is a change, the change is within the range of the minimum resolution of the steering angle ⁇ , there is no problem in reading the predicted route C from the memory 7 each time. As a result, the same graphic image is read from the memory 7 as the expected route C and the same guide image GI is provided to the image output unit 19, but the calculation load is reduced by making the processing routine.
- the image output unit 19 obtains a new guide image GI for each rewriting cycle of the display image VI. Can do.
- the guide image GI also changes at the same speed as the captured image PI. Therefore, while having a simple configuration, the visual effect exactly the same as when the guide display V generated by the calculation is directly drawn on the captured image PI. Can be given to the driver. If there is no problem even if the change of the guide display V is low, the guide image may be provided at a cycle that is an integral multiple of the rewrite cycle of the display image VI.
- blinking can be expressed by intentionally thinning out the period for adding the message M in the rewriting period.
- the camera 100 has an error between an ideal mounting position and mounting posture when mounted on the vehicle 90. That is, the two-dimensional projective coordinate system in the captured image PI does not necessarily match the ideal design coordinate system.
- the error between the actual coordinate system and the ideal coordinate system is roughly divided into two.
- One is a translation error in which the coordinate center (intersection with the optical axis orthogonal to the projection plane) of the two-dimensional projection plane serving as the captured image PI deviates from the ideal coordinates in the three-dimensional world coordinate system.
- This is an error of the optical system (optical unit 3) regarding the position when the camera 100 is attached to the vehicle 90.
- the other is a rotation error in which the two-dimensional projection surface that is the captured image PI rotates in the three-dimensional world coordinate system with, for example, each axis of three-dimensional orthogonal coordinates as a rotation axis.
- This is an error of the optical system (optical unit 3) related to the posture when the in-vehicle camera 100 is attached to the vehicle 90.
- the camera 100 is calibrated when it is installed in the vehicle 90.
- calibration parameters for correcting translation errors and rotation errors are obtained by calibration, and the captured image PI is corrected based on the calibration parameters.
- a correction parameter for mainly correcting a translation error (position error) and a rotation parameter indicating a rotation error (attitude error) are obtained by calibration.
- the correction parameter for correcting the position error corrects the error of the optical system related to the position on the display image VI.
- the correction parameter defines the deviation of the coordinates of the optical center and the predetermined reference point on the projection plane of the optical center by the offset values of the two axes of the two-dimensional orthogonal coordinate system.
- the camera 100 captures a calibration index in which ideal coordinate values are defined. Since an ideal coordinate value is also defined for a reference point that is a specific point on the calibration index, an offset value is determined by obtaining an error from the coordinate of the reference point in the actual captured image PI. Specifically, the reference point on the calibration index set at a specified position in three dimensions is converted into an ideal coordinate value on the captured image based on an ideal conversion parameter. Then, a difference between the actual coordinate value of the reference point on the captured image PI obtained by photographing the calibration index with the camera 100 and the ideal coordinate value is obtained, and this is used as an error. Based on this error, an offset value is determined. Such an offset value (correction parameter) is calculated when the camera 100 is installed in the vehicle and stored in the driving support device 10. The same applies to rotation parameters described later, which are calculated when the camera 100 is installed in the vehicle and stored in the driving support device 10.
- the guide display V has graphic information and reference coordinate position information, and is stored in the memory 7 (storage unit 13).
- the guide image providing unit 17 adjusts the offset of the reference coordinate position information using the correction parameter. Since this guide image GI is superimposed on the captured image PI to become the display image VI, the error of the optical system related to the position is corrected on the display image VI. That is, the error of the optical system related to the position can be adjusted very easily without performing complicated calculations such as coordinate transformation.
- the errors related to pan (shake in the horizontal direction of the screen) and tilt (shake in the vertical direction of the screen) that are usually included in the rotation error are included in the position error here, and when the error of the position is adjusted. Adjusted.
- the roll (rotation of the screen) that is the remaining element of the rotation error is corrected as follows. That is, the rotation error is adjusted by using the rotation angle of the actual captured image PI with respect to the ideal state of the captured image PI as an error of the optical system related to the rotation when the camera 100 is attached to the vehicle 90.
- the storage unit 13 and the guide image providing unit 17 adjust the rotation error in cooperation here.
- the storage unit 13 stores a guide display V generated in advance for each predetermined value of the error of the optical system relating to rotation, and the guide image providing unit 17 guides the adjustment amount based on the rotation parameter indicating the rotation error.
- the display V is acquired.
- the storage unit 13 stores a graphic image indicating the guide display V in accordance with the value of the rotation parameter roll of the optical system related to rotation and the value of the steering angle ⁇ of the vehicle 90.
- the graphic image is prepared corresponding to the steering angle ⁇ up to B ′ having a slight margin than the maximum rotation angle B and the absolute value being larger than B, It is remembered.
- the guide image providing unit 17 accesses the address (A ( ⁇ , roll)) of the memory 7 (storage unit 13) determined based on the rotation parameter (optical system error) roll and the value of the steering angle ⁇ .
- the guide display V with the adjustment amount added is acquired.
- the capacity of the memory 7 is insufficient to prepare and store a graphic image for each predetermined value of the rotation error with respect to the predicted route C in which a large number of graphic images are already stored for each steering angle ⁇ .
- the following method is also suitable for reducing the capacity of the memory 7.
- There is a correlation between the rotation error and the steering angle ⁇ For example, when there is a rotation error, the predicted route C is displayed at a position shifted by a predetermined steering angle according to the rotation error. Focusing on this point, the steering angle ⁇ or the reference address A ( ⁇ ) of the memory 7 based on the steering angle ⁇ is adjusted based on the rotation error.
- the storage unit 13 stores a graphic image of the predicted route C over a range wider than the range of the actual steering angle ⁇ .
- the range of the steering angle ⁇ is set to a region where the absolute value is larger than the maximum rotation angle B by ⁇ . If the absolute value (B + ⁇ ) is larger than B ′ including the margin in the absolute value B of the maximum rotation angle of the steering angle ⁇ described above with reference to FIG. It is preferable that a graphic image up to the maximum rotation angle is prepared.
- the guide image providing unit 17 corrects the actual steering angle ⁇ to, for example, ⁇ ′ based on a rotation parameter indicating an error of the optical system related to the actual steering angle ⁇ of the vehicle 90 and the actual rotation of the camera 100.
- the guide image providing unit 17 acquires the predicted route C from the storage unit 13 in accordance with the corrected value ⁇ ′. Since the storage unit 13 stores a graphic image of the predicted course C over a range wider than the actual range of the steering angle ⁇ , the steering angle value is corrected based on the rotation parameter. No problem. Of course, without correcting the steering angle ⁇ , the address A ( ⁇ ) guided by the steering angle ⁇ is corrected by the rotation parameter roll, and the address A ′ ( ⁇ ) is obtained to read the guide display V from the memory 7. May be.
- the present invention is not limited to this method, and it may be corrected as a rotation error including pan and tilt. In this case, only the translation error corresponds to the position error.
- the rotation parameter includes an adjustment value for adjusting an error of the three-axis rotation elements of pan, tilt, and roll.
- the rotation parameter is calculated when the camera 100 is installed in the vehicle in a production factory or the like, and is stored in the driving support device 10. Specifically, a plurality of calibration indices with ideal coordinate values defined are photographed by the camera 100, the ideal coordinate values of a reference point, which is a specific point on each calibration index, and the actual captured image. A rotation parameter is calculated from the coordinate value in PI. Since various methods such as using the method disclosed in Japanese Patent Laid-Open No. 2001-245326 are known as methods for obtaining such rotation parameters, detailed description thereof is omitted.
- the driving support device 10 corrects (generates) the guide image GI based on the stored rotation parameter, and stores the corrected guide display V in the storage unit 13. That is, the guide display V in which the rotation error of the optical system is adjusted is stored in advance. At this time, if the capacity of the storage unit 13 (memory 7) allows, it is preferable that both the unadjusted initial guide display and the corrected guide display are stored. Even when the rotation parameter is changed by maintenance, a new corrected guide display can be generated and stored based on the unadjusted initial guide display and the latest rotation parameter.
- the generation of the corrected guide display V is not limited to the method of correcting the non-adjusted initial guide display based on the latest rotation parameter as described above.
- the mounting position of the camera 100 is, for example, the height from the road surface, the distance from the center position in the left-right direction (left-right direction offset position), the distance from the rear wheel shaft, and the like.
- the guide display V can be stored as follows.
- An error between the ideal coordinate of the reference point and the actual coordinate in the two-dimensional coordinate system of the captured image PI is an error of the optical system related to the position when the camera 100 is attached to the vehicle 90.
- the rotation angle of the captured image PI with respect to the ideal state of the captured image PI is an error of the optical system related to the rotation when the camera 100 is attached to the vehicle 90.
- the error related to the rotation is an error between the ideal angle of the coordinate axis in the two-dimensional coordinate system of the captured image PI and the actual angle, and can also be referred to as an error of the optical system related to the posture when the camera 100 is attached to the vehicle 90.
- the storage unit 13 stores in advance a corrected guide display V that is corrected based on a rotation parameter that indicates an error of the optical system related to rotation when the camera 100 is attached.
- the guide image providing unit 17 corrects the position of the guide display V in the display image VI based on a correction parameter for correcting the error of the optical system related to the position on the display image VI.
- the storage unit 13 corrects the guide display after correction based on the rotation parameter indicating the error of the optical system related to the rotation when the camera 100 is attached and the correction parameter (translation parameter) indicating the error of the optical system related to the position. V may be stored in advance.
- the guide image providing unit 17 does not correct the guide display V with respect to the error of the optical system, but the predicted course line C as the corrected guide display V from the storage unit 13 according to the actual steering angle of the vehicle 90.
- the guide image GI is configured using the acquired guide display V and provided to the image output unit 19.
- the correction of the guide image GI may be performed by an external device (not shown) of the vehicle 90 that can be connected to the network of the vehicle 90, such as the CAN 50, without being performed by the driving support device 10.
- the corrected guide display V is downloaded to the storage unit 13 (memory 7) via the network and stored in the storage unit 13 in advance.
- the corrected guide display V may be stored in a memory card or the like without downloading, and the memory card may be installed in the vehicle 90.
- the storage unit 13 (memory 7) itself may be constituted by a memory card, and may be temporarily removed from the vehicle 90 and installed in an external device, and after returning to the vehicle 90 after storing the corrected guide display V.
- the guide display V can be easily corrected and re-stored even when the communication speed of the in-vehicle network such as the CAN 50 is low.
- parameters necessary for correction of the guide display V such as rotation parameters and correction parameters (translation parameters), may also be transmitted to the external device via the memory card.
- the image processing apparatus may include an independent camera and an image processing apparatus that receives the captured image PI from the camera and outputs the display image VI by superimposing the guide image GI by an ECU having a CPU or DSP as a core.
- the display image VI is described as being generated by superimposing the guide image GI including the guide display V on the captured image PI.
- the display image VI may be configured to further overlap and generate the guide image GI including the surrounding information S.
- An example of such a display image VI is shown in FIG.
- the vehicle 90 is provided with an object detection unit that detects an object around the vehicle 90.
- the clearance sonar 33 corresponds to the object detection unit.
- a millimeter wave radar or the like can be used instead of the clearance sonar 33.
- the clearance sonar 33 is described as being provided at four corners of the vehicle (right front, left front, right rear, and left rear).
- the storage unit 13 stores a plurality of pieces of ambient information S generated in advance as graphic images for each direction in which the distance from the vehicle 90 to the object and the direction in which the object exists, as the ambient information S indicating the situation around the vehicle 90.
- the ambient information S is information indicating whether or not an object exists around the vehicle 90.
- the distance from the vehicle 90 to the object is displayed in, for example, three levels according to the distance.
- the detection directions correspond to the above four corners, and are the front right direction, the left front direction, the right rear direction, and the left rear direction of the vehicle 90.
- the ambient information S is stored in the storage unit 13 as a plurality of graphic images for each such distance and direction. Such an example is shown in FIG.
- the guide image providing unit 17 acquires the surrounding information S from the storage unit 13 based on the detection result of the clearance sonar 33, combines it with the guide image, and provides it to the image output unit 19.
- the guide image providing unit 17 acquires the ambient information S corresponding to the detection result of the clearance sonar 33 from the storage unit 13 together with the guide image GI. Since the acquisition of the guide image GI is the same as that in the above embodiment, the description thereof is omitted.
- FIG. 8 shows an example in which a guide image GI including three types of guide displays V of OV1 to OV3 and a guide image GI including surrounding information S of OV7 are superimposed on the captured image PI. In the example illustrated in FIG.
- the ambient information S is defined according to the distance from the vehicle 90 to the object.
- an object is detected at a relatively close distance on the right front side of the vehicle 90, and another object is detected at a position farther than that on the left front side.
- an object is detected at a relatively distant position in the left rear.
- the surrounding information S is synthesized in the upper part in the guide image GI. Thereby, the surrounding information S can be arranged in an unnecessary area in the upper part of the captured image PI. Therefore, an area (for example, an area to be watched) indicating a situation around the vehicle 90 included in the captured image PI is not damaged.
- the surrounding information S may be combined with the lower part of the guide image GI.
- the surrounding information S can be arranged in an unnecessary area such as a body part of the vehicle 90 that is often included in the lower part of the captured image PI. Therefore, an area (for example, an area to be watched) indicating a situation around the vehicle 90 included in the captured image PI is not damaged.
- the vehicle 90 is provided with a vehicle information acquisition unit that acquires information about the vehicle 90, and the storage unit 13 stores a plurality of pieces of vehicle information generated in advance as graphic images as vehicle information indicating information about the vehicle 90. Then, the guide image providing unit 17 can be configured to acquire the vehicle information from the storage unit 13 based on the detection result of the vehicle information acquisition unit, combine it with the guide image GI, and provide it to the image output unit 19.
- the ambient information S may be specifically described with a numerical value (for example, “XX meter”), or may be displayed with an indicator engraved at a predetermined interval.
- a numerical value for example, “XX meter”
- the guide image GI of OV3, the guide image GI of OV2, and the guide image GI of OV1 are displayed on the captured image PI from the bottom. It is preferable to arrange them in order. Of course, it is naturally possible to arrange the display image VI by replacing the order of the guide image GI of OV3, the guide image GI of OV2, and the guide image GI of OV1 on the captured image PI.
- the OV3 guide image GI, the OV2 guide image GI, the OV1 guide image GI, and the OV7 guide image GI are arranged in this order on the captured image PI from the bottom.
- the order of the guide image GI of OV3, the guide image GI of OV2, and the guide image GI of OV1 is changed and arranged on the photographed image PI, and the guide image GI of OV7 is arranged on the top to constitute the display image VI.
- the present invention can be used for a driving support device that reduces a driver's operation burden by superimposing a guide image on an image captured by an in-vehicle camera with a simple system configuration.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Image Input (AREA)
Abstract
Description
車載カメラにより撮影された車両の周辺の撮影画像を受け取る画像受取部と、
ドライバーによる前記車両の運転操作を補助するガイド線を含み、グラフィック画像を用いたガイド表示により構成されたガイド画像を、前記撮影画像に重畳させた表示画像を前記車両内の表示装置に表示させる画像出力部と、
前記撮影画像上において前記車両の実際の舵角に応じた予想進路を示して運転操作を補助する前記ガイド線を予想進路線として、少なくとも、予め所定の舵角ごとのグラフィック画像として生成された複数の前記予想進路線を、前記ガイド表示として記憶する記憶部と、
少なくとも、前記車両の実際の舵角に応じて前記記憶部から前記ガイド表示としての前記予想進路線を取得し、取得した前記ガイド表示を用いて前記ガイド画像を構成して前記画像出力部に提供するガイド画像提供部と、を備える点にある。
光電変換により前記車両の周辺の情景をアナログ画像信号に変換するイメージセンサと、前記アナログ画像信号を信号処理してデジタル画像信号による前記撮影画像を生成する信号処理部とを有するイメージセンサコアと、
少なくとも前記予想進路線を含む前記ガイド表示を前記記憶部から取得して前記ガイド画像を生成し、前記撮影画像と前記ガイド画像とを重畳して前記表示画像を生成して出力するイメージセンサプロセッサと、を有するイメージセンサデバイスと、
前記イメージセンサの受光面に前記車両の周辺の情景を結像させる光学ユニットと、
前記ガイド表示のグラフィック画像が格納されたメモリと、を備えて一体化された車載カメラモジュールにより構成される点にある。
2:ステアリングホイール
3:光学ユニット
4:モニタ装置(表示装置)
5:イメージセンサデバイス
7:メモリ
8:イメージセンサコア
9:イメージセンサプロセッサ
10:運転支援装置
11:画像受取部
13:記憶部
17:ガイド画像提供部
19:画像出力部
81:イメージセンサ
82:アナログ回路(信号処理部)
83:A/Dコンバータ(信号処理部)
90:車両
100:カメラ(車載カメラ)
C:予想進路線
E:車両延長線
F:固定ガイド
G:ガイド線
GI:ガイド画像
PI:撮影画像
S:周囲情報
V:ガイド表示
VI:表示画像
θ:舵角
roll:回転パラメータ
Claims (12)
- 車載カメラにより撮影された車両の周辺の撮影画像を受け取る画像受取部と、
ドライバーによる前記車両の運転操作を補助するガイド線を含み、グラフィック画像を用いたガイド表示により構成されたガイド画像を、前記撮影画像に重畳させた表示画像を前記車両内の表示装置に表示させる画像出力部と、
前記撮影画像上において前記車両の実際の舵角に応じた予想進路を示して運転操作を補助する前記ガイド線を予想進路線として、少なくとも、予め所定の舵角ごとのグラフィック画像として生成された複数の前記予想進路線を、前記ガイド表示として記憶する記憶部と、
少なくとも、前記車両の実際の舵角に応じて前記記憶部から前記ガイド表示としての前記予想進路線を取得し、取得した前記ガイド表示を用いて前記ガイド画像を構成して前記画像出力部に提供するガイド画像提供部と、を備える運転支援装置。 - 前記ガイド画像提供部は、前記撮影画像の撮影周期又は前記表示装置における前記表示画像の書き換え周期に同期して前記記憶部から前記ガイド表示を取得して、前記ガイド画像を前記画像出力部に提供する請求項1に記載の運転支援装置。
- 前記撮影画像の2次元座標系における基準点の理想的な座標と実際の座標との誤差を、前記車両に対する前記車載カメラの取り付け時の位置に関する光学系の誤差とし、
前記ガイド画像提供部は、前記位置に関する光学系の誤差を前記表示画像上において補正する補正パラメータに基づいて、前記表示画像における前記ガイド表示の位置を補正する請求項1又は2に記載の運転支援装置。 - 前記撮影画像の理想的な状態に対する前記撮影画像の回転角度を、前記車両に対する前記車載カメラの取り付け時の回転に関する光学系の誤差とし、
前記記憶部は、さらに、前記回転に関する光学系の誤差に応じて、前記車両の舵角に応じた前記ガイド表示を示すグラフィック画像を記憶し、
前記ガイド画像提供部は、前記車両の実際の舵角と前記車両に対する前記車載カメラの取り付け時の回転に関する光学系の誤差を示す回転パラメータとに応じて前記記憶部から前記予想進路線のグラフィック画像を取得する請求項1~3の何れか一項に記載の運転支援装置。 - 前記撮影画像の理想的な状態に対する前記撮影画像の回転角度を、前記車両に対する前記車載カメラの取り付け時の回転に関する光学系の誤差とし、
前記記憶部は、実際の舵角の変域よりも広い変域に亘って前記予想進路線のグラフィック画像を記憶し、
前記ガイド画像提供部は、前記車両の実際の舵角と実際の前記回転に関する光学系の誤差を示す回転パラメータに基づいて実際の前記舵角の値を補正して、当該補正後の値に応じて前記記憶部から前記予想進路線を取得する請求項1~3の何れか一項に記載の運転支援装置。 - 前記記憶部に記憶される前記予想進路線のグラフィック画像は、前記車両のステアリングホイールがニュートラル位置の近傍における前記所定の舵角が、舵を切った状態における前記所定の舵角よりも小さくなるように生成される請求項1~5の何れか一項に記載の運転支援装置。
- 前記撮影画像上において前記車両の実際の舵角に拘わらず、表示画面の所定の位置に表示される前記ガイド表示を固定ガイドとして、
前記記憶部は、さらに少なくとも1つの前記固定ガイドを記憶し、
前記ガイド画像提供部は、前記記憶部から前記予想進路線及び前記固定ガイドを取得して1つの前記ガイド画像に合成して前記画像出力部に提供する請求項1~6の何れか一項に記載の運転支援装置。 - 前記固定ガイドは、前記撮影画像上において前記車両の実際の舵角に拘わらず、所定の舵角における前記車両の進行方向への延長線を示す前記ガイド線としての車両延長線を含む請求項7に記載の運転支援装置。
- 光電変換により前記車両の周辺の情景をアナログ画像信号に変換するイメージセンサと、前記アナログ画像信号を信号処理してデジタル画像信号による前記撮影画像を生成する信号処理部とを有するイメージセンサコアと、
少なくとも前記予想進路線を含む前記ガイド表示を前記記憶部から取得して前記ガイド画像を生成し、前記撮影画像と前記ガイド画像とを重畳して前記表示画像を生成して出力するイメージセンサプロセッサと、を有するイメージセンサデバイスと、
前記イメージセンサの受光面に前記車両の周辺の情景を結像させる光学ユニットと、
前記ガイド表示のグラフィック画像が格納されたメモリと、を備えて一体化された車載カメラモジュールにより構成された請求項1~8の何れか一項に記載の運転支援装置。 - 前記車両に当該車両の周囲の物体を検出する物体検出部が備えられ、
前記記憶部は、前記車両の周囲の状況を示す周囲情報として、前記車両から前記物体までの距離及び前記物体が存在する方向ごとのグラフィック画像として予め生成された複数の前記周囲情報を記憶し、
前記ガイド画像提供部は、前記物体検出部の検出結果に基づいて前記記憶部から前記周囲情報を取得して前記ガイド画像に合成して前記画像出力部に提供する請求項1~9の何れか一項に記載の運転支援装置。 - 前記周囲情報は、前記ガイド画像内の上側部分に合成される請求項10に記載の運転支援装置。
- 前記周囲状況情報は、前記ガイド画像内の下側部分に合成される請求項10に記載の運転支援装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012532879A JP5561566B2 (ja) | 2010-09-06 | 2011-05-06 | 運転支援装置 |
RU2013108264/11A RU2548649C2 (ru) | 2010-09-06 | 2011-05-06 | Устройство содействия вождению |
US13/817,670 US9294733B2 (en) | 2010-09-06 | 2011-05-06 | Driving assist apparatus |
EP11823288.3A EP2614997B1 (en) | 2010-09-06 | 2011-05-06 | Driving assist apparatus |
CN201180041923.6A CN103079902B (zh) | 2010-09-06 | 2011-05-06 | 驾驶支援装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-198789 | 2010-09-06 | ||
JP2010198789 | 2010-09-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012032809A1 true WO2012032809A1 (ja) | 2012-03-15 |
Family
ID=45810416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/060597 WO2012032809A1 (ja) | 2010-09-06 | 2011-05-06 | 運転支援装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9294733B2 (ja) |
EP (1) | EP2614997B1 (ja) |
JP (1) | JP5561566B2 (ja) |
CN (1) | CN103079902B (ja) |
RU (1) | RU2548649C2 (ja) |
WO (1) | WO2012032809A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103373285A (zh) * | 2012-04-25 | 2013-10-30 | 索尼公司 | 图像生成设备和方法、摄像机及装备控制图像生成设备 |
US20140063246A1 (en) * | 2012-08-31 | 2014-03-06 | GM Global Technology Operations LLC | Vehicle back-up camera capability |
JP2014225819A (ja) * | 2013-05-17 | 2014-12-04 | 京セラ株式会社 | 較正処理装置、カメラ較正装置、およびカメラ較正方法 |
JP2015072269A (ja) * | 2013-10-01 | 2015-04-16 | アプリケーション・ソリューションズ・(エレクトロニクス・アンド・ヴィジョン)・リミテッド | 車載カメラのオンライン較正のためのシステム、車両及び方法 |
JP2016055793A (ja) * | 2014-09-10 | 2016-04-21 | アイテル株式会社 | 運転支援装置 |
JP2017001493A (ja) * | 2015-06-09 | 2017-01-05 | 富士通テン株式会社 | 画像処理装置および画像処理方法 |
KR101856064B1 (ko) * | 2016-08-25 | 2018-05-09 | 현대오트론 주식회사 | 주차 지원 장치 및 방법 |
CN110053625A (zh) * | 2018-01-19 | 2019-07-26 | 本田技研工业株式会社 | 距离计算装置和车辆控制装置 |
US11228729B2 (en) | 2018-05-15 | 2022-01-18 | Sony Semiconductor Solutions Corporation | Imaging device and imaging system having a stacked structure for pixel portion and signal processing circuit portion |
WO2022029953A1 (ja) * | 2020-08-06 | 2022-02-10 | 株式会社デンソーテン | 車載カメラ装置及び表示画像生成方法 |
JP2022039224A (ja) * | 2020-08-28 | 2022-03-10 | 株式会社デンソーテン | 車載装置、及び、舵角調整方法 |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9514650B2 (en) * | 2013-03-13 | 2016-12-06 | Honda Motor Co., Ltd. | System and method for warning a driver of pedestrians and other obstacles when turning |
US9758099B2 (en) | 2013-03-15 | 2017-09-12 | Gentex Corporation | Display system and method thereof |
KR20140147205A (ko) * | 2013-06-18 | 2014-12-30 | 삼성전자주식회사 | 휴대 가능한 의료 진단 장치의 주행 경로 제공 방법 및 장치 |
DE102014200661A1 (de) * | 2014-01-16 | 2015-07-16 | Ford Global Technologies, Llc | Detektion eines Einparkvorgangs eines Kraftwagens |
JP5975051B2 (ja) * | 2014-02-26 | 2016-08-23 | トヨタ自動車株式会社 | 車両制御装置及び車両制御方法 |
US9981605B2 (en) * | 2014-05-16 | 2018-05-29 | GM Global Technology Operations LLC | Surround-view camera system (VPM) and vehicle dynamic |
JP6248836B2 (ja) * | 2014-07-10 | 2017-12-20 | 株式会社デンソー | 運転支援装置 |
EP3693203B1 (en) | 2014-12-10 | 2022-05-18 | Ricoh Company, Ltd. | Controller, display, method and carrier means for information provision |
JP6558733B2 (ja) * | 2015-04-21 | 2019-08-14 | パナソニックIpマネジメント株式会社 | 運転支援方法およびそれを利用した運転支援装置、運転制御装置、車両、運転支援プログラム |
JP2017021546A (ja) * | 2015-07-10 | 2017-01-26 | 田山 修一 | 車輌用画像表示システム及び方法 |
CN107179767B (zh) * | 2016-03-10 | 2021-10-08 | 松下电器(美国)知识产权公司 | 驾驶控制装置、驾驶控制方法以及非瞬时性记录介质 |
US10614721B2 (en) | 2017-06-08 | 2020-04-07 | International Business Machines Corporation | Providing parking assistance based on multiple external parking data sources |
CN110399622A (zh) * | 2018-04-24 | 2019-11-01 | 上海欧菲智能车联科技有限公司 | 车载摄像头的布置方法及车载摄像头的布置系统 |
DE102018217127B4 (de) * | 2018-10-08 | 2024-07-04 | Audi Ag | Verfahren und Anzeigesystem zur Anzeige von Sensordaten einer Sensoreinrichtung auf einer Anzeigeeinrichtung sowie Kraftfahrzeug mit einem Anzeigesystem |
JP7215231B2 (ja) | 2019-03-04 | 2023-01-31 | トヨタ自動車株式会社 | 情報処理装置、検知方法及びプログラム |
JP7331511B2 (ja) * | 2019-07-16 | 2023-08-23 | 株式会社アイシン | 車両周辺表示装置 |
JP7319593B2 (ja) * | 2020-02-13 | 2023-08-02 | トヨタ自動車株式会社 | 車両周辺監視装置 |
WO2022227020A1 (zh) * | 2021-04-30 | 2022-11-03 | 华为技术有限公司 | 一种图像处理方法以及装置 |
RU2771591C1 (ru) * | 2021-05-14 | 2022-05-06 | Общество С Ограниченной Ответственностью "Омникомм Онлайн" | Пользовательское устройство генерирования графического интерфейса пользователя |
US20220363285A1 (en) * | 2021-05-14 | 2022-11-17 | Boris Valerevich PANKOV | Device for generating a graphical user interface and a system for generating a graphical user interface |
RU2766546C1 (ru) * | 2021-05-14 | 2022-03-15 | Осаюхинг Омникомм | Способ генерирования графического интерфейса пользователя и машиночитаемый носитель данных |
WO2023033671A1 (en) * | 2022-01-28 | 2023-03-09 | "Omnicomm Online" Limited Liability Company | Method for generating a modified energy-efficient driving route for the vehicle in operation |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11334470A (ja) | 1998-05-22 | 1999-12-07 | Aisin Seiki Co Ltd | 駐車補助装置 |
JP2001245326A (ja) | 1999-12-24 | 2001-09-07 | Aisin Seiki Co Ltd | 車載カメラの校正装置および校正方法、並びに校正指標 |
JP2004203365A (ja) * | 2002-11-01 | 2004-07-22 | Yazaki Corp | 駐車支援装置 |
JP2005056320A (ja) * | 2003-08-07 | 2005-03-03 | Matsushita Electric Ind Co Ltd | 運転支援装置及び運転支援方法 |
JP2006216066A (ja) * | 2006-02-13 | 2006-08-17 | Matsushita Electric Works Ltd | 車両用障害物検知装置 |
WO2009144893A1 (ja) * | 2008-05-30 | 2009-12-03 | 三洋電機株式会社 | 運転支援装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4238663B2 (ja) | 2003-08-29 | 2009-03-18 | トヨタ自動車株式会社 | 車載カメラのキャリブレーション方法及びキャリブレーション装置 |
JP4803450B2 (ja) * | 2006-11-20 | 2011-10-26 | アイシン精機株式会社 | 車載カメラの校正装置及び当該装置を用いた車両の生産方法 |
JP2009038309A (ja) * | 2007-08-03 | 2009-02-19 | Sharp Corp | 固体撮像素子およびその製造方法、電子情報機器 |
JP5380941B2 (ja) * | 2007-10-01 | 2014-01-08 | 日産自動車株式会社 | 駐車支援装置及び方法 |
JP5240517B2 (ja) * | 2008-09-30 | 2013-07-17 | アイシン精機株式会社 | 車載カメラの校正装置 |
-
2011
- 2011-05-06 US US13/817,670 patent/US9294733B2/en not_active Expired - Fee Related
- 2011-05-06 JP JP2012532879A patent/JP5561566B2/ja not_active Expired - Fee Related
- 2011-05-06 CN CN201180041923.6A patent/CN103079902B/zh not_active Expired - Fee Related
- 2011-05-06 RU RU2013108264/11A patent/RU2548649C2/ru active
- 2011-05-06 WO PCT/JP2011/060597 patent/WO2012032809A1/ja active Application Filing
- 2011-05-06 EP EP11823288.3A patent/EP2614997B1/en not_active Not-in-force
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11334470A (ja) | 1998-05-22 | 1999-12-07 | Aisin Seiki Co Ltd | 駐車補助装置 |
JP2001245326A (ja) | 1999-12-24 | 2001-09-07 | Aisin Seiki Co Ltd | 車載カメラの校正装置および校正方法、並びに校正指標 |
JP2004203365A (ja) * | 2002-11-01 | 2004-07-22 | Yazaki Corp | 駐車支援装置 |
JP2005056320A (ja) * | 2003-08-07 | 2005-03-03 | Matsushita Electric Ind Co Ltd | 運転支援装置及び運転支援方法 |
JP2006216066A (ja) * | 2006-02-13 | 2006-08-17 | Matsushita Electric Works Ltd | 車両用障害物検知装置 |
WO2009144893A1 (ja) * | 2008-05-30 | 2009-12-03 | 三洋電機株式会社 | 運転支援装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2614997A4 |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103373285B (zh) * | 2012-04-25 | 2017-04-26 | 索尼公司 | 图像生成设备和方法、摄像机及装备控制图像生成设备 |
JP2013228832A (ja) * | 2012-04-25 | 2013-11-07 | Sony Corp | 走行支援画像生成装置、走行支援画像生成方法、車載用カメラおよび機器操縦支援画像生成装置 |
US20130325205A1 (en) * | 2012-04-25 | 2013-12-05 | Sony Corporation | Cruise-assist image generation device, cruise-assist image generation method, in-vehicle camera and equipment-control assist image generation device |
CN103373285A (zh) * | 2012-04-25 | 2013-10-30 | 索尼公司 | 图像生成设备和方法、摄像机及装备控制图像生成设备 |
US11628882B2 (en) | 2012-04-25 | 2023-04-18 | Sony Group Corporation | Cruise-assist image generation device, cruise-assist image generation method, in-vehicle camera and equipment-control assist image generation device |
US10696328B2 (en) | 2012-04-25 | 2020-06-30 | Sony Corporation | Cruise-assist image generation device, cruise-assist image generation method, in-vehicle camera and equipment-control assist image generation device |
US9533709B2 (en) * | 2012-04-25 | 2017-01-03 | Sony Corporation | Cruise-assist image generation device, cruise-assist image generation method, in-vehicle camera and equipment-control assist image generation device |
US20140063246A1 (en) * | 2012-08-31 | 2014-03-06 | GM Global Technology Operations LLC | Vehicle back-up camera capability |
CN103661104A (zh) * | 2012-08-31 | 2014-03-26 | 通用汽车环球科技运作有限责任公司 | 车辆倒车照相机功能 |
JP2014225819A (ja) * | 2013-05-17 | 2014-12-04 | 京セラ株式会社 | 較正処理装置、カメラ較正装置、およびカメラ較正方法 |
JP2015072269A (ja) * | 2013-10-01 | 2015-04-16 | アプリケーション・ソリューションズ・(エレクトロニクス・アンド・ヴィジョン)・リミテッド | 車載カメラのオンライン較正のためのシステム、車両及び方法 |
JP2016055793A (ja) * | 2014-09-10 | 2016-04-21 | アイテル株式会社 | 運転支援装置 |
JP2017001493A (ja) * | 2015-06-09 | 2017-01-05 | 富士通テン株式会社 | 画像処理装置および画像処理方法 |
KR101856064B1 (ko) * | 2016-08-25 | 2018-05-09 | 현대오트론 주식회사 | 주차 지원 장치 및 방법 |
CN110053625A (zh) * | 2018-01-19 | 2019-07-26 | 本田技研工业株式会社 | 距离计算装置和车辆控制装置 |
JP2019128153A (ja) * | 2018-01-19 | 2019-08-01 | 本田技研工業株式会社 | 距離算出装置及び車両制御装置 |
CN110053625B (zh) * | 2018-01-19 | 2022-03-11 | 本田技研工业株式会社 | 距离计算装置和车辆控制装置 |
US11228729B2 (en) | 2018-05-15 | 2022-01-18 | Sony Semiconductor Solutions Corporation | Imaging device and imaging system having a stacked structure for pixel portion and signal processing circuit portion |
US11722804B2 (en) | 2018-05-15 | 2023-08-08 | Sony Semiconductor Solutions Corporation | Imaging device and imaging system having a stacked structure for pixel portion and signal processing circuit portion |
WO2022029953A1 (ja) * | 2020-08-06 | 2022-02-10 | 株式会社デンソーテン | 車載カメラ装置及び表示画像生成方法 |
JP2022039224A (ja) * | 2020-08-28 | 2022-03-10 | 株式会社デンソーテン | 車載装置、及び、舵角調整方法 |
JP7325389B2 (ja) | 2020-08-28 | 2023-08-14 | 株式会社デンソーテン | 車載装置、及び、舵角調整方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2012032809A1 (ja) | 2014-01-20 |
RU2548649C2 (ru) | 2015-04-20 |
RU2013108264A (ru) | 2014-10-20 |
CN103079902B (zh) | 2016-02-24 |
EP2614997B1 (en) | 2018-02-28 |
EP2614997A4 (en) | 2016-11-23 |
EP2614997A1 (en) | 2013-07-17 |
US9294733B2 (en) | 2016-03-22 |
US20130147945A1 (en) | 2013-06-13 |
CN103079902A (zh) | 2013-05-01 |
JP5561566B2 (ja) | 2014-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5561566B2 (ja) | 運転支援装置 | |
US10676027B2 (en) | Vehicle control apparatus and program | |
US9895974B2 (en) | Vehicle control apparatus | |
US9973734B2 (en) | Vehicle circumference monitoring apparatus | |
JP6115104B2 (ja) | 車両の制御装置、及び制御方法 | |
US20120007985A1 (en) | Calibration device, method, and program for on-board camera | |
US10467789B2 (en) | Image processing device for vehicle | |
US9994157B2 (en) | Periphery monitoring apparatus and periphery monitoring system | |
CN110945558B (zh) | 显示控制装置 | |
JP2001218197A (ja) | 車両用周辺表示装置 | |
JP6958163B2 (ja) | 表示制御装置 | |
JP2019054420A (ja) | 画像処理装置 | |
US11669230B2 (en) | Display control device | |
JP2014225819A (ja) | 較正処理装置、カメラ較正装置、およびカメラ較正方法 | |
JP6816436B2 (ja) | 周辺監視装置 | |
JP3947117B2 (ja) | 車両周辺画像処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180041923.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11823288 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2011823288 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13817670 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2012532879 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2013108264 Country of ref document: RU Kind code of ref document: A |