EP2963922A1 - Program and device for controlling vehicle - Google Patents
Program and device for controlling vehicle Download PDFInfo
- Publication number
- EP2963922A1 EP2963922A1 EP14757301.8A EP14757301A EP2963922A1 EP 2963922 A1 EP2963922 A1 EP 2963922A1 EP 14757301 A EP14757301 A EP 14757301A EP 2963922 A1 EP2963922 A1 EP 2963922A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- vehicle
- image data
- captured image
- bird
- control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 50
- 238000001514 detection method Methods 0.000 claims abstract description 41
- 230000005484 gravity Effects 0.000 claims abstract description 14
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims description 133
- 230000001133 acceleration Effects 0.000 claims description 81
- 238000012545 processing Methods 0.000 claims description 53
- 238000012544 monitoring process Methods 0.000 description 54
- 239000000203 mixture Substances 0.000 description 42
- 238000012937 correction Methods 0.000 description 34
- 238000010586 diagram Methods 0.000 description 33
- 238000006243 chemical reaction Methods 0.000 description 18
- 238000000034 method Methods 0.000 description 18
- 230000000875 corresponding effect Effects 0.000 description 16
- 238000001914 filtration Methods 0.000 description 14
- 230000003287 optical effect Effects 0.000 description 12
- 230000009466 transformation Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 11
- 238000010276 construction Methods 0.000 description 10
- 239000000725 suspension Substances 0.000 description 9
- 239000004575 stone Substances 0.000 description 6
- 230000002596 correlated effect Effects 0.000 description 4
- 238000002485 combustion reaction Methods 0.000 description 3
- 239000002131 composite material Substances 0.000 description 2
- 230000000994 depressogenic effect Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/602—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
- B60R2300/605—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/802—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
Definitions
- An embodiment of the present invention relates to a vehicle control apparatus and program.
- a technique for providing a vehicle driver with image data captured as surrounding environments of a vehicle by plural cameras which are installed at the vehicle is known as a technique for assisting a parking of the vehicle.
- a technique for correcting the captured image data depending on an operation of the vehicle in a case where the image data is provided to the driver is proposed.
- a technique for generating bird's eye view image data indicating a ground around the vehicle in an overhead view is proposed.
- an issue that the image data is not corrected on a real-time basis is raised because the image data is corrected only when an instruction is made by the driver, for example, or an issue that an appropriate display of the image data is not sufficiently made is raised because an inclination of the vehicle is detected by a vehicle height sensor, the inclination of the vehicle is not accurately obtained in a case where wheels are not in contact with the ground, for example.
- an issue that it is difficult to accurately grasp a state around the vehicle by the display based on such the image data is raised.
- a vehicle control apparatus includes an acquisition portion acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surrounding of the vehicle, and vehicle state data output from a vehicle state detection portion provided at the vehicle, and a control portion performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data, the control portion performing the rotation control in a manner that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side at a display region serving as an output destination. Accordingly, as an example, an effect that it is easy to grasp a state around the vehicle on a basis of the captured image data is obtained.
- the vehicle state detection portion acquires acceleration data serving as the vehicle state data and output from an acceleration detection portion provided at the vehicle, and the control portion performs the rotation control on the captured image data depending on a roll angle indicating an inclination around a front-rear axis of the vehicle obtained from the acceleration data with an origin at a position coordinate within the display region of the captured image data, the position coordinate corresponding to a center of a lens used for imaging by the imaging portion. Accordingly, as an example, an effect that it is easy to grasp the state around the vehicle on a basis of the captured image data is obtained. In addition, an effect that a height difference is easily visually recognizable is obtained.
- control portion further performs an enlargement processing or a reduction processing on the captured image data. Accordingly, as an example, the captured image data is enlarged or reduced depending on the output destination, which obtains an effect where visibility improves.
- control portion further moves the position coordinate corresponding to the center of the lens from a center of the display region relative to the captured image data. Accordingly, as an example, the movement control of the captured image data is performed depending on the output destination, which obtains an effect where visibility improves.
- control portion further moves the position coordinate corresponding to the center of the lens from the center of the display region to an upper direction within the display region. Accordingly, as an example, because a lower region than the horizontal line included in the captured image data is displayed, an effect that the state around the vehicle may be easily grasped is obtained.
- the captured image data is displayed at a display device, the display device displaying information that represents at least one of a roll angle indicating an inclination around a front-rear axis of the vehicle and a pitch angle indicating an inclination around a left-right axis of the vehicle together with the captured image data. Accordingly, as an example, an effect that it is easy to grasp both the vehicle state and the state around the vehicle is obtained.
- the acquisition portion further acquires information indicating whether or not the vehicle is switched to a mode for off-road, and the control portion performs the rotation control on the captured image data depending on the vehicle state data in a case where the vehicle is switched to the mode for off-road. Accordingly, as an example, an effect that the state around the vehicle is visually recognizable in the mode for off-road is obtained.
- the aforementioned vehicle control apparatus further includes a generation portion generating bird's eye view image data indicating a ground in a surrounding of the vehicle in an overhead view based on the captured image data on which the rotation control is performed by the control portion. Accordingly, as an example, the surroundings of the vehicle are recognizable in the overhead view by referring to the bird's eye view image data on which a change of point of view is performed after the rotation control for leveling is conducted, which obtains an effect that the state around the vehicle is visually recognizable.
- a program according to the embodiments of the present invention is configured to cause a computer to execute an acquisition step acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surrounding of the vehicle, and vehicle state data output from a vehicle state detection portion provided at the vehicle and a control step performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data, the control step performing the rotation control in a manner that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side at a display region serving as an output destination. Accordingly, as an example, an effect that it is easy to grasp the state around the vehicle on a basis of the captured image data is obtained.
- the aforementioned program is further configured to cause the computer to execute a generation step generating bird's eye view image data indicating a ground in a surrounding of the vehicle in the overhead view based on the captured image data on which the rotation control is performed by the control portion.
- the bird's eye view image data on which a change of point of view is performed after the rotation control for leveling is conducted is generated.
- a vehicle 1 may be a car (internal combustion car) including an internal combustion engine (an engine not illustrated) as a driving source, a car (an electric car, a fuel cell car, or the like) including an electric motor (a motor not illustrated) as the driving source, or a car (a hybrid car) including the engine and the motor as the driving sources, for example.
- the vehicle 1 may include various kinds of transmissions and various kinds of apparatuses (systems, parts and the like) necessary for driving the internal combustion engine or the electric motor. Further, method, quantity, layout and the like of an apparatus related to driving of wheels 3 of the vehicle 1 may be variously specified.
- a vehicle body 2 forms a vehicle interior 2a where a passenger (not illustrated) gets in.
- a steering portion 4, an acceleration operating portion 5, a braking operating portion 6, a speed change operating portion 7 and the like are provided within the vehicle interior 2a in a state facing a seat 2b of a driver as the passenger.
- the steering portion 4 is a steering wheel projecting from a dashboard (instrument panel) and the acceleration operating portion 5 is an accelerator pedal positioned at the feet of the driver.
- the braking operating portion 6 is a brake pedal positioned at the feet of the driver and the speed change operating portion 7 is a shift lever projecting from a center console.
- the steering portion 4, the acceleration operating portion 5, the braking operating portion 6 and the speed change operating portion 7 are not limited to the aforementioned members.
- a display device 8 (display output portion) and an audio output device 9 (audio output portion) are provided within the vehicle interior 2a.
- the display device 8 is, for example, a LCD (liquid crystal display), an OELD (organic electroluminescent display) and the like.
- the audio output device 9 is, as an example, a speaker.
- the display device 8 is covered by a clear operation input portion 10 (for example, a touch panel and the like), for example. The passenger and the like may visually confirm a projected image (image) on a display screen of the display device 8 via the operation input portion 10.
- the passenger and the like may perform an operation input (instruction input) by operating the operation input portion 10, i.e., touching, pressing or moving the operation input portion 10 with one's finger at a position corresponding to the projected image (image) displayed on the display screen of the display device 8.
- the display device 8, the audio output device 9, the operation input portion 10 and the like are provided at a monitor device 11 positioned at a center portion of the dashboard in a vehicle width direction (left-right direction).
- the monitor device 11 may include an operation input portion (not illustrated) such as a switch, a dial, a joy-stick and a pressing button, for example.
- An audio output device may be provided at other position within the vehicle interior 2a, i.e., position different from the monitor device 11.
- sound may be output from other audio output device than the audio output device 9 of the monitor device 11.
- the monitor device 11 is shared by a navigation system and an audio system.
- a monitor device of a surroundings monitoring apparatus may be separately provided from the aforementioned systems. It may be configured that, in addition to the audio output device 9, a warning sound and the like may be output from an audio output portion such as a buzzer 24 (refer to Fig. 3 ), for example.
- the vehicle 1 is a four-wheel vehicle (four-wheel car) as an example.
- the vehicle 1 includes two right and left front wheels 3F and two right and left rear wheels 3R. Further, in the present embodiment, these four wheels 3 are configured to be steered (capable of being steered).
- the vehicle 1 includes a front wheel steering system 12 steering the front wheels 3F and a rear wheel steering system 13 steering the rear wheels 3R.
- the front wheel steering system 12 and the rear wheel steering system 13 are electrically controlled by a surroundings monitoring ECU 14 (electronic control unit) and the like to operate respective actuators 12a and 13a.
- Each of the front wheel steering system 12 and the rear wheel steering system 13 is, for example, an electric power steering system, an SBW (steer by wire) system, and the like.
- the front wheel steering system 12 and the rear wheel steering system 13 assist a steering force by adding torque (assist torque) to the steering portion 4 by the actuators 12a and 13a, and steer the corresponding wheels 3 (the front wheels 3F or the rear wheels 3R), for example.
- Each of the actuators 12a and 13a may steer one of or more than one of the wheels 3.
- the two front wheels 3F are steered substantially parallel to each other at the same phases (same phases, same steering directions, same rotation directions) and the two rear wheels 3R are steered substantially parallel to each other at the same phases.
- the driving wheels may be variously specified.
- imaging portions 16 (16a-16d) are provided at the vehicle 1 (vehicle body 2) as illustrated in Fig. 2 .
- Each of the imaging portions 16 is, for example, a digital camera incorporating an imaging element such as a CCD (charge coupled device), a CIS (CMOS image sensor) and the like.
- the imaging portions 16 may output image data (moving image data, frame data) at a predetermined frame rate.
- Each of the imaging portions 16 includes a wide-angle lens to thereby take a picture in a range from 140° to 220° in a horizontal direction (view angle).
- An optical axis of the imaging portion 16 is specified to face downward (obliquely downward).
- the imaging portion 16 takes a picture of outside environment around the vehicle body 2 including a road surface on which the vehicle 1 is movable.
- the horizontal direction is a direction included in a horizontal plane orthogonal to a direction of gravity (vertical direction).
- the imaging portion 16a is positioned at an end portion 2c (an end portion in a plan view) at a front side (a front side in a vehicle front-rear direction) of the vehicle body 2 and is provided at a front bumper, for example.
- the imaging portion 16b is positioned at an end portion 2d at a left side (a left side in a vehicle width direction) of the vehicle body 2 and is provided at a door mirror 2g (projecting portion) at a left side.
- the imaging portion 16c is positioned at an end portion 2e at a rear side (a rear side in the vehicle front-rear direction) of the vehicle body 2 and is provided at a wall portion at a lower side of a door 2h of a rear trunk.
- the imaging portion 16d is positioned at an end portion 2f at a right side (a right side in the vehicle width direction) of the vehicle body 2 and is provided at a door mirror 2g (projecting portion) at a right side.
- the method of mounting the camera at the vehicle is not limited and the camera may be mounted so that the image data in a front direction, the image data in right and left side directions and the image data in a rear direction relative to the vehicle is obtainable.
- the surroundings monitoring ECU 14 performs a calculation processing and an image processing based on the image data obtained by the plural imaging portions 16.
- the surroundings monitoring ECU 14 is able to generate a wider view angle image and a virtual bird's eye view image (planar image) where the vehicle 1 (vehicle body 2) is viewed from an upper side.
- a brake system 18, a steering angle sensor 19 (angular sensor), an accelerator sensor 20, a shift sensor 21, a wheel speed sensor 22, an acceleration sensor 26, and the like are electrically connected, in addition to the surroundings monitoring ECU 14, the monitor device 11, the front wheel steering system 12, the rear wheel steering system 13, and the like, via an in-vehicle network 23 (electric telecommunication line).
- the in-vehicle network 23 is configured as a CAN (controller area network) as an example.
- the surroundings monitoring ECU 14 may send a control signal via the in-vehicle network 23 to control the front wheel steering system 12, the rear wheel steering system 13, the brake system 18, and the like.
- the surroundings monitoring ECU 14 may also receive detection results of a torque sensor 12b, a tire angle sensor 13b (for the rear wheels 3R), an actuator 18a, a brake sensor 18b, the steering angle sensor 19 (for the front wheels 3F), the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the acceleration sensor 26, and the like and indicator signals (control signals, operation signals, input signals, data) of the operation input portion 10 and the like via the in-vehicle network 23.
- the two acceleration sensors 26 are provided at the vehicle 1.
- the vehicle 1 is equipped with an ESC (electronic stability control).
- the acceleration sensors 26 (26a, 26b) as conventionally mounted to the vehicle equipped with the ESC (electronic stability control) are employed.
- no restriction is made on the acceleration sensor. The sensor that is able to detect the acceleration in the left-right direction of the vehicle 1 is acceptable.
- Fig. 4 is a diagram illustrating an example of detection directions of the acceleration sensors 26a, 26b.
- a detection direction 401 is the detection direction of the acceleration sensor 26a while a detection direction 402 is the detection direction of the acceleration sensor 26b.
- the detection direction 401 illustrated in Fig. 4 corresponds to a direction inclined by 45 degrees from a travelling direction (front-rear direction) of the vehicle 1 on a plane in parallel with a ground (a plane on which the vehicle 1 is movable).
- the detection direction 402 forms an angle of 90 degrees relative to the detection direction 401 on the plane in parallel with the ground. Because the two different detection directions are provided on the plane in parallel with the ground, the acceleration in the front-rear direction and the acceleration in the left-right direction may be obtained. In the present embodiment, no restriction is made on the detection direction and at least the acceleration in the left-right direction may be obtained. Calculations of the acceleration in the front-rear direction and the acceleration in the left-right direction are made at the surroundings monitoring ECU 14.
- the front-rear direction of the vehicle 1 indicates the travelling direction and an opposite direction from the travelling direction of the vehicle 1.
- the left-right direction of the vehicle 1 is a direction included in a surface orthogonal to the travelling direction of the vehicle 1.
- the surroundings monitoring ECU 14 includes, as an example, a CPU 14a (central processing unit), a ROM 14b (read only memory), a RAM 14c (random access memory), a display control portion 14d, an audio control portion 14e, a SSD 14f (solid state drive, flush memory), and the like.
- the CPU 14a performs the image processing related to the image displayed at the display device 8 and the various calculation processing such as calculation of a moving path of the vehicle 1 and determination of whether or not interference with an object occurs, for example.
- the CPU 14a reads out program stored (installed) at a nonvolatile memory device such as the ROM 14b, for example, and performs the calculation processing based on the aforementioned program.
- the RAM 14c tentatively stores various data used for the calculations at the CPU 14a.
- the display control portion 14d mainly performs the image processing using the image data obtained at the imaging portions 16 and the image processing (composition and the like, as an example) of the image data displayed at the display device 8, for example, within the calculation processing at the surroundings monitoring ECU 14.
- the audio control portion 14e mainly performs processing of audio data output at the audio output device 9 within the calculation processing at the surroundings monitoring ECU 14.
- the SSD 14f is a rewritable nonvolatile memory portion that is able to store data even in a case where a power source of the surroundings monitoring ECU 14 is turned off.
- the CPU 14a, the ROM 14b, the RAM 14c and the like may be integrated within the same package.
- the surroundings monitoring ECU 14 may be configured to include other logic operation processor such as a DSP (digital signal processor) or a logic circuit, for example, than the CPU 14a.
- a DSP digital signal processor
- a HDD hard disk drive
- the SSD 14f or the HDD may be provided separately from the surroundings monitoring ECU 14.
- Fig. 5 is a block diagram illustrating a construction of a surroundings monitoring portion 500 realized within the surroundings monitoring ECU 14 according to the present embodiment.
- Each construction within the surroundings monitoring portion 500 illustrated in Fig. 5 is realized in a case where the CPU 14a configured as the surroundings monitoring ECU 14 in Fig. 4 performs software stored within the ROM 14b.
- the surroundings monitoring portion 500 realizes an acquisition portion 501, an angle calculation portion 502, a filtering control portion 503, an image processing portion 504 and an output portion 505 by performing software stored within the ROM 14b (computer readable storage medium).
- software program
- the surroundings monitoring portion 500 assists the driving of the driver by displaying the image data by which a state around the vehicle 1 is recognizable on the basis of the captured image data input from the imaging portions 16 in a case where the vehicle 1 moves to be parked, and the acceleration data as an example of a vehicle state data acquired by the acceleration sensor 26 (acceleration detection portion) functioning as an example of a vehicle state detection portion.
- the acquisition portion 501 acquires various pieces of information from various sensors, for example, provided at the vehicle 1.
- the acquisition portion 501 acquires the captured image data output from the imaging portions 16a to 16d provided at the vehicle 1 to capture the images in the surroundings of the vehicle 1 and the acceleration data output from the acceleration sensors 26a, 26b provided at the vehicle 1. Further, the acquisition portion 501 acquires information indicating whether or not a mode specified by a switch provided at the operation input portion 10 is an off-road mode.
- the acquisition portion 501 outputs the acquired information to the angle calculation portion 502 and the image processing portion 504.
- the acquisition portion 501 also correlates the captured image data with the acceleration data where time when the image is captured in the captured image data and time when the acceleration is detected in the acceleration data substantially match each other.
- the angle calculation portion 502 calculates an inclination angle (a pitch angle and a roll angle) of the vehicle 1 based on the acceleration data acquired by the acceleration sensors 26a, 26b.
- the pitch angle is an angle indicating an inclination of the vehicle 1 around a left-right axis (axis 412 in Fig. 4 ) of the vehicle 1. In a case where the vehicle 1 is present on the horizontal plane (ground), the pitch angle is zero degrees.
- the roll angle is an angle indicating an inclination of the vehicle 1 around a longitudinal axis (axis 411 in Fig. 4 ) of the vehicle 1. In a case where the vehicle 1 is present on the horizontal plane (ground), the roll angle is zero degrees.
- the angle calculation portion 502 first calculates an acceleration a1 in the front-rear direction and an acceleration a2 in the left-right direction of the vehicle 1.
- the angle calculation portion 502 calculates the acceleration a1 in the front-rear direction using the following equation (1).
- the acceleration in the detection direction 401 is specified to be GL1 and the acceleration in the detection direction 402 is specified to be GL2.
- the acceleration a1 in the front-rear direction turns to 0G in a case where the pitch angle is 0° (in a case where the vehicle 1 is horizontal) and the acceleration a1 in the front-rear direction turns to 1G in a case where the pitch angle is 90° (in a case where the vehicle 1 is vertical).
- a ⁇ 1 GL ⁇ 1 ⁇ cos 45 ⁇ ° - GL ⁇ 2 ⁇ cos 45 ⁇ °
- the angle calculation portion 502 calculates the acceleration a2 in the left-right direction using the following equation (2).
- a ⁇ 2 - GL ⁇ 1 ⁇ sin 45 ( + GL ⁇ 2 ⁇ sin 45 (
- the angle calculation portion 502 calculates a pitch angle PA using the following equation (3).
- PA deg 90 deg ⁇ a ⁇ 1 G
- angle calculation portion 502 calculates a roll angle RA using the following equation (4).
- RA deg 90 deg ⁇ a ⁇ 1 G
- the angle calculation portion 502 correlates the roll angle and the pitch angle calculated from the acceleration data with the captured image data that is correlated to the aforementioned acceleration data. Accordingly, the roll angle and the pitch angle of the vehicle 1 when the captured image data is captured are recognizable.
- the filtering control portion 503 performs filtering by low-pass filter relative to the roll angle RA and the pitch angle PA calculated by the angle calculation portion 502.
- steep changes of the roll angle RA and the pitch angle PA in other words, a steep switching of the image data displayed at the display device 8 is restrained by performing the low-pass filter. Accordingly, the driver may comfortably watch the image data displayed at the display device 8.
- digital filter is used by the filtering control portion 503 provided within the surroundings monitoring portion 500. Nevertheless, for example, analog filter, for example, may be performed relative to a signal output from the acceleration sensor 26.
- the image processing portion 504 includes a rotation control portion 521, a reduction/enlargement control portion 522, a movement control portion 523 and a composition portion 524 to generate the image data displayed at the display device 8.
- the rotation control portion 521 performs a rotation correction on the captured image data capturing the surroundings of a front side of the vehicle 1.
- a subject of the rotation correction is not limited to the captured image data captured by the imaging portion 16a and may be the captured image data captured by the imaging portion 16c capturing the surroundings of a rear side of the vehicle 1, for example.
- Fig. 6 is an example of the captured image data captured by the imaging portion 16a.
- the captured image data illustrated in Fig. 6 is captured from the vehicle 1 that is inclined.
- the driver tends to recognize the image displayed at the display device 8 in an objective way and thus tends to recognize areas in the captured image data displayed at the display device 8, if the areas include the same heights in a vertical axis direction, include the same heights in reality or heights with a smaller height difference than the actual height difference.
- a region 601 and a region 602 are possibly recognized as the same heights.
- the rotation control portion 521 performs the rotation correction on the captured image data depending on the roll angle obtained by the acceleration sensors 26.
- the rotation control portion 521 performs the rotation correction (control) on the captured image data based on the inclination of the vehicle in the left-right direction relative to the horizontal direction serving as the direction included in the horizontal plane orthogonal to the direction of gravity calculated from the vehicle state data.
- the rotation control portion 521 performs the rotation correction (control) so that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side at a display region of an output destination.
- the rotation control portion 521 performs the rotation correction with an origin at a position coordinate within the display region of the captured image data corresponding to a center of a lens used by the imaging portion 16 for image capturing depending on the roll angle correlated to the aforementioned captured image data.
- Fig. 7 is a diagram illustrating an example of a two-dimensional orthogonal coordinate system that indicates the display region of the captured image data in a case where the position coordinate corresponding to the center of the lens serves as the origin.
- the rotation control portion 521 converts the position coordinate by an equation (5) indicated below so as to achieve the rotation correction of the captured image data.
- dx0, dy0 is a coordinate value with the origin at the center of the lens.
- ⁇ is the roll angle that is calculated.
- Fig. 8 is a diagram illustrating an example of the captured image data obtained after the rotation correction is performed by the rotation control portion 521.
- the rotation correction is performed so that the horizontal line included in the subject (environment outside the vehicle 1) captured in the captured image data is substantially in parallel with the lateral-direction side of the display region of the display device 8.
- the rotation correction is performed so that a lower direction of the captured image data corresponds to the direction of gravity of the subject (environment outside the vehicle 1) captured in the aforementioned captured image data.
- the lower direction and the direction of gravity do not necessarily completely coincide with each other and may coincide with each other so that a height relation within the captured image data is recognizable.
- the driver may recognize an objective height in the surrounding environments of the vehicle 1. Accordingly, an appropriate driving is achievable, which may improve safety.
- the reduction/enlargement control portion 522 functioning as the control portion performs an enlargement processing or a reduction processing relative to the captured image data after the rotation correction is performed by the rotation control portion 521.
- the reduction/enlargement control portion 522 converts the position coordinate by an equation (6) indicated below to achieve an enlargement correction or a reduction correction of the captured image data.
- dx1, dy1 is a coordinate value with the origin at the center of the lens after the rotation correction is performed.
- magX and magY are horizontal and vertical enlargement/reduction rates. The enlargement/reduction rate is decided on a basis of a relationship between a display size of the captured image data and number of pixels of the display region of the display device 8.
- the movement control portion 523 functioning as the control portion performs a control on the captured image data after the enlargement or reduction processing is performed by the reduction/enlargement control portion 522 so that the position coordinate corresponding to the center of the lens moves from the center of the display region of the display device 8.
- the movement control portion 523 performs a control to move the position coordinate corresponding to the center of the lens from the center of the display region of the display device 8 to an upper direction within the display region.
- the movement control portion 523 performs the processing to move the position coordinate corresponding to the center of the lens from the center of the display region of the display device 8 to the upper direction within the display region. Accordingly, conditions upper than the vehicle 1 such as the sky captured in the captured image data, for example, are not displayed and conditions lower than the vehicle 1 are displayed. Thus, the user may recognize the ground conditions around the vehicle 1 by referring to the captured image data displayed at the display device 8. Accordingly, an appropriate steering assist is achievable.
- the movement control portion 523 converts the position coordinate by an equation (7) indicated below to achieve the movement of the position coordinate of the captured image data.
- dx2, dy2 is a coordinate value with the origin at the center of the lens after the enlargement/reduction correction is performed.
- a destination of the position coordinate of the center of the lens before the movement is (cx, cy).
- the composition portion 524 performs a cutout relative to the captured image data after the movement control is performed by the movement control portion 523 so as to conform to the display region of the display device 8 and thereafter combines display information for assisting the steering of the driver.
- Fig. 9 is a diagram illustrating an example of the image data after the composition is performed by the composition portion 524.
- conditions around the left front wheel of the vehicle 1 captured by the imaging portion 16b is displayed at a display region 901.
- conditions around the right front wheel of the vehicle 1 captured by the imaging portion 16d is displayed at a display region 902.
- information by which the pitch angle and the roll angle of the vehicle 1 are recognizable is displayed at a display region 903. That is, an inclination of an icon 921 representing the vehicle 1 indicates the roll angle while a distance between a center line 912 passing through the icon 921 and a line 911 indicates the pitch angle. Accordingly, in the present embodiment, information by which the roll angle and the pitch angle are recognizable is indicated, however, display method is not limited to the above and other display method is acceptable.
- the roll state and the pitch state of the vehicle 1 during the off-road driving may be displayed in real time. Accordingly, the driver may easily and objectively recognize the conditions of the surroundings of the vehicle 1.
- the captured image data after cut out by the composition portion 524 is displayed at a display region 904.
- the horizontal line within the image in the captured image data is corrected to be substantially in parallel with a lateral frame of the display device 8.
- the lower direction of the image in the captured image data is corrected to match the direction of gravity. Accordingly, the driver may easily recognize the surrounding state.
- the output portion 505 outputs the image data that is composited by the composition portion 524 to the display device 8. Accordingly, together with the aforementioned captured image data after the correction processing is performed, information by which the roll angle and the pitch angle are recognizable is displayed at the display device 8.
- an estimated course line 905 of each of the front wheels 3F is included.
- the surroundings monitoring ECU 14 (CPU 14a) is able to calculate a planned course based on detection results of the steering angle sensor 19 and the tire angle sensor 13b, for example, and to include (overlap) the estimated course line 905 conforming to the planned course in the output image.
- the estimated course line 905 is an example of a display element indicating the course that is planned.
- the surroundings monitoring ECU 14 corrects the display position, size, posture (inclination) and the like of the estimated course line 905 depending on the aforementioned rotation, enlargement/reduction and movement corrections.
- the surroundings monitoring ECU 14 is able to correct the display region and the estimated course line 905 in a direction where the deviation is reduced.
- the inclination of the icon 921 relative to a lateral-direction side of the display region 903 or 904 corresponds to the roll angle of the vehicle 1.
- the surroundings monitoring ECU 14 may constitute a tiltmeter 906 (roll angle display portion) using the icon 921 by including an angle scale 906a (tilt scale) surrounding the icon 921 in the output image in a manner that an angle of the angle scale 906a remains unchanged relative to the display region 903.
- an angle scale 906a tilt scale
- the icon 921 that performs rotation (rolling) and pitching is displayed and the tiltmeter 906 is displayed on the screen depending on the roll angle and the pitch angle so that the horizontal direction, the vertical direction and the posture (the roll angle) of the vehicle 1 may be easily understood, regardless of the state of the image of the display region 904. Accordingly, the display region 904 and the display region 903 are displayed together (displayed within the same screen or displayed in parallel with each other) so that the state around the vehicle and the state of the posture of the vehicle may be further easily understood.
- the present embodiment may not perform the aforementioned rotation, enlargement/reduction and movement corrections on a constant basis and may be specified to perform the aforementioned corrections in a case where the vehicle 1 is brought to the off-road mode.
- the image processing portion 504 performs the aforementioned rotation, enlargement/reduction and movement corrections at the time of the off-road mode by referring to information acquired by the acquisition portion 501 indicating whether or not the vehicle 1 is in the off-road mode.
- the off-road mode corresponds to the mode for bringing out a four-wheel driving performance of the vehicle 1 during the off-road driving and the mode for specifying a total transfer gear to be low. That is, in the present embodiment, the captured image data displayed at the display device 8 is switched in association with the operation when the off-road driving is performed. At this time, in the present embodiment, the switching of the image displayed at the display device 8 is not limited to be performed in a case where the vehicle 1 is switched to the off-road mode. For example, in a case where the vehicle 1 is switched to the four-wheel driving in a two/four wheel drive switching, it may be controlled that the image after the rotation correction is performed is displayed.
- FIG. 10 is a flowchart illustrating procedures of the aforementioned processing in the surroundings monitoring portion 500 according to the present embodiment.
- the acquisition portion 501 acquires the captured image data from the imaging portions 16 (step S1001).
- the acquisition portion 501 acquires the acceleration data from the acceleration sensors 26 (step S1002).
- the angle calculation portion 502 calculates the roll angle and the pitch angle of the vehicle 1 from the acceleration data (step S1003).
- the filtering control portion 503 performs filtering by low-pass filter relative to the calculated roll angle and the calculated pitch angle (step S1004).
- the rotation control portion 521 performs the rotation control relative to the captured image data depending on the roll angle (step S1005).
- step S1006 the reduction/enlargement control portion 522 and the movement control portion 523 perform the enlargement control and the movement control on the captured image data after the rotation control is performed.
- the composition portion 524 performs the cutout conforming to the display region displayed at the display device 8 relative to the captured image data after the enlargement control and the movement control are performed (step S1007).
- the composition portion 524 combines the captured image data indicating the state around the front wheels and the display information by which the pitch angle and the roll angle are recognizable relative to the captured image data that is cut out (step S1008).
- the output portion 505 outputs the image data after the composition by the composition portion 524 to the display device 8 (step S1009).
- the surroundings monitoring portion 500 includes the aforementioned construction so as to easily recognize the difference in height in the surroundings of the vehicle 1. Accordingly, load of steering may be reduced to thereby improve safety.
- Fig. 11 is a block diagram illustrating a construction of a surroundings monitoring portion 700 realized within the surroundings monitoring ECU 14 according to the present embodiment.
- the CPU 14a configured as the surroundings monitoring ECU 14 in Fig. 4 executes software stored within the ROM 14b to thereby realize an acquisition portion 701, an angle calculation portion 702, a filtering control portion 703, an image processing portion 704 and an output portion 705 illustrated in Fig. 11 .
- the surroundings monitoring portion 700 realizes a bird's eye view image storage portion 706 on the RAM 14c.
- the acquisition portion 701 acquires various pieces of information from various sensors, for example, provided at the vehicle 1.
- the acquisition portion 701 acquires the captured image data output from the imaging portions 16a to 16d provided at the vehicle 1 to capture the images in the surroundings of the vehicle 1 and the acceleration data serving as an example of the vehicle state data output from the acceleration sensors 26a, 26b (acceleration detection portion) provided at the vehicle 1 and functioning as an example of the vehicle state detection portion.
- the acquisition portion 701 outputs the acquired information to the angle calculation portion 702 and the image processing portion 704.
- the acquisition portion 701 also correlates the captured image data with the acceleration data where time when the image is captured in the captured image data and time when the acceleration is detected in the acceleration data substantially match each other.
- the angle calculation portion 702 calculates the inclination angle (the pitch angle and the roll angle) of the vehicle 1 based on the acceleration data acquired by the acceleration sensors 26a, 26b.
- the pitch angle is an angle indicating an inclination of the vehicle 1 around the left-right axis (axis 412 in Fig. 4 ) of the vehicle. In a case where the vehicle 1 is present on the horizontal plane (ground), the pitch angle is zero degrees.
- the roll angle is an angle indicating an inclination of the vehicle 1 around the longitudinal axis (axis 411 in Fig. 4 ) of the vehicle 1. In a case where the vehicle 1 is present on the horizontal plane (ground), the roll angle is zero degrees.
- the angle calculation portion 702 first calculates the acceleration a1 in the front-rear direction and the acceleration a2 in the left-right direction of the vehicle 1.
- the angle calculation portion 702 calculates the acceleration a1 in the front-rear direction using the following equation (1).
- the acceleration in the detection direction 401 is specified to be GL1 and the acceleration in the detection direction 402 is specified to be GL2.
- the acceleration a1 in the front-rear direction turns to 0G in a case where the pitch angle is 0° (in a case where the vehicle 1 is horizontal) and the acceleration a1 in the front-rear direction turns to 1G in a case where the pitch angle is 90° (in a case where the vehicle 1 is vertical).
- a ⁇ 1 GL ⁇ 1 ⁇ cos 45 ⁇ ° - GL ⁇ 2 ⁇ cos 45 ⁇ °
- the angle calculation portion 702 calculates the acceleration a2 in the left-right direction using the following equation (2).
- a ⁇ 2 - GL ⁇ 1 ⁇ sin 45 ( + GL ⁇ 2 ⁇ sin 45 (
- the angle calculation portion 702 calculates the pitch angle PA using the following equation (3).
- PA deg 90 deg ⁇ a ⁇ 1 G
- angle calculation portion 702 calculates the roll angle RA using the following equation (4).
- RA deg 90 deg ⁇ a ⁇ 2 G
- the angle calculation portion 702 correlates the roll angle and the pitch angle calculated from the acceleration data with the captured image data that is correlated to the aforementioned acceleration data. Accordingly, the roll angle and the pitch angle of the vehicle 1 when the captured image data is captured are recognizable.
- the filtering control portion 703 performs filtering by low-pass filter relative to the roll angle RA and the pitch angle PA calculated by the angle calculation portion 702.
- steep changes of the roll angle RA and the pitch angle PA in other words, a steep switching of the image data displayed at the display device 8 is restrained by performing the low-pass filter. Accordingly, the driver may comfortably watch the image data displayed at the display device 8.
- digital filter is used by the filtering control portion 703 provided within the surroundings monitoring portion 700. Nevertheless, for example, analog filter, for example, may be performed relative to a signal output from the acceleration sensor 26.
- the image processing portion 704 includes a rotation control portion 711, a bird's eye view image generation portion 712 (generation portion), a moving amount calculation portion 713, a conversion portion 714 and a composition portion 715 each of which serves as the control portion.
- the image processing portion 704 generates the image data to be displayed at the display device 8.
- the rotation control portion 711 performs the rotation correction on the captured image data capturing the surroundings of a front side of the vehicle 1 (travelling direction) based on the inclination of the vehicle in the left-right direction relative to the horizontal direction calculated from the vehicle state data (in other words, depending on the roll angle).
- the horizontal direction is a direction orthogonal to the travelling direction, for example.
- the rotation correction may be performed on the captured image data in the same direction as a rotation direction where the vehicle becomes horizontal based on the inclination in the left-right direction of the vehicle calculated from the vehicle state data.
- the aforementioned rotation correction may be performed on the captured image data as if the image is captured in a state where the left-right direction of the vehicle 1 is horizontal (in a state where the vehicle 1 is arranged on the horizontal plane orthogonal to the direction of gravity).
- the acceleration is used as the vehicle state data.
- the vehicle state data is not limited to the acceleration and may be information relevant to the state of the vehicle 1.
- a subject of the rotation correction is not limited to the captured image data captured by the imaging portion 16a and may be the captured image data captured by the imaging portion 16c that captures the surroundings of a rear side of the vehicle 1.
- Fig. 12 is a diagram illustrating an example of a state where the vehicle 1 drives over a stone, for example, during the off-road driving.
- the vehicle 1 because the vehicle 1 drives over the stone, for example, the vehicle 1 is inclined by a roll angle ⁇ .
- a roll angle ⁇ In a case where bird's eye view image data is generated from the captured image data captured by the imaging portion 16a in the aforementioned state, distortion depending on the roll angle ⁇ occurs.
- the rotation control portion 711 performs the rotation correction on the captured image data depending on the roll angle ⁇ obtained from the acceleration sensor 26.
- the rotation control portion 711 performs the rotation correction with the origin at the position coordinate within the display region of the captured image data corresponding to the center of the lens used by the imaging portion 16a for image capturing depending on the roll angle correlated to the captured image data.
- Fig. 13 is a diagram illustrating an example of a two-dimensional orthogonal coordinate system that indicates the display region of the captured image data in a case where the position coordinate corresponding to the center of the lens serves as the origin.
- the rotation control portion 711 converts the position coordinate by the equation (5) indicated below so as to achieve the rotation correction of the captured image data.
- dx0, dy0 is a coordinate value with the origin at the center of the lens.
- the angle ⁇ illustrated in Fig. 13 is the roll angled that is calculated.
- the rotation control portion 711 performs the rotation correction on all pixels included in a display region 801 so as to generate a display region 802 obtained by the rotation of the display region 801 by the angle ⁇ . Then, the surroundings monitoring portion 700 generates the bird's eye view image data based on the captured image data including the display region 802 which is obtained after the rotation control is performed. Accordingly, the bird's eye view image data where the inclination caused by the roll angle ⁇ generated at the vehicle 1 is corrected may be generated.
- the rotation control portion 711 does not limit the angle for the rotation control on the captured image data to the roll angle by which the vehicle 1 is inclined from the horizontal plane.
- the rotation control portion 711 may perform the rotation control on the captured image data depending on a difference between the roll angle previously calculated and the roll angle presently calculated. This is because the state of the ground around the vehicle 1 when the vehicle 1 is driven on the ground that is inclined by a predetermined angle (roll angle previously calculated) is more easily recognizable in a case where the bird's eye view image data is generated from an angle inclined by the predetermined angle from the upper side of the vehicle 1 in the vertical direction than a case where the bird's eye view image data is generated from the upper side of the vehicle 1 in the vertical direction.
- the rotation control portion 711 performs the rotation control on the captured image data for the difference of the roll angles resulting from the inclination by driving on a stone, for example (difference between the previously calculated roll angle and the presently calculated roll angle), in a case where the vehicle 1 drives on a stone, for example.
- the bird's eye view image generation portion 712 generates, on a basis of the captured image data which is obtained after the rotation control is performed, the bird's eye view image data obtained by looking down the ground in the travelling direction of the vehicle 1 serving as the ground around the vehicle 1 from the upper side,.
- any method for generating the bird's eye view image data from the captured image data is acceptable.
- a mapping table may be used for conversion.
- the generation of the bird's eye view image data is performed each time the captured image data is acquired.
- the bird's eye view image generation portion 712 generates first bird's eye view image data based on first captured image data on which the rotation control is performed by the rotation control portion 711 and thereafter generates second bird's eye view image data based on second captured image data which is captured by the imaging portions 16 after the first captured image data is captured and then the vehicle 1 moves and on which the rotation control is performed by the rotation control portion 711.
- the image data displayed at the display device 8 is updated each time the vehicle 1 moves by a predetermined moving amount.
- the moving amount calculation portion 713 compares the bird's eye view image data generated by the bird's eye view image generation portion 712 and the bird's eye view image data used upon previous updating so as to calculate the moving amount of the vehicle 1.
- the moving amount calculation portion 713 compares predetermined areas within the bird's eye view image data generated by the bird's eye view image generation portion 712.
- the moving amount calculation portion 713 cuts out the predetermined area (display range) from each of the first bird's eye view image data used upon the previous updating and the second bird's eye view image data generated after the first bird's eye view image data so as to calculate an optical flow.
- Fig. 14 is a diagram illustrating a concept of optical flow calculated by the moving amount calculation portion 713.
- (A) of Fig. 14 is the image data cut out at the predetermined display range from the first bird's eye view image data used upon the previous updating while (B) of Fig. 14 is the image data cut out at the predetermined display range from the second bird's eye view image data generated presently by the bird's eye view image generation portion 712.
- the moving amount calculation portion 713 calculates the optical flows indicating a shifting of (feature points of) each displayed object by vectors between the image data illustrated in (A) of Fig. 14 and the image data illustrated in (B) of Fig. 14 .
- (C) of Fig. 14 illustrates an example of calculated optical flows.
- a length of each vector corresponding to a movement of the feature point (indicated by "X") in (A) of Fig. 14 to the feature point (indicated by "X") in (B) of Fig. 14 is indicated.
- the moving amount calculation portion 713 calculates the moving amount of the vehicle 1 from an average value of the calculated optical flows.
- Fig. 15 is a diagram illustrating a relation between the average value of the optical flows and the moving amount of the vehicle 1.
- an arrow 901 is specified to be the average value of the optical flows.
- the vehicle 1 turns about a rear wheel axis.
- the moving amount calculation portion 713 calculates the turning angle 1 ⁇ of the vehicle 1.
- the moving amount calculation portion 713 calculates the moving amount of the vehicle 1 from the length of each of the optical flows.
- the moving amount calculation portion 713 may separately calculate the moving amount of the vehicle 1 in the front-rear direction and the moving amount of the vehicle 1 in the left-right direction.
- the conversion portion 714 converts the bird's eye view image data generated presently by the bird's eye view image generation portion 712 into the bird's eye view image data for composition with the bird's eye view image data stored at the bird's eye view image storage portion 706 in a case where the moving amount calculated by the moving amount calculation portion 713 is equal to or greater than a predetermined distance.
- the inclination is corrected by the rotation control portion 711, however, distortion resulting from the inclination remains in the captured image data captured by the imaging portions 16.
- a projective transformation is performed on the bird's eye view image data.
- the projective transformation is employed so that a case where the torsion of the road surface within the captured image data is generated by the inclination of the vehicle 1 is converted to a case where the vehicle 1 is not inclined.
- a so-called trapezoidal correction where a trapezoidal-shaped area within the captured image data is converted to a rectangular or square area, for example, is included.
- Fig. 16 is a diagram illustrating the distortion of the captured image data caused by the inclination of the vehicle 1 and the display range of the captured image data after the conversion is performed by the conversion portion 714.
- distortion states (a1) and (a2) of the bird's eye view image data generated in a roll state of the vehicle 1 are illustrated and distortion states (b1) and (b2) of the bird's eye view image data generated in a pitch state of the vehicle 1 are illustrated.
- the distortion is accumulated.
- the conversion portion 714 performs the projective transformation determined on a basis of a second roll angle in Fig. 16 relative to the second captured image data cut out at the display range ((a1), (a2), (b1), (b2)) specified on a basis of the aforementioned roll angle obtained from the acceleration data acquired when the captured image data is captured.
- the display range ((a1), (a2), (b1), (b2)) is converted to the display range illustrated in (c) of Fig. 16 .
- a method for specifying the position coordinates of four points indicating the display region serving as a subject of the projective transformation based on the roll angle may be any method regardless of whether it is a conventional method or not.
- a correlation of the roll angle ⁇ with each of the four position coordinates may be provided beforehand.
- the composition portion 715 combines the bird's eye view image data stored at the bird's eye view image storage portion 706 and the bird's eye view image data obtained after the projective transformation is performed by the conversion portion 714.
- Fig. 17 is a diagram illustrating an example of the bird's eye view image data obtained after the composition is performed by the composition portion 715.
- a display range 1101 is a range most recently composited.
- a display range 1102 is a range composited before the display range 1101.
- a display range 1103 is a range composited before the display range 1102. Accordingly, in the present embodiment, the bird's eye view image data is composited each time the vehicle 1 moves.
- the bird's eye view image storage portion 706 stores the bird's eye view image data after the composition by the composition portion 715. Accordingly, the bird's eye view image storage portion 706 composites and stores the bird's eye view image data newly generated each time the vehicle 1 moves by the predetermined moving amount so that the bird's eye view image data indicating the condition of the ground below the vehicle 1 is stored at the bird's eye view image storage portion 706.
- Fig. 18 is a diagram illustrating an example of the bird's eye view image data stored at the bird's eye view image storage portion 706.
- the bird's eye view image data generated from the captured image data presently captured is composited and stored.
- the bird's eye view image data stored at the bird's eye view image storage portion 706 includes the ground below the vehicle 1.
- the configuration of the vehicle 1 illustrated in Fig. 18 is indicated for easily explanation and is not included in the actual bird's eye view image data stored at the bird's eye view image storage portion 706.
- the composition portion 715 performs a rotation processing with the turning angle ⁇ 1 on the bird's eye view image data stored at the bird's eye view image storage portion 706 when the vehicle 1 turns at the turning angle ⁇ 1. Then, the composition portion 715 performs the composition with the bird's eye view image data obtained after the projective transformation is performed by the conversion portion 714. Accordingly, the bird's eye view image data conforming to the turning of the vehicle 1 may be displayed.
- Fig. 19 is a diagram illustrating an example of screen information output by the output portion 705.
- captured image data 1302 capturing the travelling direction of the vehicle 1 by the imaging portion 16a
- captured image data 1304 around the front right wheel of the vehicle 1 captured by the imaging portion 16d are displayed in addition to bird's eye view image data 1301.
- the pitch angle and the roll angle of the vehicle 1 are displayed as recognizable information.
- the pitch angle is indicated by a distance between a center line passing through the icon and a horizontal line.
- the roll angle and the pitch angle are recognizable in the aforementioned method, however, the display method is not limited to the above and the other display method is acceptable.
- the captured image data 1302 where the travelling direction of the vehicle 1 is captured by the imaging portion 16a serves as the captured image data obtained after the rotation control is performed. Accordingly, the driver may recognize a height relation within the captured image data.
- Fig. 20 is a flowchart illustrating procedures of the aforementioned processing in the surroundings monitoring portion 700 according to the present embodiment.
- the acquisition portion 701 acquires the captured image data from the imaging portions 16 (step S1401).
- the acquisition portion 701 acquires the acceleration data from the acceleration sensors 26 (step S1402).
- the angle calculation portion 702 calculates the roll angle and the pitch angle of the vehicle 1 from the acceleration data (step S1403).
- the filtering control portion 703 performs filtering by low-pass filter relative to the calculated roll angle and the calculated pitch angle.
- the rotation control portion 711 performs the rotation control relative to the captured image data depending on the roll angle (step S1404).
- the bird's eye view image generation portion 712 generates the bird's eye view image data where a predetermined area in the travelling direction of the vehicle 1 which is present around the vehicle 1 is illustrated in an overhead view (step S1405).
- the moving amount calculation portion 713 extracts the image data of a predetermined display range (area) from the generated bird's eye view image data (step S1406). In addition, the moving amount calculation portion 713 holds the image data extracted from the similar range from the bird's eye view image data in the past (for example, in a case where the previous moving amount is determined to reach or exceed a predetermined threshold value).
- the moving amount calculation portion 713 calculates the moving amount of the vehicle 1 based on the image data of the predetermined display range (area) extracted from the bird's eye view image data (step S1407).
- the image processing portion 704 determines whether or not the calculated moving amount is equal to or greater than the predetermined threshold value (step S1408).
- the threshold value is specified to be 10 cm, for example, however, the threshold value may be specified appropriately depending on an implementation mode
- the bird's eye view image generation portion 712 In a case where the image processing portion 704 determines that the moving amount is equal to or greater than the threshold value (Yes in step S1408), the bird's eye view image generation portion 712 generates the bird's eye view image data from the presently captured image data serving as the captured image data before the rotation control is performed as in step S1404 (step S1409).
- the conversion portion 714 performs the projective transformation on the bird's eye view image data depending on the present roll angle and pitch angle of the vehicle 1 (step S1410).
- the torsion of the bird's eye view image data generated by either one of the roll angle and the pitch angle is corrected by the projective transformation including the trapezoidal correction.
- the vehicle 1 is inclined with the axis of the wheel 3, for example, instead of being inclined with reference to a center of gravity. Therefore, a displacement occurs in the left-right direction.
- the conversion portion 714 according to the embodiment performs an offset correction on the displacement in the left-right direction. In the same manner, the offset correction in the front-rear direction is performed in a case where the pitch angle is generated.
- the composition portion 715 combines the present bird's eye view image data after the projective transformation is performed with the bird's eye view image data stored at the bird's eye view image storage portion (step S1411).
- the composite portion 715 performs the rotation control on the bird's eye view image data stored at the bird's eye view image storage portion 706 so as to conform to the turning angle ⁇ 1 obtained before the composition in a case where the vehicle 1 turns at the turning angle ⁇ 1.
- the output portion 705 cuts out the bird's eye view image data displayed at the display device 8 from the bird's eye view image data stored at the bird's eye view image storage portion 706 (step S1412). Thereafter, the output portion 705 adds various pieces of information to the bird's eye view image data that is cut out and outputs the data to the display device 8 (step S1413).
- the image processing portion 704 determines that the moving amount is smaller than the predetermined threshold value (No in step S1408), the image processing portion 704 continues outputting the bird's eye view image data and the like already displayed at the display device 8 (step S1414).
- the embodiment where the bird's eye view image data is displayed for confirming the state of the vehicle 1 is explained.
- the embodiment is not limited to the display of only the bird's eye view image data and various pieces of information for confirming present state may be added to the bird's eye view image data.
- the third embodiment an example where various pieces of information are added to the bird's eye view image data is explained.
- Fig. 21 is a block diagram illustrating the construction of the surroundings monitoring portion 1700 realized within the surroundings monitoring ECU 14 according to the present embodiment.
- the surroundings monitoring portion 1700 illustrated in Fig. 21 is different from the surroundings monitoring portion 700 in the first embodiment in a point where the acquisition portion 701 is changed to an acquisition portion 1701 performing a different processing from the acquisition portion 701 and the image processing portion 704 is changed to an image processing portion 1702 performing a different processing from the image processing portion 704.
- the acquisition portion 1701 acquires the captured image data and the acceleration data, in the same way as the second embodiment, and also acquires a suspension detection result indicating a depression degree of a suspension of the front wheels 3F from a stroke sensor (not illustrated) and a detection result of the steering angle of each of the front wheels 3F and the rear wheels 3R from the steering angle sensor 19.
- the acquired suspension detection result and steering angle detection result are output to the image processing portion 1702.
- the image processing portion 1702 is different from the image processing portion 704 in the second embodiment in a point where a tire outline calculation portion 1711 and a locus calculation portion 1712 are added and the composition portion 715 in the second embodiment is changed to a composition portion 1713 performing a different processing from the processing performed by the composition portion 715.
- the tire outline calculation portion 1711 calculates an outline of a tire that should be superimposed on the bird's eye view image data based on the suspension detection result and the detection result of the steering angle acquired by the acquisition portion 1701. For example, in a case where the camera is placed on a basis of the bird's eye view, the front wheels 3F and the rear wheels 3R are shown largely as approaching the camera when the suspension is depressed, and are shown small when the suspension is extended. Thus, in the present embodiment, in order to display the tire outline of the vehicle 1 on the bird's eye view image data, the tire outline calculation portion 1711 calculates the tire outline configuration (size and angle of each of the front wheels 3F) that should be superimposed on the bird's eye view image data based on the suspension detection result and the steering angle.
- the locus calculation portion 1712 calculates an estimated moving locus in the travelling direction of the vehicle 1 based on the steering angle of the front wheels 3F and the steering angle of the rear wheels 3R.
- the locus calculation portion 1712 according to the present embodiment calculates the estimated moving locus that should be added to the present captured image data as the estimated moving locus of the front wheels 3F and calculates the estimated moving locus that should be added to the bird's eye view image data as the estimated moving locus of the rear wheels 3R.
- the estimated moving loca that are calculated are added to the captured image data and the bird's eye view image data and then output by the output 705.
- the composition portion 1713 combines the bird's eye view image data stored at the bird's eye view image storage portion 706 with the bird's eye view image data obtained after the projective transformation is performed by the conversion portion 714 in the same processing as the second embodiment, and thereafter adds a mark by which the steering angle and the size of each of the front wheels 3F are recognizable to a position where each of the front wheels 3 is estimated to presently exist on the bird's eye view image data obtained after the composition is performed.
- the output portion 705 outputs the screen information at the display device 8 based on the bird's eye view image data composited by the composition portion 1713.
- Fig. 22 is a diagram illustrating an example of the screen information output by the output portion 705 according to the third embodiment.
- captured image data 1602 capturing the travelling direction of the vehicle 1 by the imaging portion 16a is shown, for example.
- estimated moving loca 1611, 1612 of the front wheels 3F (display element indicating a planned course) calculated by the locus calculation portion 1702 are indicated.
- moving loca 1621, 1622 of the vehicle 1 generated because of the marks which are continuously added on a basis of the outlines of the front wheels 3F are indicated.
- the driver may recognize protrusion and recess on the road surface. That is, at a portion where the large mark is added, the suspension is largely depressed. In other words, an obstacle such as a stone, for example, is highly possibly present.
- the driver may drive, while confirming the aforementioned marks, to operate so that the rear wheels 3F and a differential (not illustrated) are inhibited from collision.
- estimated moving loca 1631, 1632 of the rear wheels 3F calculated by the locus calculation portion 1703 are indicated. Because the driver can recognize the estimated moving loca 1631, 1632, the driver may restrain the rear wheels 3R from being collided with an obstacle by driving the vehicle so that the rear wheels 3R overlap the moving loca of the front wheels 3F which have not been collided with the obstacle.
- the bird's eye view image generation portion 1702 may differentiate colors or shapes based on information other than the suspension detection results. For example, the color or shape at the positions where the front wheels 3F slip may be differentiated. As a result, safety when the driver drives the vehicle may improve.
- the surroundings monitoring portion in the aforementioned embodiments include the aforementioned construction so as to easily recognize the surrounding state including the ground below the vehicle 1. Accordingly, a load of driving is reduced to thereby enhance safety.
- the second embodiment or the third embodiment is an example of a vehicle control apparatus or program according to either of the followings [1] - [8].
- composition portion 700: surroundings monitoring portion, 701: acquisition portion, 702: angle calculation portion, 703: filtering control portion, 504: image processing portion, 505: output portion, 521: rotation control portion, 522: reduction/enlargement control portion, 523: movement control portion, 524: composition portion, 700: surroundings monitoring portion, 701: acquisition portion, 702: angle calculation portion, 703: filtering control portion, 704: image processing portion, 705: output portion, 706: bird's eye view image storage portion, 711: rotation control portion, 712: bird's eye view image generation portion, 713: moving amount calculation portion, 714: conversion portion, 715: composition portion
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
- An embodiment of the present invention relates to a vehicle control apparatus and program.
- Conventionally, a technique for providing a vehicle driver with image data captured as surrounding environments of a vehicle by plural cameras which are installed at the vehicle is known as a technique for assisting a parking of the vehicle. A technique for correcting the captured image data depending on an operation of the vehicle in a case where the image data is provided to the driver is proposed. In addition, in order to easily recognize the surrounding environments, a technique for generating bird's eye view image data indicating a ground around the vehicle in an overhead view is proposed.
-
- Patent document 1:
JP2009-212703A - Patent document 2:
JP09-52555A - Patent document 3:
JP2003-009141A - Patent document 4:
JP2003-244688A - Nevertheless, in the conventional art, an issue that the image data is not corrected on a real-time basis is raised because the image data is corrected only when an instruction is made by the driver, for example, or an issue that an appropriate display of the image data is not sufficiently made is raised because an inclination of the vehicle is detected by a vehicle height sensor, the inclination of the vehicle is not accurately obtained in a case where wheels are not in contact with the ground, for example. In addition, an issue that it is difficult to accurately grasp a state around the vehicle by the display based on such the image data is raised.
- A vehicle control apparatus according to embodiments of the present invention, as an example, includes an acquisition portion acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surrounding of the vehicle, and vehicle state data output from a vehicle state detection portion provided at the vehicle, and a control portion performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data, the control portion performing the rotation control in a manner that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side at a display region serving as an output destination. Accordingly, as an example, an effect that it is easy to grasp a state around the vehicle on a basis of the captured image data is obtained.
- In addition, in the aforementioned vehicle control apparatus, as an example, the vehicle state detection portion acquires acceleration data serving as the vehicle state data and output from an acceleration detection portion provided at the vehicle, and the control portion performs the rotation control on the captured image data depending on a roll angle indicating an inclination around a front-rear axis of the vehicle obtained from the acceleration data with an origin at a position coordinate within the display region of the captured image data, the position coordinate corresponding to a center of a lens used for imaging by the imaging portion. Accordingly, as an example, an effect that it is easy to grasp the state around the vehicle on a basis of the captured image data is obtained. In addition, an effect that a height difference is easily visually recognizable is obtained.
- Further, in the aforementioned vehicle control apparatus, as an example, the control portion further performs an enlargement processing or a reduction processing on the captured image data. Accordingly, as an example, the captured image data is enlarged or reduced depending on the output destination, which obtains an effect where visibility improves.
- Furthermore, in the aforementioned vehicle control apparatus, as an example, the control portion further moves the position coordinate corresponding to the center of the lens from a center of the display region relative to the captured image data. Accordingly, as an example, the movement control of the captured image data is performed depending on the output destination, which obtains an effect where visibility improves.
- Furthermore, in the aforementioned vehicle control apparatus, as an example, the control portion further moves the position coordinate corresponding to the center of the lens from the center of the display region to an upper direction within the display region. Accordingly, as an example, because a lower region than the horizontal line included in the captured image data is displayed, an effect that the state around the vehicle may be easily grasped is obtained.
- Furthermore, in the aforementioned vehicle control apparatus, as an example, the captured image data is displayed at a display device, the display device displaying information that represents at least one of a roll angle indicating an inclination around a front-rear axis of the vehicle and a pitch angle indicating an inclination around a left-right axis of the vehicle together with the captured image data. Accordingly, as an example, an effect that it is easy to grasp both the vehicle state and the state around the vehicle is obtained.
- Furthermore, in the aforementioned vehicle control apparatus, as an example, the acquisition portion further acquires information indicating whether or not the vehicle is switched to a mode for off-road, and the control portion performs the rotation control on the captured image data depending on the vehicle state data in a case where the vehicle is switched to the mode for off-road. Accordingly, as an example, an effect that the state around the vehicle is visually recognizable in the mode for off-road is obtained.
- Furthermore, the aforementioned vehicle control apparatus, as an example, further includes a generation portion generating bird's eye view image data indicating a ground in a surrounding of the vehicle in an overhead view based on the captured image data on which the rotation control is performed by the control portion. Accordingly, as an example, the surroundings of the vehicle are recognizable in the overhead view by referring to the bird's eye view image data on which a change of point of view is performed after the rotation control for leveling is conducted, which obtains an effect that the state around the vehicle is visually recognizable.
- Furthermore, a program according to the embodiments of the present invention, as an example, is configured to cause a computer to execute an acquisition step acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surrounding of the vehicle, and vehicle state data output from a vehicle state detection portion provided at the vehicle and a control step performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data, the control step performing the rotation control in a manner that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side at a display region serving as an output destination. Accordingly, as an example, an effect that it is easy to grasp the state around the vehicle on a basis of the captured image data is obtained.
- The aforementioned program, as an example, is further configured to cause the computer to execute a generation step generating bird's eye view image data indicating a ground in a surrounding of the vehicle in the overhead view based on the captured image data on which the rotation control is performed by the control portion. As an example, the bird's eye view image data on which a change of point of view is performed after the rotation control for leveling is conducted is generated. As a result, because the surroundings of the vehicle are recognizable in the overhead view, an effect where the state around the vehicle is visually recognizable is obtained.
-
- [
Fig. 1] Fig. 1 is a perspective view illustrating an example of a state where a portion of an interior of a vehicle according to embodiments is viewed in a perspective manner; - [
Fig. 2] Fig. 2 is a plan view (bird's eye view) illustrating an example of the vehicle according to the embodiments; - [
Fig. 3] Fig. 3 is a block diagram illustrating an example of a surroundings monitoring system of the vehicle according to the embodiments; - [
Fig. 4] Fig. 4 is a diagram illustrating an example of a detection direction of an acceleration sensor according to the embodiments; - [
Fig. 5] Fig. 5 is a block diagram illustrating a construction of a surroundings monitoring portion realized within a surroundings monitoring ECU according to the first embodiment; - [
Fig. 6] Fig. 6 is an example of captured image data captured by an imaging portion according to the first embodiment; - [
Fig. 7] Fig. 7 is a diagram illustrating an example of a two-dimensional orthogonal coordinate system that indicates a display region of the captured image data in a case where a position coordinate corresponding to a center of a lens serves as an origin; - [
Fig. 8] Fig. 8 is a diagram illustrating an example of the captured image data after a rotation correction is performed by a rotation control portion according to the first embodiment; - [
Fig. 9] Fig. 9 is a diagram illustrating an example of image data after a composition portion performs composition according to the first embodiment; - [
Fig. 10] Fig. 10 is a flowchart illustrating procedures of a display processing on a display device in a surroundings monitoring portion according to the first embodiment; - [
Fig. 11] Fig. 11 is a block diagram illustrating a construction of a surroundings monitoring portion realized within a surroundings monitoring ECU according to the second embodiment; - [
Fig. 12] Fig. 12 is a diagram illustrating an example of a state where a vehicle rides on a stone, for example, during an off-road driving of the vehicle according to the second embodiment; - [
Fig. 13] Fig. 13 is a diagram illustrating an example of a two-dimensional orthogonal coordinate system that indicates a display region of captured image data in a case where a position coordinate corresponding to a center of a lens serves as an origin; - [
Fig. 14] Fig. 14 is a diagram illustrating a concept of optical flow calculated by a moving amount calculation portion according to the second embodiment; - [
Fig. 15] Fig. 15 is a diagram illustrating a relation between an average value of optical flows and a moving amount of the vehicle; - [
Fig. 16] Fig. 16 is a diagram illustrating a torsion of the captured image data caused by an inclination of the vehicle and a display range of the captured image data after a conversion portion performs conversion according to the second embodiment; - [
Fig. 17] Fig. 17 is a diagram illustrating an example of bird's eye view image data after a composition portion performs composition according to the second embodiment; - [
Fig. 18] Fig. 18 is a diagram illustrating an example of the bird's eye view image data stored at a bird's eye view image storage portion according to the second embodiment; - [
Fig. 19] Fig. 19 is a diagram illustrating an example of screen information output by an output portion according to the second embodiment; - [
Fig. 20] Fig. 20 is a flowchart illustrating procedures of a display processing on adisplay device 8 in a surroundings monitoring portion according to the second embodiment; - [
Fig. 21] Fig. 21 is a block diagram illustrating a construction of a surroundings monitoring portion realized within a surroundings monitoring ECU according to the third embodiment; and - [
Fig. 22] Fig. 22 is a diagram illustrating an example of screen information output by an output portion according to the third embodiment. - The following plural embodiments include the similar components to one another. Thus, the similar components bear the common reference numerals. In addition, duplicated explanation is omitted.
- In the embodiment, a
vehicle 1 may be a car (internal combustion car) including an internal combustion engine (an engine not illustrated) as a driving source, a car (an electric car, a fuel cell car, or the like) including an electric motor (a motor not illustrated) as the driving source, or a car (a hybrid car) including the engine and the motor as the driving sources, for example. In addition, thevehicle 1 may include various kinds of transmissions and various kinds of apparatuses (systems, parts and the like) necessary for driving the internal combustion engine or the electric motor. Further, method, quantity, layout and the like of an apparatus related to driving ofwheels 3 of thevehicle 1 may be variously specified. - As illustrated in
Fig. 1 , avehicle body 2 forms a vehicle interior 2a where a passenger (not illustrated) gets in. Asteering portion 4, anacceleration operating portion 5, abraking operating portion 6, a speedchange operating portion 7 and the like are provided within the vehicle interior 2a in a state facing aseat 2b of a driver as the passenger. In the present embodiment, as an example, thesteering portion 4 is a steering wheel projecting from a dashboard (instrument panel) and theacceleration operating portion 5 is an accelerator pedal positioned at the feet of the driver. Thebraking operating portion 6 is a brake pedal positioned at the feet of the driver and the speedchange operating portion 7 is a shift lever projecting from a center console. Nevertheless, thesteering portion 4, theacceleration operating portion 5, thebraking operating portion 6 and the speedchange operating portion 7 are not limited to the aforementioned members. - In addition, a display device 8 (display output portion) and an audio output device 9 (audio output portion) are provided within the
vehicle interior 2a. Thedisplay device 8 is, for example, a LCD (liquid crystal display), an OELD (organic electroluminescent display) and the like. Theaudio output device 9 is, as an example, a speaker. In the present embodiment, thedisplay device 8 is covered by a clear operation input portion 10 (for example, a touch panel and the like), for example. The passenger and the like may visually confirm a projected image (image) on a display screen of thedisplay device 8 via theoperation input portion 10. The passenger and the like may perform an operation input (instruction input) by operating theoperation input portion 10, i.e., touching, pressing or moving theoperation input portion 10 with one's finger at a position corresponding to the projected image (image) displayed on the display screen of thedisplay device 8. In the present embodiment, as an example, thedisplay device 8, theaudio output device 9, theoperation input portion 10 and the like are provided at amonitor device 11 positioned at a center portion of the dashboard in a vehicle width direction (left-right direction). Themonitor device 11 may include an operation input portion (not illustrated) such as a switch, a dial, a joy-stick and a pressing button, for example. An audio output device (not illustrated) may be provided at other position within thevehicle interior 2a, i.e., position different from themonitor device 11. In addition, sound may be output from other audio output device than theaudio output device 9 of themonitor device 11. In the present embodiment, as an example, themonitor device 11 is shared by a navigation system and an audio system. Alternatively, a monitor device of a surroundings monitoring apparatus may be separately provided from the aforementioned systems. It may be configured that, in addition to theaudio output device 9, a warning sound and the like may be output from an audio output portion such as a buzzer 24 (refer toFig. 3 ), for example. - As illustrated in
Figs. 1 and 2 , in the present embodiment, thevehicle 1 is a four-wheel vehicle (four-wheel car) as an example. Thevehicle 1 includes two right and leftfront wheels 3F and two right and leftrear wheels 3R. Further, in the present embodiment, these fourwheels 3 are configured to be steered (capable of being steered). Specifically, as illustrated inFig. 3 , thevehicle 1 includes a frontwheel steering system 12 steering thefront wheels 3F and a rearwheel steering system 13 steering therear wheels 3R. The frontwheel steering system 12 and the rearwheel steering system 13 are electrically controlled by a surroundings monitoring ECU 14 (electronic control unit) and the like to operaterespective actuators wheel steering system 12 and the rearwheel steering system 13 is, for example, an electric power steering system, an SBW (steer by wire) system, and the like. The frontwheel steering system 12 and the rearwheel steering system 13 assist a steering force by adding torque (assist torque) to thesteering portion 4 by theactuators front wheels 3F or therear wheels 3R), for example. Each of theactuators wheels 3. In the present embodiment, as an example, the twofront wheels 3F are steered substantially parallel to each other at the same phases (same phases, same steering directions, same rotation directions) and the tworear wheels 3R are steered substantially parallel to each other at the same phases. The driving wheels may be variously specified. - In the present embodiment, as an example, plural (in the embodiment, four, as an example) imaging portions 16 (16a-16d) are provided at the vehicle 1 (vehicle body 2) as illustrated in
Fig. 2 . Each of theimaging portions 16 is, for example, a digital camera incorporating an imaging element such as a CCD (charge coupled device), a CIS (CMOS image sensor) and the like. Theimaging portions 16 may output image data (moving image data, frame data) at a predetermined frame rate. Each of theimaging portions 16 includes a wide-angle lens to thereby take a picture in a range from 140° to 220° in a horizontal direction (view angle). An optical axis of theimaging portion 16 is specified to face downward (obliquely downward). Thus, theimaging portion 16 takes a picture of outside environment around thevehicle body 2 including a road surface on which thevehicle 1 is movable. - In the above, the horizontal direction is a direction included in a horizontal plane orthogonal to a direction of gravity (vertical direction).
- In the embodiment, as an example, the
imaging portion 16a is positioned at anend portion 2c (an end portion in a plan view) at a front side (a front side in a vehicle front-rear direction) of thevehicle body 2 and is provided at a front bumper, for example. Theimaging portion 16b is positioned at anend portion 2d at a left side (a left side in a vehicle width direction) of thevehicle body 2 and is provided at adoor mirror 2g (projecting portion) at a left side. Theimaging portion 16c is positioned at anend portion 2e at a rear side (a rear side in the vehicle front-rear direction) of thevehicle body 2 and is provided at a wall portion at a lower side of adoor 2h of a rear trunk. Theimaging portion 16d is positioned at anend portion 2f at a right side (a right side in the vehicle width direction) of thevehicle body 2 and is provided at adoor mirror 2g (projecting portion) at a right side. In the present embodiment, the method of mounting the camera at the vehicle is not limited and the camera may be mounted so that the image data in a front direction, the image data in right and left side directions and the image data in a rear direction relative to the vehicle is obtainable. - The
surroundings monitoring ECU 14 performs a calculation processing and an image processing based on the image data obtained by theplural imaging portions 16. Thesurroundings monitoring ECU 14 is able to generate a wider view angle image and a virtual bird's eye view image (planar image) where the vehicle 1 (vehicle body 2) is viewed from an upper side. - In the present embodiment, as an example, in a
surroundings monitoring system 100 as illustrated inFig. 3 , a brake system 18, a steering angle sensor 19 (angular sensor), anaccelerator sensor 20, ashift sensor 21, a wheel speed sensor 22, anacceleration sensor 26, and the like are electrically connected, in addition to thesurroundings monitoring ECU 14, themonitor device 11, the frontwheel steering system 12, the rearwheel steering system 13, and the like, via an in-vehicle network 23 (electric telecommunication line). The in-vehicle network 23 is configured as a CAN (controller area network) as an example. Thesurroundings monitoring ECU 14 may send a control signal via the in-vehicle network 23 to control the frontwheel steering system 12, the rearwheel steering system 13, the brake system 18, and the like. Thesurroundings monitoring ECU 14 may also receive detection results of a torque sensor 12b, a tire angle sensor 13b (for therear wheels 3R), an actuator 18a, abrake sensor 18b, the steering angle sensor 19 (for thefront wheels 3F), theaccelerator sensor 20, theshift sensor 21, the wheel speed sensor 22, theacceleration sensor 26, and the like and indicator signals (control signals, operation signals, input signals, data) of theoperation input portion 10 and the like via the in-vehicle network 23. - In the present embodiment, the two acceleration sensors 26 (26a, 26b) are provided at the
vehicle 1. In the embodiment, thevehicle 1 is equipped with an ESC (electronic stability control). Then, the acceleration sensors 26 (26a, 26b) as conventionally mounted to the vehicle equipped with the ESC (electronic stability control) are employed. In the present embodiment, no restriction is made on the acceleration sensor. The sensor that is able to detect the acceleration in the left-right direction of thevehicle 1 is acceptable. -
Fig. 4 is a diagram illustrating an example of detection directions of theacceleration sensors detection direction 401 is the detection direction of theacceleration sensor 26a while adetection direction 402 is the detection direction of theacceleration sensor 26b. Thedetection direction 401 illustrated inFig. 4 corresponds to a direction inclined by 45 degrees from a travelling direction (front-rear direction) of thevehicle 1 on a plane in parallel with a ground (a plane on which thevehicle 1 is movable). Thedetection direction 402 forms an angle of 90 degrees relative to thedetection direction 401 on the plane in parallel with the ground. Because the two different detection directions are provided on the plane in parallel with the ground, the acceleration in the front-rear direction and the acceleration in the left-right direction may be obtained. In the present embodiment, no restriction is made on the detection direction and at least the acceleration in the left-right direction may be obtained. Calculations of the acceleration in the front-rear direction and the acceleration in the left-right direction are made at thesurroundings monitoring ECU 14. - The front-rear direction of the
vehicle 1 indicates the travelling direction and an opposite direction from the travelling direction of thevehicle 1. The left-right direction of thevehicle 1 is a direction included in a surface orthogonal to the travelling direction of thevehicle 1. - Back to
Fig. 3 , thesurroundings monitoring ECU 14 includes, as an example, aCPU 14a (central processing unit), aROM 14b (read only memory), aRAM 14c (random access memory), adisplay control portion 14d, anaudio control portion 14e, aSSD 14f (solid state drive, flush memory), and the like. TheCPU 14a performs the image processing related to the image displayed at thedisplay device 8 and the various calculation processing such as calculation of a moving path of thevehicle 1 and determination of whether or not interference with an object occurs, for example. TheCPU 14a reads out program stored (installed) at a nonvolatile memory device such as theROM 14b, for example, and performs the calculation processing based on the aforementioned program. - The
RAM 14c tentatively stores various data used for the calculations at theCPU 14a. Thedisplay control portion 14d mainly performs the image processing using the image data obtained at theimaging portions 16 and the image processing (composition and the like, as an example) of the image data displayed at thedisplay device 8, for example, within the calculation processing at thesurroundings monitoring ECU 14. In addition, theaudio control portion 14e mainly performs processing of audio data output at theaudio output device 9 within the calculation processing at thesurroundings monitoring ECU 14. TheSSD 14f is a rewritable nonvolatile memory portion that is able to store data even in a case where a power source of thesurroundings monitoring ECU 14 is turned off. TheCPU 14a, theROM 14b, theRAM 14c and the like may be integrated within the same package. Thesurroundings monitoring ECU 14 may be configured to include other logic operation processor such as a DSP (digital signal processor) or a logic circuit, for example, than theCPU 14a. In addition, instead of theSSD 14f, a HDD (hard disk drive) may be provided. Further, theSSD 14f or the HDD may be provided separately from thesurroundings monitoring ECU 14. -
Fig. 5 is a block diagram illustrating a construction of asurroundings monitoring portion 500 realized within thesurroundings monitoring ECU 14 according to the present embodiment. Each construction within thesurroundings monitoring portion 500 illustrated inFig. 5 is realized in a case where theCPU 14a configured as thesurroundings monitoring ECU 14 inFig. 4 performs software stored within theROM 14b. - The
surroundings monitoring portion 500 realizes anacquisition portion 501, anangle calculation portion 502, a filtering control portion 503, animage processing portion 504 and anoutput portion 505 by performing software stored within theROM 14b (computer readable storage medium). At this time, software (program) may be provided via other computer readable storage medium. - Then, the
surroundings monitoring portion 500 according to the present embodiment assists the driving of the driver by displaying the image data by which a state around thevehicle 1 is recognizable on the basis of the captured image data input from theimaging portions 16 in a case where thevehicle 1 moves to be parked, and the acceleration data as an example of a vehicle state data acquired by the acceleration sensor 26 (acceleration detection portion) functioning as an example of a vehicle state detection portion. - The
acquisition portion 501 acquires various pieces of information from various sensors, for example, provided at thevehicle 1. Theacquisition portion 501 according to the present embodiment acquires the captured image data output from theimaging portions 16a to 16d provided at thevehicle 1 to capture the images in the surroundings of thevehicle 1 and the acceleration data output from theacceleration sensors vehicle 1. Further, theacquisition portion 501 acquires information indicating whether or not a mode specified by a switch provided at theoperation input portion 10 is an off-road mode. Theacquisition portion 501 outputs the acquired information to theangle calculation portion 502 and theimage processing portion 504. - The
acquisition portion 501 also correlates the captured image data with the acceleration data where time when the image is captured in the captured image data and time when the acceleration is detected in the acceleration data substantially match each other. - The
angle calculation portion 502 calculates an inclination angle (a pitch angle and a roll angle) of thevehicle 1 based on the acceleration data acquired by theacceleration sensors vehicle 1 around a left-right axis (axis 412 inFig. 4 ) of thevehicle 1. In a case where thevehicle 1 is present on the horizontal plane (ground), the pitch angle is zero degrees. - The roll angle is an angle indicating an inclination of the
vehicle 1 around a longitudinal axis (axis 411 inFig. 4 ) of thevehicle 1. In a case where thevehicle 1 is present on the horizontal plane (ground), the roll angle is zero degrees. In order to calculate the pitch angle and the roll angle, theangle calculation portion 502 first calculates an acceleration a1 in the front-rear direction and an acceleration a2 in the left-right direction of thevehicle 1. - The
angle calculation portion 502 calculates the acceleration a1 in the front-rear direction using the following equation (1). The acceleration in thedetection direction 401 is specified to be GL1 and the acceleration in thedetection direction 402 is specified to be GL2. In the present embodiment, as an example, the acceleration a1 in the front-rear direction turns to 0G in a case where the pitch angle is 0° (in a case where thevehicle 1 is horizontal) and the acceleration a1 in the front-rear direction turns to 1G in a case where the pitch angle is 90° (in a case where thevehicle 1 is vertical). -
-
-
- The
angle calculation portion 502 correlates the roll angle and the pitch angle calculated from the acceleration data with the captured image data that is correlated to the aforementioned acceleration data. Accordingly, the roll angle and the pitch angle of thevehicle 1 when the captured image data is captured are recognizable. - The filtering control portion 503 performs filtering by low-pass filter relative to the roll angle RA and the pitch angle PA calculated by the
angle calculation portion 502. - In the present embodiment, steep changes of the roll angle RA and the pitch angle PA, in other words, a steep switching of the image data displayed at the
display device 8 is restrained by performing the low-pass filter. Accordingly, the driver may comfortably watch the image data displayed at thedisplay device 8. In the present embodiment, an example where digital filter is used by the filtering control portion 503 provided within thesurroundings monitoring portion 500 is explained. Nevertheless, for example, analog filter, for example, may be performed relative to a signal output from theacceleration sensor 26. - The
image processing portion 504 includes arotation control portion 521, a reduction/enlargement control portion 522, amovement control portion 523 and acomposition portion 524 to generate the image data displayed at thedisplay device 8. - The
rotation control portion 521 performs a rotation correction on the captured image data capturing the surroundings of a front side of thevehicle 1. A subject of the rotation correction is not limited to the captured image data captured by theimaging portion 16a and may be the captured image data captured by theimaging portion 16c capturing the surroundings of a rear side of thevehicle 1, for example. -
Fig. 6 is an example of the captured image data captured by theimaging portion 16a. The captured image data illustrated inFig. 6 is captured from thevehicle 1 that is inclined. The driver tends to recognize the image displayed at thedisplay device 8 in an objective way and thus tends to recognize areas in the captured image data displayed at thedisplay device 8, if the areas include the same heights in a vertical axis direction, include the same heights in reality or heights with a smaller height difference than the actual height difference. In the example illustrated inFig. 6 , aregion 601 and aregion 602 are possibly recognized as the same heights. - Thus, the
rotation control portion 521 according to the present embodiment performs the rotation correction on the captured image data depending on the roll angle obtained by theacceleration sensors 26. In other words, therotation control portion 521 performs the rotation correction (control) on the captured image data based on the inclination of the vehicle in the left-right direction relative to the horizontal direction serving as the direction included in the horizontal plane orthogonal to the direction of gravity calculated from the vehicle state data. For example, therotation control portion 521 performs the rotation correction (control) so that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side at a display region of an output destination. - The
rotation control portion 521 according to the present embodiment performs the rotation correction with an origin at a position coordinate within the display region of the captured image data corresponding to a center of a lens used by theimaging portion 16 for image capturing depending on the roll angle correlated to the aforementioned captured image data. -
Fig. 7 is a diagram illustrating an example of a two-dimensional orthogonal coordinate system that indicates the display region of the captured image data in a case where the position coordinate corresponding to the center of the lens serves as the origin. For each position coordinate included in the coordinate system illustrated inFig. 7 , therotation control portion 521 converts the position coordinate by an equation (5) indicated below so as to achieve the rotation correction of the captured image data. Here, dx0, dy0 is a coordinate value with the origin at the center of the lens. In addition, θ is the roll angle that is calculated. -
-
Fig. 8 is a diagram illustrating an example of the captured image data obtained after the rotation correction is performed by therotation control portion 521. In the example illustrated inFig. 8 , the rotation correction is performed so that the horizontal line included in the subject (environment outside the vehicle 1) captured in the captured image data is substantially in parallel with the lateral-direction side of the display region of thedisplay device 8. In other words, the rotation correction is performed so that a lower direction of the captured image data corresponds to the direction of gravity of the subject (environment outside the vehicle 1) captured in the aforementioned captured image data. At this time, the lower direction and the direction of gravity do not necessarily completely coincide with each other and may coincide with each other so that a height relation within the captured image data is recognizable. - For example, as for the
region 601 and theregion 602 which seem to include the same heights inFig. 6 , it is recognizable inFig. 8 that theregion 602 is present at a higher position than theregion 601. Therefore, the driver may recognize an objective height in the surrounding environments of thevehicle 1. Accordingly, an appropriate driving is achievable, which may improve safety. - The reduction/
enlargement control portion 522 functioning as the control portion performs an enlargement processing or a reduction processing relative to the captured image data after the rotation correction is performed by therotation control portion 521. The reduction/enlargement control portion 522 converts the position coordinate by an equation (6) indicated below to achieve an enlargement correction or a reduction correction of the captured image data. Here, dx1, dy1 is a coordinate value with the origin at the center of the lens after the rotation correction is performed. Here, magX and magY are horizontal and vertical enlargement/reduction rates. The enlargement/reduction rate is decided on a basis of a relationship between a display size of the captured image data and number of pixels of the display region of thedisplay device 8. -
- The
movement control portion 523 functioning as the control portion performs a control on the captured image data after the enlargement or reduction processing is performed by the reduction/enlargement control portion 522 so that the position coordinate corresponding to the center of the lens moves from the center of the display region of thedisplay device 8. In the present embodiment, themovement control portion 523 performs a control to move the position coordinate corresponding to the center of the lens from the center of the display region of thedisplay device 8 to an upper direction within the display region. - That is, in a situation where the
vehicle 1 is inclined, the driver tends to desire to confirm the ground conditions. Thus, themovement control portion 523 performs the processing to move the position coordinate corresponding to the center of the lens from the center of the display region of thedisplay device 8 to the upper direction within the display region. Accordingly, conditions upper than thevehicle 1 such as the sky captured in the captured image data, for example, are not displayed and conditions lower than thevehicle 1 are displayed. Thus, the user may recognize the ground conditions around thevehicle 1 by referring to the captured image data displayed at thedisplay device 8. Accordingly, an appropriate steering assist is achievable. - The
movement control portion 523 converts the position coordinate by an equation (7) indicated below to achieve the movement of the position coordinate of the captured image data. Here, dx2, dy2 is a coordinate value with the origin at the center of the lens after the enlargement/reduction correction is performed. Here, a destination of the position coordinate of the center of the lens before the movement is (cx, cy). -
- The
composition portion 524 performs a cutout relative to the captured image data after the movement control is performed by themovement control portion 523 so as to conform to the display region of thedisplay device 8 and thereafter combines display information for assisting the steering of the driver. -
Fig. 9 is a diagram illustrating an example of the image data after the composition is performed by thecomposition portion 524. In the example illustrated inFig. 9 , conditions around the left front wheel of thevehicle 1 captured by theimaging portion 16b is displayed at adisplay region 901. In addition, conditions around the right front wheel of thevehicle 1 captured by theimaging portion 16d is displayed at adisplay region 902. Further, information by which the pitch angle and the roll angle of thevehicle 1 are recognizable is displayed at adisplay region 903. That is, an inclination of anicon 921 representing thevehicle 1 indicates the roll angle while a distance between acenter line 912 passing through theicon 921 and aline 911 indicates the pitch angle. Accordingly, in the present embodiment, information by which the roll angle and the pitch angle are recognizable is indicated, however, display method is not limited to the above and other display method is acceptable. - In the
vehicle 1 according to the present embodiment, the roll state and the pitch state of thevehicle 1 during the off-road driving may be displayed in real time. Accordingly, the driver may easily and objectively recognize the conditions of the surroundings of thevehicle 1. - In addition, the captured image data after cut out by the
composition portion 524 is displayed at adisplay region 904. The horizontal line within the image in the captured image data is corrected to be substantially in parallel with a lateral frame of thedisplay device 8. In other words, the lower direction of the image in the captured image data is corrected to match the direction of gravity. Accordingly, the driver may easily recognize the surrounding state. - Then, the
output portion 505 outputs the image data that is composited by thecomposition portion 524 to thedisplay device 8. Accordingly, together with the aforementioned captured image data after the correction processing is performed, information by which the roll angle and the pitch angle are recognizable is displayed at thedisplay device 8. - In the example of
Fig. 8 or 9 , an estimatedcourse line 905 of each of thefront wheels 3F is included. The surroundings monitoring ECU 14 (CPU 14a) is able to calculate a planned course based on detection results of thesteering angle sensor 19 and the tire angle sensor 13b, for example, and to include (overlap) the estimatedcourse line 905 conforming to the planned course in the output image. The estimatedcourse line 905 is an example of a display element indicating the course that is planned. Thesurroundings monitoring ECU 14 corrects the display position, size, posture (inclination) and the like of the estimatedcourse line 905 depending on the aforementioned rotation, enlargement/reduction and movement corrections. In addition, in a case where the position of the estimatedcourse line 905 is greatly deviated from a center of the screen, thesurroundings monitoring ECU 14 is able to correct the display region and the estimatedcourse line 905 in a direction where the deviation is reduced. - In the example of
Fig. 9 , the inclination of theicon 921 relative to a lateral-direction side of thedisplay region 903 or 904 (an upper side or a lower side inFig. 9 ) corresponds to the roll angle of thevehicle 1. Thus, thesurroundings monitoring ECU 14 may constitute a tiltmeter 906 (roll angle display portion) using theicon 921 by including anangle scale 906a (tilt scale) surrounding theicon 921 in the output image in a manner that an angle of theangle scale 906a remains unchanged relative to thedisplay region 903. For example, only by the display of thedisplay region 904, it may be difficult to understand the horizontal direction, the vertical direction, and the posture (the roll angle or the pitch angle) of thevehicle 1. In this point, as in the example ofFig. 9 , theicon 921 that performs rotation (rolling) and pitching is displayed and thetiltmeter 906 is displayed on the screen depending on the roll angle and the pitch angle so that the horizontal direction, the vertical direction and the posture (the roll angle) of thevehicle 1 may be easily understood, regardless of the state of the image of thedisplay region 904. Accordingly, thedisplay region 904 and thedisplay region 903 are displayed together (displayed within the same screen or displayed in parallel with each other) so that the state around the vehicle and the state of the posture of the vehicle may be further easily understood. - In addition, the present embodiment may not perform the aforementioned rotation, enlargement/reduction and movement corrections on a constant basis and may be specified to perform the aforementioned corrections in a case where the
vehicle 1 is brought to the off-road mode. For example, theimage processing portion 504 performs the aforementioned rotation, enlargement/reduction and movement corrections at the time of the off-road mode by referring to information acquired by theacquisition portion 501 indicating whether or not thevehicle 1 is in the off-road mode. - Here, the off-road mode corresponds to the mode for bringing out a four-wheel driving performance of the
vehicle 1 during the off-road driving and the mode for specifying a total transfer gear to be low. That is, in the present embodiment, the captured image data displayed at thedisplay device 8 is switched in association with the operation when the off-road driving is performed. At this time, in the present embodiment, the switching of the image displayed at thedisplay device 8 is not limited to be performed in a case where thevehicle 1 is switched to the off-road mode. For example, in a case where thevehicle 1 is switched to the four-wheel driving in a two/four wheel drive switching, it may be controlled that the image after the rotation correction is performed is displayed. - Next, a display processing at the
display device 8 in thesurroundings monitoring portion 500 according to the present embodiment is explained.Fig. 10 is a flowchart illustrating procedures of the aforementioned processing in thesurroundings monitoring portion 500 according to the present embodiment. - First, the
acquisition portion 501 acquires the captured image data from the imaging portions 16 (step S1001). Next, theacquisition portion 501 acquires the acceleration data from the acceleration sensors 26 (step S1002). - Then, the
angle calculation portion 502 calculates the roll angle and the pitch angle of thevehicle 1 from the acceleration data (step S1003). - Next, the filtering control portion 503 performs filtering by low-pass filter relative to the calculated roll angle and the calculated pitch angle (step S1004).
- Then, the
rotation control portion 521 performs the rotation control relative to the captured image data depending on the roll angle (step S1005). - Next, the reduction/
enlargement control portion 522 and themovement control portion 523 perform the enlargement control and the movement control on the captured image data after the rotation control is performed (step S1006). - Then, the
composition portion 524 performs the cutout conforming to the display region displayed at thedisplay device 8 relative to the captured image data after the enlargement control and the movement control are performed (step S1007). - Next, the
composition portion 524 combines the captured image data indicating the state around the front wheels and the display information by which the pitch angle and the roll angle are recognizable relative to the captured image data that is cut out (step S1008). - Then, the
output portion 505 outputs the image data after the composition by thecomposition portion 524 to the display device 8 (step S1009). - The
surroundings monitoring portion 500 according to the present embodiment includes the aforementioned construction so as to easily recognize the difference in height in the surroundings of thevehicle 1. Accordingly, load of steering may be reduced to thereby improve safety. -
Fig. 11 is a block diagram illustrating a construction of asurroundings monitoring portion 700 realized within thesurroundings monitoring ECU 14 according to the present embodiment. TheCPU 14a configured as thesurroundings monitoring ECU 14 inFig. 4 executes software stored within theROM 14b to thereby realize anacquisition portion 701, anangle calculation portion 702, afiltering control portion 703, animage processing portion 704 and anoutput portion 705 illustrated inFig. 11 . In addition, thesurroundings monitoring portion 700 realizes a bird's eye viewimage storage portion 706 on theRAM 14c. - The
acquisition portion 701 acquires various pieces of information from various sensors, for example, provided at thevehicle 1. Theacquisition portion 701 according to the present embodiment acquires the captured image data output from theimaging portions 16a to 16d provided at thevehicle 1 to capture the images in the surroundings of thevehicle 1 and the acceleration data serving as an example of the vehicle state data output from theacceleration sensors vehicle 1 and functioning as an example of the vehicle state detection portion. Theacquisition portion 701 outputs the acquired information to theangle calculation portion 702 and theimage processing portion 704. - The
acquisition portion 701 also correlates the captured image data with the acceleration data where time when the image is captured in the captured image data and time when the acceleration is detected in the acceleration data substantially match each other. - The
angle calculation portion 702 calculates the inclination angle (the pitch angle and the roll angle) of thevehicle 1 based on the acceleration data acquired by theacceleration sensors vehicle 1 around the left-right axis (axis 412 inFig. 4 ) of the vehicle. In a case where thevehicle 1 is present on the horizontal plane (ground), the pitch angle is zero degrees. - The roll angle is an angle indicating an inclination of the
vehicle 1 around the longitudinal axis (axis 411 inFig. 4 ) of thevehicle 1. In a case where thevehicle 1 is present on the horizontal plane (ground), the roll angle is zero degrees. In order to calculate the pitch angle and the roll angle, theangle calculation portion 702 first calculates the acceleration a1 in the front-rear direction and the acceleration a2 in the left-right direction of thevehicle 1. - The
angle calculation portion 702 calculates the acceleration a1 in the front-rear direction using the following equation (1). The acceleration in thedetection direction 401 is specified to be GL1 and the acceleration in thedetection direction 402 is specified to be GL2. In the present embodiment, as an example, the acceleration a1 in the front-rear direction turns to 0G in a case where the pitch angle is 0° (in a case where thevehicle 1 is horizontal) and the acceleration a1 in the front-rear direction turns to 1G in a case where the pitch angle is 90° (in a case where thevehicle 1 is vertical). -
-
-
- The
angle calculation portion 702 correlates the roll angle and the pitch angle calculated from the acceleration data with the captured image data that is correlated to the aforementioned acceleration data. Accordingly, the roll angle and the pitch angle of thevehicle 1 when the captured image data is captured are recognizable. - The
filtering control portion 703 performs filtering by low-pass filter relative to the roll angle RA and the pitch angle PA calculated by theangle calculation portion 702. - In the present embodiment, steep changes of the roll angle RA and the pitch angle PA, in other words, a steep switching of the image data displayed at the
display device 8 is restrained by performing the low-pass filter. Accordingly, the driver may comfortably watch the image data displayed at thedisplay device 8. In the present embodiment, an example where digital filter is used by thefiltering control portion 703 provided within thesurroundings monitoring portion 700 is explained. Nevertheless, for example, analog filter, for example, may be performed relative to a signal output from theacceleration sensor 26. - The
image processing portion 704 includes arotation control portion 711, a bird's eye view image generation portion 712 (generation portion), a movingamount calculation portion 713, aconversion portion 714 and acomposition portion 715 each of which serves as the control portion. Theimage processing portion 704 generates the image data to be displayed at thedisplay device 8. - The
rotation control portion 711 performs the rotation correction on the captured image data capturing the surroundings of a front side of the vehicle 1 (travelling direction) based on the inclination of the vehicle in the left-right direction relative to the horizontal direction calculated from the vehicle state data (in other words, depending on the roll angle). The horizontal direction is a direction orthogonal to the travelling direction, for example. In addition, the rotation correction may be performed on the captured image data in the same direction as a rotation direction where the vehicle becomes horizontal based on the inclination in the left-right direction of the vehicle calculated from the vehicle state data. In other words, the aforementioned rotation correction may be performed on the captured image data as if the image is captured in a state where the left-right direction of thevehicle 1 is horizontal (in a state where thevehicle 1 is arranged on the horizontal plane orthogonal to the direction of gravity). In the present embodiment, as an example, the acceleration is used as the vehicle state data. The vehicle state data, however, is not limited to the acceleration and may be information relevant to the state of thevehicle 1. A subject of the rotation correction is not limited to the captured image data captured by theimaging portion 16a and may be the captured image data captured by theimaging portion 16c that captures the surroundings of a rear side of thevehicle 1. -
Fig. 12 is a diagram illustrating an example of a state where thevehicle 1 drives over a stone, for example, during the off-road driving. In the sample illustrated inFig. 12 , because thevehicle 1 drives over the stone, for example, thevehicle 1 is inclined by a roll angle θ. In a case where bird's eye view image data is generated from the captured image data captured by theimaging portion 16a in the aforementioned state, distortion depending on the roll angle θ occurs. - Therefore, the
rotation control portion 711 according to the present embodiment performs the rotation correction on the captured image data depending on the roll angle θ obtained from theacceleration sensor 26. - The
rotation control portion 711 according to the present embodiment performs the rotation correction with the origin at the position coordinate within the display region of the captured image data corresponding to the center of the lens used by theimaging portion 16a for image capturing depending on the roll angle correlated to the captured image data. -
Fig. 13 is a diagram illustrating an example of a two-dimensional orthogonal coordinate system that indicates the display region of the captured image data in a case where the position coordinate corresponding to the center of the lens serves as the origin. For each position coordinate included in the coordinate system illustrated inFig. 13 , therotation control portion 711 converts the position coordinate by the equation (5) indicated below so as to achieve the rotation correction of the captured image data. Here, dx0, dy0 is a coordinate value with the origin at the center of the lens. In addition, the angle θ illustrated inFig. 13 is the roll angled that is calculated. -
- The
rotation control portion 711 performs the rotation correction on all pixels included in adisplay region 801 so as to generate adisplay region 802 obtained by the rotation of thedisplay region 801 by the angle θ. Then, thesurroundings monitoring portion 700 generates the bird's eye view image data based on the captured image data including thedisplay region 802 which is obtained after the rotation control is performed. Accordingly, the bird's eye view image data where the inclination caused by the roll angle θ generated at thevehicle 1 is corrected may be generated. - In addition, the
rotation control portion 711 according to the present embodiment does not limit the angle for the rotation control on the captured image data to the roll angle by which thevehicle 1 is inclined from the horizontal plane. Therotation control portion 711 may perform the rotation control on the captured image data depending on a difference between the roll angle previously calculated and the roll angle presently calculated. This is because the state of the ground around thevehicle 1 when thevehicle 1 is driven on the ground that is inclined by a predetermined angle (roll angle previously calculated) is more easily recognizable in a case where the bird's eye view image data is generated from an angle inclined by the predetermined angle from the upper side of thevehicle 1 in the vertical direction than a case where the bird's eye view image data is generated from the upper side of thevehicle 1 in the vertical direction. In the aforementioned case, therotation control portion 711 performs the rotation control on the captured image data for the difference of the roll angles resulting from the inclination by driving on a stone, for example (difference between the previously calculated roll angle and the presently calculated roll angle), in a case where thevehicle 1 drives on a stone, for example. - The bird's eye view
image generation portion 712 generates, on a basis of the captured image data which is obtained after the rotation control is performed, the bird's eye view image data obtained by looking down the ground in the travelling direction of thevehicle 1 serving as the ground around thevehicle 1 from the upper side,. Here, any method for generating the bird's eye view image data from the captured image data is acceptable. For example, a mapping table may be used for conversion. - The generation of the bird's eye view image data is performed each time the captured image data is acquired. In other words, the bird's eye view
image generation portion 712 generates first bird's eye view image data based on first captured image data on which the rotation control is performed by therotation control portion 711 and thereafter generates second bird's eye view image data based on second captured image data which is captured by theimaging portions 16 after the first captured image data is captured and then thevehicle 1 moves and on which the rotation control is performed by therotation control portion 711. - In the present embodiment, the image data displayed at the
display device 8 is updated each time thevehicle 1 moves by a predetermined moving amount. Thus, the movingamount calculation portion 713 compares the bird's eye view image data generated by the bird's eye viewimage generation portion 712 and the bird's eye view image data used upon previous updating so as to calculate the moving amount of thevehicle 1. - Nevertheless, the comparison in the entire bird's eye view image data causes a great processing load. Thus, the moving
amount calculation portion 713 according to the present embodiment compares predetermined areas within the bird's eye view image data generated by the bird's eye viewimage generation portion 712. - Specifically, the moving
amount calculation portion 713 according to the present embodiment cuts out the predetermined area (display range) from each of the first bird's eye view image data used upon the previous updating and the second bird's eye view image data generated after the first bird's eye view image data so as to calculate an optical flow. -
Fig. 14 is a diagram illustrating a concept of optical flow calculated by the movingamount calculation portion 713. (A) ofFig. 14 is the image data cut out at the predetermined display range from the first bird's eye view image data used upon the previous updating while (B) ofFig. 14 is the image data cut out at the predetermined display range from the second bird's eye view image data generated presently by the bird's eye viewimage generation portion 712. - Then, the moving
amount calculation portion 713 calculates the optical flows indicating a shifting of (feature points of) each displayed object by vectors between the image data illustrated in (A) ofFig. 14 and the image data illustrated in (B) ofFig. 14 . (C) ofFig. 14 illustrates an example of calculated optical flows. In the example illustrated in (C) ofFig. 14 , a length of each vector corresponding to a movement of the feature point (indicated by "X") in (A) ofFig. 14 to the feature point (indicated by "X") in (B) ofFig. 14 is indicated. - Then, the moving
amount calculation portion 713 calculates the moving amount of thevehicle 1 from an average value of the calculated optical flows. -
Fig. 15 is a diagram illustrating a relation between the average value of the optical flows and the moving amount of thevehicle 1. In the example illustrated inFig. 15 , anarrow 901 is specified to be the average value of the optical flows. Thevehicle 1 turns about a rear wheel axis. Thus, in a case of theaverage value 901 of the optical flows, thevehicle 1 turns by a turning angle 1θ. Accordingly, the movingamount calculation portion 713 calculates the turning angle 1θ of thevehicle 1. Further, the movingamount calculation portion 713 calculates the moving amount of thevehicle 1 from the length of each of the optical flows. The movingamount calculation portion 713 may separately calculate the moving amount of thevehicle 1 in the front-rear direction and the moving amount of thevehicle 1 in the left-right direction. - The
conversion portion 714 converts the bird's eye view image data generated presently by the bird's eye viewimage generation portion 712 into the bird's eye view image data for composition with the bird's eye view image data stored at the bird's eye viewimage storage portion 706 in a case where the moving amount calculated by the movingamount calculation portion 713 is equal to or greater than a predetermined distance. - In a case where the
vehicle 1 is inclined, the inclination is corrected by therotation control portion 711, however, distortion resulting from the inclination remains in the captured image data captured by theimaging portions 16. Thus, in order to reduce the distortion by theconversion portion 714, a projective transformation is performed on the bird's eye view image data. - In the present embodiment, the projective transformation is employed so that a case where the torsion of the road surface within the captured image data is generated by the inclination of the
vehicle 1 is converted to a case where thevehicle 1 is not inclined. For example, a so-called trapezoidal correction where a trapezoidal-shaped area within the captured image data is converted to a rectangular or square area, for example, is included. -
Fig. 16 is a diagram illustrating the distortion of the captured image data caused by the inclination of thevehicle 1 and the display range of the captured image data after the conversion is performed by theconversion portion 714. In the example illustrated inFig. 16 , distortion states (a1) and (a2) of the bird's eye view image data generated in a roll state of thevehicle 1 are illustrated and distortion states (b1) and (b2) of the bird's eye view image data generated in a pitch state of thevehicle 1 are illustrated. In a case where the bird's eye view image data in the distorted state because of the inclination of thevehicle 1 is combined with the bird's eye view image data stored at the bird's eye viewimage storage portion 706, the distortion is accumulated. - Therefore, the
conversion portion 714 according to the present embodiment performs the projective transformation determined on a basis of a second roll angle inFig. 16 relative to the second captured image data cut out at the display range ((a1), (a2), (b1), (b2)) specified on a basis of the aforementioned roll angle obtained from the acceleration data acquired when the captured image data is captured. The display range ((a1), (a2), (b1), (b2)) is converted to the display range illustrated in (c) ofFig. 16 . Here, a method for specifying the position coordinates of four points indicating the display region serving as a subject of the projective transformation based on the roll angle may be any method regardless of whether it is a conventional method or not. For example, a correlation of the roll angle θ with each of the four position coordinates may be provided beforehand. - The
composition portion 715 combines the bird's eye view image data stored at the bird's eye viewimage storage portion 706 and the bird's eye view image data obtained after the projective transformation is performed by theconversion portion 714. -
Fig. 17 is a diagram illustrating an example of the bird's eye view image data obtained after the composition is performed by thecomposition portion 715. Within the bird's eye view image data illustrated inFig. 17 , adisplay range 1101 is a range most recently composited. Adisplay range 1102 is a range composited before thedisplay range 1101. Adisplay range 1103 is a range composited before thedisplay range 1102. Accordingly, in the present embodiment, the bird's eye view image data is composited each time thevehicle 1 moves. - The bird's eye view
image storage portion 706 stores the bird's eye view image data after the composition by thecomposition portion 715. Accordingly, the bird's eye viewimage storage portion 706 composites and stores the bird's eye view image data newly generated each time thevehicle 1 moves by the predetermined moving amount so that the bird's eye view image data indicating the condition of the ground below thevehicle 1 is stored at the bird's eye viewimage storage portion 706. -
Fig. 18 is a diagram illustrating an example of the bird's eye view image data stored at the bird's eye viewimage storage portion 706. As illustrated inFig. 18 , besides the bird's eye view image data generated from the captured image data presently captured, the bird's eye view image data generated from the captured image data captured up to the previous time is composited and stored. As illustrated inFig. 18 , the bird's eye view image data stored at the bird's eye viewimage storage portion 706 includes the ground below thevehicle 1. The configuration of thevehicle 1 illustrated inFig. 18 is indicated for easily explanation and is not included in the actual bird's eye view image data stored at the bird's eye viewimage storage portion 706. - In addition, in a case where the
composition portion 715 combines the bird's eye view image data stored at the bird's eye viewimage storage portion 706 with the bird's eye view image data obtained after the projective transformation is performed by theconversion portion 714, thecomposition portion 715 performs a rotation processing with the turning angle θ1 on the bird's eye view image data stored at the bird's eye viewimage storage portion 706 when thevehicle 1 turns at the turning angle θ1. Then, thecomposition portion 715 performs the composition with the bird's eye view image data obtained after the projective transformation is performed by theconversion portion 714. Accordingly, the bird's eye view image data conforming to the turning of thevehicle 1 may be displayed. - The
output portion 705 outputs, together with the bird's eye view image data stored at the bird's eye viewimage storage portion 706, the image data where various pieces of information are composited to thedisplay device 8.Fig. 19 is a diagram illustrating an example of screen information output by theoutput portion 705. In the example of the screen information illustrated inFig. 19 , capturedimage data 1302 capturing the travelling direction of thevehicle 1 by theimaging portion 16a, capturedimage data 1303 around the front left wheel of thevehicle 1 captured by theimaging portion 16b, and capturedimage data 1304 around the front right wheel of thevehicle 1 captured by theimaging portion 16d are displayed in addition to bird's eyeview image data 1301. Further, in adisplay region 1305, the pitch angle and the roll angle of thevehicle 1 are displayed as recognizable information. That is, while the roll angle is indicated by an inclination of an icon, the pitch angle is indicated by a distance between a center line passing through the icon and a horizontal line. In the present embodiment, the roll angle and the pitch angle are recognizable in the aforementioned method, however, the display method is not limited to the above and the other display method is acceptable. - In addition, the captured
image data 1302 where the travelling direction of thevehicle 1 is captured by theimaging portion 16a serves as the captured image data obtained after the rotation control is performed. Accordingly, the driver may recognize a height relation within the captured image data. - Next, the display processing on the
display device 8 in thesurroundings monitoring portion 700 according to the present embodiment is explained.Fig. 20 is a flowchart illustrating procedures of the aforementioned processing in thesurroundings monitoring portion 700 according to the present embodiment. - First, the
acquisition portion 701 acquires the captured image data from the imaging portions 16 (step S1401). Next, theacquisition portion 701 acquires the acceleration data from the acceleration sensors 26 (step S1402). - Then, the
angle calculation portion 702 calculates the roll angle and the pitch angle of thevehicle 1 from the acceleration data (step S1403). Thefiltering control portion 703 performs filtering by low-pass filter relative to the calculated roll angle and the calculated pitch angle. - Then, the
rotation control portion 711 performs the rotation control relative to the captured image data depending on the roll angle (step S1404). - Next, the bird's eye view
image generation portion 712 generates the bird's eye view image data where a predetermined area in the travelling direction of thevehicle 1 which is present around thevehicle 1 is illustrated in an overhead view (step S1405). - The moving
amount calculation portion 713 extracts the image data of a predetermined display range (area) from the generated bird's eye view image data (step S1406). In addition, the movingamount calculation portion 713 holds the image data extracted from the similar range from the bird's eye view image data in the past (for example, in a case where the previous moving amount is determined to reach or exceed a predetermined threshold value). - Then, the moving
amount calculation portion 713 calculates the moving amount of thevehicle 1 based on the image data of the predetermined display range (area) extracted from the bird's eye view image data (step S1407). - Then, the
image processing portion 704 determines whether or not the calculated moving amount is equal to or greater than the predetermined threshold value (step S1408). The threshold value is specified to be 10 cm, for example, however, the threshold value may be specified appropriately depending on an implementation mode - In a case where the
image processing portion 704 determines that the moving amount is equal to or greater than the threshold value (Yes in step S1408), the bird's eye viewimage generation portion 712 generates the bird's eye view image data from the presently captured image data serving as the captured image data before the rotation control is performed as in step S1404 (step S1409). - Afterwards, the
conversion portion 714 performs the projective transformation on the bird's eye view image data depending on the present roll angle and pitch angle of the vehicle 1 (step S1410). The torsion of the bird's eye view image data generated by either one of the roll angle and the pitch angle is corrected by the projective transformation including the trapezoidal correction. When the roll angle is generated at thevehicle 1, thevehicle 1 is inclined with the axis of thewheel 3, for example, instead of being inclined with reference to a center of gravity. Therefore, a displacement occurs in the left-right direction. Thus, theconversion portion 714 according to the embodiment performs an offset correction on the displacement in the left-right direction. In the same manner, the offset correction in the front-rear direction is performed in a case where the pitch angle is generated. - Next, the
composition portion 715 combines the present bird's eye view image data after the projective transformation is performed with the bird's eye view image data stored at the bird's eye view image storage portion (step S1411). Thecomposite portion 715 performs the rotation control on the bird's eye view image data stored at the bird's eye viewimage storage portion 706 so as to conform to the turning angle θ1 obtained before the composition in a case where thevehicle 1 turns at the turning angle θ1. - Then, the
output portion 705 cuts out the bird's eye view image data displayed at thedisplay device 8 from the bird's eye view image data stored at the bird's eye view image storage portion 706 (step S1412). Thereafter, theoutput portion 705 adds various pieces of information to the bird's eye view image data that is cut out and outputs the data to the display device 8 (step S1413). - Meanwhile, in a case where the
image processing portion 704 determines that the moving amount is smaller than the predetermined threshold value (No in step S1408), theimage processing portion 704 continues outputting the bird's eye view image data and the like already displayed at the display device 8 (step S1414). - In the second embodiment, the embodiment where the bird's eye view image data is displayed for confirming the state of the
vehicle 1 is explained. Nevertheless, the embodiment is not limited to the display of only the bird's eye view image data and various pieces of information for confirming present state may be added to the bird's eye view image data. Thus, in the third embodiment, an example where various pieces of information are added to the bird's eye view image data is explained. - First, a construction of a
surroundings monitoring portion 1700 according to the third embodiment is explained.Fig. 21 is a block diagram illustrating the construction of thesurroundings monitoring portion 1700 realized within thesurroundings monitoring ECU 14 according to the present embodiment. - The
surroundings monitoring portion 1700 illustrated inFig. 21 is different from thesurroundings monitoring portion 700 in the first embodiment in a point where theacquisition portion 701 is changed to anacquisition portion 1701 performing a different processing from theacquisition portion 701 and theimage processing portion 704 is changed to animage processing portion 1702 performing a different processing from theimage processing portion 704. - The
acquisition portion 1701 acquires the captured image data and the acceleration data, in the same way as the second embodiment, and also acquires a suspension detection result indicating a depression degree of a suspension of thefront wheels 3F from a stroke sensor (not illustrated) and a detection result of the steering angle of each of thefront wheels 3F and therear wheels 3R from thesteering angle sensor 19. In the present embodiment, the acquired suspension detection result and steering angle detection result are output to theimage processing portion 1702. - The
image processing portion 1702 is different from theimage processing portion 704 in the second embodiment in a point where a tireoutline calculation portion 1711 and alocus calculation portion 1712 are added and thecomposition portion 715 in the second embodiment is changed to acomposition portion 1713 performing a different processing from the processing performed by thecomposition portion 715. - The tire
outline calculation portion 1711 calculates an outline of a tire that should be superimposed on the bird's eye view image data based on the suspension detection result and the detection result of the steering angle acquired by theacquisition portion 1701. For example, in a case where the camera is placed on a basis of the bird's eye view, thefront wheels 3F and therear wheels 3R are shown largely as approaching the camera when the suspension is depressed, and are shown small when the suspension is extended. Thus, in the present embodiment, in order to display the tire outline of thevehicle 1 on the bird's eye view image data, the tireoutline calculation portion 1711 calculates the tire outline configuration (size and angle of each of thefront wheels 3F) that should be superimposed on the bird's eye view image data based on the suspension detection result and the steering angle. - The
locus calculation portion 1712 calculates an estimated moving locus in the travelling direction of thevehicle 1 based on the steering angle of thefront wheels 3F and the steering angle of therear wheels 3R. Thelocus calculation portion 1712 according to the present embodiment calculates the estimated moving locus that should be added to the present captured image data as the estimated moving locus of thefront wheels 3F and calculates the estimated moving locus that should be added to the bird's eye view image data as the estimated moving locus of therear wheels 3R. The estimated moving loca that are calculated are added to the captured image data and the bird's eye view image data and then output by theoutput 705. - In addition, the
composition portion 1713 combines the bird's eye view image data stored at the bird's eye viewimage storage portion 706 with the bird's eye view image data obtained after the projective transformation is performed by theconversion portion 714 in the same processing as the second embodiment, and thereafter adds a mark by which the steering angle and the size of each of thefront wheels 3F are recognizable to a position where each of thefront wheels 3 is estimated to presently exist on the bird's eye view image data obtained after the composition is performed. - Each time the
composition portion 1713 performs the composition, the mark is added to the position where each of thefront wheels 3R exists so that the moving locus of thefront wheel 3R is displayed at the bird's eye view image data. Then, theoutput portion 705 outputs the screen information at thedisplay device 8 based on the bird's eye view image data composited by thecomposition portion 1713. -
Fig. 22 is a diagram illustrating an example of the screen information output by theoutput portion 705 according to the third embodiment. In the example illustrated inFig. 22 , in addition to bird's eyeview image data 1601, capturedimage data 1602 capturing the travelling direction of thevehicle 1 by theimaging portion 16a is shown, for example. - In the captured
image data 1602 capturing the travelling direction, estimated moving loca 1611, 1612 of thefront wheels 3F (display element indicating a planned course) calculated by thelocus calculation portion 1702 are indicated. - Meanwhile, in the bird's eye
view image data 1701, moving loca 1621, 1622 of thevehicle 1 generated because of the marks which are continuously added on a basis of the outlines of thefront wheels 3F are indicated. Based on the size of each of the marks included in the moving loca 1621, 1622, the driver may recognize protrusion and recess on the road surface. That is, at a portion where the large mark is added, the suspension is largely depressed. In other words, an obstacle such as a stone, for example, is highly possibly present. Thus, the driver may drive, while confirming the aforementioned marks, to operate so that therear wheels 3F and a differential (not illustrated) are inhibited from collision. - In addition, in the bird's eye
view image data 1602, estimated moving loca 1631, 1632 of therear wheels 3F calculated by the locus calculation portion 1703 are indicated. Because the driver can recognize the estimated moving loca 1631, 1632, the driver may restrain therear wheels 3R from being collided with an obstacle by driving the vehicle so that therear wheels 3R overlap the moving loca of thefront wheels 3F which have not been collided with the obstacle. - Here, in a case where the bird's eye view
image generation portion 1702 adds the marks indicating the positions of thefront wheels 3F to the bird's eye view image data in the past, the bird's eye viewimage generation portion 1702 may differentiate colors or shapes based on information other than the suspension detection results. For example, the color or shape at the positions where thefront wheels 3F slip may be differentiated. As a result, safety when the driver drives the vehicle may improve. - The surroundings monitoring portion in the aforementioned embodiments include the aforementioned construction so as to easily recognize the surrounding state including the ground below the
vehicle 1. Accordingly, a load of driving is reduced to thereby enhance safety. - The second embodiment or the third embodiment is an example of a vehicle control apparatus or program according to either of the followings [1] - [8].
- [1] A vehicle control apparatus including:
- an acquisition portion acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surroundings of the vehicle and vehicle state data output from a vehicle state detection portion provided at the vehicle;
- a control portion performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data; and
- a generation portion generating bird's eye view image data indicating a ground in a surrounding of the vehicle in an overhead view based on the captured image data on which the rotation control is performed by the control portion.
- [2] The vehicle control apparatus according to [1], wherein
the generation portion generates first bird's eye view image data based on first captured image data on which the rotation control is performed by the rotation control portion and generates second bird's eye view image data based on second captured image data which is captured by the imaging portion after the first captured image data is captured and then the vehicle moves and on which the rotation control is performed by the rotation control portion,
the vehicle control apparatus further including a composition portion combines the first bird's eye view image data and the second bird's eye view image data. - [3] The vehicle control apparatus according to [1] or [2], wherein
the acquisition portion acquires an acceleration of the vehicle as the vehicle state data from the vehicle state detection portion,
the rotation control portion further performs the rotation control on the captured image data depending on a roll angle indicating an inclination around a front-rear axis of the vehicle obtained from the acceleration data with an origin at a position coordinate within a display region of the captured image data, the position coordinate corresponding to a center of a lens used for imaging by the imaging portion. - [4] The vehicle control apparatus according to any one of [1] through [3], further including a conversion portion performing a projective transformation which is specified on a basis of a second roll angle relative to the second captured image data which is cut out at a display range specified on a basis of the second roll angle obtained from a second acceleration data which is acquired when the second captured image data is captured, wherein
the composition portion combines the first bird's eye view image data and the second bird's eye view image data which is converted by the conversion portion. - [5] The vehicle control apparatus according to any one of [1] through [4], wherein the rotation control portion performs the rotation control on the captured image data depending on a difference between a first roll angle obtained from a first acceleration data that is acquired when the first captured image data is captured and a second roll angle obtained from the second acceleration data that is acquired when the second captured image data is captured.
- [6] The vehicle control apparatus according to any one of [1] through [5], wherein the composition portion combines the first bird's eye view image data captured before the vehicle moves and including a ground below the vehicle and the second bird's eye view image data.
- [7] The vehicle control apparatus according to any one of [1] through [6], further including an output portion outputting information that represents either the roll angle or a pitch angle indicating an inclination around a left-right axis of the vehicle and bird's eye view image data composited by the composition portion.
- [8] A program configured to cause a computer to execute;
an acquisition step acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surroundings of the vehicle and vehicle state data output from a vehicle state detection portion provided at the vehicle;
a rotation control step performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data; and
a generation step generating bird's eye view image data indicating a ground in a surroundings of the vehicle in an overhead view based on the captured image data on which the rotation control is performed by the rotation control step. - The embodiments of the present invention have been explained, however, the present embodiments are proposed as examples and not intended to limit the scope of the invention. The above new embodiments may be performed in other various modes. Without departing from the spirit of the invention, various omissions, replacements and changes may be made. The embodiments and alternatives thereof are included within the spirit and scope of the invention and included in the invention described in the scope of claims and equivalents thereof.
- 500: surroundings monitoring portion, 501: acquisition portion, 502: angle calculation portion, 503: filtering control portion, 504: image processing portion, 505: output portion, 521: rotation control portion, 522: reduction/enlargement control portion, 523: movement control portion, 524: composition portion, 700: surroundings monitoring portion, 701: acquisition portion, 702: angle calculation portion, 703: filtering control portion, 704: image processing portion, 705: output portion, 706: bird's eye view image storage portion, 711: rotation control portion, 712: bird's eye view image generation portion, 713: moving amount calculation portion, 714: conversion portion, 715: composition portion
Claims (10)
- A vehicle control apparatus comprising:an acquisition portion acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surrounding of the vehicle and vehicle state data output from a vehicle state detection portion provided at the vehicle; anda control portion performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data,the control portion performing the rotation control in a manner that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side with respect to a display region serving as an output destination.
- The vehicle control apparatus according to claim 1, wherein
the vehicle state detection portion acquires acceleration data serving as the vehicle state data and outputting from an acceleration detection portion provided at the vehicle,
the control portion performs the rotation control on the captured image data depending on a roll angle indicating an inclination around a front-rear axis of the vehicle obtained from the acceleration data as an origin at a position coordinate within the display region of the captured image data, the position coordinate corresponding to a center of a lens used for imaging by the imaging portion. - The vehicle control apparatus according to claim 2, wherein the control portion further performs an enlargement processing or a reduction processing on the captured image data.
- The vehicle control apparatus according to either claim 2 or 3, wherein the control portion further moves the position coordinate corresponding to the center of the lens from a center of the display region relative to the captured image data.
- The vehicle control apparatus according to claim 4, wherein the control portion further moves the position coordinate corresponding to the center of the lens from the center of the display region to an upper direction within the display region.
- The vehicle control apparatus according to any one of claims 1 through 5, wherein the captured image data is displayed at a display device, the display device displaying, together with the captured image data, information that represents at least one of a roll angle indicating an inclination around a front-rear axis of the vehicle and a pitch angle indicating an inclination around a left-right axis of the vehicle.
- The vehicle control apparatus according to any one of claims 1 through 6, wherein the acquisition portion further acquires information indicating whether or not the vehicle is switched to a mode for off-road, and the control portion performs the rotation control on the captured image data depending on the vehicle state data in a case where the vehicle is switched to the mode for off-road.
- The vehicle control apparatus according to any one of claims 1 through 7, further comprising a generation portion generating bird's eye view image data indicating a ground in a surrounding of the vehicle in an overhead view based on the captured image data on which the rotation control is performed by the control portion.
- A program configured to cause a computer to execute:an acquisition step acquiring captured image data output from an imaging portion that is provided at a vehicle and that images a surrounding of the vehicle and vehicle state data output from a vehicle state detection portion provided at the vehicle; anda control step performing a rotation control on the captured image data based on an inclination in a left-right direction of the vehicle relative to a horizontal direction which serves as a direction included in a horizontal plane orthogonal to a direction of gravity, the inclination in the left-right direction of the vehicle being calculated from the vehicle state data, the control step performing the rotation control in a manner that a horizontal line included in a subject captured in the captured image data is substantially parallel to a lateral-direction side with respect to a display region serving as an output destination.
- The program according to claim 9, further configured to cause the computer to execute a generation step generating bird's eye view image data indicating a ground in a surrounding of the vehicle in an overhead view based on the captured image data on which the rotation control is performed in the control step.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013039895 | 2013-02-28 | ||
JP2013062440 | 2013-03-25 | ||
PCT/JP2014/050387 WO2014132680A1 (en) | 2013-02-28 | 2014-01-10 | Program and device for controlling vehicle |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2963922A1 true EP2963922A1 (en) | 2016-01-06 |
EP2963922A4 EP2963922A4 (en) | 2016-04-13 |
EP2963922B1 EP2963922B1 (en) | 2019-02-27 |
Family
ID=51427954
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14757301.8A Active EP2963922B1 (en) | 2013-02-28 | 2014-01-10 | Program and device for controlling vehicle |
Country Status (5)
Country | Link |
---|---|
US (2) | US10322672B2 (en) |
EP (1) | EP2963922B1 (en) |
JP (1) | JP6028848B2 (en) |
CN (1) | CN105075247B (en) |
WO (1) | WO2014132680A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2990265A1 (en) * | 2014-08-29 | 2016-03-02 | Aisin Seiki Kabushiki Kaisha | Vehicle control apparatus |
EP3176037A1 (en) * | 2015-12-03 | 2017-06-07 | Fico Mirrors, SA | A rear vision system for a motor vehicle |
WO2017198429A1 (en) * | 2016-05-17 | 2017-11-23 | Bayerische Motoren Werke Aktiengesellschaft | Ascertainment of vehicle environment data |
WO2019179686A1 (en) * | 2018-03-19 | 2019-09-26 | Jaguar Land Rover Limited | Controller for a vehicle |
CN111629931A (en) * | 2018-01-22 | 2020-09-04 | 株式会社小糸制作所 | Imaging device for electronic mirror, electronic mirror system, and automobile |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE602004026362D1 (en) | 2004-06-14 | 2010-05-12 | Gas Turbine Efficiency Ab | SYSTEM AND DEVICES FOR COLLECTING AND TREATING WASTEWATER FROM ENGINE CLEANING |
US9514650B2 (en) * | 2013-03-13 | 2016-12-06 | Honda Motor Co., Ltd. | System and method for warning a driver of pedestrians and other obstacles when turning |
JP6412934B2 (en) * | 2014-05-28 | 2018-10-24 | 京セラ株式会社 | Object detection device, vehicle installed with object detection device, and program |
JP6467838B2 (en) * | 2014-09-26 | 2019-02-13 | アイシン精機株式会社 | Perimeter monitoring device and perimeter monitoring system |
US10890925B2 (en) * | 2014-12-05 | 2021-01-12 | Command Electronics, LLC | Vehicle leveling systems, devices and methods and computer program products for leveling vehicles using smart devices |
JP6507626B2 (en) * | 2014-12-19 | 2019-05-08 | アイシン精機株式会社 | Vehicle perimeter monitoring device |
KR102087588B1 (en) * | 2015-03-19 | 2020-03-11 | 젠텍스 코포레이션 | Image Processing for Camera-Based Display Systems |
JP6609970B2 (en) * | 2015-04-02 | 2019-11-27 | アイシン精機株式会社 | Perimeter monitoring device |
US9884623B2 (en) * | 2015-07-13 | 2018-02-06 | GM Global Technology Operations LLC | Method for image-based vehicle localization |
US10318043B2 (en) * | 2016-03-24 | 2019-06-11 | Gm Global Technology Operations Llc. | Dynamic adjustment of touch sensitive area in a display assembly |
CN105872379A (en) * | 2016-04-15 | 2016-08-17 | 乐视控股(北京)有限公司 | Image photographing device |
CN118112811A (en) * | 2016-08-29 | 2024-05-31 | 麦克赛尔株式会社 | Vehicle with a vehicle body having a vehicle body support |
JP6766557B2 (en) * | 2016-09-29 | 2020-10-14 | アイシン精機株式会社 | Peripheral monitoring device |
US10726602B2 (en) * | 2017-02-03 | 2020-07-28 | Sony Corporation | Apparatus and method to generate realistic three-dimensional (3D) model animation |
JP6852465B2 (en) * | 2017-03-02 | 2021-03-31 | 株式会社Jvcケンウッド | Bird's-eye view image generation device, bird's-eye view image generation system, bird's-eye view image generation method and program |
JP6724834B2 (en) * | 2017-03-23 | 2020-07-15 | 日産自動車株式会社 | Display device control method and display device |
WO2018200522A1 (en) * | 2017-04-24 | 2018-11-01 | Mobileye Vision Technologies Ltd. | Systems and methods for compression of lane data |
JP6859216B2 (en) * | 2017-07-03 | 2021-04-14 | トヨタ自動車株式会社 | Vehicle peripheral display device |
US10579067B2 (en) * | 2017-07-20 | 2020-03-03 | Huawei Technologies Co., Ltd. | Method and system for vehicle localization |
DE102018204409A1 (en) * | 2018-03-22 | 2019-09-26 | Volkswagen Aktiengesellschaft | A method of displaying vehicle information and display system information |
JP2020053734A (en) * | 2018-09-25 | 2020-04-02 | アルパイン株式会社 | Electronic mirror system |
JP2020052916A (en) * | 2018-09-28 | 2020-04-02 | 日本電産サンキョー株式会社 | Image processing device, image scanner, and image processing method |
WO2020090512A1 (en) * | 2018-10-31 | 2020-05-07 | ソニー株式会社 | Imaging device, control method, and program |
JP7211047B2 (en) * | 2018-12-04 | 2023-01-24 | 株式会社アイシン | Road surface detection device and road surface detection program |
US11885893B2 (en) * | 2019-08-12 | 2024-01-30 | Motional Ad Llc | Localization based on predefined features of the environment |
EP3848781B1 (en) | 2019-12-31 | 2023-05-10 | Seiko Epson Corporation | Circuit device, electronic apparatus, and mobile body |
CN113129224B (en) * | 2019-12-31 | 2023-11-28 | 精工爱普生株式会社 | Display system, electronic apparatus, moving object, and display method |
USD947893S1 (en) * | 2020-03-05 | 2022-04-05 | Jaguar Land Rover Limited | Display screen or portion thereof with icon |
DE102021109213A1 (en) * | 2020-04-16 | 2021-10-21 | Volkswagen Aktiengesellschaft | Method for making the inclination of a motor vehicle visible in the motor vehicle |
EP3967552A1 (en) * | 2020-09-11 | 2022-03-16 | Ficosa Adas, S.L.U. | Camera monitoring system for motor vehicles |
JP7540270B2 (en) * | 2020-09-29 | 2024-08-27 | 株式会社アイシン | Parking assistance device, parking assistance method, and program |
US20240018746A1 (en) * | 2022-07-12 | 2024-01-18 | Caterpillar Inc. | Industrial machine remote operation systems, and associated devices and methods |
JP2024094471A (en) * | 2022-12-28 | 2024-07-10 | 株式会社クボタ | Remote operation support system for work vehicle and remote device |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01273746A (en) * | 1988-04-25 | 1989-11-01 | Mazda Motor Corp | Image processor for traveling vehicle |
JPH0952555A (en) * | 1995-08-11 | 1997-02-25 | Mitsubishi Electric Corp | Periphery monitoring device |
US6405975B1 (en) * | 1995-12-19 | 2002-06-18 | The Boeing Company | Airplane ground maneuvering camera system |
JP2002354467A (en) | 2001-05-25 | 2002-12-06 | Nissan Motor Co Ltd | Surrounding information display unit for vehicle |
JP3796417B2 (en) | 2001-06-27 | 2006-07-12 | 株式会社デンソー | Vehicle peripheral image processing apparatus and recording medium |
JP4593070B2 (en) | 2001-12-12 | 2010-12-08 | 株式会社エクォス・リサーチ | Image processing apparatus for vehicle |
US7212653B2 (en) | 2001-12-12 | 2007-05-01 | Kabushikikaisha Equos Research | Image processing system for vehicle |
JP4321128B2 (en) * | 2003-06-12 | 2009-08-26 | 株式会社デンソー | Image server, image collection device, and image display terminal |
JP2007266930A (en) | 2006-03-28 | 2007-10-11 | Aisin Aw Co Ltd | Driving assist method and driving support apparatus |
EP2003019B1 (en) * | 2007-06-13 | 2014-04-23 | Aisin AW Co., Ltd. | Driving assist apparatus for vehicle |
JP5125619B2 (en) | 2008-03-03 | 2013-01-23 | 日産自動車株式会社 | In-vehicle camera system |
JP5161760B2 (en) * | 2008-12-26 | 2013-03-13 | 株式会社東芝 | In-vehicle display system and display method |
US8198555B2 (en) * | 2009-04-22 | 2012-06-12 | Honda Motor Co., Ltd | Multi-position switch assembly for controlling a vehicle display screen |
US9264672B2 (en) * | 2010-12-22 | 2016-02-16 | Magna Mirrors Of America, Inc. | Vision display system for vehicle |
EP2511137B1 (en) * | 2011-04-14 | 2019-03-27 | Harman Becker Automotive Systems GmbH | Vehicle Surround View System |
US20150022664A1 (en) * | 2012-01-20 | 2015-01-22 | Magna Electronics Inc. | Vehicle vision system with positionable virtual viewpoint |
US9671935B2 (en) * | 2012-02-16 | 2017-06-06 | Furuno Electric Co., Ltd. | Information display device, display mode switching method and display mode switching program |
US8854325B2 (en) * | 2012-02-29 | 2014-10-07 | Blackberry Limited | Two-factor rotation input on a touchscreen device |
JP6271858B2 (en) * | 2012-07-04 | 2018-01-31 | キヤノン株式会社 | Display device and control method thereof |
JP6099333B2 (en) * | 2012-08-30 | 2017-03-22 | 富士通テン株式会社 | Image generation apparatus, image display system, parameter acquisition apparatus, image generation method, and parameter acquisition method |
-
2014
- 2014-01-10 WO PCT/JP2014/050387 patent/WO2014132680A1/en active Application Filing
- 2014-01-10 CN CN201480010443.7A patent/CN105075247B/en active Active
- 2014-01-10 US US14/770,320 patent/US10322672B2/en active Active
- 2014-01-10 EP EP14757301.8A patent/EP2963922B1/en active Active
- 2014-01-10 JP JP2015502795A patent/JP6028848B2/en active Active
-
2019
- 2019-05-03 US US16/402,369 patent/US10676027B2/en active Active
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2990265A1 (en) * | 2014-08-29 | 2016-03-02 | Aisin Seiki Kabushiki Kaisha | Vehicle control apparatus |
US9895974B2 (en) | 2014-08-29 | 2018-02-20 | Aisin Seiki Kabushiki Kaisha | Vehicle control apparatus |
EP3176037A1 (en) * | 2015-12-03 | 2017-06-07 | Fico Mirrors, SA | A rear vision system for a motor vehicle |
WO2017198429A1 (en) * | 2016-05-17 | 2017-11-23 | Bayerische Motoren Werke Aktiengesellschaft | Ascertainment of vehicle environment data |
CN111629931A (en) * | 2018-01-22 | 2020-09-04 | 株式会社小糸制作所 | Imaging device for electronic mirror, electronic mirror system, and automobile |
EP3744569A4 (en) * | 2018-01-22 | 2021-06-30 | Koito Manufacturing Co., Ltd. | Electronic mirror imaging device, electronic mirror system, and automobile |
CN111629931B (en) * | 2018-01-22 | 2023-09-15 | 株式会社小糸制作所 | Image pickup device for electronic mirror, electronic mirror system, and automobile |
WO2019179686A1 (en) * | 2018-03-19 | 2019-09-26 | Jaguar Land Rover Limited | Controller for a vehicle |
CN111886627A (en) * | 2018-03-19 | 2020-11-03 | 捷豹路虎有限公司 | Controller for vehicle |
US11941847B2 (en) | 2018-03-19 | 2024-03-26 | Jaguar Land Rover Limited | Controller for a vehicle |
CN111886627B (en) * | 2018-03-19 | 2024-09-20 | 捷豹路虎有限公司 | Controller for vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN105075247A (en) | 2015-11-18 |
JPWO2014132680A1 (en) | 2017-02-02 |
JP6028848B2 (en) | 2016-11-24 |
EP2963922A4 (en) | 2016-04-13 |
US20190255999A1 (en) | 2019-08-22 |
US10322672B2 (en) | 2019-06-18 |
US10676027B2 (en) | 2020-06-09 |
EP2963922B1 (en) | 2019-02-27 |
CN105075247B (en) | 2018-08-14 |
WO2014132680A1 (en) | 2014-09-04 |
US20150375680A1 (en) | 2015-12-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10676027B2 (en) | Vehicle control apparatus and program | |
EP2990265B1 (en) | Vehicle control apparatus | |
JP4914458B2 (en) | Vehicle periphery display device | |
JP6507626B2 (en) | Vehicle perimeter monitoring device | |
JP5003946B2 (en) | Parking assistance device | |
US10878253B2 (en) | Periphery monitoring device | |
US10315569B2 (en) | Surroundings monitoring apparatus and program thereof | |
EP3002727B1 (en) | Periphery monitoring apparatus and periphery monitoring system | |
CN103079902A (en) | Driving assistance device | |
WO2018150642A1 (en) | Surroundings monitoring device | |
CN107791951B (en) | Display control device | |
JP5516988B2 (en) | Parking assistance device | |
US11420678B2 (en) | Traction assist display for towing a vehicle | |
CN107925746A (en) | Periphery monitoring apparatus | |
JP2007168560A (en) | Parking support apparatus | |
US20200035207A1 (en) | Display control apparatus | |
CN109314770B (en) | Peripheral monitoring device | |
CN110877575A (en) | Periphery monitoring device | |
US20200193183A1 (en) | Periphery monitoring device | |
JP2008213647A (en) | Parking assist method and parking assist system | |
CN110546047A (en) | Parking assist apparatus | |
WO2018025441A1 (en) | Periphery monitoring device | |
WO2017057007A1 (en) | Image processing device for vehicles | |
JP7423970B2 (en) | Image processing device | |
CN114945083A (en) | Peripheral image display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20150827 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: B60R 1/00 20060101ALI20160303BHEP Ipc: H04N 7/18 20060101AFI20160303BHEP Ipc: G06T 3/60 20060101ALI20160303BHEP |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20160311 |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20161129 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20180802 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1103066 Country of ref document: AT Kind code of ref document: T Effective date: 20190315 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602014041850 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20190227 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190627 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190527 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190627 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190528 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190527 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1103066 Country of ref document: AT Kind code of ref document: T Effective date: 20190227 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602014041850 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20191128 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20200110 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20200131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200110 Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200110 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200131 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200131 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200110 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20201210 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190227 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220131 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20231128 Year of fee payment: 11 |