US20200231099A1 - Image processing apparatus - Google Patents
Image processing apparatus Download PDFInfo
- Publication number
- US20200231099A1 US20200231099A1 US16/844,294 US202016844294A US2020231099A1 US 20200231099 A1 US20200231099 A1 US 20200231099A1 US 202016844294 A US202016844294 A US 202016844294A US 2020231099 A1 US2020231099 A1 US 2020231099A1
- Authority
- US
- United States
- Prior art keywords
- change
- image
- boundary
- display image
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- FIG. 3 is a bird's eye view illustrating a plurality of images and boundaries
- FIG. 6 is an explanatory diagram illustrating an example of a display image
- FIG. 9 is an explanatory diagram illustrating a method for setting a boundary.
- the shift sensor 13 detects a state of a gear shift of the own vehicle and creates shift information.
- the shift sensor 13 transmits the shift information to the in-vehicle CAN.
- the shift information is information representing the state of the gear shift of the own vehicle. Examples of the state of the gear shift can include, for example, forward movement and backward movement.
- an angle Y formed by the line-of-sight direction 55 and the straight traveling direction 57 is larger as the amount of the steering angle increases.
- the steering angle direction X and the amount of the steering angle are included in the steering angle information acquired in step 4 .
- a map which defines relationship between the angle Y and the amount of the steering angle is stored in the memory 21 in advance.
- the display image generating unit 31 sets the line-of-sight direction 55 using this map and the steering angle information. Note that the relationship between the angle Y and the steering angle may be calculated from a predetermined formula instead of using the map.
- a front extracted image 37 B and a right extracted image 39 B are joined at the boundary 59 .
- the front extracted image 37 B is a portion between the boundary 59 and the boundary 61 in the front image 37 .
- the right extracted image 39 B is a portion on the right side of the boundary 59 in the right image 39 , when viewed from the viewpoint 53 .
- the position of the boundary with respect to the front image and the right image is always fixed, in a case where the display range 67 is set closer to the steering angle direction X, the position of the boundary approaches the center of the display image 69 in a similar manner to a boundary 159 .
- the image processing apparatus 3 tilts the boundary existing closer to the direction of change from the straight traveling direction 57 among the boundaries within the display image closer to the direction of change from the straight traveling direction 57 as the amount of the steering angle increases. Therefore, the image processing apparatus 3 can make settings so that the boundary does not approach the center of the display screen. As a result, the image processing apparatus 3 can prevent the user from feeling uncomfortable.
- the front extracted image 37 B and the right extracted image 39 B are joined at the boundary 59 . Further, in the display image 69 , the front extracted image 37 B and the left extracted image 43 B are joined at the boundary 61 .
- a plurality of functions of one component in the above-described embodiments may be realized by a plurality of components, or a single function of one component may be realized by a plurality of components. Further, a plurality of functions of a plurality of components may be realized by one component, or one function realized by a plurality of components may be realized by one component. Further, part of the components of the above-described embodiments may be omitted. Further, at least a part of the components of the above-described embodiments may be added to or replaced with the components of other embodiments. Note that embodiments of the present disclosure incorporate any aspect included in technical idea specified from wording recited in the claims.
Abstract
An image processing apparatus is mounted on a vehicle. The image processing apparatus includes an image acquiring unit, a boundary setting unit, a display image generating unit, and a direction change acquiring unit. The display image generating unit generates a display image by synthesizing a plurality of images at the boundaries. The direction change acquiring unit acquires a direction of change from a straight traveling direction of the vehicle, and a change amount. The display image generating unit sets a display range of the display image closer to the direction of change from the straight traveling direction as the change amount from a traveling direction increases. The boundary setting unit sets a position of at least a part of a boundary existing closer to the direction of change among boundaries within the display image closer to the direction of change as the change amount increases.
Description
- This application is the U.S. bypass application of International Application No. PCT/JP2018/037941, filed Oct. 11, 2018 which designated the U.S. and claims priority to Japanese Patent Application No. 2017-199383, filed Oct. 13, 2017, the contents of both of which are incorporated herein by reference.
- The present disclosure relates to an image processing apparatus.
- Conventionally, an image processing apparatus as described below is known. A plurality of images representing the surroundings of a vehicle are acquired using a plurality of cameras. A display image viewed from a viewpoint of a driver of the vehicle is created by synthesizing the plurality of images. This image processing apparatus is disclosed in JP 6014433 B, for example.
- One aspect of the present disclosure is an image processing apparatus configured to be mounted on a vehicle. An image processing apparatus according to one aspect of the present disclosure includes an image acquiring unit configured to acquire a plurality of images in which parts of ranges which can be displayed overlap with each other, using a plurality of cameras which capture the surroundings of the vehicle.
- An image processing apparatus according to one aspect of the present disclosure includes a boundary setting unit configured to set boundaries of images in which parts of ranges which can be displayed overlap with each other, included in the plurality of images, within ranges in which the ranges which can be displayed overlap.
- An image processing apparatus according to one aspect of the present disclosure includes a display image generating unit configured to generate a display image viewed from a viewpoint within a vehicle interior of the vehicle by synthesizing at least parts of the plurality of images at the boundaries.
- An image processing apparatus according to one aspect of the present disclosure includes an output unit configured to output the display image.
- An image processing apparatus according to one aspect of the present disclosure includes a direction change acquiring unit configured to acquire a direction of change from a straight traveling direction of the vehicle, and a change amount from the straight traveling direction.
- The display image generating unit is configured to set a display range of the display image closer to the direction of change as the change amount increases, and the boundary setting unit is configured to set a position of at least a part of at least the boundary existing on the direction of change among the boundaries within the display image closer to the direction of change as the change amount increases.
- The above objects and other objects, features and advantages of the present disclosure will be made clearer by the following detailed description, given referring to the appended drawings.
- In the accompanying drawings:
-
FIG. 1 is a block diagram illustrating configurations of an in-vehicle system and an image processing apparatus; -
FIG. 2 is a block diagram illustrating a functional configuration of the image processing apparatus; -
FIG. 3 is a bird's eye view illustrating a plurality of images and boundaries; -
FIG. 4 is a flowchart illustrating processing to be performed by the image processing apparatus; -
FIG. 5 is a bird's-eye view illustrating a configuration of a viewpoint, a line-of-sight direction, a display range, or the like; -
FIG. 6 is an explanatory diagram illustrating an example of a display image; -
FIG. 7 is a bird's-eye view illustrating a configuration of a viewpoint, a line-of-sight direction, a display range, or the like; -
FIG. 8 is an explanatory diagram illustrating an example of the display image; and -
FIG. 9 is an explanatory diagram illustrating a method for setting a boundary. - As a result of detailed studies by the inventor, the following problems have been found. Within a display image, there are boundaries between a plurality of images which are used for synthesis. In a case where a traveling direction of a vehicle changes, it is conceivable to move a display range of the display image to a direction in which the traveling direction is changed (hereinafter, referred to as a direction of change from a straight traveling direction). For example, in a case where left and right boundaries between an image captured by a front camera and images captured by side cameras are respectively set at positions at 45 degrees with respect to the straight traveling direction when the vehicle travels straight forward, if the positions of the boundaries are always fixed, in a case where the display range of the display image is moved to the direction of change from the straight traveling direction, a position of at least a part of the boundaries approaches a center of a display screen.
- In the image near the boundary, a phenomenon such as distortion or one object being doubly displayed may occur due to a difference of imaging positions of two images to be synthesized or through conversion. Therefore, the image near the boundary may cause user discomfort. The imaging positions are positions where the cameras are provided. If the boundary is located near the center of the display image, it becomes difficult for a passenger of the vehicle to recognize a surrounding object from the display image. In one aspect of the present disclosure, it is preferable to provide an image processing apparatus which is capable of preventing the boundary from approaching the center of the display screen.
- One aspect of the present disclosure is an image processing apparatus configured to be mounted on a vehicle. An image processing apparatus according to one aspect of the present disclosure includes an image acquiring unit configured to acquire a plurality of images in which parts of ranges which can be displayed overlap with each other, using a plurality of cameras which capture the surroundings of the vehicle.
- An image processing apparatus according to one aspect of the present disclosure includes a boundary setting unit configured to set boundaries of images in which parts of ranges which can be displayed overlap with each other, included in the plurality of images, within ranges in which the ranges which can be displayed overlap.
- An image processing apparatus according to one aspect of the present disclosure includes a display image generating unit configured to generate a display image viewed from a viewpoint within a vehicle interior of the vehicle by synthesizing at least parts of the plurality of images at the boundaries.
- An image processing apparatus according to one aspect of the present disclosure includes an output unit configured to output the display image.
- An image processing apparatus according to one aspect of the present disclosure includes a direction change acquiring unit configured to acquire a direction of change from a straight traveling direction of the vehicle, and a change amount from the straight traveling direction.
- The display image generating unit is configured to set a display range of the display image closer to the direction of change as the change amount increases, and the boundary setting unit is configured to set a position of at least a part of at least the boundary existing on the direction of change among the boundaries within the display image closer to the direction of change as the change amount increases.
- The image processing apparatus according to one aspect of the present disclosure sets the display range of the display image closer to the direction of change from the straight traveling direction as the change amount from the straight traveling direction increases. If the positions of the boundaries with respect to a plurality of images were always fixed, the position of part of the boundaries would approach the center of the display image as the change amount from the straight traveling direction increases.
- If the image is near the boundary, it may be unclear and may cause the user to feel uncomfortable. If the boundary is located near the center of the display image, it becomes difficult for a passenger of the vehicle to recognize a surrounding target from the display image.
- An image processing apparatus according to one aspect of the present disclosure sets a position of at least a part of the boundary existing closer to the direction of change from the straight traveling direction among the boundaries within the display image, closer to the direction of change from the straight traveling direction as the change amount from the straight traveling direction increases. Therefore, the image processing apparatus according to one aspect of the present disclosure can prevent the position of the boundary from approaching the center of the display image even in a case where the traveling direction of the vehicle changes. As a result, it becomes easy for a passenger of the vehicle to recognize a target in the surrounding from the display image.
- Exemplary embodiments of the present disclosure will be described with reference to the drawings.
- 1. Configurations of in-
Vehicle System 1 andImage Processing Apparatus 3 - The configurations of an in-
vehicle system 1 and animage processing apparatus 3 will be described with reference toFIGS. 1 to 3 . The in-vehicle system 1 illustrated inFIG. 1 is a system mounted on a vehicle. Hereinafter, the vehicle on which the in-vehicle system 1 is mounted will be referred to as an own vehicle. The in-vehicle system 1 includes animage processing apparatus 3, afront camera 5, aright camera 7, aleft camera 9, and arear camera 11. Theimage processing apparatus 3 and thecameras respective cameras image processing apparatus 3 as needed. The in-vehicle system 1 is connected to ashift sensor 13, asteering angle sensor 15, and anobstacle sensor 16 via an in-vehicle communication bus such as in-vehicle CAN (registered trademark) and LIN. The in-vehicle system 1 and adisplay 17 are connected via an image transmission line. - The
image processing apparatus 3 includes a microcomputer having aCPU 19 and a semiconductor memory (hereinafter, referred to as a memory 21) such as, for example, a RAM and a ROM. Each function of theimage processing apparatus 3 is realized by theCPU 19 executing a program stored in a non-transitory computer-readable storage medium. In this example, thememory 21 corresponds to a non-transitory computer-readable storage medium which stores a program. Further, a method corresponding to the program is performed by this program being performed. Note that theimage processing apparatus 3 may include one microcomputer or may include a plurality of microcomputers. - As illustrated in
FIG. 2 , theimage processing apparatus 3 includes a displaydirection determining unit 23, animage acquiring unit 25, a directionchange acquiring unit 27, aboundary setting unit 29, a display image generating unit 31, anoutput unit 33, and anobstacle detecting unit 35. - A method for realizing functions of the respective units included in the
image processing apparatus 3 is not limited to software, and part or all of the functions may be realized using one or a plurality of pieces of hardware. For example, in a case where the above-described functions are realized by an electronic circuit which is hardware, the electronic circuit may be realized by a digital circuit, an analog circuit, or a combination thereof. - As illustrated in
FIG. 3 , thefront camera 5 captures a front portion among the surroundings of theown vehicle 36 and generates an image (hereinafter, referred to as a front image 37). Thefront camera 5 includes, for example, a fisheye lens, or the like. Adisplayable range 37A of thefront image 37 is, for example, approximately 180 degrees. - The
right camera 7 captures a right portion among the surroundings of theown vehicle 36 and generates an image (hereinafter, referred to as a right image 39). Theright camera 7 includes, for example, a fisheye lens, or the like. Adisplayable range 39A of theright image 39 is, for example, approximately 180 degrees. Part of thedisplayable range 37A of thefront image 37 overlaps with part of thedisplayable range 39A of theright image 39. A range that is overlapping is defined as an overlappingrange 41. The overlappingrange 41 is, for example, 90 degrees. - The
left camera 9 captures a left portion among the surroundings of theown vehicle 36 and generates an image (hereinafter, referred to as a left image 43). Theleft camera 9 includes, for example, a fisheye lens, or the like. Adisplayable range 43A of theleft image 43 is, for example, approximately 180 degrees. Part of thedisplayable range 43A of theleft image 43 overlaps with part of thedisplayable range 37A of thefront image 37. An overlapping range is defined as an overlappingrange 45. The overlappingrange 45 is, for example, 90 degrees. - The
rear camera 11 captures a rear portion among the surroundings of theown vehicle 36 and generates an image (hereinafter, referred to as a rear image 47). Therear camera 11 includes, for example, a fisheye lens, or the like. Adisplayable range 47A of therear image 47 is, for example, approximately 180 degrees. Part of thedisplayable range 47A of therear image 47 overlaps with part of thedisplayable range 39A of theright image 39. A range that is overlapping is defined as an overlappingrange 49. The overlappingrange 49 is, for example, 90 degrees. - Further, part of the
displayable range 47A of therear image 47 overlaps with part of thedisplayable range 43A of theleft image 43. A range that is overlapping is defined as an overlappingrange 51. The overlappingrange 51 is, for example, 90 degrees. - The
shift sensor 13 detects a state of a gear shift of the own vehicle and creates shift information. Theshift sensor 13 transmits the shift information to the in-vehicle CAN. The shift information is information representing the state of the gear shift of the own vehicle. Examples of the state of the gear shift can include, for example, forward movement and backward movement. - The
steering angle sensor 15 detects a direction and an amount of the steering angle of the own vehicle, and creates steering angle information. Thesteering angle sensor 15 transmits the steering angle information to the in-vehicle CAN. The steering angle information is information representing the direction of the steering angle of the own vehicle and the amount of the steering angle. The direction of the steering angle corresponds to the direction of change from the straight traveling direction of the own vehicle. Examples of the direction of the steering angle can include, for example, turning right after traveling straight, turning left after traveling straight, further turning right after turning right, and steering left after turning right. The amount of the steering angle corresponds to the change amount from the straight traveling direction of the own vehicle. The amount of the steering angle represents a degree of turning with respect to the straight traveling direction of the own vehicle. The amount of the steering angle is expressed as an angle. - The
obstacle sensor 16 detects an obstacle existing around the own vehicle and creates obstacle information. The obstacle information is information regarding an obstacle. The obstacle information represents, for example, a position of the obstacle with respect to the own vehicle, a relative distance from the own vehicle to the obstacle, relative speed of the obstacle with respect to the own vehicle, or the like. Theobstacle sensor 16 transmits the obstacle information to the in-vehicle CAN. Examples of theobstacle sensor 16 can include, for example, a millimeter wave radar and a lidar. - The
display 17 is provided in the vehicle interior of the own vehicle. Thedisplay 17 is provided in, for example, an instrument panel portion. The instrument panel portion is aligned with a dashboard portion, a console portion, a meter, or the like. Thedisplay 17 can display an image. The image displayed on thedisplay 17 includes a virtual screen after mapping which will be described later or a display image extracted from a synthesized image, a navigation screen, various kinds of indicators, an operation screen for air conditioning, audio, or the like. - Processing to be repeatedly performed by the
image processing apparatus 3 at predetermined time intervals will be described based onFIGS. 4 to 6 . Instep 1 inFIG. 4 , the displaydirection determining unit 23 acquires the shift information from theshift sensor 13. - In step 2, the display
direction determining unit 23 determines the display direction. The display direction is a direction of display shown by a display image which will be described later. The display direction includes a forward direction and a backward direction. In a case where the state of the gear shift represented by the shift information acquired instep 1 is forward movement, the displaydirection determining unit 23 sets the display direction to the forward direction. In a case where the state of the gear shift represented by the shift information acquired instep 1 is backward movement, the displaydirection determining unit 23 sets the display direction to the backward direction. - In
step 3, theimage acquiring unit 25 acquires thefront image 37, theright image 39, theleft image 43, and therear image 47 from thefront camera 5, theright camera 7, theleft camera 9, and therear camera 11. - In
step 4, the directionchange acquiring unit 27 acquires the steering angle information from thesteering angle sensor 15. - In
step 5, the display image generating unit 31 sets aviewpoint 53 in the vehicle interior of theown vehicle 36 as illustrated inFIG. 5 . The position of theviewpoint 53 with respect to theown vehicle 36 is always fixed. Further, the display image generating unit 31 sets a line-of-sight direction 55 based on the steering angle information, using theviewpoint 53 as a starting point. - In a case where the display direction determined in step 2 is the forward direction, the straight traveling
direction 57 of the own vehicle is the forward direction of theown vehicle 36. The line-of-sight direction 55 is a direction rotated from the straight travelingdirection 57 in a steering angle direction X around theviewpoint 53. In a case where the straight travelingdirection 57 is the forward direction, the steering angle direction X corresponds to the direction of change from the straight travelingdirection 57. - On the other hand, in a case where the display direction determined in step 2 is the backward direction, the straight traveling
direction 57 is the backward direction of theown vehicle 36. The line-of-sight direction 55 is a direction rotated from the straight travelingdirection 57 in a direction opposite to the steering angle direction X around theviewpoint 53. In a case where the straight travelingdirection 57 is the backward direction, a direction opposite to the steering angle direction X corresponds to the direction of change from the straight travelingdirection 57. - Regardless of whether the display direction determined in step 2 is the forward direction or the backward direction, an angle Y formed by the line-of-
sight direction 55 and the straight travelingdirection 57 is larger as the amount of the steering angle increases. The steering angle direction X and the amount of the steering angle are included in the steering angle information acquired instep 4. A map which defines relationship between the angle Y and the amount of the steering angle is stored in thememory 21 in advance. The display image generating unit 31 sets the line-of-sight direction 55 using this map and the steering angle information. Note that the relationship between the angle Y and the steering angle may be calculated from a predetermined formula instead of using the map. - In step 6, the
boundary setting unit 29 sets a boundary. This processing will be described based on the example illustrated inFIG. 5 . In the example illustrated inFIG. 5 , the display direction determined in step 2 is the forward direction, and the straight travelingdirection 57 is the forward direction. Further, in the example illustrated inFIG. 5 , the steering angle direction X included in the steering angle information acquired instep 4 is the rightward direction. The direction of change from the straight travelingdirection 57 is the steering angle direction X. - The
boundary setting unit 29 sets aboundary 59 within the overlappingrange 41. Theboundary setting unit 29 sets aboundary 61 within the overlappingrange 45. Theboundary 59 is a line where a virtual screen, which will be described later, intersects with a vertical plane passing through an intersection A inside the overlappingarea 41. Theboundary 61 is a line where a virtual screen, which will be described later, intersects with a vertical plane passing through an intersection B inside the overlappingarea 45. - Here, the intersections A and B are intersections of frame lines which define the ranges which can be displayed of images captured by the respective cameras. The ranges which can be displayed of the images captured by the respective cameras are image areas to be used for display among the images captured by the respective cameras. The
boundaries FIG. 5 , theboundary 59 is a boundary which exists closer to the direction of change from the straight travelingdirection 57 among the boundaries in the display image. - The
boundary setting unit 29 sets a position of at least a part of theboundary 59 closer to the steering angle direction X as the amount of the steering angle increases. The steering angle direction X and the amount of the steering angle are included in the steering angle information acquired instep 4. A map which defines relationship between the position of a part of theboundary 59 and the amount of the steering angle is stored in thememory 21 in advance. Theboundary setting unit 29 sets the position of theboundary 59 using this map and the steering angle information. Theboundary setting unit 29 sets a position of theboundary 61 at, for example, a standard position. The standard position is, for example, a position at 45 degrees with respect to the straight travelingdirection 57. The standard position is stored in thememory 21 in advance. - Unlike with the example illustrated in
FIG. 5 , in a case where the display direction determined in step 2 is the forward direction and the steering angle direction X included in the steering angle information acquired instep 4 is the leftward direction, theboundary 61 is a boundary existing closer to the direction of change from the straight travelingdirection 57 among the boundaries within the display image. In this case, theboundary setting unit 29 sets a position of at least a part of theboundary 61 closer to the steering angle direction X as the amount of the steering angle increases. Further, theboundary setting unit 29 sets the position of theboundary 59 at the standard position. - Unlike with the example illustrated in
FIG. 5 , in a case where the display direction determined in step 2 is the backward direction and the steering angle direction X included in the steering angle information acquired instep 4 is the rightward direction, theboundary setting unit 29 sets aboundary 63 in the overlappingrange 49 and sets aboundary 65 in the overlappingrange 51. Theboundaries boundaries - In a case where the display direction determined in step 2 is the backward direction, the direction of change from the straight traveling
direction 57 is a direction opposite to the steering angle direction X. In a case where the steering angle direction X included in the steering angle information acquired instep 4 is the rightward direction, theboundary 63 is a boundary existing closer to the direction of change from the straight travelingdirection 57 among the boundaries within the display image. Theboundary setting unit 29 sets a position of at least a part of theboundary 63 closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. Further, theboundary setting unit 29 sets a position of theboundary 65 at the standard position. - Unlike with the example illustrated in
FIG. 5 , in a case where the display direction determined in step 2 is the backward direction and the steering angle direction X included in the steering angle information acquired instep 4 is the leftward direction, theboundary setting unit 29 sets theboundary 63 in the overlappingrange 49 and sets theboundary 65 in the overlappingrange 51. - In a case where the display direction determined in step 2 is the backward direction, the direction of change from the straight traveling
direction 57 is a direction opposite to the steering angle direction X. In a case where the steering angle direction X included in the steering angle information acquired instep 4 is the leftward direction, theboundary 65 is a boundary existing closer to the direction of change from the straight travelingdirection 57 among the boundaries in the display image. Theboundary setting unit 29 sets a position of at least a part of theboundary 65 closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. Further, theboundary setting unit 29 sets a position of theboundary 63 at the standard position. - In
step 7, theobstacle detecting unit 35 performs processing of detecting an obstacle existing around the own vehicle using the obstacle information transmitted from theobstacle sensor 16. - In step 8, the
boundary setting unit 29 judges whether the obstacle detected instep 7 is located near the boundary set in step 6 when viewed from theviewpoint 53. In a case where the obstacle is located on the boundary, the processing proceeds to step 9, while, in a case where the obstacle is not on the boundary, the processing proceeds to step 10. - In
step 9, theboundary setting unit 29 corrects a position of the boundary set in step 6 so as to avoid the obstacle detected instep 7. Here, when the position of the boundary is corrected so as to avoid the obstacle, the position is corrected so as to avoid the obstacle in the same direction as the steering angle direction X while the steering angle direction X is taken into account. - In step 10, the display image generating unit 31 generates a display image as follows. The display image generating unit 31 assumes there is a virtual screen around the own vehicle in three-dimensional space. The virtual screen is an image projection plane. Next, the display image generating unit 31 performs mapping by projecting the
front image 37, theright image 39, theleft image 43, and therear image 47 acquired instep 3 onto the above-described virtual screen. - Next, as illustrated in
FIG. 5 , the display image generating unit 31 sets, as adisplay image 69, an image which is visible within thedisplay range 67 from theviewpoint 53 among the virtual screen after mapping. Further, as illustrated inFIG. 5 , the display image generating unit 31 may use, as thedisplay image 69, an image which is visible within thedisplay range 67 from theviewpoint 53 among an image obtained by synthesizing the images acquired from the respective cameras at the same viewpoint. Thedisplay range 67 is a range having a certain spread around the line-of-sight direction 55. Thedisplay range 67 is set closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. - That is, in a case where the straight traveling
direction 57 is the forward direction, thedisplay range 67 is set closer to the steering angle direction X as the amount of the steering angle increases. Further, in a case where the straight travelingdirection 57 is the backward direction, thedisplay range 67 is set closer to a side opposite to the steering angle direction X as the amount of the steering angle increases. - The
display image 69 is an image obtained by synthesizing at least parts of thefront image 37, theright image 39, theleft image 43, and therear image 47 at the boundaries. An example of thedisplay image 69 is illustrated inFIG. 6 . - The
display image 69 illustrated inFIG. 6 is adisplay image 69 in a case where the display direction determined in step 2 is the forward direction, and the steering angle direction X included in the steering angle information acquired instep 4 is the rightward direction. A position of thedisplay range 67 in thisdisplay image 69 is set at a position in the rightward direction compared to a case where the steering angle is 0 degree. The rightward direction is the direction of change from the straight travelingdirection 57 in this example. - In the
display image 69, a front extractedimage 37B and a right extractedimage 39B are joined at theboundary 59. The front extractedimage 37B is a portion between theboundary 59 and theboundary 61 in thefront image 37. The right extractedimage 39B is a portion on the right side of theboundary 59 in theright image 39, when viewed from theviewpoint 53. - Further, in the
display image 69, the front extractedimage 37B and a left extractedimage 43B are joined at theboundary 61. The left extractedimage 43B is a portion on the left side of theboundary 61 in theleft image 43, when viewed from theviewpoint 53. - In step 6, the position of at least a part of the
boundary 59 is set closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. As a method for setting the position of at least a part of theboundary 59 closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases, for example, as illustrated inFIG. 6 , there is a method in which theboundary 59 is inclined more toward the direction of change from the straight travelingdirection 57 as the amount of steering angle increases. In this method, for example, a position of alower end 59A of theboundary 59 can be set at a fixed position with respect to the own vehicle. Anaxis 71 which is fixed at a frame representing theown vehicle 36 and which is located in a direction opposite to the steering angle direction X when viewed from theboundary 59 is assumed. Note that theaxis 71 is a part of the frame line of theright image 39 and is substantially parallel to the straight travelingdirection 57. An angle θ formed by theaxis 71 and theboundary 59 becomes larger as theboundary 59 is inclined more toward the direction of change from the straight travelingdirection 57. - The display image generating unit 31 generates a display image also in a case where the display direction determined in step 2 is the backward direction, and in a case where the steering angle direction X included in the steering angle information acquired in
step 4 is the leftward direction, in a similar manner to the case of thedisplay image 69 illustrated inFIG. 6 . - In
step 11, theoutput unit 33 outputs the display image generated in step 10 to thedisplay 17. Thedisplay 17 displays the display image. - (1A) The
image processing apparatus 3 sets thedisplay range 67 closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. Therefore, if the position of the boundary with respect to thefront image 37, theright image 39, theleft image 43, and therear image 47 is always fixed, the position of a part of the boundary approaches the center of the display image as the amount of steering angle increases. - For example, in the
display image 69, if the position of the boundary with respect to the front image and the right image is always fixed, in a case where thedisplay range 67 is set closer to the steering angle direction X, the position of the boundary approaches the center of thedisplay image 69 in a similar manner to a boundary 159. - In the image near the boundary, a phenomenon such as distortion or one object being displayed doubly may occur due to a difference of imaging positions of two images to be synthesized or through conversion. The imaging positions are positions where the cameras are provided. If the boundary is located in the vicinity of the center of the display image, it becomes difficult for the passenger of the own vehicle to recognize a surrounding target from the display image.
- The
image processing apparatus 3 sets at least a part of a position on the boundary existing closer to the direction of change from the straight travelingdirection 57 among the boundaries within the display image closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. Therefore, theimage processing apparatus 3 can prevent the position of the boundary from approaching the center of the display image even in a case where the line-of-sight direction 55 changes. As a result, it becomes easy for the passenger of the own vehicle to recognize a surrounding target from the display image. - (1B) The
image processing apparatus 3 tilts the boundary existing closer to the direction of change from the straight travelingdirection 57 among the boundaries within the display image closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. Therefore, theimage processing apparatus 3 can make settings so that the boundary does not approach the center of the display screen. As a result, theimage processing apparatus 3 can prevent the user from feeling uncomfortable. - (1C) The
image processing apparatus 3 specifies a position of the obstacle based on the information transmitted from theobstacle sensor 16. Then, theimage processing apparatus 3 sets a boundary while avoiding the detected obstacle. Therefore, it becomes easy for the passenger of the own vehicle to recognize an obstacle in the display image. - (1D) The
image processing apparatus 3 generates adisplay image 69 viewed from theviewpoint 53 in the line-of-sight direction 55. Further, theimage processing apparatus 3 sets the line-of-sight direction 55 closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. It is therefore possible to obtain thedisplay image 69 with a fixed position of theviewpoint 53. - 1. Differences from First Embodiment
- Since a basic configuration of a second embodiment is similar to that of the first embodiment, differences will be described below. Note that the same reference numerals as those in the first embodiment indicate the same components, and the preceding description will be referred to.
- In the first embodiment described above, the position of the
viewpoint 53 is fixed, and the line-of-sight direction 55 is changed in accordance with the steering angle direction X. In contrast, in the second embodiment, the line-of-sight direction 55 is fixed as illustrated inFIG. 7 . Further, in the second embodiment, the position of theviewpoint 53 is changed in accordance with the direction of change from the straight travelingdirection 57. The position of theviewpoint 53 changes in a left-right direction of the own vehicle. The position of theviewpoint 53 is set closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. The direction of change from the straight travelingdirection 57 is the steering angle direction X in a case where the straight travelingdirection 57 is the forward direction. Further, the direction of change from the straight travelingdirection 57 is a direction opposite to the steering angle direction X in a case where the straight travelingdirection 57 is the backward direction. - In step 10, the display image generating unit 31 sets an image which is visible within the
display range 67 from theviewpoint 53 set as described above as adisplay image 69. Thedisplay range 67 is a range having a certain spread around the fixed line-of-sight direction 55. Thedisplay range 67 is set closer to the direction of change from the straight travelingdirection 57 as the amount of steering angle increases. - An example of the
display image 69 is illustrated inFIG. 8 . Thedisplay image 69 is adisplay image 69 in a case where the display direction is the forward direction and the steering angle direction X is the leftward direction. The position of thedisplay range 67 in thisdisplay image 69 is set in the leftward direction compared to a case where the steering angle is 0 degree. The leftward direction is the direction of change from the straight travelingdirection 57 in this example. - In the
display image 69, the front extractedimage 37B and the right extractedimage 39B are joined at theboundary 59. Further, in thedisplay image 69, the front extractedimage 37B and the left extractedimage 43B are joined at theboundary 61. - A position of at least a part of the
boundary 61 is set closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. As a method for setting the position of at least a part of theboundary 61 closer to the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases, for example, as illustrated inFIG. 8 , there is a method in which theboundary 61 is inclined more toward the direction of change from the straight travelingdirection 57 as the amount of the steering angle increases. In this method, for example, the position of alower end 61A of theboundary 61 can be set at a position fixed with respect to theown vehicle 36. Anaxis 72 is assumed. Theaxis 72 is an axis fixed with respect to a frame representing theown vehicle 36. Theaxis 72 is an axis in a direction opposite to the steering angle direction X when viewed from theboundary 61. Note that theaxis 72 is a part of a frame line of theleft image 43 and is substantially parallel to the straight travelingdirection 57. An angle δ formed by theaxis 72 and theboundary 61 becomes larger as theboundary 61 is inclined more toward the direction of change from the straight travelingdirection 57. - According to the second embodiment described in detail above, the effects (1A) to (1C) of the first embodiment described above are provided, and the following effects are further provided.
- (2A) The
image processing apparatus 3 generates thedisplay image 69 viewed from theviewpoint 53 in the line-of-sight direction 55. Further, theimage processing apparatus 3 sets theviewpoint 53 closer to the steering angle direction X as the amount of the steering angle increases. Theimage processing apparatus 3 can obtain thedisplay image 69 with a fixed position of the line-of-sight direction 55. - While the embodiments of the present disclosure have been described above, the present disclosure is not limited to the above-described embodiments, and can be implemented with various modifications.
- (1) The direction
change acquiring unit 27 may calculate the direction of change from the straight travelingdirection 57 and the change amount from the straight travelingdirection 57 from parameters other than the steering angle. For example, the direction of change from the straight travelingdirection 57 and the change amount from the straight travelingdirection 57 may be calculated from a yaw rate of the own vehicle, display of a directional signal, display of a hazard light, or the like. - (2) A method for setting the boundary in step 6 may be, for example, the following method.
FIG. 9 illustrates thedisplay image 69 before and after the straight traveling direction changes to the rightward direction. Theboundary 59 is a boundary existing closer to the direction of change from the straight travelingdirection 57 among the boundaries within thedisplay image 69. The position of afixed point 59B, which is a part of theboundary 59, with respect to thedisplay range 67 is fixed before and after the straight travelingdirection 57 changes to the rightward direction. The fixedpoint 59B is, for example, a portion other than thelower end 59A. - (3) The
obstacle detecting unit 35 may detect an obstacle using results of image processing on images captured by thefront camera 5, theright camera 7, theleft camera 9, and therear camera 11 in place of the information from theobstacle sensor 16, or may detect an obstacle by integrating the information from theobstacle sensor 16 and results of image recognition. The image processing on the captured images is image recognition of the captured images. - It is also possible to change both the position of the
viewpoint 53 and the line-of-sight direction 55 in accordance with the amount of the steering angle to generate a display image by combining the first and second embodiments. - (4) A plurality of functions of one component in the above-described embodiments may be realized by a plurality of components, or a single function of one component may be realized by a plurality of components. Further, a plurality of functions of a plurality of components may be realized by one component, or one function realized by a plurality of components may be realized by one component. Further, part of the components of the above-described embodiments may be omitted. Further, at least a part of the components of the above-described embodiments may be added to or replaced with the components of other embodiments. Note that embodiments of the present disclosure incorporate any aspect included in technical idea specified from wording recited in the claims.
- (5) The present disclosure can be realized in various modes such as, in addition to the above-described image processing apparatus, a system having the image processing apparatus as a component, a program for causing a computer to function as the image processing apparatus, a non-transitory computer-readable storage medium such as a semiconductor memory in which this program is recorded, an image processing method, an image display method, and a drive assisting method.
Claims (5)
1. An image processing apparatus configured to be mounted on a vehicle, the image processing apparatus comprising:
an image acquiring unit configured to acquire a plurality of images in which parts of ranges which can be displayed overlap with each other, using a plurality of cameras which capture surroundings of the vehicle;
a boundary setting unit configured to set boundaries of the images in which parts of the ranges which can be displayed overlap with each other, included in the plurality of images, within ranges in which the ranges which can be displayed overlap;
a display image generating unit configured to generate a display image viewed from a viewpoint within a vehicle interior of the vehicle by synthesizing at least parts of images of the plurality of images at the boundaries;
an output unit configured to output the display image; and
a direction change acquiring unit configured to acquire a direction of change from a straight traveling direction of the vehicle, and a change amount from the straight traveling direction, wherein
the display image generating unit is configured to set a display range of the display image closer to the direction of change as the change amount increases, and
the boundary setting unit is configured to set a position of at least a part of at least the boundary existing closer to the direction of change among the boundaries within the display image closer to the direction of change as the change amount increases.
2. The image processing apparatus according to claim 1 , wherein
the boundary setting unit is configured to tilt the boundary existing closer to the direction of change among the boundaries within the display image toward the direction of change as the change amount increases.
3. The image processing apparatus according to claim 1 , further comprising:
an obstacle detecting unit configured to detect an obstacle, wherein
the boundary setting unit is configured to set the boundary so as to avoid the obstacle detected at the obstacle detecting unit.
4. The image processing apparatus according to a claim 1 , wherein
the display image generating unit is configured to generate a display image viewed from the viewpoint in a line-of-sight direction, and
the display image generating unit is configured to set the line-of-sight direction closer to the direction of change as the change amount increases.
5. The image processing apparatus according to claim 1 , wherein
the display image generating unit is configured to set a position of the viewpoint closer to the direction of change as the change amount increases.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017199383A JP2019074850A (en) | 2017-10-13 | 2017-10-13 | Image processing apparatus |
JP2017-199383 | 2017-10-13 | ||
PCT/JP2018/037941 WO2019074065A1 (en) | 2017-10-13 | 2018-10-11 | Image processing device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/037941 Continuation WO2019074065A1 (en) | 2017-10-13 | 2018-10-11 | Image processing device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200231099A1 true US20200231099A1 (en) | 2020-07-23 |
Family
ID=66100756
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/844,294 Abandoned US20200231099A1 (en) | 2017-10-13 | 2020-04-09 | Image processing apparatus |
Country Status (5)
Country | Link |
---|---|
US (1) | US20200231099A1 (en) |
JP (1) | JP2019074850A (en) |
CN (1) | CN111201788A (en) |
DE (1) | DE112018004496T5 (en) |
WO (1) | WO2019074065A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7274918B2 (en) | 2019-04-10 | 2023-05-17 | 株式会社細川洋行 | Multilayer film for container and container containing same |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4674814B2 (en) * | 2006-03-20 | 2011-04-20 | アルパイン株式会社 | Image display device for vehicle and image composition method |
JP5158051B2 (en) * | 2009-09-18 | 2013-03-06 | 三菱自動車工業株式会社 | Driving assistance device |
US8655019B2 (en) * | 2009-09-24 | 2014-02-18 | Panasonic Corporation | Driving support display device |
CN103377372B (en) * | 2012-04-23 | 2017-12-22 | 无锡维森智能传感技术有限公司 | One kind looks around composite diagram overlapping region division methods and looks around composite diagram method for expressing |
JP2016199204A (en) * | 2015-04-14 | 2016-12-01 | トヨタ自動車株式会社 | Vehicle control device |
-
2017
- 2017-10-13 JP JP2017199383A patent/JP2019074850A/en active Pending
-
2018
- 2018-10-11 WO PCT/JP2018/037941 patent/WO2019074065A1/en active Application Filing
- 2018-10-11 DE DE112018004496.3T patent/DE112018004496T5/en active Pending
- 2018-10-11 CN CN201880065858.2A patent/CN111201788A/en active Pending
-
2020
- 2020-04-09 US US16/844,294 patent/US20200231099A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
WO2019074065A1 (en) | 2019-04-18 |
DE112018004496T5 (en) | 2020-10-08 |
CN111201788A (en) | 2020-05-26 |
JP2019074850A (en) | 2019-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA3069114C (en) | Parking assistance method and parking assistance device | |
US9479740B2 (en) | Image generating apparatus | |
EP2487906B1 (en) | Control device and vehicle surrounding monitoring device | |
JP5500877B2 (en) | In-vehicle image display device and image trimming method | |
JP5321267B2 (en) | Vehicular image display device and overhead image display method | |
US10499014B2 (en) | Image generation apparatus | |
EP3357793A1 (en) | Parking assist apparatus | |
WO2011001794A1 (en) | Image generation device and image display system | |
JP6586849B2 (en) | Information display device and information display method | |
JP2015000630A (en) | Onboard display device and program | |
JP6471522B2 (en) | Camera parameter adjustment device | |
US10793069B2 (en) | Method for assisting the driver of a motor vehicle in maneuvering the motor vehicle with a trailer, driver assistance system as well as vehicle/trailer combination | |
US11833968B2 (en) | Imaging system and method | |
JP2013168063A (en) | Image processing device, image display system, and image processing method | |
EP3002727A1 (en) | Periphery monitoring apparatus and periphery monitoring system | |
JP2016063390A (en) | Image processing system and image display system | |
JP2017016200A (en) | Image processing apparatus, image display system, and image processing method | |
JP6394940B2 (en) | Vehicle display system | |
JP7000383B2 (en) | Image processing device and image processing method | |
JP2009073250A (en) | Vehicle backsight display device | |
US20200231099A1 (en) | Image processing apparatus | |
JP2012138876A (en) | Image generating apparatus, image display system, and image display method | |
JP2017111739A (en) | Driving support apparatus and driving support method | |
JP2007025739A (en) | Image display device for vehicle | |
KR20180094717A (en) | Driving assistance apparatus using avm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DENSO CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KOHARA, KENJI;REEL/FRAME:052506/0295 Effective date: 20200401 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |