WO2015152691A2 - 차량 주변 이미지 생성 장치 및 방법 - Google Patents
차량 주변 이미지 생성 장치 및 방법 Download PDFInfo
- Publication number
- WO2015152691A2 WO2015152691A2 PCT/KR2015/003394 KR2015003394W WO2015152691A2 WO 2015152691 A2 WO2015152691 A2 WO 2015152691A2 KR 2015003394 W KR2015003394 W KR 2015003394W WO 2015152691 A2 WO2015152691 A2 WO 2015152691A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- bird
- eye view
- view image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims abstract description 237
- 240000004050 Pentaglottis sempervirens Species 0.000 claims abstract description 117
- 239000002131 composite material Substances 0.000 claims abstract description 69
- 238000003702 image correction Methods 0.000 claims abstract description 18
- 238000012545 processing Methods 0.000 claims description 32
- 241000905137 Veronica schmidtiana Species 0.000 claims description 18
- 230000004044 response Effects 0.000 claims description 11
- 239000000284 extract Substances 0.000 claims description 9
- 241000238370 Sepia Species 0.000 claims description 7
- 230000000007 visual effect Effects 0.000 claims description 7
- 230000002194 synthesizing effect Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 16
- 230000008859 change Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 230000009466 transformation Effects 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/002—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles specially adapted for covering the peripheral part of the vehicle, e.g. for viewing tyres, bumpers or the like
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/301—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/302—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
Definitions
- the present invention relates to an apparatus and a method for generating vehicle surrounding image (APPARATUS AND METHOD FOR PERIPHERAL IMAGE GENERATION OF VEHICLE).
- the present invention relates to a vehicle surrounding image generating apparatus and method for distinguishing a past image from a current image in acquiring and displaying an image of a rear of a vehicle.
- a vehicle is a device for driving a road or the like and performing various tasks by driving a road, etc. using a prime mover such as an engine mounted on a vehicle body, and the driver of the vehicle drives the vehicle safely while watching the driving direction. .
- a display apparatus which outputs an image from a camera installed behind a vehicle to a monitor as it is.
- Korean Patent Laid-Open Publication No. 2008-0024772 discloses a technique for more clearly identifying a relative position between a vehicle and a parking compartment in an image displayed on a monitor through a technology for converting an input image received from a camera into a bird's eye view. .
- An object of the present invention is to match the bird's-eye view of images around the vehicle taken at different times through the camera to display an object outside the current field of view of the camera, and to distinguish the current image from the past image in the matched bird's-eye view image. To make it appear.
- the ultrasonic sensor located in the vehicle it is possible to display the danger area around the vehicle, the area where the vehicle is in danger of collision with the surrounding object on the matched bird's eye view image.
- an object of the present invention is to extract the rotation angle of the vehicle through a steering wheel sensor located in the vehicle, and in response to the rotation angle of the vehicle, the vehicle is in an area where there is a risk of collision with the surrounding objects. It is possible to mark the danger zone in a matched bird's eye view image.
- an apparatus for generating a surrounding image of a vehicle by converting a photographed image generated from a camera unit positioned in a vehicle to photograph the surroundings, into data of a ground coordinate system projected by the camera unit as a visual point.
- An area aerial image generation unit a composite aerial image generation unit for generating a composite aerial image by synthesizing a later aerial image, which is a current image photographed after the previous aerial image is generated, with a moving region aerial image, which is a past image; Bird's eye view image, current image And comprising a composite bird's eye view image correction for generating a synthesized bird's-eye view images corrected by correcting the image to distinguish the past image of the moving area bird's eye view image.
- the moving region aerial view image generating unit may extract a region in which the vehicle moves based on wheel pulses for the left and right wheels of the vehicle, which are generated through a wheel speed sensor located in the vehicle.
- the composite bird's eye view image correction unit may include a past image image processor for image processing the moving region bird's eye view image that is the past image.
- the past image image processor may perform at least one of color adjustment, black and white, Blur, sketch, sepia, negative, emboss, and mosaic for the moving region aerial view image which is the past image.
- the past image image processing unit may perform image processing on the moving area bird's eye view image that is the past image in a stepwise manner based on a past viewpoint.
- the past image image processor may image the parking line by detecting a parking line from the moving area aerial view image that is the past image in the composite aerial view image.
- the past image image processing unit may detect a parking line in the moving area aerial view image which is the past image from the synthesized aerial view image, and image the remaining portions other than the parking line to highlight the parking line.
- the composite bird's eye view image correction unit may include a danger zone image processor that displays a danger zone in the composite bird's eye view image in response to a possibility of collision of the vehicle.
- the danger zone image processor through at least one ultrasonic sensor located in the vehicle, in response to the distance between the detected object and the vehicle, by correcting the synthesized bird's eye view image side direction at the front edge of the vehicle Accordingly, the danger zone may be displayed by displaying a virtual rotation area of the front corner of the vehicle.
- the danger zone image processor extracts a rotation angle of the vehicle through a steering wheel sensor located in the vehicle, and corrects the synthesized bird's eye image corresponding to the rotation angle of the vehicle to the side surface at the front edge of the vehicle.
- the danger zone may be displayed by displaying a virtual rotation area of the front edge of the vehicle along the direction.
- the camera unit is located in the vehicle by a bird's eye view image generating unit to shoot the image generated from the camera to photograph the surroundings to the visual point Generating a bird's eye view image by converting to a data of a ground coordinate system, and generating a bird's eye view image by a moving area bird's-eye view image generating unit, and then extracting an area in which the vehicle has moved after the previous bird's-eye view image is generated.
- a moving area aerial view image which is a bird's eye view for a region
- a subsequent bird's-eye view image which is a current image generated after the previous bird's-eye view image is generated by a composite bird's-eye view image generating unit
- bird's-eye view synthesis image correction includes generating a synthesized bird's-eye view images corrected by correcting the image to distinguish the current image after the bird's eye view image and the past image in the moving area bird's eye view image.
- the generating of the moving area aerial view image may extract an area in which the vehicle moves based on wheel pulses for the left and right wheels of the vehicle, which are generated through a wheel speed sensor located in the vehicle. have.
- the generating of the corrected synthesized bird's-eye view may include image processing a moving region bird's-eye view image that is the past image in the synthesized bird's-eye view image.
- the image processing of the moving area aerial view image, which is the past image may include processing at least one of color adjustment, black and white, Blur, sketch, sepia, negative, emboss, and mosaic for the moving area aerial view image, which is the past image. Can be performed.
- the image processing of the moving area bird's-eye view image which is the past image, may be performed by image processing differently in stages based on the past viewpoint.
- the image processing of the moving area aerial view image which is the past image may include image processing of the parking line by detecting a parking line of the moving area aerial view image which is the past image.
- the image processing of the moving area bird's eye view image may include detecting a parking line from the moving area bird's-eye view image, which is the past image, and processing the remaining image except for the parking line. You can highlight the line.
- the generating of the corrected composite bird's eye view image may include displaying a dangerous area on the composite bird's-eye view image in response to a possibility of collision of the vehicle.
- the displaying of the dangerous area may include generating the corrected composite bird's eye image in front of the vehicle, in response to a distance between the detected object and the vehicle, through at least one ultrasonic sensor located in the vehicle.
- the danger zone may be displayed by displaying a virtual rotation area of the front edge of the vehicle along the lateral direction from the corner.
- the displaying of the dangerous area may include extracting a rotation angle of the vehicle through a steering wheel sensor located in the vehicle and generating the corrected composite bird's eye image corresponding to the rotation angle of the vehicle.
- the danger zone may be displayed by displaying a virtual rotation area of the front edge of the vehicle along the lateral direction from the front edge.
- the present invention while displaying the object outside the current field of view of the camera by matching the bird's-eye view of the image around the vehicle taken at different times through the camera, by separating the current image and the past image from the matched bird's-eye view image I can display it.
- the dangerous area around the vehicle which is the area where the vehicle is in danger of colliding with the surrounding object, can be displayed on the matched aerial view image.
- the steering wheel sensor located in the vehicle extract the rotation angle of the vehicle, the risk around the vehicle that is the area where the vehicle is in danger of collision with the surrounding objects corresponding to the rotation angle of the vehicle Regions can be displayed in the matched bird's eye image.
- FIG. 1 is a schematic diagram showing a main part of a vehicle surrounding image generating apparatus according to the present invention.
- FIG. 2 is a block diagram of a vehicle surroundings image generating apparatus according to the present invention.
- FIG. 3 is a diagram illustrating a positional relationship in coordinate transformation performed by the apparatus for generating a vehicle surrounding image according to the present invention.
- FIG. 4 is a view for explaining the concept of the vehicle surrounding image generating apparatus according to the present invention.
- FIG. 5 is a diagram illustrating wheel pulses of a left rear wheel of a vehicle generated by an embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- FIG. 6 is a diagram illustrating wheel pulses of a right rear wheel of a vehicle generated by an embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- FIG. 7 is a diagram for describing a method of extracting a region in which a vehicle moves according to an embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- FIG 8 and 9 are views for explaining a state before correcting a composite bird's eye view image in the vehicle surrounding image generating apparatus according to the present invention.
- FIG. 10 is an embodiment of a composite bird's eye view image correction unit of a vehicle surrounding image generating apparatus according to the present invention.
- 11 to 13 are output screens of the display unit according to an exemplary embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- FIG. 14 is a diagram for describing a concept of a danger zone image processor of a synthesized bird's eye view image correction unit according to the present invention.
- 15 to 16 are diagrams for describing an operation principle of the dangerous region image processor of the composite bird's eye view image correction unit according to the present invention.
- 17 is a flowchart illustrating a method of generating a vehicle surrounding image according to the present invention.
- FIGS. 1 and 2 A basic system configuration of the apparatus for generating a vehicle surrounding image according to the present invention will be described with reference to FIGS. 1 and 2.
- FIG. 1 is a schematic diagram showing a main part of a vehicle surrounding image generating apparatus according to the present invention.
- 2 is a block diagram of a vehicle surroundings image generating apparatus according to the present invention.
- the apparatus for generating a surrounding image of a vehicle 100 may include a photographed image generated from a camera unit 110 positioned in the vehicle 1 and photographing the surrounding 5.
- the vehicle ( 1) a moving area aerial image generator 130 extracting a moved area and generating a moving area aerial view image, which is a bird's eye view of the moved area, and a current image photographed after the previous bird's-eye view image is generated;
- a bird's-eye view synthesis image correcting unit 150 to generate a synthesized bird's-eye view images corrected by correcting the image to distinguish the past image of the moving area bird's eye view image.
- the display device 160 may further include a display unit 160 located in the vehicle 1 and displaying the corrected composite bird's eye view image.
- the vehicle 1 may include at least one ultrasonic sensor 170 to detect an object around the vehicle.
- the steering wheel sensor 180 may be included in the vehicle 1 to extract a rotation angle of the vehicle 1.
- the bird's eye view image generator 120, the moving region bird's eye view image generator 130, the synthesized bird's eye view image generator 140, and the synthesized bird's eye view image correction unit 150 may include a vehicle surrounding image generator (A) An electronic device for processing image data, including a microcomputer, as a main part of 100, may be integrally formed with the camera unit 1.
- the camera unit 110 is located in the vehicle 1 and performs a function of generating a photographed image by photographing the periphery 5.
- the camera unit 110 is installed at the rear of the vehicle 1 and includes at least one camera (eg, a CCD camera).
- the bird's eye view image generator 120 generates a bird's eye view image by converting the photographed image generated by the camera unit 110 into data of a ground coordinate system projected by the visual point. Perform.
- a related technique may be used as described below.
- the position of the image on the ground eg parking space indication
- the position of the image on the ground is obtained as a bird's eye view image.
- FIG. 3 is a diagram illustrating a positional relationship in coordinate transformation performed by the apparatus for generating a vehicle surrounding image according to the present invention.
- the perspective transformation is performed in such a manner that the positional data of the image on the ground is projected onto the screen plane T having the focal length f from the position R of the camera unit 110. Is executed.
- the camera unit 110 is located at the point R on the Z-axis (0, 0, H), and the angle of looking down ( Suppose you want to monitor an image on the ground (xy coordinate plane). Accordingly, as shown in Equation 1 below, the two-dimensional coordinates ⁇ and ⁇ on the screen plane T may be converted (back-perspective transformation) to coordinates (bird's eye view coordinates) on the ground plane.
- the projected image (showing the bird's eye view image) may be converted into an image of the screen of the display unit 160 and displayed on the display unit 160.
- the vehicle 200 at the time T and the vehicle 200 ′ at the time T + 1 which is after a predetermined time has elapsed from the time T are shown.
- the bird's-eye view image generated by the vehicle 200 at the point-in-time T is called the previous bird's-eye view image 10, and the bird's-eye view image generated by the vehicle 200 ′ at the point T + 1 is referred to as the bird's-eye view image 20. .
- the bird's-eye view image 10 and the bird's-eye view image 20 have a bird's-eye view image 30 for a common area. That is, it is a bird's eye view image which is taken and generated in common at the time T and the time T + 1.
- the past bird's eye view image 40 is an object outside the field of view of the camera unit 110 located behind the vehicle 200 'at the time T + 1. That is, it means a target that is not photographed at the time T + 1.
- the vehicle surrounding image generating apparatus 100 includes the past bird's eye view image 40 in the image displayed by the display unit 160 inside the vehicle 200 'at the time T + 1.
- the synthesized image is called a composite bird's eye image.
- the driver should be able to distinguish between the image of the past and the image of the present.
- an area in which the vehicle has moved must be extracted. Since the past bird's eye view image 40 is calculated by extracting an area where the vehicle has moved, it is called a moving area bird's-eye view image.
- the synthesized bird's-eye view image is synthesized from the bird's-eye view image 20 and the moving region bird's-eye view image after the previous bird's-eye view image 10 is generated at the time T and after the image is generated at the T + 1 point.
- the moving area aerial view image generating unit 130 extracts an area in which the vehicle is moved after the previous aerial view image is generated by the bird's eye view image generating unit 120, and then the moving area aerial view image which is a bird's eye view of the moved area. Performs the function of generating it.
- the previous bird's eye view image 10 is displayed on the display unit 160 of the vehicle 200 at the time T, and the display unit 160 of the vehicle 200 ′ at the time T + 1.
- the past bird's eye view image vehicle 200 ' exists
- the past bird's-eye view image 40 and the following bird's-eye view image 20 are displayed together on the display unit 160 of the vehicle 200'. This is referred to as a composite bird's eye image.
- the bird's-eye view image 20 is a bird's-eye view of an image photographed by the camera unit 110 located in the vehicle 200 'at a point in time T + 1.
- the past aerial image 40 in order to generate the composite image, the past aerial image 40 must be extracted, which is extracted from the previous aerial image. Specifically, the area 200 is moved until the vehicle 200 at the time T reaches the time T + 1. In order to extract the moved region, the previous aerial image 10 and the subsequent aerial image 20 may be contrasted, and the region obtained by subtracting a common region corresponding to the second aerial view from the previous aerial view image 10 ( 40) is calculated as the area where the vehicle has moved.
- Another method of estimating the area in which the vehicle has moved is a method of extracting the area in which the vehicle has moved based on driving information of the vehicle including the speed and the moving direction of the vehicle provided from the vehicle.
- the predetermined number of unit pixels is calculated according to the speed information and the steering wheel information.
- the pixel is moved by a predetermined unit pixel (for example, 20 pixels) in the driving direction, and left or right direction depending on the direction of the steering wheel.
- the moved area is calculated by moving the predetermined unit pixel (for example, 10 pixels).
- the area in which the vehicle is moved is extracted based on wheel pulses for the left and right wheels of the vehicle, which are generated through a wheel speed sensor located in the vehicle. There is a method, which will be described with reference to FIGS. 5 to 7.
- FIG. 5 is a diagram illustrating wheel pulses of a left rear wheel of a vehicle generated by an embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- FIG. 6 is a diagram illustrating wheel pulses of a right rear wheel of a vehicle generated by an embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- FIG. 7 is a diagram for describing a method of extracting a region in which a vehicle moves according to an embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- the wheel speed sensor is positioned in the vehicle to generate a wheel pulse signal according to the movement of the left and right wheels of the vehicle.
- the front wheel of the vehicle is configured to rotate, unlike the rear wheel of the vehicle, it is more effective to utilize the rear wheel of the vehicle in accurately extracting the traveled distance of the vehicle.
- the wheel speed sensor will be described with reference to the rear wheel. However, this description does not limit only the rear wheel to the right range in the wheel speed sensor.
- the wheel pulse signal for the left rear wheel of the vehicle may be checked according to a change in time. At this time, the wheel pulse signal is counted by 1 for each cycle, and the distance moved every cycle is 0.0217m.
- the wheel pulse signal for the right rear wheel of the vehicle may be checked according to a change in time. At this time, the wheel pulse signal is counted by 1 for each cycle, and the distance moved every cycle is 0.0217m.
- the wheel pulse signal for the left rear wheel at the time point T has a value of 3 when three cycles are counted, but the wheel pulse signal for the right rear wheel at the time point T has five times. The period is counted and has a value of count five.
- the distance moved during the same time can confirm that the right rear wheel is more. This may be determined that the vehicle is moving backward with the steering wheel rotated in the clockwise direction on the premise that the vehicle reverses.
- a moving distance of the vehicle may be extracted using Equations 2 to 4 below.
- K1 means a moving distance of the inner rear wheel. For example, when the vehicle 1 reverses with the steering wheel rotated clockwise, the right rear wheel becomes the inner rear wheel, and the vehicle 1 reverses with the steering wheel rotated counterclockwise. The left rear wheel becomes the inner rear wheel.
- WP out means the wheel pulse count value of the outer rear wheel
- WP res is the resolution of the wheel pulse signal and means the distance (0.0217m) per one cycle signal. That is, the WP res is a constant (0.0217) and is a value that can vary depending on the type and setting of the wheel speed sensor.
- t means time, but before the vehicle moves, Means the time taken while the vehicle is moving.
- K2 means a moving distance of the outer rear wheel.
- the left rear wheel becomes the outer rear wheel
- the vehicle 1 reverses with the steering wheel rotated counterclockwise.
- the right rear wheel is the outer rear wheel.
- WP out means the wheel pulse count value of the outer rear wheel
- WP res is the resolution of the wheel pulse signal and means the distance (0.0217m) per one cycle signal. That is, the WP res is a constant (0.0217) and is a value that can vary depending on the type and setting of the wheel speed sensor.
- t means time, but before the vehicle moves, Means the time taken while the vehicle is moving.
- Equation 4 K means the moving distance of the axle.
- the moving distance of the axle means the same as the moving distance of the vehicle.
- K1 means the moving distance of the inner rear wheel
- K2 means the moving distance of the outer rear wheel. That is, the moving distance of the axle, which is the moving distance of the vehicle, is an average value of the moving distance of the rear wheel of the vehicle and the moving distance of the rear wheel of the vehicle.
- the moving distance extracting unit 131 may extract the moving distance of the vehicle 1 through Equations 2 to 4.
- FIG. 7 is a diagram for describing a method of extracting a region in which a vehicle moves according to an embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- the current position 6 of the vehicle 1 is halfway between the left rear wheel and the right rear wheel, which is equivalent to the current center position of the axle.
- the position 7 after the vehicle 1 has moved is halfway between the left rear wheel and the right rear wheel of the vehicle at the moved position.
- K1 means the moving distance of the inner rear wheel
- R means the rotation radius of the vehicle. Specifically, it means the radius of rotation of the axle.
- W means vehicle width. Specifically, it means the distance between the left rear wheel and the right rear wheel. Also, Denotes the amount of change in the vehicle angle during t time.
- K2 means the moving distance of the outer rear wheel
- R is the rotation radius of the vehicle. Specifically, it means the radius of rotation of the axle.
- W means vehicle width
- K2 means the moving distance of the outer rear wheel
- K1 means the moving distance of the inner rear wheel
- W means vehicle width
- Equation 8 is expressed by Equation 7 as the amount of change in the vehicle angle during t time. It is a mathematical expression about. That is, if K2 obtained from Equation 5 is obtained by subtracting K1 obtained from Equation 6 by W, which is a predetermined width of the vehicle, it is an amount of change in vehicle angle during t time. Will be available.
- Equation 5 K1, K2, And since both W values can be known, by substituting Equation 5 or Equation 6, it is possible to obtain an R value which is a turning radius of the vehicle.
- x c (t) means the position of the x coordinate of the rotation center
- x (t) means the position of the x coordinate of the current center position of the axle axis that is the current position of the vehicle. It has been described above that the current center position of the axle is halfway between the left rear wheel and the right rear wheel. Also, Means the angle of the current vehicle.
- Equation 10 y c (t) means the position of the y coordinate of the rotation center, y (t) means the position of the x coordinate of the current center position of the axle axis that is the current position of the vehicle. Also, Means the angle of the current vehicle.
- Equation 11 Denotes the position of the x coordinate among the current center positions of the axle when the rotation center is the origin, and x (t) denotes the position of the x coordinate among the current center positions of the axle which is the current position of the vehicle. Also, Means the angle of the current vehicle.
- Equation 12 Denotes the position of the y coordinate among the current center positions of the axle when the rotation center is the origin, and y (t) denotes the position of the y coordinate among the current center positions of the axles that are the current positions of the vehicle. Also, Means the angle of the current vehicle.
- Equation 13 Denotes the position of the X coordinate among the center positions after the axle movement when the rotation center is the origin, Means the position of the y coordinate among the center positions after the movement of the axle when the rotation center is the origin.
- Equation 13 is It is a rotation conversion formula that calculates the center position of the moved axle when the axle moves as time passes and the rotation center is the origin.
- Equation 14 Denotes the position of the X coordinate among the center positions after the movement of the axle. That is, it means the value of the absolute position, the position of the state not considering the rotation center as the origin. Also, Means the position of the X coordinate among the center positions after the movement of the axle when the rotation center is the origin, and x c (t) means the position of the x coordinate of the rotation center.
- Equation 15 Denotes the position of the y coordinate among the center positions after the axle movement. That is, it means the value of the absolute position, the position of the state not considering the rotation center as the origin. Also, Means the position of the y coordinate among the center positions after the axle movement when the rotation center is the origin, and y c (t) means the position of the y coordinate of the rotation center.
- Equation (16) is obtained by substituting Equations (9) to (13) for Equation (14), which is the position of the X coordinate among the center positions after the movement of the axle. Is the final equation for.
- Equation (17) is obtained by substituting Equations (9) to (13) for Equation (15), which is the position of the y coordinate among the center positions after the axle movement. Is the final equation for. Therefore, it is possible to extract the area in which the vehicle moved.
- the composite bird's eye view image generation unit 140 may include a bird's-eye view image 20 that is photographed and generated after the previous bird's-eye view image 10 shown in FIG. 4 is generated. 40) to synthesize.
- the driver can view the bird's eye view image 40 of the portion not captured by the camera unit 110.
- the composite bird's eye view image correction unit 150 performs a function of generating a corrected bird's-eye view image by correcting an image to distinguish a bird's-eye view image that is a current image and a moving area bird's-eye image that is a past image.
- the moving area aerial view image generated by the moving area aerial view image generating unit 130 may be corrected.
- the composite bird's eye view image generator 140 corrects the moving region bird's-eye view image that is the past image before generating the composite bird's-eye view image, thereby distinguishing the after bird's-eye view image that is the current image from the moving area bird's-eye view image that is the past image. That's how. Accordingly, the composite bird's eye view image is generated by the composite bird's eye view image generator 140 in a state where the current image and the past image are separated.
- a method of correcting the synthesized bird's-eye view image generated by the synthesized bird's-eye view image generator 140 may be exemplified.
- the composite bird's-eye view image is corrected to distinguish the bird's-eye view image that is the current image from the moving region bird's-eye view image that is the past image. That's how. Therefore, when the composite bird's eye view image is generated by the composite bird's eye view image generation unit 140 in a state where the current image and the past image are not distinguished, the composite bird's eye view image correction unit 150 corrects the composite bird's eye view image. The image can be distinguished from the past image.
- the display 110 outputs a photographed image generated by the camera unit 110 located in the vehicle.
- 9 is a composite bird's eye view image, which is a state before being corrected by the composite bird's-eye view image correction unit 150.
- the composite bird's eye view image includes a current bird's-eye view image 20 existing in the field of view of the camera unit 110 located in the vehicle 200 ′, and a past bird's-eye view existing outside the field of view of the camera unit 110.
- Image 40 is present.
- composition of the bird's-eye view image correction unit 150 distinguishes the past bird's-eye view image 40 from the current image 20.
- 10 is an embodiment of a composite bird's eye view image correction unit of a vehicle surrounding image generating apparatus according to the present invention.
- 11 to 13 are output screens of the display unit according to an exemplary embodiment of the apparatus for generating a vehicle surrounding image according to the present invention.
- the composite bird's eye view image corrector 150 includes a past image processor 151 and a danger zone image processor 152.
- the past image image processing unit 151 performs a function of detecting a moving area aerial view image, which is the past image, from the synthesized aerial view image, and performing image processing on the moving area aerial view image, which is the past image.
- the composite aerial view image includes a moving area aerial view image that is a past image
- detecting the moving area aerial view image that is the past image is preceded by the composite aerial view image.
- the past image image processor 151 may perform at least one of black and white, Blur, sketch, sepia, negative, emboss, and mosaic on the detected moving region aerial view image. .
- the moving region bird's eye view image which is a past image, is processed in black and white.
- the driver may easily distinguish between the past image and the present image, thereby preventing a vehicle accident in advance.
- the past image image processing unit 151 at least one of black and white, Blur, sketch, sepia, negative, emboss, mosaic in a moving image of the moving area bird's eye view image that is the past image step by step based on the past viewpoint. Processing can be performed.
- the past image image processing unit 151 may image the parking line by detecting a parking line among the moving area bird's-eye view images that are the past image in the synthesized aerial view image.
- past image image processor 151 may highlight the parking line by noise-processing the rest of the moving area bird's eye view image that is the past image.
- the area aerial view image 40 can be unreliably induced.
- FIG. 13 there is a method of adding a warning display 41 to the moving area aerial view image 40, which is the past image. That is, it is possible to clearly inform the driver that the image of the past.
- it may be designed to distinguish the past bird's-eye view image 40 from the present image 20 by depicting a V-shaped line showing the vehicle and viewing angle in the composite bird's-eye view image.
- the dangerous area image processor 152 performs a function of displaying a dangerous area on the synthesized bird's eye image by correcting the synthesized bird's eye image in response to a possibility of collision of the vehicle.
- the dangerous area 42 may be displayed in the moving area aerial view image 40, which is a past image.
- a method of determining whether the vehicle may collide with another object includes a method using an ultrasonic sensor located in the vehicle and a method using a steering wheel sensor located in the vehicle. To be described later.
- the dangerous area image processor 152 corrects the synthesized bird's-eye view image in response to a distance between the detected object 21 and the vehicle through at least one ultrasonic sensor 170 located in the vehicle.
- the danger zone 42 may be displayed on the front or side of the vehicle present in the bird's eye view image.
- the at least one ultrasonic sensor positioned in the vehicle generates the corrected composite bird's eye image in response to the distance between the detected object and the vehicle, so that the vehicle is positioned along the lateral direction at the front edge of the vehicle.
- the danger zone can be marked by displaying the virtual rotation area of the front edge.
- the dangerous area 42 may be displayed to be connected from the front of the vehicle to the right side, and the dangerous area is displayed to be connected to the left side from the front of the vehicle. can do.
- the driver may easily use a larger image or display a darker color when displaying the danger area. Can be perceived.
- the display of the dangerous area 42 may be designed to be changed in real time.
- the indication of the dangerous area 42 may be changed in real time corresponding to the distance between the vehicle and the sensed object 21. .
- the dangerous area image processor 152 extracts the rotation angle of the vehicle through the steering wheel sensor 180 located in the vehicle, and corrects the synthesized bird's-eye image corresponding to the rotation angle of the vehicle.
- the dangerous area 42 may be displayed on the front or side of the vehicle in the bird's eye view image.
- the danger zone can be indicated by displaying the virtual rotation area of the front edge of the vehicle.
- the dangerous area is displayed only on the image (for example, the rear of the vehicle) existing in the field of view of the camera unit 110, but according to the embodiment of the present invention, the dangerous area is present outside the field of view of the camera unit 110.
- the driver can be notified of danger.
- the risk of collision of the vehicle is determined.
- the risk of collision of the vehicle is determined in consideration of the rotation angle of the vehicle.
- the danger area may be displayed on the left side of the vehicle by extracting the rotation angle of the vehicle through the steering wheel sensor 180.
- the driver when the driver changes the direction of the steering wheel 4 or changes the angle of the steering wheel 4, the driver may design the dangerous area 42 to be displayed in real time.
- the indicated danger zone 42 may be changed according to the traveling direction of the vehicle.
- the danger zone 42 is displayed on the front or right side of the vehicle, but at this time, the vehicle In the case of advancing by changing the direction of travel, the above indicated dangerous area 42 can be removed.
- the dangerous area 42 may be displayed to be connected from the front of the vehicle to the right side, and the dangerous area is displayed to be connected to the left side from the front of the vehicle. can do.
- FIGS. 15 to 16 An operation principle of extracting a rotation angle of the vehicle through the steering wheel sensor will be described with reference to FIGS. 15 to 16.
- 15 is a diagram for describing a steering wheel sensor of a vehicle.
- FIG. 16 is a view for explaining rotation angles of the left front wheel and the right front wheel of the vehicle.
- the maximum angle of the steering wheel 4 of the vehicle 1 may be checked. Specifically, it is possible to rotate up to 535 degrees in the counterclockwise direction (ie, -535 degrees) and 535 degrees in the clockwise direction. In order to detect the angle of the steering wheel 4, a steering wheel sensor located in the vehicle 1 is utilized.
- the angles of the left front wheel 2 and the right front wheel 2 ′ of the vehicle may be checked.
- the outer maximum angle ( ) Is 33 degrees and the maximum internal angle ( ) Is 39 degrees.
- the outer maximum angle and the inner maximum angle may vary according to the type of vehicle 1 and technology development.
- the steering wheel sensor 180 detects the rotation angle of the steering wheel 4 of the vehicle, and based on the rotation angle of the steering wheel 4, the left front wheel (2) and the right of the vehicle The rotation angle of each of the front wheels 2 'can be calculated.
- the angle of the front wheels 2, 2 ′ of the vehicle 1 may be calculated based on the detected angle of the steering wheel 4.
- a specific method for calculating the angle of the front wheels 2, 2 'of the vehicle 1 uses the following equation.
- Equations 18 and 19 (out) is an angle outside the front wheel of the rotating vehicle 1, (in) is an angle inside the front wheel of the rotating vehicle 1. Also, Is the maximum outer angle of 33 degrees, Is the inner maximum angle of 39 degrees.
- the angle of the inner side is calculated at the left front wheel 2, and the outer angle is calculated at the right front wheel 2 ′.
- the inside angle is calculated at the right front wheel 2 ′, and the outside angle is calculated at the left front wheel 2.
- the value obtained from the steering wheel sensor 180 has a range of -535 degrees to 535 degrees.
- the steering wheel sensor 180 can calculate the rotation angle of the vehicle.
- 17 is a flowchart illustrating a method of generating a vehicle surrounding image according to the present invention.
- generating a photographed image by photographing a surrounding through a camera unit located in a vehicle (S100), and the photographed image is projected onto the camera unit as a visual point.
- Generating a bird's eye view image by converting the data into coordinate data (S110), and after generating a bird's eye view image in the step of generating the bird's eye view image, extracting an area in which the vehicle has moved, and moving the bird's eye view to the moved area.
- Generating a bird's eye view image (S120), generating a synthetic bird's eye image by combining a bird's-eye view image, which is a current image photographed after the previous bird's-eye view image is generated, with a moving area bird's-eye image, which is a past image ( S130) and after the bird's eye view image and the past image, And correcting the synthesized bird's-eye image to generate a corrected bird's-eye view image in order to distinguish a moving area bird's-eye view image (S140).
- the method may further include displaying the corrected composite bird's-eye image (S150) through a display unit located in the vehicle.
- the apparatus and method for generating a vehicle surrounding image according to the present invention may not be limitedly applied to the configuration and method of the embodiments described as described above, but the embodiments may be modified in various ways. All or some of these may optionally be combined.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (20)
- 차량에 위치하여 주변을 촬영하는 카메라부로부터 생성된 촬영 이미지를 상기 카메라부를 시각점으로 투영된 지면 좌표계의 데이터로 변환함으로써, 조감도 이미지를 생성하는 조감도 이미지 생성부;상기 조감도 이미지 생성부에 의해 이전 조감도 이미지가 생성된 후, 상기 차량이 이동한 영역을 추출하여 상기 이동한 영역에 대한 조감도인 이동 영역 조감도 이미지를 생성하는 이동 영역 조감도 이미지 생성부;상기 이전 조감도 이미지가 생성된 이후에 촬영되어 생성된 현재 이미지인 이후 조감도 이미지를 과거 이미지인 이동 영역 조감도 이미지와 합성함으로써, 합성 조감도 이미지를 생성하는 합성 조감도 이미지 생성부; 및상기 현재 이미지인 이후 조감도 이미지와, 상기 과거 이미지인 이동 영역 조감도 이미지를 구분하기 위하여 이미지를 보정하여 보정된 합성 조감도 이미지를 생성하는 합성 조감도 이미지 보정부를 포함하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 1에 있어서,상기 이동 영역 조감도 이미지 생성부는,상기 차량에 위치한 휠 스피드 센서를 통하여 생성된, 상기 차량의 좌측 바퀴 및 우측 바퀴에 대한 휠 펄스를 기반으로 차량이 이동한 영역을 추출하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 1에 있어서,상기 합성 조감도 이미지 보정부는,상기 과거 이미지인 이동 영역 조감도 이미지를 영상 처리하는 과거 이미지 영상 처리부를 포함하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 3에 있어서,상기 과거 이미지 영상 처리부는,상기 과거 이미지인 이동 영역 조감도 이미지에 대하여 색상 조정, 흑백, Blur, 스케치, 세피아, 네거티브, 엠보스, 모자이크 중 적어도 하나의 처리를 수행하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 3에 있어서,상기 과거 이미지 영상 처리부는,상기 과거 이미지인 이동 영역 조감도 이미지를, 과거 시점을 기반으로 단계별로 상이하게 영상 처리를 수행하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 3에 있어서,상기 과거 이미지 영상 처리부는,상기 합성 조감도 이미지에서, 상기 과거 이미지인 이동 영역 조감도 이미지 중 주차선을 검출하여 상기 주차선을 영상 처리하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 3에 있어서,상기 과거 이미지 영상 처리부는,상기 합성 조감도 이미지에서, 상기 과거 이미지인 이동 영역 조감도 이미지중 주차선을 검출하여 상기 주차선 이외의 나머지를 영상 처리하여 상기 주차선을 부각시키는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 1에 있어서,상기 합성 조감도 이미지 보정부는,상기 차량의 충돌 가능성에 대응하여, 상기 합성 조감도 이미지에 위험 영역을 표시하는 위험 영역 영상 처리부를 포함하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 8에 있어서,상기 위험 영역 영상 처리부는,상기 차량에 위치한 적어도 하나의 초음파 센서를 통하여, 감지된 물체와 상기 차량과의 거리에 대응하여, 상기 합성 조감도 이미지를 보정함으로써 상기 차량의 전방 모서리에서 측면 방향을 따라 상기 차량의 전방 모서리의 가상 회전 영역을 표시하여 위험 영역을 표시하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 청구항 8에 있어서,상기 위험 영역 영상 처리부는,상기 차량에 위치한 스티어링 휠 센서를 통하여 상기 차량의 회전 각도를 추출하고, 상기 차량의 회전 각도에 대응하여 상기 합성 조감도 이미지를 보정함으로써 상기 차량의 전방 모서리에서 측면 방향을 따라 상기 차량의 전방 모서리의 가상 회전 영역을 표시하여 위험 영역을 표시하는 것을 특징으로 하는 차량 주변 이미지 생성 장치.
- 조감도 이미지 생성부에 의하여 차량에 위치하여 주변을 촬영하는 카메라부로부터 생성된 촬영 이미지를 상기 카메라부를 시각점으로 투영된 지면 좌표계의 데이터로 변환함으로써, 조감도 이미지를 생성하는 단계;이동 영역 조감도 이미지 생성부에 의하여 상기 조감도 이미지를 생성하는 단계에서 이전 조감도 이미지가 생성된 후, 상기 차량이 이동한 영역을 추출하여 상기 이동한 영역에 대한 조감도인 이동 영역 조감도 이미지를 생성하는 단계;합성 조감도 이미지 생성부에 의하여 상기 이전 조감도 이미지가 생성된 이후에 촬영되어 생성된 현재 이미지인 이후 조감도 이미지를, 과거 이미지인 이동 영역 조감도 이미지와 합성함으로써 합성 조감도 이미지를 생성하는 단계; 및합성 조감도 이미지 보정부에 의하여 상기 현재 이미지인 이후 조감도 이미지와, 상기 과거 이미지인 이동 영역 조감도 이미지를 구분하기 위하여 이미지를 보정하여 보정된 합성 조감도 이미지를 생성하는 단계를 포함하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 11에 있어서,상기 이동 영역 조감도 이미지를 생성하는 단계는,상기 차량에 위치한 휠 스피드 센서를 통하여 생성된, 상기 차량의 좌측 바퀴 및 우측 바퀴에 대한 휠 펄스를 기반으로 차량이 이동한 영역을 추출하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 11에 있어서,상기 보정된 합성 조감도 이미지를 생성하는 단계는,상기 합성 조감도 이미지에서, 상기 과거 이미지인 이동 영역 조감도 이미지를 영상 처리하는 단계를 포함하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 13에 있어서,상기 과거 이미지인 이동 영역 조감도 이미지를 영상 처리하는 단계는,상기 과거 이미지인 이동 영역 조감도 이미지에 대하여 색상 조정, 흑백, Blur, 스케치, 세피아, 네거티브, 엠보스, 모자이크 중 적어도 하나의 처리를 수행하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 13에 있어서,상기 과거 이미지인 이동 영역 조감도 이미지를 영상 처리하는 단계는,상기 과거 이미지인 이동 영역 조감도 이미지를, 과거 시점을 기반으로 단계별로 상이하게 영상 처리를 수행하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 13에 있어서,상기 과거 이미지인 이동 영역 조감도 이미지를 영상 처리하는 단계는,상기 합성 조감도 이미지에서, 상기 과거 이미지인 이동 영역 조감도 이미지 중 주차선을 검출하여 상기 주차선을 영상 처리하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 13에 있어서,상기 과거 이미지인 이동 영역 조감도 이미지를 영상 처리하는 단계는,상기 합성 조감도 이미지에서, 상기 과거 이미지인 이동 영역 조감도 이미지중 주차선을 검출하여 상기 주차선 이외의 나머지를 영상 처리하여 상기 주차선을 부각시키는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 11에 있어서,상기 보정된 합성 조감도 이미지를 생성하는 단계는,상기 차량의 충돌 가능성에 대응하여, 상기 합성 조감도 이미지에 위험 영역을 표시하는 단계를 포함하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 18에 있어서,상기 위험 영역을 표시하는 단계는,상기 차량에 위치한 적어도 하나 이상의 초음파 센서를 통하여, 감지된 물체와 상기 차량과의 거리에 대응하여, 상기 보정된 합성 조감도 이미지를 생성함으로써 상기 차량의 전방 모서리에서 측면 방향을 따라 상기 차량의 전방 모서리의 가상 회전 영역을 표시하여 위험 영역을 표시하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
- 청구항 18에 있어서,상기 위험 영역을 표시하는 단계는,상기 차량에 위치한 스티어링 휠 센서를 통하여 상기 차량의 회전 각도를 추출하고, 상기 차량의 회전 각도에 대응하여 상기 보정된 합성 조감도 이미지를 생성함으로써 상기 차량의 전방 모서리에서 측면 방향을 따라 상기 차량의 전방 모서리의 가상 회전 영역을 표시하여 위험 영역을 표시하는 것을 특징으로 하는 차량 주변 이미지 생성 방법.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/277,050 US20170144599A1 (en) | 2014-04-04 | 2015-04-03 | Apparatus and method for generating image around vehicle |
MX2016012999A MX2016012999A (es) | 2014-04-04 | 2015-04-03 | Aparato y metodo para la generacion de una imagen periferica de un vehiculo. |
JP2016560769A JP2017520133A (ja) | 2014-04-04 | 2015-04-03 | 車両周辺イメージ生成装置および方法 |
EP15772355.2A EP3128738A4 (en) | 2014-04-04 | 2015-04-03 | Apparatus and method for generating image around vehicle |
CN201580025702.8A CN107079081A (zh) | 2014-04-04 | 2015-04-03 | 车辆周边图像生成装置及方法 |
BR112016023013A BR112016023013A2 (pt) | 2014-04-04 | 2015-04-03 | aparelho e método para geração de imagem periférica de veículo |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2014-0040633 | 2014-04-04 | ||
KR1020140040633A KR101611194B1 (ko) | 2014-04-04 | 2014-04-04 | 차량 주변 이미지 생성 장치 및 방법 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2015152691A2 true WO2015152691A2 (ko) | 2015-10-08 |
WO2015152691A3 WO2015152691A3 (ko) | 2017-02-02 |
Family
ID=54241408
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2015/003394 WO2015152691A2 (ko) | 2014-04-04 | 2015-04-03 | 차량 주변 이미지 생성 장치 및 방법 |
Country Status (8)
Country | Link |
---|---|
US (1) | US20170144599A1 (ko) |
EP (1) | EP3128738A4 (ko) |
JP (1) | JP2017520133A (ko) |
KR (1) | KR101611194B1 (ko) |
CN (1) | CN107079081A (ko) |
BR (1) | BR112016023013A2 (ko) |
MX (1) | MX2016012999A (ko) |
WO (1) | WO2015152691A2 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2550265A (en) * | 2016-03-24 | 2017-11-15 | Ford Global Tech Llc | System and method for generating a hybrid camera view in a vehicle |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10026450B2 (en) * | 2015-03-31 | 2018-07-17 | Jaguar Land Rover Limited | Content processing and distribution system and method |
CN106080390A (zh) * | 2016-06-07 | 2016-11-09 | 深圳市灵动飞扬科技有限公司 | 车辆行进全景系统及其方法 |
JP6642306B2 (ja) * | 2016-06-29 | 2020-02-05 | アイシン精機株式会社 | 周辺監視装置 |
US10150414B2 (en) * | 2016-07-08 | 2018-12-11 | Ford Global Technologies, Llc | Pedestrian detection when a vehicle is reversing |
KR101949961B1 (ko) * | 2016-11-15 | 2019-02-21 | 주식회사 와이즈오토모티브 | 운전 보조 장치 및 방법 |
KR20180062820A (ko) * | 2016-12-01 | 2018-06-11 | 주식회사 와이즈오토모티브 | 차량의 운전 보조 장치 및 방법 |
JP6743732B2 (ja) * | 2017-03-14 | 2020-08-19 | トヨタ自動車株式会社 | 画像記録システム、画像記録方法、画像記録プログラム |
US10654570B2 (en) | 2017-06-05 | 2020-05-19 | International Business Machines Corporation | Vehicular alert system |
KR102057216B1 (ko) | 2017-10-30 | 2019-12-18 | 주식회사 케이티앤지 | 에어로졸 생성 장치 및 에어로졸 생성 장치용 히터 조립체 |
CN107945575A (zh) * | 2017-11-20 | 2018-04-20 | 深圳市中通视际实业有限公司 | 一种车辆监控装置及其方法 |
CN108961838B (zh) * | 2018-08-16 | 2020-09-22 | 大连民族大学 | 道路行人分类系统 |
CN108985271B (zh) * | 2018-08-16 | 2021-10-08 | 大连民族大学 | 磁力模型的感兴趣行人判定方法 |
CN109375620B (zh) * | 2018-10-12 | 2020-06-02 | 深圳市今天国际智能机器人有限公司 | 利用单个光电传感器控制舵轮返回原点的方法及装置 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080024772A (ko) | 2006-09-14 | 2008-03-19 | 주식회사 만도 | 조감도를 이용한 주차구획 인식 방법, 장치 및 그를 이용한주차 보조 시스템 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3275668B2 (ja) * | 1995-11-21 | 2002-04-15 | 三菱電機株式会社 | 映像監視システム |
US7366595B1 (en) * | 1999-06-25 | 2008-04-29 | Seiko Epson Corporation | Vehicle drive assist system |
US6476730B2 (en) * | 2000-02-29 | 2002-11-05 | Aisin Seiki Kabushiki Kaisha | Assistant apparatus and method for a vehicle in reverse motion |
JP3882517B2 (ja) * | 2001-03-21 | 2007-02-21 | 日産自動車株式会社 | 駐車位置設定装置 |
JP4156214B2 (ja) * | 2001-06-13 | 2008-09-24 | 株式会社デンソー | 車両周辺画像処理装置及び記録媒体 |
JP3886376B2 (ja) * | 2001-12-26 | 2007-02-28 | 株式会社デンソー | 車両周辺監視システム |
JP4020071B2 (ja) * | 2003-12-10 | 2007-12-12 | 日産自動車株式会社 | 周囲状況表示装置 |
JP2007088577A (ja) * | 2005-09-20 | 2007-04-05 | Denso Corp | 車両周辺画像処理システム |
JP4321543B2 (ja) * | 2006-04-12 | 2009-08-26 | トヨタ自動車株式会社 | 車両周辺監視装置 |
JP2007102798A (ja) * | 2006-10-11 | 2007-04-19 | Denso Corp | 車両周辺監視システム |
KR101393918B1 (ko) * | 2008-07-08 | 2014-05-13 | 현대자동차주식회사 | 차량 전방위 감시 시스템 |
JP4914458B2 (ja) * | 2009-02-12 | 2012-04-11 | 株式会社日本自動車部品総合研究所 | 車両周辺表示装置 |
US20100259371A1 (en) * | 2009-04-10 | 2010-10-14 | Jui-Hung Wu | Bird-View Parking Aid Apparatus with Ultrasonic Obstacle Marking and Method of Maneuvering the same |
JP5284206B2 (ja) * | 2009-07-10 | 2013-09-11 | 三菱電機株式会社 | 運転支援装置およびナビゲーション装置 |
KR101071666B1 (ko) * | 2010-03-02 | 2011-10-11 | (주) 엔네비솔루션 | 회전각 센서를 이용한 주차 유도 시스템 및 방법 |
JP5691339B2 (ja) * | 2010-09-21 | 2015-04-01 | アイシン精機株式会社 | 運転支援装置 |
KR101295618B1 (ko) * | 2011-12-09 | 2013-08-12 | 주식회사 와이즈오토모티브 | 사각 지대 표시 장치 및 방법 |
JP5853457B2 (ja) * | 2011-07-20 | 2016-02-09 | アイシン精機株式会社 | 車両周辺監視システム |
JP5790335B2 (ja) * | 2011-09-01 | 2015-10-07 | 株式会社デンソー | 車両周辺画像表示制御装置 |
JP2014036326A (ja) * | 2012-08-08 | 2014-02-24 | Honda Motor Co Ltd | 俯瞰画像表示装置 |
-
2014
- 2014-04-04 KR KR1020140040633A patent/KR101611194B1/ko active IP Right Grant
-
2015
- 2015-04-03 BR BR112016023013A patent/BR112016023013A2/pt not_active IP Right Cessation
- 2015-04-03 US US15/277,050 patent/US20170144599A1/en not_active Abandoned
- 2015-04-03 EP EP15772355.2A patent/EP3128738A4/en not_active Withdrawn
- 2015-04-03 CN CN201580025702.8A patent/CN107079081A/zh active Pending
- 2015-04-03 MX MX2016012999A patent/MX2016012999A/es unknown
- 2015-04-03 WO PCT/KR2015/003394 patent/WO2015152691A2/ko active Application Filing
- 2015-04-03 JP JP2016560769A patent/JP2017520133A/ja active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080024772A (ko) | 2006-09-14 | 2008-03-19 | 주식회사 만도 | 조감도를 이용한 주차구획 인식 방법, 장치 및 그를 이용한주차 보조 시스템 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2550265A (en) * | 2016-03-24 | 2017-11-15 | Ford Global Tech Llc | System and method for generating a hybrid camera view in a vehicle |
US10576892B2 (en) | 2016-03-24 | 2020-03-03 | Ford Global Technologies, Llc | System and method for generating a hybrid camera view in a vehicle |
Also Published As
Publication number | Publication date |
---|---|
KR20150115488A (ko) | 2015-10-14 |
CN107079081A (zh) | 2017-08-18 |
WO2015152691A3 (ko) | 2017-02-02 |
MX2016012999A (es) | 2017-10-12 |
JP2017520133A (ja) | 2017-07-20 |
EP3128738A2 (en) | 2017-02-08 |
US20170144599A1 (en) | 2017-05-25 |
BR112016023013A2 (pt) | 2017-10-10 |
EP3128738A4 (en) | 2017-06-28 |
KR101611194B1 (ko) | 2016-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015152691A2 (ko) | 차량 주변 이미지 생성 장치 및 방법 | |
WO2015152692A1 (ko) | 차량 주변 이미지 생성 장치 및 방법 | |
WO2020071839A1 (ko) | 선박 및 항만 모니터링 장치 및 방법 | |
WO2019172725A1 (en) | Method and apparatus for performing depth estimation of object | |
WO2019054636A1 (ko) | 차량용 카메라 캘리브레이션 장치 및 그 방법 | |
WO2018012674A1 (en) | Driver assistance apparatus and vehicle having the same | |
WO2012064106A2 (en) | Method and apparatus for video stabilization by compensating for view direction of camera | |
WO2015099465A1 (ko) | 차량 운전 보조 장치 및 이를 구비한 차량 | |
WO2015088289A1 (ko) | 스테레오 카메라, 이를 구비한 차량 운전 보조 장치, 및 차량 | |
WO2016145602A1 (en) | Apparatus and method for focal length adjustment and depth map determination | |
WO2015093828A1 (ko) | 스테레오 카메라 및 이를 구비한 차량 | |
WO2020101420A1 (ko) | 증강현실 기기의 광학 특성 측정 방법 및 장치 | |
WO2018103187A1 (zh) | 监控装置的监控画面形成方法和系统 | |
WO2021141338A1 (ko) | 선박 및 항만 모니터링 장치 및 방법 | |
WO2015093823A1 (ko) | 차량 운전 보조 장치 및 이를 구비한 차량 | |
WO2017090833A1 (en) | Photographing device and method of controlling the same | |
WO2015099463A1 (ko) | 차량 운전 보조 장치 및 이를 구비한 차량 | |
WO2019004530A1 (ko) | 영상에서 처리 대상 객체를 제거하는 방법 및 이러한 방법을 수행하는 장치 | |
WO2015093905A1 (ko) | 차량 운전 보조 장치 및 이를 구비한 차량 | |
WO2021246758A1 (ko) | 전자 장치 및 그 동작 방법 | |
WO2020145744A1 (ko) | 카메라 장치 및 이를 구비하는 전자 장치 | |
WO2021230559A1 (ko) | 전자 장치 및 그 동작 방법 | |
WO2017146314A1 (ko) | 디스플레이 패널과 무안경 다시점 렌티큘러시트를 통한 홀로그램 출력방법과 렌티큘러시트가 부착된 두 개의 디스플레이 패널을 통한 3차원 영상 생성방법 및 출력방법 | |
WO2022103121A1 (en) | Electronic device for estimating camera illuminant and method of the same | |
WO2017155365A1 (en) | Electronic apparatus for providing panorama image and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15772355 Country of ref document: EP Kind code of ref document: A2 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15277050 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2016560769 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015772355 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015772355 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2016/012999 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112016023013 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112016023013 Country of ref document: BR Kind code of ref document: A2 Effective date: 20161003 |