KR101757201B1 - The Apparatus And Method For Around View Monitoring - Google Patents

The Apparatus And Method For Around View Monitoring Download PDF

Info

Publication number
KR101757201B1
KR101757201B1 KR1020150184443A KR20150184443A KR101757201B1 KR 101757201 B1 KR101757201 B1 KR 101757201B1 KR 1020150184443 A KR1020150184443 A KR 1020150184443A KR 20150184443 A KR20150184443 A KR 20150184443A KR 101757201 B1 KR101757201 B1 KR 101757201B1
Authority
KR
South Korea
Prior art keywords
top view
image
control parameter
image control
vehicle
Prior art date
Application number
KR1020150184443A
Other languages
Korean (ko)
Other versions
KR20170075142A (en
Inventor
윤형식
Original Assignee
에스엘 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스엘 주식회사 filed Critical 에스엘 주식회사
Priority to KR1020150184443A priority Critical patent/KR101757201B1/en
Publication of KR20170075142A publication Critical patent/KR20170075142A/en
Application granted granted Critical
Publication of KR101757201B1 publication Critical patent/KR101757201B1/en

Links

Images

Classifications

    • H04N5/2257
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2625Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
    • H04N5/2627Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect for providing spin image effect, 3D stop motion effect or temporal freeze effect
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation

Abstract

According to an aspect of the present invention, there is provided a peripheral image monitoring apparatus including: a plurality of cameras for capturing a peripheral image of a vehicle; An electronic control unit (ECU) for controlling the processing of the photographed image and controlling driving of the vehicle; A top view generation unit for converting each of a plurality of images taken by the plurality of cameras into an individual top view and generating a synthesized top view by rotating and moving the plurality of converted individual top views; A display device for displaying the photographed image or the generated one synthesized top view; And an input unit to which a user's input signal is applied so that all or part of the plurality of individual top views rotate and move, and wherein when the input signal is applied to the input unit, All or a part thereof is reflected in the synthetic top view and displayed in real time on the display device.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a peripheral image monitoring apparatus and method,

The present invention relates to an apparatus and method for monitoring a peripheral image, and more particularly, to an apparatus and method for monitoring a peripheral image. More particularly, the present invention relates to a peripheral image monitoring apparatus and method, And more particularly, to a peripheral image monitoring apparatus and method capable of generating a top view.

Generally, in order to park the vehicle in the parking space, the driver must judge the position of the current vehicle, the obstacle around the other vehicle or the position and distance of the other vehicle, the steering angle, and the anticipated course of the vehicle depending on the driver's own sense and experience. However, novice drivers suffer from considerable difficulty when parking because of lack of sense and experience. In particular, when parking a vehicle in a narrow space or a place with many blind spots, there is a high possibility of causing a collision with another vehicle or an obstacle due to a mistake of the driver's position judgment or an operation mistake.

In order to solve such a problem, a camera is installed in the front and rear of the vehicle to photograph an image in the corresponding direction, and by displaying the image through a display device installed in a vehicle interior, Technologies have been proposed. Recently, cameras are installed in the front, rear, left, and right rooms of the vehicle, respectively, and images are taken, and the top view (top view) Around View Monitoring (AVM) technology has been introduced. This AVM technology has increased the convenience of driving the driver's vehicle and can easily recognize the surroundings of the vehicle. Accordingly, parking assist systems based on AVM technology are being actively developed.

FIG. 1 is a schematic view showing a method of synthesizing a plurality of images photographed from four sides of a vehicle to perform general AVM technology.

In order to carry out the AVM (Around View Monitoring) technique, it is important to accurately convert and synthesize a plurality of images of the four sides of the vehicle. Therefore, in general, the factory performs conversion and synthesis of a plurality of images before launching the vehicle. As shown in FIG. 1, the pattern diagrams are positioned at regular intervals on a flat ground, and the vehicle is positioned such that the pattern diagrams can be all photographed by a plurality of cameras 10a, 10b, 10c, and 10d installed on all sides of the vehicle . The image is automatically converted and synthesized so as to generate a top view by capturing an image with the plurality of cameras 10a, 10b, 10c, and 10d.

After the conversion and synthesis, image control parameters including information about the rotated and shifted images are generated so that the images of the respective cameras are synthesized naturally and the pattern is displayed in an image in an actual positioned interval and shape. These image control parameters may vary from one vehicle to another due to slight errors that may occur during the manufacturing process, such as camera installation. Therefore, the data on the parameter once generated is stored in the electronic control unit (ECU) of the vehicle. And the top view that is displayed later is created by reading only the data for the stored parameters. That is, there is no need to repeat the process of correcting the position of the image. However, if the installation position of the camera changes due to a failure of the camera installed in the vehicle, or due to an impact or aging, the corresponding image control parameters are also changed. Therefore, the image control parameters .

However, since such a synthesis and correction method must be performed precisely, it is not easy to perform outside the predetermined plant or the like, and there is a need to move the vehicle to a designated place such as a factory or a service center. In particular, heavy equipment such as an excavator that was under construction in the mountains has a problem that excessive cost and time are consumed in moving to the place.

Korea Patent No. 1406230 Korean Laid-Open Publication No. 2015-0053323

A problem to be solved by the present invention is to create a normal top view without moving the vehicle to a designated place even if the installation position of the camera is changed due to a change in camera due to a failure of the camera installed in the vehicle, And to provide a peripheral image monitoring apparatus and method that can be used.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided a peripheral image monitoring apparatus including: a plurality of cameras for capturing a peripheral image of a vehicle; An electronic control unit (ECU) for controlling the processing of the photographed image and controlling driving of the vehicle; A top view generation unit for converting each of a plurality of images taken by the plurality of cameras into an individual top view and generating a synthesized top view by rotating and moving the plurality of converted individual top views; A display device for displaying the photographed image or the generated one synthesized top view; And an input unit to which a user's input signal is applied so that all or part of the plurality of individual top views rotate and move, and wherein when the input signal is applied to the input unit, All or a part thereof is reflected in the synthetic top view and displayed in real time on the display device.

According to another aspect of the present invention, there is provided a method for monitoring a peripheral image, the method comprising: converting peripheral images of a plurality of vehicles photographed by a plurality of cameras into respective top views; Generating a composite top view by rotating and moving the transformed plurality of individual top views based on image control parameters; Displaying the generated one synthesized top view on a display device; Selecting a camera to be subjected to tolerance correction among the plurality of cameras; Inputting a change value of the image control parameter through an input unit; The image control parameter is changed and reflected in the generated one synthesized top view; Displaying the reflected composite view in real time on the display device; And storing the changed image control parameter when the tolerance correction is completed.

Other specific details of the invention are included in the detailed description and drawings.

The embodiments of the present invention have at least the following effects.

The tolerance can be corrected by manually changing the image control parameter even if the installation position of the camera changes due to a change in the camera installed in the vehicle or due to shock or aging. Accordingly, it is possible to provide a peripheral image monitoring apparatus and method capable of generating a normal top view through synthesis and correction of images immediately without moving the vehicle to a factory or a service center.

In particular, if the camera's image control parameters are changed even in the case of a pattern that allows the camera to recognize the object automatically or in a mountain where there is no lane, the top view with the tolerance corrected is displayed in real time, It is possible to provide a peripheral image monitoring apparatus and method that can be easily corrected.

The effects according to the present invention are not limited by the contents exemplified above, and more various effects are included in the specification.

FIG. 1 is a schematic view showing a method of synthesizing a plurality of images photographed from four sides of a vehicle to perform general AVM technology.
2 is a block diagram of a peripheral image monitoring apparatus 1 according to an embodiment of the present invention.
3 is a flowchart illustrating a method of monitoring a peripheral image according to an exemplary embodiment of the present invention.
4 is a schematic view showing a rotation and a moving direction of a top view photographed forward by correction of image control parameters according to an embodiment of the present invention.
FIG. 5 is a schematic view showing a change of the top view photographed forward by correction of image control parameters according to an embodiment of the present invention before and after correction. FIG.

BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

2 is a block diagram of a peripheral image monitoring apparatus 1 according to an embodiment of the present invention.

A peripheral image monitoring apparatus 1 according to an embodiment of the present invention includes a plurality of cameras 10a, 10b, 10c, and 10d that photograph a peripheral image of a vehicle, an electronic control unit (ECU) An image processing unit 31 for receiving an encoded and compressed image taken from a plurality of cameras 10a, 10b, 10c, and 10d and performing image processing on the compressed image, and a display unit 31 for converting the plurality of images into a top view, An image converter 32, an image synthesizer 33 for synthesizing the respective top views to generate a top view, a display device for displaying the top view, (40).

A plurality of cameras 10a, 10b, 10c, and 10d are installed around the vehicle to photograph the outside of the vehicle. Generally, four cameras 10a, 10b, 10c, and 10d may be installed in the front, rear, left, and right sides of the vehicle, but the present invention is not limited thereto and various numbers of cameras 10 may be installed in the vehicle have. The camera 10 mainly uses a wide-angle lens having a large angle of view, and a fish-eye lens, which is an ultra-wide angle lens having an angle of view of 180 degrees or more, is also used. The image photographed from the camera 10 is displayed through the display device 40 installed in the vehicle. The driver can secure the view through the output image to easily grasp the external situation and to avoid the obstacle and secure the safety. The camera 10 generally uses an image pickup element such as a CCD (Charge Coupled Device) or a CIS (CMOS Image Sensor). The camera 10 according to an exemplary embodiment of the present invention is preferably a digital camera that outputs moving image data by capturing a two-dimensional image of 15 to 30 frames per second and converting the two-dimensional image into digital signals, but the present invention is not limited thereto. If the camera 10 is not a digital camera, the captured image is an RGB analog image signal, so an ADC converter should be separately provided. However, if the camera 10 according to an embodiment of the present invention is a digital camera, an ADC converter is not required. In addition, since the camera 10 has a function of encoding an image, the camera 10 immediately encodes the image and generates compressed image data.

In recent years, the standardization of HEVC (High Efficiency Video Coding) for UHD image encoding has been completed with the interest of UHD (Ultra High Definition), which is an ultra high resolution image, and the encoding efficiency is improved by more than twice as much as H.264 / MPEG-4 AVC. As the codec for encoding the image, it is preferable to use MPEG4 or H.264 / MPEG-4 AVC, which is mainly used in recent years, HEVC, etc., but the present invention is not limited thereto and various codecs can be used.

An electronic control unit (ECU) 20 receives inputs of various sensors mounted on the vehicle, and controls overall driving of the vehicle. When the plurality of compressed images are output from the plurality of cameras 10a, 10b, 10c and 10d, the image data is decoded through the image processing unit 31, the image converting unit 32, and the image synthesizing unit 33, And then converting and synthesizing them into a top view. At this time, a signal indicating that the compressed image is input to the image processing unit 31 is applied to the electronic control unit (ECU) 20. The electronic control unit (ECU) 20 according to an embodiment of the present invention processes the decoded and decoded image in the image processing unit 31 to generate a resolution, a frame rate, Or the image quality (Bit-Depth). For example, the brightness of the image can be adjusted by making a judgment on day and night. Specifically, a sensor such as an illuminance sensor 54 or the like mounted on the vehicle separately applies a signal to the electronic control unit (ECU) 20. When receiving a signal from the sensor, the electronic control unit (ECU) 20 judges whether the daytime or nighttime of the outdoor is outdoors and commands the brightness adjustment of the image. The brightness control command may instruct the image processing unit 31 to adjust the brightness of a pixel in an image processing process including decoding of the image. Or the electronic control unit (ECU) 20 may directly instruct the display panel 41 to adjust the brightness of the backlight. If the outdoor day is the daytime, the brightness of the output image may be brightened to about 500 cd / m 2 or more, and if the outdoor daylight is nighttime, the brightness of the output image may be reduced to about 100 cd / m 2 or less. However, And can be set in various ranges. In addition, the electronic control unit (ECU) 20 performs overall control in the process of synthesizing images in the image converting unit 32 and the image synthesizing unit 33, which will be described later.

A body control module (BCM) that can control the camera 10 and the like installed in the vehicle is used as the electronic control unit (ECU) 20. Recently, a Driver Information System (DIS) ) Are also used. Here, the driver integrated information system informs the driver of all information of the vehicle such as multimedia, navigation, vehicle air conditioning system, and various settings of the vehicle to the driver through the 8-inch large monitor. The driver can use the touch screen, the rear seat haptic controller, It is a system that can control the parts of the vehicle easily and quickly by command.

In addition to the above functions, the electronic control unit (ECU) 20 implements various set contents or controls internal signals. The set contents include contents such as update of the own software as well as the contents of setting the interface so that the user is suitable for the user. In addition, the setting contents of the interface include zoom, resolution, brightness, chroma and saturation of the screen, and all contents that can be set so that the user can output a desired image such as rotation, monochrome, style, and view mode. Accordingly, when the user desires, the screen can be zoomed in and out by zooming in and out of the screen, and the brightness of the screen can be adjusted. In addition, it is possible to adjust the view mode so that the camera 10 can see a portion that is not displayed in the captured image, and to rotate and move the camera 10 to photograph the desired portion.

The electronic control unit (ECU) 20 according to an embodiment of the present invention may include a storage unit (not shown). A storage unit (not shown) stores the encoded image or various final contents. In order to miniaturize the electronic control unit (ECU) 20, a storage unit (not shown) preferably uses a flash memory which is a nonvolatile memory, but not limited thereto, various memory devices can be used.

A plurality of images photographed from the camera 10 are transmitted to the image processing unit 31. [ When the plurality of compressed images are inputted simultaneously, the image processing unit 31 can simultaneously perform image processing such as decoding and rendering independently. The image processing unit 31 may include a decoder for receiving and decoding the compressed image, a buffer storage unit, and a graphics renderer.

The decoder receives the encoded compressed image from the camera 10, decodes it, and generates a reconstructed image. The codec for decoding the encoded image may also be used in various types as in the encoding described above, but it must be of the same type as the codec used for encoding, so that the image is exactly restored because it is based on the same standard.

If a display delay phenomenon occurs in the display panel 41, the frames of the image captured and transmitted by the camera 10 should wait inside. The buffer storage unit temporarily stores the frame data of the waiting image. When the image is displayed on the display panel 41, the image data corresponding to the next frame is transmitted so that the image can be reproduced naturally. The graphic renderer performs a rendering operation of the image. Rendering is a method of creating a three-dimensional image in consideration of external information such as a light source, a position, and a color so that a two-dimensional image becomes more realistic and realistic. Rendering methods include wireframe rendering that only draws the corners of an object, ray tracing rendering that determines the color of each pixel by calculating the refraction and reflection of the light and tracing the path to the light emitting portion have.

The decoded and other image-processed images are output from the image processing unit 31 and input to the top view generation unit 30. The top view generator 30 converts and combines the input image into one top view, which is a view of the vehicle from above, and outputs the synthesized image to the image converter 32 and the image synthesizer 33 . The image converting unit 32 and the image synthesizing unit 33 perform correction and conversion of the restored image in the same manner as described above and synthesize the reconstructed image in an overlay manner, Top View image is generated.

The image converting unit 32 receives a plurality of image processed images, converts the images through a lookup table, and generates top views of the plurality of images. The lookup table can be generated by applying a distortion correction algorithm, an affine transformation algorithm, and a viewpoint transformation algorithm. The distortion correction algorithm is an algorithm for correcting the geometric distortion caused by the camera 10 lens. Since an actual lens is generally formed of an aspherical surface, radial distortion or tangential distortion may occur, thereby correcting it. The distortion correction algorithm can be expressed as a function of the correction parameters and the distortion constants. Affine transformation refers to a point correspondence in which a two-dimensional space appears as a linear expression, and is subjected to rotation (R), movement (T), and scale (S) transformation. The viewpoint conversion algorithm converts viewpoints of images captured through the plurality of cameras 10a, 10b, 10c, and 10d to top view images viewed from above. These transformations can be implemented by using various techniques already known.

The image synthesizing unit 33 synthesizes the converted plurality of individual top views in an overlapping overlapping manner to generate a single synthesized top view. Here, the image combining unit 33 performs an overlay combining process using a mask image. The mask image is corrected for each image captured by each camera 10, and weight information for pixels constituting the converted image is calculated I have. The electronic control unit (ECU) 20 adjusts the weights of the pixels included in the overlapping area between the corrected and converted images so that overlapping areas are displayed more naturally when a plurality of images are combined. Also, each image has an image control parameter indicating the degree of rotation and movement of the image for image synthesis.

In an embodiment of the present invention, after each top view of a plurality of images is generated by the image converting unit 32, a single top view is synthesized by the image synthesizing unit 33 Respectively. However, the present invention is not limited to this, and a plurality of images may be synthesized first and then converted into one top view, or various schemes for generating one top view may be used.

Each of the plurality of images taken by each camera 10 is converted into a top view and then each image is moved in the X and Y axes to be synthesized into one top view, Zoom In or Zoon Out, or Pan, Tilt, and Rotate based on the X, Y, and Z axes. The six necessary parameters representing the degree of rotation and movement for each image are referred to as image control parameters. These image control parameters may be stored in a storage unit (not shown) included in the electronic control unit (ECU)

The input unit 34 receives a command signal so that the user can change the image control parameter. When the user confirms the synthesized top view, if a mismatch occurs, a command for correcting the mismatch can be input through the input unit 34. [ When the correction command signal is applied, the image control parameter is changed and the mismatch of the synthesized top view is corrected. Details of this will be described later.

The display device 40 may include a display panel 41 for displaying the restored image and the synthesized top view image.

The display panel 41 displays a restored image and a top view generated by converting and synthesizing the image and synthesized according to the control of the electronic control unit (ECU) 20, To be monitored. The display panel 41 may have various functions such as an LCD, an LED, and an OLED. Further, the display panel 41 may have a touch panel function.

The electronic control unit (ECU) 20 can be connected to the vehicle-mounted sensors and system via the network communication 50. The vehicle's sensors and systems include an accelerator sensor 51, a brake system 52, a wheel speed sensor 53, an illuminance sensor 54, a lamp control system 55, and the like.

The accelerator sensor 51 is a sensor for measuring the pressure of the accelerator pedal to adjust the RPM of the engine. The wheel speed sensor 53 is a sensor for detecting the amount of rotation of the wheel of the vehicle or the number of revolutions per unit time, ) Device or the like. The brake sensor 522 is a sensor for detecting an operation amount of the brake pedal. The brake sensor 522 is connected to the network communication via the brake system 52. The shift lever switch 56 is a sensor or a switch for detecting the position of the shift lever, and is configured using a displacement sensor or the like. The illuminance sensor 54 is a sensor that is mounted outside the vehicle and determines the day and night by measuring the amount of received light, and is constructed using a photovoltaic cell or a phototube.

The brake system 52 is an electric braking system having a brake assist for increasing the braking force by using the driving unit 521 and an anti-lock brake system (ABS) for restricting the locking of the brake. The lamp control system 55 is a system that adjusts the lamp emission amount of the vehicle according to the light intensity sensor 54 or the user's control input. The external system 58 is an optional system for connecting an inspection system or an adjustment system used at the time of production, inspection and maintenance of a vehicle through an external connection connector or the like, and is not always mounted on a vehicle but can be detachably attached. When a user confirms the synthesized top view through the display device 40, when a mismatch occurs, a command for correcting the mismatch can be input through the input unit 34. [ The input unit 34 may be separately formed to directly transmit a command signal of the user to the top view generation unit 30, but a separate module such as a smart phone, a notebook, a tablet, So that the correction command signal of the synthesized top view can be received.

The power steering system 57 measures a torque acting on the steering wheel by using a torque sensor 572 and adds an assist torque to the steering wheel via the steering driving unit 571 And may be an electric power steering (EPS) system that provides convenience.

The sensors and systems described above are exemplary and their connection forms are also exemplary. Accordingly, the present invention is not limited thereto and may be formed in various configurations or connection forms.

According to an embodiment of the present invention, the connection method of the network communication 50 is preferably a CAN (Controller Area Network) which is mainly used recently, but it is not limited thereto, : LIN) can be used.

3 is a flowchart illustrating a method of monitoring a peripheral image according to an exemplary embodiment of the present invention.

In order to perform the method of monitoring a peripheral image according to an exemplary embodiment of the present invention, a mismatch of a synthesized top view must be found through priority verification as shown in FIG. When the user confirms the synthesized top view, if a mismatch occurs, a command for correcting the mismatch can be input through the input unit 34. [ When the correction command signal is applied, the image control parameter is changed and the mismatch of the synthesized top view is corrected. The input unit 34 may be separately formed to directly transmit a command signal of the user to the top view generation unit 30, but a separate module such as a smart phone, a notebook, a tablet, So that the correction command signal of the synthesized top view can be received. The input unit 34 may be connected to the vehicle through the external system 58 so that the notebook can be connected to the input unit 34 ).

First, the notebook is connected to the vehicle 50 through the external system 58 (S301). And enters a manual correction mode for manually changing the current image control parameter (S302). A storage unit (not shown) included in the electronic control unit (ECU) stores not only image control parameters but also vehicle information and version information of the parameters. Such information may be stored not only inside the electronic control unit (ECU) but also at other locations as the backup data. This may be stored in the connected notebook or may be separately stored in the display device 40. [ But it is not limited thereto and may be stored in various locations, and even the backup data may not exist.

If the backup data is present in the display device 40, it is checked whether the vehicle information is consistent (S303). If the vehicle information does not match, either the electronic control unit (ECU) 20 or the display device 40 is a part that has been replaced with a used part, and it is not possible to determine by itself which part has been replaced. Accordingly, a warning message is displayed on the display panel 41 so that the user can select which vehicle information is the information for identifying the vehicle. When the vehicle information indicating the vehicle is selected, accurate vehicle information is stored through synchronization (S304). However, if backup data does not exist, steps S303 and S304 may be omitted.

If the information of the vehicle matches, the user selects a camera corresponding to an individual top view in which mismatching occurs among the plurality of cameras 10a, 10b, 10c, and 10d (S305). Then, the image control parameter is modified to correct the individual top view corresponding to the selected camera (S306). When the camera is selected, the current image control parameters corresponding to the respective top views are loaded in the storage unit (not shown). At this time, the user can view and compare the loaded image control parameter in order to correct a part where mismatch occurs, and input a new image control parameter (S306). A method of modifying the image control parameter and a detailed description of the modified result will be described later.

After the image control parameter is modified, a new synthesized top view reflecting the corrected individual top view can be confirmed through the display device (S307). Steps S306 and S307 may be performed in the same order, or may be performed simultaneously.

When the steps S306 and S307 are concurrently performed, a synthesized top view is displayed on the display device, and the user can modify the image control parameters. At this time, when the image control parameter is modified, an individual top view corresponding to the modified image control parameter can be rotated and moved in real time within the synthesized top view. Therefore, the user can check the individual top view rotated and moved according to the image control parameter in real time, so that the individual top view can be more easily corrected.

The image control parameter may be input by looking at the current image control parameters loaded by the user, but it is also possible to use a movement button for simply moving in the x-axis direction or the y-axis direction movement even if the current image control parameter is not loaded So that the image can be more easily corrected. This depends on the type of input 34 to which the user can apply the correction command signal. This is not restrictive and the image correction command signal of the user can be applied in various ways.

After step S306 and step S307 are repeated until the image correction is completed, if the image correction is completed (S308), it is determined whether another individual top view should be corrected. If there is another individual top view to be corrected, the user again selects the camera corresponding to the corresponding individual top view, and repeats the steps S306 and S307. If there is no individual top view to be corrected anymore, then one composite top view displayed through the display device will be a top view with no mismatch. Accordingly, the image control parameters after completing the corresponding tolerance correction are stored (S310). When the modified image control parameter is stored, the parameter version is increased. The parameter version refers to the update step or sequence of the parameter by number or symbol, and generally, the number indicating the version becomes larger once the update is made. Therefore, comparing the numerical values of the parameter versions shows the degree of the update step. However, the present invention is not limited to this and can represent update steps in various ways. If there is backup data, there may be a plurality of the image control parameters. In this case, the latest updated data can be determined through the parameter version.

Then, the manual correction mode is released to prevent the image control parameter from being further changed (S311). The tolerance correction process is terminated by terminating the network connection with the first vehicle (S312). As described above, the input unit 34 is connected to the vehicle through the external system 58. When the notebook serves as the input unit 34, the network connection should be terminated. However, If the tolerance correction has been performed through the step 34, the step S312 may be omitted.

FIG. 4 is a schematic view showing a rotation and a moving direction of a top view photographed forward by correction of image control parameters according to an exemplary embodiment of the present invention. FIG. Fig. 5 is a schematic view showing the change of the top view photographed forward by the correction of the control parameter before and after the correction. Fig.

In step S305, for example, as shown in FIG. 4, the first camera 10a for capturing an image toward the front is selected. In this case, the individual top view to be corrected becomes the top view of the front area of the vehicle. Then, the current image control parameter corresponding to the top view is loaded in the storage unit (not shown). At this time, the user can view and compare the loaded image control parameter in order to correct a part where mismatch occurs, and input a new image control parameter (S306). When the image control parameter is modified, an individual top view corresponding to the modified image control parameter can be rotated and moved in real time within the synthesized top view. Since the user can confirm the individual top view rotated and moved according to the image control parameter in real time, it is possible to more easily correct the individual top view.

As described above, parameters necessary for expressing the degree of rotation and movement for each image are referred to as image control parameters. The x-axis coordinate value 61, the y-axis coordinate value 62, the zoom value 63 (z-axis coordinate value), the pan angle value 64 and the tilt angle value 65 ), And a rotation angle value (66). In order to rotate and move the individual top view, the six parameters are adjusted to move each image in the X and Y axes, zoom in or out, Pan, tilt, and rotate based on the Y and Z axes. At this time, the reference of the rotation and movement of the image by the image control parameter is not the image captured by the camera but the image converted into the individual top view by the image conversion unit.

The x-axis coordinate value 61 is a coordinate value indicating the x-axis position in the area converted into the individual top view (top view), as shown in Fig. Therefore, when the x-axis coordinate value (61) is changed, the individual top view linearly moves in the horizontal axis direction when the user views the coordinate system. 5, if the left-to-right direction is the + direction, if the x-axis coordinate value 61 increases, the image moves linearly to the right. If the x-axis coordinate value 61 decreases, Perform a linear motion to the left.

The y-axis coordinate value 62 is a coordinate value indicating the y-axis position in the area converted into the individual top view (top view), as shown in Fig. Therefore, when the y-axis coordinate value (62) is changed, the individual top view linearly moves in the vertical axis direction when the user looks at it. 5, if the y-axis coordinate value 62 increases, the image moves linearly upward. If the y-axis coordinate value 62 decreases, It performs linear motion downward.

The zoom value 63 is a value indicating the degree of enlargement or reduction of the image, as shown in Fig. Accordingly, when the zoom value 63 is changed, the image is enlarged or reduced based on the center of the individual top view. As shown in FIG. 5, if the magnification is positive, the image is enlarged when the zoom value 63 is increased, and the image is reduced when the zoom value 63 is decreased.

The pan angle value 64 is an angle value indicating the degree of rotation of the image around the y-axis in the area converted to the individual top view, as shown in Fig. Therefore, if you change the pan angle value (64), the individual top view rotates left and right when you look at it. 5, if the right side to the left side is the positive direction, when the pan angle value 64 increases, the image rotates to enlarge the right side and reduce the left side, When the value 64 decreases, the image is rotated in such a manner that the right side is reduced and the left side is enlarged.

The tilt angle value 65 is an angle value indicating the degree of rotation of the image around the x-axis in the area converted to the individual top view (top view). Accordingly, when the tilt angle value 65 is changed, the individual top view rotates in the vertical direction when the user looks at it. As shown in FIG. 5, if the upward direction from the lower side is the + direction, when the tilt angle value 65 increases, the image rotates while the lower side is enlarged and the upper side is reduced and the tilt angle When the value 65 is decreased, the image is rotated in such a manner that the right side is reduced and the left side is enlarged.

Rotation angle value 66 is an angle value indicating the degree of rotation of the image about the z axis in the area converted to the individual top view. Accordingly, when the rotation angle value 66 is changed, the individual top view rotates clockwise or counterclockwise with respect to the center of the individual top view when the user looks at it. 5, if the counterclockwise direction is the + direction, if the rotation angle value 66 increases, the image rotates in the counterclockwise direction and the rotation angle value 66 rotates in the counterclockwise direction. When the image is reduced, the right side is reduced and the left side is enlarged.

The user can rotate and move the individual top view by adjusting the six parameters as described above, and can correct the mismatch of one synthesized top view. In FIG. 5, an area where the pattern diagram is located is photographed for the sake of understanding. However, the peripheral image monitoring apparatus and method according to an embodiment of the present invention can manually correct the image. Therefore, the tolerance can be corrected even if the shape such as the pattern diagram, the lane, and the rod can be captured by the camera. In addition, since the user's image control parameter correction input is directly reflected in one synthesized top view, tolerance correction can be performed quickly and easily.

It will be understood by those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. The scope of the present invention is defined by the appended claims rather than the detailed description and all changes or modifications derived from the meaning and scope of the claims and their equivalents are to be construed as being included within the scope of the present invention do.

1: peripheral image monitoring device 10: camera
10a: first camera 10b: second camera
10c: third camera 10d: fourth camera
20: electronic control unit (ECU) 31: image processing unit
32: image conversion unit 33:
40: display device 41: display panel
50: Network communication

Claims (5)

A plurality of cameras for photographing a peripheral image of the vehicle;
An electronic control unit (ECU) for controlling the processing of the photographed image and controlling driving of the vehicle;
An image converter for converting an image or a composite image taken by the plurality of cameras into a top view; and an image synthesizer for synthesizing the synthesized image or the synthesized top view based on the photographed image or the converted top view, A top view generation unit for generating a synthetic top view including the composite top view;
A display device for displaying the generated top view; And
And an input unit for receiving a user's input signal so that an image photographed by the plurality of cameras rotates and moves,
When the input signal is applied to the input unit,
When the image control parameter is displayed on the display device and the image control parameter is modified and input, the modified image control parameter is displayed together on the display device,
Wherein the captured image or the transformed individual top view is rotated and moved in real time within the composite image or the composite top view to correspond to the modified image control parameter and displayed on the display device.
The method according to claim 1,
Wherein the image synthesizer comprises:
Wherein the image control parameter is changed when the input signal is applied to the input unit and all or a part of the plurality of individual top views is rotated and moved by the changed image control parameter, Wherein an image in which all or part of the individual top views are rotated and moved is displayed in real time on the display device.
The method according to claim 1,
Wherein the image control parameter comprises:
An x-axis coordinate value indicating the x-axis position in the top view converted area;
A y-axis coordinate value indicating a y-axis position in the top view converted area;
A zoom value indicating an extent of enlargement or reduction of the top view;
A pan angle value indicating a rotation angle in which the top view is tilted to the left and right;
A tilt angle value indicating a rotation angle in which the top view is tilted up and down; And
Wherein the top view comprises a rotation angle value indicating a rotation angle in which the top view is rotated clockwise or counterclockwise.
Generating individual top views and composite top views based on image control parameters with a plurality of vehicle surroundings images photographed by a plurality of cameras;
Displaying the generated top view on a display device;
Selecting a camera to be subjected to tolerance correction among the plurality of cameras;
Inputting a change value of the image control parameter through an input unit;
A change value of the image control parameter is displayed on the display device;
Rotating and moving in real time within the composite image or the composite top view so that the photographed image or the individual top view corresponds to the changed image control parameter and displayed on the display device; And
And storing the modified image control parameter when the tolerance correction is completed.
5. The method of claim 4,
Wherein the image control parameter comprises:
An x-axis coordinate value indicating the x-axis position in the top view converted area;
A y-axis coordinate value indicating a y-axis position in the top view converted area;
A zoom value indicating an extent of enlargement or reduction of the top view;
A pan angle value indicating a rotation angle in which the top view is tilted to the left and right;
A tilt angle value indicating a rotation angle in which the top view is tilted up and down; And
Wherein the top view comprises a rotation angle value representing a rotation angle in which the top view is rotated clockwise or counterclockwise.
KR1020150184443A 2015-12-23 2015-12-23 The Apparatus And Method For Around View Monitoring KR101757201B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150184443A KR101757201B1 (en) 2015-12-23 2015-12-23 The Apparatus And Method For Around View Monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150184443A KR101757201B1 (en) 2015-12-23 2015-12-23 The Apparatus And Method For Around View Monitoring

Publications (2)

Publication Number Publication Date
KR20170075142A KR20170075142A (en) 2017-07-03
KR101757201B1 true KR101757201B1 (en) 2017-07-12

Family

ID=59352840

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150184443A KR101757201B1 (en) 2015-12-23 2015-12-23 The Apparatus And Method For Around View Monitoring

Country Status (1)

Country Link
KR (1) KR101757201B1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021141143A1 (en) * 2020-01-06 2021-07-15 엘지전자 주식회사 Route provision device and route provision method therefor
KR102383086B1 (en) * 2021-09-10 2022-04-08 제이씨현오토 주식회사 Calibration system and method for 3d surround view monitoring by camera synthesis

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101579100B1 (en) * 2014-06-10 2015-12-22 엘지전자 주식회사 Apparatus for providing around view and Vehicle including the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101579100B1 (en) * 2014-06-10 2015-12-22 엘지전자 주식회사 Apparatus for providing around view and Vehicle including the same

Also Published As

Publication number Publication date
KR20170075142A (en) 2017-07-03

Similar Documents

Publication Publication Date Title
US11303806B2 (en) Three dimensional rendering for surround view using predetermined viewpoint lookup tables
JP4642723B2 (en) Image generating apparatus and image generating method
KR101766077B1 (en) System and Method for Providing Vehicle Around View
US9204108B2 (en) Vehicle periphery monitoring system
JP2006311272A (en) Video display device for vehicle
WO2013016409A1 (en) Vision system for vehicle
US11341607B2 (en) Enhanced rendering of surround view images
KR20100129965A (en) Automobile camera module and method to indicate a moving guide line
KR101757201B1 (en) The Apparatus And Method For Around View Monitoring
KR101535518B1 (en) Image processing method and image processing apparatus for generating vehicular image
US10455159B2 (en) Imaging setting changing apparatus, imaging system, and imaging setting changing method
JPWO2020090512A1 (en) Imaging device, control method, and program
KR102153581B1 (en) The Apparatus For Around View Monitoring
KR101672923B1 (en) The Apparatus And Method For Around View Monitoring
KR102235951B1 (en) Imaging Apparatus and method for Automobile
KR20190026507A (en) Imaging Apparatus and method for Automobile
JP2009083744A (en) Synthetic image adjustment device
KR20180097976A (en) The Apparatus And The Method For Around View Monitoring
JP2008153882A (en) Imaging and displaying system
KR102439439B1 (en) The Image Surveillance Apparatus For Vehicle Sharing The Electronic Control Unit
JP6148896B2 (en) Image generating apparatus and image generating method
KR20180097998A (en) The Apparatus And The Method For Around View Monitoring
KR101853613B1 (en) The Image Display Apparatus And System For Vehicle Using Integrated Board
WO2023228897A1 (en) Remote control system, remote-operated work machine system, and work information display control method
JP2005124010A (en) Imaging apparatus

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant