CN111133439A - Panoramic monitoring system - Google Patents

Panoramic monitoring system Download PDF

Info

Publication number
CN111133439A
CN111133439A CN201880057393.6A CN201880057393A CN111133439A CN 111133439 A CN111133439 A CN 111133439A CN 201880057393 A CN201880057393 A CN 201880057393A CN 111133439 A CN111133439 A CN 111133439A
Authority
CN
China
Prior art keywords
lane
image
panoramic
vehicle
recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880057393.6A
Other languages
Chinese (zh)
Other versions
CN111133439B (en
Inventor
李定俊
文俊赫
金时郁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chemtronics Co ltd
Original Assignee
Chemtronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chemtronics Co ltd filed Critical Chemtronics Co ltd
Publication of CN111133439A publication Critical patent/CN111133439A/en
Application granted granted Critical
Publication of CN111133439B publication Critical patent/CN111133439B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Signal Processing (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Geometry (AREA)

Abstract

The present invention relates to a panoramic monitoring system. Specifically, the panoramic monitoring system of the present invention includes: a plurality of cameras respectively installed in front, rear, left, and right directions of the vehicle for photographing a surrounding image of the vehicle; a first lane change recognition module which recognizes whether a lane change is made using a front image among surrounding images of a vehicle photographed by a plurality of cameras, respectively; a second lane change recognition module that recognizes whether a lane change is made using panoramic images formed based on surrounding images of the vehicle captured by the plurality of cameras, respectively; and a result integrating module which integrates the information about whether the lane is changed or not, which is respectively discriminated by the first and second lane change recognition modules, to generate a final result.

Description

Panoramic monitoring system
Technical Field
The present invention relates to panoramic surveillance systems.
Background
Nowadays, with the widespread use of vehicles, accidents caused by sleepiness and the inattention of drivers (especially, collision accidents between vehicles at the time of lane change) have increased.
Further, a Lane Departure Warning System (Lane Departure Warning System) that warns a Lane Departure by using a camera has been developed in order to prevent the occurrence of a similar accident, in which a driver cannot be completely aware of the situation around a vehicle that is subject to multiple changes, which is one of the causes of an increase in a collision accident.
Further, in the case where the front, rear, left, right, and the like directions cannot be confirmed at a glance like parking, in order to prevent the occurrence of a collision accident, a panoramic Monitoring System (around View Monitoring System) has been developed for converting the situation around the vehicle into an image seen from above (i.e., Top View) and providing it to the driver.
However, the purpose of the lane departure warning system and the panoramic monitoring system is different, and thus, there is a difference in the configuration of the camera lens. Because of this, if the lane departure warning system and the panoramic monitoring system, both of which are installed in the vehicle, are installed in addition to the cameras required for the respective systems, the number of cameras is increased, which is problematic. And, as the number of cameras increases, the cost thereof also increases in synchronization. Furthermore, if images taken by the cameras of the respective systems are all processed, a high-performance processor (such as a CPU) is required, which is also a problem.
To solve the above problems, studies have been made on a scheme for discriminating whether or not to change lanes only by using a panoramic monitoring system.
However, since a wide-angle lens is used in a camera of a panoramic monitoring system, a distortion phenomenon may occur in a captured image. Therefore, whether or not the lane is changed is determined by correcting the distortion, performing image processing, and extracting the lane position.
However, since an image acquired by a wide-angle lens implicitly displays a wide range of objects in a small image area, there is a problem that the density of the objects displayed in the image is high and the resolution of the image for correcting distortion is low.
Further, unlike the case where the camera for the lane departure warning system is mounted on the rear view mirror, the camera for the panoramic monitoring system mounted on the front end of the vehicle (i.e., the front camera) is mounted near the license plate outside the vehicle, and thus, the photographing height becomes low, so that the difference in size between the far lane and the near lane becomes large, which is a problem.
Further, when the vehicle is running at a high speed, the lane is too far from the image taken by the camera mounted at the front end of the vehicle, and thus the case where the lane cannot be recognized frequently occurs, which is also a problem.
Further, in order to solve the problem of poor lane recognition, the lane recognition rate can be improved by additionally using cameras installed on the left and right sides of the vehicle (i.e., left and right cameras), but due to the increased number of image processing, it is necessary to install a higher-performance processor or it may be impossible to process all image frames, which is a problem.
Disclosure of Invention
Technical problem
An aspect of the present invention is to provide a panoramic monitoring system for minimizing an increase in the number of image processing times and improving a lane recognition rate.
Technical scheme
In an embodiment of the present invention to solve the technical problem, a panoramic monitoring system includes: a plurality of cameras installed at front, rear, left, and right sides of the vehicle, respectively, for photographing a surrounding image of the vehicle; a first lane change recognition module which recognizes whether a lane change is made using a front image among surrounding images of a vehicle photographed by a plurality of cameras, respectively; a second lane change recognition module that recognizes whether a lane change is made using a panoramic image generated based on surrounding images of the vehicle captured by the plurality of cameras, respectively; and a result integrating module that integrates the information on whether the lane is changed or not, which is discriminated by the first lane change recognition module and the second lane change recognition module, respectively, to generate a final result.
The first lane change recognition module includes: a front image selection unit that selects a front image from the surrounding images of the vehicle captured by the plurality of cameras; a front image processing unit that converts the front image selected by the front image selection unit into an image for lane recognition; a front image lane recognition unit that detects a straight line in the lane recognition image converted by the front image processing unit and specifies a position of a straight line recognized as a lane in the detected straight line; and a front image lane change recognition unit which receives information on the position of the specified straight line from the front image lane recognition unit for each frame and discriminates whether the lane is changed or not based on the received information.
The front image processing unit performs at least one of a lens distortion correction operation, a resizing operation, a histogram (histogram) correction operation, and an Edge (Edge) extraction operation on the front image selected by the front image selecting unit, and then converts the front image into an image for lane recognition.
The front image lane recognition unit performs Hough (Hough) conversion on the lane recognition image converted by the front image processing unit to detect a straight line, determines whether the detected straight line corresponds to a lane by using a first lane recognition filter, specifies the position of the straight line recognized as the lane in the detected straight line, eliminates noise included in the specified straight line position by using a first Kalman (Kalman) filter, and provides information on the specified straight line position to the front image lane change recognition unit.
The second lane-change identification module includes: a panoramic image generation unit that forms a panoramic image based on images of the surroundings of the vehicle captured by the plurality of cameras, respectively; a panoramic image processing unit that converts the panoramic image formed by the panoramic image generation unit into an image for lane recognition; a panoramic image lane recognition unit that detects straight lines from the lane recognition images converted by the panoramic image processing unit and identifies the positions of the straight lines identified as lanes from among the detected straight lines; and a panoramic image lane change recognition unit that receives information on the specified straight line position from the panoramic image lane recognition unit for each frame, and extracts a change in at least one of a lane distance, a lane width, and a lane position based on the received information to thereby determine whether or not to change the lane.
The panoramic image generation unit forms a panoramic image by forming panoramic image integrated information based on the images of the surroundings of the vehicle captured by the plurality of cameras, and reflecting the panoramic image integrated information on the images of the surroundings of the vehicle captured by the plurality of cameras.
The panoramic image processing unit performs at least one of a resizing operation, a denoising operation, and an edge extraction operation on the panoramic image selected by the panoramic image selection unit, and then converts the panoramic image into an image for lane recognition.
The panoramic image lane recognition unit detects a straight line by performing Hough (Hough) conversion on the lane recognition image converted by the panoramic image processing unit, determines whether the detected straight line corresponds to a lane by using a second lane recognition filter, specifies the position of the straight line recognized as the lane in the detected straight line, eliminates noise included in the specified straight line position by using a second Kalman filter, and provides information on the specified straight line position to the panoramic image lane change recognition unit.
The vehicle-mounted lane change warning system further comprises a warning identification module which generates warning information based on a final result formed by the result integration module, and the warning identification module outputs the warning information to a driver under the condition of lane change.
The plurality of cameras includes: a first camera installed at a front end of a vehicle for photographing a front image of the vehicle; a second camera installed at a left side of the vehicle for photographing a left side image of the vehicle; a third camera installed at a right side of the vehicle for photographing a right side image of the vehicle; and a fourth camera installed at a rear end of the vehicle for photographing a rear image of the vehicle.
Effects of the invention
According to the present invention, the lane change recognition result for the front image and the lane change recognition result for the panoramic image are integrated to finally recognize whether the lane is changed or not, thereby minimizing the increase of the number of image processing times and improving the lane recognition rate. Further, the present invention has an advantage in that it is possible to improve a lane recognition rate while minimizing the number of image processing operations, and thus it is not necessary to install a high-cost, high-performance processor. And, it is possible to discriminate whether or not the lane is deviated without the lane departure warning system, so that it is possible to reduce the number of cameras mounted on the vehicle and the installation cost.
Drawings
FIG. 1 is a block diagram illustrating a panoramic surveillance system in relation to an embodiment of the present invention.
Fig. 2 is a block diagram illustrating a first lane change identification module with respect to fig. 1.
Fig. 3 is a block diagram illustrating a second lane-change identification module with respect to fig. 1.
Fig. 4 is a schematic diagram illustrating a panoramic image formed by the panoramic image generator of fig. 3.
Fig. 5 is a diagram illustrating a lane departure warning situation with respect to the panoramic monitoring system of fig. 1.
Detailed Description
In order that the invention may be more readily understood, it is defined herein below by specific terms. Unless otherwise explicitly specified herein, scientific or technical terms used in the present invention shall have the meanings that are commonly understood by those of ordinary skill in the relevant art. Furthermore, it is to be understood that, unless explicitly specified herein, terms in the singular include the plural and terms in the plural also include the singular.
Hereinafter, a panoramic monitoring system according to an embodiment of the present invention will now be described in detail with reference to fig. 1 to 5.
Fig. 1 is a block diagram illustrating a panoramic monitoring system (Surround view monitoring system) according to an embodiment of the present invention. Fig. 2 is a block diagram illustrating a first lane change identification module with respect to fig. 1. Fig. 3 is a block diagram illustrating a second lane-change identification module with respect to fig. 1. Fig. 4 is a schematic diagram illustrating a panoramic image formed by the panoramic image generator of fig. 3. Fig. 5 is a diagram illustrating a lane departure warning situation with respect to the panoramic monitoring system of fig. 1.
First, referring to fig. 1, a panoramic monitoring system 1 according to an embodiment of the present invention may be installed in a vehicle, which includes a plurality of cameras 100, a first lane change recognition module 200, a second lane change recognition module 300, a result integration module 400, and a warning identification module 500.
The plurality of cameras 100 may be installed in front, rear, left, and right directions of the vehicle, respectively, and capture a surrounding image of the vehicle.
More specifically, the plurality of cameras 100 includes a first camera 100a installed at a front end of the vehicle and capturing a front image of the vehicle, a second camera 100b installed at a left side of the vehicle and capturing a left image of the vehicle, a third camera 100c installed at a right side of the vehicle and capturing a right image of the vehicle, and a fourth camera 100d installed at a rear end of the vehicle and capturing a rear image of the vehicle.
Further, the surrounding images of the vehicle respectively captured by the plurality of cameras 100 may be provided to the first lane-change recognition module 200 and the second lane-change recognition module 300.
For reference, a wide-angle lens may be used in the plurality of cameras 100, and may not be limited thereto.
Next, referring to fig. 2, the first lane change recognition module 200 recognizes whether a lane change is made by using front images among surrounding images of the vehicle captured by the plurality of cameras 100, respectively.
Specifically, the first lane change recognition module 200 includes a front image selection part 210, a front image processing part 230, a front image lane recognition part 250, and a front image lane change recognition part 270.
The front image selecting unit 210 can select a front image from the surrounding images of the vehicle captured by the plurality of cameras 100. Further, the front image selecting section 210 may supply the selected front image to the front image processing section 230.
The front image processing unit 230 may convert the front image selected by the front image selecting unit 210 into an image for lane recognition.
Specifically, the front image processing unit 230 may provide the front image from the front image selecting unit 210, and may convert the front image into an image for lane recognition after performing at least one of a lens distortion correcting operation, a noise removing operation, a resizing correcting operation, a histogram (histogram) correcting operation, and an edge (edge) extracting operation on the provided front image.
Here, if the first camera 100a for capturing the image of the front of the vehicle uses a wide-angle lens, the captured image may be distorted, and thus the lens distortion correction operation may be performed.
Also, the histogram may be a graph that distinguishes a degree of light leakage (i.e., brightness) of an image photographed by a camera and shows how many pixels are distributed at the corresponding brightness, for example. For example, light leakage may mean a degree of brightness of an image in accordance with an amount of illumination projected to an image sensor of a camera. The edge is extracted for finding a line segment intersection point when dividing a line segment of an image captured by a camera.
In addition, the front image processing part 230 may supply the converted lane recognition image to the front image lane recognition part 250.
The front image lane recognition unit 250 detects a straight line from the lane recognition image converted by the front image processing unit 230, and specifies the position of the straight line recognized as the lane from among the detected straight lines.
Specifically, the front image lane recognition unit 250 detects a straight line by performing Hough transformation (Hough transformation) on the lane recognition image converted by the front image processing unit 230, and determines whether or not the detected straight line corresponds to a lane (i.e., an actual lane) by using a first lane recognition filter (not shown). The front-image lane recognition unit 250 specifies the position of a straight line recognized as a lane among the detected straight lines, eliminates noise included in the specified straight line position by a first kalman filter (not shown), and supplies information on the specified straight line position to the front-image lane change recognition unit 270.
For reference, the front-image lane recognition unit 250 may perform a line segment extraction operation other than the hough transform operation when detecting a straight line, but in the embodiment of the present invention, for convenience of description, the case where the front-image lane recognition unit 250 performs the hough transform operation when detecting a straight line will be described as an example.
In order to determine whether or not the detected straight line corresponds to a lane (i.e., an actual lane), the front image lane recognition unit 250 first specifies a lane vanishing point, and divides the straight line detected with the specified lane vanishing point as the center into a left straight line and a right straight line. Next, the front image lane recognition unit 250 may detect a straight line that is within a predetermined angle with respect to the lane vanishing point as the center of the left straight line, and may detect a straight line that is within a predetermined angle with respect to the lane vanishing point as the center of the right straight line. Among the straight lines detected through the above process, the straight line with a smaller error between the distance between the two straight lines and the preset lane width is identified as the actual lane.
The actual lane recognition method may be performed by using a first lane recognition filter.
In addition, the front image lane recognition unit 250 may use a filter other than the kalman filter when eliminating the noise included in the specific straight line position, and for convenience of explanation, the front image lane recognition unit 250 is explained as using the kalman filter when eliminating the noise included in the specific straight line position in the embodiment of the present invention.
Here, the first lane recognition filter and the first kalman filter are used as an example in the form of an algorithm in the front image lane recognition portion 250.
The front image lane-change recognizing part 270 provides information on a specific straight line position from the front image lane recognizing part 250 every frame, and discriminates whether or not to change lanes based on the provided information.
Specifically, the front image lane-change recognizing portion 270 stores information about a specific straight line position supplied from the front image lane recognizing portion 250 to each frame, and learns the changing tendency of the specific straight line position (i.e., lane position) based on the stored information, so that it is possible to discriminate whether or not to change the lane.
Of course, the front image lane-change recognizing unit 270 can recognize the lane change based on the stored information by further knowing the change tendency such as the specified straight-line pitch (i.e., the lane pitch; the pitch between the lanes on both sides with the vehicle as the center) or the straight-line width (i.e., the lane width; the lateral width of the lane itself).
Further, the front image lane-change recognition part 270 may provide the recognition result to the result integration module (400 of fig. 1).
On the other hand, referring to fig. 3 and 4, the second lane-change recognition module 300 discriminates whether or not to change lanes by using panoramic images formed based on surrounding images of the vehicle respectively captured by a plurality of cameras.
Specifically, the second lane change recognition module 300 may include a panorama image generating part 310, a panorama image processing part 330, a panorama image lane recognizing part 350, and a panorama image lane change recognizing part 370.
The panoramic image generation unit 310 may form a panoramic image based on the surrounding images of the vehicle captured by the plurality of cameras 100, respectively.
Specifically, the panoramic image generation unit 310 generates integrated information on the panoramic images based on the surrounding images of the vehicle captured by the plurality of cameras 100, and reflects the integrated information on the panoramic images to the surrounding images of the vehicle captured by the plurality of cameras 100, thereby forming the panoramic image.
That is, as shown in fig. 4, the panoramic image generation unit 310 integrates the front image FI, the rear image RI, the right side image RSI, and the left side image LSI of the vehicle C captured by the plurality of cameras 100 to form the panoramic image SVI.
For reference, the panoramic image integrated information may include information for stitching (stituting) images respectively photographed by the plurality of cameras 100 according to tolerance parameter information, and the tolerance parameter information may include related parameter information for correcting a tolerance between the cameras due to a difference in mounting positions and angle differences of the plurality of cameras 100, for example.
Further, the panoramic image generation section 310 may supply the formed panoramic image to the panoramic image processing section 330.
The panoramic image processing unit 330 may convert the panoramic image formed by the panoramic image generation unit 310 into an image for lane recognition.
Specifically, the panoramic image processing unit 330 receives the panoramic image from the panoramic image generation unit 310, and converts the panoramic image into an image for lane recognition after performing at least one of a lens distortion correction operation, a histogram correction operation, a resizing operation, a denoising operation, and an edge extraction operation on the received panoramic image.
Further, the panoramic image processing unit 330 may supply the converted image for lane recognition to the panoramic image lane recognition unit 350.
The panoramic image lane recognition unit 350 detects a straight line in the lane recognition image converted by the panoramic image processing unit 330, and specifies the position of the straight line identified as a lane among the detected straight lines.
Specifically, the panoramic image lane recognition unit 350 performs hough transform on the lane recognition image converted by the panoramic image processing unit 330 to detect a straight line, and determines whether or not the detected straight line corresponds to a lane (i.e., a real lane) by using a second lane recognition filter (not shown). The panoramic image lane recognition unit 350 identifies the position of a straight line identified as a lane among the detected straight lines, removes noise included in the identified straight line position by a second kalman filter (not shown), and then provides information on the identified straight line position to the panoramic image lane change recognition unit 370.
For reference, the panoramic image lane recognition unit 350 may perform a line segment extraction operation other than the hough transform operation when detecting a straight line, but in the embodiment of the present invention, for convenience of description, the panoramic image lane recognition unit 350 performs the hough transform operation when detecting a straight line will be described as an example.
In order to determine whether or not the detected straight line corresponds to a lane (i.e., a real lane), the panoramic image lane recognition unit 350 first selects a straight line extending from the detected straight lines with an inclination close to the vertical direction (i.e., a direction parallel to the traveling direction), and determines a straight line having a smaller error between the distance between the two straight lines and the lane width defined in advance from the selected straight lines as the real lane.
The actual lane recognition method is performed by using a second lane recognition filter.
For reference, the panoramic image lane recognition unit 350 may also recognize a real lane in the detected straight line by the same method as the real lane recognition method of the front image lane recognition unit 250.
In addition, the panoramic image lane recognition unit 350 may use a filter other than the kalman filter when eliminating the noise included in the specific straight line position, but in the embodiment of the present invention, the panoramic image lane recognition unit 350 will be described using the kalman filter when eliminating the noise included in the specific straight line position as an example for convenience of description.
Here, the second lane recognition filter and the second kalman filter, which are used in the form of an algorithm, may be used in the panoramic image lane recognition portion 350.
The panoramic image lane change recognition part 370 provides information on a specific straight line position from the panoramic image lane recognition part 350 every frame, and extracts a change in at least one of a lane spacing, a lane width, and a lane position based on the provided information to discriminate whether or not to change lanes.
Specifically, the panoramic image lane-change recognition part 370 stores information on a specific straight line position supplied from the panoramic image lane recognition part 350 into each frame, and understands at least one change tendency of a lane spacing (e.g., a spacing between lanes centered on the vehicle on both sides), a lane width (e.g., a lateral width of the lane itself), and a lane position based on the stored information, so that it is possible to discriminate whether or not to change the lane.
Further, the panoramic image lane change recognition part 370 may provide the recognition result to the result integration module (400 of fig. 1).
As described above, the second lane-change recognition module 300 recognizes whether the lane is changed or not through the panoramic image, and thus can compensate for the lane recognition accuracy of the first lane-change recognition module 200 that recognizes whether the lane is changed or not through the front image.
As such, unlike the conventional art, the left and right lanes can be simultaneously detected by the panoramic image without additionally using cameras (i.e., left and right cameras) installed on the left and right sides of the vehicle to solve the problem of poor lane recognition, and thus the increase in the number of image processing times can be minimized.
Referring again to fig. 1, the result integrating module 400 integrates the information regarding whether the lane is changed or not (i.e., the recognition results) recognized by the first and second lane- change recognition modules 200, 300, respectively, to generate a final result.
Specifically, the result integrating module 400 integrates information such as a deviation between the lane position recognized by the first lane-change recognition module 200 and the lane position recognized by the second lane-change recognition module 300 to recognize whether or not the lane is changed, and generates a final result based on the recognition result.
In addition, the result integration module 400 provides the generated final result to the alert identification module 500.
The alert identification module 500 generates alert information based on the final results produced by the result integration module 400.
Specifically, the warning identification module 500 generates warning information based on the final result generated by the result integration module 400, and may output the warning information to the driver if the final result indicates a lane change.
That is, the warning identification module 500 may provide the warning information to a display (not shown) carried by the vehicle, and the display may display the received warning information.
Referring to fig. 5, there is shown a case where the warning information a is identified on a display carried by the vehicle.
That is, the display simultaneously displays the warning information a on the left side of the vehicle C when the vehicle C deviates from the left lane L, so that the driver is immediately aware of the deviation from the left lane L.
In addition, although not shown in the drawing, the warning identification module 500 may output warning information in a form of voice through a sound (not shown) carried by the vehicle.
As described above, according to the panoramic monitoring system 1 of the present invention, whether or not a lane change is made is finally determined by integrating the lane change determination result using the front image and the lane change determination result using the panoramic image, so that the increase in the number of image processing times can be minimized and the lane recognition rate can be improved. Further, there is an advantage in that the lane recognition rate can be improved without an increase in the number of times of image processing, and thus a high-cost, high-performance processor does not need to be installed. Further, the lane departure Warning System can be used without a lane departure Warning System, so that the number of cameras to be mounted on the vehicle and the mounting cost can be reduced.
The present invention has been described above, and those skilled in the art can make various modifications and changes by adding, modifying, deleting, or adding components without departing from the scope of the idea of the present invention described in the claims, and the scope of the present invention is to be included in the claims.

Claims (10)

1. A panoramic surveillance system, comprising:
a plurality of cameras installed at front, rear, left, and right sides of a vehicle, respectively, for photographing a surrounding image of the vehicle;
a first lane change recognition module that recognizes whether a lane change is made using a front image among the surrounding images of the vehicle photographed by the plurality of cameras, respectively;
a second lane change recognition module that recognizes whether there is a lane change using a panoramic image generated based on surrounding images of the vehicle photographed by the plurality of cameras, respectively; and
a result integration module integrating information on whether the lane is changed or not, which is recognized by the first and second lane-change recognition modules, respectively, to generate a final result.
2. The panoramic surveillance system of claim 1 wherein said first lane change identification module comprises:
a front image selection unit that selects the front image from the surrounding images of the vehicle captured by the plurality of cameras, respectively;
a front image processing unit that converts the front image selected by the front image selection unit into an image for lane recognition;
a front image lane recognition unit that detects a straight line in the lane recognition image converted by the front image processing unit and specifies a position of a straight line recognized as a lane from among the detected straight lines;
a front image lane change recognition part receiving information on the specified straight line position from the front image lane recognition part every frame, and recognizing whether to change lanes based on the received information.
3. The panoramic monitoring system of claim 2, wherein,
the front image processing unit performs at least one of a lens distortion correction operation, a histogram correction operation, a resizing operation, a denoising operation, and an Edge (Edge) extraction operation on the front image selected by the front image selecting unit, and then converts the front image into the image for lane recognition.
4. The panoramic monitoring system of claim 2, wherein,
the front image lane recognition unit performs a hough transform operation on the lane recognition image converted by the front image processing unit to detect a straight line, determines whether the detected straight line corresponds to a lane by using a first lane recognition filter, specifies a position of the straight line recognized as the lane among the detected straight lines, eliminates noise included in the specified position of the straight line by using a first kalman filter, and provides information on the specified position of the straight line to the front image lane change recognition unit.
5. The panoramic surveillance system of claim 1 wherein said second lane-change identification module comprises:
a panoramic image generation unit configured to form the panoramic image based on the images of the surroundings of the vehicle captured by the plurality of cameras, respectively;
a panoramic image processing unit that converts the panoramic image formed by the panoramic image generation unit into an image for lane recognition;
a panoramic image lane recognition unit that detects straight lines in the lane recognition image converted by the panoramic image processing unit and specifies a position of a straight line recognized as a lane from among the detected straight lines; and
and a panoramic image lane change recognition unit that receives information on the specified straight line position from the panoramic image lane recognition unit for each frame, and extracts a change in at least one of a lane distance, a lane width, and a lane position based on the received information to determine whether or not to change a lane.
6. The panoramic monitoring system of claim 5, wherein,
the panoramic image generation unit generates panoramic image integrated information based on the images of the surroundings of the vehicle captured by the plurality of cameras, and generates the panoramic image by reflecting the panoramic image integrated information on the images of the surroundings of the vehicle captured by the plurality of cameras.
7. The panoramic monitoring system of claim 5, wherein,
the panoramic image processing unit performs at least one of a lens distortion correction operation, a histogram correction operation, a resizing operation, a denoising operation, and an edge extraction operation on the panoramic image selected by the panoramic image selection unit, and then converts the panoramic image into the image for lane recognition.
8. The panoramic monitoring system of claim 5, wherein,
the panorama image lane recognition unit performs a hough transform operation on the lane recognition image converted by the panorama image processing unit to detect a straight line, discriminates whether or not the detected straight line corresponds to a lane by a second lane recognition filter, specifies a position of the straight line discriminated as the lane among the detected straight lines, eliminates noise included in the specified straight line position by a second kalman filter, and then provides information on the specified straight line position to the panorama image lane change recognition unit.
9. The panoramic monitoring system of claim 1, wherein,
further comprising an alert identification module that generates alert information based on the final result formed by the result integration module,
and under the condition of lane change, the warning identification module outputs the warning information to a driver.
10. The panoramic surveillance system of claim 1 wherein said plurality of cameras comprises:
a first camera installed at a front end of the vehicle for capturing a front image of the vehicle;
a second camera installed at a left side of the vehicle for photographing a left side image of the vehicle;
a third camera installed at a right side of the vehicle for photographing a right side image of the vehicle; and
and the fourth camera is arranged at the rear end of the vehicle and is used for shooting a rear image of the vehicle.
CN201880057393.6A 2017-10-31 2018-10-26 Panoramic monitoring system Active CN111133439B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR10-2017-0143112 2017-10-31
KR1020170143112A KR101982091B1 (en) 2017-10-31 2017-10-31 Surround view monitoring system
PCT/KR2018/012838 WO2019088595A1 (en) 2017-10-31 2018-10-26 Surround view monitoring system

Publications (2)

Publication Number Publication Date
CN111133439A true CN111133439A (en) 2020-05-08
CN111133439B CN111133439B (en) 2023-03-28

Family

ID=66332092

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880057393.6A Active CN111133439B (en) 2017-10-31 2018-10-26 Panoramic monitoring system

Country Status (3)

Country Link
KR (1) KR101982091B1 (en)
CN (1) CN111133439B (en)
WO (1) WO2019088595A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062745A1 (en) * 2009-05-19 2012-03-15 Imagenext Co., Ltd. Lane departure sensing method and apparatus using images that surround a vehicle
US20120154588A1 (en) * 2010-12-21 2012-06-21 Kim Gyu Won Lane departure warning system and method
CN103121448A (en) * 2011-11-17 2013-05-29 现代摩比斯株式会社 System for improving the traffic lane recognition by using a front image and side image of vehicle and method thereof
CN104085396A (en) * 2014-07-03 2014-10-08 上海纵目科技有限公司 Panoramic lane departure warning method and system
CN106891891A (en) * 2015-12-15 2017-06-27 现代自动车株式会社 Track keeps auxiliary/support system, the vehicle including it and its control method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101499956B1 (en) * 2008-06-24 2015-03-06 현대자동차주식회사 Warning system of traffic lane departure
KR101279712B1 (en) * 2011-09-09 2013-06-27 연세대학교 산학협력단 Apparatus and method for providing real-time lane detection, recording medium thereof
KR20160021944A (en) * 2014-08-18 2016-02-29 대성전기공업 주식회사 Lane departure warnig apprature and method with sensor for automabiles
DE102015205507B3 (en) * 2015-03-26 2016-09-29 Zf Friedrichshafen Ag Rundsichtsystem for a vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062745A1 (en) * 2009-05-19 2012-03-15 Imagenext Co., Ltd. Lane departure sensing method and apparatus using images that surround a vehicle
US20120154588A1 (en) * 2010-12-21 2012-06-21 Kim Gyu Won Lane departure warning system and method
CN103121448A (en) * 2011-11-17 2013-05-29 现代摩比斯株式会社 System for improving the traffic lane recognition by using a front image and side image of vehicle and method thereof
CN104085396A (en) * 2014-07-03 2014-10-08 上海纵目科技有限公司 Panoramic lane departure warning method and system
CN106891891A (en) * 2015-12-15 2017-06-27 现代自动车株式会社 Track keeps auxiliary/support system, the vehicle including it and its control method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
RAVI KUMAR SATZODA 等: "Vision-Based Front and Rear Surround Understanding Using Embedded Processors", 《IEEE TRANSACTIONS ON INTELLIGENT VEHICLES》 *
ROBIN SCHUBERT 等: "Situation Assessment for Automatic Lane-Change Maneuvers", 《IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS》 *
王庆贺: "车道偏离预警系统决策算法及性能测试方法研究", 《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》 *
王明慧: "基于视觉的智能汽车道路检测与预警算法的研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Also Published As

Publication number Publication date
KR20190048284A (en) 2019-05-09
CN111133439B (en) 2023-03-28
KR101982091B1 (en) 2019-05-24
WO2019088595A1 (en) 2019-05-09

Similar Documents

Publication Publication Date Title
CN107845104B (en) Method for detecting overtaking vehicle, related processing system, overtaking vehicle detection system and vehicle
EP2924654B1 (en) Image processing apparatus and image processing method
CN109478324B (en) Image processing apparatus and external recognition apparatus
EP2924653B1 (en) Image processing apparatus and image processing method
US8305431B2 (en) Device intended to support the driving of a motor vehicle comprising a system capable of capturing stereoscopic images
US9659497B2 (en) Lane departure warning system and lane departure warning method
US20110234761A1 (en) Three-dimensional object emergence detection device
KR20170014168A (en) Camera device for vehicle
US20180114078A1 (en) Vehicle detection device, vehicle detection system, and vehicle detection method
CN109196304B (en) Object distance detection device
WO2017134982A1 (en) Imaging device
US9305222B2 (en) Image processing apparatus and image processing method
US11508156B2 (en) Vehicular vision system with enhanced range for pedestrian detection
JP6065629B2 (en) Object detection device
WO2011016257A1 (en) Distance calculation device for vehicle
JP2012252501A (en) Traveling path recognition device and traveling path recognition program
JP2014130429A (en) Photographing device and three-dimensional object area detection program
JP6701327B2 (en) Glare detection method and device
JP2018151999A (en) Object distance detection apparatus
CN111133439B (en) Panoramic monitoring system
JP6891082B2 (en) Object distance detector
EP3081433A1 (en) An improved camera module for vehicle
KR102051324B1 (en) Surround view monitoring system
JP2018073049A (en) Image recognition device, image recognition system, and image recognition method
JP4598011B2 (en) Vehicle display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant