KR20170075521A - Apparatus and method for monitoring environment of vehicle - Google Patents
Apparatus and method for monitoring environment of vehicle Download PDFInfo
- Publication number
- KR20170075521A KR20170075521A KR1020150185284A KR20150185284A KR20170075521A KR 20170075521 A KR20170075521 A KR 20170075521A KR 1020150185284 A KR1020150185284 A KR 1020150185284A KR 20150185284 A KR20150185284 A KR 20150185284A KR 20170075521 A KR20170075521 A KR 20170075521A
- Authority
- KR
- South Korea
- Prior art keywords
- vehicle
- image
- image frame
- camera
- based algorithm
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/147—Scene change detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/02—Rear-view mirror arrangements
- B60R1/08—Rear-view mirror arrangements involving special optical features, e.g. avoiding blind spots, e.g. convex mirrors; Side-by-side associations of rear-view and other mirrors
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
-
- H04N5/2257—
-
- H04N5/35572—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2400/00—Special features of vehicle units
- B60Y2400/30—Sensors
- B60Y2400/303—Speed sensors
Abstract
BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a peripheral device for a vehicle and a method thereof, and more particularly, to a peripheral device for a vehicle and a method for monitoring a monitored object existing around the vehicle using different algorithms according to the running state of the vehicle.
The vehicle surroundings monitoring apparatus according to an embodiment of the present invention includes an input unit for receiving an image frame from a camera, a vehicle information sensing unit for sensing a traveling speed of the vehicle having the camera, Based algorithm or a vector-based algorithm, and determines whether the monitored object is included in the image frame.
Description
BACKGROUND OF THE
At least one camera may be provided in the vehicle to provide the driver with information about the periphery of the vehicle. The image photographed by the camera may be provided to the driver as it is or may be processed and provided in a separate form.
The image photographed by the camera includes not only the surrounding environment such as the relative vehicle and the pedestrian but also the environment of the vehicle in which the camera is mounted.
On the other hand, an object existing in the vicinity of the vehicle, such as a relative vehicle or a pedestrian, can move and stop. Here, the detection of the moving object is easy, but the detection of the stopped object is not easy. In particular, it is not easy to detect the object because the relative motion of the object differs according to the running state of the vehicle.
Therefore, the emergence of an invention that allows an object to be easily detected according to the running state of the vehicle is required.
An object of the present invention is to monitor objects existing in the vicinity of a vehicle by using different algorithms according to the running state of the vehicle.
The objects of the present invention are not limited to the above-mentioned problems, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.
The vehicle surroundings monitoring apparatus according to an embodiment of the present invention includes an input unit for receiving an image frame from a camera, a vehicle information sensing unit for sensing a traveling speed of the vehicle having the camera, Based algorithm or a vector-based algorithm, and determines whether the monitored object is included in the image frame.
According to another aspect of the present invention, there is provided a vehicle surroundings monitoring method including: receiving an image frame from a camera; sensing a traveling speed of the vehicle having the camera; Algorithm or a vector-based algorithm, and determining whether the monitored object is included in the image frame.
The details of other embodiments are included in the detailed description and drawings.
According to the vehicle periphery monitoring apparatus and method according to the embodiment of the present invention, since objects existing in the periphery of the vehicle are monitored using different algorithms according to the running state of the vehicle, .
1 is a view showing a camera attached to a vehicle according to an embodiment of the present invention.
2 is a diagram illustrating a peripheral surveillance system according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating movement of a monitored object around a vehicle according to an exemplary embodiment of the present invention. Referring to FIG.
4 is a diagram illustrating a movement of a monitored object in an image according to an embodiment of the present invention.
FIG. 5 is a diagram illustrating a positional change of a supervisory object between image frames according to an embodiment of the present invention. FIG.
FIG. 6 is a diagram illustrating that light from a light source provided in a vehicle is detected by a camera according to an embodiment of the present invention.
FIG. 7 is a block diagram illustrating a peripheral device for a vehicle according to an embodiment of the present invention.
8 is an exemplary view illustrating modeling information according to an embodiment of the present invention.
FIG. 9 is a diagram illustrating that a monitoring object is monitored using a learning-based algorithm according to an embodiment of the present invention.
10 is a diagram illustrating motion vectors of minutiae included in an image according to an embodiment of the present invention.
11 is a view showing a surveillance camera table showing a relationship between a speed change stage and a camera according to an embodiment of the present invention.
12 is a flowchart illustrating a vehicle periphery monitoring method according to an embodiment of the present invention.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.
Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.
1 is a view showing a camera attached to a vehicle according to an embodiment of the present invention.
Referring to FIG. 1, the
The
The
2 is a diagram illustrating a peripheral surveillance system according to an embodiment of the present invention.
Referring to FIG. 2, the
The
In the present invention, the
On the other hand, when a still image is generated, the
Hereinafter, not only the moving image but also each still image periodically transmitted is also referred to as an image frame. That is, the camera can sequentially transmit image frames constituting a moving image or image frames constituting a series of still images.
The vehicle
As shown in Fig. 3, when the
Whether or not the
The motion of the
5 shows an image frame 300 (hereinafter, referred to as an n-1 image frame) input at time n-1 and an image frame (hereinafter referred to as an n image frame) 300 input at time n . The n-1
On the other hand, in the present invention, the object moving in the vicinity of the
Referring again to FIG. 2, in the present invention, the vehicle
FIG. 6 is a diagram illustrating that light from a light source provided in a vehicle is detected by a camera according to an embodiment of the present invention.
The
The
On the other hand, when the presence or absence of the monitored object is determined using only the difference between the image frames 300, the detection of the monitored object may not be easy. There is a difference between the video frames 300 when the monitored object moves, but there is no difference between the video frames 300 when the monitored object is stopped.
For example, when a person who is not moving around the entrance of the
Accordingly, the vehicle
Hereinafter, the vehicle
FIG. 7 is a block diagram illustrating a peripheral device for a vehicle according to an embodiment of the present invention.
7, the vehicular
The
The vehicle
In addition, the vehicle
The
Specifically, when the traveling speed of the
FIG. 8 is a diagram illustrating modeling information according to an embodiment of the present invention, and FIG. 9 is a diagram illustrating monitoring of a monitored object using a learning-based algorithm according to an embodiment of the present invention.
The
The
The
In particular, the
That is, the
In applying the
The selection information of the modeling information for each camera can be stored in the
The
When the traveling speed of the
The shape of the subject may not be complete due to the exposure time of the
The vector-based algorithm includes an algorithm for determining a movement pattern of feature points between image frames 300, and FIG. 10 shows motion vectors of feature points included in an image according to an embodiment of the present invention.
The
10, all the feature points 510 and 520 included in the
When a person is photographed by the
The
Here, the feature points 510 of the surrounding background can be calculated in various ways. For example, the
On the other hand, when the traveling speed of the
The
11 is a view showing a surveillance camera table showing a relationship between a speed change stage and a camera according to an embodiment of the present invention.
Referring to FIG. 11, the surveillance camera table 600 includes a speed
The speed
The
As described above, each
In the case where the speed change stage is the P stage, the
When the speed change stage is the R-stage, the
When the speed change stage is the N-th stage, the
When the speed change stage is the D-stage, the
However, the operation relationship between the gear stage and the camera is exemplary, and various operation relationships may exist, and may be modified in real time by the user.
Referring again to FIG. 7, the
In addition to a visual manner such as an image, the
For example, the
Upon receipt of the alert, the user can take appropriate action. For example, if the user is a driver, he can stop operations and take safety precautions.
The
12 is a flowchart illustrating a vehicle periphery monitoring method according to an embodiment of the present invention.
In order to perform the peripheral surveillance, the
The vehicle
The
On the other hand, when the running speed of the
If the traveling speed is greater than 0 and less than or equal to the threshold speed, the
If it is determined that the monitored object exists according to the determination result of the
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be understood. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.
110: input unit
120: Vehicle information sensing unit
130:
140:
150: Image analysis section
160: Output section
Claims (12)
A vehicle information sensing unit for sensing a traveling speed of the vehicle equipped with the camera;
An image analyzer for analyzing the image frame with a learning-based algorithm or a vector-based algorithm according to the traveling speed of the vehicle and determining whether the monitored object is included in the image frame; And
And an output unit for outputting an alarm according to the determination result as an image.
Wherein the learning-based algorithm includes an algorithm for determining a degree of similarity between objects included in each of the image frames and modeling information provided in advance,
Wherein the vector-based algorithm includes an algorithm for determining a movement pattern of the feature points between the image frames.
Wherein the image analyzing unit analyzes the image frame with the learning-based algorithm when the traveling speed of the vehicle is 0, and determines whether the monitoring object is included in the image frame.
Wherein the image analyzing unit determines whether a monitoring object is included in the image frame by referring to at least one modeling information of the different monitoring objects.
Wherein the image analyzing unit analyzes the image frame based on the vector when the traveling speed of the vehicle exceeds 0 and is equal to or less than a critical speed to determine whether the monitored object is included in the image frame.
Wherein the image analyzing unit analyzes an image frame input from a camera corresponding to a gear range of the vehicle among the plurality of cameras to determine whether the object to be monitored is included in the image frame.
Sensing a traveling speed of the vehicle equipped with the camera; And
And analyzing the image frame with a learning-based algorithm or a vector-based algorithm according to the traveling speed of the vehicle, and determining whether the monitoring object is included in the image frame.
Wherein the learning-based algorithm includes an algorithm for determining a degree of similarity between objects included in each of the image frames and modeling information provided in advance,
Wherein the vector-based algorithm includes an algorithm for determining a movement pattern of feature points between the image frames.
Wherein the step of determining whether the monitored object is included in the image frame includes analyzing the image frame with the learning based algorithm when the running speed of the vehicle is 0 and determining whether the monitored object is included in the image frame The method comprising the steps of:
Wherein the step of determining whether the monitored object is included in the image frame includes determining whether or not the monitored object is included in the image frame by referring to at least one modeling information of the different monitored objects Surrounding monitoring method.
Wherein the step of determining whether the monitored object is included in the image frame includes analyzing the image frame based on the vector when the traveling speed of the vehicle is greater than 0 and less than or equal to the threshold speed, Determining whether or not the vehicle is in a first state;
Wherein the step of determining whether the monitored object is included in the image frame includes analyzing an image frame inputted from a camera corresponding to a gear stage of the vehicle among the plurality of cameras to determine whether the monitored object is included in the image frame And determining whether the vehicle is in the vicinity of the vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150185284A KR101953796B1 (en) | 2015-12-23 | 2015-12-23 | Apparatus and method for monitoring environment of vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150185284A KR101953796B1 (en) | 2015-12-23 | 2015-12-23 | Apparatus and method for monitoring environment of vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170075521A true KR20170075521A (en) | 2017-07-03 |
KR101953796B1 KR101953796B1 (en) | 2019-03-05 |
Family
ID=59357802
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150185284A KR101953796B1 (en) | 2015-12-23 | 2015-12-23 | Apparatus and method for monitoring environment of vehicle |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101953796B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023277422A1 (en) * | 2021-06-30 | 2023-01-05 | 김배훈 | Camera backup system and method for autonomous vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006318060A (en) * | 2005-05-10 | 2006-11-24 | Olympus Corp | Apparatus, method, and program for image processing |
JP2009135663A (en) * | 2007-11-29 | 2009-06-18 | Clarion Co Ltd | Vehicle perimeter monitoring system |
JP2013190949A (en) * | 2012-03-13 | 2013-09-26 | Toyota Central R&D Labs Inc | Pedestrian detecting device and program |
KR20140099079A (en) | 2013-02-01 | 2014-08-11 | 조선대학교산학협력단 | Blind spot monitoring device |
KR20150062277A (en) * | 2013-11-29 | 2015-06-08 | 현대모비스 주식회사 | Apparatus and Method for Assisting Parking |
-
2015
- 2015-12-23 KR KR1020150185284A patent/KR101953796B1/en active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006318060A (en) * | 2005-05-10 | 2006-11-24 | Olympus Corp | Apparatus, method, and program for image processing |
JP2009135663A (en) * | 2007-11-29 | 2009-06-18 | Clarion Co Ltd | Vehicle perimeter monitoring system |
JP2013190949A (en) * | 2012-03-13 | 2013-09-26 | Toyota Central R&D Labs Inc | Pedestrian detecting device and program |
KR20140099079A (en) | 2013-02-01 | 2014-08-11 | 조선대학교산학협력단 | Blind spot monitoring device |
KR20150062277A (en) * | 2013-11-29 | 2015-06-08 | 현대모비스 주식회사 | Apparatus and Method for Assisting Parking |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023277422A1 (en) * | 2021-06-30 | 2023-01-05 | 김배훈 | Camera backup system and method for autonomous vehicle |
Also Published As
Publication number | Publication date |
---|---|
KR101953796B1 (en) | 2019-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6937443B2 (en) | Imaging device and control method of imaging device | |
EP3467698B1 (en) | Method for monitoring blind spot of vehicle and blind spot monitor using the same | |
CN109571468B (en) | Security inspection robot and security inspection method | |
JP6888950B2 (en) | Image processing device, external world recognition device | |
KR101967305B1 (en) | Pedestrian detecting method in a vehicle and system thereof | |
KR101729486B1 (en) | Around view monitor system for detecting blind spot and method thereof | |
JP4425642B2 (en) | Pedestrian extraction device | |
KR20140076415A (en) | Apparatus and method for providing information of blind spot | |
JP2001211466A (en) | Image processing system having self-diagnostic function | |
KR20140107880A (en) | method and system for checking doze at the vehicle | |
WO2020194584A1 (en) | Object tracking device, control method, and program | |
KR101953796B1 (en) | Apparatus and method for monitoring environment of vehicle | |
US11348347B2 (en) | In-vehicle device | |
KR101680833B1 (en) | Apparatus and method for detecting pedestrian and alert | |
KR20140088630A (en) | System and method for vehicle monitoring | |
KR20130053605A (en) | Apparatus and method for displaying around view of vehicle | |
JP5771955B2 (en) | Object identification device and object identification method | |
KR20180041525A (en) | Object tracking system in a vehicle and method thereof | |
KR20170075523A (en) | Apparatus and method for monitoring environment of vehicle | |
CN109074710B (en) | Anti-theft method and device for closed circuit television set top box | |
KR101750201B1 (en) | Blind spot detection device using behavior of vehicle | |
KR20170075526A (en) | Apparatus and method for monitoring environment | |
KR101633501B1 (en) | AVM system for providing information of side distance and method thereof | |
KR101445362B1 (en) | Device for Imagery Interpretation | |
KR101511567B1 (en) | System for monitoring image and thereof method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment | ||
X701 | Decision to grant (after re-examination) | ||
GRNT | Written decision to grant |