KR20170075521A - Apparatus and method for monitoring environment of vehicle - Google Patents

Apparatus and method for monitoring environment of vehicle Download PDF

Info

Publication number
KR20170075521A
KR20170075521A KR1020150185284A KR20150185284A KR20170075521A KR 20170075521 A KR20170075521 A KR 20170075521A KR 1020150185284 A KR1020150185284 A KR 1020150185284A KR 20150185284 A KR20150185284 A KR 20150185284A KR 20170075521 A KR20170075521 A KR 20170075521A
Authority
KR
South Korea
Prior art keywords
vehicle
image
image frame
camera
based algorithm
Prior art date
Application number
KR1020150185284A
Other languages
Korean (ko)
Other versions
KR101953796B1 (en
Inventor
이상헌
Original Assignee
에스엘 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스엘 주식회사 filed Critical 에스엘 주식회사
Priority to KR1020150185284A priority Critical patent/KR101953796B1/en
Publication of KR20170075521A publication Critical patent/KR20170075521A/en
Application granted granted Critical
Publication of KR101953796B1 publication Critical patent/KR101953796B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/147Scene change detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/02Rear-view mirror arrangements
    • B60R1/08Rear-view mirror arrangements involving special optical features, e.g. avoiding blind spots, e.g. convex mirrors; Side-by-side associations of rear-view and other mirrors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • H04N5/2257
    • H04N5/35572
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/30Sensors
    • B60Y2400/303Speed sensors

Abstract

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a peripheral device for a vehicle and a method thereof, and more particularly, to a peripheral device for a vehicle and a method for monitoring a monitored object existing around the vehicle using different algorithms according to the running state of the vehicle.
The vehicle surroundings monitoring apparatus according to an embodiment of the present invention includes an input unit for receiving an image frame from a camera, a vehicle information sensing unit for sensing a traveling speed of the vehicle having the camera, Based algorithm or a vector-based algorithm, and determines whether the monitored object is included in the image frame.

Description

[0001] Apparatus and method for monitoring environment of a vehicle [

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a peripheral device for monitoring a vehicle, and more particularly, to a peripheral device for a vehicle and a method for monitoring an object existing around the vehicle using different algorithms according to the running state of the vehicle.

At least one camera may be provided in the vehicle to provide the driver with information about the periphery of the vehicle. The image photographed by the camera may be provided to the driver as it is or may be processed and provided in a separate form.

The image photographed by the camera includes not only the surrounding environment such as the relative vehicle and the pedestrian but also the environment of the vehicle in which the camera is mounted.

On the other hand, an object existing in the vicinity of the vehicle, such as a relative vehicle or a pedestrian, can move and stop. Here, the detection of the moving object is easy, but the detection of the stopped object is not easy. In particular, it is not easy to detect the object because the relative motion of the object differs according to the running state of the vehicle.

Therefore, the emergence of an invention that allows an object to be easily detected according to the running state of the vehicle is required.

Korean Patent Publication No. 10-2014-0099079 (Apr.

An object of the present invention is to monitor objects existing in the vicinity of a vehicle by using different algorithms according to the running state of the vehicle.

The objects of the present invention are not limited to the above-mentioned problems, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.

The vehicle surroundings monitoring apparatus according to an embodiment of the present invention includes an input unit for receiving an image frame from a camera, a vehicle information sensing unit for sensing a traveling speed of the vehicle having the camera, Based algorithm or a vector-based algorithm, and determines whether the monitored object is included in the image frame.

According to another aspect of the present invention, there is provided a vehicle surroundings monitoring method including: receiving an image frame from a camera; sensing a traveling speed of the vehicle having the camera; Algorithm or a vector-based algorithm, and determining whether the monitored object is included in the image frame.

The details of other embodiments are included in the detailed description and drawings.

According to the vehicle periphery monitoring apparatus and method according to the embodiment of the present invention, since objects existing in the periphery of the vehicle are monitored using different algorithms according to the running state of the vehicle, .

1 is a view showing a camera attached to a vehicle according to an embodiment of the present invention.
2 is a diagram illustrating a peripheral surveillance system according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating movement of a monitored object around a vehicle according to an exemplary embodiment of the present invention. Referring to FIG.
4 is a diagram illustrating a movement of a monitored object in an image according to an embodiment of the present invention.
FIG. 5 is a diagram illustrating a positional change of a supervisory object between image frames according to an embodiment of the present invention. FIG.
FIG. 6 is a diagram illustrating that light from a light source provided in a vehicle is detected by a camera according to an embodiment of the present invention.
FIG. 7 is a block diagram illustrating a peripheral device for a vehicle according to an embodiment of the present invention.
8 is an exemplary view illustrating modeling information according to an embodiment of the present invention.
FIG. 9 is a diagram illustrating that a monitoring object is monitored using a learning-based algorithm according to an embodiment of the present invention.
10 is a diagram illustrating motion vectors of minutiae included in an image according to an embodiment of the present invention.
11 is a view showing a surveillance camera table showing a relationship between a speed change stage and a camera according to an embodiment of the present invention.
12 is a flowchart illustrating a vehicle periphery monitoring method according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

1 is a view showing a camera attached to a vehicle according to an embodiment of the present invention.

Referring to FIG. 1, the vehicle 10 may include at least one camera 20. The camera 20 photographs a subject and generates an image of the subject. The image sensor of the camera 20 receives an analog video signal. In order to receive a video signal, an image sensor may be provided in the image sensor. A charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) may be used as the image pickup device, but the present invention is not limited thereto.

The camera 20 can be attached to at least one location of the vehicle 10. 1 shows the camera 20 on the front, rear, top, and both sides of the vehicle 10, but the location of the camera 20 is not limited thereto. Also, a plurality of cameras 20 may be provided at similar positions, and in this case, the directions in which each camera 20 faces may be different from each other.

The camera 20 may photograph the periphery of the vehicle 10 or photograph the vehicle 10 itself. For example, the camera 20 may photograph a relative vehicle or a pedestrian present in the vicinity of the vehicle 10, and may photograph one side of the vehicle 10.

2 is a diagram illustrating a peripheral surveillance system according to an embodiment of the present invention.

Referring to FIG. 2, the peripheral surveillance system 1 includes a vehicle surrounding surveillance apparatus 100 and a camera 20.

The camera 20 can transmit the photographed image to the peripheral monitor 100 for a vehicle. For this, a wired or wireless channel may be formed between the camera 20 and the peripheral monitor 100 for a vehicle.

In the present invention, the camera 20 can generate a still image or a moving image for a photographed subject. When a moving image is generated, the camera 20 can transmit a plurality of image frames constituting the moving image to the vehicle-use peripheral monitoring apparatus 100.

On the other hand, when a still image is generated, the camera 20 can transmit a still image every time an event is generated. For example, the vehicle peripheral surveillance apparatus 100 can transmit a video transmission command to the camera 20. The camera 20 captures a subject in response to reception of a video transmission command to generate a still image, It is possible to transmit the still image to the vehicle surrounding surveillance apparatus 100. Alternatively, the camera 20 may periodically transmit the still image to the peripheral monitor 100 for the vehicle.

Hereinafter, not only the moving image but also each still image periodically transmitted is also referred to as an image frame. That is, the camera can sequentially transmit image frames constituting a moving image or image frames constituting a series of still images.

The vehicle peripheral surveillance apparatus 100 may analyze an image received from the camera 20 and perform an alarm providing function. For example, the vehicle surrounding surveillance apparatus 100 may analyze an image and output an alarm when it is determined that there is a pedestrian in the vicinity of the vehicle.

As shown in Fig. 3, when the pedestrian 30 moves on the rear side of the vehicle 10, it can be photographed by the camera 20. Fig. Thus, the image generated by the camera 20 is transmitted to the vehicle surroundings surveillance apparatus 100, the vehicle surroundings surveillance apparatus 100 analyzes the image to determine that the pedestrian 30 exists, and provides an alarm to the driver can do.

Whether or not the pedestrian 30 exists can be determined by determining whether motion is included in the object in the image. As shown in FIG. 4, when there is movement in the object included in the image 200, the vehicle surrounding surveillance apparatus 100 determines that the pedestrian 30 exists.

The motion of the object 210 included in the image 200 can be performed by analyzing sequentially input image frames. 5, a plurality of image frames 300 may be input from the camera 20. The peripheral monitor 100 for a vehicle may include a plurality of image frames 300, It is possible to judge whether or not there is a motion.

5 shows an image frame 300 (hereinafter, referred to as an n-1 image frame) input at time n-1 and an image frame (hereinafter referred to as an n image frame) 300 input at time n . The n-1 image frame 300 includes an object 311 on the right side of the image frame area, but the n image frame 300 includes the object 312 on the left side of the image frame area. That is, there is a difference between the n-1 image frame 300 and the n image frame 300. The vehicle peripheral surveillance apparatus 100 can determine whether or not the pedestrian 30 exists around the vehicle 10 by using the difference between the image frames.

On the other hand, in the present invention, the object moving in the vicinity of the vehicle 10 is not limited to the pedestrian 30. For example, a pet or a relative vehicle may move around the vehicle, and objects such as a bag or an umbrella carried by the pedestrian 30 may move around the vehicle. Hereinafter, all the objects existing in the vicinity of the vehicle and subject to the alarm output will be referred to as objects to be monitored hereinafter.

Referring again to FIG. 2, in the present invention, the vehicle surroundings monitoring apparatus 100 may be mounted on the vehicle 10, but is not limited thereto. For example, a portable terminal of a user such as a smart phone can serve as a peripheral monitor 100 for a vehicle. When the user is adjacent to the vehicle 10 with the portable terminal, the camera 20 can configure a communication channel with the portable terminal and transmit the image. Accordingly, the portable terminal can perform an alarm providing function by analyzing the received image.

FIG. 6 is a diagram illustrating that light from a light source provided in a vehicle is detected by a camera according to an embodiment of the present invention.

The vehicle 10 according to the embodiment of the present invention may be a bus, and the camera 20 may be installed around the entrance. 6 shows that the camera 20 is installed above the entrance.

The camera 20 can photograph a moving object entering and exiting from the entrance and exit, and provide the generated image to the vehicle surroundings monitoring apparatus 100. Accordingly, the vehicular peripheral monitoring apparatus 100 analyzes the transmitted image to determine whether or not a moving object is present at the entrance, and can provide an alarm to the driver when it is determined that the moving object exists. Drivers who have been alerted will be able to stop operations and take safety precautions.

On the other hand, when the presence or absence of the monitored object is determined using only the difference between the image frames 300, the detection of the monitored object may not be easy. There is a difference between the video frames 300 when the monitored object moves, but there is no difference between the video frames 300 when the monitored object is stopped.

For example, when a person who is not moving around the entrance of the vehicle 10 is photographed to generate an image frame 300, an alarm may not be output because there is no difference between the image frames 300.

Accordingly, the vehicle peripheral surveillance apparatus 100 according to the embodiment of the present invention can detect a monitored object by applying a learning-based algorithm or a vector-based algorithm according to the traveling speed of the vehicle 10. [

Hereinafter, the vehicle surroundings monitoring apparatus 100 will be described in detail.

FIG. 7 is a block diagram illustrating a peripheral device for a vehicle according to an embodiment of the present invention.

7, the vehicular peripheral monitoring apparatus 100 includes an input unit 110, a vehicle information sensing unit 120, a storage unit 130, a control unit 140, an image analysis unit 150, and an output unit 160, .

The input unit 110 receives the image frame 300 from the camera 20. For this, a wired or wireless communication channel may be formed between the input unit 110 and the camera 20. [ As described above, the vehicle 10 may be provided with at least one camera 20, and the input unit 110 may configure a communication channel for each camera 20.

The vehicle information sensing unit 120 senses the traveling speed of the vehicle 10 having the camera 20. The vehicle information sensing unit 120 may be provided with the running speed of the vehicle 10 from the control system in the vehicle and may be provided with the running speed of the vehicle 10 directly from the encoder provided on the wheels of the vehicle 10 have. Alternatively, a GPS (Global Positioning System) receiver may be included to sense the traveling speed of the vehicle 10 by receiving GPS signals. In this case, the vehicle information sensing unit 120 may be provided in the vehicle 10.

In addition, the vehicle information sensing unit 120 can sense various information of the vehicle 10 as well as the traveling speed. For example, it is possible to detect the starting state of the vehicle 10, the speed change stage, and the like.

The image analysis unit 150 analyzes the image frame 300 according to the traveling speed of the vehicle 10 sensed by the vehicle information sensing unit 120 using a learning-based algorithm or a vector-based algorithm, And determines whether the target object is included or not.

Specifically, when the traveling speed of the vehicle 10 is zero, the image analysis unit 150 analyzes the image frame 300 using a learning-based algorithm, and determines whether the object to be monitored is included in the image frame 300 have. The learning-based algorithm includes an algorithm for determining the degree of similarity between objects included in each of the image frames 300 and modeling information provided in advance.

FIG. 8 is a diagram illustrating modeling information according to an embodiment of the present invention, and FIG. 9 is a diagram illustrating monitoring of a monitored object using a learning-based algorithm according to an embodiment of the present invention.

The modeling information 400 may include modeling information 410 for a person, modeling information 420 for a pet, modeling information 430 for a relative vehicle, and modeling information 440 for a person's belongings. However, this is an example, and modeling information of more various objects can be provided.

The modeling information 410 to 440 for a specific object may be composed of detailed modeling information. That is, the modeling information 410 for a person may include detailed modeling information on the entire appearance of a person, a face, a hand, a body, and a leg.

The modeling information 410 to 440 may be stored in the storage unit 130. The image analysis unit 150 compares the modeling information 400 stored in the storage unit 130 with the image frame 300, It can be determined whether or not the object is included in the image frame 300.

In particular, the image analyzer 150 may determine whether the monitored object is included in the image frame 300 by referring to at least one modeling information of the different monitored objects.

That is, the image analysis unit 150 can determine whether the object to be monitored is included in the image frame 300 by applying all of the modeling information 400 stored in the storage unit 130, and can apply only some modeling information It may be determined whether or not the object to be monitored is included in the image frame 300. For example, the image analysis unit 150 may perform image analysis by applying modeling information selected by the user. The user can apply different modeling information for each camera. That is, modeling information 410, 420, and 440 for a person, a pet, and a belonging are applied to the camera 20 facing the entrance, The image analysis may be performed by applying the modeling information 410 and 430 to the opponent vehicle.

In applying the modeling information 410 for a person, the image analyzing unit 150 may perform image analysis by applying selected detailed modeling information among the whole image, face, hand, body, and leg.

The selection information of the modeling information for each camera can be stored in the storage unit 130. The image analyzing unit 150 can perform image analysis by applying only the selected modeling information with reference to the selection information of the modeling information.

The modeling information 400 and the selection information, as well as the storage unit 130, serve to temporarily or permanently store an image input through the input unit 110. Also, the storage unit 130 may temporarily or permanently store various types of information input through the input unit 110 and various data transmitted from the inside of the vehicle peripheral surveillance apparatus 100.

When the traveling speed of the vehicle 10 exceeds 0 and the speed of the vehicle 10 is less than or equal to the threshold speed, the image analyzer 150 analyzes the image frame 300 based on the vector and determines whether or not the object to be monitored is contained in the image frame 300 can do.

The shape of the subject may not be complete due to the exposure time of the camera 20 when the vehicle 10 is running. Thus, it is not easy to apply a learning-based algorithm when the vehicle 10 is running. Accordingly, the image analysis unit 150 according to the embodiment of the present invention can determine whether the monitored object exists by using a vector-based algorithm when the vehicle 10 is running.

The vector-based algorithm includes an algorithm for determining a movement pattern of feature points between image frames 300, and FIG. 10 shows motion vectors of feature points included in an image according to an embodiment of the present invention.

The image analysis unit 150 may analyze the image, extract the feature points included in the image, and calculate the motion vector of each feature point using the difference between the image frames 300.

10, all the feature points 510 and 520 included in the image 200 are directed to the right, and the magnitude of the motion vector of some feature points 520 is the size of the motion vector of most other feature points 510 Lt; / RTI > In particular, the size of some feature points 520 may be larger than the size of most of the feature points 510.

When a person is photographed by the camera 20, the surrounding background is also photographed with the person. At this time, when the vehicle 10 moves, the camera 20 moves as well, and the person and the surrounding background are recognized as moving within the photographed image. On the other hand, since the person is close to the camera 20 as compared with the background, the movement of the person is recognized more by the camera 20. [

The image analysis unit 150 analyzes the motion vectors of the feature points 510 and 520 included in the image and determines whether the feature points 510 having a larger motion vector than the feature points 510 of the surrounding background form a group It can be determined that the object is the object to be monitored. Alternatively, the image analyzer 150 may analyze the motion vectors of the minutiae included in the image and determine that the target object is a group of minutiae having motion vectors in directions different from the minutiae of the surrounding background.

Here, the feature points 510 of the surrounding background can be calculated in various ways. For example, the image analysis unit 150 may determine a feature point having a size of a motion vector similar to a specific area of the image region as a feature point 510 of the surrounding background. Alternatively, the minutiae of the area determined to be unable to exist in the image area may be determined as the minutiae 510 of the surrounding background. However, the feature point calculation of the surrounding background is not limited to this.

On the other hand, when the traveling speed of the vehicle 10 exceeds the threshold speed, the motion vector magnitudes of the feature point 510 for the surrounding background and the feature point 520 for the monitored object can be similarly detected. Accordingly, when the traveling speed of the vehicle 10 exceeds the critical speed, the image analysis unit 150 may not perform the image analysis.

The image analysis unit 150 analyzes the image frame 300 input from the camera 20 corresponding to the gear range of the vehicle 10 among the plurality of cameras 20 installed in the vehicle, It is possible to determine whether or not the monitored object is included.

11 is a view showing a surveillance camera table showing a relationship between a speed change stage and a camera according to an embodiment of the present invention.

Referring to FIG. 11, the surveillance camera table 600 includes a speed change end field 610 and a camera field 620.

The speed change end field 610 includes the gear position of the vehicle 10 and the camera field 620 includes the position of the camera installed in the vehicle 10. [

The image analysis unit 150 may perform image analysis with reference to the surveillance camera table 600. That is, the image analysis unit 150 can perform only image analysis on the camera corresponding to the gear range.

As described above, each camera 20 can generate an image by including its identification information. The image analysis unit 150 can select the image frame 300 to analyze the image by referring to the identification information included in the image.

In the case where the speed change stage is the P stage, the image analysis unit 150 can analyze the images received from the cameras installed on the front, rear, side, and upper portions of the vehicle 10.

When the speed change stage is the R-stage, the image analysis unit 150 can perform analysis only on the images received from the cameras installed on the rear side, the side surface, and the upper side of the vehicle 10.

When the speed change stage is the N-th stage, the image analysis unit 150 can maintain the analysis according to the previous speed change stage. For example, when the previous gear position is the P-stage, the image analyzer 150 analyzes only the image of the camera corresponding to the P-stage. When the previous gear stage is the R-stage, Can be analyzed.

When the speed change stage is the D-stage, the image analysis unit 150 can perform analysis only on the images received from the cameras installed on the front and side surfaces of the vehicle 10.

However, the operation relationship between the gear stage and the camera is exemplary, and various operation relationships may exist, and may be modified in real time by the user.

Referring again to FIG. 7, the output unit 160 provided in the vicinity of the driver's seat outputs an alarm according to the result of the determination by the image analysis unit 150 as an image. That is, the output unit 160 outputs an alarm when the monitored object exists, and may not output an alarm when the monitored object does not exist.

In addition to a visual manner such as an image, the output unit 160 may output an alarm in an audible or tactile manner. To this end, the output unit 160 may include video output means (not shown), audio output means (not shown), and vibration output means (not shown), and may output an alarm through each output means.

For example, the output unit 160 may output that the monitored object exists in the image frame 300 generated by the specific camera. At this time, the output unit 160 can output the identification information of the corresponding camera. The camera 20 may generate an image including its own identification information, and the output unit 160 may output the identification information of the camera that senses the monitored object using the identification information included in the image . When the image output unit is included, the output unit 160 may output the image captured by the camera.

Upon receipt of the alert, the user can take appropriate action. For example, if the user is a driver, he can stop operations and take safety precautions.

The control unit 140 performs overall control of the input unit 110, the vehicle information sensing unit 120, the storage unit 130, the image analysis unit 150, and the output unit 160.

12 is a flowchart illustrating a vehicle periphery monitoring method according to an embodiment of the present invention.

In order to perform the peripheral surveillance, the input unit 110 receives the image frame 300 from the camera 20 (S710). The input image is transmitted to the image analysis unit 150.

The vehicle information sensing unit 120 may determine whether the vehicle 10 is starting up (S720). Thus, when the vehicle 10 is started, the vehicle information sensing unit 120 confirms the traveling speed of the vehicle 10 and transmits the checked traveling speed to the image analysis unit 150.

The image analysis unit 150 can confirm whether the running speed of the vehicle 10 is zero (S730). Thus, if the running speed of the vehicle 10 is zero, the image analysis unit 150 may perform image analysis using a learning-based algorithm to determine whether the monitored object exists (S740).

On the other hand, when the running speed of the vehicle 10 is not 0, the image analyzing unit 150 can check whether the running speed of the vehicle 10 exceeds 0 or below the threshold speed (S750).

If the traveling speed is greater than 0 and less than or equal to the threshold speed, the image analysis unit 150 may perform image analysis using a vector-based algorithm to determine whether there is an object to be monitored (S760).

If it is determined that the monitored object exists according to the determination result of the image analysis unit 150, the output unit 160 outputs an alarm to the image (S770). The alarm output method by the output unit 160 is not limited to the image, and an alarm may be output in an audible or tactile manner.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be understood. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

110: input unit
120: Vehicle information sensing unit
130:
140:
150: Image analysis section
160: Output section

Claims (12)

An input unit for receiving an image frame from a camera;
A vehicle information sensing unit for sensing a traveling speed of the vehicle equipped with the camera;
An image analyzer for analyzing the image frame with a learning-based algorithm or a vector-based algorithm according to the traveling speed of the vehicle and determining whether the monitored object is included in the image frame; And
And an output unit for outputting an alarm according to the determination result as an image.
The method according to claim 1,
Wherein the learning-based algorithm includes an algorithm for determining a degree of similarity between objects included in each of the image frames and modeling information provided in advance,
Wherein the vector-based algorithm includes an algorithm for determining a movement pattern of the feature points between the image frames.
The method according to claim 1,
Wherein the image analyzing unit analyzes the image frame with the learning-based algorithm when the traveling speed of the vehicle is 0, and determines whether the monitoring object is included in the image frame.
The method of claim 3,
Wherein the image analyzing unit determines whether a monitoring object is included in the image frame by referring to at least one modeling information of the different monitoring objects.
The method according to claim 1,
Wherein the image analyzing unit analyzes the image frame based on the vector when the traveling speed of the vehicle exceeds 0 and is equal to or less than a critical speed to determine whether the monitored object is included in the image frame.
The method according to claim 1,
Wherein the image analyzing unit analyzes an image frame input from a camera corresponding to a gear range of the vehicle among the plurality of cameras to determine whether the object to be monitored is included in the image frame.
Receiving an image frame from a camera;
Sensing a traveling speed of the vehicle equipped with the camera; And
And analyzing the image frame with a learning-based algorithm or a vector-based algorithm according to the traveling speed of the vehicle, and determining whether the monitoring object is included in the image frame.
8. The method of claim 7,
Wherein the learning-based algorithm includes an algorithm for determining a degree of similarity between objects included in each of the image frames and modeling information provided in advance,
Wherein the vector-based algorithm includes an algorithm for determining a movement pattern of feature points between the image frames.
8. The method of claim 7,
Wherein the step of determining whether the monitored object is included in the image frame includes analyzing the image frame with the learning based algorithm when the running speed of the vehicle is 0 and determining whether the monitored object is included in the image frame The method comprising the steps of:
10. The method of claim 9,
Wherein the step of determining whether the monitored object is included in the image frame includes determining whether or not the monitored object is included in the image frame by referring to at least one modeling information of the different monitored objects Surrounding monitoring method.
8. The method of claim 7,
Wherein the step of determining whether the monitored object is included in the image frame includes analyzing the image frame based on the vector when the traveling speed of the vehicle is greater than 0 and less than or equal to the threshold speed, Determining whether or not the vehicle is in a first state;
8. The method of claim 7,
Wherein the step of determining whether the monitored object is included in the image frame includes analyzing an image frame inputted from a camera corresponding to a gear stage of the vehicle among the plurality of cameras to determine whether the monitored object is included in the image frame And determining whether the vehicle is in the vicinity of the vehicle.
KR1020150185284A 2015-12-23 2015-12-23 Apparatus and method for monitoring environment of vehicle KR101953796B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150185284A KR101953796B1 (en) 2015-12-23 2015-12-23 Apparatus and method for monitoring environment of vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150185284A KR101953796B1 (en) 2015-12-23 2015-12-23 Apparatus and method for monitoring environment of vehicle

Publications (2)

Publication Number Publication Date
KR20170075521A true KR20170075521A (en) 2017-07-03
KR101953796B1 KR101953796B1 (en) 2019-03-05

Family

ID=59357802

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150185284A KR101953796B1 (en) 2015-12-23 2015-12-23 Apparatus and method for monitoring environment of vehicle

Country Status (1)

Country Link
KR (1) KR101953796B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023277422A1 (en) * 2021-06-30 2023-01-05 김배훈 Camera backup system and method for autonomous vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006318060A (en) * 2005-05-10 2006-11-24 Olympus Corp Apparatus, method, and program for image processing
JP2009135663A (en) * 2007-11-29 2009-06-18 Clarion Co Ltd Vehicle perimeter monitoring system
JP2013190949A (en) * 2012-03-13 2013-09-26 Toyota Central R&D Labs Inc Pedestrian detecting device and program
KR20140099079A (en) 2013-02-01 2014-08-11 조선대학교산학협력단 Blind spot monitoring device
KR20150062277A (en) * 2013-11-29 2015-06-08 현대모비스 주식회사 Apparatus and Method for Assisting Parking

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006318060A (en) * 2005-05-10 2006-11-24 Olympus Corp Apparatus, method, and program for image processing
JP2009135663A (en) * 2007-11-29 2009-06-18 Clarion Co Ltd Vehicle perimeter monitoring system
JP2013190949A (en) * 2012-03-13 2013-09-26 Toyota Central R&D Labs Inc Pedestrian detecting device and program
KR20140099079A (en) 2013-02-01 2014-08-11 조선대학교산학협력단 Blind spot monitoring device
KR20150062277A (en) * 2013-11-29 2015-06-08 현대모비스 주식회사 Apparatus and Method for Assisting Parking

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023277422A1 (en) * 2021-06-30 2023-01-05 김배훈 Camera backup system and method for autonomous vehicle

Also Published As

Publication number Publication date
KR101953796B1 (en) 2019-03-05

Similar Documents

Publication Publication Date Title
JP6937443B2 (en) Imaging device and control method of imaging device
EP3467698B1 (en) Method for monitoring blind spot of vehicle and blind spot monitor using the same
CN109571468B (en) Security inspection robot and security inspection method
JP6888950B2 (en) Image processing device, external world recognition device
KR101967305B1 (en) Pedestrian detecting method in a vehicle and system thereof
KR101729486B1 (en) Around view monitor system for detecting blind spot and method thereof
JP4425642B2 (en) Pedestrian extraction device
KR20140076415A (en) Apparatus and method for providing information of blind spot
JP2001211466A (en) Image processing system having self-diagnostic function
KR20140107880A (en) method and system for checking doze at the vehicle
WO2020194584A1 (en) Object tracking device, control method, and program
KR101953796B1 (en) Apparatus and method for monitoring environment of vehicle
US11348347B2 (en) In-vehicle device
KR101680833B1 (en) Apparatus and method for detecting pedestrian and alert
KR20140088630A (en) System and method for vehicle monitoring
KR20130053605A (en) Apparatus and method for displaying around view of vehicle
JP5771955B2 (en) Object identification device and object identification method
KR20180041525A (en) Object tracking system in a vehicle and method thereof
KR20170075523A (en) Apparatus and method for monitoring environment of vehicle
CN109074710B (en) Anti-theft method and device for closed circuit television set top box
KR101750201B1 (en) Blind spot detection device using behavior of vehicle
KR20170075526A (en) Apparatus and method for monitoring environment
KR101633501B1 (en) AVM system for providing information of side distance and method thereof
KR101445362B1 (en) Device for Imagery Interpretation
KR101511567B1 (en) System for monitoring image and thereof method

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant