KR20160114323A - Environment monitoring apparatus for vehicle - Google Patents

Environment monitoring apparatus for vehicle Download PDF

Info

Publication number
KR20160114323A
KR20160114323A KR1020150040639A KR20150040639A KR20160114323A KR 20160114323 A KR20160114323 A KR 20160114323A KR 1020150040639 A KR1020150040639 A KR 1020150040639A KR 20150040639 A KR20150040639 A KR 20150040639A KR 20160114323 A KR20160114323 A KR 20160114323A
Authority
KR
South Korea
Prior art keywords
vehicle
correction
image
present
peripheral
Prior art date
Application number
KR1020150040639A
Other languages
Korean (ko)
Other versions
KR101678098B1 (en
Inventor
예호철
Original Assignee
에스엘 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스엘 주식회사 filed Critical 에스엘 주식회사
Priority to KR1020150040639A priority Critical patent/KR101678098B1/en
Publication of KR20160114323A publication Critical patent/KR20160114323A/en
Application granted granted Critical
Publication of KR101678098B1 publication Critical patent/KR101678098B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/02Rear-view mirror arrangements
    • B60R1/08Rear-view mirror arrangements involving special optical features, e.g. avoiding blind spots, e.g. convex mirrors; Side-by-side associations of rear-view and other mirrors
    • B60R1/081Rear-view mirror arrangements involving special optical features, e.g. avoiding blind spots, e.g. convex mirrors; Side-by-side associations of rear-view and other mirrors avoiding blind spots, e.g. by using a side-by-side association of mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/202Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used displaying a blind spot scene on the vehicle part responsible for the blind spot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a peripheral monitoring apparatus for a vehicle, and more particularly, to a peripheral monitoring apparatus for a vehicle capable of correcting a tolerance of a camera installed in a vehicle for peripheral surveillance of the vehicle.
A peripheral monitoring apparatus for a vehicle according to an embodiment of the present invention includes an image obtaining unit for obtaining an image including a plurality of correction patterns in at least one direction around the vehicle, And a control unit for detecting a correction point from the plurality of correction patterns included in the generated peripheral image and calculating a correction value for correcting the obtained image according to the detected correction point, Wherein the correction pattern includes a plurality of pattern regions having different brightnesses, and at least a part of the plurality of pattern regions comprises a diffusely reflecting region for diffusely reflecting ambient light.

Figure P1020150040639

Description

[0001] The present invention relates to an environment monitoring apparatus for vehicle,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a peripheral monitoring apparatus for a vehicle, and more particularly, to a peripheral monitoring apparatus for a vehicle capable of correcting a tolerance of a camera installed in a vehicle for peripheral surveillance of the vehicle.

In general, the vehicle is easy to monitor in front of the driver's seat, but it is relatively difficult to observe the side or backside of the vehicle, resulting in a blind spot. Most accidents such as human accidents or structural damage occurring during the operation of the vehicle, Occurs in the zone.

The driver monitors the surroundings, particularly the blind spot, through one or more mirrors provided in the vehicle. However, since the monitoring by such a mirror requires continuous attention of the driver, the driver's fatigue increases, There is a limit to the safe operation due to obstruction of sight.

Recently, a camera is installed in a vehicle, and an image acquired by a camera is displayed in a driver's seat of a vehicle so that a driver can identify an obstacle around the vehicle.

The camera is installed with a predetermined height or angle. If the camera is not installed correctly or the camera is out of the preset installation height or installation angle due to external factors, the surrounding image can not be normally acquired, It is difficult to recognize and cope with such a situation.

Therefore, there is a demand for a method of obtaining a peripheral image normally even when the camera is deviated from a predetermined installation height or installation angle.

Registered Patent Publication No. 10-0966288 (published on March 28, 2010)

SUMMARY OF THE INVENTION It is an object of the present invention to provide a vehicle control system and a vehicle control method for a vehicle that corrects an image acquired by a camera and thereby normally acquires a surrounding image even when a camera for acquiring an image of the surroundings of the vehicle deviates from a predetermined installation height or installation angle Thereby providing a peripheral surveillance device.

Another object of the present invention is to provide a peripheral device for a vehicle which prevents irregularity of a correction pattern due to ambient light by irregularly reflecting ambient light incident on a correction pattern used for correcting an image acquired by a camera.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

In order to achieve the above object, a vehicle surroundings surveillance apparatus according to an embodiment of the present invention includes an image acquisition unit that acquires an image including a plurality of correction patterns in at least one direction around the vehicle, And a controller for detecting a correction point from the plurality of correction patterns included in the generated peripheral image, and for correcting the obtained image according to the detected correction point, Wherein the correction pattern includes a plurality of pattern regions having different brightnesses, and at least a part of the plurality of pattern regions is composed of a diffusely reflecting region for diffusely reflecting ambient light.

The details of other embodiments are included in the detailed description and drawings.

According to the vehicle surroundings surveillance apparatus of the present invention, one or more of the following effects can be obtained.

It is possible to correct the tolerance of the camera easily even if it is not a skilled expert because the image obtained by the camera installed in the vehicle can be corrected and the surrounding image can be normally generated even when the tolerance occurs in the camera.

In addition, since the ambient light incident on the correction pattern used for correcting the tolerance of the camera is irregularly reflected, it is possible to prevent the correction pattern from being perceived abnormally in the camera due to the ambient light, There is also an effect.

The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.

1 is a block diagram showing a vehicle surroundings monitoring apparatus according to an embodiment of the present invention;
2 is a schematic diagram illustrating a plurality of cameras according to an embodiment of the present invention;
3 is a schematic diagram illustrating a peripheral image according to an embodiment of the present invention.
4 is a schematic view illustrating a peripheral image when a tolerance of a camera is generated according to an embodiment of the present invention.
5 is a schematic view showing a correction pattern located around a vehicle according to an embodiment of the present invention;
6 is a schematic diagram showing a correction pattern according to an embodiment of the present invention;
7 is a schematic diagram illustrating correction points according to an embodiment of the present invention;
8 is a schematic view showing reflected light reflected by a correction pattern according to an embodiment of the present invention.
9 is a pictorial schematic illustration of a correction pattern acquired by a camera when ambient light is present according to an embodiment of the present invention.
10 is a schematic view showing reflected light diffusely reflected by a correction pattern according to an embodiment of the present invention.
11 and 12 are schematic diagrams showing correction patterns for irregularly reflecting incident ambient light according to an embodiment of the present invention.
13 is a schematic diagram showing correction points detected from a peripheral image according to an embodiment of the present invention;

BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Thus, in some embodiments, well known process steps, well-known structures, and well-known techniques are not specifically described to avoid an undue interpretation of the present invention.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. It should be understood that the terms comprising and / or comprising the terms used in the specification do not exclude the presence or addition of one or more other components, steps and / or operations other than the stated components, steps and / . And "and / or" include each and any combination of one or more of the mentioned items.

Further, the embodiments described herein will be described with reference to the perspective view, cross-sectional view, side view, and / or schematic views, which are ideal illustrations of the present invention. Thus, the shape of the illustrations may be modified by manufacturing techniques and / or tolerances. Accordingly, the embodiments of the present invention are not limited to the specific forms shown, but also include changes in the shapes that are generated according to the manufacturing process. In addition, in the drawings of the present invention, each constituent element may be somewhat enlarged or reduced in view of convenience of explanation.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described with reference to the drawings for explaining a peripheral monitoring apparatus of a vehicle according to embodiments of the present invention.

FIG. 1 is a block diagram showing a vehicle surroundings monitoring apparatus according to an embodiment of the present invention.

The peripheral monitoring apparatus 1 of the vehicle according to the embodiment of the present invention includes an image obtaining unit 100, a distortion correcting unit 200, a peripheral image generating unit 300, and a control unit 400 .

The image acquisition unit 100 may include a plurality of cameras 111 to 114 and an image processing unit 120.

The plurality of cameras 111 to 114 may serve to acquire images of different directions around the vehicle.

As shown in FIG. 2, the plurality of cameras 111 to 114 are respectively installed in front, back, left and right directions of the vehicle to acquire images of the vehicle in all directions. For example, It can be installed near the bumper or near the left and right outside mirrors.

In the embodiment of the present invention, a plurality of cameras 111 to 114 are referred to as a first camera 111, a second camera 112, a third camera 113, and a fourth camera 114, The images obtained by the first to fourth images 111 to 114 will be referred to as a first image 111a, a second image 112a, a third image 113a and a fourth image 114a.

In this case, in the embodiment of the present invention, a case where a plurality of cameras 111 to 114 are installed in front, rear, left, and right directions of the vehicle will be described as an example. However, The number of installations may vary.

For example, in a case where a wide angle lens having a relatively large angle of view is used for a plurality of cameras 111 to 114, the number of installations can be reduced, and when a narrow angle lens is used, the number of installations can be increased.

The image obtained by the plurality of cameras 111 to 114 may be an image in the forward direction of the vehicle, that is, in a direction of 360 degrees. As described above, the plurality of cameras 111 to 114 may be arranged in front, When installed, each camera may be a camera using a wide-angle lens having an angle of view of approximately 180 degrees.

Further, in order to acquire an image of the vehicle in all directions, the plurality of cameras 111 to 114 do not necessarily have to be installed in the front, rear, left, and right directions of the vehicle, The installation position can be changed variously.

As described above, when a wide angle lens is used for the plurality of cameras 111 to 114, a wide angle of view can be realized. On the other hand, since the distortion of the image, that is, radial distortion, The image obtained by the distortion correction algorithms 111 to 141 may be subjected to distortion correction processing by the distortion correction algorithm.

The image processing unit 120 may convert the image obtained by the plurality of cameras 111 to 114 into a compression format so as to facilitate data transmission before transmitting the image to the distortion correction unit 200, -1 or MPEG-4 may be used.

In addition, the image processing unit 120 may correct a tolerance that occurs when a plurality of cameras 111 to 114 deviate from a preset installation position or installation angle, and a detailed description thereof will be described later.

As described above, the distortion correction unit 200 may correct the radial distortion caused when the wide-angle lens is used in the plurality of cameras 111 to 114. The distortion correction unit 200 may perform a predetermined distortion algorithm And corrects the acquired image according to the correction rate.

Here, in the embodiment of the present invention, a case where a wide angle lens is used for a plurality of cameras 111 to 114 to correct radial distortion through the distortion correction unit 200 is described as an example. However, In this case, the image obtained by the image obtaining unit 100 may be subjected to a radial distortion correction process without being subjected to a radial distortion correction process, And may be transmitted to the image generating unit 300.

The peripheral image generating unit 300 synthesizes the images 111a to 114a acquired by the plurality of cameras 111 to 114 to generate a peripheral image of a top view type as viewed from above the vehicle, An icon or an image representing a vehicle may be inserted into the empty space in the center portion.

At this time, the peripheral image generating unit 300 may synthesize the image before the radial distortion correction according to the presence or absence of the distortion correction unit 200, and may synthesize the image after radial distortion correction. In the embodiment of the present invention, A case where the generating unit 300 generates a peripheral image 310 by synthesizing an image obtained by correcting the radial distortion by the distortion correction unit 200 will be described.

The peripheral image 310 generated by the peripheral image generating unit 300 is displayed through various display devices such as a navigation system, an AV system, and a HUD (Head Up Display) provided in the vehicle, so that the driver can see the pedestrian 311 And the surrounding vehicles 312 and the like in advance.

At this time, when the plurality of cameras 111 to 114 have predetermined installation heights or installation angles, the peripheral image generation unit 300 normally generates the peripheral images 310 so that the driver recognizes the obstacles 311 and 312 However, when a physical error, that is, a mechanical error, occurs, and a plurality of cameras 111 to 114 deviate from a predetermined installation height or installation angle, a tolerance is generated. In this case, At least one 312 of the obstacles 311 and 312 is not normally displayed and it is difficult for the driver to recognize the obstacles 311 and 312 correctly.

Therefore, in the embodiment of the present invention, the control unit 400 performs a process of correcting the tolerances generated in the plurality of cameras 111 to 114, and in the process of correcting the tolerances of the plurality of cameras 111 to 114 Will be described in detail.

In the embodiment of the present invention, in order to correct the tolerances generated in the plurality of cameras 111 to 114, a plurality of correction patterns P are positioned around the vehicle as shown in FIG. 5, and then a plurality of cameras 111 to 141 The control unit 400 obtains a plurality of images 111a to 114a including the plurality of correction patterns P and the plurality of correction patterns P3 included in the peripheral image 310 generated by the peripheral image generation unit 300 P to detect the correction points necessary for calculating the correction values for correcting the tolerances of the plurality of cameras 111 to 114. [

The position and the number of the correction pattern P used when correcting the tolerances generated in the plurality of cameras 111 to 114 can be variously changed.

The correction pattern P may include a plurality of pattern areas P11, P12, P21, and P22 having different brightness levels as shown in FIG. 6. In the exemplary embodiment of the present invention, The regions P11 and P12 are referred to as high luminance regions and the pattern regions P21 and P22 having luminance lower than the reference luminance are referred to as low luminance regions.

The reason why the correction pattern P is composed of the high luminance areas P11 and P12 and the low luminance areas P21 and P22 is that edge detection is easy due to the luminance difference between the two areas, CP) may be determined at a predetermined point of the detected edge.

For example, when the correction pattern P is alternately arranged in a plurality of directions in the high luminance areas P11 and P12 and the low luminance areas P21 and P22 as shown in FIG. 6, The edge E1 and the longitudinal edge E2 of the transverse edge E1 and the longitudinal edge E2 of the transverse edge E1 can be respectively detected and the point at which the transverse edge E1 and the longitudinal edge E2 intersect can be detected by the correction point CP .

In the embodiment of the present invention, the high brightness areas P11 and P12 and the low brightness areas P21 and P22 are each composed of two areas. However, the present invention is not limited to this, The number and arrangement direction of the high luminance areas P11 and P12 and the low luminance areas P21 and P22 can be variously changed according to the detection standard.

In the embodiment of the present invention, detecting a point at which a transverse edge E1 intersects a transverse edge E2 with a correction point CP is merely an example for facilitating understanding of the present invention, The present invention is not limited to this, and the correction point CP may be determined as a predetermined point.

The shape of the correction pattern P is not limited to the above-described embodiment, and may have various shapes according to the criteria for detecting the correction point CP.

In the embodiment of the present invention, a white pattern is used for the high luminance areas P11 and P12, and a black pattern is used for the low luminance areas P21 and P22. However, But the present invention is not limited to this, and colors other than the white color and the black color having a large luminance difference may be used to facilitate edge detection.

5, since the control unit 400 includes a plurality of correction patterns P around the vehicle, the control unit 400 can detect the correction points CP for the respective correction patterns P, The control unit 400 will describe a case where the positional relationship between each correction pattern P (for example, coordinates, etc.) is stored in advance in the embodiment of the present invention.

In other words, since the control unit 400 knows the positional relationship between the respective correction patterns P, when the correction point CP of each correction pattern P is detected, the positional relationship between two or more correction points P Can be compared with the previously stored positional relationship to determine whether a tolerance has occurred and the tolerance can be corrected by correcting the positional relationship between the detected correction points CP to be equal to the previously stored positional relationship.

At this time, when the ambient light such as natural light or artificial light is incident on the correction pattern P, when the image is acquired by the camera due to the reflected light reflected by the correction pattern P, A phenomenon in which the luminance of the liquid crystal panel is varied may occur.

That is, when the ambient light L1 is acquired by the camera due to the reflected light L2 incident on at least one of the low-luminance areas P21 and P22 of the correction pattern P and reflected thereon as shown in FIG. 8, When at least one of the low luminance areas P21 and P22 of the correction pattern P included in the image is recognized as a high luminance area as shown in Fig. 9, it is difficult to detect the edge and it is difficult to detect the correction point CP .

For this, in the embodiment of the present invention, in the embodiment of the present invention, the reflected light L2 reflected by the correction pattern P is scattered, that is, the reflected light L2 incident on the camera is reduced At least a part of the correction pattern P is formed as a diffuse reflection region so that the luminance increase phenomenon due to the ambient light L1 is prevented.

For example, as shown in FIG. 11, the surface of the correction pattern P may be roughened through sandpaper or the like, or may be formed of a material having a surface of the correction pattern P itself, It is possible to reduce the reflected light L2 incident on the camera and make it possible to accurately detect the correction point CP.

12, a diffusive layer M may be disposed on at least a part of the correction pattern P so that the reflected light L2 is irregularly reflected. The diffusive layer M may be made of a diffusive coating or a film, Or a cover for diffuse reflection (for example, diffused reflection glass) or the like.

When at least one of the pattern areas P11, P12, P21 and P22 of the correction pattern P is formed, it is also possible to reduce the reflected light L2 by using irregularly reflecting ink, paint or the like.

The method of forming the diffusive region in at least a partial region of the correction pattern P is not limited to the above-described examples, and various methods capable of causing the incident ambient light L1 to be scattered can be used.

As described above, in the embodiment of the present invention, at least a partial region of the correction pattern P is formed as a diffusely reflecting region in which the reflected light L2 is diffusely diffused. However, in the present invention, It can be understood that the reflected light is diffusely reflected by using the various methods described above in the low luminance region.

As described above, in the embodiment of the present invention, reflected light is scattered by the correction pattern P to prevent the brightness from increasing, so that the correction pattern can be correctly recognized by the camera.

The control unit 400 synthesizes the images including the plurality of correction patterns P acquired by the plurality of cameras 111 to 114 by the peripheral image generation unit 300 to generate the peripheral image 310 The correction point CP is detected from the plurality of correction patterns P and a correction value for correcting the tolerance for the plurality of cameras 111 to 114 is calculated from the detected positional relationship of the correction points CP do.

For example, when the correction point CP detected from the peripheral image 310 generated by the peripheral image generation unit 300 is the same as that shown in FIG. 13, the control unit 400 determines the positional relationship between the correction points CP It can be judged.

That is, as described above, since the control unit 400 stores the positional relationship among the plurality of correction patterns P located in the vicinity of the vehicle in advance, the positional relationship between the correction points CP corresponds to the pre-stored positional relationship It is possible to calculate a correction value for correcting the image obtained from the plurality of cameras 111 to 114.

For example, the control unit 400 detects that the positional relationship of any two correction points CP included in the first image 111a acquired by the first camera 111 is inclined at a predetermined angle from the predetermined position It is possible to calculate a correction value for compensating for the tolerance caused thereby.

The correction value calculated by the control unit 400 is transmitted to the image processing unit 120 of the image obtaining unit 100 and the image processing unit 120 obtains the correction values obtained by the cameras 111 to 114 At least one of the images may be corrected in accordance with the transmitted correction value, for example, the image may be rotated in a predetermined direction so that each of the correction points CP has a predetermined positional relationship.

In the embodiment of the present invention, the image processing unit 120 corrects the tolerance of the image according to the correction value transmitted from the control unit 400. However, May be performed at any stage before being displayed through the display device.

In the embodiment of the present invention, the image processing unit 120 is included in the image obtaining unit 100. However, the present invention is not limited thereto, and the image processing unit 120 may be separately configured to perform tolerance correction It is possible.

As described above, in the embodiment of the present invention, a process for correcting the tolerance separately by previously correcting the case where the image is not normally acquired due to the tolerance generated in the plurality of cameras 111 to 114 through the correction pattern P It is possible to easily correct the image without going through.

It will be understood by those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. The scope of the present invention is defined by the appended claims rather than the foregoing detailed description, and all changes or modifications derived from the meaning and range of the claims and the equivalents thereof are included in the scope of the present invention Should be interpreted.

100:
111 to 114: camera
120:
200: Distortion correction unit
300: peripheral image generating unit
400:

Claims (8)

An image acquiring unit acquiring an image including a plurality of correction patterns with respect to at least one direction around the vehicle;
A peripheral image generating unit for generating a peripheral image of the vehicle using the acquired image; And
And a control unit for detecting a correction point from the plurality of correction patterns included in the generated peripheral image and calculating a correction value for correcting the obtained image according to the detected correction point,
Wherein the plurality of correction patterns include:
A plurality of pattern regions each having a different luminance,
Wherein at least some of the plurality of pattern regions
A periphery monitoring device of a vehicle, which is a diffuse reflection area that irregularly reflects ambient light.
The method according to claim 1,
Wherein the plurality of pattern regions are pattern regions,
A low luminance region lower than the reference luminance and a high luminance region higher than the reference luminance,
The low-luminance region and the high-
Wherein the plurality of sensors are arranged alternately in at least one direction.
3. The method of claim 2,
The irregular reflection region may be formed,
And the peripheral luminance of the vehicle is the low luminance region.
The method according to claim 1,
The irregular reflection region may be formed,
Wherein the surface of the plurality of pattern regions is roughly formed.
The method according to claim 1,
The irregular reflection region may be formed,
Wherein the plurality of pattern regions are regions in which the diffusive layer is located.
6. The method of claim 5,
The diffusely reflective layer
A device for monitoring the surroundings of a vehicle comprising a diffusive film or a diffusive coating.
6. The method of claim 5,
The diffusely reflective layer
A device for monitoring the surroundings of a vehicle made of diffusive glass.
The method according to claim 1,
The irregular reflection region may be formed,
Surrounding surveillance of the vehicle where diffuse reflection ink or diffuse reflection paint is used.
KR1020150040639A 2015-03-24 2015-03-24 Environment monitoring apparatus for vehicle KR101678098B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150040639A KR101678098B1 (en) 2015-03-24 2015-03-24 Environment monitoring apparatus for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150040639A KR101678098B1 (en) 2015-03-24 2015-03-24 Environment monitoring apparatus for vehicle

Publications (2)

Publication Number Publication Date
KR20160114323A true KR20160114323A (en) 2016-10-05
KR101678098B1 KR101678098B1 (en) 2016-11-23

Family

ID=57153852

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150040639A KR101678098B1 (en) 2015-03-24 2015-03-24 Environment monitoring apparatus for vehicle

Country Status (1)

Country Link
KR (1) KR101678098B1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08276787A (en) * 1995-04-03 1996-10-22 Suzuki Motor Corp On-vehicle image processing device and image display system
JP2003312408A (en) * 2002-04-17 2003-11-06 Toshiba Corp Image diagnosing device, image diagnosing system of on- vehicle image monitoring device and passing vehicle monitoring device
JP2006178667A (en) * 2004-12-21 2006-07-06 Nissan Motor Co Ltd Video correcting apparatus and method for vehicle
KR100966288B1 (en) 2009-01-06 2010-06-28 주식회사 이미지넥스트 Around image generating method and apparatus
KR101272257B1 (en) * 2011-08-30 2013-06-19 에스엘 주식회사 Apparatus and method for compensating around image of vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08276787A (en) * 1995-04-03 1996-10-22 Suzuki Motor Corp On-vehicle image processing device and image display system
JP2003312408A (en) * 2002-04-17 2003-11-06 Toshiba Corp Image diagnosing device, image diagnosing system of on- vehicle image monitoring device and passing vehicle monitoring device
JP2006178667A (en) * 2004-12-21 2006-07-06 Nissan Motor Co Ltd Video correcting apparatus and method for vehicle
KR100966288B1 (en) 2009-01-06 2010-06-28 주식회사 이미지넥스트 Around image generating method and apparatus
KR101272257B1 (en) * 2011-08-30 2013-06-19 에스엘 주식회사 Apparatus and method for compensating around image of vehicle

Also Published As

Publication number Publication date
KR101678098B1 (en) 2016-11-23

Similar Documents

Publication Publication Date Title
US11689703B2 (en) Vehicular vision system with customized display
JP5339124B2 (en) Car camera calibration system
US8345095B2 (en) Blind spot image display apparatus and method thereof for vehicle
US20150009100A1 (en) Projection type image display device
EP3357734A1 (en) Display device
EP3350646A1 (en) Information display apparatus, information provision system, moving object device, information display method, and recording medium
JP4926766B2 (en) Shooting range adjusting device, shooting range adjusting method, and computer program
JP6445607B2 (en) Vehicle display system and method for controlling vehicle display system
JP6669053B2 (en) Head-up display system
US20180335633A1 (en) Viewing direction detector and viewing direction detection system
JP5959311B2 (en) Data deriving apparatus and data deriving method
JP2009183473A (en) Visual line direction detection device, and visual line direction detection method
JP2020078051A (en) Rear display device, rear display method, and program
CN115244451A (en) Aerial image display device
US10997861B2 (en) Luminance control device, luminance control system, and luminance control method
CN117441201A (en) Aerial image display device and aerial image display method
US20210224591A1 (en) Methods and systems for training an object detection algorithm
KR101678098B1 (en) Environment monitoring apparatus for vehicle
JP2017112486A (en) Image processing apparatus
JP6855254B2 (en) Image processing device, image processing system, and image processing method
KR20160034669A (en) Environment monitoring apparatus and method for vehicle
KR101069209B1 (en) Method and apparatus for controlling convergence in parallel axis typed sterocamera
JP2009006968A (en) Vehicular display device
JP2017162233A (en) Visual line detection device and visual line detection method
KR20140118115A (en) System and method for calibrating around view of vehicle

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190925

Year of fee payment: 4