KR20160144643A - Apparatus for prividing around view and vehicle including the same - Google Patents

Apparatus for prividing around view and vehicle including the same Download PDF

Info

Publication number
KR20160144643A
KR20160144643A KR1020150081049A KR20150081049A KR20160144643A KR 20160144643 A KR20160144643 A KR 20160144643A KR 1020150081049 A KR1020150081049 A KR 1020150081049A KR 20150081049 A KR20150081049 A KR 20150081049A KR 20160144643 A KR20160144643 A KR 20160144643A
Authority
KR
South Korea
Prior art keywords
vehicle
information
parking
processor
mobile terminal
Prior art date
Application number
KR1020150081049A
Other languages
Korean (ko)
Inventor
오성찬
김태경
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150081049A priority Critical patent/KR20160144643A/en
Publication of KR20160144643A publication Critical patent/KR20160144643A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W2050/0078
    • B60W2050/0081
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system
    • B60W50/045Monitoring control system parameters
    • B60W2050/046Monitoring control system parameters involving external transmission of data to or from the vehicle, e.g. via telemetry, satellite, Global Positioning System [GPS]
    • B60W2050/14
    • B60W2550/406

Abstract

The present invention relates to an apparatus for providing an around view image and a vehicle having the same. An apparatus for providing an around view image comprises: a plurality of around view cameras for providing around view images; a communication unit for exchanging data with a mobile terminal; and a processor generating an around view image based on images captured by the plurality of around view cameras when a vehicle is reversing or the vehicle is moving in a speed lower than a predetermined speed, extracting a plurality of parking zone identification markers from a plurality of around view images generated during a first time when the vehicle is parked, and controlling the extracted parking zone identification markers to be transmitted to the mobile terminal. Accordingly, the present invention allows to provide position information of the vehicle when parking the vehicle by using the around view cameras.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to an apparatus,

BACKGROUND OF THE INVENTION Field of the Invention [0001] The present invention relates to an apparatus and method for providing an overview image, and more particularly, to an apparatus and method for providing an overview image of a vehicle when a vehicle is parked .

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

On the other hand, for the convenience of users who use the vehicle, various sensors and electronic devices are provided. In particular, various devices and the like for the user's driving comfort have been developed, and images photographed from a rear camera provided when the vehicle is backed up or when the vehicle is parked are provided.

It is an object of the present invention to provide an apparatus for providing an overview of an image of a vehicle when the vehicle is parked using an averaged view camera and a vehicle having the same.

According to an aspect of the present invention, there is provided an apparatus for providing an overview of an environment, including: a plurality of surround view cameras for providing an overview view; a communication unit for exchanging data with the mobile terminal; Generating an overview image based on the image captured by the view camera, extracting a plurality of parking zone identification markers from among the plurality of surrounding view images generated during the first time when the vehicle is parked, And a processor for controlling the identification marker to be transmitted to the mobile terminal.

According to another aspect of the present invention, there is provided a vehicle including: a steering driving unit for driving a steering device; a brake driving unit for driving a brake device; a power source driving unit for driving a power source; A communication unit for exchanging data with the mobile terminal; and an image generating unit for generating an aurally view image based on an image captured by the plurality of surround-view cameras when the vehicle is backward or below a predetermined speed, And a processor for extracting a plurality of parking zone identification markers from among the plurality of surrounding view images generated during the first time and controlling the extracted multiple parking zone identification markers to be transmitted to the mobile terminal.

There is provided an approach view providing apparatus and a vehicle including the same, the apparatus comprising: a plurality of surround view cameras for providing an overview view; a communication unit for exchanging data with the mobile terminal; A plurality of parking zone identification markers are extracted from a plurality of the surround view images generated during the first time when the vehicle is parked, And a processor for controlling the multi-parking zone identification marker to be transmitted to the mobile terminal. Accordingly, it is possible to provide the position information of the vehicle when the vehicle is parked using the surround view camera.

On the other hand, by controlling the plurality of parking zone identification markers to be sequentially transferred according to the traveling route of the vehicle, or by adding travel route information of the vehicle to a plurality of parking zone identification markers and transmitting them, intuitively, Information can be recognized.

On the other hand, in the case where the startup is turned off after the parking of the vehicle, a plurality of surrounding cameras are activated for a second time period to generate an ambient view image based on the captured images from the plurality of surrounding cameras, More accurate parking position information of the vehicle can be recognized by extracting the walking path information of the vehicle, extracting the parking lot entrance information, and further transmitting the walking path information or the parking lot entrance / exit information to the mobile terminal.

On the other hand, by generating a parking map using a plurality of generated surround view images for the first time, adding the extracted multiple parking zone identification markers to the parking map, and transmitting the same to the mobile terminal, It becomes possible to recognize the parking position information of the vehicle.

1 is a conceptual diagram of a vehicle communication system including an apparatus for providing an overview of an approach according to an embodiment of the present invention.
2A is a diagram showing the appearance of a vehicle having various cameras.
FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.
2C is a view showing an example of the position of the surrounding view camera attached to the vehicle of FIG. 2A.
FIG. 2D illustrates an overview view image based on an image photographed by the surround view camera of FIG. 2C.
Figs. 2E to 2F are views showing various examples of the positions of the surround-view cameras attached to the vehicle of Fig. 2A.
Figs. 3A to 3B illustrate various examples of the internal block diagram of the vehicle driving assist system of Fig.
3C to 3D illustrate various examples of an internal block diagram of the far view providing apparatus of FIG.
3E is an internal block diagram of the vehicle display device of FIG.
Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D.
Figure 5 is a diagram illustrating object detection in the processor of Figures 4A-4B.
6A and 6B are views referred to in the description of the operation of the vehicle driving assistant device of FIG.
7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.
8 is a diagram illustrating an example of an operation method of an apparatus for providing an overview of an ambient view according to an embodiment of the present invention.
FIGS. 9A and 9B are diagrams referred to in depth map generation using the surround view image.
10 is an example of an internal block diagram of the terminal of FIG.
FIGS. 11A to 12C are views referencing an operation method description of an apparatus for providing an overview view according to an embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.

On the other hand, the vehicle described in the present specification may be a concept including both a vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

1 is a conceptual diagram of a vehicle communication system including an apparatus for providing an overview of an approach according to an embodiment of the present invention.

Referring to the drawings, a vehicle communication system 10 may include a vehicle 200, terminals 600a and 600b, and a server 500. [

The vehicle 200 may include an autonomous traveling device 100, a vehicle display device 400, and the like in the interior of the vehicle.

On the other hand, the autonomous mobile device 100 may include a vehicle driving assistant 100a, an ambient view providing device 100b, and the like

For example, for autonomous driving of the vehicle, when the vehicle speed is equal to or greater than a predetermined speed, autonomous travel of the vehicle is performed through the vehicle driving assistant device 100a, and when the vehicle speed is less than the predetermined speed, Can be performed.

As another example, when the vehicle driving assist device 100a and the surrounding view providing device 100b are operated together for autonomous driving of the vehicle but the predetermined speed or more, the vehicle driving assistant 100a is further weighted, The autonomous running is performed mainly on the driving assistance device 100a and when the speed is less than the predetermined speed, the weight is further added to the surrounding view providing device 100b so that the autonomous traveling of the vehicle can be performed mainly on the surrounding view providing device 100b .

The vehicle driving assistant 100a, the surrounding view providing device 100b and the vehicle display device 100c are connected to the terminals 600a and 600b (not shown) by using communication units (not shown) Or exchange data with the server 500.

For example, when the mobile terminal 600a is located in or near the vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 100c is connected by short- , And the terminal 600a.

As another example, when the terminal 600b is located at a remote place outside the vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 100c is a remote communication ) Can exchange data with the terminal 600b or the server 500 via the network 570. [

The terminals 600a and 600b may be mobile terminals such as mobile phones, smart phones, tablet PCs, and wearable devices such as smart watches. Or a fixed terminal such as a TV or a monitor. Hereinafter, the terminal 600 will be mainly described as a mobile terminal such as a smart phone.

On the other hand, the server 500 may be a server provided by a vehicle manufacturer or a server operated by a provider providing a vehicle-related service. For example, it may be a server operated by a provider providing information on road traffic conditions and the like.

On the other hand, the vehicle driving assistant 100a can generate and provide vehicle-related information by signal processing the stereo image received from the stereo camera 195 based on computer vision. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.

3c or 3d) 270 in the vehicle 200. The surround view providing apparatus 100b includes a plurality of images captured by the plurality of surround view cameras 295a, 295b, 295c, and 295d, And the processor (270 in FIG. 3C or FIG. 3D) can combine the plurality of images to generate and provide the surround view image.

On the other hand, the vehicle display device 100c may be an AVN (Audio Video Navigation) device.

On the other hand, the vehicle display device 100c may include a space recognition sensor unit and a touch sensor unit, whereby the remote access can be sensed by the space recognition sensor unit, and the near touch approach can be sensed through the touch sensor unit . Then, a user interface corresponding to the detected user gesture or touch can be provided.

On the other hand, the surrounding view providing apparatus 100b according to the embodiment of the present invention calculates a disparity for an overlapping area among images obtained from a plurality of surround view cameras, and based on the calculated disparity, The distance to the object in the overlap area can be calculated.

Meanwhile, the surrounding view providing apparatus 100b according to the embodiment of the present invention outputs the calculated distance information, so that the driver can easily recognize the distance to the object around the vehicle.

On the other hand, the surrounding view providing apparatus 100b according to the embodiment of the present invention generates and outputs a disparity map based on the disparity calculated for the overlapped area, thereby providing distance information about the vehicle periphery .

2A is a diagram showing the appearance of a vehicle having various cameras.

Referring to the drawings, the vehicle 200 includes wheels (203FR, 103FL, 103RL, ...) rotated by a power source, a handle (250) for adjusting the traveling direction of the vehicle (200) A plurality of surround view cameras 295a, 295b, and 295b mounted on the vehicle 200 for the stereo camera 195 included in the vehicle 200 for the apparatus 100a and the surrounding view providing apparatus 100b of FIG. 295c, 295d. On the other hand, in the figure, only the left camera 295a and the front camera 295d are shown for the sake of convenience.

The stereo camera 195 may include a plurality of cameras, and the stereo image obtained by the plurality of cameras may be signal-processed in the vehicle driving assistance apparatus (100a in Fig. 3).

On the other hand, the figure illustrates that the stereo camera 195 includes two cameras.

The plurality of surrounding view cameras 295a, 295b, 295c, and 295d can be activated when the vehicle speed is less than or equal to the predetermined speed, or when the vehicle is propelled, to acquire a shot image, respectively. The image, which is obtained by a plurality of cameras, can be signal processed within the surrounding view providing apparatus (100b in Fig. 3c or 3d).

FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.

Referring to the drawing, the stereo camera module 195 may include a first camera 195a having a first lens 193a, and a second camera 195b having a second lens 193b.

The stereo camera module 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, Shielding portion 192b.

The stereo camera module 195 in the drawing may be a structure detachable from the ceiling or the windshield of the vehicle 200.

A vehicle driving assistant device 100a (FIG. 3) having such a stereo camera module 195 obtains a stereo image for the vehicle front from the stereo camera module 195 and generates a disparity ) Detection, perform object detection for at least one stereo image based on the disparity information, and continuously track the motion of the object after object detection.

FIG. 2C is a view showing an example of the position of the surround view camera attached to the vehicle of FIG. 2A, and FIG. 2D illustrates an example of an ambient view image based on the captured image in the surround view camera of FIG. 2C.

First, referring to FIG. 2C, a plurality of surrounding view cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively.

In particular, the left camera 295a and the right camera 295c may be disposed in a case that surrounds the left side mirror and a case that surrounds the right side mirror, respectively.

On the other hand, the rear camera 295b and the front camera 295d can be disposed in the vicinity of the trunk switch and in the vicinity of the ambulance or the ambulance, respectively.

Each of the plurality of images photographed at the plurality of surrounding view cameras 295a, 295b, 295c and 295d is transmitted to a processor (270 in Fig. 3c or 3d) in the vehicle 200, 270) combines a plurality of images to produce an ambient view image.

On the other hand, the plurality of surrounding view cameras 295a, 295b, 295c, and 295d can obtain the left image Ara1, the rear image Arb1, the right image Arc1, and the forward image Ard1, respectively.

On the other hand, the plurality of surrounding view cameras 295a, 295b, 295c, and 295d are cameras for providing the surround view image, and can be a wide angle camera and can cover an angle of approximately 170 to 180 degrees.

Accordingly, as shown in the figure, overlapping areas are generated on the images photographed by the plurality of surrounding cameras 295a, 295b, 295c, and 295d.

In the present invention, the disparity is calculated based on the distance between the cameras, and the distance to the object in the overlapping area is calculated based on the calculated disparity .

In the drawing, the overlapping area Ara1 occurring between the left image Ara1 and the rear image Arb1 is illustrated, and the overlapping area Arbc1 occurring between the rear image Arb1 and the right image Arc1 Illustrating the overlapping area Arcd1 occurring between the right image Arc1 and the forward image Ard1 and the overlapping area Arad1 occurring between the front image Ard1 and the left image Ara1, For example.

The processor 270 in the surrounding view providing apparatus 100b calculates the disparity for at least one of the overlap region Ara1, the overlap region Arbc1, the overlap region Arcd1, and the overlap region Arad1, Based on the disparity, the distance to the object in the overlapping area can be calculated.

On the other hand, the processor 270 in the surrounding view providing apparatus 100b generates an aurally view image based on the images obtained from the plurality of around-view cameras when the speed of the vehicle is less than or equal to the first speed , The generated surrounding view image, and the calculated distance information.

On the other hand, the processor 270 in the surrounding view providing apparatus 100b can generate a disparity map based on the disparity calculated for the overlapped area, and control to output the generated disparity map.

Meanwhile, the processor 270 in the surrounding view providing apparatus 100b can control to store the generated disparity map in the memory 240, and the vehicle can control the area associated with the disparity map stored in the memory 240 When traveling, the disparity map can be controlled to be displayed on the display 280.

On the other hand, the processor 270 in the surrounding view providing apparatus 100b can check an object in the overlapping area and control the display 280 to display the checked object and the distance information on the object in the overlapping area .

The processor 270 in the surrounding view providing apparatus 100b includes a disparity calculating unit 420 for performing a disparity calculation on the overlapping area based on the images obtained from the plurality of view cameras, An object detecting unit 434 for performing object detection on the overlapping area based on the calculated disparity information, and an object tracking unit 440 for performing tracking on the detected object.

On the other hand, the processor 270 in the surrounding view providing apparatus 100b includes a segmentation section 432 for segmenting an object in the overlapping area based on the disparity information, and an object checking section 436 for classifying the detected object .

FIG. 2D illustrates an example of the surrounding view image 210. FIG. The surround view image 210 includes a first image area 295ai from the left camera 295a, a second image area 295bi from the rear camera 295b, a third image area 295b from the right camera 295c, 295ci, and a fourth image area 295di from the front camera 295d.

Figs. 2E to 2F are views showing various examples of the positions of the surround-view cameras attached to the vehicle of Fig. 2A.

First, referring to FIG. 2E, a plurality of surround view cameras 295a, 295b, 295c, 295d1, and 295d2 may be disposed on the left, rear, right, and front of the vehicle, respectively.

In particular, unlike FIG. 2C, there are differences in that two cameras in front are disposed in the left front lamp, and the right front lamp, respectively.

Each of the plurality of images photographed at the plurality of the surrounding view cameras 295a, 295b, 295c, 295d1 and 295d2 is transmitted to the processor 270 (270 in Fig. 3c or 3d) in the vehicle 200, 270) (270 in FIG. 3C or FIG. 3D) combines a plurality of images to generate an ambient view image.

On the other hand, the plurality of surround view cameras 295a, 295b, 295c, 295d1, and 295d2 are arranged in the order of the left image Ara1, the rear image Arb1, the right image Arc1 and the first front image Ard1, It is possible to obtain the second front image Ard2.

On the other hand, the plurality of surrounding view cameras 295a, 295b, 295c, and 295d are cameras for providing the surround view image, and can be a wide angle camera and can cover an angle of approximately 170 to 180 degrees.

As a result, overlapping areas are generated on the images photographed by the plurality of surround view cameras 295a, 295b, 295c, 295d1 and 295d2).

In the present invention, the disparity is calculated based on the distance between the cameras, and the distance to the object in the overlapping area is calculated based on the calculated disparity .

In the drawing, the overlapping area Ara2 occurring between the left image Ara1 and the rear image Arb1 is illustrated, and the overlapping area Arbc2 occurring between the rear image Arb2 and the right image Arc1 And exemplifies an overlapping area Arcd2 occurring between the right image Arc1 and the second front image Ard2 and the overlapping area Arc2 generated between the first forward image Ard1 and the second forward image Ard2 Illustrates the overlap area Ardd and illustrates the overlap area Arad2 that occurs between the first front image Ard1 and the left image Ara1.

The processor 270 in the surrounding view providing apparatus 100b may determine whether or not at least one of the overlap area Ara2, the overlap area Arbc2, the overlap area Aradd, the overlap area Aradd, The parity can be calculated, and the distance to the object in the overlapping area can be calculated based on the calculated disparity.

In addition, various operations as described in the description of FIG. 2C can be performed.

Next, referring to FIG. 2F, a plurality of surrounding view cameras 295d1, 295b1, 295b2, and 295d2 may be disposed on the left front side, the left rear side, the right rear side, and the right front side of the vehicle.

That is, unlike FIG. 2C, the four cameras are disposed in the vicinity of the left front lamp, the left rear lamp, the right rear lamp, and the right front lamp, respectively.

Each of the plurality of images photographed by the plurality of surrounding view cameras 295d1, 295b1, 295b2 and 295d2 is transmitted to the processor 270 (270 in Fig. 3c or 3d) in the vehicle 200, (270 in FIG. 3C or FIG. 3D) combines a plurality of images to generate an ambient view image.

On the other hand, the plurality of surrounding view cameras 295d1, 295b1, 295b2, and 295d2 acquire the left front image Ard1, the left rear image Arb1, the right rear image Arb2, and the right front image Ard2, can do.

On the other hand, the plurality of surrounding view cameras 295d1, 295b1, 295b2, and 295d2 are cameras for providing an overview image, which can be a wide angle camera and can cover an angle of approximately 170 to 180 degrees.

As a result, overlapping areas are generated on the images photographed by the plurality of surround view cameras 295d1, 295b1, 295b2, and 295d2.

In the present invention, the disparity is calculated based on the distance between the cameras, and the distance to the object in the overlapping area is calculated based on the calculated disparity .

In the drawing, an overlapped area Ardd occurring between a left front image Ard1 and a right front image Ard2 is illustrated, and an overlap area Ard2 occurring between a left rear image Arb1 and a right rear image Arb2 (Arbb).

The processor 270 in the surrounding view providing apparatus 100b calculates the disparity for at least one of the overlapping area Ardd and the overlapping area Arbb and calculates the disparity based on the calculated disparity The distance can be calculated.

In addition, various operations as described in the description of FIG. 2C can be performed.

Figs. 3A to 3B illustrate various examples of the internal block diagram of the vehicle driving assist system of Fig.

The vehicle driving assistant 100a of FIGS. 3A and 3B can generate the vehicle-related information by signal processing the stereo image received from the stereo camera 195 based on computer vision. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.

3A, the vehicle driving assistant apparatus 100a includes a communication unit 120, an interface unit 130, a memory 140, a processor 170, a power supply unit 190, (Not shown).

The communication unit 120 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 120 can exchange data with a mobile terminal of a vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 120 can receive weather information and traffic situation information on the road, for example, TPEG (Transport Protocol Expert Group) information from the mobile terminal 600 or the server 500. Meanwhile, the vehicle driving assistant 100a may transmit real-time traffic information based on the stereo image to the mobile terminal 600 or the server 500. [

On the other hand, when the user is aboard the vehicle, the user's mobile terminal 600 and the vehicle driving assistant 100a can perform pairing with each other automatically or by execution of the user's application.

The interface unit 130 can receive the vehicle-related data or transmit the signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 can perform data communication with the ECU 770, the AVN (Audio Video Navigation) device 400, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method have.

The interface unit 130 can receive map information related to the vehicle driving by data communication with the vehicle display device 400. [

On the other hand, the interface unit 130 can receive the sensor information from the ECU 770 or the sensor unit 760.

Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

Such sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.

On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

The memory 140 may store various data for operation of the entire vehicle driving assistant device 100a, such as a program for processing or controlling the processor 170. [

An audio output unit (not shown) converts an electric signal from the processor 170 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit (not shown) can also output sound corresponding to the operation of the input unit 110, that is, the button.

An audio input unit (not shown) can receive a user's voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 170.

The processor 170 controls the overall operation of each unit in the vehicle driving assistant 100a.

In particular, the processor 170 performs signal processing based on computer vision. Accordingly, the processor 170 obtains a stereo image for the vehicle front from the stereo camera 195, performs a disparity calculation for the vehicle front based on the stereo image, and based on the calculated disparity information , Perform object detection for at least one of the stereo images, and continue to track object motion after object detection.

Particularly, when the object is detected, the processor 170 performs lane detection, vehicle detection, pedestrian detection, traffic sign detection, road surface detection, and the like .

The processor 170 may perform a distance calculation to the detected nearby vehicle, a speed calculation of the detected nearby vehicle, a speed difference calculation with the detected nearby vehicle, and the like.

Meanwhile, the processor 170 can receive weather information, traffic situation information on the road, and TPEG (Transport Protocol Expert Group) information, for example, through the communication unit 120.

On the other hand, the processor 170 can grasp, in real time, the traffic situation information on the surroundings of the vehicle based on the stereo image in the vehicle driving assistant device 100a.

On the other hand, the processor 170 can receive map information and the like from the vehicle display device 400 through the interface unit 130. [

On the other hand, the processor 170 can receive the sensor information from the ECU 770 or the sensor unit 760 through the interface unit 130. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

The power supply unit 190 can supply power necessary for the operation of each component under the control of the processor 170. [ Particularly, the power supply unit 190 can receive power from a battery or the like inside the vehicle.

The stereo camera 195 may include a plurality of cameras. Hereinafter, as described with reference to FIG. 2B and the like, it is assumed that two cameras are provided.

The stereo camera 195 may be detachably attachable to the ceiling or the front glass of the vehicle 200 and may include a first camera 195a having a first lens 193a and a second camera 195a having a second lens 193b, (195b).

The stereo camera 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, And a portion 192b.

Next, referring to FIG. 3B, the vehicle driving assistance device 100a of FIG. 3B further includes an input unit 110 display 180 and an audio output unit 185 in addition to the vehicle driving assistance device 100a of FIG. . Hereinafter, only the description of the input unit 110, the display 180, and the audio output unit 185 will be described.

The input unit 110 may include a plurality of buttons or a touch screen attached to the vehicle driving assistance apparatus 100a, particularly, the stereo camera 195. [ It is possible to turn on and operate the vehicle driving assistant 100a through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.

The display 180 may display an image related to the operation of the vehicle driving assist system. For this image display, the display 180 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 180 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [

The audio output unit 185 outputs the sound to the outside based on the audio signal processed by the processor 170. [ To this end, the audio output unit 185 may include at least one speaker.

Figs. 3C to 3D illustrate various examples of an internal block diagram of the far view providing apparatus of Fig.

The surround view providing apparatus 100b in FIGS. 3C to 3D can combine a plurality of images received from the plurality of cameras 295a, ..., and 295d in FIG. 2B to generate an ambient view image.

On the other hand, the surrounding view providing apparatus 100b performs object detection, confirmation, and tracking on objects located in the vicinity of the vehicle based on the plurality of images received from the plurality of cameras 295a, ..., 295d .

Referring to FIG. 3C, the surrounding view providing apparatus 100b of FIG. 3C includes a communication unit 220, an interface unit 230, a memory 240, a processor 270, a display 280, a power supply unit 290 , And a plurality of cameras 295a, ..., 295d.

The communication unit 220 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 220 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 220 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the travel position, weather information, road traffic situation information, for example, TPEG Group) information. Meanwhile, in the surrounding view providing apparatus 100b, real-time traffic information based on an image may be transmitted to the mobile terminal 600 or the server 500.

On the other hand, when the user is boarding the vehicle, the user's mobile terminal 600 and the surrounding view providing apparatus 100b can perform pairing with each other automatically or by execution of the user's application.

The interface unit 230 may receive the vehicle-related data or may transmit the processed or generated signal to the outside by the processor 270. For this purpose, the interface unit 230 can perform data communication with the ECU 770, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method.

On the other hand, the interface unit 230 can receive the sensor information from the ECU 770 or the sensor unit 760.

Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

The memory 240 may store various data for operation of the entire surround view providing apparatus 100b, such as a program for processing or controlling the processor 270. [

On the other hand, the memory 240 may store map information related to the vehicle driving.

The processor 270 controls the overall operation of each unit in the surrounding view providing apparatus 100b.

Particularly, the processor 270 can acquire a plurality of images from the plurality of cameras 295a, ..., 295d, and combine the plurality of images to generate an around view image.

Meanwhile, the processor 270 may perform signal processing based on computer vision. For example, based on a plurality of images or a generated surrounding view image, a disparity calculation is performed around the vehicle, object detection is performed in the image based on the calculated disparity information, , The motion of the object can be continuously tracked.

Particularly, when the object is detected, the processor 270 can perform lane detection, vehicle detection, pedestrian detection, obstacle detection, parking area detection, road surface detection, and the like .

Then, the processor 270 can perform a distance calculation on the detected nearby vehicles or pedestrians.

On the other hand, the processor 270 can receive the sensor information from the ECU 770 or the sensor unit 760 via the interface unit 230. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

The display 280 may display an aurally view image generated by the processor 270. Meanwhile, it is also possible to provide a variety of user interface when displaying the surround view image, or to provide a touch sensor capable of touch input to the provided user interface.

Meanwhile, the display 280 may include a cluster or an HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 280 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [

The power supply unit 290 can supply power necessary for the operation of each component under the control of the processor 270. [ Particularly, the power supply unit 290 can receive power from a battery or the like inside the vehicle.

The plurality of cameras 295a, ..., and 295d are cameras for providing an overview image, preferably a wide angle camera.

3D is similar to the surrounding view providing apparatus 100b of FIG. 3C, but includes an input unit 210, an audio output unit 285, and an audio input unit (not shown) 286 are provided. Hereinafter, only the description of the input unit 210, the audio output unit 285, and the audio input unit 286 will be described.

The input unit 210 may include a plurality of buttons attached to the periphery of the display 280 or a touch screen disposed on the display 280. It is possible to turn on the power of the surrounding view providing apparatus 100b and operate it through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.

The audio output unit 285 converts an electric signal from the processor 270 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 285 can also output sound corresponding to the operation of the input unit 210, that is, the button.

The audio input unit 286 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 270.

Meanwhile, the far view providing apparatus 100b of FIG. 3C or FIG. 3D may be an AVN (Audio Video Navigation) apparatus.

3E is an internal block diagram of the vehicle display device of FIG.

The vehicle display apparatus 400 according to an embodiment of the present invention includes an input unit 310, a communication unit 320, a spatial recognition sensor unit 321, a touch sensor unit 326, an interface unit 330, A memory 340, a processor 370, a display 380, an audio input unit 383, an audio output unit 385, and a power supply unit 390.

The input unit 310 includes a button attached to the display device 400. For example, a power button may be provided. In addition, it may further include at least one of a menu button, an up / down button, and a left / right button.

The input signal through the input unit 310 may be transmitted to the processor 370.

The communication unit 320 can exchange data with an adjacent electronic device. For example, data can be exchanged with a vehicle internal electronic device or a server (not shown) in a wireless manner. Particularly, data can be exchanged wirelessly with the mobile terminal of the vehicle driver. Various wireless data communication methods such as Bluetooth, WiFi, and APiX are available.

For example, when the user is boarded in the vehicle, the user's mobile terminal and the display device 400 can perform the pairing with each other automatically or by execution of the user's application.

On the other hand, the communication unit 320 may include a GPS receiving device, and can receive GPS information, that is, position information of the vehicle.

The space recognition sensor unit 321 can detect the approach or movement of the user's hand. For this purpose, it may be disposed around the display 380.

The spatial recognition sensor unit 321 may perform spatial recognition based on an optical basis or may perform spatial recognition based on an ultrasonic wave. Hereinafter, description will be made mainly on performing spatial recognition under an optical basis.

The spatial recognition sensor section 321 can sense the approach or movement of the user's hand based on the output of the output light and the reception of the corresponding received light. In particular, the processor 370 can perform signal processing on the electrical signals of the output light and the received light.

For this purpose, the spatial recognition sensor unit 321 may include a light output unit 322 and a light receiving unit 324.

The light output unit 322 may output infrared light, for example, for detecting a user's hand located on the front of the display device 400. [

The light receiving unit 324 receives the light scattered or reflected when the light output from the light output unit 322 is scattered or reflected in the user's hand located on the front of the display device 400. [ Specifically, the light receiving unit 324 may include a photo diode and convert the received light into an electric signal through a photodiode. The converted electrical signal may be input to the processor 370.

The touch sensor unit 326 senses a floating touch and a direct touch. For this purpose, the touch sensor unit 326 may include an electrode array, an MCU, and the like. When the touch sensor unit is operated, an electric signal is supplied to the electrode array, and an electric field is formed on the electrode array.

The touch sensor unit 326 can operate when the intensity of light received by the spatial recognition sensor unit 321 is equal to or higher than the first level.

That is, when a user's hand such as a user's hand approaches within a predetermined distance, an electric signal may be supplied to the electrode array or the like in the touch sensor unit 326. [ An electric field is formed on the electrode array by the electric signal supplied to the electrode array, and the electric field is used to sense a capacitance change. Based on the capacitance change detection, the touch sensor detects a floating touch and a direct touch.

In particular, the z-axis information can be sensed by the touch sensor unit 326 in addition to the x- and y-axis information according to the approach of the user's hand.

The interface unit 330 can exchange data with other electronic devices in the vehicle. For example, the interface unit 330 can perform data communication with an ECU or the like in the vehicle by a wired communication method.

Specifically, the interface unit 330 can receive the vehicle status information by data communication with an ECU or the like in the vehicle.

Here, the vehicle status information includes at least one of battery information, fuel information, vehicle speed information, tire information, steering information by steering wheel rotation, vehicle lamp information, vehicle internal temperature information, vehicle external temperature information, can do.

The interface unit 330 may further receive GPS information from an ECU or the like in the vehicle. Alternatively, it is also possible to transmit GPS information, which is received by the display device 400, to an ECU or the like.

The memory 340 may store various data for operation of the display device 400, such as a program for processing or controlling the processor 370. [

For example, the memory 340 may store a map map for guiding the traveling path of the vehicle.

As another example, the memory 340 may store user information, user's mobile terminal information, for pairing with a user's mobile terminal.

The audio output unit 385 converts an electric signal from the processor 370 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 385 can also output sound corresponding to the operation of the input unit 310, that is, the button.

The audio input unit 383 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 370.

The processor 370 controls the overall operation of each unit in the vehicle display device 400. [

When the user's hands approach the display device 400 successively, the processor 370 sequentially determines the x, y, and z axes for the user's hand based on the light received by the light receiver 324 Information can be computed. At this time, the z-axis information can be sequentially reduced.

On the other hand, when the user's hand approaches within a second distance closer to the display 380 than the first distance, the processor 370 can control the touch sensor unit 326 to operate. That is, the processor 370 can control the touch sensor unit 326 to operate when the intensity of the electric signal from the spatial recognition sensor unit 321 is equal to or higher than the reference level. Thereby, an electric signal is supplied to each electrode array in the touch sensor unit 326. [

On the other hand, the processor 370 can sense the floating touch based on the sensing signal sensed by the touch sensor unit 326 when the user's hand is located within the second distance. In particular, the sensing signal may be a signal indicative of a change in capacitance.

Based on the sensed signal, the processor 370 computes the x and y axis information of the floating touch input and calculates z (x, y) based on the magnitude of the electrostatic capacitance change, Axis information can be calculated.

On the other hand, the processor 370 can change the grouping for the electrode array in the touch sensor unit 326 according to the distance of the user's hand.

Specifically, the processor 370 performs grouping on the electrode array in the touch sensor unit 326 based on the approximate z-axis information calculated on the basis of the received light received by the spatial recognition sensor unit 321 It is possible to change it. The larger the distance, the larger the size of the electrode array group can be set.

That is, the processor 370 can vary the size of the touch sensing cell with respect to the electrode array in the touch sensor unit 326 based on the distance information of the user's hand, that is, the z-axis information.

The display 380 may separately display an image corresponding to the function set for the button. For this image display, the display 380 may be implemented as a variety of display modules such as an LCD, an OLED, and the like. On the other hand, the display 380 may be implemented as a cluster on the inside of the vehicle interior.

The power supply unit 390 can supply power necessary for the operation of each component under the control of the processor 370. [

Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D, and Figure 5 is a diagram illustrating object detection in the processors of Figures 4A-4B.

4A is a block diagram of the processor 170 of the vehicle driving assistance apparatus 100a of FIGS. 3A-3B or the processor 270 of the surrounding view providing apparatus 100B of FIGS. And shows an example of an internal block diagram.

The processor 170 or 270 may include an image preprocessing unit 410, a disparity computing unit 420, an object detecting unit 434, an object tracking unit 440, and an application unit 450.

The image preprocessor 410 may receive a plurality of images from the plurality of cameras 295a, ..., and 295d or a generated foreground view image to perform preprocessing.

Specifically, the image preprocessing unit 410 may perform a noise reduction, a rectification, a calibration, a color enhancement, a color correction, and a color correction on a plurality of images or a generated surrounding view image. Color space conversion (CSC), interpolation, camera gain control, and the like. Thus, it is possible to acquire a plurality of images photographed by the plurality of cameras 295a, ..., and 295d, or a sharper image than the generated surround view image.

The disparity calculator 420 receives a plurality of images or a generated surrounding view image signal-processed by the image preprocessing unit 410, and generates a plurality of images or a generated surrounding image Performs stereo matching on the view image, and obtains a disparty map according to the stereo matching. That is, it is possible to obtain the disparity information about the surroundings of the vehicle.

At this time, the stereo matching may be performed on a pixel-by-pixel basis or a predetermined block basis. On the other hand, the disparity map may mean a map in which numerical values of binocular parallax information of images, i.e., left and right images, are displayed.

The segmentation unit 432 may perform segmenting and clustering in the image based on the disparity information from the disparity calculating unit 420. [

Specifically, the segmentation unit 432 can separate the background and the foreground for at least one of the images based on the disparity information.

For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated.

As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.

Thus, by separating the foreground and background based on the disparity information information extracted based on the image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.

Next, the object detector 434 can detect the object based on the image segment from the segmentation unit 432. [

That is, the object detecting unit 434 can detect an object for at least one of the images based on the disparity information.

More specifically, the object detecting unit 434 can detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.

Next, the object verification unit 436 classifies and verifies the isolated object.

For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.

On the other hand, the object checking unit 436 can check the objects by comparing the objects stored in the memory 240 with the detected objects.

For example, the object checking unit 436 can identify nearby vehicles, lanes, roads, signs, hazardous areas, tunnels, and the like, which are located around the vehicle.

An object tracking unit 440 performs tracking on the identified object. For example, it is possible to sequentially check the objects in the acquired images, calculate the motion or motion vector of the identified object, and track the movement of the object based on the calculated motion or motion vector have. Accordingly, it is possible to track nearby vehicles, lanes, roads, signs, hazardous areas, etc., located in the vicinity of the vehicle.

4B is another example of an internal block diagram of the processor.

Referring to FIG. 4B, the processor 170 or 270 of FIG. 4B has the same internal configuration unit as the processor 170 or 270 of FIG. 4A, but differs in the signal processing order. Only the difference will be described below.

The object detecting unit 434 may receive a plurality of images or a generated surrounding view image, and may detect a plurality of images or objects in the generated surrounding view image. 4A, it is possible to detect an object directly from a plurality of images or a generated surrounding view image, instead of detecting an object, based on disparity information, for a segmented image.

Next, the object verification unit 436 classifies the detected and separated objects based on the image segment from the segmentation unit 432 and the object detected by the object detection unit 434, (Verify).

For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.

FIG. 5 is a diagram referred to for explaining the operation method of the processor 170 or 270 of FIGS. 4A to 4B, based on images obtained respectively in the first and second frame periods.

Referring to FIG. 5, during the first and second frame periods, the plurality of cameras 295a, ..., and 295d sequentially acquire images FR1a and FR1b, respectively.

The disparity calculating unit 420 in the processor 170 or 270 receives the images FR1a and FR1b processed by the image preprocessing unit 410 and performs stereo matching on the received images FR1a and FR1b And obtains a disparity map (520).

The disparity map 520 is obtained by leveling the parallax between the images FR1a and FR1b. The higher the disparity level is, the closer the distance from the vehicle is, and the smaller the disparity level is, The distance can be calculated to be far.

On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.

In the figure, first to fourth lanes 528a, 528b, 528c, and 528d have corresponding disparity levels in the disparity map 520, and the construction area 522, the first front vehicle 524 ) And the second front vehicle 526 have corresponding disparity levels, respectively.

The segmentation unit 432, the object detection unit 434 and the object identification unit 436 determine whether or not a segment, an object detection, and an object of at least one of the images FR1a and FR1b, based on the disparity map 520, Perform verification.

In the figure, using the disparity map 520, object detection and confirmation for the second image FRlb is performed.

That is, the first to fourth lanes 538a, 538b, 538c, and 538d, the construction area 532, the first forward vehicle 534, and the second forward vehicle 536 are included in the image 530, And verification may be performed.

On the other hand, by continuously acquiring the image, the object tracking unit 440 can perform tracking on the identified object.

6A and 6B are views referred to in the description of the operation of the vehicle driving assistant device of FIG.

First, FIG. 6A is a diagram illustrating a vehicle forward situation photographed by a stereo camera 195 provided inside a vehicle. In particular, the vehicle front view is indicated by a bird eye view.

Referring to the drawing, a first lane 642a, a second lane 644a, a third lane 646a, and a fourth lane 648a are located from the left to the right, and the first lane 642a and the second The construction area 610a is positioned between the lanes 644a and the first front vehicle 620a is positioned between the second lane 644a and the third lane 646a and the third lane 646a and the fourth It can be seen that the second forward vehicle 630a is disposed between the lane lines 648a.

Next, FIG. 6B illustrates the display of the vehicle front state, which is grasped by the vehicle driving assist system, together with various information. In particular, the image as shown in FIG. 6B may be displayed on the display 180 or the vehicle display device 400 provided in the vehicle driving assistance device.

6B is different from FIG. 6A in that information is displayed on the basis of an image photographed by the stereo camera 195. FIG.

A first lane 642b, a second lane 644b, a third lane 646b and a fourth lane 648b are located from the left to the right and the first lane 642b and the second The construction area 610b is located between the lanes 644b and the first front vehicle 620b is located between the second lane 644b and the third lane 646b and the third lane 646b and the fourth It can be seen that the second forward vehicle 630b is disposed between the lane 648b.

The vehicle driving assistant 100a performs signal processing on the basis of the stereo image photographed by the stereo camera 195 and outputs it to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b You can see the object for. In addition, the first lane 642b, the second lane 644b, the third lane 646b, and the fourth lane 648b can be confirmed.

On the other hand, in the drawing, it is exemplified that each of them is highlighted by a frame to indicate object identification for the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b.

On the other hand, the vehicle driving assistant device 100a calculates the distance (distance) to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b based on the stereo image photographed by the stereo camera 195 Information can be computed.

In the figure, calculated first distance information 611b, second distance information 621b, and third distance information 621b corresponding to the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b, respectively, Information 631b is displayed.

On the other hand, the vehicle driving assistant device 100a can receive sensor information about the vehicle from the ECU 770 or the sensor unit 760. [ Particularly, it is possible to receive and display the vehicle speed information, the gear information, the yaw rate indicating the speed at which the vehicle's rotational angle (yaw angle) changes, and the angle information of the vehicle.

The figure illustrates that the vehicle speed information 672, the gear information 671 and the yaw rate information 673 are displayed on the vehicle front image upper portion 670. In the vehicle front image lower portion 680, Information 682 is displayed, but various examples are possible. Besides, the width information 683 of the vehicle and the curvature information 681 of the road can be displayed together with the angle information 682 of the vehicle.

On the other hand, the vehicle driving assistant 100a can receive the speed limitation information and the like for the road running on the vehicle through the communication unit 120 or the interface unit 130. [ In the figure, it is exemplified that the speed limitation information 640b is displayed.

The vehicle driving assistant 100a may display various information shown in FIG. 6B through the display 180 or the like, but may store various information without a separate indication. And, by using such information, it can be utilized for various applications.

7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.

Referring to the drawings, the vehicle 200 may include an electronic control device 700 for vehicle control.

The electronic control unit 700 includes an input unit 710, a communication unit 720, a memory 740, a lamp driving unit 751, a steering driving unit 752, a brake driving unit 753, a power source driving unit 754, An air conditioner driving unit 757, a window driving unit 758, an airbag driving unit 759, a sensor unit 760, an ECU 770, a display 780, an audio output unit 785, An audio input unit 786, a power supply unit 790, a stereo camera 195, and a plurality of cameras 295.

Meanwhile, the ECU 770 may be a concept including the processor 270 described in FIG. 3C or FIG. 3D. Alternatively, in addition to the ECU 770, a separate processor for signal processing of images from the camera may be provided.

The input unit 710 may include a plurality of buttons or a touch screen disposed inside the vehicle 200. Through a plurality of buttons or a touch screen, it is possible to perform various input operations.

The communication unit 720 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 720 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 720 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the moving position, weather information, road traffic situation information, for example, TPEG Group) information.

On the other hand, when the user aboard the vehicle, the user's mobile terminal 600 and the electronic control device 700 can perform pairing with each other automatically or by execution of the user's application.

The memory 740 may store various data for operation of the electronic control unit 700, such as a program for processing or controlling the ECU 770. [

On the other hand, the memory 740 may store map information related to the vehicle driving.

The lamp driving unit 751 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.

The steering driver 752 may perform electronic control of a steering apparatus (not shown) in the vehicle 200. [ Thus, the traveling direction of the vehicle can be changed.

The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle 200. [ For example, the speed of the vehicle 200 can be reduced by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 200 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The power source driving section 754 can perform electronic control of the power source in the vehicle 200. [

For example, when a fossil fuel-based engine (not shown) is a power source, the power source drive unit 754 can perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled.

As another example, when the electric motor (not shown) is a power source, the power source driving unit 754 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The sunroof driving unit 755 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 200. [ For example, you can control the opening or closing of the sunroof.

The suspension driving unit 756 may perform electronic control of a suspension apparatus (not shown) in the vehicle 200. [ For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 200. [

The air conditioning driving unit 757 can perform electronic control on an air conditioner (not shown) in the vehicle 200. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cooling air to be supplied into the vehicle.

The window driving unit 758 can perform electronic control on a window apparatus (not shown) in the vehicle 200. [ For example, it can control the opening or closing of left and right windows on the side of the vehicle.

The airbag driver 759 may perform electronic control of the airbag apparatus in the vehicle 200. [ For example, at risk, the airbag can be controlled to fire.

The sensor unit 760 senses a signal relating to the running of the vehicle 200 or the like. To this end, the sensor unit 760 may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, A vehicle speed sensor, a vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle interior temperature sensor, and a vehicle interior humidity sensor.

Thereby, the sensor unit 760 outputs the vehicle position information (GPS information), the vehicle angle information, the vehicle speed information, the vehicle acceleration information, the vehicle tilt information, the vehicle forward / backward information, the battery information, Tire information, vehicle lamp information, vehicle internal temperature information, vehicle interior humidity information, and the like.

In addition, the sensor unit 760 may include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The ECU 770 can control the overall operation of each unit in the electronic control unit 700. [

It is possible to perform a specific operation by input by the input unit 710 or to receive the sensed signal from the sensor unit 760 and transmit it to the surrounding view providing apparatus 100b and receive map information from the memory 740 754, 756, 753, 754, 756, respectively.

Also, the ECU 770 can receive weather information and traffic situation information of the road, for example, TPEG (Transport Protocol Expert Group) information from the communication unit 720. [

On the other hand, the ECU 770 can combine a plurality of images received from the plurality of cameras 295 to generate an ambient view image. In particular, when the vehicle is below a predetermined speed or when the vehicle is moving backward, the surround view image can be generated.

The display 780 can display an image of the front of the vehicle while the vehicle is running or an around view image during the running of the vehicle. In particular, it is also possible to provide various user interfaces in addition to the surround view image.

For the display of such an ambient view image or the like, the display 780 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 780 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [ On the other hand, the display 780 may include a touch screen capable of being input.

The audio output unit 785 converts the electrical signal from the ECU 770 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 785 can also output a sound corresponding to the operation of the input unit 710, that is, the button.

The audio input unit 786 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the ECU 770.

The power supply unit 790 can supply power necessary for operation of each component under the control of the ECU 770. [ Particularly, the power supply unit 790 can receive power from a battery (not shown) inside the vehicle.

The stereo camera 195 is used for the operation of a driving assist system for a vehicle. This will be described with reference to the above description.

A plurality of cameras 295 are used to provide the surround view image, and for this purpose, as shown in FIG. 2C, four cameras may be provided. For example, a plurality of surrounding view cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively. The plurality of images photographed by the plurality of cameras 295 may be transmitted to the ECU 770 or a separate processor (not shown).

FIG. 8 is a diagram illustrating an operation method of an apparatus for providing an overview image according to an exemplary embodiment of the present invention, and FIGS. 9A and 9B are diagrams for referring to depth map generation using an overview image. In the following description, it is assumed that the arrangement of a plurality of surrounding cameras is as shown in Fig. 2C.

8, the processor 270 of the surrounding view providing apparatus 100b determines whether or not the overlapping of the front and left regions based on the front image from the front camera 295d and the left image from the left camera 295a Stereo matching is performed on the area Arad1 (S810). That is, the disparity calculation is performed on the overlap area Arad1. Then, the depth is calculated based on the disparity calculated for the overlap area Arad1 (S820).

Next, the processor 270 of the surrounding-view providing apparatus 100b determines whether or not the overlapping area Arcd1 of the front and right areas, based on the front image from the front camera 295d and the right image from the right camera 295c, (S812). ≪ / RTI > That is, a disparity operation is performed on the overlap area Arac1. Then, the depth is calculated based on the disparity calculated for the overlap area Arcd1 (S822).

Next, the processor 270 of the surrounding view providing apparatus 100b determines whether or not the overlapping area Arab1 among the rear and left areas, based on the rear image from the rear camera 295b and the left image from the left camera 295a, (S814). ≪ / RTI > In other words, a disparity operation is performed on the overlap area Arab1. Then, the depth is computed based on the disparity calculated for the overlap area Arab1 (S824).

Next, the processor 270 of the surrounding view providing apparatus 100b determines whether or not the overlapping region Arbc1 of the rear and right regions, based on the rear image from the rear camera 295b and the right image from the right camera 295c, (S816). ≪ / RTI > That is, the disparity calculation is performed on the overlap area Arbc1. Then, the depth is calculated based on the disparity calculated for the overlap area Arbc1 (S826).

On the other hand, the processor 270 of the surrounding view providing apparatus 100b can generate a 3D map, i.e., a depth map, based on the depths calculated for the overlapping areas Arad1, Arcd1, Arab1, and Arbc1 (S830). Particularly, when the vehicle is running, a 3D map, i.e., a depth map, can be generated based on the depths obtained sequentially

9A illustrates a method of generating a 3D map, that is, a depth map, based on sequentially computed depths in accordance with the running direction car-dr of the vehicle 200. Fig.

The processor 270 of the surrounding view providing apparatus 100b can generate a depth map for the left region Ar1 of the vehicle 200 and a depth map for the right region Arri .

On the other hand, the processor 270 of the surrounding view providing apparatus 100b performs object detection and distance calculation for the left area Arle of the vehicle 200, object detection for the right area Arri, Operation can be performed.

That is, the processor 270 of the surrounding view providing apparatus 100b can calculate the depths of the other overlapping regions Arad1, Arcd1, Arab1, Arbc1, (S840). In particular, when the vehicle is running, the depths for the front, rear, left, and right regions can be calculated based on the depths for the overlap regions that are obtained sequentially (S840).

Thus, the depth of the object relative to the vehicle periphery can be calculated. In addition, a depth map for the surroundings of the vehicle can be generated.

On the other hand, the processor 270 of the surrounding view providing apparatus 100b enters the surround view mode when the vehicle is backward or the vehicle speed is equal to or lower than the first speed, and after each image is obtained from the surrounding view cameras, However, it is also possible to perform stereo matching and depth calculation for the overlap area, and to generate a depth map around the vehicle.

On the other hand, the processor 270 of the surrounding view providing apparatus 100b combines the photographed images from the plurality of activated surround view cameras 295a, 295b, 295c, and 295d, May be generated.

Processor 270, on the other hand, may perform detection, identification, and tracking for an object within the surrounding view image, as described in Figures 4A-4B.

Meanwhile, the processor 270 in the surrounding view providing apparatus 100b can generate an overview image, and can control to output the generated overview image and the calculated distance information.

9B illustrates that the display 280 displays distance information 10m and 30m for each of the objects BU1 and BU2 in the generated surrounding view image 1002. FIG. Thus, the user can easily grasp the distance to the object near the vehicle.

On the other hand, the processor 270 of the surrounding-view providing apparatus 100b determines whether or not the vehicle detection, the pedestrian detection, the lane detection, the road surface detection, the road surface detection, Obstacle detection, and visual odometry can be performed.

On the other hand, the processor 270 of the surrounding view providing apparatus 100b can perform dead reckoning based on the vehicle running information from the ECU 770 or the sensor unit 760.

Then, the processor 270 of the surrounding view providing apparatus 100b can perform the vehicle motion (egomotion) tracking based on the dead reckoning. At this time, in addition to the dead reckoning, it is also possible to perform the vehicle motion (egomotion) tracking based on the visual odometry.

Further, since the image areas are overlapped by the side cameras 295a and 295c and the rear camera 295b with respect to the rear of the vehicle, the rear right, and the rear of the rear, based on the photographed images, By using this, the disparity around the vehicle can be calculated by combining each image. The processor 270 may then perform object detection and verification for the vehicle rear, right rear, and left rear.

10 is an example of an internal block diagram of the terminal of FIG.

Referring to the drawings, the mobile terminal 600 may include a mobile terminal.

The mobile terminal 600 includes a wireless communication unit 610, an audio / video input unit 620, a user input unit 630, a sensing unit 640, an output unit 650, a memory 660, A controller 625, a controller 670, and a power supply 690.

The wireless communication unit 610 may include a broadcast receiving module 611, a mobile communication module 613, a wireless communication module 615, an acoustic communication unit 617, and a GPS module 619.

The broadcast receiving module 611 may receive at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. At this time, the broadcast channel may include a satellite channel, a terrestrial channel, and the like.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 611 may be stored in the memory 660.

The mobile communication module 613 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless communication module 615 refers to a module for wireless Internet access, and the wireless communication module 615 can be embedded in the mobile terminal 600 or externally. For example, the wireless communication module 615 may perform WFii-based wireless communication or WiFi Direct-based wireless communication.

Meanwhile, the wireless communication module 615 may include a receiving section 423 for receiving an identification signal and a transmitting section 423 for transmitting a remote control signal.

The receiving unit 423 receives a device identification signal of any one of an IR (Infra Red) signal, an RF (Radio Frequency) signal, a ZigBee signal, a bluetooth signal and a laser signal from the transmitting apparatus 101 .

Accordingly, the receiving unit 423 may include an IR receiving unit (not shown), an RF receiving unit (not shown), and the like.

Meanwhile, the transmitter 421 can output a remote control signal of any one of an IR (Infra Red) signal, an RF (Radio Frequency) signal, a ZigBee signal, a bluetooth signal and a laser signal .

Accordingly, the transmitting unit 421 may include an IR transmitting unit (not shown), an RF transmitting unit (not shown), and the like.

The acoustic communication unit 617 can perform acoustic communication. In the acoustic communication mode, the acoustic communication unit 617 can output sound by adding predetermined information data to the audio data to be output. Further, in the acoustic communication mode, the acoustic communication unit 617 can extract predetermined information data from the sound received from the outside.

In addition, Bluetooth, RFID (radio frequency identification), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

The Global Positioning System (GPS) module 619 may receive position information from a plurality of GPS satellites.

The A / V (Audio / Video) input unit 620 is for inputting an audio signal or a video signal, and may include a camera 621 and a microphone 623.

The user input unit 630 generates key input data that the user inputs to control the operation of the terminal. The user input unit 630 may include a key pad, a dome switch, and a touch pad (static / static). Particularly, when the touch pad has a mutual layer structure with the display 680, it can be called a touch screen.

The sensing unit 640 senses the current state of the mobile terminal 600 such as the open / close state of the mobile terminal 600, the position of the mobile terminal 600, A sensing signal can be generated.

The sensing unit 640 may include a sensing sensor 641, a pressure sensor 643, a motion sensor 645, and the like. The motion sensor 645 can detect the movement or the position of the mobile terminal 600 using an acceleration sensor, a gyro sensor, a gravity sensor, or the like. In particular, the gyro sensor is a sensor for measuring the angular velocity, and it can sense the direction (angle) of rotation about the reference direction.

The output unit 650 may include a display 680, an audio output unit 653, an alarm unit 655, and a haptic module 657 and the like.

The display 680 displays and outputs information processed in the mobile terminal 600.

Meanwhile, when the display 680 and the touch pad have a mutual layer structure to constitute a touch screen, the display 680 may be used as an input device capable of inputting information by a user's touch in addition to the output device .

The audio output unit 653 outputs audio data received from the wireless communication unit 610 or stored in the memory 660. [ The audio output unit 653 may include a speaker, a buzzer, and the like.

The alarm unit 655 outputs a signal for notifying the occurrence of an event of the mobile terminal 600. For example, it is possible to output a signal in a vibration mode. .

The haptic module 657 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 657 is a vibration effect.

The memory 660 may store a program for processing and controlling the control unit 670 and application data and may store temporary data of inputted or outputted data (e.g., a phone book, a message, a still image, For example.

The interface unit 625 serves as an interface with all the external devices connected to the mobile terminal 600. The interface unit 625 may receive data from the external device or supply power to the respective components in the mobile terminal 600 and may transmit data in the mobile terminal 600 to the external device .

The controller 670 typically controls the operations of the respective units to control the overall operation of the mobile terminal 600. For example, perform related controls and processing for voice calls, data communications, video calls, and the like. In addition, the controller 670 may include a multimedia playback module 681 for multimedia playback. The multimedia playback module 681 may be configured in hardware in the controller 670 or in software separately from the controller 670.

The power supply unit 690 supplies external power and internal power under the control of the controller 670 to supply power necessary for operation of the respective components.

Meanwhile, the block diagram of the mobile terminal 600 shown in FIG. 10 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the mobile terminal 600 actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

FIGS. 11A to 12C are views referencing an operation method description of an apparatus for providing an overview view according to an embodiment of the present invention.

Fig. 11A illustrates that the vehicle 200 enters the parking lot in the building 1101 through the entrance 1102. Fig.

11B and 11C are diagrams illustrating how the vehicle 200 moves into the parking lot 1100 having a parking area.

The processor 270 of the surrounding view providing apparatus 100b generates an aurally view image based on the images captured by the plurality of the surrounding view cameras 295a, 295b, 295c, and 295d when the vehicle is backward or below a predetermined speed When the vehicle is parked, a plurality of parking zone identification markers are extracted from a plurality of the area view images generated during the first time, and the extracted parking zone identification markers are controlled to be transmitted to the mobile terminal 600 .

Figs. 11B and 11C illustrate parking zone identification markers 1107a, 1107b, and 1107c (B06, B08, and B05) in the parking lot 1100, which are letters and numbers when the vehicle 200 is moved.

In Fig. 11B, the Aro area is an area in which the surround view image is generated for a predetermined time, and represents an area where the parking map can be generated.

On the other hand, the processor 270 of the surrounding view providing apparatus 100b analyzes the images captured by the plurality of the surrounding view cameras 295a, 295b, 295c, and 295d or the generated surrounding view images when the vehicle 200 is moved , A plurality of parking zone identification markers 1107a, 1107b, 1107c (B06, B08, B05) can be recognized and extracted.

On the other hand, assuming that the vehicle is parked in the specific parking area 1110b of the parking space 1100 of Fig. 11B and the driver moves to the entrance 1109 after the parking of the vehicle and moves outside the parking lot 1100 do.

The processor 270 of the surrounding view providing apparatus 100b can control to transmit the extracted multiple parking zone identification markers to the mobile terminal 600 when the parking of the vehicle is completed. Thus, the vehicle driver can easily grasp his / her vehicle position without taking another photograph using the mobile terminal.

Particularly, the processor 270 of the surrounding view providing apparatus 100b can control to transfer only a predetermined number of markers among the extracted multiple parking zone identification markers to the mobile terminal 600, until the parking of the vehicle is completed.

For example, the processor 270 of the surrounding-view providing apparatus 100b may use only the three multiple parking zone identification markers out of the six extracted multiple parking zone identification markers, using the communication unit 220, 600). Accordingly, it is possible to transmit to the mobile terminal mainly on the vehicle parking position information, thereby reducing unnecessary data transmission.

On the other hand, the processor 270 of the surrounding view providing apparatus 100b calculates the layer information of the vehicle 200 based on the vehicle sensor information and the surround view image received through the interface unit 230, To the mobile terminal (600).

For example, when the parking lot 1100 of FIG. 11B is two layers underground, it is possible to receive the descent information of the vehicle through the vehicle sensor information received through the interface unit 230, The processor 270 of the vehicle 200 can calculate the descent height of the vehicle 200. [ In accordance with the descent height of the vehicle 200, it is possible to calculate the layer information of the vehicle currently positioned. For example, it can be calculated as two basement floors.

As another example, the processor 270 of the surrounding view providing apparatus 100b can receive the descent information of the vehicle and the plurality of the surrounding view cameras 295a, 295b, 295c, and 295c through the vehicle sensor information received through the interface unit 230, Extracts a text such as 'B2' or 'B2' from images captured by the first and second cameras 295d and 295d, and calculates the layer information of the vehicle currently positioned on the basis of the extracted text. For example, it can be calculated as two basement floors.

By further transmitting this layer information to the mobile terminal 600, the driver can more clearly grasp the parking position of his / her own vehicle.

FIG. 11D illustrates an example of a multiple parking zone identification marker displayed on the display 680 of the mobile terminal 600. FIG.

The processor 270 of the surrounding view providing apparatus 100b may control to sequentially transmit a plurality of parking zone identification markers according to the traveling route of the vehicle or to transmit the traveling route information of the vehicle to the plurality of parking zone identification markers So that it can be controlled so as to be transmitted.

In particular, the processor 270 of the surrounding-view providing apparatus 100b can control to sequentially transmit a plurality of parking zone identification markers according to a time sequence.

11D, a parking information screen 1110 including a plurality of parking zone identification markers 1112, 1114, 1116 (B06, B08, B05) and traveling route information 1117 of the vehicle is moved So that it can be displayed on the display 680 of the terminal 600.

FIG. 11E illustrates another example of a multiple parking zone identification marker displayed on the display 680 of the mobile terminal 600. FIG.

The processor 270 of the surrounding view providing apparatus 100b may control to transmit a plurality of parking zone identification markers in the reverse order of the time order.

11E, a parking information screen 1110 including a plurality of parking zone identification markers 1116, 1114, 1112 (B05, B08, B06) and traveling route information 1119 of the vehicle is moved So that it can be displayed on the display 680 of the terminal 600.

On the other hand, the processor 270 of the surrounding view providing apparatus 100b activates the plurality of the surrounding cameras 295a, 295b, 295c, and 295d for a second time period when the startup is turned off after the parking of the vehicle, Generates an overview image based on the images captured by the view cameras 295a, 295b, 295c, and 295d, extracts the driver's walking path information from the generated surrounding view images, extracts the parking entrance information 1109, It is possible to control the mobile terminal 600 to further transmit the walking route information or the parking lot entrance / exit information 1109. [

On the other hand, the processor 270 of the surrounding view providing apparatus 100b may control to sequentially transmit a plurality of parking zone identification markers and walking route information or parking lot entrance / exit information, or may include a plurality of parking zone identification markers, It is possible to control so that traveling route information of the vehicle is added to the information or the parking lot entrance / exit information to be transmitted.

FIG. 11F illustrates another example of a multiple parking zone identification marker displayed on the display 680 of the mobile terminal 600. FIG.

The processor 270 of the surrounding view providing apparatus 100b can control to sequentially transmit a plurality of parking zone identification markers and parking lot entrance and exit information according to a time sequence.

11D, a parking information screen including a plurality of parking zone identification markers 1112, 1114, and 1116 (B06, B08, B05), doorway information 1132, and travel route information 1117 of the vehicle (1110) can be displayed on the display (680) of the mobile terminal (600).

FIG. 11G illustrates another example of a multiple parking zone identification marker displayed on the display 680 of the mobile terminal 600. FIG.

The processor 270 of the surrounding view providing apparatus 100b may control the plurality of parking zone identification markers and the parking lot entrance information to be transmitted in the reverse order of the time order.

Thus, as shown in FIG. 11G, the parking information screen 1120 including entrance information 1132, a plurality of parking zone identification markers 1116, 1114, 1112 (B05, B08, B06), and travel route information 1119 of the vehicle (1110) can be displayed on the display (680) of the mobile terminal (600).

The processor 270 of the surrounding view providing apparatus 100b generates a parking map using a plurality of generated surround view images for the first time and displays the extracted multiple parking zone identification markers on the parking map, In addition, it can be controlled to transmit to the mobile terminal 600.

Further, the processor 270 of the surrounding-view providing apparatus 100b can control to highlight the parking position of the vehicle on the parking map.

Fig. 11H illustrates that the parking map image 1150 and the parking position 1152 of the vehicle are highlighted on the display 680 of the mobile terminal 600. Fig. Also, the entrance 1154 is also highlighted. It is also exemplified that the walking route 1153 from the parking position to the entrance is displayed. Thus, the driver can intuitively recognize the parking position of the vehicle.

The processor 270 of the surrounding view providing apparatus 100b generates a parking map by using a plurality of generated surround view images for the first time and displays the moving route of the vehicle on the parking map, A moving picture including the identification marker, and control the moving picture to be transmitted to the mobile terminal 600.

Specifically, the processor 270 may generate a moving picture that notifies a moving path of the vehicle as well as a parking map, using a plurality of generated surround view images for the first time.

The moving picture at this time may be a moving picture showing a plurality of surround view images in a slide show. Alternatively, it may be a moving image obtained by combining a plurality of surround view images on a frame basis.

On the other hand, at the completion of the vehicle parking, the moving picture is terminated, and the parking map image 1150 and the parking position 1152 of the vehicle can be highlighted as shown in Fig. 11H. In addition, the entrance 1154 and the walking path 1153 from the parking position to the entrance can also be displayed.

Fig. 11I illustrates that a display 680 of the mobile terminal 600 displays a parking map image 1150 and an icon 1160 indicating reproduction of a moving picture associated with the parking map. When the icon 1160 indicating reproduction is selected, a moving picture associated with the parking map image 1150 is reproduced. Accordingly, the driver can accurately grasp the vehicle parking position after parking.

12A shows a case where the driver Pd outputs a voice command 205 for searching for his own vehicle, the mobile terminal 600 carries out voice recognition, and after the voice recognition, the driver's vehicle 200 And exchanges data with the surrounding view providing apparatus 100b.

In the figure, an object 1210 indicating that vehicle parking information has been received from the surrounding view providing apparatus 100b is displayed on the mobile terminal 600. [

The vehicle parking information at this time may be any one of various information shown in Figs. 11D to 11I.

On the other hand, Fig. 12B illustrates that the parking map image 1150 and the parking position 1152 of the vehicle are highlighted on the display 680 of the mobile terminal 600. Fig. Also, the entrance 1154 is also highlighted. It is also exemplified that the walking route 1153 from the parking position to the entrance is displayed. Thus, the driver can intuitively recognize the parking position of the vehicle.

On the other hand, when the driver Pd flickers the parking map image 1150 of FIG. 12B in one direction, the detailed parking area information screen 1110 can be displayed as shown in FIG. 12C.

That is, a plurality of parking zone identification markers can be displayed sequentially in time order.

12C, a parking information screen 1110 including a plurality of parking zone identification markers 1112, 1114, 1116 (B06, B08, B05) and traveling route information 1117 of the vehicle is moved And may be displayed on the display 680 of the terminal 600.

On the other hand, the parking zone identification markers 1106 (B05) corresponding to the final parking position among the plurality of parking zone identification markers (1112, 1114, 1116) (B06, B08, B05) can be highlighted and displayed.

Meanwhile, the apparatus for providing an ambient view and the method for operating a vehicle according to the present invention can be embodied as code readable by a processor on a recording medium readable by a processor included in the surround view providing apparatus or the vehicle. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

Claims (15)

A plurality of surrounding view cameras for providing the surround view;
A communication unit for exchanging data with the mobile terminal;
A method of generating an overview image based on an image captured by a plurality of surround view cameras when the vehicle is backward or below a predetermined speed, and, when the vehicle is parked, And a processor for extracting the parking zone identification marker and controlling the extracted multiple parking zone identification marker to be transmitted to the mobile terminal.
The method according to claim 1,
The processor comprising:
Controls to sequentially transmit the plurality of parking zone identification markers according to the traveling route of the vehicle or to add traveling route information of the vehicle to the plurality of parking zone identification markers Around view providing device.
The method according to claim 1,
The processor comprising:
Wherein the control unit controls to sequentially transmit the plurality of parking zone identification markers according to time order.
The method according to claim 1,
The processor comprising:
And controls the plurality of parking zone identification markers to be transmitted in the reverse order of the time order.
The method according to claim 1,
The processor comprising:
Activating the plurality of surround view cameras for a second time when the vehicle is parked after parking, generating an overview view image based on the captured images from the plurality of surround view cameras, Wherein the controller is configured to extract the walking path information of the middle driver, extract the parking lot entrance information, and further control the walking path information or the parking lot entrance information to the mobile terminal.
6. The method of claim 5,
The processor comprising:
The control unit controls to sequentially transmit the plurality of parking zone identification markers and the walking route information or the parking lot entrance and exit information to the plurality of parking zone identification markers and the walking route information or the parking lot entrance information, Information is added and transmitted.
The method according to claim 1,
The processor comprising:
Generates a parking map using a plurality of generated surround view images for the first time,
And adds the extracted multiple parking zone identification markers to the parking map, and transmits the added parking lot identification markers to the mobile terminal.
8. The method of claim 7,
The processor comprising:
And controls to highlight the parking position of the vehicle on the parking map.
The method according to claim 1,
The processor comprising:
Generating a parking map using the plurality of generated surround view images for the first time, generating moving images including the moving path of the vehicle, the extracted parking area identification markers on the parking map, And controls the moving picture to be transmitted to the mobile terminal.
The method according to claim 1,
And an interface unit for receiving sensor information from the vehicle,
The processor comprising:
Calculates the layer information of the vehicle on the basis of the vehicle sensor information and the surround view image received through the interface unit and controls the layer information to be further transmitted to the mobile terminal .
A steering driver for driving the steering device;
A brake driver for driving the brake device;
A power source driving unit for driving a power source;
A plurality of surrounding view cameras for providing the surround view;
A communication unit for exchanging data with the mobile terminal;
A method of generating an overview image based on an image captured by a plurality of surround view cameras when the vehicle is backward or below a predetermined speed, and, when the vehicle is parked, A processor for extracting a parking zone identification marker and controlling the extracted multiple parking zone identification marker to be transmitted to the mobile terminal.
12. The method of claim 11,
The processor comprising:
Controls to sequentially transmit the plurality of parking zone identification markers according to the traveling route of the vehicle or to add traveling route information of the vehicle to the plurality of parking zone identification markers vehicle.
12. The method of claim 11,
The processor comprising:
Activating the plurality of surround view cameras for a second time when the vehicle is parked after parking, generating an overview view image based on the captured images from the plurality of surround view cameras, Extracts the walking path information of the middle driver, extracts the parking lot entrance / exit information, and further controls the mobile terminal to transmit the walking path information or the parking lot entrance / exit information.
12. The method of claim 11,
The processor comprising:
Generates a parking map using a plurality of generated surround view images for the first time,
And adds the extracted multiple parking zone discrimination markers to the parking map and transmits the added parking lot discrimination markers to the paired mobile terminal.
12. The method of claim 11,
The processor comprising:
Generating a parking map using the plurality of generated surround view images for the first time, generating moving images including the moving path of the vehicle, the extracted parking area identification markers on the parking map, And to transmit the moving picture to the mobile terminal.
KR1020150081049A 2015-06-09 2015-06-09 Apparatus for prividing around view and vehicle including the same KR20160144643A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150081049A KR20160144643A (en) 2015-06-09 2015-06-09 Apparatus for prividing around view and vehicle including the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150081049A KR20160144643A (en) 2015-06-09 2015-06-09 Apparatus for prividing around view and vehicle including the same

Publications (1)

Publication Number Publication Date
KR20160144643A true KR20160144643A (en) 2016-12-19

Family

ID=57735437

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150081049A KR20160144643A (en) 2015-06-09 2015-06-09 Apparatus for prividing around view and vehicle including the same

Country Status (1)

Country Link
KR (1) KR20160144643A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11037328B1 (en) 2019-12-31 2021-06-15 Lyft, Inc. Overhead view image generation
WO2021138357A1 (en) * 2019-12-31 2021-07-08 Lyft, Inc. Map feature extraction using overhead view images
US11288522B2 (en) 2019-12-31 2022-03-29 Woven Planet North America, Inc. Generating training data from overhead view images

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11037328B1 (en) 2019-12-31 2021-06-15 Lyft, Inc. Overhead view image generation
WO2021138357A1 (en) * 2019-12-31 2021-07-08 Lyft, Inc. Map feature extraction using overhead view images
US11244500B2 (en) 2019-12-31 2022-02-08 Woven Planet North America, Inc. Map feature extraction using overhead view images
US11288522B2 (en) 2019-12-31 2022-03-29 Woven Planet North America, Inc. Generating training data from overhead view images
US11727601B2 (en) 2019-12-31 2023-08-15 Woven Planet North America, Inc. Overhead view image generation

Similar Documents

Publication Publication Date Title
KR102043060B1 (en) Autonomous drive apparatus and vehicle including the same
KR101551215B1 (en) Driver assistance apparatus and Vehicle including the same
KR101741433B1 (en) Driver assistance apparatus and control method for the same
KR101582572B1 (en) Driver assistance apparatus and Vehicle including the same
KR101750876B1 (en) Display apparatus for vehicle and Vehicle
US10782405B2 (en) Radar for vehicle and vehicle provided therewith
CN105270179B (en) Vehicle parking assistance device and vehicle
KR102309316B1 (en) Display apparatus for vhhicle and vehicle including the same
KR20170010645A (en) Autonomous vehicle and autonomous vehicle system including the same
KR101632179B1 (en) Driver assistance apparatus and Vehicle including the same
KR101698781B1 (en) Driver assistance apparatus and Vehicle including the same
KR20150139368A (en) Vehivle charge assistance device and Vehicle including the same
KR20170011885A (en) Antenna, radar for vehicle, and vehicle including the same
KR20160148394A (en) Autonomous vehicle
KR101641491B1 (en) Driver assistance apparatus and Vehicle including the same
KR20160148395A (en) Autonomous vehicle
KR20160144643A (en) Apparatus for prividing around view and vehicle including the same
KR101822896B1 (en) Driver assistance apparatus and control method for the same
KR20180073540A (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101872477B1 (en) Vehicle
KR20150072942A (en) Driver assistance apparatus and Vehicle including the same
KR20160064762A (en) Display apparatus for vhhicleand vehicle including the same
KR20160131580A (en) Apparatus for prividing around view and vehicle including the same
KR20170011881A (en) Radar for vehicle, and vehicle including the same
KR20160144645A (en) Apparatus for prividing around view and vehicle including the same