KR20170006753A - Apparatus for detecting navigation and vehicle including the same - Google Patents

Apparatus for detecting navigation and vehicle including the same Download PDF

Info

Publication number
KR20170006753A
KR20170006753A KR1020150097875A KR20150097875A KR20170006753A KR 20170006753 A KR20170006753 A KR 20170006753A KR 1020150097875 A KR1020150097875 A KR 1020150097875A KR 20150097875 A KR20150097875 A KR 20150097875A KR 20170006753 A KR20170006753 A KR 20170006753A
Authority
KR
South Korea
Prior art keywords
vehicle
light
information
processor
unit
Prior art date
Application number
KR1020150097875A
Other languages
Korean (ko)
Inventor
김성민
임채환
홍기현
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150097875A priority Critical patent/KR20170006753A/en
Publication of KR20170006753A publication Critical patent/KR20170006753A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/22Conjoint control of vehicle sub-units of different type or different function including control of suspension systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/0077
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2550/40

Abstract

The present invention relates to a driving detection apparatus, and a vehicle having the same. According to the present invention, the driving detection apparatus comprises: a light transmission unit outputting pattern light toward a lower portion of a vehicle; a light reception unit receiving reflected light which is the light that the outputted pattern light is reflected; and a processor determining a state of a road surface near the vehicle based on a difference between the pattern light and the reflected light or a difference between the reflected light which are continuously received and providing a vehicle posture control signal corresponding to the determined state of the road surface. Therefore, the present invention controls a vehicle posture corresponding to the state of the road surface near the vehicle based on the pattern light.

Description

[0001] The present invention relates to a traveling detection apparatus,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a traveling detection device and a vehicle having the same, and more particularly, to a traveling detection device capable of performing vehicle position control corresponding to the road surface condition around the vehicle based on pattern light, .

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

On the other hand, for the convenience of users who use the vehicle, various sensors and electronic devices are provided.

On the other hand, in order to calculate the position information of the vehicle, a light-based sensor has been used, but there has been a restriction that it can only calculate the two-dimensional position information of the x and y axes.

An object of the present invention is to provide a traveling detection device capable of performing vehicle attitude control corresponding to the road surface condition around the vehicle based on pattern light, and a vehicle equipped with the same.

According to an aspect of the present invention, there is provided a traveling detecting apparatus including a light transmitting unit for outputting pattern light in a downward direction of a vehicle, a light receiving unit for receiving reflected light on which pattern light is output, And a processor for determining a road surface condition around the vehicle based on the difference of reflected light continuously received and providing a vehicle posture control signal corresponding to the determined road surface condition.

According to another aspect of the present invention, there is provided a vehicle comprising: a steering driver for driving a steering device; a brake driver for driving a brake device; a power source driver for driving a power source; A light receiving section for receiving the reflected light of the pattern light outputted from the light receiving section, a light receiving section for receiving the reflected light of the pattern light outputted from the light receiving section, a difference between the pattern light and the reflected light, And provides a vehicle posture control signal corresponding to the vehicle position control signal.

The traveling detecting apparatus and the vehicle including the same according to the embodiment of the present invention include a light transmitting unit that outputs pattern light in a downward direction of the vehicle, a light receiving unit that receives reflected light in which the pattern light is reflected, And a processor for determining a road surface condition around the vehicle on the basis of the difference between the reflected light of the vehicle and the reflected light of the vehicle, It becomes possible to perform the vehicle posture control corresponding to the road surface state.

On the other hand, the traveling detecting apparatus and the vehicle having the traveling detecting apparatus detect the vehicle bouncing chin on the basis of the difference between the pattern light and the reflected light or the difference of the reflected light successively received, Thus, by providing a suspension drive control signal for controlling the suspension drive section in the vehicle, it is possible to provide a stable ride comfort of the vehicle despite the vehicle stoppage.

On the other hand, the traveling detection device and the vehicle equipped with the traveling detection device are arranged such that, when a road that is inclined in an uphill road or a downhill road or in one direction is located in front of the vehicle, based on the difference between the patterned light and the reflected light, By providing a suspension drive control signal for controlling the suspension drive portion in the vehicle in response to an uphill road or a downhill road or a road inclined in one direction and corresponding to an uphill road or a downhill road or a road inclined in one direction, So that it is possible to provide a stable ride comfort of the vehicle.

On the other hand, the traveling detecting apparatus and the vehicle equipped with the traveling detecting apparatus detect a hole based on the difference between the pattern light and the reflected light or the difference of reflected light continuously received when the hole is located in front of the vehicle, The stability of the running of the vehicle can be improved by providing the steering drive control signal for controlling the steering driving section in the vehicle.

On the other hand, the travel detecting apparatus and the vehicle having the travel detecting apparatus calculate the three-dimensional position or the three-dimensional moving distance of the vehicle on the basis of the difference between the pattern light and the reflected light or the difference in reflected light continuously received, It is possible to provide information on the three-dimensional moving distance.

On the other hand, the traveling detecting apparatus and the vehicle equipped with the traveling detecting apparatus can control so that the reflected light is stably obtained by varying at least one of the luminance, hue and area of the pattern light output based on the level of the received reflected light.

On the other hand, the traveling detecting apparatus and the vehicle equipped with the traveling detecting apparatus are able to obtain reflected light stably during daytime and nighttime by outputting pattern light based on infrared rays.

1 is a conceptual diagram of a vehicle communication system having a travel detection device according to an embodiment of the present invention.
2A is a diagram showing the appearance of a vehicle having various cameras.
FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.
2C is a view schematically showing the positions of a plurality of cameras attached to the vehicle of FIG. 2A.
Fig. 2D illustrates an auroral view image based on images photographed by the plurality of cameras of Fig. 2C.
3A to 3B illustrate various examples of an internal block diagram of the autonomous navigation apparatus of FIG.
3C to 3D illustrate various examples of the internal block diagram of the autonomous vehicle of FIG.
3E is an internal block diagram of the vehicle display device of FIG.
Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D.
Figure 5 is a diagram illustrating object detection in the processor of Figures 4A-4B.
6A to 6B are views referred to in the description of the operation of the autonomous travel apparatus of FIG.
7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.
8 is an example of an internal block diagram of a travel detection device according to an embodiment of the present invention.
Fig. 9 is a view showing an example of a vehicle mounting position of the travel detecting device of Fig. 8;
Figs. 10 to 12B are diagrams referred to in the description of the operation principle of the travel detecting device of Fig.
Figs. 13A to 16D are diagrams for explaining the operation of the travel detecting device of Fig. 8; Fig.
Figs. 17A to 18B are diagrams for explaining the operation of the travel detecting device of Fig. 8; Fig.

Hereinafter, the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.

On the other hand, the vehicle described in the present specification may be a concept including both a vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

1 is a conceptual diagram of a vehicle communication system having a travel detection device according to an embodiment of the present invention.

Referring to the drawings, a vehicle communication system 10 may include a vehicle 200, terminals 600a and 600b, and a server 500. [

The vehicle 200 may include an autonomous travel apparatus 100, a vehicle display apparatus 400, a travel detection apparatus 300, and the like in the interior of the vehicle.

The travel detecting device (300) is a device for detecting the position or movement distance of the vehicle at the time of vehicle traveling.

The travel detecting apparatus 300 according to the embodiment of the present invention may be an optical based apparatus for detecting navigation.

Specifically, the travel detecting apparatus 300 outputs the pattern light in a downward direction of the vehicle, receives the reflected light with the pattern light output therefrom, and detects the difference between the pattern light and the reflected light or based on the difference , It is possible to determine the road surface condition around the vehicle and provide the vehicle posture control signal corresponding to the determined road surface condition. Thus, it becomes possible to perform the vehicle posture control corresponding to the road surface condition around the vehicle based on the pattern light.

On the other hand, the travel detecting device 300 detects the vehicle-restraining jaw based on the difference between the pattern light and the reflected light or the difference of the reflected light continuously received when the vehicle-restraining jaw is located in front of the vehicle, By providing a suspension drive control signal for controlling the suspension drive portion in the vehicle, it is possible to provide a stable ride comfort of the vehicle in spite of the vehicle stoppage.

On the other hand, in the case where a road inclined uphill or downhill, or a road inclined in one direction is located in front of the vehicle, the travel detecting device 300 determines, based on the difference between the pattern light and the reflected light, By providing a suspension drive control signal for controlling the suspension drive portion in the vehicle, corresponding to a road that is inclined downward or downward or one direction and corresponds to a road that is inclined uphill or downhill or one direction, So that it is possible to provide a stable ride comfort of the vehicle.

On the other hand, when the hole is located in front of the vehicle, the travel detecting device 300 detects the hole based on the difference between the pattern light and the reflected light or the difference between the reflected light continuously received, By providing the steering drive control signal for controlling the driving unit, the stability of the running of the vehicle can be improved.

On the other hand, the travel detecting device 300 calculates the three-dimensional position or the three-dimensional moving distance of the vehicle based on the difference between the pattern light and the reflected light or the difference between the continuously received reflected light, It is possible to provide information on the movement distance.

The traveling detecting device 300 will be described in detail with reference to FIG. 8 and the following figures.

On the other hand, the autonomous mobile device 100 may include a vehicle driving assistant 100a, an ambient view providing device 100b, and the like

For example, for autonomous driving of the vehicle, when the vehicle speed is equal to or greater than a predetermined speed, autonomous travel of the vehicle is performed through the vehicle driving assistant device 100a, and when the vehicle speed is less than the predetermined speed, Can be performed.

As another example, when the vehicle driving assist device 100a and the surrounding view providing device 100b are operated together for autonomous driving of the vehicle but the predetermined speed or more, the vehicle driving assistant 100a is further weighted, The autonomous running is performed mainly on the driving assistance device 100a and when the speed is less than the predetermined speed, the weight is further added to the surrounding view providing device 100b so that the autonomous traveling of the vehicle can be performed mainly on the surrounding view providing device 100b .

On the other hand, the vehicle driving assistant 100a, the surrounding view providing apparatus 100b and the vehicle display apparatus 400 are connected to each other via a communication unit (not shown) or a communication unit provided in the vehicle 200, , 600b, or the server 500, as shown in FIG.

For example, when the mobile terminal 600a is located inside or near a vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 400 is connected by short- , And the terminal 600a.

As another example, when the terminal 600b is located at a remote place outside the vehicle, at least one of the vehicle driving assistant device 100a, the surrounding view providing device 100b, and the vehicle display device 400 may be a remote communication ) Can exchange data with the terminal 600b or the server 500 via the network 570. [

The terminals 600a and 600b may be mobile terminals such as mobile phones, smart phones, tablet PCs, and wearable devices such as smart watches. Or a fixed terminal such as a TV or a monitor. Hereinafter, the terminal 600 will be mainly described as a mobile terminal such as a smart phone.

On the other hand, the server 500 may be a server provided by a vehicle manufacturer or a server operated by a provider providing a vehicle-related service. For example, it may be a server operated by a provider providing information on road traffic conditions and the like.

On the other hand, the vehicle driving assistant 100a can generate and provide vehicle-related information by signal processing the stereo image received from the stereo camera 195 based on computer vision. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.

Alternatively, the vehicle driving assistant device 100a generates a control signal for the vehicle-to-vehicle traveling based on the distance information between the stereo image received from the stereo camera 195 and the vehicle periphery object from the radar 797 . For example, it is possible to output a control signal for controlling at least one of the steering driver, the brake driver, and the power source driver during autonomous vehicle travel.

On the other hand, the surrounding view providing apparatus 100b is configured to provide a plurality of images captured by the plurality of cameras 295a, 295b, 295c, and 295d to a processor (270 in FIG. 3C or FIG. 3D) And the processor (270 in FIG. 3C or FIG. 3D) can combine the plurality of images to generate and provide the surround view image.

Meanwhile, the vehicle display device 400 may be an AVN (Audio Video Navigation) device.

Meanwhile, the vehicle display apparatus 400 may include a space recognition sensor unit and a touch sensor unit, whereby the remote access can be sensed by the space recognition sensor unit, and the near-to-touch approach can be sensed through the touch sensor unit . Then, a user interface corresponding to the detected user gesture or touch can be provided.

2A is a diagram showing the appearance of a vehicle having various cameras.

Referring to the drawings, the vehicle 200 includes wheels (203FR, 103FL, 103RL, ...) rotated by a power source, a handle (250) for adjusting the traveling direction of the vehicle (200) A stereo camera 195 provided inside the vehicle 200 for the device 100a and a plurality of cameras 295a, 295b, 295c, 295d mounted on the vehicle 200 for the autonomous vehicle 100b of Fig. ). On the other hand, in the figure, only the left camera 295a and the front camera 295d are shown for the sake of convenience.

The stereo camera 195 may include a plurality of cameras, and the stereo image obtained by the plurality of cameras may be signal-processed in the vehicle driving assistance apparatus (100a in Fig. 3).

On the other hand, the figure illustrates that the stereo camera 195 includes two cameras.

The plurality of cameras 295a, 295b, 295c, and 295d can be activated when the vehicle speed is equal to or lower than a predetermined speed, or when the vehicle is backward, and can acquire a shot image, respectively. The image, which is obtained by a plurality of cameras, can be signal processed within the surrounding view providing apparatus (100b in Fig. 3c or 3d).

FIG. 2B is a view showing the appearance of a stereo camera attached to the vehicle of FIG. 2A.

Referring to the drawing, the stereo camera module 195 may include a first camera 195a having a first lens 193a, and a second camera 195b having a second lens 193b.

The stereo camera module 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, Shielding portion 192b.

The stereo camera module 195 in the drawing may be a structure detachable from the ceiling or the windshield of the vehicle 200.

A vehicle driving assistant device 100a (FIG. 3) having such a stereo camera module 195 obtains a stereo image for the vehicle front from the stereo camera module 195 and generates a disparity ) Detection, perform object detection for at least one stereo image based on the disparity information, and continuously track the motion of the object after object detection.

FIG. 2C is a view schematically showing the positions of a plurality of cameras attached to the vehicle of FIG. 2A, and FIG. 2D illustrates an example of an ambient view image based on images photographed by the plurality of cameras of FIG. 2C.

First, referring to FIG. 2C, a plurality of cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively.

In particular, the left camera 295a and the right camera 295c may be disposed in a case that surrounds the left side mirror and a case that surrounds the right side mirror, respectively.

On the other hand, the rear camera 295b and the front camera 295d can be disposed in the vicinity of the trunk switch and in the vicinity of the ambulance or the ambulance, respectively.

Each of the plurality of images photographed by the plurality of cameras 295a, 295b, 295c and 295d is transmitted to a processor (270 in Fig. 3c or 3d) in the vehicle 200 and is transmitted to a processor ) Combines a plurality of images to generate an ambient view image.

FIG. 2D illustrates an example of the surrounding view image 210. FIG. The surround view image 210 includes a first image area 295ai from the left camera 295a, a second image area 295bi from the rear camera 295b, a third image area 295b from the right camera 295c, 295ci, and a fourth image area 295di from the front camera 295d.

3A to 3B illustrate various examples of an internal block diagram of the autonomous navigation apparatus of FIG.

3A and 3B illustrate an internal block diagram of the autonomous navigation device 100 for the vehicle driving assistance device 100a.

The vehicle driving assistant 100a can process the stereo image received from the stereo camera 195 based on computer vision to generate vehicle related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver.

3A, the vehicle driving assistant apparatus 100a includes a communication unit 120, an interface unit 130, a memory 140, a processor 170, a power supply unit 190, (Not shown).

The communication unit 120 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 120 can exchange data with a mobile terminal of a vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 120 can receive weather information and traffic situation information on the road, for example, TPEG (Transport Protocol Expert Group) information from the mobile terminal 600 or the server 500. Meanwhile, the vehicle driving assistant 100a may transmit real-time traffic information based on the stereo image to the mobile terminal 600 or the server 500. [

On the other hand, when the user is aboard the vehicle, the user's mobile terminal 600 and the vehicle driving assistant 100a can perform pairing with each other automatically or by execution of the user's application.

The interface unit 130 can receive the vehicle-related data or transmit the signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 can perform data communication with the ECU 770, the AVN (Audio Video Navigation) device 400, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method have.

The interface unit 130 can receive map information related to the vehicle driving by data communication with the vehicle display device 400. [

On the other hand, the interface unit 130 can receive the sensor information from the ECU 770 or the sensor unit 760.

Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

Such sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.

On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

The memory 140 may store various data for operation of the entire vehicle driving assistant device 100a, such as a program for processing or controlling the processor 170. [

An audio output unit (not shown) converts an electric signal from the processor 170 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit (not shown) can also output sound corresponding to the operation of the input unit 110, that is, the button.

An audio input unit (not shown) can receive a user's voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 170.

The processor 170 controls the overall operation of each unit in the vehicle driving assistant 100a.

In particular, the processor 170 performs signal processing based on computer vision. Accordingly, the processor 170 obtains a stereo image for the vehicle front from the stereo camera 195, performs a disparity calculation for the vehicle front based on the stereo image, and based on the calculated disparity information , Perform object detection for at least one of the stereo images, and continue to track object motion after object detection.

Particularly, when the object is detected, the processor 170 performs lane detection, vehicle detection, pedestrian detection, traffic sign detection, road surface detection, and the like .

The processor 170 may perform a distance calculation to the detected nearby vehicle, a speed calculation of the detected nearby vehicle, a speed difference calculation with the detected nearby vehicle, and the like.

Meanwhile, the processor 170 can receive weather information, traffic situation information on the road, and TPEG (Transport Protocol Expert Group) information, for example, through the communication unit 120.

On the other hand, the processor 170 can grasp, in real time, the traffic situation information on the surroundings of the vehicle based on the stereo image in the vehicle driving assistant device 100a.

On the other hand, the processor 170 can receive map information and the like from the vehicle display device 400 through the interface unit 130. [

On the other hand, the processor 170 can receive the sensor information from the ECU 770 or the sensor unit 760 through the interface unit 130. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

The power supply unit 190 can supply power necessary for the operation of each component under the control of the processor 170. [ Particularly, the power supply unit 190 can receive power from a battery or the like inside the vehicle.

The stereo camera 195 may include a plurality of cameras. Hereinafter, as described with reference to FIG. 2B and the like, it is assumed that two cameras are provided.

The stereo camera 195 may be detachably attachable to the ceiling or the front glass of the vehicle 200 and may include a first camera 195a having a first lens 193a and a second camera 195a having a second lens 193b, (195b).

The stereo camera 195 includes a first light shield 192a and a second light shield 192b for shielding light incident on the first lens 193a and the second lens 193b, And a portion 192b.

Next, referring to FIG. 3B, the vehicle driving assistance device 100a of FIG. 3B further includes an input unit 110 display 180 and an audio output unit 185 in addition to the vehicle driving assistance device 100a of FIG. . Hereinafter, only the description of the input unit 110, the display 180, and the audio output unit 185 will be described.

The input unit 110 may include a plurality of buttons or a touch screen attached to the vehicle driving assistance apparatus 100a, particularly, the stereo camera 195. [ It is possible to turn on and operate the vehicle driving assistant 100a through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.

The display 180 may display an image related to the operation of the vehicle driving assist system. For this image display, the display 180 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 180 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [

The audio output unit 185 outputs the sound to the outside based on the audio signal processed by the processor 170. [ To this end, the audio output unit 185 may include at least one speaker.

3C to 3D illustrate various examples of the internal block diagram of the autonomous vehicle of FIG.

FIGS. 3C to 3D illustrate an internal block diagram of the surrounding view providing apparatus 100b in the autonomous navigation apparatus 100. FIG.

The surround view providing apparatus 100b in FIGS. 3C to 3D can combine a plurality of images received from the plurality of cameras 295a, ..., and 295d to generate an ambient view image.

On the other hand, the surrounding view providing apparatus 100b performs object detection, confirmation, and tracking on objects located in the vicinity of the vehicle based on the plurality of images received from the plurality of cameras 295a, ..., 295d .

Referring to FIG. 3C, the surrounding view providing apparatus 100b of FIG. 3C includes a communication unit 220, an interface unit 230, a memory 240, a processor 270, a display 280, a power supply unit 290 , And a plurality of cameras 295a, ..., 295d.

The communication unit 220 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 220 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 220 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the travel position, weather information, road traffic situation information, for example, TPEG Group) information. Meanwhile, in the surrounding view providing apparatus 100b, real-time traffic information based on an image may be transmitted to the mobile terminal 600 or the server 500.

On the other hand, when the user is boarding the vehicle, the user's mobile terminal 600 and the surrounding view providing apparatus 100b can perform pairing with each other automatically or by execution of the user's application.

The interface unit 230 may receive the vehicle-related data or may transmit the processed or generated signal to the outside by the processor 270. For this purpose, the interface unit 230 can perform data communication with the ECU 770, the sensor unit 760, and the like in the vehicle by a wire communication or a wireless communication method.

On the other hand, the interface unit 230 can receive the sensor information from the ECU 770 or the sensor unit 760.

Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

The memory 240 may store various data for operation of the entire surround view providing apparatus 100b, such as a program for processing or controlling the processor 270. [

On the other hand, the memory 240 may store map information related to the vehicle driving.

The processor 270 controls the overall operation of each unit in the surrounding view providing apparatus 100b.

Particularly, the processor 270 can acquire a plurality of images from the plurality of cameras 295a, ..., 295d, and combine the plurality of images to generate an around view image.

Meanwhile, the processor 270 may perform signal processing based on computer vision. For example, based on a plurality of images or a generated surrounding view image, a disparity calculation is performed around the vehicle, object detection is performed in the image based on the calculated disparity information, , The motion of the object can be continuously tracked.

Particularly, when the object is detected, the processor 270 can perform lane detection, vehicle detection, pedestrian detection, obstacle detection, parking area detection, road surface detection, and the like .

Then, the processor 270 can perform a distance calculation on the detected nearby vehicles or pedestrians.

On the other hand, the processor 270 can receive the sensor information from the ECU 770 or the sensor unit 760 via the interface unit 230. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

The display 280 may display an aurally view image generated by the processor 270. Meanwhile, it is also possible to provide a variety of user interface when displaying the surround view image, or to provide a touch sensor capable of touch input to the provided user interface.

Meanwhile, the display 280 may include a cluster or an HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 280 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [

The power supply unit 290 can supply power necessary for the operation of each component under the control of the processor 270. [ Particularly, the power supply unit 290 can receive power from a battery or the like inside the vehicle.

The plurality of cameras 295a, ..., and 295d are cameras for providing an overview image, preferably a wide angle camera.

3D is similar to the surrounding view providing apparatus 100b of FIG. 3C, but includes an input unit 210, an audio output unit 285, and an audio input unit (not shown) 286 are provided. Hereinafter, only the description of the input unit 210, the audio output unit 285, and the audio input unit 286 will be described.

The input unit 210 may include a plurality of buttons attached to the periphery of the display 280 or a touch screen disposed on the display 280. It is possible to turn on the power of the surrounding view providing apparatus 100b and operate it through a plurality of buttons or a touch screen. In addition, it is also possible to perform various input operations.

The audio output unit 285 converts an electric signal from the processor 270 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 285 can also output sound corresponding to the operation of the input unit 210, that is, the button.

The audio input unit 286 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 270.

Meanwhile, the far view providing apparatus 100b of FIG. 3C or FIG. 3D may be an AVN (Audio Video Navigation) apparatus.

3E is an internal block diagram of the vehicle display device of FIG.

The vehicle display apparatus 400 according to an embodiment of the present invention includes an input unit 310, a communication unit 320, a spatial recognition sensor unit 321, a touch sensor unit 326, an interface unit 330, A memory 340, a processor 370, a display 480, an audio input unit 383, an audio output unit 385, and a power supply unit 390.

The input unit 310 includes a button attached to the display device 400. For example, a power button may be provided. In addition, it may further include at least one of a menu button, an up / down button, and a left / right button.

The input signal through the input unit 310 may be transmitted to the processor 370.

The communication unit 320 can exchange data with an adjacent electronic device. For example, data can be exchanged with a vehicle internal electronic device or a server (not shown) in a wireless manner. Particularly, data can be exchanged wirelessly with the mobile terminal of the vehicle driver. Various wireless data communication methods such as Bluetooth, WiFi, and APiX are available.

For example, when the user is boarded in the vehicle, the user's mobile terminal and the display device 400 can perform the pairing with each other automatically or by execution of the user's application.

On the other hand, the communication unit 320 may include a GPS receiving device, and can receive GPS information, that is, position information of the vehicle.

The space recognition sensor unit 321 can detect the approach or movement of the user's hand. For this purpose, it may be disposed around the display 480.

The spatial recognition sensor unit 321 may perform spatial recognition based on an optical basis or may perform spatial recognition based on an ultrasonic wave. Hereinafter, description will be made mainly on performing spatial recognition under an optical basis.

The spatial recognition sensor unit 321 can detect the approach or movement of the user's hand based on the output of the output light and the reception of the reflected light corresponding thereto. In particular, the processor 370 can perform signal processing on the electrical signals of the output light and the reflected light.

For this purpose, the spatial recognition sensor unit 321 may include a light output unit 322 and a light receiving unit 324.

The light output unit 322 may output infrared light, for example, for detecting a user's hand located on the front of the display device 400. [

The light receiving unit 324 receives the light scattered or reflected when the light output from the light output unit 322 is scattered or reflected in the user's hand located on the front of the display device 400. [ Specifically, the light receiving unit 324 may include a photo diode and convert the received light into an electric signal through a photodiode. The converted electrical signal may be input to the processor 370.

The touch sensor unit 326 senses a floating touch and a direct touch. For this purpose, the touch sensor unit 326 may include an electrode array, an MCU, and the like. When the touch sensor unit is operated, an electric signal is supplied to the electrode array, and an electric field is formed on the electrode array.

The touch sensor unit 326 can operate when the intensity of light received by the spatial recognition sensor unit 321 is equal to or higher than the first level.

That is, when a user's hand such as a user's hand approaches within a predetermined distance, an electric signal may be supplied to the electrode array or the like in the touch sensor unit 326. [ An electric field is formed on the electrode array by the electric signal supplied to the electrode array, and the electric field is used to sense a capacitance change. Based on the capacitance change detection, the touch sensor detects a floating touch and a direct touch.

In particular, the z-axis information can be sensed by the touch sensor unit 326 in addition to the x- and y-axis information according to the approach of the user's hand.

The interface unit 330 can exchange data with other electronic devices in the vehicle. For example, the interface unit 330 can perform data communication with an ECU or the like in the vehicle by a wired communication method.

Specifically, the interface unit 330 can receive the vehicle status information by data communication with an ECU or the like in the vehicle.

Here, the vehicle status information includes at least one of battery information, fuel information, vehicle speed information, tire information, steering information by steering wheel rotation, vehicle lamp information, vehicle internal temperature information, vehicle external temperature information, can do.

The interface unit 330 may further receive GPS information from an ECU or the like in the vehicle. Alternatively, it is also possible to transmit GPS information, which is received by the display device 400, to an ECU or the like.

The memory 340 may store various data for operation of the display device 400, such as a program for processing or controlling the processor 370. [

For example, the memory 340 may store a map map for guiding the traveling path of the vehicle.

As another example, the memory 340 may store user information, user's mobile terminal information, for pairing with a user's mobile terminal.

The audio output unit 385 converts an electric signal from the processor 370 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 385 can also output sound corresponding to the operation of the input unit 310, that is, the button.

The audio input unit 383 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 370.

The processor 370 controls the overall operation of each unit in the vehicle display device 400. [

When the user's hands approach the display device 400 successively, the processor 370 sequentially determines the x, y, and z axes for the user's hand based on the light received by the light receiver 324 Information can be computed. At this time, the z-axis information can be sequentially reduced.

On the other hand, when the user's hand approaches within a second distance closer to the display 480 than the first distance, the processor 370 can control the touch sensor unit 326 to operate. That is, the processor 370 can control the touch sensor unit 326 to operate when the intensity of the electric signal from the spatial recognition sensor unit 321 is equal to or higher than the reference level. Thereby, an electric signal is supplied to each electrode array in the touch sensor unit 326. [

On the other hand, the processor 370 can sense the floating touch based on the sensing signal sensed by the touch sensor unit 326 when the user's hand is located within the second distance. In particular, the sensing signal may be a signal indicative of a change in capacitance.

Based on the sensed signal, the processor 370 computes the x and y axis information of the floating touch input and calculates z (x, y) based on the magnitude of the electrostatic capacitance change, Axis information can be calculated.

On the other hand, the processor 370 can change the grouping for the electrode array in the touch sensor unit 326 according to the distance of the user's hand.

Specifically, the processor 370 performs grouping on the electrode array in the touch sensor unit 326 based on the approximate z-axis information calculated on the basis of the received light received by the spatial recognition sensor unit 321 It is possible to change it. The larger the distance, the larger the size of the electrode array group can be set.

That is, the processor 370 can vary the size of the touch sensing cell with respect to the electrode array in the touch sensor unit 326 based on the distance information of the user's hand, that is, the z-axis information.

The display 480 may separately display an image corresponding to the function set for the button. For such image display, the display 480 may be implemented as a variety of display modules such as an LCD, an OLED, and the like. On the other hand, the display 480 may be implemented as a cluster on the inside of the vehicle interior.

The power supply unit 390 can supply power necessary for the operation of each component under the control of the processor 370. [

Figures 4A-4B illustrate various examples of internal block diagrams of the processors of Figures 3A-3D, and Figure 5 is a diagram illustrating object detection in the processors of Figures 4A-4B.

4A is a block diagram of the processor 170 of the vehicle driving assistance apparatus 100a of FIGS. 3A-3B or the processor 270 of the surrounding view providing apparatus 100B of FIGS. 3C- And shows an example of an internal block diagram.

The processor 170 or 270 may include an image preprocessing unit 410, a disparity computing unit 420, an object detecting unit 434, an object tracking unit 440, and an application unit 450.

The image preprocessor 410 may receive a plurality of images from the plurality of cameras 295a, ..., and 295d or a generated foreground view image to perform preprocessing.

Specifically, the image preprocessing unit 410 may perform a noise reduction, a rectification, a calibration, a color enhancement, a color correction, and a color correction on a plurality of images or a generated surrounding view image. Color space conversion (CSC), interpolation, camera gain control, and the like. Thus, it is possible to acquire a plurality of images photographed by the plurality of cameras 295a, ..., and 295d, or a sharper image than the generated surround view image.

The disparity calculator 420 receives a plurality of images or a generated surrounding view image signal-processed by the image preprocessing unit 410, and generates a plurality of images or a generated surrounding image Performs stereo matching on the view image, and obtains a disparty map according to the stereo matching. That is, it is possible to obtain the disparity information about the surroundings of the vehicle.

At this time, the stereo matching may be performed on a pixel-by-pixel basis or a predetermined block basis. On the other hand, the disparity map may mean a map in which numerical values of binocular parallax information of images, i.e., left and right images, are displayed.

The segmentation unit 432 may perform segmenting and clustering in the image based on the disparity information from the disparity calculating unit 420. [

Specifically, the segmentation unit 432 can separate the background and the foreground for at least one of the images based on the disparity information.

For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated.

As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.

Thus, by separating the foreground and background based on the disparity information information extracted based on the image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.

Next, the object detector 434 can detect the object based on the image segment from the segmentation unit 432. [

That is, the object detecting unit 434 can detect an object for at least one of the images based on the disparity information.

More specifically, the object detecting unit 434 can detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.

Next, the object verification unit 436 classifies and verifies the isolated object.

For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.

On the other hand, the object checking unit 436 can check the objects by comparing the objects stored in the memory 240 with the detected objects.

For example, the object checking unit 436 can identify nearby vehicles, lanes, roads, signs, hazardous areas, tunnels, and the like, which are located around the vehicle.

An object tracking unit 440 performs tracking on the identified object. For example, it is possible to sequentially check the objects in the acquired images, calculate the motion or motion vector of the identified object, and track the movement of the object based on the calculated motion or motion vector have. Accordingly, it is possible to track nearby vehicles, lanes, roads, signs, hazardous areas, etc., located in the vicinity of the vehicle.

4B is another example of an internal block diagram of the processor.

Referring to FIG. 4B, the processor 170 or 270 of FIG. 4B has the same internal configuration unit as the processor 170 or 270 of FIG. 4A, but differs in the signal processing order. Only the difference will be described below.

The object detecting unit 434 may receive a plurality of images or a generated surrounding view image, and may detect a plurality of images or objects in the generated surrounding view image. 4A, it is possible to detect an object directly from a plurality of images or a generated surrounding view image, instead of detecting an object, based on disparity information, for a segmented image.

Next, the object verification unit 436 classifies the detected and separated objects based on the image segment from the segmentation unit 432 and the object detected by the object detection unit 434, (Verify).

For this purpose, the object identifying unit 436 identifies the objects using a neural network identification method, a SVM (Support Vector Machine) method, a AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradients (HOG) Technique can be used.

FIG. 5 is a diagram referred to for explaining the operation method of the processor 170 or 270 of FIGS. 4A to 4B, based on images obtained respectively in the first and second frame periods.

Referring to FIG. 5, during the first and second frame periods, the plurality of cameras 295a, ..., and 295d sequentially acquire images FR1a and FR1b, respectively.

The disparity calculating unit 420 in the processor 170 or 270 receives the images FR1a and FR1b processed by the image preprocessing unit 410 and performs stereo matching on the received images FR1a and FR1b And obtains a disparity map (520).

The disparity map 520 is obtained by leveling the parallax between the images FR1a and FR1b. The higher the disparity level is, the closer the distance from the vehicle is, and the smaller the disparity level is, The distance can be calculated to be far.

On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.

In the figure, first to fourth lanes 528a, 528b, 528c, and 528d have corresponding disparity levels in the disparity map 520, and the construction area 522, the first front vehicle 524 ) And the second front vehicle 526 have corresponding disparity levels, respectively.

The segmentation unit 432, the object detection unit 434 and the object identification unit 436 determine whether or not a segment, an object detection, and an object of at least one of the images FR1a and FR1b, based on the disparity map 520, Perform verification.

In the figure, using the disparity map 520, object detection and confirmation for the second image FRlb is performed.

That is, the first to fourth lanes 538a, 538b, 538c, and 538d, the construction area 532, the first forward vehicle 534, and the second forward vehicle 536 are included in the image 530, And verification may be performed.

On the other hand, by continuously acquiring the image, the object tracking unit 440 can perform tracking on the identified object.

6A to 6B are views referred to in the description of the operation of the autonomous travel apparatus of FIG.

First, FIG. 6A is a diagram illustrating a vehicle forward situation photographed by a stereo camera 195 provided inside a vehicle. In particular, the vehicle front view is indicated by a bird eye view.

Referring to the drawing, a first lane 642a, a second lane 644a, a third lane 646a, and a fourth lane 648a are located from the left to the right, and the first lane 642a and the second The construction area 610a is positioned between the lanes 644a and the first front vehicle 620a is positioned between the second lane 644a and the third lane 646a and the third lane 646a and the fourth It can be seen that the second forward vehicle 630a is disposed between the lane lines 648a.

Next, FIG. 6B illustrates the display of the vehicle front state, which is grasped by the vehicle driving assist system, together with various information. In particular, the image as shown in FIG. 6B may be displayed on the display 180 or the vehicle display device 400 provided in the vehicle driving assistance device.

6B is different from FIG. 6A in that information is displayed on the basis of an image photographed by the stereo camera 195. FIG.

A first lane 642b, a second lane 644b, a third lane 646b and a fourth lane 648b are located from the left to the right and the first lane 642b and the second The construction area 610b is located between the lanes 644b and the first front vehicle 620b is located between the second lane 644b and the third lane 646b and the third lane 646b and the fourth It can be seen that the second forward vehicle 630b is disposed between the lane 648b.

The vehicle driving assistant 100a performs signal processing on the basis of the stereo image photographed by the stereo camera 195 and outputs it to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b You can see the object for. In addition, the first lane 642b, the second lane 644b, the third lane 646b, and the fourth lane 648b can be confirmed.

On the other hand, in the drawing, it is exemplified that each of them is highlighted by a frame to indicate object identification for the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b.

On the other hand, the vehicle driving assistant device 100a calculates the distance (distance) to the construction area 610b, the first front vehicle 620b, the second front vehicle 630b based on the stereo image photographed by the stereo camera 195 Information can be computed.

In the figure, calculated first distance information 611b, second distance information 621b, and third distance information 621b corresponding to the construction area 610b, the first forward vehicle 620b, and the second forward vehicle 630b, respectively, Information 631b is displayed.

On the other hand, the vehicle driving assistant device 100a can receive sensor information about the vehicle from the ECU 770 or the sensor unit 760. [ Particularly, it is possible to receive and display the vehicle speed information, the gear information, the yaw rate indicating the speed at which the vehicle's rotational angle (yaw angle) changes, and the angle information of the vehicle.

The figure illustrates that the vehicle speed information 672, the gear information 671 and the yaw rate information 673 are displayed on the vehicle front image upper portion 670. In the vehicle front image lower portion 680, Information 682 is displayed, but various examples are possible. Besides, the width information 683 of the vehicle and the curvature information 681 of the road can be displayed together with the angle information 682 of the vehicle.

On the other hand, the vehicle driving assistant 100a can receive the speed limitation information and the like for the road running on the vehicle through the communication unit 120 or the interface unit 130. [ In the figure, it is exemplified that the speed limitation information 640b is displayed.

The vehicle driving assistant 100a may display various information shown in FIG. 6B through the display 180 or the like, but may store various information without a separate indication. And, by using such information, it can be utilized for various applications.

7 is an example of a block diagram of an interior of a vehicle according to an embodiment of the present invention.

Referring to the drawings, the vehicle 200 may include an electronic control device 700 for vehicle control.

The electronic control unit 700 includes an input unit 710, a communication unit 720, a memory 740, a lamp driving unit 751, a steering driving unit 752, a brake driving unit 753, a power source driving unit 754, An air conditioner driving unit 757, a window driving unit 758, an airbag driving unit 759, a sensor unit 760, an ECU 770, a display 780, an audio output unit 785, An audio input unit 786, a power supply unit 790, a stereo camera 195, a plurality of cameras 295, a radar 797, an internal camera 708, a seat driving unit 761, .

Meanwhile, the ECU 770 may be a concept including the processor 270 described in FIG. 3C or FIG. 3D. Alternatively, in addition to the ECU 770, a separate processor for signal processing of images from the camera may be provided.

The input unit 710 may include a plurality of buttons or a touch screen disposed inside the vehicle 200. Through a plurality of buttons or a touch screen, it is possible to perform various input operations.

The communication unit 720 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 720 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 720 receives from the mobile terminal 600 or the server 500 the schedule information related to the vehicle driver's schedule time or the moving position, weather information, road traffic situation information, for example, TPEG Group) information.

On the other hand, when the user aboard the vehicle, the user's mobile terminal 600 and the electronic control device 700 can perform pairing with each other automatically or by execution of the user's application.

The memory 740 may store various data for operation of the electronic control unit 700, such as a program for processing or controlling the ECU 770. [

On the other hand, the memory 740 may store map information related to the vehicle driving.

The lamp driving unit 751 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.

The steering driver 752 may perform electronic control of a steering apparatus (not shown) in the vehicle 200. [ Thus, the traveling direction of the vehicle can be changed.

The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle 200. [ For example, the speed of the vehicle 200 can be reduced by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 200 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The power source driving section 754 can perform electronic control of the power source in the vehicle 200. [

For example, when a fossil fuel-based engine (not shown) is a power source, the power source drive unit 754 can perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled.

As another example, when the electric motor (not shown) is a power source, the power source driving unit 754 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The sunroof driving unit 755 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 200. [ For example, you can control the opening or closing of the sunroof.

The suspension driving unit 756 may perform electronic control of a suspension apparatus (not shown) in the vehicle 200. [ For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 200. [

The air conditioning driving unit 757 can perform electronic control on an air conditioner (not shown) in the vehicle 200. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cooling air to be supplied into the vehicle.

The window driving unit 758 can perform electronic control on a window apparatus (not shown) in the vehicle 200. [ For example, it can control the opening or closing of left and right windows on the side of the vehicle.

The airbag driver 759 can perform electronic control of the airbag apparatus in the vehicle 200. [ For example, at risk, the airbag can be controlled to fire.

The seat driving unit 761 can perform position control of the seat 200 or the backrest of the vehicle 200. [ For example, when the driver is seated in the driver's seat, the driver's seat can be adjusted according to the driver, front / rear spacing adjustment of the seat, front / rear gap adjustment of the backrest, and the like.

On the other hand, the seat driving unit 761 can drive the rollers disposed in the seat or the backrest, and can control the driver to provide pressure such as a massage.

The sensor unit 760 senses a signal relating to the running of the vehicle 200 or the like. To this end, the sensor unit 760 may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, A vehicle speed sensor, a vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle interior temperature sensor, and a vehicle interior humidity sensor.

Thereby, the sensor unit 760 outputs the vehicle position information (GPS information), the vehicle angle information, the vehicle speed information, the vehicle acceleration information, the vehicle tilt information, the vehicle forward / backward information, the battery information, Tire information, vehicle lamp information, vehicle internal temperature information, vehicle interior humidity information, and the like.

In addition, the sensor unit 760 may include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The ECU 770 can control the overall operation of each unit in the electronic control unit 700. [

It is possible to perform a specific operation by input by the input unit 710 or to receive the sensed signal from the sensor unit 760 and transmit it to the surrounding view providing apparatus 100b and receive map information from the memory 740 754, 756, 753, 754, 756, respectively.

Also, the ECU 770 can receive weather information and traffic situation information of the road, for example, TPEG (Transport Protocol Expert Group) information from the communication unit 720. [

On the other hand, the ECU 770 can combine a plurality of images received from the plurality of cameras 295 to generate an ambient view image. In particular, when the vehicle is below a predetermined speed or when the vehicle is moving backward, the surround view image can be generated.

The display 780 can display an image of the front of the vehicle while the vehicle is running or an around view image during the running of the vehicle. In particular, it is also possible to provide various user interfaces in addition to the surround view image.

For the display of such an ambient view image or the like, the display 780 may include a cluster or HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display 780 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 200. [ On the other hand, the display 780 may include a touch screen capable of being input.

The audio output unit 785 converts the electrical signal from the ECU 770 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The audio output unit 785 can also output a sound corresponding to the operation of the input unit 710, that is, the button.

The audio input unit 786 can receive user voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the ECU 770.

The power supply unit 790 can supply power necessary for operation of each component under the control of the ECU 770. [ Particularly, the power supply unit 790 can receive power from a battery (not shown) inside the vehicle.

The stereo camera 195 is used for the operation of a driving assist system for a vehicle. This will be described with reference to the above description.

A plurality of cameras 295 are used to provide the surround view image, and for this purpose, as shown in FIG. 2C, four cameras may be provided. For example, the plurality of cameras 295a, 295b, 295c, and 295d may be disposed on the left, rear, right, and front of the vehicle, respectively. The plurality of images photographed by the plurality of cameras 295 may be transmitted to the ECU 770 or a separate processor (not shown).

The internal camera 708 captures an image of the interior of the vehicle including the driver. For example, an RGB camera, an IR camera with a thermal sensation, and the like can be exemplified.

The driver detection sensor 799 detects the driver's body information. For example, the driver's blood pressure information, sleeping waves, and the like can be detected.

The radar 797 transmits a transmission signal and receives a reception signal reflected from an object near the vehicle. The distance information can be output based on the difference between the transmission signal and the reception signal. Further, phase information can be further output.

8 is an example of an internal block diagram of a travel detection device according to an embodiment of the present invention.

The traveling detection apparatus 300 includes a sensor unit 310, a communication unit 320, an interface unit 330, a memory 340, a processor 370, a power supply unit 390, an optical transmitter unit 380 ), A light reception section 390, and a camera 395.

The sensor unit 310 can sense the moving distance of the vehicle. For this, the sensor unit 310 may include a gyro sensor, an acceleration sensor, and the like.

The communication unit 320 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner.

In particular, the communication unit 320 can exchange data with the mobile terminal of the vehicle driver wirelessly. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.

The communication unit 320 can receive weather information, road information, road traffic situation information, and the like from the mobile terminal 600 or the server 500.

On the other hand, when the user is aboard the vehicle, the user's mobile terminal 600 and the travel detecting device 300 can perform pairing with each other automatically or by execution of the user's application.

The interface unit 330 can receive the vehicle-related data or transmit the processed or generated signal to the outside by the processor 370.

To this end, the interface unit 330 includes an ECU 770, an AVN (Audio Video Navigation) apparatus 400, a sensor unit 760, a vehicle driving assistant 100a ), The surrounding view providing apparatus 100b, and the like.

On the other hand, the interface unit 330 can receive the sensor information from the ECU 770 or the sensor unit 760. [

Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, and vehicle interior humidity information.

The memory 340 may store various data for operation of the traveling detection device 300, such as a program for processing or controlling the processor 370. [

The processor 370 controls the overall operation of each unit in the travel detecting device 300. [

In particular, the processor 370 can calculate, based on the optical, three-dimensional position or three-dimensional moving distance of the vehicle of the vehicle.

Based on the difference between the pattern light output in the downward direction of the vehicle from the light transmitting unit 380 and the reflected light of the pattern light or the difference of the reflected light continuously received, Dimensional position or a three-dimensional moving distance. This makes it possible to provide information on the three-dimensional position or the three-dimensional moving distance of the vehicle.

On the other hand, the processor 370 can control so that the pattern of the pattern light output from the light transmitting unit 380 is varied based on the received reflected light.

For example, in the case where the pattern recognition of the received reflected light is not successful, the processor 370 can control so that the pattern light of a pattern with a well-recognized pattern is outputted for smooth pattern recognition.

On the other hand, when the optical transmitter unit 380 includes a plurality of light sources and includes a first light source for outputting visible light and a second light source for outputting infrared light, for example, It is possible to control the infrared ray-based pattern light to be output in consideration of the pattern recognition of the infrared ray-based pattern light.

On the other hand, when the optical transmitter unit 380 includes a plurality of light sources and includes a first light source for outputting visible light and a second light source for outputting infrared light, for example, To output pattern light based on visible light, and to control the pattern light output based on infrared light at night.

On the other hand, the processor 370 may vary at least one of the luminance, hue and area of the pattern light output based on the level of the reflected light received.

For example, in the case where the reception level of the received reflected light is lower than the predetermined level and the pattern recognition is not performed well, the processor 370 may change at least one of the luminance, color, can do.

On the other hand, the processor 370 calculates the position or movement distance of the x- and y-axes based on the time difference between the first reflected light and the second reflected light which are successively received, and calculates the position or movement distance based on the pattern difference between the first reflected light and the second reflected light So that the position or movement distance of the z-axis can be calculated.

On the other hand, the processor 370 calculates the homography using the received reflected light, and can calculate the z-axis position or the z-axis movement distance based on the calculated homography.

According to such calculation of the three-dimensional position or the three-dimensional moving distance of the optical-based vehicle, the error is reduced to within a few centimeters.

On the other hand, the error of the position or movement distance information of the vehicle sensed by the sensor unit 310 shows an error of about 50 cm, and therefore, the three-dimensional position of the optically-based vehicle based on the pattern light of the present invention, It is possible to calculate the three-dimensional position or the three-dimensional moving distance of the vehicle, which is more accurate in calculating the distance.

On the other hand, the processor 370 obtains information related to the position or movement distance of the vehicle from the sensor unit 310 and information related to the three-dimensional position or three-dimensional movement distance of the vehicle calculated based on the difference between the pattern light and the reflected light May be used to calculate the final position or travel distance of the vehicle.

In this case, as described above, the processor 370 gives a greater weight to the information related to the three-dimensional position or the three-dimensional moving distance of the vehicle calculated based on the difference between the pattern light and the reflected light, The position or movement distance may also be calculated.

When the position-based signal is not received in the communication unit 320 or the level of the position-based signal is lower than a predetermined level, the processor 370 determines the position of the vehicle based on the difference between the pattern light and the reflected light, The moving distance can be calculated.

Specifically, the processor 370 calculates the three-dimensional position or the three-dimensional moving distance of the vehicle based on the difference between the pattern light and the reflected light when the vehicle enters the underground of the building, Based on the three-dimensional moving distance, the floor information in the building for the vehicle can be provided. Thus, the user's convenience of use can be increased.

At this time, the processor 370 extracts a plurality of parking zone identification markers based on the image captured by the camera 395 during the first time when the vehicle is parked in the underground of the building, The parking zone identification marker, and the floor information to the mobile terminal 600 of the driver. Thus, the user's convenience of use can be increased.

On the other hand, the processor 370 activates the camera 395 for a second time when the startup is turned off after the parking of the vehicle, extracts the driver's walking path information based on the image captured by the camera 395, It is possible to control to extract the parking lot entrance information and to further transmit the walking path information or the parking lot entrance information to the mobile terminal 600. [ Accordingly, the usability of the user can be further increased.

On the other hand, the processor 370 determines, based on the difference between the pattern light output in the vehicle lower direction from the optical transmitter 380 and the reflected light of the pattern light, It is possible to determine the road surface condition around the vehicle, and to provide the vehicle posture control signal corresponding to the determined road surface condition.

The vehicle attitude control signal at this time may include a suspension drive control signal, a brake drive control signal, a steering drive control signal, and a power source drive control signal.

For example, the processor 370 senses the vehicle bouncing chin on the basis of the difference between the pattern light and the reflected light or the difference of the reflected light continuously received when the vehicle bouncing jaw is located in front of the vehicle, It is possible to provide a suspension drive control signal for controlling the suspension drive portion in the vehicle.

As another example, the processor 370 may determine whether the vehicle is traveling on an uphill road or on an uphill road or on an uphill road or on an uphill road or a downhill road based on the difference between the patterned light and the reflected light, It is possible to provide a suspension drive control signal for controlling the suspension drive portion in the vehicle, corresponding to a downhill road or a road inclined in one direction and corresponding to an uphill road or a downhill road or a road inclined in one direction.

As another example, in the case where a hole is located in front of the vehicle, it is possible to detect the hole based on the difference between the pattern light and the reflected light or the difference of the reflected light continuously received, It is possible to provide a drive control signal.

The optical transmitter 380 may include a light source portion 382 (FIG. 10) that outputs light and a light diffusing portion 384 (FIG. 10) that diffuses light to output pattern light.

The light source unit (382 in FIG. 10) may include a plurality of light sources, for example, a first light source that outputs visible light and a second light source that outputs infrared light.

Meanwhile, the light source unit may include an LED or a laser diode.

The light diffusing portion (384 in Fig. 10) can diffuse the light from the light source portion and output the pattern light in accordance with the light pattern attached or formed on the plate.

The light receiving section 390 receives the reflected light corresponding to the pattern light output to the outside of the chanang. To this end, a lens 398 for collecting light and a light detecting part 396 for detecting light may be provided. The photodetector 396, as an image sensor, can convert the reflected light into an electric signal.

The camera 395 can capture an image of the surroundings of the vehicle.

The camera 395 may be a separate camera from the plurality of cameras 295a, 295b, 295c, and 295d in the surrounding view providing apparatus 100b, the stereo camera 195 in the vehicle driving assistance apparatus 100a, and the like.

Alternatively, the camera 395 may be a plurality of cameras 295a, 295b, 295c, 295d in the surround view providing apparatus 100b or a stereo camera 195 in the vehicle driving assistant 100a.

The power supply unit 390 can supply power necessary for the operation of each component under the control of the processor 370. [ Particularly, the power supply unit 390 can receive power from a battery or the like inside the vehicle.

Fig. 9 is a view showing an example of a vehicle mounting position of the travel detecting device of Fig. 8;

Referring to the drawings, the travel detecting apparatus 300 according to the embodiment of the present invention can be mounted at a plurality of positions below the vehicle 200. [

In the figure, three travel detecting devices 300a, 300b and 300c are mounted on the front ends of the front wheels 103FR and 103FL of the vehicle and two traveling detecting devices 300d and 300e ) Is mounted.

 On the other hand, unlike the drawings, two traveling detecting devices are mounted on the front ends of the front wheels 103FR and 103FL, and two traveling detecting devices 300d and 300e are mounted on the rear ends of the rear wheels 103RR and 103RL of the vehicle It is also possible.

Figs. 10 to 12B are diagrams referred to in the description of the operation principle of the travel detecting device of Fig.

10 shows an example in which the light transmitting unit 380 outputs pattern light in an inclined direction on the road and the light receiving unit 390 receives the reflected light reflected on the road surface do.

The light transmitting unit 380 may include a light source unit 382 that outputs light and a light diffusing unit 384 that diffuses the light and outputs pattern light. Unlike the drawing, in order to improve the straightness of light, Between the light source unit 382 and the light diffusion unit 384, a lens may be further provided.

The light receiving section 390 receives the reflected light corresponding to the pattern light output to the outside of the chanang. To this end, a lens 398 for collecting light and a light detecting part 396 for detecting light may be provided. The photodetector 396, as an image sensor, can convert the reflected light into an electric signal.

11, the travel detecting device 300c of the vehicle 200 that travels on a flat road, such as the Pa position, outputs a rectangular pattern light and receives a rectangular shaped reflection pattern light Ima can do.

Next, the travel detecting device 300c of the vehicle 200 that travels on the road where the uphill starts, like the Pb position, can receive the trapezoidal reflection pattern light Imb whose lower end is shorter in length.

Next, the travel detecting device 300c of the vehicle 200 that travels on the road where the ascending end is terminated, such as the Pc position, can receive the trapezoidal shaped reflection pattern light Imc whose upper end is shorter.

The processor 170 can recognize that the uphill starts based on the received trapezoidal reflection pattern light Imb at the position Pb.

Particularly, the processor 170 can calculate the position of the z-axis, or the movement distance, according to the inclination of the trapezoidal shape by the reflection pattern light Imb.

On the other hand, the processor 170 calculates the position of the x- and y-axes or the position of the x- and y-axes based on the time difference between the trapezoidal reflection pattern light Imb and the rectangular reflection pattern light Ima between Pb and Pa The distance can be calculated.

On the other hand, at the Pc position, it can be recognized that the uphill is ended based on the received trapezoidal reflection pattern light Imc.

In particular, the processor 170 can calculate the position of the z-axis, or the movement distance, according to the inclination of the trapezoidal shape of the reflection pattern light Imc.

On the other hand, based on the time difference between the trapezoidal reflection pattern light Imc and the trapezoidal reflection pattern light Imb between Pc and Pb, the processor 170 determines the position of the x-, y- The distance can be calculated.

Figs. 12A to 12B are diagrams illustrating a method of outputting pattern light in the optical transmitter unit 380. Fig.

Referring to FIG. 12A, the light output from the light source unit 382 is diffused by the light diffusion unit 384, and pattern light due to the diffusion is output through the lens 388 to the outside (Ipa).

12B illustrates various examples of the light diffusing portion 384. FIG.

12B illustrates a first light diffusion portion 384a in which light is dispersed in two directions. FIG. 12B illustrates a second light diffusion portion 384b in which light is dispersed in three directions. (C) of FIG. 12 illustrates a third light diffusion portion 384c in which light is dispersed in nine directions, and FIG. 12 (d) illustrates a fourth light diffusion Section 384d.

On the other hand, when the light diffusing unit 384 includes the first through fourth light diffusing units shown in FIG. 12B, the processor 370 determines whether the pattern of the pattern light output from the light transmitting unit 380 is variable It is possible to select the corresponding one of the first to fourth light diffusion units.

Figs. 13A to 16D are diagrams for explaining the operation of the travel detecting device of Fig. 8; Fig.

First, FIG. 13A illustrates a case where the vehicle 200 travels on a flat road 1300a. In this case, the travel detecting devices 300a and 300d can calculate the position or the moving distance with respect to the x and y axes, without changing the z-axis, based on the pattern light and the reflected light.

Next, Fig. 13B illustrates a case where the vehicle 200 travels on the ascending road 1300b. In such a case, the travel detecting apparatuses 300a and 300d can calculate the position or the moving distance with respect to the x, y, and z axes based on the pattern light and the reflected light.

Next, FIG. 13C illustrates a case where the vehicle 200 travels on the uphill road 1300b and the downhill road 1300c. In such a case, the travel detecting apparatuses 300a and 300d can calculate the position or the moving distance with respect to the x, y, and z axes based on the pattern light and the reflected light.

On the other hand, Fig. 13D illustrates a case where the vehicle 200 travels on a road 1300d whose height on the left side in the right and left of the vehicle is lower than that on the right side.

In this case, the travel detecting apparatuses 300a, 300b, and 300c can calculate the position or movement distance with respect to the x, y, and z axes based on the pattern light and the reflected light.

14A illustrates that the vehicle 200 enters the parking lot in the building 1101 through the entrance 1102. Fig.

14B illustrates that the vehicle 200 enters the inside of the building 1101, passes through the rotary road, and enters the second floor of the building.

When the vehicle 200 enters the basement, the communication unit 320 may not receive the position-based signal, or the level of the position-based signal may be lower than a predetermined level.

Even in such a case, the travel detecting apparatus 300 according to the embodiment of the present invention can calculate the position or the moving distance with respect to the x, y, and z axes based on the pattern light and the reflected light.

In particular, the processor 370 may calculate the z-axis position or travel distance and provide floor information within the building for the vehicle based on the computed z-axis position or travel distance.

For example, if the z-axis travel distance is equal to the parking entry threshold, the height of the first floor is set to 3 m, and the processor 370 calculates that the current vehicle 200 is located in the second floor .

As another example, the processor 370 may calculate the vehicle 200 by rotating the 360-degree rotation road twice as shown in FIG. 14B in consideration of the moving distance or the moving direction on the x, y, and z axes, And the current vehicle 200 is located in the second basement floor.

Then, the processor 370 calculates the three-dimensional position or the three-dimensional moving distance of the vehicle based on the difference between the pattern light and the reflected light, and based on the calculated three-dimensional position or the three- The floor information in the building can be provided. As a result, the convenience of use is increased.

15A and 15B are diagrams illustrating how the vehicle 200 moves into and moves into a two-story underground parking lot 1100 having a parking area.

The processor 370 of the travel detection apparatus 300 extracts a plurality of parking zone identification markers from among the images captured by the camera 395 during the first time period when the vehicle is parked, It is possible to control the marker to be transmitted to the mobile terminal 600.

15A and 15B illustrate parking zone identification markers 1107a, 1107b and 1107c (B06, B08 and B05) in the parking lot 1100, which are composed of letters and numbers when the vehicle 200 is moved.

On the other hand, the processor 370 of the travel detection device 300 analyzes the image captured by the camera 395 when moving the vehicle 200, and detects a plurality of parking zone identification markers 1107a, 1107b, and 1107c (B06, B08, B05) can be recognized and extracted.

On the other hand, assuming that the vehicle is parked in the specific parking area 1110b of the parking space 1100 in Fig. 15A and the driver moves to the entrance 1109 and the outside of the parking lot 1100 after the parking of the vehicle do.

The processor 370 of the travel detection device 300 can control to transmit the extracted multiple parking zone identification markers to the mobile terminal 600 when the parking of the vehicle is completed. Thus, the vehicle driver can easily grasp his / her vehicle position without taking another photograph using the mobile terminal.

Particularly, the processor 370 of the travel detection device 300 can control to transmit only a predetermined number of markers among the extracted multiple parking zone identification markers to the mobile terminal 600 until the parking of the vehicle is completed.

For example, the processor 370 of the travel detection device 300 may use only the communication device 320 to transmit only the three multiple parking zone identification markers out of the six extracted multiple parking zone identification markers to the driver's mobile terminal 600 As shown in FIG. Accordingly, it is possible to transmit to the mobile terminal mainly on the vehicle parking position information, thereby reducing unnecessary data transmission.

On the other hand, the processor 370 of the travel detection device 300 can control the mobile terminal 600 to further transmit the layer information calculated as described above.

For example, in the case where the parking lot 1100 of FIG. 15A is underground two floors, by further transmitting the underground two-story information to the mobile terminal 600, the driver can more clearly grasp the parking position of his / do.

16A illustrates an example of a multiple parking zone identification marker displayed on the display 680 of the mobile terminal 600. As shown in FIG.

The processor 370 of the travel detection device 300 can control to sequentially transmit the floor information in the building to the vehicle and the plurality of parking zone identification markers in accordance with the traveling route of the vehicle.

In particular, the processor 370 of the travel detection device 300 can control to sequentially transmit a plurality of parking zone identification markers in chronological order.

Accordingly, as shown in FIG. 16A, the floor information 1109a (B2F) in the building for the vehicle, the plurality of parking zone identification markers 1112, 1114, 1116 (B06, B08, B05) The parking information screen 1110 including the parking information 1117 can be displayed on the display 680 of the mobile terminal 600. [

16B illustrates another example of the multiple parking zone identification markers displayed on the display 680 of the mobile terminal 600. As shown in FIG.

The processor 370 of the travel detection device 300 may control to transmit a plurality of parking zone identification markers in the reverse order of the time sequence.

Accordingly, as shown in FIG. 16B, the floor information 1109a (B2F) in the building for the vehicle, the plurality of parking zone identification markers 1116, 1114, 1112 (B05, B08, B06) The parking information screen 1110 including the parking information 1119 can be displayed on the display 680 of the mobile terminal 600. [

On the other hand, the processor 370 of the travel detection device 300 activates the camera 395 for a second time when the startup is turned off after the vehicle has been parked, so that, based on the image captured by the camera 395, It is possible to extract the walking path information, extract the parking lot entrance information 1109, and control the mobile terminal 600 to further transmit the walking path information or the parking lot entrance / exit information 1109. [

On the other hand, the processor 370 of the travel detection device 300 controls to sequentially transmit the floor information in the building to the vehicle, the plurality of parking zone identification markers, the walking route information or the parking lot entrance / exit information, The parking zone identification marker, the walking route information, or the parking lot entrance / exit information, to which the traveling route information of the vehicle is added.

16C illustrates another example of the multiple parking zone identification markers displayed on the display 680 of the mobile terminal 600. As shown in FIG.

The processor 370 of the travel detection apparatus 300 can control to sequentially transmit the floor information in the building to the vehicle and the plurality of parking zone identification markers and the parking lot entrance and exit information in the order of time.

Accordingly, as shown in FIG. 16C, the floor information 1109a (B2F) in the building for the vehicle, the plurality of parking zone identification markers 1112, 1114, 1116 (B06, B08, B05), the entrance information 1132 The parking information screen 1110 including the traveling route information 1117 of the vehicle and the traveling route information 1117 of the vehicle can be displayed on the display 680 of the mobile terminal 600. [

16D illustrates another example of the multiple parking zone identification markers displayed on the display 680 of the mobile terminal 600. FIG.

The processor 370 of the travel detection device 300 may control to transmit a plurality of parking zone identification markers and parking lot entrance and exit information in a reverse order of time order.

16D, floor information 1109a (B2F) in the building for the vehicle, entrance information 1132, a plurality of parking zone identification markers 1116, 1114, 1112 (B05, B08, B06) The parking information screen 1110 including the traveling route information 1119 of the vehicle can be displayed on the display 680 of the mobile terminal 600. [

Figs. 17A to 18B are diagrams for explaining the operation of the travel detecting device of Fig. 8; Fig.

17A illustrates that the vehicle 200 travels on a flat road and judges the road surface condition around the vehicle based on the pattern light and reflected light of the travel detecting devices 300a and 300d.

On the other hand, the travel detecting apparatus 300 according to the embodiment of the present invention may detect obstacles around the vehicle in advance.

17A, the processor 370 of the travel detecting device 300a disposed in front of the vehicle detects in advance the braking force 1700 in front of the vehicle based on the pattern light and the reflected light, and detects the braking force 1700 in front of the vehicle, A suspension driving control signal for controlling the suspension driving portion 756 in the vehicle.

In such a case, the processor 370 or the ECU 770 may control the suspension drive unit 756 to drive the suspension apparatus corresponding to the front wheels 113FR and 113FL of the vehicle in consideration of the braking force 1700 in front of the vehicle It is possible.

Specifically, as shown in FIG. 17B, when the vehicle front wheels 113FR and 113FL are located in the frontal brakes 1700 in front of the vehicle, or immediately before, the processor 370 or the ECU 770 drives the front wheels 113FR , And 113FL of the suspension driving unit 756. The suspension driving unit 756 may control the suspension driving unit 756 to drive the suspension devices corresponding to the suspension driving units.

Thus, it is possible to provide a stable ride comfort of the vehicle in spite of the braking force 1700 in front of the vehicle.

On the other hand, when the vehicle rear wheels 113RR and 113RL are located in the braking tines 1700 or just before the vehicle, the processor 370 or the ECU 770 controls the suspension devices corresponding to the vehicle rear wheels 113RR and 113RL The suspension driving unit 756 can be controlled to be driven.

18A illustrates that the vehicle 200 travels on a flat road and judges the road surface condition around the vehicle based on the pattern light and reflected light of the travel detecting devices 300a and 300d.

On the other hand, the travel detecting apparatus 300 according to the embodiment of the present invention may detect obstacles around the vehicle in advance.

18A, when there is the inclined road 1800 in front of the vehicle, the processor 370 of the travel detecting device 300a detects the inclined road 1800 in front of the vehicle based on the pattern light and the reflected light And may provide a suspension drive control signal for controlling the suspension drive portion 756 in the vehicle, corresponding to the inclined road 1800 in front of the vehicle.

In such a case, the processor 370 or the ECU 770 may calculate the slope of the suspension corresponding to the left wheels 113FL, 113RL of the vehicle wheels 113FR, 113FL, 113RR, 113RL in consideration of the inclined road 1800 in front of the vehicle It is also possible to control the suspension driving unit 756 to drive the apparatus.

Specifically, as shown in FIG. 18B, when the height of the left side of the vehicle is equal to or shorter than the right side, the processor 370 or the ECU 770 controls the vehicle wheels 113FL, 113FL, 113RR, and 113RL to drive the suspension devices corresponding to the left wheels 113FL and 113RL.

Thus, it is possible to provide a stable ride comfort of the vehicle in spite of the inclined road 1800 in front of the vehicle.

On the other hand, although not shown in the figure, when the uphill road or the downhill road is located in front of the vehicle, the processor 370 determines whether the uphill road or the downhill road is on the uphill road or the downhill road based on the difference between the patterned light and the reflected light, And may provide a suspension drive control signal for controlling the suspension drive portion in the vehicle, corresponding to the uphill road or the downhill road.

On the other hand, although not shown in the drawings, when the hole is located in front of the vehicle, the processor 370 detects a hole based on the difference between the pattern light and the reflected light or the difference of reflected light continuously received, And may also provide a steering drive control signal for controlling the steering driver in the vehicle.

On the other hand, the running detection apparatus or the vehicle operation method of the present invention can be implemented as a code that can be read by a processor in a running detection apparatus or a recording medium readable by a processor included in the vehicle. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

Claims (13)

An optical transmitter for outputting the pattern light in a downward direction of the vehicle;
A light receiving unit for receiving the reflected light of the pattern light output;
And a processor for determining a road surface condition around the vehicle based on a difference between the pattern light and the reflected light or a difference between consecutively received reflected light and providing a vehicle posture control signal corresponding to the determined road surface condition Driving detection device.
The method according to claim 1,
The processor comprising:
Wherein the control unit senses the vehicle bust based on a difference between the pattern light and the reflected light or a difference between consecutively received reflected light when the vehicle bust is located in front of the vehicle, And a suspension drive control signal for controlling the suspension drive control signal.
The method according to claim 1,
The processor comprising:
When an uphill road or a downhill road or a road inclined in one direction is located in front of the vehicle, a difference between the pattern light and the reflected light or a difference between successive reflected light is used as the uphill road or downhill road or one direction Detects a sloped road, and provides a suspension drive control signal for controlling a suspension drive section in the vehicle, corresponding to a road inclined to the uphill road or the downhill road or one direction.
The method according to claim 1,
The processor comprising:
The control unit detects the hole based on the difference between the pattern light and the reflected light or the difference of reflected light continuously received when the hole is located in front of the vehicle and controls the steering driving unit in the vehicle corresponding to the hole And a steering drive control signal for steering the vehicle.
The method according to claim 1,
The processor comprising:
And calculates the three-dimensional position or the three-dimensional moving distance of the vehicle based on the difference between the pattern light and the reflected light or the difference of reflected light continuously received.
The method according to claim 1,
The optical transmitter includes:
A light source for outputting light;
And a light diffusion unit for diffusing the light and outputting the pattern light.
The method according to claim 1,
The processor comprising:
And controls the pattern of the pattern light output from the light transmitting unit to be varied on the basis of the received reflected light.
The method according to claim 1,
The processor comprising:
And outputs the infrared-based pattern light.
The method according to claim 1,
The processor comprising:
Wherein the control unit controls to output the pattern light based on the visible light during the day and outputs the pattern light based on the infrared light in the nighttime.
The method according to claim 1,
The processor comprising:
And changes at least one of brightness, hue and area of the pattern light to be output based on the level of the received reflected light.
6. The method of claim 5,
The processor comprising:
Calculating a position or movement distance of the x- and y-axes based on a time difference between the first reflected light and the second reflected light continuously received, and calculating a position or a movement distance of the z-axis based on a pattern difference between the first reflected light and the second reflected light, And calculates the distance.
6. The method of claim 5,
And a communication unit for receiving the position-based signal,
The processor comprising:
Based on the difference between the pattern light and the reflected light when the position-based signal is not received in the communication unit or the level of the position-based signal is equal to or less than a predetermined level, the three-dimensional position or the three- And outputs the detection result.
A vehicle comprising the traveling detection device of any one of claims 1 to 12.
KR1020150097875A 2015-07-09 2015-07-09 Apparatus for detecting navigation and vehicle including the same KR20170006753A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150097875A KR20170006753A (en) 2015-07-09 2015-07-09 Apparatus for detecting navigation and vehicle including the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150097875A KR20170006753A (en) 2015-07-09 2015-07-09 Apparatus for detecting navigation and vehicle including the same

Publications (1)

Publication Number Publication Date
KR20170006753A true KR20170006753A (en) 2017-01-18

Family

ID=57991917

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150097875A KR20170006753A (en) 2015-07-09 2015-07-09 Apparatus for detecting navigation and vehicle including the same

Country Status (1)

Country Link
KR (1) KR20170006753A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113044022A (en) * 2019-12-26 2021-06-29 现代自动车株式会社 Apparatus and method for controlling travel of vehicle
KR20220044031A (en) * 2020-09-29 2022-04-06 한국자동차연구원 System and method for controlling charging of online electric vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113044022A (en) * 2019-12-26 2021-06-29 现代自动车株式会社 Apparatus and method for controlling travel of vehicle
KR20220044031A (en) * 2020-09-29 2022-04-06 한국자동차연구원 System and method for controlling charging of online electric vehicle

Similar Documents

Publication Publication Date Title
KR102043060B1 (en) Autonomous drive apparatus and vehicle including the same
KR101741433B1 (en) Driver assistance apparatus and control method for the same
KR101850795B1 (en) Apparatus for Parking and Vehicle
KR101750876B1 (en) Display apparatus for vehicle and Vehicle
KR101551215B1 (en) Driver assistance apparatus and Vehicle including the same
US10782405B2 (en) Radar for vehicle and vehicle provided therewith
CN105270179B (en) Vehicle parking assistance device and vehicle
KR20170010645A (en) Autonomous vehicle and autonomous vehicle system including the same
KR101582572B1 (en) Driver assistance apparatus and Vehicle including the same
KR20190106845A (en) Method and apparatus for passenger recognition and boarding support of autonomous vehicle
KR20170003133A (en) Advanced Driver Assistance System, Display apparatus for vehicle and Vehicle
KR101632179B1 (en) Driver assistance apparatus and Vehicle including the same
KR101698781B1 (en) Driver assistance apparatus and Vehicle including the same
KR20170011885A (en) Antenna, radar for vehicle, and vehicle including the same
KR20160148394A (en) Autonomous vehicle
KR20170140284A (en) Vehicle driving aids and vehicles
KR101641491B1 (en) Driver assistance apparatus and Vehicle including the same
KR20160148395A (en) Autonomous vehicle
KR101822896B1 (en) Driver assistance apparatus and control method for the same
KR20170043212A (en) Apparatus for providing around view and Vehicle
KR101980547B1 (en) Driver assistance apparatus for vehicle and Vehicle
KR20170006753A (en) Apparatus for detecting navigation and vehicle including the same
KR20160144643A (en) Apparatus for prividing around view and vehicle including the same
KR101872477B1 (en) Vehicle
KR20160064762A (en) Display apparatus for vhhicleand vehicle including the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E601 Decision to refuse application
E801 Decision on dismissal of amendment