KR20160146280A - Driver Assistance Apparatus and Vehicle Having The Same - Google Patents

Driver Assistance Apparatus and Vehicle Having The Same Download PDF

Info

Publication number
KR20160146280A
KR20160146280A KR1020150083334A KR20150083334A KR20160146280A KR 20160146280 A KR20160146280 A KR 20160146280A KR 1020150083334 A KR1020150083334 A KR 1020150083334A KR 20150083334 A KR20150083334 A KR 20150083334A KR 20160146280 A KR20160146280 A KR 20160146280A
Authority
KR
South Korea
Prior art keywords
vehicle
laser
automatic parking
processor
unit
Prior art date
Application number
KR1020150083334A
Other languages
Korean (ko)
Inventor
윤종화
박상민
윤상열
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150083334A priority Critical patent/KR20160146280A/en
Publication of KR20160146280A publication Critical patent/KR20160146280A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S17/936
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/141Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

The automatic parking assist apparatus according to the embodiment has an advantage that the parking space can be precisely detected using a laser sensor having a high resolution and a low cost. At this time, the laser sensor swings the laser irradiation direction and moves the horizontal plane to be scanned in the lateral direction, thereby detecting the 3D space with the pointer laser.

Description

Technical Field [0001] The present invention relates to an automatic parking assist device and a vehicle including the same,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an automatic parking assisting apparatus provided in a vehicle, a control method thereof, and a vehicle including the same.

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

On the other hand, for the convenience of users who use the vehicle, various sensors and electronic devices are provided. In particular, various devices for the user's driving convenience have been developed.

Recently, as interest in autonomous vehicles increases, researches on sensors mounted on autonomous vehicles are actively under way. There are cameras, infrared sensors, radar, GPS, lidar, and gyroscope that are mounted on the autonomous vehicle. Among them, the camera occupies an important position as a sensor that replaces the human eye.

Particularly, there is a growing interest in a vehicle having an automatic parking function that automatically carries out parking in which a driver is currently experiencing difficulties.

Ultrasonic sensors have been attracting attention as sensors for detecting parking spaces for such automatic parking. Since the ultrasonic sensor has a wide field of view (FOV), the parking space can be sensed even by using a single ultrasonic sensor, so that the automatic parking function can be implemented at a low price.

However, such an ultrasonic sensor has a low range resolution, and it is difficult to precise parking space sensing. For example, although the space between the parked cars is larger than the full width of the vehicle (or the total length of the vehicle), the parking space is smaller than the actual parking space and the automatic parking is not performed.

In addition, a parking space detection sensor using a camera has been proposed, but the camera has a problem that it can not discriminate obstacles in a shape that is sensitive to light intensity and weather and can not be fixed.

Therefore, there is a demand for a sensor capable of accurately sensing a parking space at a low cost and a vehicle capable of providing an automatic parking function that meets user convenience by using such a sensor.

 SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems, and it is an object of the present invention to provide an automatic parking assist device using a laser sensor capable of detecting a precise parking space while securing vehicle safety.

An automatic parking assist device according to an embodiment of the present invention includes a laser output unit for irradiating a laser beam toward a side surface of the vehicle and a reflection signal reflected from the object A laser sensor including a receiving laser detecting portion; An irradiation direction control unit for adjusting a laser signal irradiation direction of the laser sensor; And a processor for calculating a distance between the vehicle and the object from the reflection signal, collecting the detected distance, and determining a parking space of the vehicle.

The automatic parking assist apparatus according to the embodiment has an advantage that the parking space can be precisely detected using a laser sensor having a high resolution and a low cost.

In detail, the laser sensor according to the embodiment can be realized by using a semiconductor laser diode of a small size, a low power and a low price, and has a merit that a parking space can be precisely measured because of its high range resolution.

The laser sensor according to the embodiment swings the laser irradiation direction and moves the horizontal plane to be scanned up and down, thereby detecting the 3D space with the pointer laser. In other words, the laser sensor according to the embodiment can accurately detect only the parking obstacle obstructing the parking by irradiating the laser in the horizontal direction and swinging in the vertical direction within a certain range.

Further, the direction adjusting unit of the laser sensor according to the embodiment can further improve the resolution by moving the laser irradiation direction in an oblique direction.

In addition, the direction adjusting unit of the laser sensor according to another embodiment can guide the laser sensor to swing by transmitting a natural vibration force generated in the vehicle through the link unit to the laser sensor, so that no separate power driver is included. And can be implemented at low cost.

Meanwhile, the automatic parking assist device according to the embodiment can detect the parking space by considering the space between the vehicle and the vehicle and the parking line of the vehicle through the surrounding view camera.

In addition, the automatic parking assist device according to the embodiment enables the driver to check whether the automatic parking is safe through the display unit, and to control the automatic parking assist device through the input unit, thereby further enhancing the safety of the vehicle.

Further, in the automatic parking assist device according to another embodiment, the outermost area is detected by a laser sensor having a good resolution, and the other space is detected by an ultrasonic sensor having a good viewing angle, thereby securing the safety of the vehicle, .

Further, the automatic parking assist apparatus according to the embodiment can precisely detect the parking space by correcting the laser irradiation direction in consideration of the vehicle tilt information.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a perspective view showing the appearance of a vehicle equipped with an automatic parking assist device according to an embodiment of the present invention; Fig.
2 is a plan view of a vehicle equipped with an automatic parking assist device according to an embodiment of the present invention.
3 shows a front view of a vehicle equipped with an automatic parking assist device according to an embodiment of the present invention.
FIG. 4 shows an internal block diagram of the automatic parking assisting apparatus 100 according to the embodiment of the present invention.
5 is a front view of a vehicle for indicating a change in laser irradiation direction according to an embodiment of the present invention.
6 is a side view of a vehicle for indicating a change in laser irradiation direction according to an embodiment of the present invention.
7 is a side view of a vehicle for indicating a change in laser irradiation direction according to another embodiment of the present invention
8 is a view showing a laser sensor according to another embodiment of the present invention.
9 shows a situation where the vehicle senses the parking space using the laser sensor.
Fig. 10 shows the parking space scanned by the laser sensor in Fig.
11 is a diagram illustrating a parking space detection using an approach-view camera of an automatic parking assist apparatus according to an embodiment of the present invention.
FIG. 12 is a view showing an automatic parking assist device according to another embodiment of the present invention detecting a parking space. FIG.
13 is a view showing a laser irradiation direction control according to vehicle tilt information according to an embodiment of the present invention.
14 is an example of an internal block diagram of the vehicle of Fig.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.

The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

In the following description, the left side of the vehicle means the left side in the running direction of the vehicle, and the right side of the vehicle means the right side in the running direction of the vehicle.

Unless otherwise mentioned in the following description, the LHD (Left Hand Drive) vehicle will be mainly described.

FIG. 1 is a perspective view showing the appearance of a vehicle equipped with an automatic parking assist device according to an embodiment of the present invention, FIG. 2 is a plan view of a vehicle equipped with an automatic parking assist device according to an embodiment of the present invention, 3 shows a front view of a vehicle equipped with an automatic parking assist device according to an embodiment of the present invention.

1, the vehicle 700 includes wheels 13FL and 13FR that are rotated by a power source, steering input means for adjusting the traveling direction of the vehicle 700, and an automatic parking assist device (100).

The automatic parking assist device 100 according to the embodiment of the present invention may include a laser sensor for detecting the parking space by measuring the distance between the vehicle 700 and the object.

The laser sensor can detect the parking space by illuminating the laser to the object and detecting the reflected laser to measure the distance between the vehicle 700 and the object.

2, this automatic parking assisting device 100L is disposed on the left side of the vehicle 700 and irradiates a laser toward the left side of the vehicle 700 so that the side of the vehicle 700 and the left side By measuring the distance between objects, the parking space can be detected.

The automatic parking assisting apparatus 100R is disposed on the right side of the vehicle 700 and irradiates a laser toward the right side of the vehicle 700 to adjust the distance between the side surface of the vehicle 700 and the object disposed on the right side By measuring, the parking space can be detected.

For convenience of explanation, the automatic parking assist apparatus 100R including the laser sensor disposed on the right side of the vehicle 700 will be described.

3, the laser sensor of the automatic parking assist system 100 is disposed on the side surface of the vehicle 700, and irradiates the laser toward the outside of the side surface, Signal can be received. Then, the distance d between the side of the vehicle 700 and the object O can be measured by analyzing the outputted laser signal information and the reflected signal information reflected back.

The distance between the vehicle 700 and the object O (for example, a time-of-flight (TOF) or / and a phase-shift) according to a laser signal modulation method d) can be measured.

In detail, in a time delay manner, it is possible to measure the distance d from the object O by emitting a pulsed laser signal and measuring the time that the reflected pulse signals from the objects O within the measurement range arrive at the receiver .

Alternatively, the time and distance d can be calculated by emitting a laser continuously modulated with a specific frequency in a phase modulation manner and measuring the amount of phase change of a signal reflected back from the object O within the measurement range.

Such a laser sensor can be implemented using a small, low-power, and low-cost semiconductor laser diode, but has a high range resolution, which can precisely measure the parking space. For example, the laser sensor can accurately recognize the difference between the space in which the vehicle is not disposed and the space in which the vehicle is not disposed, with a difference that the laser is reflected and not reflected, so that the parking space can be precisely recognized.

In an embodiment, a 1D laser may be used as the laser sensor, or a 2D laser may be used.

For example, the laser sensor is a pointer laser, which scans the first dimension space when the vehicle 700 is stopped by irradiating the laser toward the outside, and when the vehicle 700 moves, the second dimension plane can be scanned to detect the parking space of the vehicle 700. [ That is, the laser sensor can scan the horizontal plane including the laser irradiation direction L0 and the moving direction of the vehicle 700 to detect the parking space.

Such a laser sensor can only scan an object placed on a horizontal plane to be scanned, and thus may not detect an obstacle not disposed on a horizontal plane. For example, when the laser sensor determines the irradiation direction of the laser with respect to the height of the vehicle 700, there is a disadvantage that an obstacle (e.g., a curb) can not be detected at a relatively low height.

In addition, when the laser sensor uses more than 2D laser, there is a disadvantage that the volume increases and the unit price increases.

In order to overcome such a problem, the laser sensor of the automatic parking assist apparatus 100 according to the embodiment can change the irradiation direction L0 of the laser. More specifically, the laser sensor can detect the 3D space with the pointer laser by repeatedly changing the laser irradiation direction L0 in the vertical direction and moving the horizontal plane to be scanned in the vertical direction.

Hereinafter, the automatic park assistant apparatus 100 according to the embodiment will be described in more detail with reference to FIGS. 4 to 13. FIG.

FIG. 4 shows an internal block diagram of the automatic parking assisting apparatus 100 according to the embodiment of the present invention.

Referring to FIG. 4, the automatic parking assistance apparatus 100 according to the embodiment may include a laser sensor 110 and a processor 170. The automatic parking assistant 100 includes an input unit 120, a camera 150, an ultrasonic sensor 160, a memory 140, an audio output unit 185, a display unit 180, a power supply unit 190, And may further include an interface unit 130.

First, the laser sensor 110 includes a laser output section 111 for irradiating a laser, a laser detection section 113 for receiving the laser reflected by the object as a reflection signal, and a laser output section 111 for changing the laser irradiation direction And may include an adjustment unit 200.

More specifically, the laser output section 111 can output the laser from the vehicle 700 toward the outer object of the vehicle 700 in a specific irradiation direction.

Specifically, the operating range of the laser irradiated by the laser output unit 111 may be long-range. That is, the operating range may at least exceed the total length of the vehicle 700. [ For example, the operating range may be between 30 m and 300 m.

Further, the laser output section 111 can output a laser in the reference laser irradiation direction. For example, the reference laser irradiation direction may be a direction inclined downward from the horizontal direction. So as to irradiate the laser beam toward the other vehicle 700.

The laser irradiated by the laser output unit 111 may be reflected by an object placed outside the vehicle 700 to generate a reflected signal and the returned reflected signal may be detected by the laser detecting unit 113.

In an embodiment, the processor 170 may emit a laser signal and measure the time it takes for the reflected signals from objects within the measurement range to arrive at the receiver to measure the distance to the object.

More specifically, as the vehicle 700 moves forward (or rearward), the laser sensor 110 detects a 2D plane space including the laser irradiation direction toward the side of the vehicle 700 and the forward direction of the vehicle 700 By scanning, the parking space can be detected.

At this time, the direction adjusting unit 200 can scan the 3D space by changing the laser irradiation direction and controlling the 2D plane to be detected.

For example, the direction adjusting unit 200 can repeatedly move the laser irradiation direction in the vertical direction to repeatedly move the detected 2D plane in the vertical direction. Thus, the scanned 2D planes can be summed to scan the 3D (third dimension) space. That is, since the movement of the vehicle 700 is in the front-rear direction, the laser irradiation direction is the left-right direction, and the laser irradiation direction control of the direction adjusting unit 200 is the vertical direction, To scan the 3D space.

On the other hand, when the parking space is detected, it is important to accurately sense how far it protrudes from the horizontal direction than the vertical position of the object. The laser sensor 110 according to the embodiment can accurately sense only a parking obstacle obstructing the parking by irradiating the laser in the horizontal direction and swinging in the vertical direction within a certain range.

5 is a front view of a vehicle for indicating a change in laser irradiation direction according to an embodiment of the present invention. 6 is a side view of a vehicle for indicating a change in laser irradiation direction according to an embodiment of the present invention.

5 to 6, the direction adjusting unit 200 can move the laser irradiation direction from the reference laser irradiation direction L0 upward to the upper limit laser irradiation direction LH, And can be moved to the laser irradiation direction LL. That is, the direction adjusting unit 200 can swing the laser irradiation direction within the upper limit laser irradiation direction LH to the lower limit laser irradiation direction LL.

5, the direction adjusting unit 200 moves the laser irradiating direction upward in the reference laser irradiating direction L0 and moves the laser irradiating direction in the third direction L3, the second direction L2 and the first direction L1 Can be adjusted so as to irradiate the laser in the upper limit laser irradiation direction LH and can be adjusted in the upper limit laser irradiation direction LH in the first direction L1, the second direction L2, the third direction L3, Direction L4 to irradiate the laser in the lower-limit laser irradiation direction LL.

The laser detecting unit 113 can receive reflection signals corresponding to the irradiation direction change. For example, referring to FIG. 6, a reference reflection signal of the laser irradiated in the reference direction can be received. Likewise, the laser beam irradiated in the first direction to the fourth direction (L1, L2, L3, L4) can be sensed as the fourth reflected signal from the first reflected signal. This reflected signal detection frequency can be controlled by the processor 170. [ The higher the detection frequency, the better the resolution can be.

The processor 170 can process the received first to fourth reflected signals to calculate the positions of the objects arranged in the respective laser irradiation directions.

FIG. 7 is a side view of a vehicle for showing a change in laser irradiation direction according to another embodiment of the present invention. FIG.

Referring to FIG. 7, the direction adjusting unit 200 can further improve the resolution by moving the laser irradiation direction in an oblique direction. That is, the direction adjusting unit 200 can further improve the angular resolution in the vertical direction by swinging the laser irradiation direction in the oblique direction between the up-down direction and the forward-backward direction.

The direction adjusting unit 200 may control the laser irradiation direction by moving the laser output unit 111 itself or controlling the lens included in the laser output unit 111.

For example, the direction adjusting unit 200 includes a power driving unit, and can directly apply power to move the laser output unit 111 itself or move the lens.

Alternatively, the direction adjusting unit 200 of another embodiment of the present invention may move the natural vibration generated in the vehicle 700 to move the irradiation direction of the laser.

8 is a diagram illustrating a laser sensor 110 according to another embodiment of the present invention.

Referring to FIG. 8, the laser sensor 110 according to another embodiment includes a laser output unit 111, a laser detection unit 113, and a direction adjustment unit 200, and the direction adjustment unit 200 includes a laser sensor A coupling part 210 coupling the supporting part 220 and the laser sensor 110 and a link part 230 transmitting vibration of the vehicle 700 to the laser sensor 110 ).

More specifically, the support part 220 may be fixed to one side of the vehicle 700 to support the laser sensor 110 body. The support portion 220 may be coupled to the laser sensor 110 through the coupling portion 210.

The coupling part 210 may couple the body of the laser sensor 110 with the supporting part 220 to be movable. For example, the coupling portion 210 may include a hinge to couple the laser sensor 110 body with the support portion 220 so as to be rotatable within a predetermined angle. That is, when the force is applied, the body of the laser sensor 110 can rotate upward or downward with respect to the coupling portion 210 according to the direction of the force.

According to the movement of the laser sensor 110 body, the laser irradiation direction of the laser output section 111 can be swung up and down.

A link portion 230 may be disposed between the vehicle 700 and the laser sensor 110 body to transmit the natural vibration force of the vehicle 700 to the laser sensor 110 body.

The link unit 230 may be an elastic member connecting the other end of the body of the laser sensor 110 at a position opposite to the output of the laser sensor 110 and the vehicle 700. [

The link unit 230 may transmit the natural vibration force generated from the vehicle 700 to the other end of the laser sensor 110 to guide the laser sensor 110 to swing with respect to the coupling unit 210.

Such a laser sensor 110 can swing the laser irradiation direction without including a separate power drive unit, which is advantageous in that the laser sensor 110 can be implemented at the lower end.

On the other hand, the processor 170 controls the overall operation of each unit in the automatic parking assist device 100.

In detail, the processor 170 may analyze the reflected signal received from the laser sensor 110 to estimate the parking space.

In addition, the processor 170 may change the laser irradiation direction through the direction adjusting unit 200. [

In addition, the processor 170 may process the vehicle front image or the vehicle periphery image obtained by the camera 150. [ In particular, the processor 170 performs signal processing based on computer vision. Accordingly, the processor 170 can acquire images from the camera 150 in front of or in the vicinity of the vehicle 700, and can perform object detection and object tracking based on the images. Particularly, when detecting an object, the processor 170 may detect lane detection (LD), vehicle detection (VD), pedestrian detection (PD), light detection (Brightspot Detection) Traffic sign recognition (TSR), road surface detection, and the like.

The processor 170 may detect information in the vehicle front image or the vehicle surround image obtained by the camera 150. [

The information may be information on the driving situation of the vehicle. For example, the information may be a concept including the road information, the traffic regulation information, the surrounding vehicle information, the vehicle or pedestrian signal light information, the construction information, the traffic situation information, the parking lot information, have.

The processor 170 may compare the detected information with the information stored in the memory 140 to verify the information.

For example, the processor 170 detects a graphic or text indicating the vehicle 700 or the parking lot from the object included in the acquired image. Here, the object may be a traffic sign or a parking line. The processor 170 may compare the traffic information stored in the memory 140 with the detected pattern or text to identify the parking lot display and the parking line.

Hereinafter, the process of detecting the parking space by the laser sensor 110 and the processor 170 will be described in more detail with reference to FIGS. 9 to 10. FIG.

Fig. 9 shows the vehicle sensing the parking space using the laser sensor, and Fig. 10 shows the parking space scanned by the laser sensor in Fig.

9, the first obstacle to the fifth obstacle O1, O2, O3, O4, and O5 having different heights are arranged on the left side of the vehicle 700 in accordance with the traveling direction of the vehicle 700 have.

More specifically, since the first obstacle O1 and the fifth obstacle O5 have a predetermined height or more as the vehicle 700, the first obstacle O1 and the fifth obstacle O5 can be detected by the laser sensor 110 without difficulty.

That is, when the laser sensor 110 irradiates the position of the bumper of the vehicle 700 in the reference laser irradiation direction L0, the laser is reflected at the first obstacle O1 and the fifth obstacle O5, The position of the first obstacle O1 and the fifth obstacle O5 can be accurately detected by sensing the reflected signal.

On the other hand, since the second obstacle O2 and the fourth obstacle O4 are curved at a low height, when the laser is emitted in the reference laser irradiation direction L0, the laser may not be reflected and may not be detected.

Also, the third obstacle (O3) is a narrow tree, and if the resolution is not sufficiently high, its size may not be accurately detected. That is, there may be a disadvantage in that the ultrasonic sensor 160 senses the third obstacle O3 to be larger than the actual size.

The laser sensor 110 according to the embodiment can swing the laser irradiation direction in the vertical direction and acquire a reflection signal reflected from the curb when irradiating the laser toward the lower limit laser irradiation direction LL. Therefore, the second obstacle O2 and the fourth obstacle O4 can be scanned without difficulty.

The laser sensor 110 according to the embodiment can accurately measure the size of the third obstacle O3 because the resolution is high using the laser and the size of the first obstacle O1 and the fifth obstacle O5 The placement position can also be accurately detected.

Referring to FIG. 10, the processor 170 may scan the 3D space using the laser sensor 110, convert the 3D space into a 2D image, and check the parking space. That is, the processor 170 can detect the parking space by setting the parking limit line by checking the horizontal position of the objects having a height that interferes with the parking.

Accordingly, the processor 170 can obtain the parking limit line according to the first to fifth obstacles (O1, O2, O3, O4, and O5) disposed at positions where parking is interrupted.

The processor 170 measures distances between the vehicle 700 and the first to fifth obstacles O1, O2, O3, O4 and O5 so that when the distances of all the obstacles are greater than the full width of the vehicle 700 , It can be estimated as a parking space.

That is, the distance c between the vehicle 700 and the second obstacle O2, the distance b between the third obstacle O3 and the fourth obstacle O4, It is possible to estimate the parking space as a parking space.

In the case of parallel parking, the processor 170 may determine such a space as a parking space when the front and rear space d at a distance larger than the full width of the vehicle 700 is larger than the total length of the vehicle 700, have. That is, the automatic parking of the vehicle 700 can be performed only when the longitudinal length d of the space larger than the full width of the vehicle 700 is longer than the total length of the vehicle 700. [

10, when the distances a, b, and c of the vehicle 700 and the obstacle are both larger than the full width of the vehicle 700 and the length d including these distances is larger than the total length of the vehicle 700, The vehicle 700 can be automatically parked in the space.

Meanwhile, the automatic park assistant apparatus 100 according to the embodiment may further include at least one camera 150.

FIG. 11 is a view showing a parking space detection using an approach-view camera of the automatic parking assist apparatus according to the embodiment. FIG.

Referring to FIG. 11, the automatic driving assistant device may include a plurality of cameras 150a, 150b, 150c, and 150d. In this case, the camera 150 can be named as the surrounding view cameras 150a, 150b, 150c, and 150d.

A plurality of cameras 150 may be disposed on the left, rear, right, and front of the vehicle 700, respectively.

The left camera 150b may be disposed in a case surrounding the left side mirror. Alternatively, the left camera 150b may be disposed outside the case surrounding the left side mirror. Alternatively, the left camera 150b may be disposed in one area outside the left front door, the left rear door, or the left fender.

The right camera 150c may be disposed in a case surrounding the right side mirror. Or the right camera 150c may be disposed outside the case surrounding the right side mirror. Alternatively, the right camera 150c may be disposed in one area outside the right front door, the right rear door, or the right fender.

On the other hand, the rear camera 150d may be disposed in the vicinity of a rear license plate or a trunk switch.

The front camera 150a may be disposed in the vicinity of the ambulance or in the vicinity of the radiator grill.

Each image photographed by the plurality of cameras 150a, 150b, 150c, and 150d is transmitted to the processor 170, and the processor 170 may synthesize the respective images to generate a vehicle periphery image.

The vehicle peripheral image includes a first image area photographed by the left camera 150b, a second image area photographed by the rear camera 150d, a third image area photographed by the right camera 150c, 150a. ≪ / RTI >

On the other hand, when a surround view image is generated from the plurality of cameras 150, a boundary portion between the respective image regions occurs. These boundary portions can be naturally displayed by image blending processing.

On the other hand, a boundary line can be displayed at the boundary of each of the plurality of images.

Meanwhile, the vehicle surroundings image may include the image of the vehicle 700. Where the image of the vehicle 700 may be an image generated by the processor 170.

The processor 170 processes the images of the plurality of images to identify nearby vehicles, parking lines, roads, signs, hazardous areas, tunnels, etc. located around the vehicle 700.

In detail, the processor 170 may receive an image from the camera 150 and perform preprocessing. In particular, the processor 170 may perform various operations on the image, such as noise reduction, rectification, calibration, color enhancement, color space conversion (CSC) interpolation, camera 150 gain control, and the like. Accordingly, a clearer image than the stereo image photographed by the camera 150 can be obtained.

Specifically, processor 170 may separate the background and the foreground for at least one of the images.

Next, the processor 170 may detect the object based on the image segment.

Specifically, processor 170 may detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.

Next, the processor 170 classifies and verifies the isolated object.

For this, the processor 170 may use a neural network identification method, a SVM (Support Vector Machine) method, an AdaBoost identification method using a Haar-like feature, or a Histograms of Oriented Gradient Can be used.

Finally, the processor 170 can compare the detected object with the objects stored in the memory 140 to identify the object.

For example, the processor 170 can identify nearby vehicles, parking lines, roads, signs, hazardous areas, tunnels, etc., located around the vehicle 700.

The processor 170 can estimate the parking space in consideration of the object thus confirmed. That is, the processor 170 can estimate the parking space in consideration of the parking space and the parking line measured by the laser sensor 110. [

11, it can be seen that the parking space S is calculated in consideration of the space between the vehicle and the vehicle and the parking line of the vehicle 700. [

On the other hand, the vehicle surroundings image may be displayed through the display unit 180 of the vehicle 700 or the display unit 180 of the automatic parking assist apparatus 100.

In the embodiment, the automatic parking assist device 100 may further include a display unit 180 for displaying an overview image.

For example, the display unit 180 may display an image of the surround view synthesized by the processor 170. In addition, the display portion 180 may further display an image related to the operation of the self-parking assist device 100. For example, the display unit 180 may display a space identified as a parking space in the surround view image, and may indicate an automatic parking path of the vehicle 700. [

The display unit 180 may be a display device installed inside the vehicle 700. In addition, the display unit 180 may include a cluster or a head up display (HUD) on the inside of the vehicle 700. Meanwhile, when the display unit 180 is the HUD, it may include a projection module that projects an image on the glass of the vehicle 700. [

The automatic parking assist device 100 may further include an input unit 120 receiving user input related to automatic parking control.

The input unit 120 may be a plurality of buttons disposed within the vehicle 700 or a touch screen of the display unit 180. It is possible to turn on and operate the automatic parking assist device 100 through a plurality of buttons or a touch screen.

Further, the driver can input automatic parking execution and stop execution through the input unit 120. [ For example, the driver can confirm the parking space through the display unit 180 and execute the automatic parking through the input unit 120 when there is an indication that automatic parking is possible.

In addition, the driver can stop the automatic parking operation while the automatic parking path is displayed on the display unit 180 and the automatic parking is being executed.

That is, the automatic parking assisting device 100 allows the driver to confirm whether the automatic parking is safe through the display unit 180, and to control the safety of the vehicle 700, thereby further enhancing the safety of the vehicle 700.

12 is a view showing a state in which the automatic parking assist apparatus according to another embodiment detects a parking space.

Referring to FIG. 12, the automatic park assistant apparatus 100 may further include an ultrasonic sensor 160 for detecting a parking space together with the laser sensor 110. At this time, the laser sensor 110 may not include the direction adjusting unit.

More specifically, the automatic parking assist apparatus 100 may further include an ultrasonic sensor 160 that emits ultrasonic waves toward the side of the vehicle 700 and receives ultrasonic waves reflected from the object. The processor 170 calculates the distance between the vehicle 700 and the object from the received reflected wave, and collects the calculated distance to grasp the parking space of the vehicle 700. [

The ultrasonic sensor 160 has a wide viewing angle, while the resolution is low. On the other hand, the laser sensor 110 has a narrow viewing angle, while the resolution is high.

In consideration of the above characteristics, the processor 170 can estimate the parking space in consideration of the distance measured by the ultrasonic sensor 160 and the distance measured through the laser sensor 110. [

12, it can be seen that the space between the first obstacle O1 and the fifth obstacle O5 is wider than the total length of the vehicle 700, which is a sufficient space for parking.

However, when the processor 170 detects the parking space using only the ultrasonic sensor 160, the first parking space K1 is detected, and if the first parking space K1 is smaller than the vehicle 700, .

When the processor 170 detects the parking space with the laser sensor 110, it is possible to precisely sense the space where the vehicle 700 and the vehicle 700 are not present, so that the second parking space K2 is detected . At this time, it can be understood that the second parking space K2 is added with the outermost space E measured by the laser sensor 110 in the first parking space K1.

That is, the processor 170 detects the outermost region E with the laser sensor 110 having a good resolution and detects the other space with the ultrasonic sensor 160 having a good viewing angle to secure the safety of the vehicle 700 Precision can be ensured in space detection.

13 is a view showing a laser irradiation direction control according to vehicle tilt information according to an embodiment of the present invention.

Referring to FIG. 13, the automatic park assistant device 100 may further include an interface unit 130.

The interface unit 130 may receive the data related to the vehicle 700 or may transmit the processed or generated signal to the outside. In the embodiment, the sensing unit is described as being included in the vehicle 700 itself, but a separate sensing unit may be included in the automatic parking assist device 100. [

In detail, the interface unit 130 may transmit the parking space estimated by the processor 170 to the vehicle control unit 770 so that the vehicle control unit 770 may perform the automatic parking.

Also, the interface unit 130 can receive the sensor information from the control unit 770 or the sensor unit 760.

On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

Such vehicle driving information can be utilized as an element for controlling the laser sensor 110. [

For example, the processor 170 may receive the vehicle tilt information through the interface unit 130, and may control the laser irradiation direction through the vehicle tilt information.

13, it can be seen that the vehicle is inclined by the first angle? 1 in the horizontal direction.

At this time, when the laser sensor 110 irradiates the laser in the conventional reference laser irradiation direction L0, the laser irradiation direction is directed toward the bottom surface or the air hole of the vehicle 700, and it may be difficult to detect the accurate parking space .

In order to prevent this, the processor 170 may correct the laser irradiation direction through the direction adjusting unit 200 in consideration of a situation in which the laser beam is inclined by the first angle? 1. For example, the processor 170 may correct the laser irradiation direction by the second angle? 2 corresponding to the first angle? 1.

In addition, the automatic parking assist apparatus 100 may further include an audio output unit 185, a power supply unit 190, a memory 140, and the like.

The memory 140 may store data for object identification. For example, when a predetermined object is detected in the image acquired through the camera 150, the memory 140 may store data for confirming what the object corresponds to by a predetermined algorithm.

The memory 140 may store data on traffic information. For example, when predetermined traffic information is detected in an image obtained through the camera 150, the memory 140 may store data for checking what the traffic information corresponds to by a predetermined algorithm have.

Meanwhile, the memory 140 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like in hardware.

The audio output unit 185 can output the sound to the outside based on the audio signal processed by the processor 170. [ To this end, the audio output unit 185 may include at least one speaker.

An audio input unit (not shown) can receive a user's voice. For this purpose, a microphone may be provided. The received voice may be converted to an electrical signal and transmitted to the processor 170.

The power supply unit 190 can supply power necessary for the operation of each component under the control of the processor 170. [ Particularly, the power supply unit 190 can receive power from a battery or the like inside the vehicle.

14 is an example of an internal block diagram of the vehicle of Fig.

The vehicle 700 includes a communication unit 710, an input unit 720, a sensing unit 760, an output unit 740, a vehicle driving unit 750, a memory 730, an interface unit 780, a control unit 770, A parking assist device 790, an automatic parking assist device 100, and an AVN device 400.

The communication unit 710 is connected to the communication unit 710 to communicate with the vehicle 700 and the mobile terminal 600, Modules. In addition, the communication unit 710 may include one or more modules that connect the vehicle 700 to one or more networks.

The communication unit 710 may include a broadcast receiving module 711, a wireless Internet module 712, a local area communication module 713, a location information module 714, and an optical communication module 715.

The broadcast receiving module 711 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.

The wireless Internet module 712 refers to a module for wireless Internet access, and may be embedded in the vehicle 700 or externally. The wireless Internet module 712 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro World Wide Interoperability for Microwave Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), and Long Term Evolution-Advanced (LTE-A) (712) transmits and receives data according to at least one wireless Internet technology in a range including internet technologies not listed above. For example, the wireless Internet module 712 can exchange data with the external server 510 wirelessly. The wireless Internet module 712 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) information from the external server 510. [

The short-range communication module 713 is for short-range communication and may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), an Ultra Wideband (UWB) It is possible to support near-field communication using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless USB (Universal Serial Bus)

The short-range communication module 713 may form short-range wireless communication networks to perform short-range communication between the vehicle 700 and at least one external device. For example, the short-range communication module 713 can exchange data with the mobile terminal 600 wirelessly. The short distance communication module 713 can receive weather information and traffic situation information of the road (for example, TPEG (Transport Protocol Expert Group)) from the mobile terminal 600. For example, when the user has boarded the vehicle 700, the user's mobile terminal 600 and the vehicle 700 can perform pairing with each other automatically or by execution of the user's application.

The position information module 714 is a module for obtaining the position of the vehicle 700, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle 700 utilizes a GPS module, it can acquire the position of the vehicle 700 using a signal sent from the GPS satellite.

The optical communication module 715 may include a light emitting portion and a light receiving portion.

The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive the information of the preceding vehicle through the light emitted from the light source included in the front vehicle 700.

The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle 700. [ For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 715 can exchange data with another vehicle 520 via optical communication.

The input unit 720 may include a driving operation unit 721, a camera 160, a microphone 723, and a user input unit 724.

The driving operation means 721 receives a user input for driving the vehicle. The driving operation means 721 may include a steering input means 721, a shift input means 721, an acceleration input means 721 and a brake input means 721.

The steering input means 721 receives the input of the traveling direction of the vehicle 700 from the user. The steering input means 721 is preferably formed in a wheel shape so that steering input is possible by rotation. According to an embodiment, the steering input means 721 may be formed of a touch screen, a touch pad or a button.

The shift input means 721 receives inputs of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle 700 from the user. The shift input means 721 is preferably formed in a lever shape. According to an embodiment, the shift input means 721 may be formed of a touch screen, a touch pad or a button.

The acceleration input means 721 receives an input for acceleration of the vehicle 700 from the user. The brake input means 721 receives an input for decelerating the vehicle 700 from the user. The acceleration input means 721 and the brake input means 721 are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means 721 or the brake input means 721 may be formed of a touch screen, a touch pad, or a button.

The camera 150 may include an image sensor and an image processing module. The camera 150 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module processes the still image or moving image obtained through the image sensor, extracts necessary information, and transmits the extracted information to the control unit 770. Meanwhile, the vehicle 700 may include a camera 150 that photographs a vehicle front image or a vehicle peripheral image, and a monitoring unit 150 that photographs an in-vehicle image.

The monitoring unit 150 may acquire an image of the passenger. The monitoring unit 150 may obtain an image for biometrics of the passenger.

34, the monitoring unit 150 and the camera 150 are included in the input unit 720. However, the camera 150 may be configured to include the automatic parking assist device 100 as described above, .

The microphone 723 can process an external sound signal as electrical data. The processed data can be utilized variously according to functions performed in the vehicle 700. The microphone 723 can convert the voice command of the user into electrical data. The converted electrical data can be transmitted to the control unit 770.

Meanwhile, according to the embodiment, the camera 150 or the microphone 723 may be a component included in the sensing unit 760, not a component included in the input unit 720. [

The user input unit 724 is for receiving information from a user. When information is inputted through the user input unit 724, the control unit 770 can control the operation of the vehicle 700 to correspond to the inputted information. The user input unit 724 may include touch input means or mechanical input means. According to an embodiment, the user input 724 may be located in one area of the steering wheel. In this case, the driver can operate the user input portion 724 with his / her finger while holding the steering wheel.

The sensing unit 760 senses a signal relating to the running of the vehicle 700 and the like. To this end, the sensing unit 760 may include a sensor, a wheel sensor, a velocity sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, A position module, a vehicle 700 forward / backward sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor 700, a vehicle internal humidity sensor, ), Radar, rider, and the like.

Thereby, the sensing unit 760 can acquire the vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, , Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, and the like.

In addition, the sensing unit 760 may include an acceleration pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The sensing unit 760 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor that senses the passenger's biometric information. Here, the monitoring unit 150 and the microphones 723 may operate as sensors. The biometric information sensing unit can acquire the hand shape information and the face recognition information through the monitoring unit 150.

The output unit 740 is for outputting information processed by the control unit 770 and may include a display unit 741, an acoustic output unit 742, and a haptic output unit 743. [

The display unit 741 can display information processed in the control unit 770. For example, the display unit 741 can display the vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.

The display unit 741 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.

The display unit 741 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. This touch screen may function as a user input 724 that provides an input interface between the vehicle 700 and the user and may provide an output interface between the vehicle 700 and the user. In this case, the display unit 741 may include a touch sensor that senses a touch with respect to the display unit 741 so that a control command can be received by a touch method. When a touch is made to the display unit 741, the touch sensor senses the touch, and the control unit 770 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

Meanwhile, the display unit 741 may include a cluster so that the driver can check the state information of the vehicle 700 or the driving information of the vehicle 700 while driving the vehicle. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the line of sight ahead of the vehicle 700.

Meanwhile, according to the embodiment, the display unit 741 may be implemented as a Head Up Display (HUD). When the display unit 741 is implemented as a HUD, information can be output through a transparent display provided in the windshield. Alternatively, the display unit 741 may include a projection module to output information through an image projected on the windshield.

The sound output unit 742 converts an electric signal from the control unit 770 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 742 may include a speaker or the like. It is also possible for the sound output section 742 to output a sound corresponding to the operation of the user input section 724. [

The haptic output unit 743 generates a tactile output. For example, the haptic output section 743 may operate to vibrate the steering wheel, the seat belt, and the seat so that the user can recognize the output.

The vehicle drive unit 750 can control the operation of various devices of the vehicle 700. [ The vehicle driving unit 750 includes a power source driving unit 751, a steering driving unit 752, a brake driving unit 753, a lamp driving unit 754, an air conditioning driving unit 755, a window driving unit 756, an airbag driving unit 757, A driving unit 758 and a suspension driving unit 759.

The power source driving unit 751 can perform electronic control on the power source in the vehicle 700. [

For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive unit 751 can perform electronic control on the engine. Thus, the output torque of the engine and the like can be controlled. When the power source drive unit 751 is an engine, the speed of the vehicle 700 can be limited by limiting the engine output torque under the control of the control unit 770. [

As another example, when the electric motor (not shown) is a power source, the power source driving unit 751 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The steering driver 752 may perform electronic control of the steering apparatus in the vehicle 700. [ Thereby, the traveling direction of the vehicle 700 can be changed.

The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle 700. [ For example, it is possible to reduce the speed of the vehicle 700 by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 700 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The lamp driving unit 754 can control the turn-on / turn-off of the lamp disposed in the vehicle 700 or outside. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.

The air conditioning driving section 755 can perform electronic control on an air conditioner (not shown) in the vehicle 700. [ For example, when the temperature inside the vehicle 700 is high, the air conditioner can be operated to control the cooling air to be supplied into the inside of the vehicle 700.

The window driver 756 may perform electronic control of the window apparatus in the vehicle 700. [ For example, it is possible to control the opening or closing of the side of the vehicle 700 with respect to the left and right windows.

The airbag driving unit 757 can perform electronic control of the airbag apparatus in the vehicle 700. [ For example, in case of danger, the airbag can be controlled to fire.

The sunroof driving unit 758 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 700. [ For example, the opening or closing of the sunroof can be controlled.

The suspension driving unit 759 can perform electronic control on a suspension apparatus (not shown) in the vehicle 700. [ For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 700. [

The memory 730 is electrically connected to the control unit 770. The memory 770 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 790 can be, in hardware, various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like. The memory 730 may store various data for operation of the vehicle 700, such as a program for processing or controlling the control unit 770. [

The interface unit 780 may serve as a pathway to various kinds of external devices connected to the vehicle 700. For example, the interface unit 780 may include a port that can be connected to the mobile terminal 600, and may be connected to the mobile terminal 600 through the port. In this case, the interface unit 780 can exchange data with the mobile terminal 600.

Meanwhile, the interface unit 780 may serve as a channel for supplying electrical energy to the connected mobile terminal 600. The interface unit 780 provides electric energy supplied from the power supply unit 790 to the mobile terminal 600 under the control of the control unit 770 when the mobile terminal 600 is electrically connected to the interface unit 780 do.

The control unit 770 can control the overall operation of each unit in the vehicle 700. [ The control unit 770 may be referred to as an ECU (Electronic Control Unit).

The controller 770 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) 170 may be implemented using at least one of processors, controllers, micro-controllers, microprocessors 170, and electrical units for performing other functions.

The control unit 770 can delegate the role of the processor 170 described above. That is, the processor 170 of the automatic parking assist device 100 may be set directly to the control unit 770 of the vehicle 700. [ In such an embodiment, it is understood that the automatic parking assist device 100 refers to some configurations of the vehicle 700 as a whole.

Alternatively, the control unit 770 may control the configurations so as to transmit the information requested by the processor 170. [

The power supply unit 790 can supply power necessary for the operation of each component under the control of the control unit 770. [ In particular, the power supply unit 770 can be supplied with power from a battery (not shown) or the like inside the vehicle 700.

The AVN (Audio Video Navigation) device 400 can exchange data with the control unit 770. The control unit 770 can receive navigation information from the AVN apparatus 400 or a separate navigation device (not shown). Here, the navigation information may include set destination information, route information according to the destination, map information related to the vehicle driving, or vehicle 700 position information.

The features, structures, effects and the like described in the foregoing embodiments are included in at least one embodiment of the present invention and are not necessarily limited to one embodiment. Further, the features, structures, effects, and the like illustrated in the embodiments may be combined or modified in other embodiments by those skilled in the art to which the embodiments belong. Therefore, it should be understood that the present invention is not limited to these combinations and modifications.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be construed as limiting the scope of the present invention. It can be seen that various modifications and applications are possible. For example, each component specifically shown in the embodiments may be modified and implemented. It is to be understood that the present invention may be embodied in many other specific forms without departing from the spirit or essential characteristics thereof.

Claims (17)

An automatic parking assist device for controlling a vehicle to be automatically parked,
A laser output section for irradiating a laser signal toward a side surface of the vehicle; and a laser detecting section for receiving a reflected signal reflected from the object;
An irradiation direction control unit for adjusting a laser signal irradiation direction of the laser sensor; And
And a processor for calculating a distance between the vehicle and the object from the reflection signal, collecting the detected distance, and determining a parking space of the vehicle
Automatic parking aid.
The method according to claim 1,
Wherein the irradiation direction control unit comprises:
And the laser output unit is repeatedly moved in the vertical direction by applying power to the driving unit to adjust the laser signal irradiation direction
Automatic parking aid.
The method according to claim 1,
Wherein the irradiation direction control unit comprises:
And the laser output section is repeatedly moved in the diagonal direction by applying power to the driving section to adjust the laser signal irradiation direction
Automatic parking aid.
The method according to claim 2 or 3,
The processor comprising:
The 3D space is scanned from the reflection signal for the repeatedly irradiated laser signal to grasp the parking space of the vehicle
Automatic parking aid.
The method according to claim 1,
Wherein the irradiation direction control unit comprises:
A support for supporting the laser sensor;
An engaging portion for engaging the laser sensor so as to be movable with respect to the support portion; And
And a link portion for transmitting the vibration of the vehicle to the laser sensor to move the laser sensor
Automatic parking aid.
6. The method of claim 5,
Wherein the engaging portion includes a hinge shaft for vertically rotating the laser sensor within a predetermined angle,
Wherein the link portion includes an elastic member connecting one end of the laser sensor and one surface of the vehicle
Automatic parking aid.
The method according to claim 1,
Further comprising an ultrasonic sensor for emitting an ultrasonic wave toward a side surface of the vehicle and receiving the reflected wave reflected from the object,
The processor
Calculates a distance between the vehicle and the object from the reflected wave, collects the detected distance, and grasps the parking space of the vehicle
Automatic parking aid.
8. The method of claim 7,
The processor comprising:
The recognition of the outermost spatial space is performed by grasping the parking space of the vehicle with priority given to the space calculated from the laser sensor
Automatic parking aid.
The method according to claim 1,
Further comprising an aurora view camera for photographing the front, rear, left, and right directions of the vehicle,
The processor comprising:
Synthesizes images photographed in the front, rear, left, and right directions, and generates a plane image viewed from the top view of the vehicle
Automatic parking aid.
10. The method of claim 9,
And a display unit for outputting the plane image,
The processor comprising:
And controls the display unit to further display the parking space identified from the sensing of the laser sensor on the plane image
Automatic parking aid.
11. The method of claim 10,
The processor comprising:
The control unit controls the display unit to further display the automatic parking path of the vehicle on the plane image,
Further comprising an input section for receiving an automatic parking stop input from a user
Automatic parking aid.
The method according to claim 1,
Further comprising a camera for photographing the front of the vehicle or the vicinity of the vehicle to acquire an image,
The processor comprising:
A parking lot display and a parking line are detected from the image,
Automatic parking aid.
13. The method of claim 12,
The processor comprising:
When the parking line is detected in the parking space, control is performed to automatically park the parking line
Automatic parking aid.
3. The method of claim 2,
Further comprising an interface unit for receiving tilt information of the vehicle from the gyro sensor of the vehicle
Automatic parking aid.
15. The method of claim 14,
The processor comprising:
According to the inclination information of the vehicle, the irradiation direction of the laser signal of the laser sensor is adjusted through the irradiation direction control unit
Automatic parking aid.
15. The method of claim 14,
The processor comprising:
The parking space of the vehicle is grasped through the reflection signal and the vehicle tilt information
Automatic parking aid.
A vehicle driving assist system comprising a vehicle driving assist system according to any one of claims 1 to 16
vehicle.
KR1020150083334A 2015-06-12 2015-06-12 Driver Assistance Apparatus and Vehicle Having The Same KR20160146280A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150083334A KR20160146280A (en) 2015-06-12 2015-06-12 Driver Assistance Apparatus and Vehicle Having The Same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150083334A KR20160146280A (en) 2015-06-12 2015-06-12 Driver Assistance Apparatus and Vehicle Having The Same

Publications (1)

Publication Number Publication Date
KR20160146280A true KR20160146280A (en) 2016-12-21

Family

ID=57735122

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150083334A KR20160146280A (en) 2015-06-12 2015-06-12 Driver Assistance Apparatus and Vehicle Having The Same

Country Status (1)

Country Link
KR (1) KR20160146280A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180078984A (en) * 2016-12-30 2018-07-10 기아자동차주식회사 Automatically parking system and automatically parking method
CN110095769A (en) * 2018-01-29 2019-08-06 杭州海康汽车技术有限公司 A kind of method for detecting parking stalls, device and electronic equipment
US10392009B2 (en) 2015-08-12 2019-08-27 Hyundai Motor Company Automatic parking system and automatic parking method
US11691619B2 (en) 2015-08-12 2023-07-04 Hyundai Motor Company Automatic parking system and automatic parking method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10392009B2 (en) 2015-08-12 2019-08-27 Hyundai Motor Company Automatic parking system and automatic parking method
US11691619B2 (en) 2015-08-12 2023-07-04 Hyundai Motor Company Automatic parking system and automatic parking method
KR20180078984A (en) * 2016-12-30 2018-07-10 기아자동차주식회사 Automatically parking system and automatically parking method
CN110095769A (en) * 2018-01-29 2019-08-06 杭州海康汽车技术有限公司 A kind of method for detecting parking stalls, device and electronic equipment
CN110095769B (en) * 2018-01-29 2020-07-31 杭州海康汽车技术有限公司 Parking space detection method and device and electronic equipment

Similar Documents

Publication Publication Date Title
US10131347B2 (en) Parking assistance apparatus and vehicle having the same
KR101832466B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101750178B1 (en) Warning Method Outside Vehicle, Driver Assistance Apparatus For Executing Method Thereof and Vehicle Having The Same
US10200656B2 (en) Display apparatus and vehicle including the same
KR101916993B1 (en) Display apparatus for vehicle and control method thereof
KR101838187B1 (en) Display Apparatus and Vehicle Having The Same
KR101838967B1 (en) Convenience Apparatus for Vehicle and Vehicle
US20180093619A1 (en) Vehicle display apparatus and vehicle having the same
CN107380056A (en) Vehicular illumination device and vehicle
KR20170058188A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101790426B1 (en) Apparatus for automatic parking and vehicle having the same
KR101962348B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR102470298B1 (en) A method of correcting cameras and device thereof
KR20160147557A (en) Automatic parking apparatus for vehicle and Vehicle
KR102077575B1 (en) Vehicle Driving Aids and Vehicles
KR20160146280A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101929294B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101897350B1 (en) Driver Assistance Apparatus
KR20180069646A (en) Driver assistance apparatus
US20210333869A1 (en) Vehicle control device and vehicle control method
KR20170110800A (en) Navigation Apparutaus and Driver Assistance Apparatus Having The Same
KR20170069096A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101737236B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101888259B1 (en) Vehicle Assistance Apparatus and Vehicle Having The Same
US20210323469A1 (en) Vehicular around view image providing apparatus and vehicle