KR20170055334A - Appratus and method for assisting a driver based on difficulty level of parking - Google Patents

Appratus and method for assisting a driver based on difficulty level of parking Download PDF

Info

Publication number
KR20170055334A
KR20170055334A KR1020150158377A KR20150158377A KR20170055334A KR 20170055334 A KR20170055334 A KR 20170055334A KR 1020150158377 A KR1020150158377 A KR 1020150158377A KR 20150158377 A KR20150158377 A KR 20150158377A KR 20170055334 A KR20170055334 A KR 20170055334A
Authority
KR
South Korea
Prior art keywords
parking
vehicle
candidate
zone
parking zone
Prior art date
Application number
KR1020150158377A
Other languages
Korean (ko)
Other versions
KR101832224B1 (en
Inventor
백일주
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150158377A priority Critical patent/KR101832224B1/en
Publication of KR20170055334A publication Critical patent/KR20170055334A/en
Application granted granted Critical
Publication of KR101832224B1 publication Critical patent/KR101832224B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0082Automatic parameter input, automatic initialising or calibrating means for initialising the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/42Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/52Radar, Lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/06Automatic manoeuvring for parking

Abstract

The present invention relates to an apparatus and a method for supporting a driver when a vehicle is parked. According to one embodiment of the present invention, the apparatus for supporting a driver includes: a sensing portion configured to generate sensing information corresponding to a space around the vehicle; and a control portion configured to be connected to the sensing portion. In a parking assist mode, the control portion is configured to detect a candidate parking area existing in the surrounding space based on the sensing information provided from the sensing portion, calculate a score showing the difficulty of parking with respect to the respective candidate parking areas, and set any one of the candidate parking areas to be a target parking area based on the score. As such, in case there are several parking areas around a vehicle where the vehicle is able to be parked, the present invention can search and guide a parking area that is the most useful in parking based on the respective difficulties of parking in accordance with the several parking areas.

Description

[0001] APPARATUS AND METHOD FOR ASSISTING A DRIVER BASED ON DIFFICULTY LEVEL OF PARKING [0002]

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to an apparatus and a method for supporting a driver, and more particularly, to an apparatus and a method for supporting a driver in parking a vehicle based on parking difficulty by parking area.

A vehicle is a device that drives a wheel to transport a person or cargo from one place to another. For example, two-wheeled vehicles such as a motorcycle, a four-wheeled vehicle such as a sedan, as well as a train belong to the vehicle.

In order to increase the safety and convenience of users who use the vehicle, development of technologies for connecting various sensors and electronic devices to the vehicle has been accelerated. In particular, a system that provides various functions (eg, smart cruise control, lane keeping assistance) developed for the user's driving convenience is installed in the vehicle. Thereby, so-called autonomous driving in which the vehicle runs on the road in consideration of the external environment itself becomes possible without the driver's operation.

On the other hand, parking is one of the most difficult vehicle manipulation behaviors while the driver of the vehicle performs most frequently. Especially, when the space to be parked is narrow or there are adjacent obstacles, it may take a lot of time. In order to reduce such a problem, a driver assistance device for providing an AVM (Around View Monitoring) image as well as a parking assistance function is provided in a vehicle.

However, the driver assistance apparatus that provides the conventional parking assistance function has a limitation in that, even if there are a plurality of parking spaces around the vehicle, only the information on one parking zone is provided to the driver according to a predetermined standard. For example, the conventional parking assist function carries out automatic parking for only the firstly searched parking area, without regard to the parking difficulty for the various parking areas.

SUMMARY OF THE INVENTION It is an object of the present invention to provide an apparatus and method for searching for and guiding the parking zone that is easiest to park on the basis of parking difficulty for each of a plurality of parking areas, The present invention is directed to providing a method for providing a service to a user.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided an apparatus for supporting a driver at the time of parking, comprising: a sensing unit for generating sensing information corresponding to a peripheral space of a vehicle; and a control unit connected to the sensing unit Wherein the control unit detects a candidate parking zone in the peripheral space based on the sensing information provided from the sensing unit in the parking assist mode and calculates a score indicating a parking difficulty level for each of the candidate parking zones Based on the score, sets one of the candidate parking zones as a target parking zone.

Also, the sensing unit may include at least one of a camera, a radar, a lidar, and an ultrasonic sensor, and the sensing information may include at least one of an external image of the vehicle provided from the camera and the radar, And obstacle information detected by the obstacle information detecting unit.

Also, the control unit may filter a parking zone satisfying at least one predetermined condition among all the parking zones in the surrounding space, and detect the candidate parking zone from among the filtered parking zones.

The at least one condition may also include at least one of (i) a first condition indicating that no other parked vehicle is present, and (ii) a second condition indicating that entry of the vehicle by the obstacle is not blocked One can be included.

The control unit may calculate a parking locus for the candidate parking zone based on at least one of the size, shape and position of the candidate parking zone and the size, shape, and position of obstacles existing in the surrounding space, Based on the parking locus, the score can be calculated.

The control unit may determine the parking mode of the vehicle for the candidate parking zone based on at least one of the size, shape, and position of the candidate parking zone and the size, shape, and position of obstacles existing in the surrounding space. And the parking locus for the candidate parking area can be calculated on the basis of the parking mode.

Further, the control unit may calculate the score based on the length of the parking locus, the number of times of switching in the left and right direction, and the number of times of forward and backward switching.

Also, the control unit may set the candidate parking zone in which the lowest score among the candidate parking zones is calculated as the target parking zone.

The control unit may set a candidate parking zone corresponding to the user input to the target parking zone when there are two or more candidate parking zones in which the lowest score is calculated.

The control unit may further include a display unit for displaying an image, and the control unit may display an image guiding the candidate parking zone and the target parking zone on the display unit.

In addition, the controller may select the parking assist mode when a predetermined event occurs.

In addition, the predetermined event may include at least one of reception of a user input instructing selection of the parking support mode, entry of a parking lot of the vehicle, and arrival of a predetermined destination to the vehicle.

According to another aspect of the present invention, there is provided a method of supporting a driver in parking a vehicle, the method comprising: entering a parking assist mode; receiving sensing information corresponding to a peripheral space of the vehicle; Detecting a candidate parking zone present in the peripheral space, calculating a score indicative of a parking difficulty for each of the candidate parking zones, and determining, based on the score, The method comprising the steps of:

The step of calculating the score may further include calculating a parking locus for the candidate parking zone based on at least one of the size, shape and position of the candidate parking zone and the size, shape, and position of obstacles existing in the surrounding space And calculating the score based on the length of the parking locus, the number of times of switching in the left and right direction, and the number of times of switching in the forward and backward directions.

In addition, the step of setting the target parking zone may set the candidate parking zone in which the lowest score among the candidate parking zones is calculated to the target parking zone.

The details of other embodiments are included in the detailed description and drawings.

Effects of the apparatus and method for supporting the driver based on the parking difficulty according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, it is possible to provide parking difficulty information for a plurality of parkable parking areas. Accordingly, the driver can help him or her to select an area for parking the vehicle.

Further, according to at least one of the embodiments of the present invention, the parking area can be determined by taking into consideration the roads or the marks (e.g., exit direction) of the facilities included in the external image of the vehicle, .

According to at least one of the embodiments of the present invention, when the parking zone is detected, the parking mode and the parking locus are calculated or changed based on the surrounding obstacle information as well as the information about the vehicle and the parking zone, And the convenience of getting on and off the user can be improved.

The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.

1 shows a block diagram of a vehicle according to an embodiment of the present invention.
Figs. 2 and 3 are drawings referred to in describing the external camera described above with reference to Fig. 1. Fig.
4 shows an exemplary top view of the vehicle described above with reference to Fig.
FIG. 5 shows an example of an internal block diagram of the control unit shown in FIG.
6A and 6B are views referred to in the description of the operation of the control unit shown in FIG.
FIG. 7 shows a flowchart of an exemplary process performed by the driver assistance device according to an embodiment of the present invention to assist the driver in parking the vehicle.
8 shows an exemplary interior view of a vehicle in accordance with an embodiment of the present invention.
9A to 9D show an exemplary operation in which a driver assistance apparatus according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle.
10 shows an exemplary operation in which a driver assistance device according to an embodiment of the present invention calculates a score for each of a plurality of candidate parking areas around the vehicle.
11 shows an exemplary operation in which a driver assistance device according to an embodiment of the present invention calculates a score for each of a plurality of candidate parking spaces around the vehicle.
12A and 12B illustrate an exemplary operation in which a driver assistance apparatus according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle.
13A to 13C show an exemplary operation in which a driver assistance apparatus according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle.
14A and 14B show an exemplary operation in which a driver assistance apparatus according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle.
15A to 15D illustrate an exemplary operation in which a driver assistance apparatus according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. It should also be understood that the term "controlling" one component is meant to encompass not only one component directly controlling the other component, but also controlling through mediation of a third component something to do. It is also to be understood that any element "providing" information or signals to another element is meant to encompass not only providing the element directly to the other element, but also providing it through intermediation of a third element .

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

1 shows a block diagram of a vehicle 100 according to an embodiment of the present invention.

The vehicle 100 includes a communication unit 110, an input unit 120, a memory 130, an output unit 140, a vehicle driving unit 150, a sensing unit 160, a control unit 170, an interface unit 180, (Not shown).

The communication unit 110 may include one or more modules that enable wireless communication between the vehicle 100 and an external device (e.g., portable terminal, external server, other vehicle). In addition, the communication unit 110 may include one or more modules that connect the vehicle 100 to one or more networks.

The communication unit 110 may include a broadcast receiving module 111, a wireless Internet module 112, a local area communication module 113, a location information module 114, and an optical communication module 115.

The broadcast receiving module 111 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.

The wireless Internet module 112 refers to a module for wireless Internet access, and may be built in or externally mounted on the vehicle 100. The wireless Internet module 112 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 112 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above. For example, the wireless Internet module 112 may exchange data wirelessly with an external server. The wireless Internet module 112 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) from an external server.

The short-range communication module 113 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology.

The short-range communication module 113 may form short-range wireless communication networks to perform short-range communication between the vehicle 100 and at least one external device. For example, the short-range communication module 113 can wirelessly exchange data with the occupant's portable terminal. The short-range communication module 113 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) from a portable terminal or an external server. For example, when the user aboard the vehicle 100, the user's portable terminal and the vehicle 100 can perform pairing with each other automatically or by execution of the user's application.

The position information module 114 is a module for acquiring the position of the vehicle 100, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.

The optical communication module 115 may include a light emitting portion and a light receiving portion.

The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.

The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle 100. [ For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 115 can exchange data with other vehicles through optical communication.

The input unit 120 may include a driving operation unit 121, a microphone 123, and a user input unit 124.

The driving operation means 121 receives a user input for driving the vehicle 100. The driving operation means 121 may include a steering input means 121a, a shift input means 121b, an acceleration input means 121c and a brake input means 121d.

The steering input means 121a receives a forward direction input of the vehicle 100 from the user. The steering input means 121a may include a steering wheel. According to the embodiment, the steering input means 121a may be formed of a touch screen, a touch pad, or a button.

The shift input means 121b receives inputs of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle 100 from the user. The shift input means 121b is preferably formed in a lever shape. According to an embodiment, the shift input means 121b may be formed of a touch screen, a touch pad, or a button.

The acceleration input means 121c receives an input for acceleration of the vehicle 100 from the user. The brake input means 121d receives an input for decelerating the vehicle 100 from the user. The acceleration input means 121c and the brake input means 121d are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means 121c or the brake input means 121d may be formed of a touch screen, a touch pad, or a button.

The camera 122 is disposed at one side of the interior of the vehicle 100 to generate an indoor image of the vehicle 100. [ For example, the camera 122 may be disposed at various positions of the vehicle 100, such as a dashboard surface, a roof surface, a rear view mirror, etc., to photograph the passenger of the vehicle 100. In this case, the camera 122 may generate an indoor image of an area including the driver's seat of the vehicle 100. [ In addition, the camera 122 may generate an indoor image of an area including an operator's seat and an assistant seat of the vehicle 100. [ The indoor image generated by the camera 122 may be a two-dimensional image and / or a three-dimensional image. To generate a three-dimensional image, the camera 122 may include at least one of a stereo camera, a depth camera, and a three-dimensional laser scanner. The camera 122 can provide the indoor image generated by the camera 122 to the control unit 170 functionally combined with the indoor image.

The controller 170 analyzes the indoor image provided from the camera 122 and can detect various objects. For example, the control unit 170 can detect the sight line and / or the gesture of the driver from the portion corresponding to the driver's seat area in the indoor image. As another example, the control unit 170 can detect the sight line and / or the gesture of the passenger from the portion corresponding to the indoor area excluding the driver's seat area in the indoor image. Of course, the sight line and / or the gesture of the driver and the passenger may be detected at the same time.

The microphone 123 can process an external acoustic signal into electrical data. The processed data can be utilized variously according to functions performed in the vehicle 100. The microphone 123 can convert the voice command of the user into electrical data. The converted electrical data may be transmitted to the control unit 170.

The camera 122 or the microphone 123 may be a component included in the sensing unit 160 and not a component included in the input unit 120. [

The user input unit 124 is for receiving information from a user. When information is input through the user input unit 124, the controller 170 may control the operation of the vehicle 100 to correspond to the input information. The user input unit 124 may include a touch input means or a mechanical input means. According to an embodiment, the user input 124 may be located in one area of the steering wheel. In this case, the driver can operate the user input unit 124 with his / her finger while holding the steering wheel.

The input unit 120 may include a plurality of buttons or a touch sensor. It is also possible to perform various input operations through a plurality of buttons or touch sensors.

The sensing unit 160 senses a signal related to the running of the vehicle 100 or the like. To this end, the sensing unit 160 may include a sensor, a steering sensor, a speed sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, Position sensor, vehicle forward / backward sensor, battery sensor, fuel sensor, tire sensor, steering sensor by steering wheel rotation, vehicle internal temperature sensor, internal humidity sensor, ultrasonic sensor, infrared sensor, radar, . ≪ / RTI >

Accordingly, the sensing unit 160 can sense the vehicle collision information, the vehicle direction information, the vehicle position information (GPS information), the vehicle angle information, the vehicle speed information, the vehicle acceleration information, the vehicle tilt information, Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, and the like. The control unit 170 controls the acceleration and deceleration of the vehicle 100 based on the external environment information obtained by at least one of the camera, the ultrasonic sensor, the infrared sensor, the radar, A control signal for changing direction, etc. can be generated. Here, the external environment information may be information related to various objects located within a predetermined distance from the vehicle 100 in motion. For example, the external environment information may include information on the number of obstacles located within a distance of 100 m from the vehicle 100, a distance to the obstacle, a size of the obstacle, a type of the obstacle, and the like.

The sensing unit 160 may further include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The sensing unit 160 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor that senses the passenger's biometric information. Here, the camera 122 and the microphone 123 can operate as sensors. The biometric information sensing unit can acquire hand shape information and facial recognition information through the camera 122. [

The sensing unit 160 may include at least one camera 161 for photographing the outside of the vehicle 100. [ The camera 161 may be referred to as an external camera. For example, the sensing unit 160 may include a plurality of cameras 161 disposed at different positions of the vehicle exterior. The camera 161 may include an image sensor and an image processing module. The camera 161 can process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module may process the still image or the moving image obtained through the image sensor, extract necessary information, and transmit the extracted information to the control unit 170.

The camera 161 may include an image sensor (e.g., CMOS or CCD) and an image processing module. In addition, the camera 161 can process still images or moving images obtained by the image sensor. The image processing module can process the still image or moving image obtained through the image sensor. In addition, the camera 161 may acquire an image including at least one of a traffic light, a traffic sign, a pedestrian, another vehicle, and a road surface.

The output unit 140 may include a display unit 141, an acoustic output unit 142, and a haptic output unit 143 for outputting information processed by the control unit 170.

The display unit 141 may display information processed by the controller 170. [ For example, the display unit 141 can display vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.

The display unit 141 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.

The display unit 141 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. Such a touch screen may function as a user input 124 that provides an input interface between the vehicle 100 and a user and may provide an output interface between the vehicle 100 and a user. In this case, the display unit 141 may include a touch sensor that senses a touch with respect to the display unit 141 so as to receive a control command by a touch method. When a touch is made to the display unit 141, the touch sensor senses the touch, and the control unit 170 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

Meanwhile, the display unit 141 may include a cluster so that the driver can check the vehicle state information or the vehicle driving information while driving. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the gaze ahead of the vehicle.

Meanwhile, according to the embodiment, the display unit 141 may be implemented as a Head Up Display (HUD). When the display unit 141 is implemented as a HUD, information can be output through a transparent display provided in the windshield. Alternatively, the display unit 141 may include a projection module to output information through an image projected on the windshield.

The sound output unit 142 converts an electric signal from the control unit 170 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 142 may include a speaker or the like. It is also possible that the sound output unit 142 outputs a sound corresponding to the operation of the user input unit 124. [

The haptic output unit 143 generates a tactile output. For example, the haptic output section 143 may vibrate the steering wheel, the seat belt, and the seat so that the user can operate to recognize the output.

The vehicle driving unit 150 can control the operation of various devices of the vehicle. The vehicle driving unit 150 includes a power source driving unit 151, a steering driving unit 152, a brake driving unit 153, a lamp driving unit 154, an air conditioning driving unit 155, a window driving unit 156, an airbag driving unit 157, A driving unit 158, and a wiper driving unit 159. [0035]

The power source drive unit 151 may perform electronic control of the power source in the vehicle 100. [ The power source drive unit 151 may include an accelerator for increasing the speed of the vehicle 100 and a decelerator for decreasing the speed of the vehicle 100. [

For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive unit 151 can perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled. When the power source drive unit 151 is an engine, the speed of the vehicle can be limited by limiting the engine output torque under the control of the control unit 170. [

In another example, when the electric motor (not shown) is a power source, the power source drive unit 151 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The steering driver 152 may include a steering apparatus. Accordingly, the steering driver 152 can perform electronic control of the steering apparatus in the vehicle 100. [ For example, the steering driver 152 may be provided with a steering torque sensor, a steering angle sensor, and a steering motor, and the steering torque applied by the driver to the steering wheel may be sensed by the steering torque sensor. The steering driver 152 can control the steering force and the steering angle by changing the magnitude and direction of the current applied to the steering motor based on the speed of the vehicle 100 and the steering torque. In addition, the steering driver 152 can determine whether the running direction of the vehicle 100 is properly adjusted based on the steering angle information obtained by the steering angle sensor. Thereby, the running direction of the vehicle can be changed. In addition, when the vehicle 100 is running at a low speed, the steering driver 152 lowers the weight of the steering wheel by increasing the steering force of the steering motor and reduces the steering force of the steering motor when the vehicle 100 is traveling at high speed, The weight can be increased. When the autonomous vehicle running function of the vehicle 100 is executed, the steering driver 152 may be configured to determine whether or not the steering wheel 160 is in a state where the driver operates the steering wheel (e.g., a situation in which the steering torque is not detected) It is also possible to control the steering motor to generate appropriate steering force based on the sensing signal or the control signal provided by the control unit 170. [

The brake driver 153 may perform electronic control of a brake apparatus (not shown) in the vehicle 100. [ For example, it is possible to reduce the speed of the vehicle 100 by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 100 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The lamp driving unit 154 may control the turn-on / turn-off of at least one or more lamps disposed inside or outside the vehicle. The lamp driver 154 may include a lighting device. Further, the lamp driving unit 154 can control intensity, direction, etc. of light output from each of the lamps included in the lighting apparatus. For example, it is possible to perform control for a direction indicating lamp, a head lamp, a brake lamp, and the like.

The air conditioning driving unit 155 may perform electronic control on an air conditioner (not shown) in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cool air to be supplied to the inside of the vehicle.

The window driving unit 156 may perform electronic control of a window apparatus in the vehicle 100. [ For example, it is possible to control the opening or closing of the side of the vehicle with respect to the left and right windows.

The airbag drive 157 may perform electronic control of the airbag apparatus in the vehicle 100. [ For example, in case of danger, the airbag can be controlled to fire.

The sunroof driving unit 158 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 100. [ For example, the opening or closing of the sunroof can be controlled.

The wiper driving unit 159 may control the wipers 14a and 14b provided on the vehicle 100. [ For example, the wiper drive 159 may be configured to provide an electronic control for the number of drives, drive speeds, etc. of the wipers 14a, 14b in response to user input upon receipt of a user input instructing to drive the wiper through the user input 124 Can be performed. The wiper drive unit 159 may determine the amount or intensity of the rainwater based on the sensing signal of the rain sensor included in the sensing unit 160 so that the wipers 14a and 14b may be used without user input, Can be automatically driven.

Meanwhile, the vehicle driving unit 150 may further include a suspension driving unit (not shown). The suspension driving unit may perform electronic control of a suspension apparatus (not shown) in the vehicle 100. For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 100. [

The memory 130 is electrically connected to the controller 170. The memory 170 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 190 may be, in hardware, various storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like. The memory 130 may store various data for operation of the vehicle 100, such as a program for processing or controlling the controller 170. [

The interface unit 180 may serve as a path to various kinds of external devices connected to the vehicle 100. For example, the interface unit 180 may include a port connectable to the portable terminal, and may be connected to the portable terminal through the port. In this case, the interface unit 180 can exchange data with the portable terminal.

The interface unit 180 may receive the turn signal information. Here, the turn signal information may be a turn-on signal of the turn signal lamp for the left turn or the turn right turn inputted by the user. When the left or right turn signal turn-on input is received through the user input (124 in FIG. 1) of the vehicle 100, the interface unit 180 may receive the left or right turn signal information.

The interface unit 180 may receive vehicle speed information, rotation angle information of the steering wheel, or gear shift information. The interface unit 180 may receive the sensed vehicle speed information, the steering wheel rotation angle information, or the gear shift information through the sensing unit 160 of the vehicle. Alternatively, the interface unit 180 may receive the vehicle speed information, the steering wheel rotation angle information, or the gear shift information from the control unit 170 of the vehicle. Here, the gear shift information may be information on which state the shift lever of the vehicle is in. For example, the gear shift information may be information on which state the shift lever is in the parking (P), reverse (R), neutral (N), running (D) .

The interface unit 180 may receive user input received via the user input 124 of the vehicle 100. [ The interface unit 180 may receive the user input from the input unit 120 of the vehicle 100 or may receive the user input through the control unit 170. [

The interface unit 180 can receive information obtained from an external device. For example, when the traffic light change information is received from the external server through the communication unit 110 of the vehicle 100, the interface unit 180 can receive the traffic light change information from the control unit 170. [

The control unit 170 can control the overall operation of each unit in the vehicle 100. [ The control unit 170 may be referred to as an ECU (Electronic Control Unit). The controller 170 may include at least one processor, and each processor may control the operation of other components included in the vehicle 100. [

The control unit 170 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) ), Controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.

The power supply unit 190 can supply power necessary for the operation of each component under the control of the controller 170. [ In particular, the power supply unit 170 can receive power from a battery (not shown) or the like inside the vehicle.

The AVN (Audio Video Navigation) device can exchange data with the controller 170. [ The control unit 170 may receive navigation information from the AVN apparatus or a separate navigation apparatus (not shown). Here, the navigation information may include set destination information, route information according to the destination, map information about the vehicle driving, or vehicle location information.

On the other hand, some of the components shown in FIG. 1 may not be essential to the implementation of the vehicle 100. Thus, the vehicle 100 described herein may have more or fewer components than those listed above.

On the other hand, the driver assistance device to be described later may be an apparatus including some or all of the components of the vehicle 100 shown in Fig. For example, the driver assistance device may include the sensing unit 160 and the control unit 170, among the components of the vehicle 100. [ As another example, the driver assistance device may be the same as the vehicle 100. [

Figs. 2 and 3 are drawings referred to in describing the external camera 161 described above with reference to Fig.

Referring to FIGS. 2 and 3, four cameras 161a, 161b, 161c, and 161d may be mounted at different positions on the exterior of the vehicle 100. FIG. Each of the four cameras 161a, 161b, 161c, and 161d may be the same as the camera 161 described above.

Referring to FIG. 2, the plurality of cameras 161a, 161b, 161c, and 161d may be disposed at the front, left, right, and rear of the vehicle 100, respectively. Each of the plurality of cameras 161a, 161b, 161c, and 161d may be included in the camera 161 shown in FIG.

The front camera 161a may be disposed near the windshield, near the ambulance, or near the radiator grill.

The left camera 161b may be disposed in a case surrounding the left side mirror. Alternatively, the left camera 161b may be disposed outside the case surrounding the left side mirror. Alternatively, the left camera 161b may be disposed in one area outside the left front door, the left rear door, or the left fender.

The right camera 161c may be disposed in a case surrounding the right side mirror. Or the right camera 161c may be disposed outside the case surrounding the right side mirror. Alternatively, the right camera 161c may be disposed in one area outside the right front door, the right rear door, or the right fender.

On the other hand, the rear camera 161d may be disposed in the vicinity of a rear license plate or a trunk switch.

The respective images photographed by the plurality of cameras 161a, 161b, 161c, and 161d are transmitted to the control unit 170, and the control unit 170 may synthesize the respective images to generate a peripheral image of the vehicle.

2, four cameras are mounted on the outer surface of the vehicle 100. However, the present invention is not limited to the number of cameras, and the number of cameras may be different from the position shown in Fig. 2 Lt; / RTI >

3, the composite image 400 includes a first image area 401 corresponding to an external image photographed by the front camera 161a, a second image area 401 corresponding to an external image photographed by the left camera 161b, A third image area 403 corresponding to an external image photographed by the right camera 161c and a fourth image area 404 corresponding to an external image photographed by the rear camera 161d . The composite image 400 may be named an around view monitoring image.

At the time of generating the composite image 400, the boundary lines 411, 412, 413, and 414 are generated between any two external images included in the composite image 400. These boundary portions can be naturally displayed by image blending processing.

On the other hand, boundary lines 411, 412, 413, and 414 may be displayed at the boundaries between the plurality of images. In addition, a predetermined image may be included in the center of the composite image 400 to indicate the vehicle 100.

Further, the composite image 400 may be displayed on a display device mounted in the interior of the vehicle 100. [

FIG. 4 shows an exemplary top view of the vehicle 100 described above with reference to FIG. For convenience of explanation, it is assumed that the vehicle 100 is a four-wheeled vehicle.

Referring to FIG. 4, the vehicle 100 may include at least one or more radar devices 162, a plurality of radar devices 163, and an ultrasonic sensor device 164.

The radar 162 may be mounted on one side of the vehicle 100 to emit electromagnetic waves toward the periphery of the vehicle 100 and receive electromagnetic waves reflected from various objects existing around the vehicle 100. [ For example, the radar 162 measures the time of an electromagnetic wave reflected by an object and acquires information related to the distance, direction, altitude, and the like of the object.

The laser 163 is mounted on one side of the vehicle 100 and can emit laser toward the periphery of the vehicle 100. [ The laser emitted by the laser 163 may be scattered or reflected back to the vehicle 100 and the laser 163 may be reflected on the basis of the change in the time, intensity, frequency, , Information on the physical characteristics such as the distance, speed, and shape of the target located in the periphery of the vehicle 100 can be obtained.

The ultrasonic sensor 164 is mounted on one side of the vehicle 100 to generate ultrasonic waves toward the periphery of the vehicle 100. [ Ultrasonic waves generated by the ultrasonic sensor 164 have a high frequency (about 20 KHz or more) and a short wavelength. Such an ultrasonic sensor 164 can be used mainly to recognize an obstacle close to the vehicle 100 and the like.

The radar 162, the RDA 163, and the ultrasonic sensor 164 shown in FIG. 4 may be sensors included in the sensing unit 160 shown in FIG. It is also apparent to those skilled in the art that the radar 162, the lidar 163, and the ultrasonic sensor 164 may be mounted in different numbers in different positions from those shown in Fig. 4, depending on the embodiment.

FIG. 5 shows an example of an internal block diagram of the controller 170 shown in FIG.

5, the control unit 170 may include an image preprocessing unit 510, a disparity calculating unit 520, an object detecting unit 534, an object tracking unit 540, and an application unit 550 .

The image preprocessor 510 receives an image provided from the cameras 161 and 122 shown in FIG. 1 and can perform preprocessing.

In particular, the image preprocessing unit 510 may perform noise reduction, rectification, calibration, color enhancement, color space conversion (CSC) Interpolation, camera gain control, and the like. Thus, a clearer image can be obtained than the stereo image photographed by the cameras 161 and 122.

The disparity calculator 520 receives the image signal processed by the image preprocessing unit 510, performs stereo matching on the received images, and performs disparity calculation based on stereo matching, A disparty map can be obtained. That is, it is possible to obtain the disparity information about the stereo image with respect to the front of the vehicle.

At this time, the stereo matching may be performed on a pixel-by-pixel basis of stereo images or on a predetermined block basis. On the other hand, the disparity map may mean a map in which binaural parallax information of stereo images, i.e., left and right images, is numerically expressed.

The segmentation unit 532 may perform segmenting and clustering on at least one of the images based on the dispetity information from the disparity calculating unit 520. [

Specifically, the segmentation unit 532 can separate the background and the foreground for at least one of the stereo images based on the disparity information.

For example, an area having dispaly information within a disparity map of a predetermined value or less can be calculated as a background, and the corresponding part can be excluded. Thereby, the foreground can be relatively separated.

As another example, an area in which the dispetity information is equal to or greater than a predetermined value in the disparity map can be calculated with the foreground, and the corresponding part can be extracted. Thereby, the foreground can be separated.

Thus, by separating the foreground and the background based on the disparity information information extracted based on the stereo image, it becomes possible to shorten the signal processing speed, signal processing amount, and the like at the time of object detection thereafter.

Next, the object detector 534 can detect the object based on the image segment from the segmentation unit 532. [

That is, the object detecting unit 534 can detect an object for at least one of the images based on the disparity information.

Specifically, the object detecting unit 534 can detect an object for at least one of the images. For example, an object can be detected from a foreground separated by an image segment.

Next, the object verification unit 536 classifies and verifies the separated object.

For this purpose, the object identification unit 536 identifies the object using the neural network identification method, the SVM (Support Vector Machine) method, the AdaBoost identification method using the Haar-like feature, or the Histograms of Oriented Gradients Etc. may be used.

On the other hand, the object checking unit 536 can check the objects by comparing the objects stored in the memory 130 with the detected objects.

For example, the object identifying unit 536 can identify nearby vehicles, lanes, roads, signs, hazardous areas, tunnels, etc. located in the vicinity of the vehicle.

An object tracking unit 540 may perform tracking on the identified object. For example, it sequentially identifies an object in the acquired stereo images, calculates a motion or a motion vector of the identified object, and tracks movement of the object based on the calculated motion or motion vector . Accordingly, it is possible to track nearby vehicles, lanes, roads, signs, dangerous areas, tunnels, etc., located in the vicinity of the vehicle.

Next, the application unit 550 can calculate the risk and the like of the vehicle 100 based on various objects (e.g., other vehicles, lanes, roads, signs, etc.) located around the vehicle 100 . It is also possible to calculate the possibility of a collision with a preceding vehicle, whether the vehicle is slipping or the like.

Then, the application unit 550 can output a message or the like for notifying the user to the user as vehicle driving assistance information, based on the calculated risk, possibility of collision, sleep, or the like. Alternatively, a control signal for attitude control or running control of the vehicle 100 may be generated as the vehicle control information.

The controller 170 may include an image preprocessing unit 510, a dispaly computing unit 520, a segmentation unit 532, an object detection unit 534, an object verification unit 536, an object tracking unit 540, and an application unit 550, as shown in FIG. For example, if the cameras 161 and 122 are cameras providing only two-dimensional images, the disparity calculating unit 520 may be omitted.

6A and 6B are diagrams referred to in the description of the operation of the controller 170 shown in FIG.

6A and 6B are diagrams for explaining the operation method of the controller 170 of FIG. 5, based on the stereo image obtained in the first and second frame periods, respectively.

First, referring to FIG. 6A, when the camera 161 is a stereo camera, the camera 161 acquires a stereo image during a first frame period.

The disparity calculating unit 520 in the control unit 170 receives the stereo images FR1a and FR1b signal-processed by the image preprocessing unit 510 and performs stereo matching on the received stereo images FR1a and FR1b , And a disparity map (620).

The disparity map 620 is obtained by leveling the parallax between the stereo images FR1a and FR1b. The higher the disparity level, the closer the distance to the vehicle, and the lower the disparity level, The distance can be calculated to be far.

On the other hand, when such a disparity map is displayed, it may be displayed so as to have a higher luminance as the disparity level becomes larger, and a lower luminance as the disparity level becomes smaller.

In the figure, first to fourth lanes 628a, 628b, 628c, and 628d have corresponding disparity levels in the disparity map 620, and the construction area 622, the first forward vehicle 624 ) And the second preceding vehicle 626 have corresponding disparity levels, respectively.

The segmentation unit 532, the object detection unit 534 and the object identification unit 536 determine whether or not the segments, the object detection, and the object (s) for at least one of the stereo images FR1a and FR1b based on the disparity map 620 Perform verification.

In the figure, using the disparity map 620, object detection and confirmation for the second stereo image FRlb is performed.

That is, in the image 630, the first to fourth lanes 638a, 638b, 638c, 638d, the construction area 632, the first forward vehicle 634, the second forward vehicle 636, And verification may be performed.

Next, referring to FIG. 6B, during the second frame period, the stereo camera 161 acquires a stereo image.

The disparity calculating unit 520 in the control unit 170 receives the stereo images FR2a and FR2b signal-processed by the image preprocessing unit 510 and performs stereo matching on the received stereo images FR2a and FR2b , And a disparity map (640).

In the figure, the first to fourth lanes 648a, 648b, 648c, and 648d have corresponding disparity levels in the disparity map 640, and the construction area 642, the first forward vehicle 644 ) And the second preceding vehicle 646 have corresponding disparity levels, respectively.

The segmentation unit 532, the object detection unit 534 and the object identification unit 536 determine whether or not the segments, the object detection, and the object (s) for at least one of the stereo images FR2a and FR2b based on the disparity map 640 Perform verification.

In the figure, using the disparity map 640, object detection and confirmation for the second stereo image FR2b is performed.

That is, the first to fourth lanes 658a, 658b, 658c, and 658d, the construction area 652, the first forward vehicle 654, and the second forward vehicle 656 in the image 650 are used for object detection and Verification can be performed.

On the other hand, the object tracking unit 540 may compare the FIG. 6A and FIG. 6B to perform tracking on the identified object.

Specifically, the object tracking unit 540 can track the movement of the object, based on the motion or motion vector of each object identified in FIGS. 6A and 6B. Accordingly, it is possible to perform tracking on the lane, the construction area, the first forward vehicle, the second forward vehicle, and the like, which are located in the vicinity of the vehicle.

FIG. 7 shows a flowchart of an exemplary process (S700) performed by the driver assistance device according to an embodiment of the present invention to assist the driver in parking the vehicle 100. FIG. Hereinafter, for convenience of explanation, it is assumed that the driver assistance device includes the control unit 170 of the vehicle 100. [

In step S710, the driver assistance device can determine whether a predetermined event has occurred. Specifically, the event may be data or input information related to a specific situation predetermined by the controller 170 to enter the parking support mode.

For example, the predetermined event may be (i) a reception event of a user input (e.g., voice, touch, click, gesture) instructing selection of a parking support mode, (ii) iii) a destination arrival event preset in the vehicle 100. [

Here, the control unit 170 may determine whether the event (i) is generated based on the user input received by the input unit 120 of the vehicle 100. For example, the voice input of the user is received by the microphone 123, the touch input is received by the touch sensor, the click input is received by a button provided on the steering wheel, Lt; / RTI >

The control unit 170 may determine whether the event (ii) and the event (iii) are generated based on the GPS information of the vehicle 100 received by the position information module 115. For example, the control unit 170 accesses the electronic map data (e.g., roads, peripheral facilities, speed limit) stored in the memory 130 and stores the position corresponding to the GPS information of the vehicle 100 in the parking lot , It can be determined that the event (ii) has occurred. For example, when the position corresponding to the GPS information of the vehicle 100 is within a predetermined distance from the predetermined destination by the driver, the control unit 170 may determine that the (iii) event has occurred.

However, it is needless to say that the types of events determined in advance for entry into the parking support mode are not limited to the above-mentioned examples, and other types of events can be predetermined.

If a predetermined event occurs, the controller 170 may perform step S720. On the other hand, if the occurrence of the predetermined event is not detected, the control unit 170 may repeat step S710 periodically.

In step S720, the driver assistance apparatus can enter the parking support mode. The parking assist mode in the present invention is a mode for recognizing a parking zone in which at least one parking zone existing in the peripheral space of the vehicle 100 is available for parking the vehicle 100 and considering the parking difficulty for each recognized parking zone To provide information on the parking zone that is easiest for the driver to park.

In step S730, the driver assistance device can receive the sensing information corresponding to the peripheral space of the vehicle 100 when entering the parking support mode. Here, the peripheral space of the vehicle 100 may be a space within a predetermined distance from the present position of the vehicle 100. [

At least one of the camera 161, the radar 162, the Lidar 163 and the ultrasonic sensor 164 included in the sensing unit 160 may be located at a position where the vehicle 100 is located Parking lot), and provide the control unit 170 with sensing information including sensed data.

For example, the sensing information provided from the camera 161 may include an external image (e.g., a front image, a left image, a right image, and a rear image) of the vehicle 100. In addition, the sensing information provided from at least one of the radar 162, the RDA 163, and the ultrasonic sensor 164 may be obstacle information. Here, the obstacle may mean an object existing in the peripheral space of the vehicle 100, such as a curb, a wall, a pillar, a streetcar, a pedestrian, and the like.

In step S740, the driver assistance device can detect the candidate parking zone based on the sensing information. At this time, the candidate parking area detected by the vehicle 100 may be at least one.

In one embodiment, the control unit 170 may synthesize an external image provided from the camera 161 to generate an AVM image as shown in FIG. In addition, the controller 170 can recognize a parking line from the AVM image using a parking line recognition technique (e.g., edge detection technique). Accordingly, the control unit 170 can detect the parking area existing in the peripheral space of the vehicle 100 based on the recognized parking line. For example, the controller 170 may determine the size (e.g., width and length), the shape (e.g., for right angle parking, parallel parking, And direction), and the like.

In one embodiment, the controller 170 may filter out of all detected parking zones a parking zone that meets at least one predetermined condition, and may detect a candidate parking zone out of the filtered parking zones.

For example, a condition preset for detection of a candidate parking zone may include at least one of a first condition and a second condition. The first condition may indicate that there is no other parked vehicle. The second condition may be that the entry of the vehicle 100 by the obstacle is not to be blocked. Of course, various other conditions may also be set for the filtering.

On the other hand, the control unit 170 can acquire three-dimensional spatial information about the surrounding space of the vehicle 100, with or without detection of the candidate parking area. In one embodiment, the control unit 170 uses the SLAM (Simultaneous Localization And Mapping) technique to calculate three-dimensional spatial information about the peripheral space of the vehicle 100 based on the sensing information provided from the sensing unit 160 Can be obtained.

For example, the control unit 170 may acquire three-dimensional spatial information about a surrounding space within a predetermined distance from the vehicle 100, based on an image outside the vehicle 100 provided from the sensing unit 160. Specifically, the control unit 170 detects three-dimensional objects (e.g., walls, curbstones, other vehicles, electric poles, street trees, falling objects, etc.) and detects the type, shape, size, 100) and the like can be obtained.

In step S750, the driver assistance device can determine whether there are two or more candidate parking areas detected through step S740. If there are more than two candidate parking areas detected, the controller 170 may perform step S760. On the other hand, if there is one detected candidate parking zone, the controller 170 may perform step S755. In step S755, the driver assistance device may set the detected one candidate parking zone to the target parking zone.

In step S760, the driver assistance device may calculate the score representing the parking difficulty for each of the candidate parking areas detected through step S740.

Specifically, the control unit 170 can calculate the parking locus for each of the detected candidate parking zones. For example, the control unit 170 may be configured to determine (i) the size, shape and location of a particular candidate parking zone and (ii) the size, shape, and location of an obstacle present in the surrounding space, The parking locus can be calculated. For example, if there is an obstacle adjacent to a particular candidate parking area, a longer or more complex parking trajectory may be generated than when there are no obstacles adjacent to the particular candidate parking area.

Meanwhile, the controller 170 may determine the parking mode for each candidate parking zone before calculating the parking locus for each candidate parking zone. The parking mode for each candidate parking zone can be selected by the user or predetermined in the memory 130. [ Alternatively, the control unit 170 may determine a parking mode for each candidate parking zone based on at least one of the size, shape, and position of the candidate parking zone and the size, shape, and position of obstacles existing in the surrounding space.

For example, if a candidate parking zone is for right angle parking, the control unit 170 may determine whether to park backward in the candidate parking zone or forward. Next, the controller 170 may calculate the parking locus for each candidate parking zone based on the parking mode determined for each candidate parking zone.

Also, the control unit 170 can calculate the score for each of the candidate parking zones based on the plurality of parking trajectories calculated for each candidate parking zone. Here, the score is an index indicating the degree of difficulty of parking the vehicle 100 with respect to each candidate parking zone, and the higher the score calculated, the more difficult it is to park the vehicle 100. The control section 170 can calculate the score based on at least one of the length (i) of the parking locus, (ii) the number of times of switching in the lateral direction, and (iii) Here, the length of the parking locus may be the total moving distance of the vehicle 100 necessary to move from the current position of the vehicle 100 to the target parking position for the specific candidate parking area. The number of turns of the parking locus in the left and right direction is determined by the steering direction of the vehicle 100 necessary to move from the current position of the vehicle 100 to the target parking position for a particular candidate parking area ) May be converted. The number of times the parking locus is turned forward or backward may be the total number of times the vehicle 100 is advanced or retreated to move from the current position of the vehicle 100 to the target parking position for the specific candidate parking area .

For example, if the length of the first parking locus for the first candidate parking zone is the first length and the length of the second parking locus for the second candidate parking zone is the second length (> first length) The first candidate parking zone 170 may yield a higher score in the second candidate parking zone than the first candidate parking zone.

For example, even if the length of the first parking locus for the first candidate parking area is the same as the length of the second parking locus for the second candidate parking area, the number of switching times of the first parking locus in the left and right direction, The controller 170 may calculate a higher score in the first candidate parking zone than the second candidate parking zone.

In step S770, the driver assistance apparatus can set any one of the plurality of candidate parking zones to the target parking zone based on the score for each candidate parking zone calculated in step S760.

In one embodiment, the driver assistance device may set one of the plurality of scores, the lowest score corresponding to the calculated parking locus, to the target parking zone. That is, regardless of the intention of the driver of the vehicle 100, the control unit 170 can automatically set one candidate parking zone having the lowest parking difficulty to the target parking zone.

In one embodiment, the driver assistance device may arrange a plurality of scores in descending order and determine if there is a candidate parking zone corresponding to a score below a predetermined threshold. Here, the threshold value may be a value set by the driver. For example, when entering the above-described parking assisting mode, the controller 170 may provide a plurality of sub-modes differentiated according to parking difficulty levels to the driver. For example, the lower mode may include a 'beginner mode' and an 'expert mode', and when the 'beginner mode' is selected by the driver, the threshold value may be smaller than when the 'expert mode' is selected. Accordingly, it is possible to search and set the target parking zone in which the driving skill of the driver is taken into account.

If there is no candidate parking zone corresponding to a score below the threshold value, the control unit 170 may request the driver to move the vehicle 100 to another place through the output unit 140. [

If there is one candidate parking zone corresponding to the score below the threshold value, the controller 170 can automatically set the candidate parking zone to the target parking zone.

If there are two or more candidate parking zones corresponding to a score below the threshold value, the controller 170 may request the driver to select one of the two or more candidate parking zones through the output unit 140. The control unit 170 may set a candidate parking zone selected by the driver as the target parking zone out of the two or more candidate parking zones. Alternatively, when there are two or more candidate parking zones corresponding to a score below the threshold value, the controller 170 may automatically set any one candidate parking zone to the target parking zone in accordance with a predetermined rule.

In step S780, the driver assistance device may initiate automatic parking for the target parking zone. The control unit 170 can control the acceleration, deceleration, and steering of the vehicle 100 so as to follow the parking trajectory of the target parking zone calculated in step S760 described above. Specifically, the control unit 170 compares the position of the vehicle 100 with respect to the parking locus of the target parking zone in real time or periodically, and determines the acceleration parameter, the deceleration parameter, and the acceleration / deceleration parameter required for correcting the difference between the two, And provide a control signal to the driving unit 150 indicating a change of at least one of the steering parameters. The power source driving unit 151 and the steering driving unit 152 of the driving unit 150 vary the speed and moving direction of the vehicle 100 according to the control signal provided from the control unit 170, Can be moved. On the other hand, the driver assistance device only performs steering control according to the parking locus of the target parking zone, and acceleration and deceleration may be changed in accordance with the operation of the accelerator pedal and the brake pedal by the driver.

In step S790, the driver assistance device can determine whether parking for the target parking zone is completed. For example, the control unit 170 confirms the parking line of the target parking zone based on the AVM image as shown in FIG. 3, and when the vehicle 100 is located within the parking line of the target parking zone, It can be judged that the parking of the vehicle has been completed.

If it is determined that parking for the target parking zone is in progress, the controller 170 may repeat step S780.

FIG. 8 shows an exemplary interior view of a vehicle 100 according to one embodiment of the present invention.

8 illustrates a case where the vehicle 100 is close to a parking lot entrance. The driver of the vehicle 100 can confirm the forward situation of the vehicle 100 through the windshield.

At least one input means included in the input unit 120 may be disposed in the interior of the vehicle 100. For example, as shown in the figure, the microphone 123 is disposed on one side of the A-pillar, and the button 124a for on / off of the parking assist mode can be disposed on one side of the steering wheel.

The display unit 141 is disposed at the center of the center pedestrian of the vehicle 100, and can visually output various information processed, generated, and received by the vehicle 100.

On the other hand, the driver assistance device can enter the parking support mode in response to the occurrence of a predetermined event.

For example, when a voice command V (e.g., "find a parking spot") of a driver instructing entry of a parking assist mode by the microphone 123 is received, the controller 170 sets the parking assist mode You can enter.

As another example, when the button 124a is clicked by the driver, the control unit 170 can enter the parking support mode.

For example, when the position of the vehicle 100 provided from the position information module 114 is within a predetermined distance from the parking position of the electronic map data previously stored in the memory 130, the controller 170 controls the parking assist mode You can enter.

For example, when the parking lot sign 810 is detected from the forward image provided from the front camera 161a using the image processing technique (e.g., TSR (Traffic Sign Recognition)), Mode can be entered.

On the other hand, when the predetermined event occurs, the controller 170 can select whether the parking support mode is entered from the driver. For example, when a predetermined event occurs, the controller 170 displays buttons 821 and 822 for selecting whether to enter the parking support mode on the display unit 141. When the button 821 is touched , The parking support mode can be entered.

9A to 9D illustrate an exemplary operation in which a driver assistance device according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle 100. [

FIG. 9A illustrates a top view of a situation in which four parking zones are respectively present in the left and right areas of the peripheral space, with respect to the vehicle 100. FIG.

The left parking zones P1-P4 and the right parking zones P5-P8 may all be right angle parking zones. As shown in the figure, the first parking zone P1, the fourth parking zone P4, the fifth parking zone P5, the sixth parking zone P6, the seventh parking zone P7, In the zone P8, the first to sixth vehicles 901 to 906 may be parked in order, respectively.

The control unit 170 can detect the parking areas P1-P8 existing in the surrounding space within the sensing range of the sensing unit 160 based on the sensing information provided from the sensing unit 160. [

Then, the control unit 170 may filter the parking zones P1-P8 according to predetermined conditions. That is, the control unit 170 can select only the parking areas in which the vehicle 100 can be parked among the parking spaces P1-P8 through filtering. For example, in accordance with the first condition described above with reference to Fig. 7, the control unit 170 determines whether the second parking zone P2, which is not parked by the other vehicle 901-906 among the parking zones P1- The third parking zone P3 can be detected as the candidate parking zone. Hereinafter, for convenience of explanation, the third parking zone P3 will be referred to as a first candidate parking zone, and the second parking zone P2 will be referred to as a second candidate parking zone.

FIG. 9B illustrates a guidance screen displayed on the display unit 141 in the situation shown in FIG. 9A.

Referring to FIG. 9B, the controller 170 may display a guidance screen including information corresponding to the situation shown in FIG. 9A on the display unit 141. FIG. For example, the guidance screen shown in FIG. 9B may be a screen for guiding information on the candidate parking areas P3 and P2 detected by the control unit 170 through step S740.

Specifically, an indicator M for guiding the current position of the vehicle 100 may be displayed on the guidance screen displayed on the display unit 141. [ The guidance screen also displays the position of the parked areas P1, P4-P8 parked by the other vehicles 901-906, i.e., parking areas P1, P4-P8 where parking of the vehicle 100 is not possible Lt; RTI ID = 0.0 > 911-916 < / RTI > For example, the first indicator 911 is a graphical object for notifying that parking for the first parking zone P1 is impossible, and the fifth indicator 915 indicates that parking for the seventh parking zone P7 is impossible It can be a graphical object to guide. At this time, the indicators 911-916 may be displayed in an area corresponding to the actual position of the parking zones P1, P4-P8.

The guidance screen displayed on the display unit 141 includes an indicator 921 for guiding the position of the first candidate parking zone P3 and an indicator 922 for guiding the position of the second candidate parking zone P2 . At this time, the indicators 921 and 922 may be displayed in an area corresponding to the actual position of the candidate parking areas P3 and P2.

The control unit 170 distinguishes the indicators 921 and 922 for guiding the candidate parking zones P3 and P2 from the indicators 911 to 916 for guiding the remaining parking zones P1 and P4 to P8 . For example, when the indicators 921 and 922 are compared with the indicators 911 to 916, the color, the thickness of the border, the type of the border line, the brightness, the texture, and the like may be displayed differently.

On the other hand, the controller 170 may display a guide window 930 on the display unit 141 for selecting whether to perform the score calculation process (S760) for the candidate parking areas P3 and P2. As shown, the guide window 930 may include messages and buttons 931 and 932 to guide information (e.g., number) about the detected candidate parking zones P3 and P2.

For example, when the driver touches the first button 931, the controller 170 may calculate the score representing the parking difficulty of the two candidate parking areas P3 and P2 through step S760. On the other hand, when the driver touches the second button 932, the controller 170 can release the parking assist mode.

FIG. 9C illustrates another guidance screen displayed on the display unit 141. FIG. For example, the guidance screen shown in FIG. 9C may be a screen displayed when the first button 931 shown in FIG. 9B is touched by the driver.

9C, in response to the touch of the first button 931, the controller 170 may calculate a score representing the parking difficulty levels of the two candidate parking areas P3 and P2. A method of calculating a score for a candidate parking zone will be described in detail below with reference to FIGS. 10 and 11. FIG.

For example, when the score of the first candidate parking zone P3 is lower than the score of the second candidate parking zone P2 (i.e., when the parking difficulty of the first candidate parking zone P3 is the lowest) Is a message for recommending setting the first candidate parking zone P3 as the target parking zone ("① parking zone ① is easier to park than ② parking zone ① parking zone") 940) can be displayed on the display unit (141). If the driver touches the first button 941 of the guide window 940, the controller 170 may set the first candidate parking zone P3 as the target parking zone. On the other hand, when the driver touches the second button 942 of the guide window 940, the controller 170 may set the second candidate parking zone P2 as the target parking zone.

Alternatively, the control unit 170 may set a candidate parking zone guided by one of the indicators 921 and 922, which is touched by the user, to the target parking zone.

Of course, the control unit 170 may automatically set the first candidate parking zone P3 of the lowest score (i.e., the lowest parking difficulty level) to the target parking zone without displaying the guide window 940. [

FIG. 9D illustrates that the vehicle 100 follows the parking locus 950 of the first candidate parking zone P3 set to the target parking zone. For convenience of explanation, it is assumed that the parking locus 950 is calculated based on the forward parking mode.

9D, when the first candidate parking zone P3 is set as the target parking zone, the controller 170 controls the vehicle 100 to move along the pre-calculated parking locus for the target parking zone P3 , And the driving unit 150 can be controlled.

On the other hand, the control unit 170 can determine the final parking point in the target parking zone P3 in consideration of the obstacles present around the target parking zone P3. For example, while the other vehicle 902 is parked in the parking zone P4 adjacent to the lower side of the target parking zone P3, the parking zone P2 adjacent to the upper side is empty, 100 can be parked closer to the parking zone P2 relatively within the target parking zone P3.

10 shows an exemplary operation in which a driver assistance device according to an embodiment of the present invention calculates a score for each of a plurality of candidate parking areas around the vehicle 100. [

FIG. 10 illustrates that two candidate parking areas 1001 and 1002 detected by the driver assistance device are perpendicular to the traveling direction of the vehicle 100. FIG. That is, the two candidate parking areas 1001 and 1002 may be rectangular parking areas. For the sake of understanding, it is assumed that forward parking is selected in a right angle parking manner for the two candidate parking areas 1001 and 1002.

The control unit 170 controls the first parking trajectory C1 for forward parking and the second parking trajectory C2 for forward parking for the second candidate parking area 1002 for the first candidate parking zone 1001 simultaneously or sequentially . For example, the first parking locus C1 may be a curve connecting the last parking position P21 of the first candidate parking area 1001 from the current position P10 of the vehicle 100. [ The second parking locus C2 may be a curve connecting the last parking position P22 of the second candidate parking area 1002 from the current position P10 of the vehicle 100. [

When the calculation of the first parking trajectory C1 and the second parking trajectory C2 is completed, the control unit 170 sets the first score and the second parking locus C2 indicating the parking difficulty of the first parking trajectory C1, And a second score indicating a parking difficulty level of the parking space.

The control unit 170 determines that the first parking locus C1 and the second parking locus C2 are in the first parking locus C1 and the second parking locus C2 is the same, Based on the length of the second parking locus C2, the first score and the second score can be calculated. For example, the length of the parking locus and the corresponding score may have a proportional relationship. That is, the relation of 'score = length of k parking trajectory' can be obtained. Here, the unit of the length of the parking locus is m, and k may be a predetermined proportional constant.

As shown, when the first candidate parking zone 1001 is closer to the vehicle 100 than the second candidate parking zone 1002, the length of the first parking locus C1 is longer than the second parking locus C2 The first score calculated by the control unit 170 may have a value smaller than the second score. For example, if the proportional constant k is 0.2, the length of the first parking locus C1 is 3 meters, and the length of the first parking locus C1 is 4.5 meters, the first score is calculated as 0.6 points, The second score can be calculated as 0.9 points.

11 shows an exemplary operation in which a driver assistance device according to an embodiment of the present invention calculates a score for each of a plurality of candidate parking spaces around the vehicle 100. [

Figure 11 illustrates two candidate parking zones 1101 and 1102 detected by the driver assistance device.

10, the first candidate parking zone 1101 is a right-angle parking zone perpendicular to the traveling direction of the vehicle 100, while the second candidate parking zone 1102 is a right- A parallel parking parking space that is parallel to the parking space. For the sake of understanding, it is assumed that rear parking is selected in a right angle parking manner for the first candidate parking area 1101.

The control unit 170 controls the first parking trajectory C10 for backward parking and the second parking trajectory C20 for forward parking for the second candidate parking zone 1102 for the first candidate parking zone 1101 simultaneously or sequentially .

For example, the first parking locus C10 may include a first-first parking locus C11 and a first-second parking locus C12 connected to each other. The first 1-1 parking trajectory C11 is a curve connecting the current position P30 of the vehicle 100 to the forward and backward switching point P41 and the 1-2 parking trajectory C12 is a curve connecting the front- P41) to the final parking position P51 of the first candidate parking zone 1101. [ That is, the vehicle 100 moves forward from the current position P30 to the forward / backward switching point P41 and can move backward from the forward / backward switching point P41 to the final parking position P51.

The second parking locus C20 may include a second-first parking locus C21 and a second-second parking locus C22 connected to each other. The second-1 parking trajectory C21 is a curve connecting the current position P30 of the vehicle 100 to the left-right direction switching point P42 and the second-2 parking trajectory C22 is a curve connecting the left- P42) to the final parking position P52 of the second candidate parking zone 1102. [ That is, the vehicle 100 moves backward by rotating the steering wheel of the vehicle 100 in the clockwise direction from the current position P30 to the forward / backward changeover point P41, and moves from the forward / backward change point P41 to the final parking position P51), the steering wheel of the vehicle 100 can be rotated backward in the counterclockwise direction.

When the calculation of the first parking locus C10 and the second parking locus C20 is completed, the control unit 170 sets the first score and the second parking locus C20 indicating the parking difficulty of the first parking locus C10, And a second score indicating a parking difficulty level of the parking space.

As shown in the figure, the lengths of the first parking locus C10 and the second parking locus C20, the left and right direction switching times and the forward and backward switching times may not be the same. In this case, the controller 170 calculates the first score and the second score in consideration of both the lengths of the first parking locus C10 and the second parking locus C20, the number of times of the lateral direction switching and the number of forward and backward switching times .

In one embodiment, the controller 170 calculates the first subscore by multiplying the length of the parking trajectory by a first proportional constant (k1 = 0.2), multiplies the number of transverse directions by a second proportional constant (k2 = 0.4) The second subscore is calculated, the third subscore is calculated by multiplying the forward / backward switching frequency by the third proportional constant (k3 = 0.5), and then the first to third subscores are summed to calculate the final score have.

For example, assuming that the total length of the first parking locus C10 is 7 m, since the number of times of the first parking locus C10 to be shifted leftward and rightward is 1 and the number of forward and backward shifts is 1, 170 may calculate 1.4 as the first subscore for the first candidate parking zone 1101, 0.4 for the second subscore, and 0.5 for the third subscore to yield a final score of 2.3.

Since the number of switching of the second parking locus C20 in the left and right direction is one and the number of switching in the forward and backward direction is zero, assuming that the total length of the second parking locus C20 is 8 m, May calculate 1.6 as the first subscore for the second candidate parking zone 1102, 0.4 for the second subscore, and 0 for the third subscore to yield a final score of 2.0.

As a result, since the score for the second candidate parking zone 1102 is lower than the score for the first candidate parking zone 1101, the controller 170 can set the second candidate parking zone 1102 as the target parking zone have.

On the other hand, the values of the above-described first to third proportional constants are illustrative and may be changed to other values according to the embodiment. It is also apparent to those skilled in the art that the score of the candidate parking section can be calculated using a method other than the summing method of the subscores.

On the other hand, as shown in FIG. 11, a method of calculating the final parking trajectories C10 and C20 by connecting two or more sub-trajectories having curvatures in different directions may be referred to as a two circle system.

12A and 12B illustrate an exemplary operation in which the driver assistance device according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle 100. FIG.

12A illustrates a top view of a situation in which there are four parking zones in the left and right areas of the peripheral space, respectively, based on the vehicle 100, similar to FIG. 9A.

The control unit 170 can detect the parking areas P11 to P18 existing in the surrounding space within the sensing range of the sensing unit 160 based on the sensing information provided from the sensing unit 160. [

The parking areas P11-P14 on the left side and the parking areas P15-P18 on the right side may all be rectangular parking areas. As shown in the figures, the first parking zone P11, the third parking zone P13, the fourth parking zone P14, the sixth parking zone P16, the seventh parking zone P17, In the zone P18, the first to sixth vehicles 1201 to 1206 may be parked respectively in order.

Then, the control unit 170 may filter the parking zones P11 to P18 according to predetermined conditions. That is, the control unit 170 can select only the parking areas in which the vehicle 100 can be parked, among the parking areas P11 to P18, through filtering.

For example, according to the first condition described above with reference to Fig. 7, the control unit 170 determines whether or not the second parking zone P12, which is not parked by the other vehicles 1201-1206 in the parking zones P11- The fifth parking zone P15 can be detected primarily.

In addition, the control unit 170 sets the second parking zone P12 among the firstly detected second parking zone P12 and the fifth parking zone P15 according to the second condition described above with reference to Fig. 7 Only the excluded fifth parking zone P15 can be detected as the candidate parking zone. Since the second parking zone P12 is currently empty but the entrance of the vehicle 100 is blocked by the other vehicle 1207 (i.e., the width O of the vehicle 100 < the allowable width W) The control unit 170 may exclude the second parking zone P12 from the candidate parking zone. Here, the allowable width W may be the minimum distance between the obstacles 1202 and 1207 adjacent to the second parking zone P12. The control unit 170 can calculate the allowable width W based on the sensing information and compare it with the outward specification information of the vehicle 100 to determine whether or not the second condition is satisfied.

12B illustrates a guidance screen displayed on the display unit 141 in the situation shown in FIG. 12A.

Referring to FIG. 12B, the controller 170 may display a guidance screen including information corresponding to the situation shown in FIG. 12A on the display unit 141. FIG. For example, the guidance screen shown in FIG. 12B may be a screen for guiding information on the candidate parking zone P15 detected by the control unit 170 through step S740.

Specifically, an indicator M for guiding the current position of the vehicle 100 may be displayed on the guidance screen displayed on the display unit 141. [ P13, P14, P16-P18, that is, the first condition is satisfied by the other vehicles 1201-1206, P16-P18) may be included. Further, the guide screen may include an indicator 1231 for guiding the position of the parking zone P12 unsatisfied with the second condition. At this time, the indicators 1211-1216 and 1231 can be displayed in an area corresponding to the actual position of the parking zones P11-P14 and P16-P18.

Meanwhile, the guide screen displayed on the display unit 141 may include an indicator 1221 for guiding the position of the candidate parking zone P15. At this time, the indicator 1221 may also be displayed in an area corresponding to the actual position of the candidate parking zone P15.

The control unit 170 displays the indicator 1221 for guiding the candidate parking zone P15 to be distinguished from the indicators 1211-1216 and 1231 for guiding the remaining parking zones P11-P14 and P16-P18 can do. For example, when the indicator 1221 is compared with the indicators 1211-1216 and 1231, the color, the thickness of the border, the type of the border line, the brightness, the texture, and the like may be displayed differently.

On the other hand, the controller 170 may display on the display unit 141 a guide window 1240 for selecting whether to perform the score calculation process (S760) for the candidate parking zone P15. As shown, the guide window 1230 may include buttons and buttons 1241 and 1242 to guide information (e.g., number) about the detected candidate parking zone P15.

For example, when the driver touches the first button 1241, the control unit 170 can calculate the parking locus to the candidate parking zone P15. 9B, when only one candidate parking zone P15 is detected, the control unit 170 omits calculation of the score indicating the parking difficulty level for the candidate parking zone P15, and selects the candidate parking zone P15 ) Can be automatically set as the target parking zone. On the other hand, when the driver touches the second button 1242, the controller 170 can release the parking assist mode.

13A to 13C illustrate an exemplary operation in which the driver assistance device according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle 100. FIG.

13A illustrates a top view of a situation in which there are four parking areas P21 to P24 in the left area of the peripheral space and three parking areas P25 to P27 in the right area based on the vehicle 100 do.

The control unit 170 can detect the parking areas P21 to P27 existing in the surrounding space within the sensing range of the sensing unit 160 based on the sensing information provided from the sensing unit 160. [

The parking areas P21 to P24 on the left side are all right angle parking areas and the parking areas P25 to P27 on the right side are all parking areas for parallel parking.

As shown in the figure, the first parking zone P21, the third parking zone P23, the fourth parking zone P24, the fifth parking zone P25, and the seventh parking zone P27 are sequentially 1 to the fifth vehicle 1301-1305 may be parked respectively.

Next, the control unit 170 may filter the parking zones P21 to P27 according to predetermined conditions. That is, the control unit 170 can select only the parking area in which the vehicle 100 can be parked among the parking areas P21-P27 through filtering.

For example, according to the first condition described above with reference to Fig. 7, the control unit 170 determines whether or not the second parking zone P22, which is not parked by the other vehicle 1301-1305 among the parking zones P21- The sixth parking zone P26 can be detected.

13B illustrates a guidance screen displayed on the display unit 141 in the situation shown in FIG. 13A.

Referring to FIG. 13B, the control unit 170 may display a guidance screen including information corresponding to the situation shown in FIG. 13A on the display unit 141. FIG. For example, the guidance screen shown in FIG. 13B may be a screen for guiding information on the candidate parking areas P22 and P26 detected by the control unit 170 through step S740.

Specifically, an indicator M for guiding the current position of the vehicle 100 may be displayed on the guidance screen displayed on the display unit 141. [ The guide screen may also include indicators 1311 to 1315 for guiding the positions of the parked areas P21, P23-P25, P27 parked by the other vehicles 1301-1305. At this time, the indicators 1311 to 1315 may be displayed in an area corresponding to the actual position of the parking zones P21, P23-P25, and P27.

The guide screen displayed on the display unit 141 may include indicators 1321 and 1322 for guiding the positions of the candidate parking areas P22 and P26. At this time, the indicators 1321 and 1322 may also be displayed in an area corresponding to the actual position of the candidate parking areas P22 and P26.

The controller 170 displays the indicators 1321 and 1322 for guiding the candidate parking zones P22 and P26 and the indicators 1311 to 1315 for guiding the remaining parking zones P21 to P25 It can be marked to distinguish. For example, when the indicators 1321 and 1322 are compared with the indicators 1311 to 1315, the color, the thickness of the border, the type of the border line, the brightness, and the texture may be displayed differently.

On the other hand, when the forward parking and the backward parking for the first candidate parking zone P22 are possible, the controller 170 displays a guide window 1330 for selecting the right parking method in the first candidate parking zone P22 Can be displayed on the display unit 141. For example, when the driver touches the first button 1331, the control unit 170 can calculate the driving trajectory for forward parking in the first candidate parking zone P22. Conversely, when the driver touches the second button 1332, the controller 170 can calculate the driving trajectory for backward parking in the first candidate parking zone P22.

13C illustrates that the recommendation information of the target parking zone is displayed on the display unit 141. [

If the first button 1331 shown in FIG. 13B is touched, the controller 170 calculates the score of the advance parking trajectory for the first candidate parking zone P22, and transmits the calculated score to the second candidate parking zone P26 ). &Lt; / RTI &gt; If the score of the first candidate parking zone P22 is lower than the score of the second candidate parking zone P26, the control unit 170 recommends setting the first candidate parking zone P22 as the target parking zone A message window 1340 may be displayed on the display unit 141 including a message ("① parking zone is easier to park than ② parking zone ① parked in parking zone").

If the driver touches the first button 1341 of the guide window 1340, the controller 170 may set the first candidate parking zone P22 as the target parking zone. On the other hand, when the driver touches the second button 1342 of the guidance window 1340, the control unit 170 may set the second candidate parking zone P26 as the target parking zone. Of course, the control unit 170 may automatically set the first candidate parking zone P22 of the lowest score (i.e., the lowest parking difficulty level) to the target parking zone without displaying the guidance window 1340. [

Meanwhile, although not shown, when the second button 1332 shown in FIG. 13B is touched, the controller 170 calculates the score of the backward parking trajectory for the first candidate parking zone P22, Can be compared to the score for the candidate parking zone P26. If the score of the second candidate parking zone P26 is lower than the score of the first candidate parking zone P22, then the control unit 170 moves the second candidate parking zone P26 to the target parking zone &lt; RTI ID = 0.0 & On the display unit 141. The display unit 141 displays a message that recommends setting the display unit 141 as a display unit.

14A and 14B show an exemplary operation in which the driver assistance device according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle 100. [

14A illustrates a top view of a situation in which there are four parking areas P31 to P34 in the left area of the peripheral space and four parking areas P35 to P38 in the right area with respect to the vehicle 100 do.

The control unit 170 can detect the parking areas P31 to P38 existing in the surrounding space within the sensing range of the sensing unit 160 based on the sensing information provided from the sensing unit 160. [

The parking areas P31-P34 on the left side are all rectangular parking areas, and the parking areas P35-P37 on the right side can all be parking areas for oblique parking. The diagonal parking zone may be a parking zone that forms a predetermined inclination (for example, about 30 to 60 degrees) that is orthogonal to the traveling direction of the vehicle 100 or is not parallel.

As shown in the figure, the first parking zone P31, the third parking zone P33, the fourth parking zone P34, the fifth parking zone P35, the seventh parking zone P37, In the zone P38, the first to sixth vehicles 1401 to 1406 may be parked in order, respectively.

Then, the control unit 170 may filter the parking zones P31 to P38 according to predetermined conditions. That is, the control unit 170 can select only the parking areas in which the vehicle 100 can be parked, among the parking areas P31 to P38, through filtering.

For example, in accordance with the first condition described above with reference to Fig. 7, the control unit 170 determines whether the parking zones P31-P38 are parked by the other vehicles 1401-1406, The sixth parking zone P36 can be detected.

On the other hand, the control unit 170 detects the mark M1 drawn on the road from the forward image provided from the front camera 161a using an image processing technique (e.g., TSR (Traffic Sign Recognition)), M1) with the reference images previously stored in the memory 130 to identify the information of the detected mark M1. For example, as shown in the figure, when the mark M1 guides the direction of the exit, the control unit 170 can use the mark recognition information for the setting of the target parking zone, which will be described in more detail in Fig. 14B .

Fig. 14B illustrates a guidance screen displayed on the display unit 141 in the situation shown in Fig. 14A.

Referring to FIG. 14B, the controller 170 may display on the display 141 a guide screen including information corresponding to the situation shown in FIG. 14A. For example, the guidance screen shown in FIG. 14B may be a screen for guiding information on the candidate parking areas P32 and P36 detected by the control unit 170 through step S740.

Specifically, an indicator M for guiding the current position of the vehicle 100 may be displayed on the guidance screen displayed on the display unit 141. [ The guide screen may also include indicators 1411-1416 for guiding the position of the parked areas P31, P33-P35, P37, P38 parked by the other vehicles 1401-1406. At this time, the indicators 1411 to 1416 can be displayed in an area corresponding to the actual position of the parking zones P31, P33-P35, P37, and P38.

The guidance screen displayed on the display unit 141 may include indicators 1421 and 1422 for guiding the positions of the candidate parking areas P32 and P36. At this time, the indicators 1421 and 1422 may also be displayed in an area corresponding to the actual position of the candidate parking areas P32 and P36.

The controller 170 displays indicators 1421 and 1422 for guiding the candidate parking zones P32 and P36 to the indicators 1411 to 1416 for guiding the remaining parking zones P31 to P35, Can be displayed so as to be distinguished from &quot; For example, when the indicators 1421 and 1422 are compared with the indicators 1411 to 1416, the color, the thickness of the border, the type of the border line, the brightness, the texture, and the like may be displayed differently.

Also, the controller 170 may display the indicator 1430 for guiding the exit direction in an area that does not overlap with the other indicators 1411-1416, 1421, and 1422. Indicator 1430 may be a graphic object corresponding to mark M1.

On the other hand, the scores calculated for the plurality of candidate parking areas P32 and P36 may be equal to each other. In this case, based on the recognition result of the mark M1, the control unit 170 displays a guidance window 1440 including a message recommending that any one of the plurality of candidate parking areas P32 and P36 be set as the target parking zone And can be displayed on the display unit 141. [ For example, even if the score of the first candidate parking zone P32 is the same as the score of the second candidate parking zone P36, the first candidate parking zone P32 is closer to the exit than the second candidate parking zone P36 If it is easy to enter the exit, the control unit 170 may display on the display unit 141 a message recommending that the first candidate parking zone P32 be set as the target parking zone. Here, the fact that it is easy to enter the exit means that the amount of movement of the vehicle 100 required to enter the exit (for example, the length of the trajectory, the number of times of turning in the left and right direction, have.

If the driver touches the first button 1441, the controller 170 may start the reverse parking for the first candidate parking zone P32. On the other hand, when the driver touches the second button 1442, the control unit 170 can calculate the driving trajectory for forward parking in the second candidate parking zone P36 diagonally.

15A to 15D illustrate an exemplary operation in which a driver assistance device according to an embodiment of the present invention selects a target parking zone among a plurality of parking zones around the vehicle 100. [

15A illustrates a top view of a situation in which there are four parking zones in the left and right regions of the peripheral space, respectively, based on the vehicle 100, similar to FIG. 9A.

The left parking zones P51-P54 and the right parking zones P55-P58 may all be right angle parking zones. As shown in the figure, the first parking zone P51, the fourth parking zone P54, the fifth parking zone P55, the seventh parking zone P57, and the eighth parking zone P58 are sequentially 1 to 5th vehicles 1501 to 1505 may be parked respectively.

The control unit 170 can detect the parking areas P51 to P58 existing in the surrounding space within the sensing range of the sensing unit 160 based on the sensing information provided from the sensing unit 160. [

Then, the control unit 170 may filter the parking zones P51 to P58 according to predetermined conditions. That is, the control unit 170 can select only the parking areas in which the vehicle 100 can be parked among the parking areas P51-P58 through filtering.

For example, in accordance with the first condition described above with reference to Fig. 7, the control unit 170 determines whether or not the second parking zone P52, which is not parked by the other vehicle 1501-1505 among the parking zones P51- The third parking zone P53 and the sixth parking zone P56 can be detected as the candidate parking zone.

On the other hand, the control unit 170 receives the external image from the external camera provided from the front camera 161a or the left camera 161b using an image processing technique (e.g., TSR (Traffic Sign Recognition) It is possible to detect the mark M2 and identify the information of the detected mark M2. For example, as shown in the figure, when the mark M2 is an indication that it is disabled, the control unit 170 can use the mark recognition information to set the target parking zone, which will be described in more detail in Figs. 15B to 15D .

Hereinafter, for convenience of explanation, the second parking zone P52 is referred to as a first candidate parking zone, the third parking zone P53 as a second candidate parking zone, the sixth parking zone P56 as a third candidate zone, Will be referred to as a candidate parking zone.

FIG. 15B illustrates a guidance screen displayed on the display unit 141 in the situation shown in FIG. 15A.

Referring to FIG. 15B, the controller 170 may display a guidance screen including information corresponding to the situation shown in FIG. 15A on the display unit 141. FIG. For example, the guidance screen shown in FIG. 15B may be a screen for guiding information on the candidate parking areas P52, P53, and P56 detected by the control unit 170 through step S740.

Specifically, an indicator M for guiding the current position of the vehicle 100 may be displayed on the guidance screen displayed on the display unit 141. [ P54, P55, P57, and P58 that are parked by the other vehicles 1501 to 1505, that is, the parking areas P51, P54, P55, P57, and P58). For example, the fourth indicator 1514 may be a graphical object that guides parking disabled for the seventh parking zone P57. At this time, the indicators 1511-1515 may be displayed in an area corresponding to the actual position of the parking zones P51, P54, P55, P57, and P58.

Indicators 1521-1523 for guiding the positions of the first candidate parking zone P52, the second candidate parking zone P53 and the third candidate parking zone P56 are displayed on the guidance screen displayed on the display unit 141, May be included. At this time, the indicators 1521-1523 may be displayed in an area corresponding to the actual position of the candidate parking areas P52, P53, and P56. The control unit 170 may also display the indicator 1530 corresponding to the mark M2 in the indicator 1522 that guides the second candidate parking zone P53.

The controller 170 controls the indicators 1521-1523 for guiding the candidate parking zones P52, P53 and P56 to the indicators 1511 to guide the remaining parking zones P51, P54, P55, P57 and P58 -1515). &Lt; / RTI &gt; For example, when the indicators 1521-1523 are compared with the indicators 1511-1515, the color, the thickness of the border, the type of the border line, the brightness, the texture, and the like may be displayed differently.

On the other hand, the control unit 170 may display a guidance window 1540 on the display unit 141 for selecting whether to perform the score calculation process (S760) for the candidate parking zones P52, P53, and P56. As shown, the guide window 1540 may include messages and buttons 1541 and 1542 to guide information (e.g., number) about the detected candidate parking zones P52, P53, and P56.

For example, when the driver touches the first button 1541, the controller 170 may calculate the score representing the parking difficulty of each of the plurality of candidate parking areas P52, P53, and P56 through step S760. On the other hand, when the driver touches the second button 1542, the controller 170 can release the parking assist mode.

Fig. 15C illustrates another guidance screen displayed on the display unit 141. Fig. For example, the guidance screen shown in FIG. 15C may be a screen displayed when the first button 1541 shown in FIG. 15B is touched by the driver.

15C, in response to the touch of the first button 1541, the controller 170 may calculate a score representing the parking difficulty of each of the plurality of candidate parking areas P52, P53, and P56.

If the score of the second candidate parking zone P53 is lower than the score of the first and third candidate parking zones P52 and P56 (i.e., the parking difficulty of the second candidate parking zone P53 is the lowest) , The control unit 170 determines whether the second candidate parking zone P52 is a zone for the disabled and the second candidate parking zone P52 is a message for recommending setting the target parking zone as the target parking zone (2) Do you want to park in the parking area (for the disabled)? ") Can be displayed on the display unit (141).

If the driver touches the first button 1551 of the guidance window 1550, the control unit 170 may initiate a process for setting the second candidate parking zone P53 to the target parking zone. On the other hand, when the driver touches the second button 1552 of the guide window 1550, the control unit 170 sets either the first candidate parking zone P52 or the third candidate parking zone P56 to the target parking zone .

FIG. 15D illustrates the operation performed by the vehicle 100 when the user selects the second candidate parking zone P53 in FIG. 15C.

For example, when the first button 1551 is touched or the indicator 1522 for guiding the second candidate parking zone P53 is touched on the guidance screen shown in FIG. 15C, the controller 170 generates a new object 1560 , 1570) can be displayed on the display unit 141.

Specifically, the control unit 170 may display on the display unit 141 a message 1560 for requesting the disabled person authentication and an input window 1570 for receiving the user input for the disabled person authentication.

For example, when the password for the disabled person authentication is numeric, an input window 1570 for numeric input can be displayed in one area of the display unit 141. [

When the password is input through the input window 1570, the control unit 170 accesses the disabled authentication server through the communication unit 110, requests authentication of the input password, and receives the authentication result from the disabled authentication server have. If the authentication is successful, the control unit 170 may set the second candidate parking zone P53 as the target parking zone and provide the driving unit 150 with a control signal to follow the parking trajectory of the target parking zone . On the other hand, if the authentication fails, the control unit 170 may set either the first candidate parking zone P52 or the third candidate parking zone P56 as the target parking zone.

The embodiments of the present invention described above are not only implemented by the apparatus and method but may be implemented through a program for realizing the function corresponding to the configuration of the embodiment of the present invention or a recording medium on which the program is recorded, The embodiments can be easily implemented by those skilled in the art from the description of the embodiments described above.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to be illustrative, The present invention is not limited to the drawings, but all or some of the embodiments may be selectively combined so that various modifications may be made.

100: vehicle

Claims (15)

A device for supporting a driver in parking,
A sensing unit for generating sensing information corresponding to a peripheral space of the vehicle; And
And a control unit connected to the sensing unit,
Wherein,
In a parking assisting mode, a candidate parking zone existing in the peripheral space is detected based on the sensing information provided from the sensing unit,
Calculating a score representing a parking difficulty level for each of the candidate parking zones,
And sets one of the candidate parking zones as a target parking zone based on the score.
The method according to claim 1,
The sensing unit includes:
A camera, a radar, a radar, and an ultrasonic sensor,
The sensing information,
And at least one of an external image of the vehicle provided from the camera and obstacle information detected by at least one of the radar, the driver and the ultrasonic sensor.
The method according to claim 1,
Wherein,
Filtering a parking zone satisfying at least one predetermined condition among all parking zones in the peripheral space,
And detects the candidate parking zone from among the filtered parking zones.
The method of claim 3,
Wherein the at least one condition comprises:
(i) a first condition indicating that no other parked vehicle is present, and (ii) a second condition indicating that the entry of the vehicle by the obstacle is not blocked.
The method according to claim 1,
Wherein,
Calculating a parking locus for the candidate parking area based on at least one of size, shape and position of the candidate parking area and size, shape and position of an obstacle present in the surrounding space,
And calculates the score based on the parking locus.
6. The method of claim 5,
Wherein,
Determining a parking mode of the vehicle for the candidate parking zone based on at least one of size, shape and position of the candidate parking zone and size, shape, and position of an obstacle present in the surrounding space,
And calculates a parking locus for the candidate parking area based on the parking mode.
6. The method of claim 5,
Wherein,
And calculates the score based on the length of the parking locus, the number of times of switching in the left and right direction, and the number of times of forward and backward switching.
The method according to claim 1,
Wherein,
And sets a candidate parking area in which the lowest score among the candidate parking areas is calculated as the target parking area.
9. The method of claim 8,
Wherein,
Sets a candidate parking zone corresponding to user input to the target parking zone when there is more than one candidate parking zone in which the lowest score is calculated.
The method according to claim 1,
Further comprising a display unit for displaying an image,
Wherein,
And displays an image guiding the candidate parking zone and the target parking zone on the display unit.
The method according to claim 1,
Wherein,
And when the predetermined event occurs, selects the parking support mode.
12. The method of claim 11,
The predetermined event may include:
Wherein the vehicle support mode includes at least one of reception of a user input instructing selection of the parking support mode, entry of a parking lot of the vehicle, and arrival of a predetermined destination to the vehicle.
A method for assisting a driver in parking a vehicle,
Entering a parking support mode;
Receiving sensing information corresponding to a peripheral space of the vehicle;
Detecting a candidate parking zone in the peripheral space based on the sensing information;
Calculating a score representing a degree of parking difficulty for each of the candidate parking zones; And
And setting one of the candidate parking zones as a target parking zone based on the score.
14. The method of claim 13,
Wherein the step of calculating the score comprises:
Calculating a parking locus for the candidate parking zone based on at least one of size, shape and position of the candidate parking zone and size, shape and position of an obstacle present in the surrounding space; And
And calculating the score based on the length of the parking locus, the number of times of the lateral direction switching, and the number of times of forward and backward switching.
14. The method of claim 13,
Wherein setting the target parking zone comprises:
And sets the candidate parking zone in which the lowest score among the candidate parking zones is calculated to the target parking zone.
KR1020150158377A 2015-11-11 2015-11-11 Appratus and method for assisting a driver based on difficulty level of parking KR101832224B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150158377A KR101832224B1 (en) 2015-11-11 2015-11-11 Appratus and method for assisting a driver based on difficulty level of parking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150158377A KR101832224B1 (en) 2015-11-11 2015-11-11 Appratus and method for assisting a driver based on difficulty level of parking

Publications (2)

Publication Number Publication Date
KR20170055334A true KR20170055334A (en) 2017-05-19
KR101832224B1 KR101832224B1 (en) 2018-02-26

Family

ID=59049702

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150158377A KR101832224B1 (en) 2015-11-11 2015-11-11 Appratus and method for assisting a driver based on difficulty level of parking

Country Status (1)

Country Link
KR (1) KR101832224B1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108364480A (en) * 2018-04-19 2018-08-03 智慧互通科技有限公司 System is managed based on the united Roadside Parking of more ball machines
WO2019210599A1 (en) * 2018-04-29 2019-11-07 惠州市德赛西威汽车电子股份有限公司 Parking space identification method and parking method
US10565875B2 (en) 2017-10-27 2020-02-18 Mando Corporation Apparatus and method for parking assist
US10586448B2 (en) 2018-06-05 2020-03-10 Ford Global Technologies, Llc Hazard mitigation for access to passenger vehicles
US20220326039A1 (en) * 2021-04-08 2022-10-13 Toyota Jidosha Kabushiki Kaisha Method, information processing device, and non-transitory storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4465773B2 (en) * 2000-01-19 2010-05-19 株式会社エクォス・リサーチ Computer-readable recording medium on which parking assistance device and parking assistance program are recorded
KR101104609B1 (en) * 2007-10-26 2012-01-12 주식회사 만도 Method and System for Recognizing Target Parking Location
JP2010208358A (en) * 2009-03-06 2010-09-24 Toyota Industries Corp Parking assist apparatus
JP5321267B2 (en) * 2009-06-16 2013-10-23 日産自動車株式会社 Vehicular image display device and overhead image display method
JP5440867B2 (en) * 2010-06-18 2014-03-12 アイシン精機株式会社 Parking assistance device
JP5640511B2 (en) * 2010-07-12 2014-12-17 マツダ株式会社 Driving skill training device for vehicles
JP2012126193A (en) * 2010-12-14 2012-07-05 Denso Corp Automatic parking system for parking lot
KR101560984B1 (en) * 2014-08-25 2015-10-27 성균관대학교산학협력단 System for automatic parking of vehicle

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10565875B2 (en) 2017-10-27 2020-02-18 Mando Corporation Apparatus and method for parking assist
CN108364480A (en) * 2018-04-19 2018-08-03 智慧互通科技有限公司 System is managed based on the united Roadside Parking of more ball machines
WO2019210599A1 (en) * 2018-04-29 2019-11-07 惠州市德赛西威汽车电子股份有限公司 Parking space identification method and parking method
US10586448B2 (en) 2018-06-05 2020-03-10 Ford Global Technologies, Llc Hazard mitigation for access to passenger vehicles
US20220326039A1 (en) * 2021-04-08 2022-10-13 Toyota Jidosha Kabushiki Kaisha Method, information processing device, and non-transitory storage medium

Also Published As

Publication number Publication date
KR101832224B1 (en) 2018-02-26

Similar Documents

Publication Publication Date Title
KR101832466B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101708657B1 (en) Vehicle and control method for the same
KR101916993B1 (en) Display apparatus for vehicle and control method thereof
KR101750178B1 (en) Warning Method Outside Vehicle, Driver Assistance Apparatus For Executing Method Thereof and Vehicle Having The Same
US10286905B2 (en) Driver assistance apparatus and control method for the same
US10748428B2 (en) Vehicle and control method therefor
KR101942793B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101838187B1 (en) Display Apparatus and Vehicle Having The Same
KR101824982B1 (en) Vehicle and control method for the same
KR101750876B1 (en) Display apparatus for vehicle and Vehicle
KR20180037426A (en) Parking Assistance Apparatus and Vehicle Having The Same
KR20180040235A (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101762805B1 (en) Vehicle and control method for the same
KR101832224B1 (en) Appratus and method for assisting a driver based on difficulty level of parking
US20190276044A1 (en) User interface apparatus for vehicle and vehicle including the same
KR101962348B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101936629B1 (en) Vehicle and control method for the same
KR101859044B1 (en) Vehicle and control method for the same
KR20170035238A (en) Vehicle and control method for the same
KR101767507B1 (en) Display apparatus for a vehicle, and control method for the same
KR20170005663A (en) Display control apparatus for vehicle and operating method for the same
KR20170033612A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101850794B1 (en) Parking assist appratus and method for assisting parking
KR20180073540A (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101985496B1 (en) Driving assistance apparatus and vehicle having the same

Legal Events

Date Code Title Description
A201 Request for examination
E701 Decision to grant or registration of patent right
GRNT Written decision to grant