KR101767074B1 - Vehicle and controlling method for the same - Google Patents

Vehicle and controlling method for the same Download PDF

Info

Publication number
KR101767074B1
KR101767074B1 KR1020150179911A KR20150179911A KR101767074B1 KR 101767074 B1 KR101767074 B1 KR 101767074B1 KR 1020150179911 A KR1020150179911 A KR 1020150179911A KR 20150179911 A KR20150179911 A KR 20150179911A KR 101767074 B1 KR101767074 B1 KR 101767074B1
Authority
KR
South Korea
Prior art keywords
parking
vehicle
input
touch gesture
image
Prior art date
Application number
KR1020150179911A
Other languages
Korean (ko)
Other versions
KR20170071796A (en
Inventor
손지호
윤대중
Original Assignee
현대자동차주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대자동차주식회사 filed Critical 현대자동차주식회사
Priority to KR1020150179911A priority Critical patent/KR101767074B1/en
Publication of KR20170071796A publication Critical patent/KR20170071796A/en
Application granted granted Critical
Publication of KR101767074B1 publication Critical patent/KR101767074B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/06Automatic manoeuvring for parking

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The disclosed embodiment provides a vehicle and a method of controlling the same that automatically perform parking through a predetermined touch gesture. According to one embodiment, a vehicle includes an image sensor for acquiring an image around a vehicle; A top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle is displayed. When a parking command is input, based on the image acquired by the image sensor, A display unit for displaying a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces; And a controller for parking the subject vehicle according to the input touch gesture when the touch gesture is input.

Description

[0001] VEHICLE AND CONTROLLING METHOD FOR THE SAME [0002]

The disclosed embodiment relates to a vehicle.

Generally, a driver of a vehicle uses a side mirror or a room mirror mounted on a vehicle to move the vehicle while visually checking rear or side obstacles.

However, if there is a blind spot in the rear corner of the vehicle that the driver can not recognize, and the driver can not accurately grasp the length or width of the vehicle even if the driver recognizes the obstacle, the vehicle can not recognize the distance between the vehicle and the obstacle, It may come in contact with obstacles.

In order to solve the above problems, a Parking Assist System (PAS) has been introduced to mount a sensor on the rear and front of the vehicle to help recognize the distance to the obstacle by an alarm sound. In recent years, a smart parking assist system (SPAS: Smart Parking (SPAS)) has been developed which recognizes the space to be parked and automatically generates a parking path to automatically control the steering wheel to automatically park the vehicle without operating the steering wheel Assist System) was introduced. SPAS is also called a parking steering assist system because it controls the steering of the vehicle.

The disclosed embodiment provides a vehicle and a method of controlling the same that automatically perform parking through a predetermined touch gesture.

According to one embodiment, a vehicle includes an image sensor for acquiring an image around a vehicle; A top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle is displayed. When a parking command is input, based on the image acquired by the image sensor, A display unit for displaying a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces; And a controller for parking the subject vehicle according to the input touch gesture when the touch gesture is input.

Further, the touch gesture includes a touch gesture indicating a parking progress path of the subject vehicle.

Also, the display unit may display the parking progress route when the touch gesture indicating the parking progress route is input, and the control unit may control the child vehicle to travel along the inputted parking progress route.

The display unit may display a predicted position of at least one vehicle on the parking progress path when the touch gesture indicating the parking progress path is input.

Also, the touch gesture may include at least one touch input to the parking available seat.

In addition, the touch input may indicate an input for another parking method depending on the number of touches.

The display unit may display an object provided to receive an input for a parking command. When a parking command is input through touching the object, the position of the vehicle is parked based on the image acquired by the image sensor A vehicle that displays a seat.

In addition, the display unit may display the position of the vehicle and the available parking space as a top view image based on the image acquired by the image sensor and the previously stored map data when the parking command is input.

According to one embodiment, a vehicle includes an image sensor for acquiring an image around a vehicle; A first area for displaying a top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle, a second area for displaying an object adapted to receive an input for a parking command, A third area for displaying a position of the vehicle and a parking allowable seat based on the image acquired by the image sensor and receiving a predetermined touch gesture for parking the subject vehicle in any one of the parking seats, And a display unit including the display unit.

According to another aspect of the present invention, there is provided a method of controlling a vehicle, the method comprising: when a parking command is inputted, displaying a position of a vehicle and a parking allowable seat based on an image acquired by the image sensor; Receiving a predetermined touch gesture for parking the subject vehicle at any one of the available parking positions; And parking the subject vehicle according to the input touch gesture.

The inputting of the predetermined touch gesture for parking the subject vehicle to any one of the available parking spaces may include inputting a touch gesture indicating a parking progress path for parking the subject vehicle to any one of the available parking spaces under; And displaying the parking progress path according to the input touch gesture.

The method may further include displaying an expected position of at least one vehicle on the displayed parking progress path.

In addition, parking the subject vehicle according to the input touch gesture may include controlling the subject vehicle to travel along the inputted parking course.

The inputting of the predetermined touch gesture for parking the subject vehicle to any one of the available parking spaces may include inputting at least one touch to any one of the available parking spaces .

In addition, parking the subject vehicle according to the input touch gesture may include parking the subject vehicle using a different parking method according to the number of touches to any one of the available places.

In addition, when the parking command is input, displaying the position of the vehicle and the parking allowable position based on the image acquired by the image sensor displays an object provided to receive an input to the parking command; And displaying a position of the vehicle and a parking allowable seat based on the image acquired by the image sensor when a parking command is input through touching the object.

In addition, when the parking command is inputted, displaying the position of the vehicle and the parking allowable seat based on the image acquired by the image sensor is based on the image acquired by the image sensor and the map data stored in advance, And displaying the location of the vehicle and the available parking space as a top view image.

According to the disclosed embodiment, the automatic parking function can be more easily used through an intuitive user interface.

1 is an external view of a vehicle according to an embodiment.
2 is a view showing the internal configuration of a vehicle according to an embodiment.
3 is a control block diagram of a vehicle according to the disclosed embodiment.
4 is a diagram illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment of the present invention.
5 is a diagram illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an embodiment when a parking command is input.
FIGS. 6A, 6B, and 6C are views illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment when a touch gesture for rear parking is input.
FIGS. 7A, 7B, and 7C are diagrams illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment when a touch gesture for front parking is input.
8 is a flowchart showing a control method of a vehicle according to an embodiment.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1, a vehicle according to an embodiment of the present invention includes a main body 1 forming an outer appearance of a vehicle, wheels 51 and 52 for moving the vehicle, driving devices for rotating the wheels 51 and 52 A door 71 for shielding the inside of the vehicle from the outside, a windshield 30 for providing a driver's front view to the inside of the vehicle, and side mirrors 81 and 82 for providing a rear view of the vehicle to the driver .

The wheels 51 and 52 include a front wheel 51 provided at the front of the vehicle and a rear wheel 52 provided at the rear of the vehicle.

The driving device 80 provides a rotational force to the front wheel 51 or the rear wheel 52 so that the main body 1 moves forward or backward. The driving device 60 may include an engine for generating a rotational force by burning fossil fuel or a motor for generating a rotational force by receiving power from a capacitor (not shown).

The door 71 is rotatably provided on the left and right sides of the main body 1 so that the driver can ride on the inside of the vehicle at the time of opening and shields the inside of the vehicle from the outside when the door is closed.

A front glass 30 called windshield glass is provided on the front upper side of the main body 100. The driver inside the vehicle can see the front of the vehicle through the windshield 30. [

The side mirrors 81 and 82 include a left side mirror 81 provided on the left side of the main body 1 and a right side mirror 82 provided on the right side. The driver inside the vehicle can visually confirm the situation on the side and rear of the vehicle through the side mirrors 81 and 82. [

In addition, the vehicle may include various sensors for detecting obstacles around the vehicle and helping the driver to recognize the situation around the vehicle. For example, the vehicle may include a plurality of cameras capable of acquiring front, rear, left, and right images of the vehicle.

The vehicle may include a dashboard provided therein with a gear box 120, a center fascia 130, a steering wheel 140 and a dashboard 150 as shown in Fig.

The gear box 120 may be provided with a gear lever 121 for shifting the vehicle. Also, as shown in the figure, the gear box includes a dial control unit 111, which is provided to allow the user to control the functions of the multimedia device including the navigation device 10 and the audio device 133, And an input device 110 including various buttons.

The air conditioner 132, the audio device 133, and the navigation device 10 may be installed in the center fascia 130.

The air conditioner controls the temperature, humidity, air cleanliness and air flow inside the vehicle to keep the interior of the vehicle comfortable. The air conditioner may include at least one discharge port 132 installed in the center pacea 130 and discharging air. The center fascia 130 may be provided with a button or a dial for controlling the air conditioner or the like. A user such as a driver can control the air conditioner of the vehicle by using a button or dial arranged on the center pace. Of course, the air conditioner may be controlled through the buttons of the input device 110 installed in the gear box 120 or the dial control unit 111.

According to the embodiment, the navigation device 10 may be installed in the center fascia 130. The navigation device 10 may be embedded in the center pacea 130 of the vehicle. According to one embodiment, an input unit for controlling the navigation device 10 may be provided in the center fascia. Depending on the embodiment, the input of the navigation device 10 may be located at a different location than the center fascia. For example, the input unit of the navigation device 10 may be formed around the display unit 300 of the navigation device 10. [ As another example, the input unit of the navigation device 10 may be installed in the gear box 120 or the like.

The steering wheel 140 is a device for adjusting the running direction of the vehicle and includes a rim 141 held by a driver and a spoke (not shown) connected to the steering device of the vehicle and connecting the rim 141 to a hub of a rotary shaft for steering 142). According to the embodiment, the spokes 142 may be provided with operating devices 142a and 142b for controlling various devices in the vehicle, for example, an audio device. The steering wheel 140 may perform a function of calling the driver's attention so that the driver can perform safe driving. For example, the steering wheel 140 may warn the driver of drowsy operation by tactile vibration when the driver is in a drowsy operation of the driver, You may.

Also, various dashboards 150 may be installed on the dashboard to indicate the running speed of the vehicle, the number of revolutions of the engine, or the remaining amount of the fuel. The dashboard 150 may include a dashboard display unit 151 that displays a vehicle status, information related to vehicle driving, information related to the operation of the multimedia device, and the like.

The display unit 300 of the vehicle, for example, the display unit 300 of the navigation system, may display an image of the outside of the vehicle obtained by an image sensor provided outside the vehicle and acquiring an image outside the vehicle. The image sensor is an AVM (Around View Monitoring) image sensor. The image sensor includes a plurality of cameras capable of acquiring front, rear, left, and right images of the vehicle. The display unit displays a synthesized image (Hereinafter referred to as AVM image) can be displayed as a top view image. The driver can see the image of the outside of the vehicle displayed on the display unit 300 and can recognize the surroundings of the vehicle which is difficult to confirm in the driver's seat. On the other hand, parking assist systems that assist in parking the vehicle may be somewhat difficult to use in everyday situations without the user's prior learning and practice. The disclosed embodiment provides an intuitive user interface for automatic parking, which is designed using the above-described AVM video, so that a user can easily use the automatic parking function without any learning, and a control method thereof . This will be described in detail below.

4 is a diagram illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an embodiment of the present invention. Fig. FIGS. 6A, 6B, and 6C are diagrams illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment when a touch gesture for rear parking is input. FIGS. 7A, 7B, 7C is a diagram illustrating a user interface for automatic parking displayed on the display unit of the vehicle according to an exemplary embodiment when a touch gesture for front parking is input.

Referring to FIG. 3, the vehicle according to the disclosed embodiment includes an image sensor 220 for acquiring images of the surroundings of the vehicle, an ultrasonic sensor for sensing obstacles around the vehicle, a display unit for providing a user interface for automatic parking, A controller 317 for controlling the parking of the vehicle according to the input touch gesture when a parking command is input through a predetermined touch gesture through the touch pad, and a driver for controlling the operation of the vehicle under the control of the controller 317. [

The image sensor 220 is an AVM image sensor as described above. The image sensor 220 includes a front camera 221 for acquiring an image of the front of the vehicle, a left camera 225 and a right camera 227 for acquiring left and right images of the vehicle, And a rear camera 223 for acquiring an image. If the images of the front, rear, left and right rooms of the vehicle can be acquired, there is no restriction on the installation position or the number of the cameras. The camera may comprise a CCD or CMOS sensor.

The control unit 317 processes the image obtained by the camera to process the image. The processed image can be displayed on the display unit 300. The control unit 317 may be integrated into at least one system on chip (SOC) built into the vehicle.

The images obtained from the cameras may be displayed directly on the display unit 300 without the image processing of the control unit 317. [ This is because it is necessary for the driver to check the images acquired from the unique time of each camera. For example, when performing rear parking, it is necessary to check an image taken at a specific time of the rear camera. In this case, the image processing of the control unit 317 may not be performed.

However, if the images acquired from the front, rear, and left cameras need to be synthesized around the vehicle as shown in FIG. 4, the surrounding images must be matched based on the vehicle. In this case, the control unit 317 may perform predetermined image processing on the image of the surroundings of the vehicle acquired by the image sensor 220. [ That is, the images obtained by each camera may have distortion due to lenses or the like. If the images are matched without removing such distortion, it may be difficult for the driver to accurately recognize the surrounding situation. Accordingly, the control unit 317 generates a matching image in which the distortion is eliminated as shown in FIG. 4 using the distance value information with respect to the vehicle, which is matched for each pixel and stored in advance, R1). The driver can recognize the situation around the vehicle as if looking down on the vehicle from the distorted matching image.

The ultrasonic sensor can detect the obstacle adjacent to the vehicle and output the distance between the obstacle and the vehicle. Ultrasonic sensors can be mounted on the stern, the rear or side of the vehicle to detect not only obstacles but also parking spaces.

The display unit 300 may be located in the center fascia 130, which is a central area of the dashboard. The display unit 300 may include a light emitting diode (LED), an organic light emitting diode (OLED) or a liquid crystal display (LCD) Can be adopted. Also, the display unit 300 may employ a touch screen panel (TSP) that receives a control command from a user through a touch gesture and displays operation information corresponding to the received control command. The touch screen panel includes a display for displaying operation information and control commands that can be input by a user, a touch panel for detecting coordinates of a part of the user's body contacted, And a touch screen controller for determining the input control command. The touch screen controller can recognize the control command inputted by the user by comparing the touch coordinates of the user detected through the touch panel with the coordinates of the control command displayed through the display.

The display unit provides a user interface for automatic parking, as shown in FIG. The user interface includes a first area R1 for displaying a top view image obtained by combining the images acquired by the image sensor with the vehicle as a center, a second area R1 for displaying a button-shaped object B for inputting a parking command, A second area R3 for displaying an image for automatic parking, and a third area R3 for displaying an image for automatic parking. It is needless to say that the positions and sizes of the respective regions shown in the drawings are only examples and they can be arranged at different positions in different sizes. Each area is divided for convenience of explanation, and the number of areas is not limited to three as described above. The top view image, the object, and the image for automatic parking may all be displayed in one area or may be displayed in two areas.

5, when the object of the second area is touched, the display unit displays the parking positions PL1 and PL2 of the parking lot on the basis of the current position of the child vehicle and the current position of the child vehicle in the third area . The image displayed in the third area may be generated using an image obtained by the image sensor and map data stored in advance. A parkable seat is marked to distinguish it from the place where the vehicle is already parked. For example, it may be displayed in a different color from the place already parked. A variety of other methods may be used to distinguish the parking available from the already parked.

When the position of the subject vehicle and the current state of the parking lot are displayed in the third area, the user can touch the third area of the display unit with a predetermined touch gesture to perform automatic parking of the vehicle.

Typical parking methods include rear parking, front parking, and parallel parking. The disclosed embodiment describes a predetermined touch gesture as an example of rear parking and front parking.

Figures 6A-6C show a touch gesture for rear parking.

As shown in FIG. 6A, the user can enter a runner path T1 of the backward parking with the current position of the vehicle as the starting point and the destination as the destination point by touching the third area. The user can touch the third area such that the parking progress path is drawn in the third area. The display unit may display the parking progress path indicated by the input touch gesture in the third area as shown in FIG. 6A.

The memory may store various predefined touch gestures and the parking mode indicated by each touch gesture. The control unit 317 recognizes the inputted touch gesture and determines a touch gesture corresponding to the recognized touch gesture among the previously stored touch gestures. Then, a control signal for parking the vehicle can be generated in accordance with the parking mode indicated by the determined touch gesture, thereby controlling the driving device.

If the parking progress path corresponding to the input touch gesture does not correspond to the touch gesture stored in the memory or the parking progress route according to the input touch gesture can not be parked, the control unit 317 outputs a message informing that parking is impossible through the display unit can do.

The control unit 317 can generate a control command for rear parking and control the driving apparatus when the parking progress path as shown in FIG. 6A is input through the touch.

Further, as shown in FIG. 6B, the display unit displays at least one position of the anticipated vehicle position on the parking progress path when the vehicle moves along the parking progress path indicated in the third area. For example, as shown in FIG. 6B, the predicted positions E1, E2, and E3 may be displayed in a box shape at three positions including the destination.

Through the predicted position displayed in the third area, the user can intuitively recognize which position the vehicle travels to the destination.

As described above, it is also possible to input the parking progress path directly through the touch or to park the vehicle at the corresponding position in the rear parking mode by touching the position where the vehicle is to be parked once, as shown in Fig. 6C .

The display unit displays at least one position of the expected vehicle position when the vehicle moves along the rear parking path, as shown in Fig. 6C, once the position to which the vehicle is to be parked is touched. For example, as shown in FIG. 6C, the predicted position can be displayed in a box form at three positions including the destination.

7A to 7C show a touch gesture for forward parking.

As shown in FIG. 7A, the user can input the current time by touching the third area with the current vehicle position as the starting point and the runner path T2 of the front parking as the destination point. The user can touch the third area such that the parking progress path is drawn in the third area. The display unit may display the parking progress path indicated by the input touch gesture in the third area as shown in FIG. 7A.

The control unit 317 recognizes the inputted touch gesture and determines a touch gesture corresponding to the recognized touch gesture among the previously stored touch gestures. Then, a control signal for parking the vehicle can be generated in accordance with the parking mode indicated by the determined touch gesture, thereby controlling the driving device.

When the parking progress path as shown in FIG. 7A is input through the touch, the control unit 317 can generate a control command for front parking and control the driving apparatus.

Further, as shown in FIG. 7B, the display unit displays at least one position (E1, E2) of the expected vehicle position on the parking progress path when the vehicle moves along the parking progress path indicated in the third area. For example, as shown in FIG. 7B, the predicted position can be displayed in a box form at two positions including the destination.

Through the predicted position displayed in the third area, the user can intuitively recognize which position the vehicle travels to the destination.

As described above, it is also possible to input the parking progress path directly through the touch or to park the vehicle in the forward parking manner by touching the position where the vehicle is to be parked twice, as shown in FIG. 7C have. That is, in this case, the vehicle can be parked in the promised parking mode depending on how many times the position to be parked is touched. For example, as described above, when the position to be parked is touched once, the parking mode is determined as the backward parking, and when the position to be parked is touched twice, the parking mode can be determined by the front parking.

The display portion displays at least one position of the expected vehicle position when the vehicle moves along the front parking path, as shown in Fig. 7C, when the position to which the vehicle is to be parked is touched twice. For example, as shown in FIG. 7C, the predicted position can be displayed in a box form at two positions including the destination.

The control unit 317 determines the parking mode indicated by the touch gesture through matching of the touch gesture input to the display unit with the touch gesture stored in the memory, and controls the driving device according to the determined parking mode to park the vehicle.

The automatic parking method according to the disclosed embodiment includes a method of automatically performing both steering and gear shifting when the user inputs a touch gesture. Alternatively, only the steering may be performed automatically, and gear shifting may be requested to the driver.

8 is a flowchart showing a control method of a vehicle according to an embodiment.

Referring to FIG. 8, when a parking command is inputted through the display unit 800, the display unit displays the position of the vehicle and the available parking space 810, and if the touch gesture for parking the vehicle is input 820, The estimated parking route and the expected location are displayed (830). The control unit 317 controls the parking of the subject vehicle according to the input touch gesture (840).

5, when a parking command is inputted through the touch of the object displayed in the second area, the display unit displays the parking position of the parking lot on the basis of the current position of the child vehicle and the current position of the child vehicle in the third area Display. The image displayed in the third area may be generated using an image obtained by the image sensor and map data stored in advance. A parkable seat is marked to distinguish it from the place where the vehicle is already parked.

When the position of the subject vehicle and the current state of the parking lot are displayed in the third area, the user can touch the third area of the display unit with a predetermined touch gesture to perform automatic parking of the vehicle.

As shown in FIG. 6A, the user can enter the runner path of the backward parking with the current vehicle position as the starting point and the parking spot as the destination point by touching the third area. The user can touch the third area such that the parking progress path is drawn in the third area. The display unit may display the parking progress path indicated by the input touch gesture in the third area as shown in FIG. 6A.

The control unit 317 can generate a control command for rear parking and control the driving apparatus when the parking progress path as shown in FIG. 6A is input through the touch.

Further, as shown in FIG. 6B, the display unit displays at least one position of the anticipated vehicle position on the parking progress path when the vehicle moves along the parking progress path indicated in the third area. For example, as shown in FIG. 6B, the expected position may be displayed in a box form at three positions including the destination. Through the predicted position displayed in the third area, the user can intuitively recognize which position the vehicle travels to the destination.

As described above, it is also possible to input the parking progress path directly through the touch or to park the vehicle at the corresponding position in the rear parking mode by touching the position where the vehicle is to be parked once, as shown in Fig. 6C . The display unit displays at least one position of the expected vehicle position when the vehicle moves along the rear parking path, as shown in Fig. 6C, once the position to which the vehicle is to be parked is touched. For example, as shown in FIG. 6C, the predicted position can be displayed in a box form at three positions including the destination. Description of the touch gesture for front parking shown in Figs. 7A to 7C will be omitted.

The control unit 317 determines the parking mode indicated by the touch gesture through matching of the touch gesture input to the display unit with the touch gesture stored in the memory, and controls the driving device according to the determined parking mode to park the vehicle.

200: Image sensor
316: Ultrasonic sensor
317:
300:

Claims (19)

An image sensor for acquiring images of the surroundings of the vehicle;
A top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle is displayed. When a parking command is input, based on the image acquired by the image sensor, A display unit for displaying a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces;
And a controller for parking the subject vehicle according to the input touch gesture when the touch gesture is input,
Wherein the touch gesture includes a touch gesture indicating a parking progress path of the subject vehicle and at least one touch input to a parkable seat, and the touch input indicates a front parking mode or a rear parking mode according to the number of touches.
delete The method according to claim 1,
Wherein the display unit displays the parking progress path when a touch gesture indicating the parking progress path is input,
Wherein the control unit controls the subject vehicle to travel along the parking progress path.
The method according to claim 1,
Wherein the display unit displays a predicted position of at least one vehicle on the parking progress path when the touch gesture indicating the parking progress path is input.
delete delete The method according to claim 1,
The display unit includes:
And displays a position of the subject vehicle and a parking allowable seat based on the image acquired by the image sensor when a parking command is input through a touch to the object.
The method according to claim 1,
The display unit includes:
And displays the position of the vehicle and the parking available seat as a top view image based on the image acquired by the image sensor and the map data previously stored when the parking command is input.
The method according to claim 1,
Wherein the display unit outputs a parking disabled message when the vehicle can not be parked according to the touch gesture.
An image sensor for acquiring images of the surroundings of the vehicle; And
A first area for displaying a top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle, a second area for displaying an object adapted to receive an input for a parking command, A third area for displaying a position of the subject vehicle and a parkable seat based on the image acquired by the image sensor and receiving a predetermined touch gesture for parking the subject vehicle at any one of the parkable seats, And a display unit,
Wherein the touch gesture includes a touch gesture indicating a parking progress path of the vehicle and at least one touch input to a parkable seat, and the touch input includes a vehicle indicating a front parking mode or a rear parking mode according to the number of touches
When the parking command is inputted, displaying the position of the vehicle and the parking allowable seat based on the image acquired by the image sensor;
Receiving a predetermined touch gesture for parking the subject vehicle at any one of the available parking positions;
And parking the subject vehicle according to the input touch gesture,
Receiving a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces,
And receiving at least one touch for any one of the available parking spaces,
Parking the subject vehicle according to the input touch gesture,
And parking the subject vehicle in a front parking mode or a rear parking mode according to the number of touches to any one of the available parking spaces.
12. The method of claim 11,
Receiving a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces,
A touch gesture indicating a parking progress path for parking the subject vehicle to any one of the available parking spaces;
And displaying the parking progress route in accordance with the input touch gesture.
13. The method of claim 12,
Further comprising: displaying a predicted position of at least one vehicle in the indicated parking progress path.
13. The method of claim 12,
Parking the subject vehicle according to the input touch gesture,
And controlling the child vehicle to travel along the parking progress path.
delete delete 12. The method of claim 11,
When the parking command is input, the display of the position of the vehicle and the parking allowable seat based on the image acquired by the image sensor,
Displaying an object adapted to receive input for a parking command;
And displaying a position of the child vehicle and a parking allowable seat based on the image acquired by the image sensor when a parking command is inputted through touching the object.
12. The method of claim 11,
When the parking command is input, the display of the position of the vehicle and the parking allowable seat based on the image acquired by the image sensor,
And displaying the location of the vehicle and the available parking space as a top view image based on the image acquired by the image sensor and the map data previously stored when the parking command is input.
12. The method of claim 11,
Determining whether the vehicle can be parked according to the input touch gesture;
And outputting a parking disabled message when the vehicle can not be parked according to the input touch gesture.
KR1020150179911A 2015-12-16 2015-12-16 Vehicle and controlling method for the same KR101767074B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150179911A KR101767074B1 (en) 2015-12-16 2015-12-16 Vehicle and controlling method for the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150179911A KR101767074B1 (en) 2015-12-16 2015-12-16 Vehicle and controlling method for the same

Publications (2)

Publication Number Publication Date
KR20170071796A KR20170071796A (en) 2017-06-26
KR101767074B1 true KR101767074B1 (en) 2017-08-23

Family

ID=59282630

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150179911A KR101767074B1 (en) 2015-12-16 2015-12-16 Vehicle and controlling method for the same

Country Status (1)

Country Link
KR (1) KR101767074B1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002240662A (en) * 2000-12-15 2002-08-28 Honda Motor Co Ltd Parking support device
JP2002362271A (en) * 2001-06-07 2002-12-18 Denso Corp Equipment, program, and recording medium for vehicle parking guide
JP2005041433A (en) * 2003-07-25 2005-02-17 Denso Corp Vehicle guiding device and route judging program
JP2007183877A (en) * 2006-01-10 2007-07-19 Nissan Motor Co Ltd Driving support device for vehicle and display method for bird's-eye video
JP2008213791A (en) * 2007-03-07 2008-09-18 Aisin Aw Co Ltd Parking assist method and parking assist system
JP2011002884A (en) * 2009-06-16 2011-01-06 Nissan Motor Co Ltd Image display device for vehicle and method for displaying bird's-eye view image
JP2012076483A (en) * 2010-09-30 2012-04-19 Aisin Seiki Co Ltd Parking support device
JP2013043510A (en) * 2011-08-23 2013-03-04 Nissan Motor Co Ltd Parking assist apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002240662A (en) * 2000-12-15 2002-08-28 Honda Motor Co Ltd Parking support device
JP2002362271A (en) * 2001-06-07 2002-12-18 Denso Corp Equipment, program, and recording medium for vehicle parking guide
JP2005041433A (en) * 2003-07-25 2005-02-17 Denso Corp Vehicle guiding device and route judging program
JP2007183877A (en) * 2006-01-10 2007-07-19 Nissan Motor Co Ltd Driving support device for vehicle and display method for bird's-eye video
JP2008213791A (en) * 2007-03-07 2008-09-18 Aisin Aw Co Ltd Parking assist method and parking assist system
JP2011002884A (en) * 2009-06-16 2011-01-06 Nissan Motor Co Ltd Image display device for vehicle and method for displaying bird's-eye view image
JP2012076483A (en) * 2010-09-30 2012-04-19 Aisin Seiki Co Ltd Parking support device
JP2013043510A (en) * 2011-08-23 2013-03-04 Nissan Motor Co Ltd Parking assist apparatus

Also Published As

Publication number Publication date
KR20170071796A (en) 2017-06-26

Similar Documents

Publication Publication Date Title
JP6340969B2 (en) Perimeter monitoring apparatus and program
JP6275007B2 (en) Parking assistance device
CN107219915B (en) Vehicle and method for controlling the same
EP2981077B1 (en) Periphery monitoring device and program
CN105539287B (en) Periphery monitoring device
US10337881B2 (en) Navigation device, vehicle, and method for controlling the vehicle
JP6100222B2 (en) Parking assistance device
JP4952765B2 (en) Vehicle night vision support device
JP6413207B2 (en) Vehicle display device
JP6281289B2 (en) Perimeter monitoring apparatus and program
US20090009314A1 (en) Display system and program
US20170305345A1 (en) Image display control apparatus and image display system
JP5605606B2 (en) Parking assistance device
JP2016060225A (en) Parking support device, parking support method and control program
CN109278844B (en) Steering wheel, vehicle with steering wheel and method for controlling vehicle
US20190244324A1 (en) Display control apparatus
WO2018150642A1 (en) Surroundings monitoring device
US10864866B2 (en) Vehicle and control method thereof
JP2017162015A (en) Vehicle peripheral image display device
US11858424B2 (en) Electronic device for displaying image by using camera monitoring system (CMS) side display mounted in vehicle, and operation method thereof
JP4753735B2 (en) Car electronics
JP2009129251A (en) Operation input apparatus
CN112141083A (en) Parking control apparatus for vehicle and method thereof
KR20170070459A (en) Vehicle and method for controlling vehicle
KR101882188B1 (en) Vehicle and control method for the vehicle

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant