KR101767074B1 - Vehicle and controlling method for the same - Google Patents
Vehicle and controlling method for the same Download PDFInfo
- Publication number
- KR101767074B1 KR101767074B1 KR1020150179911A KR20150179911A KR101767074B1 KR 101767074 B1 KR101767074 B1 KR 101767074B1 KR 1020150179911 A KR1020150179911 A KR 1020150179911A KR 20150179911 A KR20150179911 A KR 20150179911A KR 101767074 B1 KR101767074 B1 KR 101767074B1
- Authority
- KR
- South Korea
- Prior art keywords
- parking
- vehicle
- input
- touch gesture
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 5
- 238000010586 diagram Methods 0.000 description 7
- 210000003195 fascia Anatomy 0.000 description 7
- 230000006870 function Effects 0.000 description 4
- 230000003466 anti-cipated effect Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 1
- 230000003749 cleanliness Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 239000002803 fossil fuel Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2300/00—Purposes or special features of road vehicle drive control systems
- B60Y2300/06—Automatic manoeuvring for parking
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
The disclosed embodiment provides a vehicle and a method of controlling the same that automatically perform parking through a predetermined touch gesture. According to one embodiment, a vehicle includes an image sensor for acquiring an image around a vehicle; A top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle is displayed. When a parking command is input, based on the image acquired by the image sensor, A display unit for displaying a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces; And a controller for parking the subject vehicle according to the input touch gesture when the touch gesture is input.
Description
The disclosed embodiment relates to a vehicle.
Generally, a driver of a vehicle uses a side mirror or a room mirror mounted on a vehicle to move the vehicle while visually checking rear or side obstacles.
However, if there is a blind spot in the rear corner of the vehicle that the driver can not recognize, and the driver can not accurately grasp the length or width of the vehicle even if the driver recognizes the obstacle, the vehicle can not recognize the distance between the vehicle and the obstacle, It may come in contact with obstacles.
In order to solve the above problems, a Parking Assist System (PAS) has been introduced to mount a sensor on the rear and front of the vehicle to help recognize the distance to the obstacle by an alarm sound. In recent years, a smart parking assist system (SPAS: Smart Parking (SPAS)) has been developed which recognizes the space to be parked and automatically generates a parking path to automatically control the steering wheel to automatically park the vehicle without operating the steering wheel Assist System) was introduced. SPAS is also called a parking steering assist system because it controls the steering of the vehicle.
The disclosed embodiment provides a vehicle and a method of controlling the same that automatically perform parking through a predetermined touch gesture.
According to one embodiment, a vehicle includes an image sensor for acquiring an image around a vehicle; A top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle is displayed. When a parking command is input, based on the image acquired by the image sensor, A display unit for displaying a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces; And a controller for parking the subject vehicle according to the input touch gesture when the touch gesture is input.
Further, the touch gesture includes a touch gesture indicating a parking progress path of the subject vehicle.
Also, the display unit may display the parking progress route when the touch gesture indicating the parking progress route is input, and the control unit may control the child vehicle to travel along the inputted parking progress route.
The display unit may display a predicted position of at least one vehicle on the parking progress path when the touch gesture indicating the parking progress path is input.
Also, the touch gesture may include at least one touch input to the parking available seat.
In addition, the touch input may indicate an input for another parking method depending on the number of touches.
The display unit may display an object provided to receive an input for a parking command. When a parking command is input through touching the object, the position of the vehicle is parked based on the image acquired by the image sensor A vehicle that displays a seat.
In addition, the display unit may display the position of the vehicle and the available parking space as a top view image based on the image acquired by the image sensor and the previously stored map data when the parking command is input.
According to one embodiment, a vehicle includes an image sensor for acquiring an image around a vehicle; A first area for displaying a top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle, a second area for displaying an object adapted to receive an input for a parking command, A third area for displaying a position of the vehicle and a parking allowable seat based on the image acquired by the image sensor and receiving a predetermined touch gesture for parking the subject vehicle in any one of the parking seats, And a display unit including the display unit.
According to another aspect of the present invention, there is provided a method of controlling a vehicle, the method comprising: when a parking command is inputted, displaying a position of a vehicle and a parking allowable seat based on an image acquired by the image sensor; Receiving a predetermined touch gesture for parking the subject vehicle at any one of the available parking positions; And parking the subject vehicle according to the input touch gesture.
The inputting of the predetermined touch gesture for parking the subject vehicle to any one of the available parking spaces may include inputting a touch gesture indicating a parking progress path for parking the subject vehicle to any one of the available parking spaces under; And displaying the parking progress path according to the input touch gesture.
The method may further include displaying an expected position of at least one vehicle on the displayed parking progress path.
In addition, parking the subject vehicle according to the input touch gesture may include controlling the subject vehicle to travel along the inputted parking course.
The inputting of the predetermined touch gesture for parking the subject vehicle to any one of the available parking spaces may include inputting at least one touch to any one of the available parking spaces .
In addition, parking the subject vehicle according to the input touch gesture may include parking the subject vehicle using a different parking method according to the number of touches to any one of the available places.
In addition, when the parking command is input, displaying the position of the vehicle and the parking allowable position based on the image acquired by the image sensor displays an object provided to receive an input to the parking command; And displaying a position of the vehicle and a parking allowable seat based on the image acquired by the image sensor when a parking command is input through touching the object.
In addition, when the parking command is inputted, displaying the position of the vehicle and the parking allowable seat based on the image acquired by the image sensor is based on the image acquired by the image sensor and the map data stored in advance, And displaying the location of the vehicle and the available parking space as a top view image.
According to the disclosed embodiment, the automatic parking function can be more easily used through an intuitive user interface.
1 is an external view of a vehicle according to an embodiment.
2 is a view showing the internal configuration of a vehicle according to an embodiment.
3 is a control block diagram of a vehicle according to the disclosed embodiment.
4 is a diagram illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment of the present invention.
5 is a diagram illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an embodiment when a parking command is input.
FIGS. 6A, 6B, and 6C are views illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment when a touch gesture for rear parking is input.
FIGS. 7A, 7B, and 7C are diagrams illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment when a touch gesture for front parking is input.
8 is a flowchart showing a control method of a vehicle according to an embodiment.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
1, a vehicle according to an embodiment of the present invention includes a
The
The
The
A
The
In addition, the vehicle may include various sensors for detecting obstacles around the vehicle and helping the driver to recognize the situation around the vehicle. For example, the vehicle may include a plurality of cameras capable of acquiring front, rear, left, and right images of the vehicle.
The vehicle may include a dashboard provided therein with a
The
The
The air conditioner controls the temperature, humidity, air cleanliness and air flow inside the vehicle to keep the interior of the vehicle comfortable. The air conditioner may include at least one
According to the embodiment, the
The
Also,
The
4 is a diagram illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an embodiment of the present invention. Fig. FIGS. 6A, 6B, and 6C are diagrams illustrating a user interface for automatic parking displayed on a display unit of a vehicle according to an exemplary embodiment when a touch gesture for rear parking is input. FIGS. 7A, 7B, 7C is a diagram illustrating a user interface for automatic parking displayed on the display unit of the vehicle according to an exemplary embodiment when a touch gesture for front parking is input.
Referring to FIG. 3, the vehicle according to the disclosed embodiment includes an
The
The
The images obtained from the cameras may be displayed directly on the
However, if the images acquired from the front, rear, and left cameras need to be synthesized around the vehicle as shown in FIG. 4, the surrounding images must be matched based on the vehicle. In this case, the
The ultrasonic sensor can detect the obstacle adjacent to the vehicle and output the distance between the obstacle and the vehicle. Ultrasonic sensors can be mounted on the stern, the rear or side of the vehicle to detect not only obstacles but also parking spaces.
The
The display unit provides a user interface for automatic parking, as shown in FIG. The user interface includes a first area R1 for displaying a top view image obtained by combining the images acquired by the image sensor with the vehicle as a center, a second area R1 for displaying a button-shaped object B for inputting a parking command, A second area R3 for displaying an image for automatic parking, and a third area R3 for displaying an image for automatic parking. It is needless to say that the positions and sizes of the respective regions shown in the drawings are only examples and they can be arranged at different positions in different sizes. Each area is divided for convenience of explanation, and the number of areas is not limited to three as described above. The top view image, the object, and the image for automatic parking may all be displayed in one area or may be displayed in two areas.
5, when the object of the second area is touched, the display unit displays the parking positions PL1 and PL2 of the parking lot on the basis of the current position of the child vehicle and the current position of the child vehicle in the third area . The image displayed in the third area may be generated using an image obtained by the image sensor and map data stored in advance. A parkable seat is marked to distinguish it from the place where the vehicle is already parked. For example, it may be displayed in a different color from the place already parked. A variety of other methods may be used to distinguish the parking available from the already parked.
When the position of the subject vehicle and the current state of the parking lot are displayed in the third area, the user can touch the third area of the display unit with a predetermined touch gesture to perform automatic parking of the vehicle.
Typical parking methods include rear parking, front parking, and parallel parking. The disclosed embodiment describes a predetermined touch gesture as an example of rear parking and front parking.
Figures 6A-6C show a touch gesture for rear parking.
As shown in FIG. 6A, the user can enter a runner path T1 of the backward parking with the current position of the vehicle as the starting point and the destination as the destination point by touching the third area. The user can touch the third area such that the parking progress path is drawn in the third area. The display unit may display the parking progress path indicated by the input touch gesture in the third area as shown in FIG. 6A.
The memory may store various predefined touch gestures and the parking mode indicated by each touch gesture. The
If the parking progress path corresponding to the input touch gesture does not correspond to the touch gesture stored in the memory or the parking progress route according to the input touch gesture can not be parked, the
The
Further, as shown in FIG. 6B, the display unit displays at least one position of the anticipated vehicle position on the parking progress path when the vehicle moves along the parking progress path indicated in the third area. For example, as shown in FIG. 6B, the predicted positions E1, E2, and E3 may be displayed in a box shape at three positions including the destination.
Through the predicted position displayed in the third area, the user can intuitively recognize which position the vehicle travels to the destination.
As described above, it is also possible to input the parking progress path directly through the touch or to park the vehicle at the corresponding position in the rear parking mode by touching the position where the vehicle is to be parked once, as shown in Fig. 6C .
The display unit displays at least one position of the expected vehicle position when the vehicle moves along the rear parking path, as shown in Fig. 6C, once the position to which the vehicle is to be parked is touched. For example, as shown in FIG. 6C, the predicted position can be displayed in a box form at three positions including the destination.
7A to 7C show a touch gesture for forward parking.
As shown in FIG. 7A, the user can input the current time by touching the third area with the current vehicle position as the starting point and the runner path T2 of the front parking as the destination point. The user can touch the third area such that the parking progress path is drawn in the third area. The display unit may display the parking progress path indicated by the input touch gesture in the third area as shown in FIG. 7A.
The
When the parking progress path as shown in FIG. 7A is input through the touch, the
Further, as shown in FIG. 7B, the display unit displays at least one position (E1, E2) of the expected vehicle position on the parking progress path when the vehicle moves along the parking progress path indicated in the third area. For example, as shown in FIG. 7B, the predicted position can be displayed in a box form at two positions including the destination.
Through the predicted position displayed in the third area, the user can intuitively recognize which position the vehicle travels to the destination.
As described above, it is also possible to input the parking progress path directly through the touch or to park the vehicle in the forward parking manner by touching the position where the vehicle is to be parked twice, as shown in FIG. 7C have. That is, in this case, the vehicle can be parked in the promised parking mode depending on how many times the position to be parked is touched. For example, as described above, when the position to be parked is touched once, the parking mode is determined as the backward parking, and when the position to be parked is touched twice, the parking mode can be determined by the front parking.
The display portion displays at least one position of the expected vehicle position when the vehicle moves along the front parking path, as shown in Fig. 7C, when the position to which the vehicle is to be parked is touched twice. For example, as shown in FIG. 7C, the predicted position can be displayed in a box form at two positions including the destination.
The
The automatic parking method according to the disclosed embodiment includes a method of automatically performing both steering and gear shifting when the user inputs a touch gesture. Alternatively, only the steering may be performed automatically, and gear shifting may be requested to the driver.
8 is a flowchart showing a control method of a vehicle according to an embodiment.
Referring to FIG. 8, when a parking command is inputted through the
5, when a parking command is inputted through the touch of the object displayed in the second area, the display unit displays the parking position of the parking lot on the basis of the current position of the child vehicle and the current position of the child vehicle in the third area Display. The image displayed in the third area may be generated using an image obtained by the image sensor and map data stored in advance. A parkable seat is marked to distinguish it from the place where the vehicle is already parked.
When the position of the subject vehicle and the current state of the parking lot are displayed in the third area, the user can touch the third area of the display unit with a predetermined touch gesture to perform automatic parking of the vehicle.
As shown in FIG. 6A, the user can enter the runner path of the backward parking with the current vehicle position as the starting point and the parking spot as the destination point by touching the third area. The user can touch the third area such that the parking progress path is drawn in the third area. The display unit may display the parking progress path indicated by the input touch gesture in the third area as shown in FIG. 6A.
The
Further, as shown in FIG. 6B, the display unit displays at least one position of the anticipated vehicle position on the parking progress path when the vehicle moves along the parking progress path indicated in the third area. For example, as shown in FIG. 6B, the expected position may be displayed in a box form at three positions including the destination. Through the predicted position displayed in the third area, the user can intuitively recognize which position the vehicle travels to the destination.
As described above, it is also possible to input the parking progress path directly through the touch or to park the vehicle at the corresponding position in the rear parking mode by touching the position where the vehicle is to be parked once, as shown in Fig. 6C . The display unit displays at least one position of the expected vehicle position when the vehicle moves along the rear parking path, as shown in Fig. 6C, once the position to which the vehicle is to be parked is touched. For example, as shown in FIG. 6C, the predicted position can be displayed in a box form at three positions including the destination. Description of the touch gesture for front parking shown in Figs. 7A to 7C will be omitted.
The
200: Image sensor
316: Ultrasonic sensor
317:
300:
Claims (19)
A top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle is displayed. When a parking command is input, based on the image acquired by the image sensor, A display unit for displaying a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces;
And a controller for parking the subject vehicle according to the input touch gesture when the touch gesture is input,
Wherein the touch gesture includes a touch gesture indicating a parking progress path of the subject vehicle and at least one touch input to a parkable seat, and the touch input indicates a front parking mode or a rear parking mode according to the number of touches.
Wherein the display unit displays the parking progress path when a touch gesture indicating the parking progress path is input,
Wherein the control unit controls the subject vehicle to travel along the parking progress path.
Wherein the display unit displays a predicted position of at least one vehicle on the parking progress path when the touch gesture indicating the parking progress path is input.
The display unit includes:
And displays a position of the subject vehicle and a parking allowable seat based on the image acquired by the image sensor when a parking command is input through a touch to the object.
The display unit includes:
And displays the position of the vehicle and the parking available seat as a top view image based on the image acquired by the image sensor and the map data previously stored when the parking command is input.
Wherein the display unit outputs a parking disabled message when the vehicle can not be parked according to the touch gesture.
A first area for displaying a top view image generated by synthesizing an image obtained by the image sensor on the basis of the subject vehicle, a second area for displaying an object adapted to receive an input for a parking command, A third area for displaying a position of the subject vehicle and a parkable seat based on the image acquired by the image sensor and receiving a predetermined touch gesture for parking the subject vehicle at any one of the parkable seats, And a display unit,
Wherein the touch gesture includes a touch gesture indicating a parking progress path of the vehicle and at least one touch input to a parkable seat, and the touch input includes a vehicle indicating a front parking mode or a rear parking mode according to the number of touches
Receiving a predetermined touch gesture for parking the subject vehicle at any one of the available parking positions;
And parking the subject vehicle according to the input touch gesture,
Receiving a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces,
And receiving at least one touch for any one of the available parking spaces,
Parking the subject vehicle according to the input touch gesture,
And parking the subject vehicle in a front parking mode or a rear parking mode according to the number of touches to any one of the available parking spaces.
Receiving a predetermined touch gesture for parking the subject vehicle in any one of the available parking spaces,
A touch gesture indicating a parking progress path for parking the subject vehicle to any one of the available parking spaces;
And displaying the parking progress route in accordance with the input touch gesture.
Further comprising: displaying a predicted position of at least one vehicle in the indicated parking progress path.
Parking the subject vehicle according to the input touch gesture,
And controlling the child vehicle to travel along the parking progress path.
When the parking command is input, the display of the position of the vehicle and the parking allowable seat based on the image acquired by the image sensor,
Displaying an object adapted to receive input for a parking command;
And displaying a position of the child vehicle and a parking allowable seat based on the image acquired by the image sensor when a parking command is inputted through touching the object.
When the parking command is input, the display of the position of the vehicle and the parking allowable seat based on the image acquired by the image sensor,
And displaying the location of the vehicle and the available parking space as a top view image based on the image acquired by the image sensor and the map data previously stored when the parking command is input.
Determining whether the vehicle can be parked according to the input touch gesture;
And outputting a parking disabled message when the vehicle can not be parked according to the input touch gesture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150179911A KR101767074B1 (en) | 2015-12-16 | 2015-12-16 | Vehicle and controlling method for the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150179911A KR101767074B1 (en) | 2015-12-16 | 2015-12-16 | Vehicle and controlling method for the same |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170071796A KR20170071796A (en) | 2017-06-26 |
KR101767074B1 true KR101767074B1 (en) | 2017-08-23 |
Family
ID=59282630
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150179911A KR101767074B1 (en) | 2015-12-16 | 2015-12-16 | Vehicle and controlling method for the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101767074B1 (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002240662A (en) * | 2000-12-15 | 2002-08-28 | Honda Motor Co Ltd | Parking support device |
JP2002362271A (en) * | 2001-06-07 | 2002-12-18 | Denso Corp | Equipment, program, and recording medium for vehicle parking guide |
JP2005041433A (en) * | 2003-07-25 | 2005-02-17 | Denso Corp | Vehicle guiding device and route judging program |
JP2007183877A (en) * | 2006-01-10 | 2007-07-19 | Nissan Motor Co Ltd | Driving support device for vehicle and display method for bird's-eye video |
JP2008213791A (en) * | 2007-03-07 | 2008-09-18 | Aisin Aw Co Ltd | Parking assist method and parking assist system |
JP2011002884A (en) * | 2009-06-16 | 2011-01-06 | Nissan Motor Co Ltd | Image display device for vehicle and method for displaying bird's-eye view image |
JP2012076483A (en) * | 2010-09-30 | 2012-04-19 | Aisin Seiki Co Ltd | Parking support device |
JP2013043510A (en) * | 2011-08-23 | 2013-03-04 | Nissan Motor Co Ltd | Parking assist apparatus |
-
2015
- 2015-12-16 KR KR1020150179911A patent/KR101767074B1/en active IP Right Grant
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002240662A (en) * | 2000-12-15 | 2002-08-28 | Honda Motor Co Ltd | Parking support device |
JP2002362271A (en) * | 2001-06-07 | 2002-12-18 | Denso Corp | Equipment, program, and recording medium for vehicle parking guide |
JP2005041433A (en) * | 2003-07-25 | 2005-02-17 | Denso Corp | Vehicle guiding device and route judging program |
JP2007183877A (en) * | 2006-01-10 | 2007-07-19 | Nissan Motor Co Ltd | Driving support device for vehicle and display method for bird's-eye video |
JP2008213791A (en) * | 2007-03-07 | 2008-09-18 | Aisin Aw Co Ltd | Parking assist method and parking assist system |
JP2011002884A (en) * | 2009-06-16 | 2011-01-06 | Nissan Motor Co Ltd | Image display device for vehicle and method for displaying bird's-eye view image |
JP2012076483A (en) * | 2010-09-30 | 2012-04-19 | Aisin Seiki Co Ltd | Parking support device |
JP2013043510A (en) * | 2011-08-23 | 2013-03-04 | Nissan Motor Co Ltd | Parking assist apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR20170071796A (en) | 2017-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6340969B2 (en) | Perimeter monitoring apparatus and program | |
JP6275007B2 (en) | Parking assistance device | |
CN107219915B (en) | Vehicle and method for controlling the same | |
EP2981077B1 (en) | Periphery monitoring device and program | |
CN105539287B (en) | Periphery monitoring device | |
US10337881B2 (en) | Navigation device, vehicle, and method for controlling the vehicle | |
JP6100222B2 (en) | Parking assistance device | |
JP4952765B2 (en) | Vehicle night vision support device | |
JP6413207B2 (en) | Vehicle display device | |
JP6281289B2 (en) | Perimeter monitoring apparatus and program | |
US20090009314A1 (en) | Display system and program | |
US20170305345A1 (en) | Image display control apparatus and image display system | |
JP5605606B2 (en) | Parking assistance device | |
JP2016060225A (en) | Parking support device, parking support method and control program | |
CN109278844B (en) | Steering wheel, vehicle with steering wheel and method for controlling vehicle | |
US20190244324A1 (en) | Display control apparatus | |
WO2018150642A1 (en) | Surroundings monitoring device | |
US10864866B2 (en) | Vehicle and control method thereof | |
JP2017162015A (en) | Vehicle peripheral image display device | |
US11858424B2 (en) | Electronic device for displaying image by using camera monitoring system (CMS) side display mounted in vehicle, and operation method thereof | |
JP4753735B2 (en) | Car electronics | |
JP2009129251A (en) | Operation input apparatus | |
CN112141083A (en) | Parking control apparatus for vehicle and method thereof | |
KR20170070459A (en) | Vehicle and method for controlling vehicle | |
KR101882188B1 (en) | Vehicle and control method for the vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |