EP3113000B1 - Vehicle and control method for the vehicle - Google Patents
Vehicle and control method for the vehicle Download PDFInfo
- Publication number
- EP3113000B1 EP3113000B1 EP16177496.3A EP16177496A EP3113000B1 EP 3113000 B1 EP3113000 B1 EP 3113000B1 EP 16177496 A EP16177496 A EP 16177496A EP 3113000 B1 EP3113000 B1 EP 3113000B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- character
- area
- touch
- input
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 29
- 210000000707 wrist Anatomy 0.000 description 13
- 230000006870 function Effects 0.000 description 10
- 238000003860 storage Methods 0.000 description 8
- 210000003195 fascia Anatomy 0.000 description 7
- 239000011521 glass Substances 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 208000017445 musculoskeletal system disease Diseases 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/274—Converting codes to words; Guess-ahead of partial word inputs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
- B60K2360/113—Scrolling through menu items
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
- B60K2360/115—Selection of menu items
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/141—Activation of instrument input devices by approaching fingers or pens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1446—Touch switches
Definitions
- Embodiments of the present disclosure relate to a vehicle including a touch input apparatus having a concave area, and to a method of controlling the vehicle.
- Vehicles often have various functions for improving a passenger's convenience, in addition to driving functions.
- the more functions a vehicle has the more operating load a driver experiences. Excessive operating load deteriorates a driver's concentration on driving.
- a driver's difficulties in operating the vehicle might increase accordingly, such that a driver might be not able to properly use all functions the vehicle can perform.
- a representative example of such a vehicle-mounted input device is a touch input device capable of detecting a driver's touch inputs. If a vehicle has such a touch input device, a driver would be able to easily control the vehicle by touching the touch input device without having to perform any complicated manipulations.
- EP 2 881 878 A2 being directed to a vehicle control method.
- Document DE 10 2009 023887 A1 relates to a further input device for a vehicle.
- a vehicle includes: a display unit for displaying a first character input User Interface (Ul) in which a plurality of characters are arranged to surround a predetermined reference point; a touch input apparatus having a concave area for detecting a touch gesture of selecting one character of the plurality of characters; and a controller for controlling the display unit to display, if the number of pre-stored completed character combinations corresponding to an input character combination configured with at least one character sequentially selected according to the touch gesture detected by the touch input apparatus is smaller than or equal to a threshold value, a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround the reference point.
- Ul User Interface
- the touch input apparatus may detect a touch gesture of selecting a completed character combination from among the arranged completed character combinations.
- the controller may control the display unit to inform that the selected completed character combination was selected.
- the touch input apparatus includes a concave area divided into a first area and a second area, wherein the second area may correspond to the central part of the concave area and has a circular shape, and the first area may surround the circumference of the second area.
- the controller controls the display unit to inform that a character or a completed character combination corresponding to the touch location of the first area was selected.
- the controller may control the display unit to inform that the selection of the selected character or the selected completed character combination was cancelled.
- the controller may control the display unit to display the second character input UI.
- the controller may control the display unit to display the second character input UI.
- the controller may control the display unit to display a second character input UI in which the completed character combinations corresponding to the input character combination are adjusted to have a predetermined length.
- the controller may reduce the sizes of the completed character combinations so that the completed character combinations have the predetermined length.
- the controller may omit at least one character of characters configuring each completed character combination so that the completed character combinations have the predetermined length.
- a method of controlling a vehicle includes: displaying a first character input User Interface (UI) in which a plurality of characters are arranged to surround a predetermined reference point; detecting a touch gesture of selecting a character of the plurality of characters; and displaying, if the number of pre-stored completed character combinations corresponding to an input character combination configured with at least one character sequentially selected according to the touch gesture detected by the touch input apparatus is smaller than or equal to a threshold value, a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround the reference point.
- UI User Interface
- the method may further include: detecting a touch gesture of selecting a completed character combination from among the arranged completed character combinations; and informing that the selected completed character combination was selected.
- the method further includes dividing a concave area into a first area and a second area, wherein the second area corresponds to the central part of the concave area and has a circular shape, and the first area surrounds the circumference of the second area.
- the method further includes, if a gesture of moving a touch location from the first area to the second area is detected when the first character input UI is displayed, informing that a character corresponding to the touch location of the first area was selected.
- the method may further include, if the gesture of moving the touch location from the first area to the second area is detected, and successively a gesture of moving the touch location to the first area is detected, informing that the selection of the selected character was cancelled.
- the method includes, if a gesture of moving a touch location from the first area to the second area is detected when the second character input UI is displayed, informing that a completed character combination corresponding to the touch location of the first area was selected.
- the method may further include, if the gesture of moving the touch location from the first area to the second area is detected, and successively a gesture of moving the touch location to the first area is detected, informing that the selection of the selected completed character combination was cancelled.
- the displaying of the second character input UI may include displaying the second character input UI, if the number of completed character combinations including the entirety of the input character combination is smaller than or equal to the threshold value.
- the displaying of the second character input UI may include displaying the second character input UI, if the number of completed character combinations including at least one character of characters configuring the input character combination is smaller than or equal to the threshold value.
- the displaying of the second character input UI may include displaying a second character input UI in which the completed character combinations corresponding to the input character combination are adjusted to have a predetermined length.
- the displaying of the second character input UI may include reducing the sizes of the completed character combinations so that the completed character combinations have the predetermined length.
- the displaying of the second character input UI may include omitting at least one character of characters configuring each completed character combination so that the completed character combinations have the predetermined length.
- FIG. 1 shows an outer appearance of a vehicle according to an embodiment of the present disclosure.
- a vehicle 1 may include a main body 10 forming an outer appearance of the vehicle 1, wheels 21 and 22 to move the vehicle 1, doors 14 to shield the interior of the vehicle 1 from the outside, a front glass 17 to provide a driver inside the vehicle 1 with a front view of the vehicle 1, and side-view mirrors 18 and 19 to provide the driver with rear and side views of the vehicle 1.
- the wheels 21 and 22 may include front wheels 21 provided in the front part of the vehicle 1, and rear wheels 22 provided in the rear part of the vehicle 1.
- the front wheels 21 or the rear wheels 22 may receive rotatory power from a driving apparatus which will be described later to move the main body 10 forward or backward.
- the doors 14 may be rotatably provided in the left and right sides of the main body 10 to allow the driver to open one of them and get into the vehicle 1. Also, the doors 14 may shield the interior of the vehicle 10 from the outside when all of them are closed.
- the front glass 17 may be provided in the upper, front part of the main body 10 to provide the driver inside the vehicle 1 with a front view of the vehicle 1.
- the front glass 17 is also called windshield glass.
- the side-view mirrors 18 and 19 may include a left side-view mirror 18 provided in the left side of the main body 10, and a right side-view mirror 19 provided in the right side of the main body 10 to provide the driver inside the vehicle 1 with rear and side views of the vehicle 10.
- FIG. 2 shows an interior of the vehicle 1 according to an embodiment of the present disclosure.
- the vehicle 1 may include: a plurality of seats 11 in which a driver and a passenger sit; a dashboard 50 on which a gear box 20, a center fascia 30, and a steering wheel 40 are provided; and a plurality of speakers 60.
- the gear box 20 may include a transmission lever 24 for shifting gears, and a touch input apparatus 100 or 200 for controlling function execution of the vehicle 1.
- the touch input apparatus 100 or 200 will later be described in detail.
- the steering wheel 40 attached on the dashboard 50 may be used to change a driving direction of the vehicle 1.
- the steering wheel 40 may include a rim 41 that can be gripped by a driver, and a spoke 42 connected to a steering apparatus of the vehicle 1 and connecting the rim 31 to a hub of a rotation axis for steering.
- the spoke 42 may include a plurality of manipulation units 42a and 42b for controlling various devices (for example, an audio system) of the vehicle 1.
- an air conditioner 31, a clock 32, an audio system 33, a display, etc. may be installed in the center fascia 30 provided on the dashboard 50.
- the air conditioner 31 may adjust the temperature, humidity, air quality, and flow of air inside the vehicle 1 to maintain a pleasant environment inside of the vehicle 1.
- the air conditioner 31 may be installed in the center fascia 30, and may include at least one vent 31a for discharging air.
- at least one button or dial for controlling the air conditioner 31, etc. may be provided in the center fascia 30, at least one button or dial for controlling the air conditioner 31, etc. may be provided. A driver or passenger may use the button provided on the center fascia 30 to control the air conditioner 31.
- the clock 32 may be positioned around the button or dial for controlling the air conditioner 31.
- the audio system 33 may include an operating panel on which a plurality of buttons for executing functions of the audio system 33 are arranged.
- the audio system 33 may provide a radio mode to provide a radio function, and a media mode to reproduce an audio file stored in storage medium that stores audio files.
- the display unit 34 may display a User Interface (UI) to provide a driver with information related to the vehicle 1 in the form of an image or text.
- UI User Interface
- the display unit 34 may be embedded in the center fascia 30. However, the display unit 34 may be installed in another fashion. For example, the display unit 34 may be separated from the center fascia 30 of the vehicle 1. Details about operations of the display unit 34 will be described later.
- the display unit 34 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, a Plasma Display Panel (PDP) display, an Organic Light Emitting Diode (OLED) display, or a Cathode Ray Tube (CRT) display, although the display unit 34 is not limited to these.
- LCD Liquid Crystal Display
- LED Light Emitting Diode
- PDP Plasma Display Panel
- OLED Organic Light Emitting Diode
- CRT Cathode Ray Tube
- the dashboard 50 may further include an instrument panel to display speed, Revolutions Per Minute (RPM), and a fuel gauge of the vehicle 1, and a globe box to store various things.
- RPM Revolutions Per Minute
- the speakers 60 may be provided to output sound.
- the speakers 60 may output information related to the vehicle 1 in the form of sound. For example, if the vehicle 1 receives an output signal for controlling outputting of a recommended driving method, the speakers 60 may output the recommended driving method corresponding to the output signal in the form of sound for a driver.
- the vehicle 1 is arranged to provide the driver with a character input UI through the display unit 34, so that the driver or a passenger can input a character through the touch input apparatus 100 or 200.
- a method of manipulating the character input UI of the vehicle 1, according to an embodiment of the present disclosure will be described with reference to FIG. 3 .
- FIG. 3 is a control block diagram of the vehicle 1 according to an embodiment of the present disclosure.
- the vehicle 1 includes: the touch input apparatus 100 or 200 to detect a touch gesture; the display unit 34 to display the character input UI; storage to store a control command corresponding to the touch gesture; and a controller 400 to control the display unit 34 to manipulate a character input UI that is displayed according to the touch gesture.
- the control command corresponding to the touch gesture detected by the touch input apparatus 100 or 200 may have been stored in advance in the storage.
- the storage may provide the control command to the controller 400 which will be described later.
- the storage has already stored completed character combinations, and information corresponding to the completed character combinations.
- Examples of the information corresponding to the completed character combinations stored in the storage may include phone numbers, addresses, music, and video.
- the display unit 34 receives the control command from the controller 400, and inform that a character selected by a passenger was selected, through the character input UI.
- a character input UI displayed on the display unit 34 a plurality of characters may be arranged in the form of a circle or oval.
- FIG. 4 an embodiment of a first character input UI that is displayed on the display unit 34 will be described with reference to FIG. 4 .
- FIG. 4 shows an embodiment of a first character input UI displayed on the display unit 34.
- the display unit 34 displays a first character input UI in which a plurality of characters are arranged to surround a predetermined reference point P.
- the characters arranged in the first character input UI may include at least one of consonants, vowels, and special characters of a predetermined language.
- the characters arranged to surround the predetermined reference point P may be displayed in the left area of the first character input UI.
- the first character input UI may display characters selected by the passenger in a box located in the upper, right area, sequentially in the order in which they are selected, and also display completed character combinations that are recommended for the passenger, below the box.
- the completed character combinations that are recommended for the passenger may be completed character combinations corresponding to an input character combination configured with the characters sequentially input by the passenger among a plurality of completed character combinations stored in the storage.
- the first character input UI may recommend completed character combinations including the entirety of the input character combination, or completed character combinations including at least one character of the plurality of characters configuring the input character combination.
- the first character input UI shown in FIG. 4 is an embodiment of a first character input UI, and the first character input UI can be configured in any other format within a technical concept of arranging a plurality of characters to surround a predetermined reference point P.
- the touch input apparatus 100 or 200 may detect a touch operation by a passenger including a driver.
- the touch input apparatus 100 or 200 may have a concave area capable of detecting a touch operation.
- FIGS. 5A to 5C and 6A to 6C various embodiments of the touch input apparatus 100 or 200 will be described with reference to FIGS. 5A to 5C and 6A to 6C .
- FIG. 5A is a perspective view of the touch input apparatus 100 according to an embodiment of the present disclosure
- FIG. 5B is a plan view of the touch input apparatus 100 of FIG. 5A
- FIG. 5C is a cross-sectional view of the touch input apparatus 100 cut along a line A-A of FIG. 5B .
- the touch input apparatus 100 shown in FIGS. 5A to 5C may include a touch section 110 to detect a passenger's touch operation, and a border section 120 located along the circumference of the touch section 110.
- the touch section 110 may be a touch pad to generate a signal when a passenger contacts or approaches it using a pointer, such as his/her finger or a touch pen.
- the passenger may make a predetermined touch gesture on the touch section 110 to input a desired control command.
- the touch pad may be a touch film or a touch sheet including a touch sensor, regardless of its name. Also, the touch pad may be a touch panel which is a display unit capable of detecting a touch operation on a screen.
- proximity touch a touch operation of making a pointer approach a touch pad so as for the pointer to be at the proximity of the touch pad without being in contact with the touch pad in order to recognize the location of the pointer
- contact touch a touch operation of making a pointer contact a touch pad in order to recognize the location of the pointer.
- the location of a pointer at which proximity touch is recognized may be a location at which the pointer approaches a touch pad to be vertical to the touch pad.
- the touch pad may be a resistive type touch pad, an optical type touch pad, a capacitive type touch pad, an ultrasonic type touch pad, or a pressure type touch pad. That is, the touch pad may be one of various kinds of touch pads well-known in the art.
- the border section 120 surrounding the touch section 110 may be provided as a separate member from the touch section 110.
- one or more key buttons or touch buttons 121 may be arranged in the border section 120. Accordingly, the passenger may input a control command by touching the touch section 110 or using the buttons 121 arranged in the border section 120 around the touch section 110.
- the touch input apparatus 100 may further include a wrist supporting part 130 to support the passenger's wrist.
- the wrist supporting part 130 may be positioned higher than the touch section 110. Since the wrist supporting part 130 is positioned higher than the touch section 110, the wrist supporting part 130 may prevent a passenger's wrist from being bent when he/she touches the touch section 110 with his/her finger while putting his/her wrist on the wrist supporting part 130. Accordingly, the wrist supporting part 130 may protect the passenger from musculoskeletal system disorder, while offering a good operation feeling.
- the touch section 110 may include an area that is lower than the boundary line with the border section 120. That is, the touch surface of the touch section 110 may be lower than the boundary line with the border section 120. For example, the touch surface of the touch section 110 may be inclined downward from the boundary line with the border section 120, or the touch surface of the touch section 110 may have a step with respect to the boundary line with the border section 120. As shown in FIGS. 5A to 5C , the touch section 110 includes a concave, curved area.
- the passenger can recognize the area of the touch section 110 and the boundary line with his/her tactile impression.
- the center area of the touch section 110 may have a high detection rate with respect to touch operations. Also, when the passenger inputs a touch operation, the passenger can intuitively recognize the touch area and the boundary line so that he/she can apply a touch operation to an exact location, resulting in an improvement in accuracy of touch inputs.
- the touch section 110 includes a concave area, as described above.
- concave means a hollow or depressed shape, and may also include an inclined or stepped shape, as well as a round depressed shape.
- the touch section 110 includes a concave, curved area.
- the concave, curved surface of the touch section 110 may have different curvatures according to area.
- the center area of the concave, curved surface may have a relatively small curvature (a small radius of curvature)
- the outer area of the concave, curved surface may have a relatively great curvature (a greater radius of curvature).
- the passenger can feel improved touch sensation when applying a touch input to the touch section 110.
- the curved surface of the touch section 110 may similarly correspond to a trajectory drawn by a user fingertip's movement occurring when he/she moves his/her finger while fixing his/her wrist or when he/she rotates or twists his/her wrist while spreading out his/her fingers.
- the touch section 110 has a circular shape. If the touch section 110 has a circular shape, it may be easy to form a concave, curved area in the touch section 110. Also, if the touch section 110 has a circular shape, the passenger can easily recognize the touch area of the touch section 100 with his/her tactile feeling so as to easily input rolling or spin operations.
- the passenger can intuitively recognize at which location of the touch section 110 his/her finger is positioned. Also, if the touch section 110 is curved, all points of the touch section 110 may have different gradients. Accordingly, the passenger can intuitively recognize which location of the touch section 110 his/her finger touches through a sense of gradient felt by the finger. That is, the curved shape of the touch section 110 can provide the passenger with feedback about which location of the touch section 110 his/her finger is located when he/she makes a gesture on the touch section 110 while fixing his/her eyes at some other place instead of the touch section 110, thereby helping the passenger make his/her desired gesture and improving the accuracy of gesture inputs.
- the concave area of the touch input apparatus is divided into two areas of a center area and an outer area.
- the concave area of the touch input apparatus is divided into a gesture input section corresponding to the center portion of the concave area and a swiping input section corresponding to the outer portion of the concave area.
- FIG. 6A is a perspective view of a touch input apparatus according to another embodiment of the present disclosure
- FIG. 6B is a plan view of the touch input apparatus of FIG. 6A
- FIG. 6C is a cross-sectional view of the touch input apparatus cut along a line B-B of FIG. 6B .
- a touch input apparatus 200 may include touch sections 210 and 220 to detect a passenger's touch input, and a border section 230 surrounding the touch sections 210 and 220.
- a method in which the touch sections 210 and 220 detect a touch input is the same as the method described above in the embodiment of FIGS. 5A to 5C .
- the border section 230 surrounding the touch sections 210 and 220 may be provided as a separate member from the touch sections 210 and 220.
- one or more key buttons 232a and 232b or one or more touch buttons 231a, 231b, and 231c may be arranged in such a way to surround the touch sections 210 and 220.
- the passenger may make a gesture on the touch sections 210 and 220, or may input a signal using any one of the buttons 231a, 231b, 232a, 232b, and 232c arranged in the border section 230 around the touch sections 210 and 220.
- the touch input apparatus 200 may further include a wrist supporting part 240 located in the lower portion and supporting the passenger's wrist.
- the touch sections 210 and 220 may include an area that is lower than the boundary line with the border section 230. That is, the touch surfaces of the touch sections 210 and 220 may be lower than the boundary line with the border section 230. For example, the touch surfaces of the touch sections 210 and 220 may be inclined downward from the boundary line with the border section 230, or the touch surfaces of the touch sections 210 and 220 may have a step with respect to the boundary line with the border section 230. As shown in FIG. 6C , the touch sections 210 and 220 include a gesture input section 210 that is a concave, curved area.
- the shape of the touch sections 210 and 220 having a concave area is the same as that of the touch section 110 described in the embodiment of FIGS. 5A to 5C .
- the touch sections 210 and 220 may include a swiping input section 220 inclined downward along the circumference of the gesture input section 210. If the touch sections 210 and 220 have a circular shape, the gesture input section 210 may be a part of a spherical surface, and the swiping input section 220 may surround the circumference of the gesture input section 210.
- the swiping input section 220 may detect a swiping gesture. For example, the passenger may make a swiping gesture along the swiping input section 220 provided in the form of a circle. At this time, the passenger may make the swiping gesture clockwise or counterclockwise along the swiping input section 220.
- the swiping input unit 220 may include gradations 221.
- the gradations 221 may visually or tactilely inform a passenger of a relative location.
- the gradations 221 may be embossed or engraved.
- the gradations 221 may be arranged at regular intervals. Accordingly, the passenger can intuitively recognize the number of gradations through which his/her finger passes while making a swiping operation so as to accurately adjust a duration of the swiping gesture.
- a cursor that is displayed on the display unit 34 may move according to the number of gradations 221 through which a finger passes when a swiping gesture is made. If the passenger makes a swiping gesture when various selected characters were successively displayed on the display unit 34, a selected character may move to the right by a space whenever the passenger's finger passes through a gradation 221.
- a gradient of the swiping input section 220 shown in FIGS. 6A , 6B , and 6C may be greater than a gradient in the direction of tangent of the swiping input section 220 with respect to the boundary line between the swiping input section 220 and the gesture input section 210. Since the swiping input unit 220 is more steeply inclined than the gesture input section 210, the passenger may intuitively recognize the gesture input section 210 when making a gesture on the gesture input section 210. Meanwhile, while a gesture is made on the gesture input section 210, no touch input applied on the swiping input section 220 may be recognized.
- the gesture input made on the gesture input section 210 may not overlap with any swiping gesture input made on the swiping input section 220.
- the swiping input section 220 may be integrated into the gesture input section 210.
- a plurality of touch sensors may be respectively installed in the gesture input section 210 and the swiping input section 220, or a touch sensor may be installed in the gesture input section 210 and the swiping input section 220. If the gesture input section 210 and the swiping input section 220 include a touch sensor, the controller 400 may distinguish the touch area of the gesture input section 210 from the touch area of the swiping input section 220 to distinguish a signal generated in correspondence to a touch input applied on the gesture input section 210 from a signal generated in correspondence to a touch input applied on the swiping input section 220.
- the touch input apparatus 200 may further include one or more buttons 231 and 232, as described above.
- the buttons 231 and 232 may be arranged around the touch sections 210 and 220.
- the buttons 231 and 232 may include one or more touch buttons 231a, 231b, and 231c to perform a predetermined function according to the passenger's touch input, and one or more pressure buttons 232a and 232b to change its position according to pressure applied by the passenger to perform a predetermined function.
- the controller 400 controls the display unit 34 to inform that one of a plurality of characters included in the first character input UI being currently displayed was selected, based on a touch input sensed by the concave area of the touch input apparatus 100 or 200.
- the controller 400 may divide the concave area of the touch input apparatus 100 or 200.
- FIG. 7 shows a case in which a concave area of the touch input apparatus 100 shown in FIGS. 5A to 5C is divided.
- the controller 400 divides a touch section 110, which is a concave area, into a first area S1 and a second area S2, wherein the second area S2 corresponds to the central portion of the touch section 110 and has a circular shape, and the first area S1 surrounds the circumference of the second area S2.
- the controller 400 may divide the concave area into a first area S1 and a second area S2 in a different way. More specifically, the controller 400 may set the swiping input section 220 to a first area S1 and the gesture input section 210 to a second area S2.
- the controller 400 may control the display unit 34 to manipulate the first character input UI that is displayed according to a touch gesture detected through the touch input apparatus 100 or 200.
- FIGS. 8A and 8B are views for describing a method of making a touch gesture for selecting a character on the touch input apparatus 200 shown in FIGS. 6A to 6C
- FIGS. 9A , 9B , and 9C are views for describing a method in which the controller 400 controls the first character input UI according to the touch gesture shown in FIGS. 8A and 8B .
- a passenger may touch a second location which is an arbitrary location on the first area S1 including the swiping input section 220 of the touch input apparatus 200. Then, as shown in FIG. 9A , the controller 400 may control the display unit 34 to enlarge a character G corresponding to the second location rather than other characters. Thereby, the display unit 34 may inform that the character G can be selected.
- the swiping input section 220 can recognize a swiping gesture, and the passenger may make a gesture of moving his/her finger clockwise or counterclockwise from the second location of the first area S1.
- FIG. 8A a case in which a gesture of moving a finger clockwise from the second location of the first areas S1 is made is shown.
- the display unit 34 may enlarge characters sequentially along a direction in which the swiping gesture is made. If a swiping gesture made clockwise is detected as shown in FIG. 8A , the controller 400 may control the display unit 34 to sequentially enlarge characters clockwise from the character G.
- the passenger may visually check the characters that are sequentially enlarged to correspond to the swiping gesture, through the display unit 34, and stop swiping when a character he/she wants to select is enlarged. That is, the controller 400 may control the display unit 34 to enlarge a character the passenger wants to select, at a first location at which a finger moved from the second location of the first area S1 moves no longer.
- FIG. 9B a case in which a character the passenger wants to select is "N" is shown.
- the passenger may make a gesture of moving his/her finger from the first location of the first area S1 at which swiping is stopped to the second area S2.
- the gesture may correspond to a command for selecting the character corresponding to the first location.
- the controller 400 may control the display unit 34 to inform that the character corresponding to the first location was selected.
- the display unit 34 may display the selected character N in a central area surrounded by a character arrangement configured with a plurality of characters, thereby informing the passenger that the character N was selected.
- the controller 400 may control the display unit 34 to inform that the selected character was cancelled. For example, as shown in FIG. 9C , if a selection cancelation gesture is detected after it is informed that the character N was selected, the display unit 34 may no longer display the character N displayed in the central area surrounded by the character arrangement.
- the touch input apparatus 200 is configured to sense pressure applied when a touch gesture is made, through the concave area, the first character input UI may be manipulated according to the touch gesture and the sensed result of the pressure.
- the controller 400 may recognize the detected gesture as a character selection command. Accordingly, when pressure is sensed at the first location, and then a gesture of moving a finger from the first location to the second area S2 is detected, the display unit 34 may inform that a character corresponding to the first location was selected.
- the controller 400 may recognize the detected gesture as a character selection command. Accordingly, when a gesture of moving a finger from the first location to the second area S2 is detected, and then pressure is sensed at the second area S2, the display unit 34 may inform that a character corresponding to the first location was selected.
- the controller 400 may recognize the detected gesture as a character selection command. Accordingly, when pressure is sensed at the first location, a gesture of moving his/her finger from the first location to the second area S3 is detected, and then, pressure is sensed at the second area S2, the display unit 34 may inform that a character corresponding to the first location was selected.
- the first character input UI may display completed character combinations "Adendorf”, “Affalterbach”, “Ahlen”, and “Augsburg” corresponding to the input character combination "Amberg”, together with the input character combination "Amberg”.
- the controller 400 determines whether the number of pre-stored completed character combinations corresponding to the input character combination configured with at least one character selected sequentially by a touch gesture detected by the touch input apparatus 200 is smaller than or equal to a threshold value.
- the threshold value may correspond to the maximum number of completed character combinations which is enough to display a second character input UI which will be described below, and the threshold value may have been decided when the vehicle 1 (see FIG. 1 ) is manufactured, or may be decided by a passenger's input.
- the controller 400 controls the display unit 34 to display a second character input UI in which the completed character combinations corresponding to the input character combination are arranged in the form of a circle or oval.
- FIG. 10 shows an embodiment of a second character input UI displayed on the display unit 34.
- FIG. 10 a second character input UI in which an input character combination "Amberg”, and completed character combinations “Adendorf”, “affalterbach”, “Ahlen”, and “Augsburg” corresponding to the input character combination "Amberg” are arranged to surround a predetermined reference point P is shown.
- a passenger may visually check the second character input UI, and then make a gesture of selecting any one of the arranged completed character combinations on the touch input apparatus 200.
- a gesture of selecting a completed character combination with respect to the second character input UI may be the same as a gesture of selecting a completed character combination with respect to the first character input UI, and a gesture of cancelling a selection of a completed character combination with respect to the second character input UI may also be the same as a gesture of cancelling a selection of a completed character combination with respect to the first character input UI. Accordingly, detailed descriptions thereof will be omitted.
- the controller 400 may search for information corresponding to the selected completed character combination in the storage, and control the vehicle 1 based on the found information. Simultaneously, the controller 400 controls the display unit 34 to inform that the completed character combination was selected. For example, the display unit 34 may display the selected completed character combination in a central area surrounded by the plurality of completed character combinations, thereby informing the passenger that the displayed completed character combination was selected.
- the passenger can select a completed character combination without having to input all characters individually.
- FIG. 11 is a flowchart illustrating a vehicle control method according to an embodiment of the present disclosure.
- the display unit 34 displays a first character input UI in which a plurality of characters are arranged to surround a reference point, in operation 700.
- the characters arranged in the first character input UI may include at least one of consonants, vowels, and special characters of a predetermined language. Since the plurality of characters is arranged in this way, a passenger can easily manipulate the first character input UI through the concave area of the touch input apparatus 100 or 200.
- the touch input apparatus 100 or 200 determines whether a gesture of selecting a character is detected, in operation 710. More specifically, the touch input apparatus 100 or 200 determines whether a gesture of moving a finger from a first location of a first area S1 of the concave area to a second area S2 is detected.
- the second area S2 is a circular area in the center of a touch section which is a concave area
- the first area S1 is a border area surrounding the circumference of the second area S2.
- the first location may be a location at which a touch input of causing a character a passenger wants to select among the plurality of characters to be in a selectable state can be detected.
- the touch input apparatus 100 or 200 may continue to determine whether a gesture of selecting a character is detected.
- the display unit 34 displays the selected character, in operation 720. If there is another character selected earlier than the currently selected character, the currently selected character may be displayed following the previously selected character.
- the controller determines whether the number of completed character combinations corresponding to an input character combination configured with the sequentially selected characters is smaller than or equal to a threshold value, in operation 730.
- the threshold value may correspond to the maximum number of completed character combinations which is enough to display a second character input UI which will be described below, and the threshold value may have been decided when the vehicle 1 (see FIG. 1 ) is manufactured, or may be decided by a passenger's input.
- the vehicle 1 may return to operation 710 to determine whether a gesture of selecting a character is detected.
- the display unit 34 displays a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround a reference point, in operation 740.
- the completed character combinations arranged in the second character input UI may be completed character combinations including the entirety of the input character combination, or completed character combinations including at least one of a plurality of characters configuring the input character combination.
- the touch input apparatus 100 or 200 determines whether a gesture of selecting a completed character combination from among the completed character combinations is detected, in operation 750. More specifically, the touch input apparatus 100 or 200 may determine whether a gesture of moving a finger from a third location of the first area S1 of the concave area to the second area S2 is detected.
- the third location may be a location at which a touch input of causing a completed character combination a passenger wants to select among the plurality of completed character combinations to be in a selectable state can be detected.
- the touch input apparatus 100 or 200 may continue to determine whether a gesture of selecting a completed character combination is detected.
- the display unit 34 displays the selected completed character combination, in operation 760.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Automation & Control Theory (AREA)
- Position Input By Displaying (AREA)
Description
- Embodiments of the present disclosure relate to a vehicle including a touch input apparatus having a concave area, and to a method of controlling the vehicle.
- Vehicles often have various functions for improving a passenger's convenience, in addition to driving functions. However, the more functions a vehicle has, the more operating load a driver experiences. Excessive operating load deteriorates a driver's concentration on driving. Also, as a vehicle has more functions, a driver's difficulties in operating the vehicle might increase accordingly, such that a driver might be not able to properly use all functions the vehicle can perform.
- In order to overcome the problem, studies into a vehicle-mounted input device for reducing a driver's operating load and difficulties have been conducted. A representative example of such a vehicle-mounted input device is a touch input device capable of detecting a driver's touch inputs. If a vehicle has such a touch input device, a driver would be able to easily control the vehicle by touching the touch input device without having to perform any complicated manipulations.
- Related technical background may be found in
EP 2 881 878 A2Document DE 10 2009 023887 A1 relates to a further input device for a vehicle. - Therefore, it is an aspect of the present disclosure to provide a vehicle capable of displaying a character input User Interface (Ul) to show pre-stored completed character combinations corresponding to an input character combination, and a method of controlling the vehicle.
- Additional aspects of the disclosure will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the disclosure.
- In accordance with one aspect of the present disclosure, a vehicle includes: a display unit for displaying a first character input User Interface (Ul) in which a plurality of characters are arranged to surround a predetermined reference point; a touch input apparatus having a concave area for detecting a touch gesture of selecting one character of the plurality of characters; and a controller for controlling the display unit to display, if the number of pre-stored completed character combinations corresponding to an input character combination configured with at least one character sequentially selected according to the touch gesture detected by the touch input apparatus is smaller than or equal to a threshold value, a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround the reference point.
- The touch input apparatus may detect a touch gesture of selecting a completed character combination from among the arranged completed character combinations.
- The controller may control the display unit to inform that the selected completed character combination was selected.
- The touch input apparatus includes a concave area divided into a first area and a second area, wherein the second area may correspond to the central part of the concave area and has a circular shape, and the first area may surround the circumference of the second area.
- If a gesture of moving a touch location from the first area to the second area is detected, the controller controls the display unit to inform that a character or a completed character combination corresponding to the touch location of the first area was selected.
- If the gesture of moving a touch location from the first area to the second area is detected, and successively a gesture of moving the touch location to the firs area is detected, the controller may control the display unit to inform that the selection of the selected character or the selected completed character combination was cancelled.
- If the number of completed character combinations including the entirety of the input character combination is smaller than or equal to the threshold value, the controller may control the display unit to display the second character input UI.
- If the number of completed character combinations including at least one character of characters configuring the input character combination is smaller than or equal to the threshold value, the controller may control the display unit to display the second character input UI.
- The controller may control the display unit to display a second character input UI in which the completed character combinations corresponding to the input character combination are adjusted to have a predetermined length.
- The controller may reduce the sizes of the completed character combinations so that the completed character combinations have the predetermined length.
- The controller may omit at least one character of characters configuring each completed character combination so that the completed character combinations have the predetermined length.
- In accordance with another aspect of the present disclosure, a method of controlling a vehicle, the vehicle including a touch input apparatus having a concave area configured to detect a touch gesture, includes: displaying a first character input User Interface (UI) in which a plurality of characters are arranged to surround a predetermined reference point; detecting a touch gesture of selecting a character of the plurality of characters; and displaying, if the number of pre-stored completed character combinations corresponding to an input character combination configured with at least one character sequentially selected according to the touch gesture detected by the touch input apparatus is smaller than or equal to a threshold value, a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround the reference point.
- The method may further include: detecting a touch gesture of selecting a completed character combination from among the arranged completed character combinations; and informing that the selected completed character combination was selected.
- The method further includes dividing a concave area into a first area and a second area, wherein the second area corresponds to the central part of the concave area and has a circular shape, and the first area surrounds the circumference of the second area.
- The method further includes, if a gesture of moving a touch location from the first area to the second area is detected when the first character input UI is displayed, informing that a character corresponding to the touch location of the first area was selected.
- The method may further include, if the gesture of moving the touch location from the first area to the second area is detected, and successively a gesture of moving the touch location to the first area is detected, informing that the selection of the selected character was cancelled.
- The method includes, if a gesture of moving a touch location from the first area to the second area is detected when the second character input UI is displayed, informing that a completed character combination corresponding to the touch location of the first area was selected.
- The method may further include, if the gesture of moving the touch location from the first area to the second area is detected, and successively a gesture of moving the touch location to the first area is detected, informing that the selection of the selected completed character combination was cancelled.
- The displaying of the second character input UI may include displaying the second character input UI, if the number of completed character combinations including the entirety of the input character combination is smaller than or equal to the threshold value.
- The displaying of the second character input UI may include displaying the second character input UI, if the number of completed character combinations including at least one character of characters configuring the input character combination is smaller than or equal to the threshold value.
- The displaying of the second character input UI may include displaying a second character input UI in which the completed character combinations corresponding to the input character combination are adjusted to have a predetermined length.
- The displaying of the second character input UI may include reducing the sizes of the completed character combinations so that the completed character combinations have the predetermined length.
- The displaying of the second character input UI may include omitting at least one character of characters configuring each completed character combination so that the completed character combinations have the predetermined length.
- These and/or other aspects of the disclosure will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
-
FIG. 1 shows an outer appearance of a vehicle according to an embodiment of the present disclosure; -
FIG. 2 shows an interior of a vehicle according to an embodiment of the present disclosure; -
FIG. 3 is a control block diagram of a vehicle according to an embodiment of the present disclosure; -
FIG. 4 shows an embodiment of a first character input User Interface (UI) displayed on a display unit; -
FIG. 5A is a perspective view of a touch input apparatus according to an embodiment of the present disclosure,FIG. 5B is a plan view of the touch input apparatus ofFIG. 5A , andFIG. 5C is a cross-sectional view of the touch input apparatus cut along a line A-A ofFIG. 5B ; -
FIG. 6A is a perspective view of a touch input apparatus according to another embodiment of the present disclosure,FIG. 6B is a plan view of the touch input apparatus ofFIG. 6A , andFIG. 6C is a cross-sectional view of the touch input apparatus cut along a line B-B ofFIG. 6B ; -
FIG. 7 shows a case in which a concave area of the touch input apparatus shown inFIGS. 5A to 5C is divided; -
FIGS. 8A and8B are views for describing a method of making a touch gesture for selecting a character on the touch input apparatus shown inFIGS. 6A to 6C ; -
FIGS. 9A ,9B , and9C are views for describing a method in which a controller controls a first character input UI according to the touch gesture shown inFIGS. 8A and8B ; -
FIG. 10 shows an embodiment of a second character input UI displayed on a display unit; and -
FIG. 11 is a flowchart illustrating a vehicle control method according to an embodiment of the present disclosure. - Reference will now be made in detail to the embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
- Hereinafter, a vehicle, and a control method thereof will be described in detail, with reference to the accompanying drawings.
-
FIG. 1 shows an outer appearance of a vehicle according to an embodiment of the present disclosure. - As shown in
FIG. 1 , a vehicle 1 according to an embodiment of the present disclosure may include amain body 10 forming an outer appearance of the vehicle 1,wheels doors 14 to shield the interior of the vehicle 1 from the outside, afront glass 17 to provide a driver inside the vehicle 1 with a front view of the vehicle 1, and side-view mirrors 18 and 19 to provide the driver with rear and side views of the vehicle 1. - The
wheels front wheels 21 provided in the front part of the vehicle 1, andrear wheels 22 provided in the rear part of the vehicle 1. Thefront wheels 21 or therear wheels 22 may receive rotatory power from a driving apparatus which will be described later to move themain body 10 forward or backward. - The
doors 14 may be rotatably provided in the left and right sides of themain body 10 to allow the driver to open one of them and get into the vehicle 1. Also, thedoors 14 may shield the interior of thevehicle 10 from the outside when all of them are closed. - The
front glass 17 may be provided in the upper, front part of themain body 10 to provide the driver inside the vehicle 1 with a front view of the vehicle 1. Thefront glass 17 is also called windshield glass. - The side-view mirrors 18 and 19 may include a left side-
view mirror 18 provided in the left side of themain body 10, and a right side-view mirror 19 provided in the right side of themain body 10 to provide the driver inside the vehicle 1 with rear and side views of thevehicle 10. -
FIG. 2 shows an interior of the vehicle 1 according to an embodiment of the present disclosure. - Referring to
FIG. 2 , the vehicle 1 may include: a plurality ofseats 11 in which a driver and a passenger sit; adashboard 50 on which agear box 20, acenter fascia 30, and asteering wheel 40 are provided; and a plurality ofspeakers 60. - The
gear box 20 may include atransmission lever 24 for shifting gears, and atouch input apparatus touch input apparatus - The
steering wheel 40 attached on thedashboard 50 may be used to change a driving direction of the vehicle 1. Thesteering wheel 40 may include arim 41 that can be gripped by a driver, and aspoke 42 connected to a steering apparatus of the vehicle 1 and connecting therim 31 to a hub of a rotation axis for steering. According to an embodiment, thespoke 42 may include a plurality ofmanipulation units - In the
center fascia 30 provided on thedashboard 50, anair conditioner 31, aclock 32, anaudio system 33, a display, etc. may be installed. - The
air conditioner 31 may adjust the temperature, humidity, air quality, and flow of air inside the vehicle 1 to maintain a pleasant environment inside of the vehicle 1. Theair conditioner 31 may be installed in thecenter fascia 30, and may include at least onevent 31a for discharging air. In thecenter fascia 30, at least one button or dial for controlling theair conditioner 31, etc. may be provided. A driver or passenger may use the button provided on the center fascia 30 to control theair conditioner 31. - The
clock 32 may be positioned around the button or dial for controlling theair conditioner 31. - The
audio system 33 may include an operating panel on which a plurality of buttons for executing functions of theaudio system 33 are arranged. Theaudio system 33 may provide a radio mode to provide a radio function, and a media mode to reproduce an audio file stored in storage medium that stores audio files. - The
display unit 34 may display a User Interface (UI) to provide a driver with information related to the vehicle 1 in the form of an image or text. Thedisplay unit 34 may be embedded in thecenter fascia 30. However, thedisplay unit 34 may be installed in another fashion. For example, thedisplay unit 34 may be separated from thecenter fascia 30 of the vehicle 1. Details about operations of thedisplay unit 34 will be described later. - The
display unit 34 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, a Plasma Display Panel (PDP) display, an Organic Light Emitting Diode (OLED) display, or a Cathode Ray Tube (CRT) display, although thedisplay unit 34 is not limited to these. - Also, the
dashboard 50 may further include an instrument panel to display speed, Revolutions Per Minute (RPM), and a fuel gauge of the vehicle 1, and a globe box to store various things. - In the inside of the vehicle 1, the
speakers 60 may be provided to output sound. Thespeakers 60 may output information related to the vehicle 1 in the form of sound. For example, if the vehicle 1 receives an output signal for controlling outputting of a recommended driving method, thespeakers 60 may output the recommended driving method corresponding to the output signal in the form of sound for a driver. - Meanwhile, the vehicle 1 is arranged to provide the driver with a character input UI through the
display unit 34, so that the driver or a passenger can input a character through thetouch input apparatus FIG. 3 . -
FIG. 3 is a control block diagram of the vehicle 1 according to an embodiment of the present disclosure. - In order to manipulate the character input UI, the vehicle 1 includes: the
touch input apparatus display unit 34 to display the character input UI; storage to store a control command corresponding to the touch gesture; and acontroller 400 to control thedisplay unit 34 to manipulate a character input UI that is displayed according to the touch gesture. - The control command corresponding to the touch gesture detected by the
touch input apparatus controller 400 which will be described later. - Also, the storage has already stored completed character combinations, and information corresponding to the completed character combinations. Examples of the information corresponding to the completed character combinations stored in the storage may include phone numbers, addresses, music, and video.
- The
display unit 34 receives the control command from thecontroller 400, and inform that a character selected by a passenger was selected, through the character input UI. For the operation, in the character input UI displayed on thedisplay unit 34, a plurality of characters may be arranged in the form of a circle or oval. Hereinafter, an embodiment of a first character input UI that is displayed on thedisplay unit 34 will be described with reference toFIG. 4 . -
FIG. 4 shows an embodiment of a first character input UI displayed on thedisplay unit 34. - In order to allow a passenger to select a character through the
touch input apparatus display unit 34 displays a first character input UI in which a plurality of characters are arranged to surround a predetermined reference point P. The characters arranged in the first character input UI may include at least one of consonants, vowels, and special characters of a predetermined language. - Referring to
FIG. 4 , the characters arranged to surround the predetermined reference point P may be displayed in the left area of the first character input UI. The first character input UI may display characters selected by the passenger in a box located in the upper, right area, sequentially in the order in which they are selected, and also display completed character combinations that are recommended for the passenger, below the box. - The completed character combinations that are recommended for the passenger may be completed character combinations corresponding to an input character combination configured with the characters sequentially input by the passenger among a plurality of completed character combinations stored in the storage. For example, the first character input UI may recommend completed character combinations including the entirety of the input character combination, or completed character combinations including at least one character of the plurality of characters configuring the input character combination.
- However, the first character input UI shown in
FIG. 4 is an embodiment of a first character input UI, and the first character input UI can be configured in any other format within a technical concept of arranging a plurality of characters to surround a predetermined reference point P. - Referring again to
FIG. 3 , thetouch input apparatus touch input apparatus - Hereinafter, various embodiments of the
touch input apparatus FIGS. 5A to 5C and6A to 6C . -
FIG. 5A is a perspective view of thetouch input apparatus 100 according to an embodiment of the present disclosure,FIG. 5B is a plan view of thetouch input apparatus 100 ofFIG. 5A , andFIG. 5C is a cross-sectional view of thetouch input apparatus 100 cut along a line A-A ofFIG. 5B . - The
touch input apparatus 100 shown inFIGS. 5A to 5C may include atouch section 110 to detect a passenger's touch operation, and aborder section 120 located along the circumference of thetouch section 110. - The
touch section 110 may be a touch pad to generate a signal when a passenger contacts or approaches it using a pointer, such as his/her finger or a touch pen. The passenger may make a predetermined touch gesture on thetouch section 110 to input a desired control command. - The touch pad may be a touch film or a touch sheet including a touch sensor, regardless of its name. Also, the touch pad may be a touch panel which is a display unit capable of detecting a touch operation on a screen.
- Meanwhile, a touch operation of making a pointer approach a touch pad so as for the pointer to be at the proximity of the touch pad without being in contact with the touch pad in order to recognize the location of the pointer is called "proximity touch", and a touch operation of making a pointer contact a touch pad in order to recognize the location of the pointer is called "contact touch". The location of a pointer at which proximity touch is recognized may be a location at which the pointer approaches a touch pad to be vertical to the touch pad.
- The touch pad may be a resistive type touch pad, an optical type touch pad, a capacitive type touch pad, an ultrasonic type touch pad, or a pressure type touch pad. That is, the touch pad may be one of various kinds of touch pads well-known in the art.
- The
border section 120 surrounding thetouch section 110 may be provided as a separate member from thetouch section 110. In theborder section 120, one or more key buttons or touch buttons 121 may be arranged. Accordingly, the passenger may input a control command by touching thetouch section 110 or using the buttons 121 arranged in theborder section 120 around thetouch section 110. - The
touch input apparatus 100 may further include awrist supporting part 130 to support the passenger's wrist. Thewrist supporting part 130 may be positioned higher than thetouch section 110. Since thewrist supporting part 130 is positioned higher than thetouch section 110, thewrist supporting part 130 may prevent a passenger's wrist from being bent when he/she touches thetouch section 110 with his/her finger while putting his/her wrist on thewrist supporting part 130. Accordingly, thewrist supporting part 130 may protect the passenger from musculoskeletal system disorder, while offering a good operation feeling. - The
touch section 110 may include an area that is lower than the boundary line with theborder section 120. That is, the touch surface of thetouch section 110 may be lower than the boundary line with theborder section 120. For example, the touch surface of thetouch section 110 may be inclined downward from the boundary line with theborder section 120, or the touch surface of thetouch section 110 may have a step with respect to the boundary line with theborder section 120. As shown inFIGS. 5A to 5C , thetouch section 110 includes a concave, curved area. - Since the
touch section 110 includes an area lower than the boundary line with theborder section 120, the passenger can recognize the area of thetouch section 110 and the boundary line with his/her tactile impression. In thetouch input apparatus 100, the center area of thetouch section 110 may have a high detection rate with respect to touch operations. Also, when the passenger inputs a touch operation, the passenger can intuitively recognize the touch area and the boundary line so that he/she can apply a touch operation to an exact location, resulting in an improvement in accuracy of touch inputs. - The
touch section 110 includes a concave area, as described above. Herein, the term "concave" means a hollow or depressed shape, and may also include an inclined or stepped shape, as well as a round depressed shape. - Referring to
FIG. 5C , thetouch section 110 includes a concave, curved area. In this case, the concave, curved surface of thetouch section 110 may have different curvatures according to area. For example, the center area of the concave, curved surface may have a relatively small curvature (a small radius of curvature), and the outer area of the concave, curved surface may have a relatively great curvature (a greater radius of curvature). - Since the
touch section 110 includes a curved surface, the passenger can feel improved touch sensation when applying a touch input to thetouch section 110. The curved surface of thetouch section 110 may similarly correspond to a trajectory drawn by a user fingertip's movement occurring when he/she moves his/her finger while fixing his/her wrist or when he/she rotates or twists his/her wrist while spreading out his/her fingers. - Also, the
touch section 110 has a circular shape. If thetouch section 110 has a circular shape, it may be easy to form a concave, curved area in thetouch section 110. Also, if thetouch section 110 has a circular shape, the passenger can easily recognize the touch area of thetouch section 100 with his/her tactile feeling so as to easily input rolling or spin operations. - Also, since the
touch section 110 is curved, the passenger can intuitively recognize at which location of thetouch section 110 his/her finger is positioned. Also, if thetouch section 110 is curved, all points of thetouch section 110 may have different gradients. Accordingly, the passenger can intuitively recognize which location of thetouch section 110 his/her finger touches through a sense of gradient felt by the finger. That is, the curved shape of thetouch section 110 can provide the passenger with feedback about which location of thetouch section 110 his/her finger is located when he/she makes a gesture on thetouch section 110 while fixing his/her eyes at some other place instead of thetouch section 110, thereby helping the passenger make his/her desired gesture and improving the accuracy of gesture inputs. - However, unlike the embodiment shown in
FIGS. 5A to 5C , the concave area of the touch input apparatus is divided into two areas of a center area and an outer area. Hereinafter, a case in which the concave area of the touch input apparatus is divided into a gesture input section corresponding to the center portion of the concave area and a swiping input section corresponding to the outer portion of the concave area will be described in detail with reference toFIGS. 6A ,6B , and6C . -
FIG. 6A is a perspective view of a touch input apparatus according to another embodiment of the present disclosure,FIG. 6B is a plan view of the touch input apparatus ofFIG. 6A , andFIG. 6C is a cross-sectional view of the touch input apparatus cut along a line B-B ofFIG. 6B . - Referring to
FIGS. 6A ,6B , and6C , atouch input apparatus 200 may includetouch sections border section 230 surrounding thetouch sections - A method in which the
touch sections FIGS. 5A to 5C . - The
border section 230 surrounding thetouch sections touch sections border section 230, one or morekey buttons more touch buttons touch sections touch sections buttons border section 230 around thetouch sections - Also, as shown in
FIGS. 6A to 6C , thetouch input apparatus 200 may further include awrist supporting part 240 located in the lower portion and supporting the passenger's wrist. - Referring to
FIG. 6C , thetouch sections border section 230. That is, the touch surfaces of thetouch sections border section 230. For example, the touch surfaces of thetouch sections border section 230, or the touch surfaces of thetouch sections border section 230. As shown inFIG. 6C , thetouch sections gesture input section 210 that is a concave, curved area. - The shape of the
touch sections touch section 110 described in the embodiment ofFIGS. 5A to 5C . - The
touch sections input section 220 inclined downward along the circumference of thegesture input section 210. If thetouch sections gesture input section 210 may be a part of a spherical surface, and the swipinginput section 220 may surround the circumference of thegesture input section 210. - The swiping
input section 220 may detect a swiping gesture. For example, the passenger may make a swiping gesture along the swipinginput section 220 provided in the form of a circle. At this time, the passenger may make the swiping gesture clockwise or counterclockwise along the swipinginput section 220. - The swiping
input unit 220 may includegradations 221. Thegradations 221 may visually or tactilely inform a passenger of a relative location. For example, thegradations 221 may be embossed or engraved. Thegradations 221 may be arranged at regular intervals. Accordingly, the passenger can intuitively recognize the number of gradations through which his/her finger passes while making a swiping operation so as to accurately adjust a duration of the swiping gesture. - According to an embodiment, a cursor that is displayed on the display unit 34 (see
FIG. 2 ) may move according to the number ofgradations 221 through which a finger passes when a swiping gesture is made. If the passenger makes a swiping gesture when various selected characters were successively displayed on thedisplay unit 34, a selected character may move to the right by a space whenever the passenger's finger passes through agradation 221. - A gradient of the swiping
input section 220 shown inFIGS. 6A ,6B , and6C may be greater than a gradient in the direction of tangent of the swipinginput section 220 with respect to the boundary line between the swipinginput section 220 and thegesture input section 210. Since the swipinginput unit 220 is more steeply inclined than thegesture input section 210, the passenger may intuitively recognize thegesture input section 210 when making a gesture on thegesture input section 210. Meanwhile, while a gesture is made on thegesture input section 210, no touch input applied on the swipinginput section 220 may be recognized. Accordingly, when the passenger makes a gesture on thegesture input section 210 until reaching the boundary line with the swipinginput section 220, the gesture input made on thegesture input section 210 may not overlap with any swiping gesture input made on the swipinginput section 220. - Meanwhile, the swiping
input section 220 may be integrated into thegesture input section 210. Also, a plurality of touch sensors may be respectively installed in thegesture input section 210 and the swipinginput section 220, or a touch sensor may be installed in thegesture input section 210 and the swipinginput section 220. If thegesture input section 210 and the swipinginput section 220 include a touch sensor, thecontroller 400 may distinguish the touch area of thegesture input section 210 from the touch area of the swipinginput section 220 to distinguish a signal generated in correspondence to a touch input applied on thegesture input section 210 from a signal generated in correspondence to a touch input applied on the swipinginput section 220. - The
touch input apparatus 200 may further include one or more buttons 231 and 232, as described above. The buttons 231 and 232 may be arranged around thetouch sections more touch buttons more pressure buttons - Referring again to
FIG. 3 , thecontroller 400 controls thedisplay unit 34 to inform that one of a plurality of characters included in the first character input UI being currently displayed was selected, based on a touch input sensed by the concave area of thetouch input apparatus - Before controlling the
display unit 34 to manipulate the first character input UI, thecontroller 400 may divide the concave area of thetouch input apparatus -
FIG. 7 shows a case in which a concave area of thetouch input apparatus 100 shown inFIGS. 5A to 5C is divided. - In the
touch input apparatus 100 shown inFIGS. 5A to 5C , thecontroller 400 divides atouch section 110, which is a concave area, into a first area S1 and a second area S2, wherein the second area S2 corresponds to the central portion of thetouch section 110 and has a circular shape, and the first area S1 surrounds the circumference of the second area S2. - However, in the
touch input apparatus 200 shown inFIGS. 6A to 6C , thecontroller 400 may divide the concave area into a first area S1 and a second area S2 in a different way. More specifically, thecontroller 400 may set the swipinginput section 220 to a first area S1 and thegesture input section 210 to a second area S2. - If the concave area of the
touch input apparatus controller 400 may control thedisplay unit 34 to manipulate the first character input UI that is displayed according to a touch gesture detected through thetouch input apparatus - Hereinafter, an embodiment of a method in which the first character input UI that is displayed on the
display unit 34 according to a touch gesture is manipulated will be described. For convenience of description, it is assumed that a touch gesture is made on thetouch input apparatus 200 shown inFIGS. 6A to 6C . -
FIGS. 8A and8B are views for describing a method of making a touch gesture for selecting a character on thetouch input apparatus 200 shown inFIGS. 6A to 6C , andFIGS. 9A ,9B , and9C are views for describing a method in which thecontroller 400 controls the first character input UI according to the touch gesture shown inFIGS. 8A and8B . - If the first character input UI as shown in
FIG. 4 is displayed, a passenger may touch a second location which is an arbitrary location on the first area S1 including the swipinginput section 220 of thetouch input apparatus 200. Then, as shown inFIG. 9A , thecontroller 400 may control thedisplay unit 34 to enlarge a character G corresponding to the second location rather than other characters. Thereby, thedisplay unit 34 may inform that the character G can be selected. - As described above, the swiping
input section 220 can recognize a swiping gesture, and the passenger may make a gesture of moving his/her finger clockwise or counterclockwise from the second location of the first area S1. InFIG. 8A , a case in which a gesture of moving a finger clockwise from the second location of the first areas S1 is made is shown. - If a swiping gesture is detected from the second location of the first area S1, the
display unit 34 may enlarge characters sequentially along a direction in which the swiping gesture is made. If a swiping gesture made clockwise is detected as shown inFIG. 8A , thecontroller 400 may control thedisplay unit 34 to sequentially enlarge characters clockwise from the character G. - The passenger may visually check the characters that are sequentially enlarged to correspond to the swiping gesture, through the
display unit 34, and stop swiping when a character he/she wants to select is enlarged. That is, thecontroller 400 may control thedisplay unit 34 to enlarge a character the passenger wants to select, at a first location at which a finger moved from the second location of the first area S1 moves no longer. InFIG. 9B , a case in which a character the passenger wants to select is "N" is shown. - Then, as shown in
FIG. 8B , the passenger may make a gesture of moving his/her finger from the first location of the first area S1 at which swiping is stopped to the second area S2. The gesture may correspond to a command for selecting the character corresponding to the first location. - As a result, the
controller 400 may control thedisplay unit 34 to inform that the character corresponding to the first location was selected. Referring toFIG. 9C , thedisplay unit 34 may display the selected character N in a central area surrounded by a character arrangement configured with a plurality of characters, thereby informing the passenger that the character N was selected. - Meanwhile, if a gesture of moving the finger to the first area S1 is detected following the gesture of moving the finger from the first location to the second area S2, the
controller 400 may control thedisplay unit 34 to inform that the selected character was cancelled. For example, as shown inFIG. 9C , if a selection cancelation gesture is detected after it is informed that the character N was selected, thedisplay unit 34 may no longer display the character N displayed in the central area surrounded by the character arrangement. - So far, a case in which when a gesture of moving a finger from the first location of the first area S1 to the second area S2 is detected, the
display unit 34 informs that a character corresponding to the first location was selected has been described. - However, if the
touch input apparatus 200 is configured to sense pressure applied when a touch gesture is made, through the concave area, the first character input UI may be manipulated according to the touch gesture and the sensed result of the pressure. - For example, when pressure is sensed at the first location, and then a gesture of moving a finger from the first location to the second area S2 is detected, the
controller 400 may recognize the detected gesture as a character selection command. Accordingly, when pressure is sensed at the first location, and then a gesture of moving a finger from the first location to the second area S2 is detected, thedisplay unit 34 may inform that a character corresponding to the first location was selected. - When a gesture of moving a finger from the first location to the second area S2 is detected, and then pressure is sensed at the second area S2, the
controller 400 may recognize the detected gesture as a character selection command. Accordingly, when a gesture of moving a finger from the first location to the second area S2 is detected, and then pressure is sensed at the second area S2, thedisplay unit 34 may inform that a character corresponding to the first location was selected. - Also, when pressure is sensed at the first location, a gesture of moving a finger from the first location to the second area S2 is detected, and then pressure is sensed at the second area S2, the
controller 400 may recognize the detected gesture as a character selection command. Accordingly, when pressure is sensed at the first location, a gesture of moving his/her finger from the first location to the second area S3 is detected, and then, pressure is sensed at the second area S2, thedisplay unit 34 may inform that a character corresponding to the first location was selected. - Meanwhile, as shown in
FIG. 4 , if an input character combination configured with characters input sequentially is "Amberg", the first character input UI may display completed character combinations "Adendorf", "Affalterbach", "Ahlen", and "Augsburg" corresponding to the input character combination "Amberg", together with the input character combination "Amberg". - Then, the
controller 400 determines whether the number of pre-stored completed character combinations corresponding to the input character combination configured with at least one character selected sequentially by a touch gesture detected by thetouch input apparatus 200 is smaller than or equal to a threshold value. Herein, the threshold value may correspond to the maximum number of completed character combinations which is enough to display a second character input UI which will be described below, and the threshold value may have been decided when the vehicle 1 (seeFIG. 1 ) is manufactured, or may be decided by a passenger's input. - If the number of the pre-stored completed character combinations corresponding to the input character combination is smaller than or equal to the threshold value, the
controller 400 controls thedisplay unit 34 to display a second character input UI in which the completed character combinations corresponding to the input character combination are arranged in the form of a circle or oval. -
FIG. 10 shows an embodiment of a second character input UI displayed on thedisplay unit 34. - In
FIG. 10 , a second character input UI in which an input character combination "Amberg", and completed character combinations "Adendorf", "affalterbach", "Ahlen", and "Augsburg" corresponding to the input character combination "Amberg" are arranged to surround a predetermined reference point P is shown. - A passenger may visually check the second character input UI, and then make a gesture of selecting any one of the arranged completed character combinations on the
touch input apparatus 200. - A gesture of selecting a completed character combination with respect to the second character input UI may be the same as a gesture of selecting a completed character combination with respect to the first character input UI, and a gesture of cancelling a selection of a completed character combination with respect to the second character input UI may also be the same as a gesture of cancelling a selection of a completed character combination with respect to the first character input UI. Accordingly, detailed descriptions thereof will be omitted.
- If the gesture of selecting the completed character combination is detected, the
controller 400 may search for information corresponding to the selected completed character combination in the storage, and control the vehicle 1 based on the found information. Simultaneously, thecontroller 400 controls thedisplay unit 34 to inform that the completed character combination was selected. For example, thedisplay unit 34 may display the selected completed character combination in a central area surrounded by the plurality of completed character combinations, thereby informing the passenger that the displayed completed character combination was selected. - As such, by displaying the second character input UI, the passenger can select a completed character combination without having to input all characters individually.
-
FIG. 11 is a flowchart illustrating a vehicle control method according to an embodiment of the present disclosure. - Referring to
FIGS. 2 and11 , thedisplay unit 34 displays a first character input UI in which a plurality of characters are arranged to surround a reference point, inoperation 700. The characters arranged in the first character input UI may include at least one of consonants, vowels, and special characters of a predetermined language. Since the plurality of characters is arranged in this way, a passenger can easily manipulate the first character input UI through the concave area of thetouch input apparatus - Then, the
touch input apparatus operation 710. More specifically, thetouch input apparatus - The second area S2 is a circular area in the center of a touch section which is a concave area, and the first area S1 is a border area surrounding the circumference of the second area S2. The first location may be a location at which a touch input of causing a character a passenger wants to select among the plurality of characters to be in a selectable state can be detected.
- If no gesture of selecting a character is detected, the
touch input apparatus - Meanwhile, if a gesture of selecting a character is detected, the
display unit 34 displays the selected character, inoperation 720. If there is another character selected earlier than the currently selected character, the currently selected character may be displayed following the previously selected character. - Then, the controller determines whether the number of completed character combinations corresponding to an input character combination configured with the sequentially selected characters is smaller than or equal to a threshold value, in
operation 730. Herein, the threshold value may correspond to the maximum number of completed character combinations which is enough to display a second character input UI which will be described below, and the threshold value may have been decided when the vehicle 1 (seeFIG. 1 ) is manufactured, or may be decided by a passenger's input. - If the controller determines that the number of the completed character combinations corresponding to the input character combination is greater than the threshold value, the vehicle 1 may return to
operation 710 to determine whether a gesture of selecting a character is detected. - Meanwhile, if the controller determines that the number of the completed character combinations corresponding to the input character combination is smaller than or equal to the threshold value, the
display unit 34 displays a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround a reference point, in operation 740. - The completed character combinations arranged in the second character input UI may be completed character combinations including the entirety of the input character combination, or completed character combinations including at least one of a plurality of characters configuring the input character combination.
- Then, the
touch input apparatus operation 750. More specifically, thetouch input apparatus - The third location may be a location at which a touch input of causing a completed character combination a passenger wants to select among the plurality of completed character combinations to be in a selectable state can be detected.
- If no gesture of selecting a completed character combination is detected, the
touch input apparatus - Meanwhile, if a gesture of selecting a completed character combination is detected, the
display unit 34 displays the selected completed character combination, inoperation 760. - According to the vehicle and the control method thereof as described above, by displaying a character input UI to show pre-stored completed character combinations corresponding to an input character combination configured with input characters, it is possible to provide a passenger with an environment in which he/she can easily input characters.
- As a result, it is possible to reduce a time taken to input characters and to reduce vehicle operating load, which leads to an improvement of driving safety.
- Although a few embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the scope of the disclosure, which is defined in the claims.
Claims (15)
- A vehicle (1) comprising:a display unit (34) configured to display a first character input User Interface (Ul) in which a plurality of characters are arranged to surround a predetermined reference point;a touch input apparatus (100, 200) having a concave area (110, 210, 220) configured to detect a touch gesture of selecting one character of the plurality of characters; anda controller (400) configured to control the display unit (34) to display, if the number of pre-stored completed character combinations corresponding to an input character combination configured with at least one character sequentially selected according to the touch gesture detected by the touch input apparatus (100, 200) is smaller than or equal to a threshold value, a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround the reference point,wherein the touch input apparatus includes a concave area divided into a first area and a second area, wherein the second area corresponds to the central part of the concave area and has a circular shape, and the first area surrounds the circumference of the second area,wherein if a gesture of moving a touch location from the first area to the second area is detected, the controller controls the display unit to inform that a character or a completed character combination corresponding to the touch location of the first area was selected.
- The vehicle (1) according to claim 1, wherein the touch input apparatus (100, 200) detects a touch gesture of selecting a completed character combination from among the arranged completed character combinations.
- The vehicle (1) according to claim 1 or 2, wherein the controller (400) controls the display unit (34) to inform that the selected completed character combination was selected.
- The vehicle (1) according to any one of the preceding claims
wherein if the gesture of moving the touch location from the first area (51) to the second area (52) is detected, and successively a gesture of moving the touch location to the first area is detected, the controller (400) controls the display unit (34) to inform that the selection of the selected character or the selected completed character combination was cancelled. - The vehicle (1) according to any one of the preceding claims, wherein, if the number of completed character combinations including the entire of the input character combination is smaller than or equal to the threshold value, the controller controls the display unit (34) to display the second character input UI.
- The vehicle (1) according to any one of the preceding claims, wherein if the number of completed character combinations including at least one character of characters configuring the input character combination is smaller than or equal to the threshold value, the controller controls the display unit to display the second character input UI.
- The vehicle (1) according to any one of the preceding claims, wherein the controller (400) controls the display unit (34) to display a second character input UI in which the completed character combinations corresponding to the input character combination are adjusted to have a predetermined length and arranged.
- The vehicle (1) according to claim 7, wherein the controller (400) reduces the sizes of the completed character combinations so that the completed character combinations have the predetermined length.
- The vehicle (1) according to claim 7, wherein the controller (400) omits at least one character of characters configuring each completed character combination so that the completed character combinations have the predetermined length.
- A method of controlling a vehicle (1), the vehicle (1) including a touch input apparatus (100, 200) having a concave area (110, 210, 220) configured to detect a touch gesture, comprising:displaying a first character input User Interface (UI) in which a plurality of characters are arranged to surround a predetermined reference point;detecting a touch gesture of selecting a character of the plurality of characters; anddisplaying, if the number of pre-stored completed character combinations corresponding to an input character combination configured with at least one character sequentially selected according to the touch gesture detected by the touch input apparatus is smaller than or equal to a threshold value, a second character input UI in which the completed character combinations corresponding to the input character combination are arranged to surround the reference point,wherein the touch input apparatus includes a concave area divided into a first area and a second area, wherein the second area corresponds to the central part of the concave area and has a circular shape, and the first area surrounds the circumference of the second area,if a gesture of moving a touch location from the first area to the second area is detected when the first character input UI is displayed, informing that a character corresponding to the touch location of the first area was selected.
- The method according to claim 10, further comprising:detecting a touch gesture of selecting a completed character combination from among the arranged completed character combinations;informing that the selected completed character combination was selected, anddividing a concave area (110, 210, 220) into a first area (51) and a second area (52), wherein the second area corresponds to the central part of the concave area and has a circular shape, and the first area (51) surrounds the circumference of the second area (52).
- The method according to claim 11, further comprising, if the gesture of moving the touch location from the touch location of the first area to the second area is detected, and successively a gesture of moving the touch location to the first area is detected, informing that the selection of the selected character was cancelled.
- The method according to claim 11, further comprising, if a gesture of moving a touch location from the first area to the second area is detected when the second character input UI is displayed, informing that a completed character combination corresponding to the touch location of the first area was selected, and
if the gesture of moving the touch location from the first area to the second area is detected, and successively a gesture of moving the touch location to the first area is detected, informing that the selection of the selected completed character combination was cancelled. - The method according to any one of claims 10 to 13, wherein the displaying of the second character input UI comprises displaying the second character input UI, if the number of completed character combinations including the entire of the input character combination is smaller than or equal to the threshold value.
- The method according to any one of claims 10 to 14, wherein the displaying of the second character input UI comprises displaying the second character input UI, if the number of completed character combinations including at least one character of characters configuring the input character combination is smaller than or equal to the threshold value.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150095404A KR101678094B1 (en) | 2015-07-03 | 2015-07-03 | Vehicle, and control method for the same |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3113000A1 EP3113000A1 (en) | 2017-01-04 |
EP3113000B1 true EP3113000B1 (en) | 2021-05-26 |
Family
ID=56363740
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16177496.3A Active EP3113000B1 (en) | 2015-07-03 | 2016-07-01 | Vehicle and control method for the vehicle |
Country Status (4)
Country | Link |
---|---|
US (1) | US10268675B2 (en) |
EP (1) | EP3113000B1 (en) |
KR (1) | KR101678094B1 (en) |
CN (1) | CN106314151B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101901194B1 (en) | 2016-12-15 | 2018-09-27 | 현대자동차주식회사 | Vehicle, and control method for the same |
KR102263593B1 (en) * | 2017-02-08 | 2021-06-10 | 현대자동차주식회사 | Vehicle, and control method for the same |
CN106873899A (en) * | 2017-03-21 | 2017-06-20 | 网易(杭州)网络有限公司 | The acquisition methods and device of input information, storage medium and processor |
USD931870S1 (en) * | 2018-12-26 | 2021-09-28 | Leica Biosystems Melbourne Pty Ltd | Display screen with graphical user interface for an automated staining apparatus or portion thereof |
USD926197S1 (en) * | 2018-12-26 | 2021-07-27 | Leica Biosystems Melbourne Pty Ltd | Display screen with graphical user interface for an automated staining apparatus or portion thereof |
US11113933B1 (en) * | 2020-02-28 | 2021-09-07 | Therm-Omega-Tech, Inc. | Visual indication system for feedback controller |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102007026282A1 (en) * | 2007-06-05 | 2009-01-02 | Volkswagen Ag | Method for controlling device of motor vehicle such as navigation device, audio device or video device, involves detecting movement of finger on surface of control element, which has form of cavity |
DE102009023887A1 (en) * | 2009-06-04 | 2011-03-10 | Volkswagen Ag | Operating device for motor vehicle i.e. land vehicle, has operating element spatially separated from display and comprising hollow channel for controlling continuous function of motor vehicle and ring-shaped operating region |
GB2490485A (en) * | 2011-04-26 | 2012-11-07 | Sound Infinity Ltd | User Interface with a concave surface for an Electronic Device |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1509832B1 (en) * | 2002-05-21 | 2009-07-08 | Koninklijke Philips Electronics N.V. | Object entry into an electronic device |
US20060027955A1 (en) * | 2004-08-04 | 2006-02-09 | Barnes Group Inc., A Corporation Of Delaware | Non-linear spring system |
KR100811160B1 (en) * | 2005-06-02 | 2008-03-07 | 삼성전자주식회사 | Electronic device for inputting command 3-dimensionally |
US7443316B2 (en) * | 2005-09-01 | 2008-10-28 | Motorola, Inc. | Entering a character into an electronic device |
KR101136367B1 (en) * | 2005-11-10 | 2012-04-18 | 오의진 | Devices and method for inputting alphabet |
JP5054336B2 (en) | 2006-07-19 | 2012-10-24 | クラリオン株式会社 | Display device and navigation device |
US7509348B2 (en) * | 2006-08-31 | 2009-03-24 | Microsoft Corporation | Radially expanding and context-dependent navigation dial |
KR100867817B1 (en) * | 2006-10-13 | 2008-11-10 | 현대자동차주식회사 | a graphical user interface system for drive information device |
JP5171364B2 (en) | 2008-04-08 | 2013-03-27 | アルパイン株式会社 | Navigation device, search method, and search program |
JP2011141725A (en) | 2010-01-07 | 2011-07-21 | Denso Corp | Navigation device |
JP2013003802A (en) * | 2011-06-15 | 2013-01-07 | Sharp Corp | Character input device, control method for character input device, control program and recording medium |
US8886407B2 (en) | 2011-07-22 | 2014-11-11 | American Megatrends, Inc. | Steering wheel input device having gesture recognition and angle compensation capabilities |
JP5452566B2 (en) | 2011-10-31 | 2014-03-26 | 本田技研工業株式会社 | Vehicle input device |
KR101323281B1 (en) * | 2012-04-06 | 2013-10-29 | 고려대학교 산학협력단 | Input device and method for inputting character |
JP5910345B2 (en) | 2012-06-21 | 2016-04-27 | 富士通株式会社 | Character input program, information processing apparatus, and character input method |
CN103294222B (en) | 2013-05-22 | 2017-06-16 | 小米科技有限责任公司 | A kind of input method and system |
KR102091161B1 (en) | 2013-12-05 | 2020-03-19 | 엘지전자 주식회사 | Mobile terminal and control method for the mobile terminal |
-
2015
- 2015-07-03 KR KR1020150095404A patent/KR101678094B1/en active IP Right Grant
- 2015-11-12 US US14/939,034 patent/US10268675B2/en active Active
- 2015-12-08 CN CN201510897116.4A patent/CN106314151B/en active Active
-
2016
- 2016-07-01 EP EP16177496.3A patent/EP3113000B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102007026282A1 (en) * | 2007-06-05 | 2009-01-02 | Volkswagen Ag | Method for controlling device of motor vehicle such as navigation device, audio device or video device, involves detecting movement of finger on surface of control element, which has form of cavity |
DE102009023887A1 (en) * | 2009-06-04 | 2011-03-10 | Volkswagen Ag | Operating device for motor vehicle i.e. land vehicle, has operating element spatially separated from display and comprising hollow channel for controlling continuous function of motor vehicle and ring-shaped operating region |
GB2490485A (en) * | 2011-04-26 | 2012-11-07 | Sound Infinity Ltd | User Interface with a concave surface for an Electronic Device |
Also Published As
Publication number | Publication date |
---|---|
CN106314151B (en) | 2020-10-27 |
EP3113000A1 (en) | 2017-01-04 |
CN106314151A (en) | 2017-01-11 |
US20170004127A1 (en) | 2017-01-05 |
KR101678094B1 (en) | 2016-11-23 |
US10268675B2 (en) | 2019-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3113000B1 (en) | Vehicle and control method for the vehicle | |
JP5948711B2 (en) | Deformable pad for tactile control | |
KR101721967B1 (en) | Input apparatus, vehicle comprising the same and control method for the input apparatus | |
KR101685891B1 (en) | Controlling apparatus using touch input and controlling method of the same | |
US10802701B2 (en) | Vehicle including touch input device and control method of the vehicle | |
KR101974372B1 (en) | Control apparatus using touch and vehicle comprising the same | |
US20160137064A1 (en) | Touch input device and vehicle including the same | |
KR102674463B1 (en) | Vehicle, and control method for the same | |
KR101696596B1 (en) | Vehicle, and control method for the same | |
US10514784B2 (en) | Input device for electronic device and vehicle including the same | |
US10732713B2 (en) | Vehicle and control method thereof | |
US10732824B2 (en) | Vehicle and control method thereof | |
CN107305460B (en) | Vehicle and control method thereof | |
KR102671661B1 (en) | Vehicle, and control method for the same | |
KR101889039B1 (en) | Vehicle, and control method for the same | |
KR101665549B1 (en) | Vehicle, and control method for the same | |
KR20180069297A (en) | Vehicle, and control method for the same | |
KR101764610B1 (en) | Vehicle, and control method for the same | |
KR101744736B1 (en) | Controlling apparatus using touch input and controlling method of the same | |
KR20190078708A (en) | Vehicle, and control method for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170623 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200422 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602016058322 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06F0003048000 Ipc: G06F0003048800 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/048 20130101ALI20201210BHEP Ipc: G06F 3/0488 20130101AFI20201210BHEP Ipc: B60K 37/06 20060101ALI20201210BHEP Ipc: G06F 3/023 20060101ALI20201210BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20210114 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1396885 Country of ref document: AT Kind code of ref document: T Effective date: 20210615 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602016058322 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1396885 Country of ref document: AT Kind code of ref document: T Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210826 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210826 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210927 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210926 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210827 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602016058322 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210731 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210731 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210731 |
|
26N | No opposition filed |
Effective date: 20220301 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210926 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210701 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210701 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210731 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20160701 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230428 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240620 Year of fee payment: 9 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20240624 Year of fee payment: 9 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210526 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240620 Year of fee payment: 9 |