KR101741665B1 - User interface apparatus and vehicle comprising the same, control method for the user interface apparatus - Google Patents

User interface apparatus and vehicle comprising the same, control method for the user interface apparatus Download PDF

Info

Publication number
KR101741665B1
KR101741665B1 KR1020150168616A KR20150168616A KR101741665B1 KR 101741665 B1 KR101741665 B1 KR 101741665B1 KR 1020150168616 A KR1020150168616 A KR 1020150168616A KR 20150168616 A KR20150168616 A KR 20150168616A KR 101741665 B1 KR101741665 B1 KR 101741665B1
Authority
KR
South Korea
Prior art keywords
index
finger
touch
scroll
touch screen
Prior art date
Application number
KR1020150168616A
Other languages
Korean (ko)
Inventor
이용호
박재석
임상연
진재화
김연지
이정원
양현승
조성태
Original Assignee
현대자동차주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 현대자동차주식회사 filed Critical 현대자동차주식회사
Priority to KR1020150168616A priority Critical patent/KR101741665B1/en
Application granted granted Critical
Publication of KR101741665B1 publication Critical patent/KR101741665B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

The present invention proposes a user interface device for minimizing the number of user operations, and a control method for a vehicle and a user interface device including the same.
To this end, a user interface device according to an aspect of the present invention, a vehicle, and a control method of a vehicle and a user interface device including the same are provided with a step of fixing one finger of two fingers, By providing scrolling, an efficient list search can be performed, the number of times of manipulation by the user can be reduced, and a simpler and quicker touch interaction between the AVN device and the user can be performed, thereby enhancing the convenience of the user.

Description

USER INTERFACE APPARATUS AND VEHICLE COMPRISING THE SAME, CONTROL METHOD FOR THE USER INTERFACE APPARATUS,

The present invention relates to a user interface device for minimizing the number of operations of a user, a vehicle including the same, and a control method for the user interface device.

In addition to basic driving functions, the vehicle includes additional functions for user convenience such as audio function, video function, navigation function, air conditioning control, seat control, lighting control and so on.

In order to perform such functions, audio, multimedia devices, navigation devices, and the like are integrated into one system, and devices such as a radio, an audio service such as a CD (Compact Disk), a video service such as a DVD (Digital Versatile Disk) (Hereinafter referred to as AVN) for providing a navigation service such as a destination guidance function or the like is provided in the vehicle.

The AVN apparatus outputs a screen for providing a radio service, a screen for providing an audio service, a screen for providing a video service, or a screen for providing a navigation service according to a user's operation. In addition, even when the navigation service is provided, the AVN device can output various screens according to the user's operation such as a destination search screen.

In recent years, the development of vehicle IT technology has increased the number of operations of the user (specifically, the driver) to the AVN device while providing various functions in the AVN device. The increase in the number of manipulations reduces the concentration of the user on the operation, thereby hindering safe driving. Also, as the number of functions in the AVN device increases, the number of index information to be provided increases, which makes it difficult to search the list efficiently.

One aspect of the present invention provides a user interface device that enables an efficient list search by minimizing the number of operations by a user using an index scroll method using a plurality of fingers, and a control method of a vehicle and a user interface device including the same.

A user interface device according to one aspect of the present invention includes: a touch screen unit displaying a plurality of index scrolls; And a control unit for controlling the touch screen unit to display a plurality of index scrolls according to the detected multi-touch operation signal, wherein the touch screen unit comprises: A list display area for displaying an item list, and an index display area for displaying index information which is an index of the content item list, wherein the index display area includes a first index scroll area for displaying the start of the content item, And a second index scroll region in which a first character set is displayed, wherein the detecting section includes: a first detecting section that detects a moving operation and a fixing operation of the first finger; Touch position value according to an operation signal detected through the first detection unit and the second detection unit, and the control unit calculates the multi-touch position value through the first detection unit and the first finger Detecting whether or not the second finger is moved through the second detecting unit after the fixing of the first finger is detected and detecting whether or not the second finger scrolls through the touch screen unit so that the second index scroll is outputted through the pressing operation and the fixing operation of the first finger .

delete

delete

delete

delete

delete

The control unit searches the first index scroll displayed in the first index scroll region according to the movement operation of the first finger and searches the second index scroll displayed in the second index scroll region according to the movement operation of the second finger .

delete

Further, the control unit selects the first character of the content item in accordance with the movement operation of the second finger.

delete

delete

According to another aspect of the present invention, there is provided a method of controlling a user interface device, comprising: detecting a multi-touch operation signal by touch interaction using a plurality of fingers; Detecting a multi-touch operation signal and displaying a plurality of index scrolls through a touch screen unit, wherein detecting the multi-touch operation signal includes detecting a movement operation and a fixed operation of the first finger; Detecting a movement operation of the second finger; And displaying a plurality of index scrolls when the second index scroll is displayed in accordance with a pressing operation and a fixing operation of the first finger, And controls the screen unit.

delete

delete

The plurality of index scrolls may include a first index scroll providing a prefix of the item of content; And a second index scroll to provide a first letter collection of the item of content.

Displaying a plurality of index scrolls displays a first index scroll in a first index scroll region of the touch screen portion; And displaying a second index scroll in a second index scroll region of the touch screen portion.

The display of the plurality of index scrolls may be performed by searching the first index scroll displayed in the first index scroll region according to the movement operation of the first finger and displaying the first index scroll in the second index scroll region And searching for a second index scroll to be performed.

delete

delete

delete

According to one aspect of the present invention, there is provided a user interface device, a vehicle including the same, and a control method of the user interface device, wherein one of the two fingers is fixed, It is possible to perform an efficient list search, thereby reducing the number of operations of the user, and facilitating the quick and easy touch interaction between the AVN device and the user, thereby enhancing the convenience of the user.

1 is a view showing an appearance of a vehicle according to an embodiment of the present invention.
2 is a diagram showing an internal configuration of a vehicle according to an embodiment of the present invention.
3 is a block diagram of a control of a user interface device according to an embodiment of the present invention.
4 is a view schematically showing a search screen of a user interface device according to an embodiment of the present invention.
5 is an operation flowchart illustrating a first algorithm of a search screen provided through a multi-touch interaction in a user interface device according to an embodiment of the present invention.
6A to 6E are views showing an example of a search screen provided in FIG.
7A to 7E are views showing another example of the search screen provided in FIG.
8A to 8E are views showing still another example of the search screen provided in FIG.

The embodiments described in the present specification and the configurations shown in the drawings are preferred examples of the disclosed invention, and various modifications may be made at the time of filing of the present application to replace the embodiments and drawings of the present specification.

In addition, the same reference numerals or signs shown in the respective figures of the present specification indicate components or components performing substantially the same function.

Also, the terms used herein are used to illustrate the embodiments and are not intended to limit and / or limit the disclosed invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, terms such as " comprise ", " comprise ", or "have ", when used in this specification, designate the presence of stated features, integers, Steps, operations, components, parts, or combinations thereof, whether or not explicitly described herein, whether in the art,

It is also to be understood that terms including ordinals such as " first ", "second ", and the like used herein may be used to describe various elements, but the elements are not limited by the terms, It is used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The term "and / or" includes any combination of a plurality of related listed items or any of a plurality of related listed items.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, embodiments of a user interface apparatus, a vehicle, and a user interface apparatus according to the present invention will be described in detail with reference to the accompanying drawings.

1 is a view showing an appearance of a vehicle according to an embodiment of the present invention.

1, a vehicle 1 according to an embodiment of the present invention includes a main body 10 forming an outer appearance of a vehicle 1, wheels 21 and 22 for moving the vehicle 1, wheels 21 and 22 A door 14 for shielding the interior of the vehicle 1 from the outside, a windshield 17 for providing a driver with a view of the front of the vehicle 1 to the driver inside the vehicle 1, And a side mirror 18, 19 for providing a rear view of the vehicle 1 to the vehicle.

The wheels 21 and 22 include a front wheel 21 provided at the front of the vehicle and a rear wheel 22 provided at the rear of the vehicle and the drive device 24 includes a front wheel 21 and a rear wheel 22, (21) or the rear wheel (22). Such a drive device 24 may employ an engine for generating a rotational force by burning fossil fuel or a motor for generating a rotational force by receiving power from a capacitor (not shown).

The door 14 is rotatably provided on the left and right sides of the main body 10 so that the driver can ride inside the vehicle 1 at the time of opening and shields the inside of the vehicle 1 from the outside at the time of closing .

The front glass 17 is provided on the front upper side of the main body 10 so that a driver inside the vehicle 1 can obtain time information in front of the vehicle 1 and is also referred to as a windshield glass.

The side mirrors 18 and 19 include a left side mirror 18 provided on the left side of the main body 1 and a right side mirror 19 provided on the right side. 1) The side information and the rear side time information can be obtained.

In addition, the vehicle 1 may include a proximity sensor for detecting obstacles or other vehicles behind the vehicle, and a rain sensor for detecting rainfall and precipitation.

As one example of the proximity sensor, a sensing signal is transmitted to a side or rear surface of a vehicle, and a reflection signal reflected from an obstacle such as another vehicle is received. Also, it is possible to detect the presence of an obstacle behind the vehicle 1 based on the waveform of the received reflection signal, and to detect the position of the obstacle. Such a proximity sensor may employ a method of detecting ultrasonic waves and detecting a distance to an obstacle by using ultrasonic waves reflected from obstacles.

2 is a diagram showing an internal configuration of a vehicle according to an embodiment of the present invention.

2, a seat DS and a seat occupant sits on the inside of the vehicle 1, a dashboard 30 (Fig. 2) in which various instruments for controlling the operation of the vehicle 1 and displaying the running information of the vehicle 1 are provided. a steering wheel 60 for operating the direction of the vehicle 1 may be provided.

The seats DS and PS may include a driver's seat DS on which the driver sits, a passenger seat PS on which the passenger sits, and a rear seat (not shown) located in the rear of the vehicle 1.

The dashboard 30 is provided with a dashboard 31 such as a speedometer for indicating information related to driving, a fuel meter, an automatic shift selector lever indicator, a tachometer and a rangefinder, a gear box 40, Etc. may be provided.

The gear box 40 is provided with a speed change gear 41 for vehicle speed change. Further, as shown in the figure, an input unit 110 is provided for the user to input a user command for controlling the performance of the main function of the AVN apparatus 51 or the vehicle.

The center fascia 50 may be provided with an air conditioner, a clock, an AVN device 51, or the like. The air conditioner controls the temperature, humidity, air cleanliness, and air flow inside the vehicle 1 to keep the inside of the vehicle 1 comfortable. The air conditioner may include at least one discharge port that is installed in the center fascia 50 and discharges air. The center fascia 50 may be provided with a button or a dial for controlling the air conditioner or the like. A user such as a driver can control the air conditioner of the vehicle 1 by using a button or a dial disposed on the center pacea 50. [

The AVN device 51 is a device implemented as a single system in which audio in the vehicle 1, a multimedia device, a navigation device, and the like are integrated into one system. In the vehicle 1, A video service for reproducing a DVD (Digital Versatile Disk) or the like, a navigation service for guiding the user to a route to a destination, a telephone of a mobile terminal connected to the vehicle 1, And a telephone service for controlling the reception or the like. In addition, the AVN device 51 can also provide a voice recognition service for receiving the voice, not the user's operation, for providing the above-described radio service, audio service, video service, navigation service, and telephone service.

The AVN device 51 is equipped with a USB (Universal Serial Bus) port or the like and is provided with a portable device for multimedia such as PMP (Portable Multimedia Player), MP3 (MPEG Audio Layer-3) player, PDA And can play audio and video files.

The AVN apparatus 51 may be disposed on the dashboard 30 in a disposable manner or may be embedded in the center fascia 50.

The user can receive a radio service, an audio service, a video service, and a navigation service through the AVN device 51.

Here, the AVN device 51 may be referred to as a navigation terminal or a display device, and may be referred to as various other terms used by those skilled in the art.

The steering wheel 60 is a device for adjusting the running direction of the vehicle, and includes a rim 61 gripped by a driver and a spoke (not shown) connected to the steering device of the vehicle and connecting the rim 61 to a hub of a rotary shaft for steering 62). According to the embodiment, the spoke 62 may be provided with an operating device for controlling various devices in the vehicle 1, for example, the AVN device 51 and the like.

The AVN apparatus 51 can display at least one of a radio screen, an audio screen, a video screen, a navigation screen, and a telephone screen through the touch screen unit 120, A control screen or a screen related to an additional function executable by the AVN apparatus 51 may be displayed.

According to one embodiment, the AVN apparatus 51 can display various control screens related to the control of the air conditioner through the touch screen unit 120 in cooperation with the above-described air conditioner. In addition, the AVN apparatus 51 can control the air conditioning environment in the vehicle 1 by controlling the operation state of the air conditioner. In addition, the AVN apparatus 51 may display a map indicating a path to a destination to the driver through the touch screen unit 120.

The touch screen unit 120 may be implemented as a liquid crystal display (LCD) panel, a light emitting diode (LED) panel, or an organic light emitting diode (OLED) Display function and input function of instruction or command can be performed.

The touch screen unit 120 displays a screen including a predetermined image according to an operating system (OS) for driving and controlling the AVN apparatus 51 and an application running in the AVN apparatus 51 Or input instructions or commands.

The touch screen unit 120 may display a basic screen according to an application being executed. The basic screen means a screen displayed by the touch screen unit 120 when the touch operation is not performed.

The touch screen unit 120 may display a touch operation screen according to the situation. The touch operation screen means a screen capable of receiving a touch operation from a user.

The input method of the touch screen unit 120 may be a resistance touch screen method for sensing a user's touch operation, a capacitive touch screen method for sensing a touch operation of a user using a capacitive coupling effect, an optical touch method using an infrared It may be a touch screen type or an ultrasonic touch screen type using ultrasonic waves. Various input methods may be used, but the present invention is not limited thereto.

The touch screen unit 120 constitutes the user interface apparatus 100 together with the input unit 110 provided in the gear box 40.

The user interface device 100 is a device that allows the user to interact with the AVN device 51 provided in the vehicle 1 and uses the keypad, the remote control, the jog dial (knob), the touch interaction, And receives a user command by selecting a character or a menu displayed on the touch screen unit 120. A method of operating various functions provided by the AVN device 51 more simply and quickly using a touch interaction will be described as a representative example of the user interface device 100 according to an embodiment of the present invention.

3 is a block diagram of a control of a user interface device according to an embodiment of the present invention.

3, the user interface device 100 according to an exemplary embodiment of the present invention further includes a touch screen unit 120, an information detection unit 130, a control unit 140, a storage unit 150, and a communication unit 160 do.

The touch screen unit 120 may employ a touch screen panel (TSP) that receives a control command from a user and displays operation information corresponding to the received control command.

The touch screen panel includes a display for displaying operation information and control commands that can be input by a user, a touch panel for detecting coordinates of a part of the user's body contacted, And a touch screen controller for determining a command to be input that is input.

The touch screen controller can recognize the control command inputted by the user by comparing the touch coordinates of the user detected through the touch panel and the coordinates of the control command displayed through the touch screen unit 120.

The touch screen unit 120 senses a touch interaction applied to a touch panel and can generate and output an electrical signal based on the sensed touch interaction. The touch interaction may be performed by a touch means, for example, a finger, a hand or a touch pen. Accordingly, the touch screen unit 120 can receive an instruction or command from the user. In this case, the touch screen unit 120 may output the screen to the outside or may not output the screen.

Also, the touch screen unit 120 can display an index item as well as an image and a picture, can display operation information of a navigation function such as a map for road guidance, road environment information, and the like, The music information of the audio can be displayed, and the search image for searching the Internet can be displayed. Here, the index item means a set of characters. Characters displayed in an index entry can include symbols, numbers, special characters, and the like. The characters that display the index items can be Hangul (consonants, vowels), English, Japanese, and many other languages.

The detection unit 130 detects a multi-touch position value using an electrical signal generated according to a touch interaction of the touch screen unit 120. The touch sensing unit 130 detects a multi- And a second detecting unit 132 for detecting a moving direction and a moving distance of a second finger (hereinafter, referred to as a second finger).

Accordingly, the detection unit 130 detects the movement and the fixed signal of the first finger and the movement signal of the second finger, calculates the multi-touch position value, and transmits the detected signal to the control unit 140. [

The control unit 140 is a microprocessor that controls all operations of the user interface device 100. The control unit 140 controls the operation of the user interface device 100 according to the movement of the first finger detected through the first detection unit 131, 1 index information to provide the second index information to be displayed through the touch screen unit 120 according to the fixed signal of the first finger detected through the first detecting unit 131 And controls to change the content item list displayed through the touch screen unit 12 according to the movement signal of the second finger detected through the second detection unit 132. [

The controller 140 manages the content provided to the user and controls the touch screen unit 120 to display a predetermined content according to a user's input. In addition, the control unit 140 may control the touch screen unit 120 to display a search screen for the user to search for a predetermined item of content.

In addition, the control unit 140 may classify each content item according to the prefix, and arrange each content item according to the prefix to generate the content item list.

The storage unit 150 stores various data necessary for the operation of the user interface device 100. That is, the storage unit 150 may store various applications necessary for providing an operating system or information required for interface operation.

More specifically, the storage unit 150 includes a control program 151 for controlling the user interface device 100, control data 153 for controlling the operation of the user interface device 100, and map data and load data And destination information 157 related to the destination selected by the user.

Here, the map data is the map for the AVN apparatus 51 to guide the route to the user, and the load data is information related to the road included in the map that the AVN apparatus 51 displays to the user to guide the route to the user .

The road data may include road information necessary for driving and route guidance of the vehicle such as the location of the road, the length of the road, and the speed limit of the road. In addition, the roads included in the map are divided into a plurality of road sections on the basis of whether they intersect with distances or other roads, and the road data may include road information for each divided road section.

The destination information 157 is information related to a destination where the user has searched for the route through the AVN apparatus 51. [ The destination information 157 may include a date when the user retrieved the destination, a name of the destination, an address of the destination, a longitude and latitude of the destination, and the like.

Specifically, when the user inputs a destination to the AVN apparatus 51 and receives route guidance to the destination through the AVN apparatus 51, the AVN apparatus 51 automatically transmits information related to the destination to the storage unit 150, Lt; / RTI > The destination related information stored in the storage unit 150 becomes the destination information 157. [

In addition, the storage unit 150 may store operation data or the like generated while the user interface device 100 performs a predetermined operation.

The storage unit 150 may be a flash memory, a hard disk, a card type memory (for example, SD or XD memory), a random access memory (RAM), a static random access (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk .

The communication unit 160 includes a wireless fidelity (WiFi) communication module 161 that connects to a local area network (LAN) through a wireless access point or the like, a one- A Bluetooth communication module 163 for communicating with a small number of external devices or a one-to-many communication with the external device, a broadcast signal receiving module 165 for receiving a digital broadcast signal, A location information receiving module 167, and the like.

The communication unit 160 may be connected to other devices using a GSM / 3GPP communication system (GSM, HSDPA, LTE Advanced), a 3GPP2 system communication system (CDMA, etc.), or a wireless communication protocol such as WiMAX.

In addition, the communication unit 160 transmits / receives data to / from the GPS satellite and transmits / receives position information of the current vehicle 1 from the GPS satellite or transmits / receives map information from a server located at a remote location. The location information and the map information of the vehicle 1 may be used to provide a route for moving to a destination set by the user.

In addition, the communication unit 160 may be connected to other devices to transmit / receive multimedia data. Specifically, the communication unit 160 may be connected to a mobile terminal located near the vehicle 1 or a server located at a remote location, and may transmit the multimedia data from the mobile terminal or the server. For example, the communication unit 160 may be connected to a user's smartphone and receive multimedia stored in the smartphone.

Meanwhile, the user interface device 100 according to an embodiment of the present invention may further include an acoustic input unit 170 for receiving sounds and an acoustic output unit 180 for outputting sounds.

The sound input unit 170 includes a microphone 171 that receives an acoustic signal outside the AVN apparatus 51 and converts the received acoustic signal into an electrical signal. In addition, the acoustic input unit 170 may include an amplifier for amplifying the electrical signal converted by the microphone 171 and an analog-to-digital converter (ADC) for digitizing the electric signal converted by the microphone 171 ), And the like.

The sound output unit 180 includes a speaker 181 that converts an electrical signal into an acoustic signal and outputs the converted acoustic signal to the outside of the AVN apparatus 51. [ In addition, the sound output unit 180 may further include a digital-to-analog converter (DAC) for converting a digitized electric signal into an analog signal, an amplifier for amplifying an analog signal converted by the digital-to-analog converter can do.

4 is a view schematically showing a search screen of a user interface device according to an embodiment of the present invention.

4, the touch screen unit 120 includes a list display area 121 in which a list of content items is displayed and an index display area 122 in which an index that is an index of the content item list is displayed.

The list display area 121 is provided on the left side of the touch screen unit 120 and the index display area 122 is provided on the right side of the touch screen unit 120.

The index display area 122 includes a first index scroll area 122a for displaying the beginning of the item to be searched and a second index scroll area 122b for displaying the first letter collection of the item to be searched.

Hereinafter, the operation of the user interface device, the vehicle, and the control method of the user interface device according to an embodiment of the present invention will be described.

FIG. 5 is a flow chart showing a first algorithm of a search screen provided through a multi-touch interaction in a user interface device according to an embodiment of the present invention. FIGS. 6A to 6E are views Fig.

5, the control unit 140 controls the touch screen unit 120 to display the content item list in the list display area 121, and displays the content item list in the first index scroll area 122a of the index display area 122. [ The first index information which is an index of the list is displayed (200, see Fig. 6A).

The first index information displayed in the first index scroll area 122a of the index display area 122 is the index of the item to be searched through the first index scroll.

Beginning is the beginning of the syllable. For example, if the content item is English, such as "James Blunt ", the first letter " J" of "James Blunt " Can be the first consonant of "Hong Gil-dong".

At this time, the control unit 140 can arrange the content items in ascending order according to the chronological order to generate the content item list, and control the touch screen unit 120 to display the content item list in the list display area 121. [

6A, in a state in which an item list is displayed in the list display area 121, and the first index scroll area 122a of the index display area 122 is displayed, The first index scroll is moved up and down by touching the first finger 191 to the one-index scroll region 122a (202).

When the first index scroll is moved upward and downward by using the first finger 191, the start of the item to be displayed in the first index scroll region 122a is changed, and the cursor moves to the beginning of the item to be searched.

In this way, the user can search for the beginning of the item to be searched through the first index scroll (204, see Fig. 6B).

Accordingly, the control unit 140 determines whether the chief characteristic of the item to be searched is selected and selected (206). The user can press the searched searched person to select the corresponding searched person.

As a result of the determination in step 206, if the chaos is not selected, the controller 140 feeds back to step 202 and proceeds to the subsequent operation.

On the other hand, if it is determined in step 206 that the first finger is selected, the controller 140 determines whether the first finger 191 is fixed (step 208). Whether or not the first finger 191 is fixed can be detected through the first detection unit 131 by determining whether the user holds the detected first finger for a predetermined period of time.

If it is determined in step 208 that the first finger 191 is not fixed, the process returns to step 202 to proceed with the subsequent operation.

If it is determined in step 208 that the first finger 191 is fixed, the controller 140 controls the touch screen unit 120 to move the second index scroll area 122b of the index display area 122 to the second index scroll area 122b. Index scroll is displayed (210, see Fig. 6C).

The second index information displayed in the second index scroll area 122b of the index display area 122 is a first character set of the item to be searched through the second index scroll.

The first letter represents the letter after the first letter. For example, if the content item is "private ", the first set of letters is" a ", the first set of " "ㅣ" can be.

At this time, the control unit 140 can arrange the content items in ascending order according to the first letter collection to generate the content item list, and control the touch screen unit 120 to display the content item list in the list display area 121 .

The content item list is displayed in the list display area 121 and the first character set of the content item is displayed in the second index scroll area 122b of the index display area 122, The second index scroll is moved up and down (212) by touching the second finger 192 to the second index scroll region 122b.

When the second index scroll is moved up and down using the second finger 192, the first character set of the content item displayed in the second index scroll region 122b is changed and moved to a collection of items to be searched.

In this way, the user can search through the second index scroll for the first letter collection of the item to be searched (214, 6c and 6d).

Accordingly, the control unit 140 determines whether the first character set of the item to be searched has been searched and selected (step 216). The user can select the first character set by pressing the first character set searched.

If it is determined in step 216 that the first letter collection is not selected, the control unit 140 feeds back to step 212 to proceed with the subsequent operation.

If the first character set is selected in step 216, the control unit 140 controls the touch screen unit 120 to display the list of content items corresponding to the selected first character on the list display area 121 (step 218) . Whether or not the second finger 192 is moved can be detected and detected by the second detection unit 132 by determining whether the user is moving the second finger 192 in a state in which the first finger 191 is fixed.

When the first character and the first character of the item to be searched are selected, the first character of the item to be searched, for example, the character or the land is displayed in the center of the list display area 121, Is displayed.

On the other hand, if two fingers 191 and 192 are moved away from the search screen after moving to the first character of the content item to be searched, the second index scroll area 122b of the index display area 122 disappears and the second index scroll disappears (See Fig. 6E).

The index display area 122 is provided on the right side of the touch screen unit 120. However, the present invention is not limited thereto, and the index display area 122 may be provided on the touch screen unit 120 As shown in Fig. The index display area 122 provided on the left side of the touch screen unit 120 will be described with reference to Figs. 7A to 7E.

7A to 7E are views showing another example of the search screen provided in FIG.

7A to 7E, the list display area 121 is provided on the right side of the touch screen unit 120 and the index display area 122 is provided on the left side of the touch screen unit 120. [

The index display area 122 includes a first index scroll area 122a for displaying the beginning of the item to be searched and a second index scroll area 122b for displaying the first letter collection of the item to be searched.

The search screen shown in FIGS. 7A to 7E is outputted in the same manner as the search screen shown in FIGS. 6A to 6E, and a duplicate description will be omitted.

Although the second index scroll region 122b is provided in the form of a bar like the first index scroll region 122a in the embodiment of the present invention, the present invention is not limited thereto, The index scroll region 122b may be provided in a circular or semicircular shape. Providing the second index scroll region 122b in a semicircular shape will be described with reference to Figs. 8A to 8E.

8A to 8E are views showing still another example of the search screen provided in FIG.

8A to 8E, the list display area 121 is provided on the right side of the touch screen unit 120 and the index display area 122 is provided on the left side of the touch screen unit 120. [

The index display area 122 is provided with a first index scroll area 122a in which a first character of an item to be searched is displayed in a bar shape and a second index scroll area 122b in which a first character set of an item to be searched is displayed, Is provided in the form of a semicircle.

The search screen shown in FIGS. 8A to 8E is outputted in the same manner as the search screen shown in FIGS. 6A to 6E and FIGS. 7A to 7E, and thus a duplicated description will be omitted.

In the embodiment of the present invention, the index display area 122 is provided on the right or left of the touch screen unit 120. However, the present invention is not limited thereto, But may be provided on the upper side or the lower side of the screen unit 120.

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed methods should be considered in an illustrative rather than a restrictive sense. It is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

1: Vehicle 30: Dashboard
40: gear box 51: AVN device
100: user interface device 110: input
120: Touch screen unit 121: List display area
122: index display area 122a: first index scroll area
122b: second index scroll region 130: detecting section
131, 132: first and second detecting units 140:
160: communication unit 191, 192: first and second fingers

Claims (20)

A touch screen unit displaying a plurality of index scrolls;
A detection unit for detecting a multi-touch operation signal generated by a touch interaction using a plurality of fingers;
And a controller for controlling the touch screen unit to display the plurality of index scrolls according to the detected multi-touch operation signal,
The touch-
A list display area for displaying the content item list and an index display area for displaying index information which is an index of the content item list,
Wherein the index display area includes:
A first index scroll region in which a first character of the content item is displayed and a second index scroll region in which a first character collection of the content item is displayed,
Wherein:
A first detecting unit detecting a moving operation and a fixing operation of the first finger;
And a second detecting unit that detects a moving operation of the second finger,
Touch position value in accordance with the operation signal detected through the first detection unit and the second detection unit,
Wherein,
Detecting whether or not the first finger is fixed through the first detection unit and detecting whether the second finger is moved through the second detection unit after fixing the first finger;
And controls the touch screen unit so that the second index scroll is output through a pressing operation and a fixing operation of the first finger.
delete delete delete delete delete The method according to claim 1,
Wherein,
A first index scroll displayed in the first index scroll region is searched according to a movement operation of the first finger and a second index scroll displayed in the second index scroll region is searched in accordance with a movement operation of the second finger User interface device.
delete 8. The method of claim 7,
Wherein,
And selects the first letter of the item of content according to the movement of the second finger.
delete delete Detecting a multi-touch operation signal by touch interaction using a plurality of fingers;
And displaying the plurality of index scrolls through the touch screen unit by receiving the detected multi-touch operation signal,
To detect the multi-touch operation signal,
Detecting a movement operation and a fixing operation of the first finger;
Detecting a movement operation of the second finger;
Detecting whether the second finger is moved after the fixing of the first finger is determined,
Displaying the plurality of index scrolls,
And controls the touch screen unit to output a second index scroll in accordance with a pressing operation and a fixing operation of the first finger.
delete delete 13. The method of claim 12,
Wherein the plurality of index scrolls include:
A first index scroll providing a leading character of an item of content;
And a second index scroll providing a first letter collection of the item of content.
16. The method of claim 15,
Displaying the plurality of index scrolls,
Displaying the first index scroll in a first index scroll region of the touch screen unit;
And displaying the second index scroll in a second index scroll region of the touch screen unit.
17. The method of claim 16,
Displaying the plurality of index scrolls,
The second index scroll region is displayed in the first index scroll region according to the movement operation of the first finger, and the second index scroll region is displayed in the second index scroll region in accordance with the movement operation of the second finger, Of the user interface device.
delete delete delete
KR1020150168616A 2015-11-30 2015-11-30 User interface apparatus and vehicle comprising the same, control method for the user interface apparatus KR101741665B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150168616A KR101741665B1 (en) 2015-11-30 2015-11-30 User interface apparatus and vehicle comprising the same, control method for the user interface apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150168616A KR101741665B1 (en) 2015-11-30 2015-11-30 User interface apparatus and vehicle comprising the same, control method for the user interface apparatus

Publications (1)

Publication Number Publication Date
KR101741665B1 true KR101741665B1 (en) 2017-06-15

Family

ID=59217391

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150168616A KR101741665B1 (en) 2015-11-30 2015-11-30 User interface apparatus and vehicle comprising the same, control method for the user interface apparatus

Country Status (1)

Country Link
KR (1) KR101741665B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022114334A1 (en) * 2020-11-27 2022-06-02 한국전자기술연구원 Method and device for controlling content in vehicle for plurality of users

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101241904B1 (en) * 2006-09-20 2013-03-12 엘지전자 주식회사 Method for Display in Mobile Terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101241904B1 (en) * 2006-09-20 2013-03-12 엘지전자 주식회사 Method for Display in Mobile Terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022114334A1 (en) * 2020-11-27 2022-06-02 한국전자기술연구원 Method and device for controlling content in vehicle for plurality of users

Similar Documents

Publication Publication Date Title
US10328950B2 (en) Vehicle equipment control device and method of searching for control content
KR101678087B1 (en) Vehicle and method of controlling the same
US9292093B2 (en) Interface method and apparatus for inputting information with air finger gesture
KR101575648B1 (en) User interface apparatus, Vehicle having the same and method for controlling the same
US10949886B2 (en) System and method for providing content to a user based on a predicted route identified from audio or images
KR101700714B1 (en) User interface apparatus, Vehicle having the same and method for controlling the same
WO2016084360A1 (en) Display control device for vehicle
JP2011232270A (en) Navigation device and help presentation method thereof
JP6009583B2 (en) Electronics
US10071685B2 (en) Audio video navigation (AVN) head unit, vehicle having the same, and method for controlling the vehicle having the AVN head unit
KR101767088B1 (en) Multimedia apparatus and method for user app display method of the same
KR101741665B1 (en) User interface apparatus and vehicle comprising the same, control method for the user interface apparatus
KR20160021509A (en) User interface apparatus, Vehicle having the same and method for controlling the same
US20220415321A1 (en) Electronic device mounted in vehicle, and method of operating the same
US10248382B2 (en) User interface and method for assisting a user with the operation of an operating unit
KR101804769B1 (en) Multimedia apparatus and vehicle comprising the same, destination search method of the multimedia apparatus
KR20230090510A (en) Terminal device and Vehicle
JP2021174237A (en) Search control device, computer program, and search system
US10618407B2 (en) Terminal apparatus, vehicle, and method of controlling the terminal apparatus
JP2021174236A (en) Control device, computer program, and control system
KR102559681B1 (en) Avn apparatus and vehicle comprising the same, control method for the avn apparatus
KR101623858B1 (en) User interface apparatus and vehicle comprising the same, control method for the user interface apparatus
KR101626427B1 (en) Vehicle, multimedia apparatus and controlling method thereof
KR20230001056A (en) The electronic device mounted on vehicle and the method operating the same
JP2010185686A (en) Information providing device

Legal Events

Date Code Title Description
AMND Amendment
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant