WO2013077686A1 - 단계적 정보 제공 시스템 및 방법 - Google Patents
단계적 정보 제공 시스템 및 방법 Download PDFInfo
- Publication number
- WO2013077686A1 WO2013077686A1 PCT/KR2012/010021 KR2012010021W WO2013077686A1 WO 2013077686 A1 WO2013077686 A1 WO 2013077686A1 KR 2012010021 W KR2012010021 W KR 2012010021W WO 2013077686 A1 WO2013077686 A1 WO 2013077686A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- display
- displayed
- operation command
- screen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
- G06F3/04895—Guidance during keyboard input operation, e.g. prompting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention provides a step-by-step information, and relates to a system for performing a step operation command on the display through a variety of input devices, the content of the information is provided step by step according to the operation command of the step.
- Touch screen or touch panel detects the touched position by directly touching a character or a specific position displayed on the screen (screen) with a finger or any manipulation means without using the keyboard, and processes the touched position through pre-stored software.
- the touch panel includes a resistive overlay, a surface acoustic wave, a capacitive overlay, and an infrared beam.
- the present invention has been made to solve the above problems, and in accordance with the operation command of the step input through the input device frame at the top of the display, the information of the step by step to display or provide the information or also exist in a separate storage location Its purpose is to provide a system and method for connecting.
- the object of the present invention is, in a mill equipped with a display, a central processing unit and an input device, when an operation command of a step for distinguishing a step is input on the display, the input device outputs an operation command of the step, and the central processing unit This is achieved by recognizing the step of the operation command of the step and outputting information corresponding to the step to the display.
- the CPU determines the moving distance as a step.
- the CPU determines the angle of the rotational movement in stages.
- the step is a + move step
- the step is a-move step
- two points are selected through the input device and the step is determined as the distance between the two points becomes closer or farther away.
- the terminal in a mill equipped with a display, a central processing unit, and an input device, the terminal is connected to a server equipped with a database and a control unit, and an operation command for distinguishing steps on the display.
- the input device outputs the operation command of the step
- the central processing unit recognizes the step of the operation command of the step, transmits information on the step of the operation command to the server or sends the output of the input device to the server.
- the server outputs information on the step from the database and transmits the information to the terminal.
- the CPU determines the moving distance as a step, and when the point is selected on the display and the point is rotated to output information about the rotation direction, The CPU determines the angle of the rotational movement in stages.
- step + moving step there is a step + moving step, and there is a step-moving step.
- a step is determined as the distance between the two points becomes closer or farther away.
- the final selection step is characterized by N + J
- the final selection step is N-I It features.
- the operation command of a step is executed by a finger or an operation means through an input device at the top of the display, not only the information is provided step by step, but the screen is not changed even when the information of the various steps is provided. It may also be provided at, and may also allow access to information stored on another Internet site or other storage location.
- FIG. 1 is a diagram illustrating a configuration of a terminal connected to a server through a wired or wireless internet.
- FIG. 2 is a block diagram illustrating the server in more detail.
- FIG. 3 is a block diagram of a terminal.
- FIG. 4 is a diagram schematically illustrating an input device.
- FIG. 5 is a diagram of an embodiment of performing an operation command of a step through an input device.
- FIG. 6 is a diagram of an embodiment for explaining an operation command of a step.
- FIG. 7 is a diagram of another embodiment of an operation command via an input device.
- FIG. 8 and 9 are diagrams illustrating steps of an operation command in the embodiment of FIG. 7.
- FIG. 10 is a diagram of an embodiment in which the display information of the entire screen is changed by the operation command of the step.
- FIG. 11 is a diagram of an embodiment in which the display information of a part of the screen is changed by the operation command of the step.
- 12 to 14 are diagrams of an embodiment showing a flow chart of the present invention.
- 15 to 17 are diagrams of yet another embodiment for executing an operation command.
- 19 is a diagram of another embodiment of a stepwise input method.
- 20 to 28 are diagrams of embodiments in which the size of the selection region changes.
- 29 and 31 are diagrams of embodiments in which the content and size of information vary in various ways.
- 32 to 34 are diagrams of an embodiment when two or more selection areas exist in a full screen.
- 35 is a diagram of another embodiment changed by an operation command of a step.
- 36 is a flowchart of an embodiment according to the operation instruction of the step of the present invention.
- 37 is a flowchart of an embodiment in which the server transmits information associated with each step.
- the present invention is a diagram of an embodiment of performing "manipulation command of step” through the input device at the top of the display. Then, when the terminal user of the present invention performs the "step operation command" through the input device, the job execution will be performed step by step.
- FIG. 1 is a diagram illustrating a configuration of a terminal connected to a server through a wired or wireless internet.
- the server 100 in the communication system is a device that constitutes a system for operating various information providing services through wired and wireless Internet.
- An input unit 103 for managing or inputting information by an administrator or an operator of the server 100, an output unit 105 for outputting or displaying information, and a database unit 104 for storing various information and information regarding service operations.
- an interface unit 102 capable of transmitting and receiving data with the accessor via the Internet or a communication network.
- the information means all information such as an image, a video, and a text.
- the terminal (or computer) 110 is a terminal capable of transmitting and receiving a variety of information through a wired or wireless Internet (or a communication network).
- the terminal 110 includes a central processing unit (CPU) 20, a display unit 30 for displaying various kinds of information, a memory unit 21 for storing various kinds of information, an input device 28 for inputting information, and information.
- a data input / output unit 10 capable of inputting and outputting data is configured.
- FIG. 2 is a block diagram illustrating the server in more detail.
- the control unit 101 is configured in the server 100, and the control unit 101 includes a data retrieval unit 111 for retrieving data, a data processing unit 112, and a site operation unit for managing and operating Internet accessors or Internet members ( 113).
- a database 104 is further configured, and the database 14 includes an operation database 141 in which information related to site operation is stored, an information database 142 in which data suitable for each information is stored, and a database in which a plurality of information are stored. 143.
- control unit 101 and the database 104 are just one example, and a general control unit that performs all algorithms of a server operation and a general database which stores all information are included in the embodiment of the present invention. can do.
- control unit 101 of the server 100 determines the information about the visitor (or terminal), the membership status and content usage, etc., the data search unit 111, the data search unit 111 in the visitor ( Alternatively, information matching the information transmitted from the terminal) is searched through the database 14, and the data processing unit 112 transmits the searched data to the accessor through the interface.
- FIG. 3 is a block diagram of a terminal.
- the central processing unit 20 is a control means for controlling the overall operation of the terminal (generally a portable display device, a smart phone, or a computer) used in the embodiment of the present invention.
- the ROM 21a in the memory unit 21 controls the execution program of the display device, the RAM 21b stores data generated when the program is executed, and the EPIROM 21c is required by the user. Data and the data needed to process it.
- the R / F unit 24 is a radio frequency.
- the R / F unit 24 tunes to an RF channel, amplifies various input signals, and converts an RF signal received from an antenna into a required frequency signal.
- the input / output unit 10 includes an input unit and an output unit, and the input unit includes various information input devices, numeric keys, menu keys, and selection keys, and the output unit also includes a speaker or a vibration device.
- a display driving circuit 25 for driving the display by receiving the signal output from the central processing unit 20, and the driving circuit again outputs a signal capable of driving the display 30.
- the CPU controls the input device 28 through the input device driver 27. That is, when information is input through the input device, the input window value driving unit transmits the input information to the central processing unit.
- the terminal of the present invention may include a portable display device, a smart phone, a tablet PC or a computer.
- FIG. 4 is a diagram schematically illustrating an input device.
- a cross-sectional view of the input device is a simplified diagram
- (A) is a diagram showing a capacitance method
- (B) is a diagram showing a resistive film method.
- the electrode plate 29a is composed of one or two sheets of the film coated with a transparent electrode.
- (B) is provided with two films 29a, 29b coated with a transparent electrode on the upper side of the protective plate 28a so as to face each other in a state where a constant interval is maintained, and the input device 28 (28)
- An outer protective plate (or makeup plate) 28b may be further provided at the top. And the coating of the desired pattern is made to the protective plate 28b.
- FIG. 4 shows an example of an input device 28 that is commonly used, and the present invention is not an invention of the input device 28.
- a conventional input device 28 capable of inputting information on the display is applicable to the present invention.
- the display means that information may be input without applying pressure or contact to the display surface.
- the embodiment of the present invention can be applied to a device in which the input device element and the display are integrated.
- FIG. 5 is a diagram of an embodiment of performing an operation command of a step through an input device.
- a small bar on the display 30 or a finger is used to command a step.
- the operation command of the step is input to the information displayed on the display through the input device 28.
- the operation command of the step is executed in such a manner that the two fingers reduce or increase the distance between the selected points (points).
- FIG. 6 is a diagram of an embodiment for explaining an operation command of a step.
- step operation command An operation command that allows two fingers (or two bars) to define two points, divides the moving distance of the point into steps, and recognizes the command in each step is a "step operation command".
- the operation command of the step is executed, information corresponding to each step is displayed on the screen of the terminal display 30.
- the step is a finite number N of at least two steps. Too many is not good, so within 10 or 5 steps is appropriate.
- the maximum number of steps that can be recognized is predetermined.
- the maximum distance that can be recognized is also predetermined.
- the operation command of step 3) has a + direction and a-direction.
- each step 50a, 50b, 50c, 50d, 50e is shown in the guide line as shown in FIG. ).
- each step is " 51a, 51b, 51c, 51d, 51e "
- the distance for one step of movement is set to 10 mm in advance, when 20 mm is moved by the operation command of the step, two steps are moved, and since there are up to five steps, the maximum moving distance is 50 mm.
- the distances in each step need not necessarily be the same.
- the distance from step 1 to step 2 may be 10 mm, and the distance from step 2 to step 3 may be 12 mm. Only the step distance is predetermined.
- the step distance may also vary variably when the algorithm is executed and the program is executed, for example, the movement distance of the step is changed by an algorithm that is automatically changed before the operation command of the step, although the movement distance of the step 1 to the step 2 is 10 mm. May change to 12 mm.
- the core of the present invention is that when executing the operation command of a step through the input device, the operation command is recognized as a step, and the key recognition is that the operation of the operation command of the step is recognized as N step movement.
- the operation command of the step has a + direction and a ⁇ direction.
- the guide lines 50 and 51 with the step scale displayed as shown in the embodiment of FIG. 6 can be displayed on the display 30 screen.
- the guide line 50 51
- the guide line allows the user to perform the correct step input.
- not only the number of steps may be displayed on the guide lines 50 and 51, but also a simple item for information corresponding to each step may be displayed.
- step 6 the correspondence information for each step is "(50a-1) (50b-1) (50c-1) (50d-1) (50e-1)" and "(51a-1) (51b-1) (51c-1). 51d-1) 51e-1 ".
- the information corresponding to the first step "50e” in FIG. (A) is “50e-1”
- the information corresponding to the first step "51a” in FIG. (B) is "51a-1”.
- step 1 "50e” in Fig. (A) is "the dealership location of the mobile phone", in the screen of Fig. (A), the word “mobile dealership location” appears in the box designated by "50e-1" Can be an image.) Is displayed.
- step 1 "51a" in FIG. Can be an image.
- step 1 may be "50a” and "51a”
- step 5 may be "50e” and "51e”.
- step information "50a-1” in step (A) and the step information "51a-1” in step (B) become the same information.
- step information "51e-1" which is five-step information in FIG.
- FIG. 7 is a diagram of another embodiment of an operation command via an input device.
- FIG. 7A is a diagram showing an example in which a point (point) selected by a finger or a bar is moved through an input device
- (B) is a diagram illustrating an embodiment showing a moved angle.
- the + direction may be determined when the point moves up or rotates counterclockwise, and vice versa.
- FIG. 8 and 9 are diagrams illustrating steps of an operation command in the embodiment of FIG. 7.
- step 8 is an embodiment of FIG. 7A. That is, the figure shows that the distance traveled can be divided into stages and the characteristic of the information (contents of information) for each stage can be displayed on the screen. For example, step 1 is “52a” and step 2 is “52b”. Then, the first-level correspondence information is "52a-1" and the second-level correspondence information is "52b-1". Therefore, on the display 30, the number indicating the step and the characteristic of the corresponding information of each step can be displayed. have.
- FIG. 9 is an embodiment of FIG. 7B. That is, it is a figure which shows that it is possible to divide the moved angle into steps, and to display the characteristics (contents of information) of each step and information on each step on the screen.
- the input device driver 27 When a point (point) is selected (finger or finger can be selected) through the input device 28 of the terminal, the input device driver 27 outputs the position information (coordinate) of the selected point. . When the point is moved, the input device driver 27 outputs position information (coordinate) of the point to be moved.
- the central processing unit 20 determines the position information of the output point, and determines the movement step by a predetermined algorithm, and reflects the movement step at the maximum stage (add a step if it is a movement, and add a step if it is a movement). Subtraction method) to determine the final selection step.
- the central processing unit 20 selects the information corresponding to the selection information from the memory unit 21, and outputs a signal to enable the information to be driven to the display 30.
- the present invention has been described an embodiment in which the step can be divided for the distance or angle moved by selecting a point by the operation command of the step.
- the step of the operation command of the step can not but vary.
- a step can also be broken down by selecting and holding points.
- the movement to the upper end of the display can be recognized as an image element, and the signal output from the image element can be divided into a degree of movement by determining the algorithm. In addition, it can be determined by dividing the steps into the shape of a finger or the shape of an image.
- any method of input to the terminal can be used to distinguish the step by step movement as in the step operation command, the information corresponding to the final step after the move can be displayed on the screen.
- FIG. 10 is a diagram of an embodiment in which the display information of the entire screen is changed by the operation command of the step.
- FIG. 10 is a diagram of an embodiment displayed when the level of detail of the information is divided into steps.
- FIG. If the left side of the drawing is a step in which detailed information is displayed in the first step, the screen display can be switched to another step on the left side by the operation command of the step described in the foregoing embodiment.
- the information on the left side is displayed on the display 30 screen with three levels of detail.
- the detail of the information may increase as the step increases, but conversely, the detail of the information may decrease as the step increases.
- the change in the detail of the information by the step means that the same kind of information is made by dividing the detail according to the step.
- the first step is the summary, and the higher the level, the higher the level of explanation.
- FIG. 11 is a diagram of an embodiment in which the display information of a part of the screen is changed by the operation command of the step.
- selection area 31 capable of executing the operation command of the step on the entire screen. Therefore, the terminal user must first select the selection area.
- the user inputs a light to the location of the selection area on the display.
- various selections may exist depending on the type of the terminal or program. .
- the CPU determines a signal for the position information output through the input device, and determines that the selection area 31 is selected.
- the CPU displays the information corresponding to the final selection step on the display screen according to the method of the embodiment of the present invention.
- the size of the selection area may also vary according to the displayed information.
- the size of the selection area may increase in proportion to the step as the step increases.
- the size of the selection area may vary depending on the content of the information.
- 12 to 14 are diagrams of an embodiment showing a flow chart of the present invention.
- a flowchart of an embodiment illustrating a method of providing information step by step That is, as in the previous embodiment, when the operation command of the step is performed through the input device 28, the information is displayed on the display.
- the algorithm function may be performed by the controller 101 of the server 100 or may be performed by the controller 20 of the terminal.
- the terminal 110 may be connected to a server through a wired / wireless internet or a communication network to perform the above process. That is, when a step of an operation command is input through the input device 28 of the terminal 110, the input information is transmitted to the server through a communication network, and the server selects new information corresponding to the step of the operation command from the database. The selected information is transmitted to the terminal, and the terminal displays the received new information on the display 30 screen.
- the central processing unit 20 and the memory unit 21 in the terminal it is possible to display information corresponding to the operation command of the step on the display screen.
- the algorithm of the connection of the server or the terminal itself can be determined at that time according to the type of program and information. And this decision can be done programmatically or by user choice.
- the information related to the photogenic king of the king is divided into five stages (when the stages are divided into five stages) according to the degree of detail and stored.
- step 1 the content storage information of the "birth of King Kwanggato"
- step 2 the most detailed step 1
- step 5 the information stored in step 5 "HiKaKi001-5" Will be present.
- a screen change command is input (S102).
- the screen change command means that the user selects a previous step for performing a step operation command.
- the central processing unit of the terminal is in a state capable of recognizing the operation command of the step.
- the selection may be made by a menu button or by selecting a screen for a predetermined time.
- the step of the screen switching command may be omitted.
- the central processing unit (or the control unit) can be in a state capable of recognizing the operation command of the step only by displaying information on which the operation command of the step is displayed on the display.
- an operation command of a step (an operation command for classifying a step) through an input device is executed (S104).
- step 1 it is determined whether the current step is step 1 (S106). If it is not step 1, the process goes to step S130.
- stage moves from stage 1 to stage K.
- the current screen step is displayed as it is (S110).
- step of the operation instruction of the step is from step 1 to K step, the lowest step is step 1 and the highest step is K step.
- step J the step input through the input device
- the current screen display step is N steps.
- the screen display stage is divided from stage 1 to stage K, with the lowest stage being the first stage and the highest stage being the K stage.
- the input step is a J step (S132).
- FIG. 14 is a flowchart when the input direction is the one-step direction in step 130 of FIG. 13.
- the current display step is N steps.
- the screen display stage is divided from stage 1 to stage K, with the lowest stage being the first stage and the highest stage being the K stage.
- the step input through the input device is the step J.
- the screen of step 1 is displayed, otherwise, the screen of step" N-J "is displayed.
- the screen may be terminated according to the end command.
- the algorithm is executed by the central processing unit (CPU) 20 of the terminal or the controller 101 of the server.
- 15 to 17 are diagrams of yet another embodiment for executing an operation command.
- the operation command guide line 51 appears on the screen of the display 30.
- the operation command guide line 51 displays each step (1st step, 2nd step, etc. by numbers) ("51a" to "51e") like a scale, and can move by the displayed scale, thereby effectively stepping the operation instruction. You can do In other words, if you want to move only the second stage, you only need to move the second stage while looking at the scale displayed on the screen.
- a feature or information on information corresponding to each step is also displayed. That is, the information display boxes "51a-1" to “51e-1” appear in connection with each step, and each information display box “51a-1” to “51e-1” corresponds to each step.
- the content of the information is displayed as text, images or linked sites.
- the display box indicating the current step can be distinguished from other display boxes (in various ways such as color or darkness) so that the operation command of the step can be easily displayed on the screen.
- the guide line 51 appears, and the display box indicating the current (if the current step is step 1 is the display window of "51a-1") is displayed darker than other display boxes. Can be.
- the moved display box (the present is 1st step and if the 3rd step is moved, the 4th level display box 51d-1) is displayed darker (or distinctively) than the other display boxes. It can be. Even when the step is moved in the opposite direction, the current display box and the moving display box may be distinguished and displayed on the screen.
- the current step and the moving step can be distinguished and displayed through a scale indicating the step on the guide line. That is, the scale of the corresponding step in the guide line can be displayed separately from the scale of other steps.
- 16 and 17 are diagrams of an embodiment when a separate selection area 31 exists in the full screen. That is, when the screen change command 30a is selected (for example, the displayed portion 30a is selected) and the operation command of the step is input, the guide line 51 appears, and the guide line 51 displays the scale. The displayed steps (“51a” to “51e”) are displayed again, and the display boxes ("51a-1" to "51e-1") connected to each step are also displayed, and the characteristics of the information are displayed on the display box. Will be.
- the selection area 31 of FIG. 16 is referred to as one-step information
- the information displayed in the new selection area 32 of FIG. 17 is referred to as four-step information
- the selection area 31 on the screen of FIG. If the operation command of the step is performed and moved three steps, the screen of FIG. 17 is displayed.
- the display box 51a-1 of the current step and the display box 51d-1 corresponding to the step movement after the step are displayed differently from other display boxes.
- the content such as “mobile phone image” is displayed on the first-level display box 51a-1, and the content such as “mobile phone specifications” is displayed on the corresponding display box 51d-1 after being moved. It can be.
- step 5 is a step of connecting to another Internet site
- the content of “000 site connection” may be displayed on the step 5 display box 51e-1.
- step 5 information on the newly connected Internet site is displayed on the screen of the display 30. Therefore, a newly connected Internet site can proceed with the following work, such as purchasing a product.
- the change of information displayed on the display according to the change of the steps is as follows.
- step 1 increases, the amount of information displayed on the display increases. The detail of the information increases.
- the program to run to display the information on the display changes.
- image information or video information may be displayed as the stage changes.
- the hierarchy of information displayed on the display may change.
- the first stage could change to the exterior of the car, the second stage to the exterior of the car, and the third stage to the interior.
- the information displayed on the display may be changed from the appearance to the inside according to the change of the stage.
- step 5 information stored in a separate storage device or storage location may be displayed on the display.
- the size of the selection area also increases as the step increases, of course, the size of the selection area decreases depending on the information of the displayed display. It may or may not be.
- 18 is a flowchart for performing an operation command by a guide line.
- Fig. 18 is an embodiment thereof.
- the user of the terminal After executing the program, the user of the terminal inputs a screen conversion command (S164), and then executes the operation command of the step through the input device (S166).
- the screen change command (YES in FIG. 6) is not limited to the embodiment of the present invention. It is natural that the screen change command can be performed through various commands such as a certain period of time, a special movement operation, a separate menu display on the screen, or a specific button keyboard or button key, voice command, vibration command, and the like.
- the terminal CPU displays the guide lines 50 and 51 on which the step is displayed on the display screen.
- the server may transmit the guideline display information to the terminal, and the terminal CPU may display the received information on the display.
- the design or display information of the guide line is stored in a terminal memory unit or a database of a server.
- the shape in which the guide line is displayed on the display screen is not limited to the shape of the guide lines 50 and 51 presented in the embodiment of the present invention. If a step and a summary of information corresponding to the step can be displayed, the present invention can be applied to an embodiment of the present invention.
- the display shows information of the connected Internet site. And various functions are performed in the connected site.
- Performing various functions in a connected Internet site means all the functions that can be performed in the real Internet. For example, it can be linked to a normal payment system to select and pay for goods.
- 19 is a diagram of another embodiment of a stepwise input method.
- (A) is an embodiment showing that the guide lines 50, 51 on which the scales 50a, 50b, 50c, 50d, 50e may be displayed on the display 30 screen. And the distance " L " (distance actually displayed on the display) between the scales displayed on the display 30 can be actually equal to the distance traveled when performing the operation instruction of the step.
- the distance of one step becomes 10 mm in the operation command of the step. Therefore, the user who executes the operation command of the step moves (moves through the input device) the selected point by the actual size of the scale of the guide lines 50 and 51 displayed on the display, thereby displaying the information of the desired step on the display. Can be displayed.
- the central processing unit of the terminal or the controller of the server selects and displays the information corresponding to the output position on the display.
- (B) shows that the form of displaying the steps 50a, 50b, 50c, 50d, 50e on the display 30 may be different. Any shape or shape that can distinguish the steps can be applied to embodiments of the present invention.
- step (B) if the current state is in step 1 50a and the information of step 5 is to be displayed on the screen, the display bar 55 corresponding to step 5e may be selected.
- (C) is an embodiment showing a step in the form of another shape.
- Each step 50a, 50b, 50c, 50d, 50e is in the form of a box.
- the shape representing the step is not necessarily limited to the shape of the embodiment of the present invention.
- steps can be displayed in various shapes,
- 20 to 28 are diagrams of embodiments in which the size of the selection region changes.
- the selection area 32 is assumed to occupy only a part of the screen of the display 30.
- the selection area 32 is changed in size by the operation command. Moving in the + direction increases the size and increases the content of the information. Moving in the-direction decreases the size and the information decreases.
- FIG. 21 is a diagram of an embodiment showing the steps of information shown in FIG. 20. Although shown in FIG. 21 in three steps for convenience, it can be divided into more granular steps.
- 20 and 21 change the size of the selection region 32 only up and down. However, it is natural that the size may change from side to side.
- 22 and 23 are diagrams illustrating an embodiment in which the size of the selection area 32 is changed up, down, left, and right. After selecting the selection area 32 as shown in the figure, if the operation command of the step of changing the size is executed, the selection area 32 is changed to the new selection area 32a having the size changed. And the size of the selection will change in stages,
- FIG. 23 illustrates that there are three levels of size for convenience, but the size may be further subdivided. Information corresponding to the size of each selected area is displayed on the display. 23 shows that the dex information and the image information may be displayed together according to the step.
- FIG. 24 is a diagram of an embodiment in which information is stored, i.e., if a size step is divided into N steps, information corresponding to each size is also stored, where each type is a respective step and the corresponding step is When selected, the corresponding information is displayed on the display.
- the information may be stored in the memory unit 21 of the terminal or the database 104 of the server.
- the information may be stored in a storage device or a separate server.
- the displayed information does not necessarily exist only in the text file, but also in the image file 32c or the video file. Naturally, if the area is clicked on, link information 31d linked to another site may be displayed.
- 25 is a view showing another embodiment.
- Fig. A the size of the selection areas 32 and 32a changes according to each step, and the information displayed on the selection areas 32 and 32a changes accordingly.
- FIG. (B) and (C) are diagrams illustrating an embodiment in which a step is selected based on the size of the selection area. That is, when the size of the selection area is adjusted by the operation command of the step, if the size of the selection area is not the same as step 2 but close to step 2 as shown in Fig. (B), the information of the second step is displayed on the selection area 32a '. Will be. In addition, if the size of the selection area is not the same as step 3, but is the same as step 3, as shown in Fig. C, three levels of information are displayed on the selection area 32a '.
- the selection area 32 is selected on the screen, and when the step operation command is executed through the input device 28, the selection screen 32a (32a ') of the changed size is displayed on the display screen. And information corresponding thereto is displayed. (S198-S200) Then, execution may be terminated by the termination command (202).
- 27 is a diagram of an embodiment for smoothly performing a stepwise operation instruction.
- the type is from 1 to N
- the selection target 32 is sized for each type from 1 to N. (214) Such size storage is performed by the memory unit 21 or the database. Stored at 104.
- stepwise operation commander may precisely adjust the size of the selection target 32 to a predetermined size for each type, in reality, this is more often the case.
- the size adjusted by the operation commander may be N + a.
- the value of N + a is the size between type N and type N + 1.
- the size of the selection target 32a is displayed on the display as the size of the N type and the information is also N. Type information is indicated. (S216-S218)
- the selection objects 32a and 32a ' are displayed on the display 30 in size as a result of the operator's stepwise operation instruction.
- the selection target 32a is displayed on the display 30 in the sizes indicated by S218 and S220. Then, the termination instruction is terminated.
- the terminal 110 or the server 100 may be performed as shown in FIG. 27.
- FIG. 28 is a diagram of still another embodiment, in which two or more selection areas 32 capable of performing an operation command of a step may be displayed in a display. Then, one of the selection regions 32 can be selected to perform the stepwise operation command.
- 29 and 31 are diagrams of embodiments in which the content and size of information vary in various ways.
- the size of the selection area does not increase in proportion to the step, and the size of the selection area is determined according to the size or type of data displayed in the selection area. There is a number.
- the information to be displayed when the information to be displayed is an image, the information may be displayed according to the size of the selection area 32 changed by the stepwise operation command.
- FIG. 31 is a diagram of an embodiment in which a stepped operation command in which the size of the selection area is changed and a stepped operation command in which the size is not changed are performed together.
- 32 to 34 are diagrams of an embodiment when two or more selection areas exist in a full screen.
- FIG. 32 is a diagram of an embodiment showing a general table of contents of a digital textbook.
- the menu selection bar 30a is displayed at the top of the display.
- the contents of items I to VII can all be executed in the step of operating instructions. Therefore, when one of the above contents is selected and the operation command of the step is executed, information corresponding to the execution result is displayed on the display screen.
- the CPU 33 selects one of the tables of contents and activates the selected tables of contents.
- the colors of the letters are changed or flickered so that they are displayed on the display to distinguish them from other letters.
- the CPU displays the selected letter differently by a predetermined algorithm.
- the table of contents is selected and then moved one step in the + direction as in the embodiment of the present invention, the information expanded in one step in connection with the establishment and development of the consideration is displayed on the display window 40.
- the information stage has three stages, three steps of information are displayed on the display window 40 when the selection and the development of consideration are selected and moved by three stages in the + direction.
- the second stage moves in the + direction, and the third stage of information is displayed.
- the display window may be immediately closed by selecting the "x" mark 40e displayed on the upper portion of the display window 40. That is, even if the information of any step is displayed on the display window, when the "x" mark 40e is selected, the display window is closed and the display is switched to the first display step (step 1 or step 0).
- FIG. 34 is a diagram of an embodiment showing that a manipulation command of a step may be hierarchical. As shown in Fig. 34, when the table of contents "consideration and development" is selected and moved one step in the + direction, the display window 40 displays information of the first stage 40 connected with "consideration and development".
- step 1 If one of the list of information displayed in step 1 is selected again (for example, "characteristic of Koryo culture") and maintained for a predetermined time, only “characteristic of Koryo culture” is activated as shown in FIG. 34, and the activated "
- the operation command of the step is performed for "characteristic of Koryo culture”
- the information of the step related to "characteristic of Koryo culture” is displayed on the additional display window 40a.
- the selected page is displayed on the display.
- 35 is a diagram of another embodiment changed by an operation command of a step.
- the additional information 35a appears in columns. If one of the additional information 35a is selected and a stepwise control command is executed, the additional information 35b is displayed as a column. Also-move to the original state.
- more additional information 35a appears in proportion to the + travel distance.
- 36 is a flowchart of an embodiment according to the operation instruction of the step of the present invention.
- the input device driver 27 When the terminal user of the present invention selects two specific points (or may be one or may be two or more) through the input device 28, the input device driver 27 outputs the position of the selected point. In operation 310, the CPU 20 recognizes the position of the selected point.
- the input device driver 27 When the selected point moves through the input device 28 that can input information on the upper portion of the display 30, the input device driver 27 outputs the position of the moved point. In addition, the CPU 20 recognizes the change distance of the point, determines the step of moving the point, and as a result, determines the step of the control command. (S 315-S320)
- the execution process can be divided as follows.
- the terminal central processing unit transmits the position and the movement information of the point output from the input device driver 27 to the server, and the control unit 101 of the server determines the step of the control command.
- the terminal central processing unit 20 recognizes the position and movement information of the point output from the input device driver 27, the terminal central processing unit 20 determines the step of the control command, The central processing unit sends the step to the server.
- the data transmission and reception between the terminal and the server is performed by the interface unit 102 of the server and the R / F 24 of the terminal using a communication network.
- the central processing unit 20 adds or subtracts the step moved after the operation command from the step of the information currently displayed on the display 30 to determine a new final step.
- the information associated with the new final stage is then shown on the display. That is, the central processing unit 20 outputs a display driving signal for displaying the newly selected information, and the newly selected information is displayed on the display 30. (S 325 to S 335)
- the finally selected step is step 3 and the step 3 information is displayed on the display screen.
- the step of information currently displayed on the display is four steps, and the step of the control command is-two steps, the finally selected step is two steps and the two-step information is displayed on the display screen.
- the CPU 20 determines the final step, selects the information connected to the last step in the memory unit 21, and displays the selected information on the display.
- the control unit 10 of the server selects the information suitable for the final step in the database 104 of the server, and transmits the selected information to the terminal, the central processing unit of the terminal from the server Display the received information on the display.
- This process is performed by the CPU selecting information stored in the memory unit.
- an algorithm capable of performing this is stored in a memory unit.
- information corresponding to each step is stored in the memory unit 21.
- the controller of the server performs the above process
- the information corresponding to each step is stored in the database 104 of the server and the execution algorithm is also stored in the database.
- 37 is a flowchart of an embodiment in which the server transmits information associated with each step.
- the control unit 101 of the server 100 selects information to be displayed on the display 30 of the terminal 110 from the database 104 and performs only wired or wireless communication ( Or to the terminal via the Internet).
- the display 30 displays the information received from the server as a result.
- step 1 information is displayed on the current display
- the server transmits information from type 2 to type N to the terminal.
- the server If the display screen transmitted by the server for display on the display of the terminal enables the operation command of the step, the server transmits the information of the other steps connected with the screen information to the terminal.
- the central processing unit 20 of the terminal stores the received information in the memory unit 21, when the operation command of the step through the input device 28 is performed, in accordance with an embodiment of the present invention determines the final step, After selecting the information associated with the final step, the selected information is displayed on the display. On the other hand, if the program is terminated and the end switch is operated, execution of the terminal is terminated. (S 285-S 290)
- the operation command of a step is executed by a finger or an operation means through an input device at the top of the display, not only the information is provided step by step, but the screen is not changed even when the information of the various steps is provided. It may also be provided at, and may also allow access to information stored on another Internet site or other storage location.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (14)
- 디스플레이와 중앙처리장치 및 입력장치가 구비된 단밀기에서,상기 디스플레이 위에서 단계를 구별하는 단계의 조작 명령이 입력되면, 입력장치는 단계의 조작 명령을 출력하고, 상기 중앙처리장치는 단계의 조작 명령의 단계를 인식하고,상기 단계에 맞는 정보를 디스플레이에 출력하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 디스플레이 위에서 점이 선택되고 상기 점이 이동되어 이동 거리에 대한 정보가 출력되면, 중앙처리장치는 상기 이동 거리를 단계로 판단하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제 1항에 있어서, 디스플레이 위에서 점이 선택되고 상기 점이 회전되어 회전 방향에 대한 정보가 출력되면, 중앙처리장치는 상기 회전 이동의 각도를 단계로 판단하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 상기 단계가 + 이동 단계가 존재하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 상기 단계가 - 이동 단계가 존재하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 상기 입력장치를 통하여 두점이 선택되고 상기 두점 사이의 거리가 가까워 지거나 멀어짐에 따라 단계가 정해지는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 디스플레이와 중앙처리장치 및 입력장치가 구비된 단밀기에서,상기 단말기는 데이터 베이스와 제어부가 구비된 서버와 연결되고,상기 디스플레이 위에서 단계를 구별하는 단계의 조작 명령이 입력되면, 입력장치는 단계의 조작 명령을 출력하고, 상기 중앙처리장치는 단계의 조작 명령의 단계를 인식하고,상기중앙처리장치는 상기 조작 명령의 단계에 대한 판단 정보를 서버에 송신하거나 혹은 입력장치의 단계적 조작 명령 신호를 서버에 전송하고,상기 서버는 단계에 대한 정보를 데이터 베이스에서 출력하여 단말기에 전송하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 디스플레이 위에서 점이 선택되고 상기 점이 이동되어 이동 거리에 대한 정보가 출력되면, 중앙처리장치는 상기 이동 거리를 단계로 판단하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제 1항에 있어서, 디스플레이 위에서 점이 선택되고 상기 점이 회전되어 회전 방향에 대한 정보가 출력되면, 중앙처리장치는 상기 회전 이동의 각도를 단계로 판단하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 상기 단계가 + 이동 단계가 존재하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 상기 단계가 - 이동 단계가 존재하는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항에 있어서, 상기 입력장치를 통하여 두점이 선택되고 상기 두점 사이의 거리가 가까워 지거나 멀어짐에 따라 단계가 정해지는 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항 또는 7에 있어서, 현재의 단계가 N 이고, + 만큼 J 단계 이동하면 최종 선택 단계는 N + J 인 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
- 제1항 또는 7에 있어서, 현재의 단계가 N 이고, - 만큼 I 단계 이동하면 최종 선택 단계는 N - I 인 것을 특징으로 하는 단계적 정보 제공 시스템 및 방법.
Priority Applications (15)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020167018963A KR101783279B1 (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020167018973A KR101799838B1 (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
US14/360,554 US20150160746A1 (en) | 2011-11-24 | 2012-11-26 | Phased information providing system and method |
KR1020167018978A KR101922426B1 (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020177025091A KR20170104674A (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020217011189A KR102351898B1 (ko) | 2011-11-24 | 2012-11-26 | 스마트폰에서의 정보 제공 방법 |
KR1020207020413A KR20200087882A (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020237020041A KR20230091202A (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020187033460A KR102148228B1 (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020217024009A KR20210096325A (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020147014275A KR101647057B1 (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020177025461A KR101922423B1 (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020227004664A KR20220025222A (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020227028238A KR20220123128A (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
KR1020227004665A KR20220025223A (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20110123360 | 2011-11-24 | ||
KR10-2011-0123360 | 2011-11-24 | ||
KR10-2012-0010491 | 2012-02-01 | ||
KR20120010491 | 2012-02-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013077686A1 true WO2013077686A1 (ko) | 2013-05-30 |
Family
ID=48470061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2012/010021 WO2013077686A1 (ko) | 2011-11-24 | 2012-11-26 | 단계적 정보 제공 시스템 및 방법 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150160746A1 (ko) |
KR (13) | KR20230091202A (ko) |
WO (1) | WO2013077686A1 (ko) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200067956A (ko) * | 2012-03-21 | 2020-06-12 | 김시환 | 스마트폰 디스플레이에서 정보 표시 방법 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100125635A (ko) * | 2009-05-21 | 2010-12-01 | 엘지전자 주식회사 | 이동 통신 단말기에서의 메뉴 실행 방법 및 이를 적용한 이동 통신 단말기 |
KR20100134948A (ko) * | 2009-06-16 | 2010-12-24 | 삼성전자주식회사 | 터치스크린을 구비하는 장치의 메뉴 표시 방법 |
KR20110006547A (ko) * | 2009-07-14 | 2011-01-20 | 주식회사 팬택 | 터치 궤적에 따라 메뉴 정보를 표시하는 이동 단말기 |
KR20110030893A (ko) * | 2009-09-18 | 2011-03-24 | (주)빅트론닉스 | 멀티터치 기반 터치 패널 제어장치, 방법 및 이를 이용한 모바일 기기 |
KR101060175B1 (ko) * | 2010-07-08 | 2011-08-29 | 한국과학기술원 | 터치스크린 제어방법, 이를 위한 기록매체 및 이를 이용하는 클라우드 컴퓨팅 제어방법 |
KR20110107069A (ko) * | 2010-03-24 | 2011-09-30 | 글로벌테크링크(주) | 터치 장치 제어방법, 이를 이용하는 터치 장치 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6639584B1 (en) | 1999-07-06 | 2003-10-28 | Chuang Li | Methods and apparatus for controlling a portable electronic device using a touchpad |
US7657849B2 (en) | 2005-12-23 | 2010-02-02 | Apple Inc. | Unlocking a device by performing gestures on an unlock image |
US8519965B2 (en) * | 2008-04-23 | 2013-08-27 | Motorola Mobility Llc | Multi-touch detection panel with disambiguation of touch coordinates |
JP4791565B2 (ja) | 2009-06-15 | 2011-10-12 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 評価システム、マーカー表示の制御方法およびプログラム |
-
2012
- 2012-11-26 KR KR1020237020041A patent/KR20230091202A/ko not_active Application Discontinuation
- 2012-11-26 KR KR1020227004664A patent/KR20220025222A/ko not_active Application Discontinuation
- 2012-11-26 KR KR1020227004665A patent/KR20220025223A/ko active Application Filing
- 2012-11-26 KR KR1020167018973A patent/KR101799838B1/ko active Application Filing
- 2012-11-26 KR KR1020217011189A patent/KR102351898B1/ko active IP Right Grant
- 2012-11-26 KR KR1020177025091A patent/KR20170104674A/ko not_active Application Discontinuation
- 2012-11-26 WO PCT/KR2012/010021 patent/WO2013077686A1/ko active Application Filing
- 2012-11-26 KR KR1020177025461A patent/KR101922423B1/ko active IP Right Grant
- 2012-11-26 KR KR1020187033460A patent/KR102148228B1/ko active IP Right Grant
- 2012-11-26 KR KR1020147014275A patent/KR101647057B1/ko active IP Right Grant
- 2012-11-26 US US14/360,554 patent/US20150160746A1/en not_active Abandoned
- 2012-11-26 KR KR1020207020413A patent/KR20200087882A/ko not_active Application Discontinuation
- 2012-11-26 KR KR1020167018963A patent/KR101783279B1/ko active IP Right Grant
- 2012-11-26 KR KR1020217024009A patent/KR20210096325A/ko not_active Application Discontinuation
- 2012-11-26 KR KR1020167018978A patent/KR101922426B1/ko active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100125635A (ko) * | 2009-05-21 | 2010-12-01 | 엘지전자 주식회사 | 이동 통신 단말기에서의 메뉴 실행 방법 및 이를 적용한 이동 통신 단말기 |
KR20100134948A (ko) * | 2009-06-16 | 2010-12-24 | 삼성전자주식회사 | 터치스크린을 구비하는 장치의 메뉴 표시 방법 |
KR20110006547A (ko) * | 2009-07-14 | 2011-01-20 | 주식회사 팬택 | 터치 궤적에 따라 메뉴 정보를 표시하는 이동 단말기 |
KR20110030893A (ko) * | 2009-09-18 | 2011-03-24 | (주)빅트론닉스 | 멀티터치 기반 터치 패널 제어장치, 방법 및 이를 이용한 모바일 기기 |
KR20110107069A (ko) * | 2010-03-24 | 2011-09-30 | 글로벌테크링크(주) | 터치 장치 제어방법, 이를 이용하는 터치 장치 |
KR101060175B1 (ko) * | 2010-07-08 | 2011-08-29 | 한국과학기술원 | 터치스크린 제어방법, 이를 위한 기록매체 및 이를 이용하는 클라우드 컴퓨팅 제어방법 |
Also Published As
Publication number | Publication date |
---|---|
KR20230091202A (ko) | 2023-06-22 |
KR20140097233A (ko) | 2014-08-06 |
KR20220025222A (ko) | 2022-03-03 |
KR101922423B1 (ko) | 2018-11-27 |
KR20210045511A (ko) | 2021-04-26 |
KR20160089534A (ko) | 2016-07-27 |
KR20180126624A (ko) | 2018-11-27 |
KR101647057B1 (ko) | 2016-08-09 |
US20150160746A1 (en) | 2015-06-11 |
KR20210096325A (ko) | 2021-08-04 |
KR20200087882A (ko) | 2020-07-21 |
KR20170104674A (ko) | 2017-09-15 |
KR101783279B1 (ko) | 2017-09-29 |
KR20220025223A (ko) | 2022-03-03 |
KR20160088944A (ko) | 2016-07-26 |
KR101922426B1 (ko) | 2019-02-20 |
KR20170105649A (ko) | 2017-09-19 |
KR102351898B1 (ko) | 2022-01-18 |
KR102148228B1 (ko) | 2020-10-14 |
KR101799838B1 (ko) | 2017-12-20 |
KR20160088945A (ko) | 2016-07-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016137167A1 (en) | Terminal | |
WO2014088375A1 (en) | Display device and method of controlling the same | |
WO2013151347A1 (ko) | 입력 장치 및 문자 입력 방법 | |
WO2016104922A1 (ko) | 웨어러블 전자기기 | |
WO2013022223A2 (en) | Method for controlling electronic apparatus based on voice recognition and motion recognition, and electronic apparatus applying the same | |
WO2015122571A1 (en) | Mobile terminal and method for controlling the same | |
WO2013022218A2 (en) | Electronic apparatus and method for providing user interface thereof | |
WO2016018062A1 (en) | Method and device for providing content | |
WO2016017956A1 (en) | Wearable device and method of operating the same | |
WO2016085173A1 (en) | Device and method of providing handwritten content in the same | |
WO2014112678A1 (ko) | 이동 단말기를 이용한 쇼핑정보 제공방법 및 이동 단말기를 이용하여 쇼핑정보를 제공하는 사용자 인터페이스 | |
WO2015030445A1 (en) | Method and apparatus for executing application using multiple input tools on touchscreen device | |
WO2022131521A1 (ko) | 터치스크린을 포함하는 입력 장치와 이의 동작 방법 | |
WO2016208920A1 (en) | Input device, electronic apparatus for receiving signal from input device and controlling method thereof | |
WO2017183804A1 (en) | Touch screen device, input device, and control method thereof and method thereof | |
WO2022025450A1 (ko) | 슬라이딩 가능한 전자 장치 및 이의 제어 방법 | |
WO2017159931A1 (en) | Electronic device including touch panel and method of controlling the electronic device | |
WO2020149600A1 (en) | Electronic device and operation method thereof | |
WO2015030555A1 (en) | Method and apparatus for providing information about image painting and recording medium thereof | |
WO2016068645A1 (en) | Display apparatus, system, and controlling method thereof | |
WO2013077686A1 (ko) | 단계적 정보 제공 시스템 및 방법 | |
WO2015093848A1 (ko) | 데이터 실행 장치 | |
WO2022025451A1 (ko) | 슬라이딩 가능한 전자 장치 및 이의 제어 방법 | |
WO2013015553A2 (ko) | 문자 입력을 제공하는 방법, 단말기 및 기록매체 | |
WO2020213941A1 (ko) | 스마트폰 내에서 컨텐츠 제공 시스템 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12851431 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20147014275 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12851431 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14360554 Country of ref document: US |