WO2012064128A2 - Appareil à écran tactile et procédé permettant de commander cet appareil - Google Patents

Appareil à écran tactile et procédé permettant de commander cet appareil Download PDF

Info

Publication number
WO2012064128A2
WO2012064128A2 PCT/KR2011/008568 KR2011008568W WO2012064128A2 WO 2012064128 A2 WO2012064128 A2 WO 2012064128A2 KR 2011008568 W KR2011008568 W KR 2011008568W WO 2012064128 A2 WO2012064128 A2 WO 2012064128A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch
pointer
point
display
finger
Prior art date
Application number
PCT/KR2011/008568
Other languages
English (en)
Korean (ko)
Other versions
WO2012064128A3 (fr
Inventor
채상우
Original Assignee
Chae Sang-Woo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chae Sang-Woo filed Critical Chae Sang-Woo
Publication of WO2012064128A2 publication Critical patent/WO2012064128A2/fr
Publication of WO2012064128A3 publication Critical patent/WO2012064128A3/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a touch screen device and a control method thereof, and more particularly, to an apparatus and method for displaying a user interface graphic object at a position different from a touch position in response to a finger touch and controlling the displayed GUI object.
  • a touch screen using a touch sensing device has been widely used as a control method of a graphic user interface in wired and wireless electronic devices of mobile devices and ubiquitous environments.
  • the multi-network environment of the user may be richer and more diverse, and a more convenient graphical user interface may be provided.
  • a cursor is first displayed on a screen, and when a touch is detected within a predetermined area around the cursor display position, the touch position and the cursor are displayed.
  • the display position is interlocked and the touch position is moved, the cursor display position follows the same interval while maintaining the same distance, and when the touch position is touched, the tapping operation recognition initiates a cursor control technique that recognizes that the tapping operation occurs at the cursor display position.
  • Patent Publication No. 2008-0070226 (Samsung), a first touch and a second touch are detected, a pointer is displayed at a first touch position, and the movement of the second touch is controlled by the position movement of the pointer displayed at the first touch position.
  • the cursor display direction is set clockwise around the touch position to facilitate selection in the rectangular area.
  • Patent Publication No. 2009-0094360 (Microsoft) while maintaining a finger in contact with the screen, the user guides the position of the pointer based on visual feedback provided by the callout, and the user guides the screen to the screen. You can precisely move and fine-tune the pointer position by moving your finger on the surface of the screen until it is over a copy of the desired target displayed in the non-occluded area of the screen.
  • a pointer is displayed at a position spaced a predetermined distance from a touch position in a predetermined direction.
  • These devices typically perform a selection command of a selection position by moving the pointer to a desired selection position (drag) and then releasing the touch while holding the pointer at the selection position while maintaining the touch state after the touch screen device. .
  • the selection command is performed at the touched location, which causes an unwanted object to be selected and executed.
  • the existing multi-screen method uses a finger as a pointer itself, which causes a lot of mistake in clicking confirmation and frequent errors, making it difficult for users to search the Internet or perform tasks. If not, you can select a small link area after zooming out and zooming out using two fingers in selecting a small area displayed on the GUI object.
  • This method has a problem that it is not easy to select GUI objects with a more natural touch method in the existing web environment, and seriously, a new app needs to be made to fit the area of the average finger's finger to operate the gesture-based control. It causes difficulties.
  • An object of the present invention is to provide a touch screen device and a method of controlling the same, which can achieve convenience of a touch input in order to solve the problems of the related art.
  • Another object of the present invention is to provide a touch screen device and a control method thereof capable of accurately controlling an object.
  • Another object of the present invention is to provide a touch screen device and a control method capable of controlling a display position of a pointer adaptively to a physical environment of a screen when a touch is moved.
  • Still another object of the present invention is to provide a touch screen device and a control method having a zoom in and zoom out function associated with a pointer.
  • Another object of the present invention to provide a touch screen device and a control method having a virtual indicator bar function associated with the pointer.
  • Another object of the present invention is to provide a touch screen device and a control method having a virtual touch ball function associated with a pointer.
  • Another object of the present invention to provide a touch screen device and a control method capable of multi-touch multi-pointer control.
  • the control method of the present invention is a touch screen device having a display panel and a touch panel, the touch position corresponding to the finger touch (or touch down) on the touch panel, and recognized
  • the pointer is displayed on an area of the display panel which is a certain distance away from the touch position and is not visually obscured by the finger, and when the finger is moved while the finger is touched at the touch position, the pointer is maintained while maintaining a certain distance in conjunction with the finger movement.
  • the selection wait step is limited to a certain time after the touch release and if there is no input of a selection command until the predetermined time elapses, it is preferable to erase the displayed pointer display.
  • the selection command may be generated by at least one of a touch operation for displaying a pointer and another touch operation (touch down, touch click, double touch click), or key input.
  • the pointer operation mode is activated by touching the pointer mode selection button displayed on the touch screen, which is convenient for use in parallel with the existing gesture operation mode.
  • the pointer movement display step may variably control any one of a distance or a direction between the touch point and the pointer display point in conjunction with the distance at which the touch movement enters the rectangular area set near the edge of the touch screen.
  • the pointer movement display step may variably control any one of a distance or a direction between the touch point and the pointer display point corresponding to the distance between the lower edge of the touch screen and the touch point.
  • the pointer movement display step may variably control the distance between the touch point and the pointer display point corresponding to the distance between the center point and the touch point of the touch screen.
  • the distance between the reference point and the touch point in the indicator bar control zone in the virtual indicator bar mode, and variably control at least one of the distance or direction between the touch point and the display point of the indicator bar corresponding to the calculated reference point distance A virtual indicator bar mode may be performed.
  • the pointer display area is displayed on the other hand while one hand is touched at the first touch point. Detects the touch with the other hand, moves and displays the pointer in association with the touch movement by the other hand, and between the second touch point and the first touch point where the touch-up is generated in conjunction with the touch-up of the other hand. Updating the distance and direction to a new setting position of the pointer makes setting the display direction and distance of the pointer very easy.
  • a touch screen device for displaying a pointer at a setting position spaced apart by a set distance in a set direction from a touch point, wherein an enlarged symbol or a reduced symbol is displayed in a display area of the pointer, and a touch of the display area of the pointer is detected.
  • the enlargement or reduction corresponding to the enlarged or reduced symbol is executed, the enlarged or reduced symbol in the display area of the pointer is displayed as a reduced or enlarged symbol, and the touch detection step is performed in response to the touch detection. Iteratively repeating the inverted display steps makes the zoom in and zoom out interface very convenient.
  • the virtual touch ball is displayed on the touch screen, the direction and rotation speed of rotating the touch ball by touch in the virtual touch ball display area are detected, and the image is displayed by moving in response to the detected rotation direction and rotation speed.
  • the touch-up is detected in the image movement display step, it is very convenient to move the image on the touch screen by gradually decelerating and stopping the image movement speed in the movement direction of the touch-up moment.
  • the present invention detects the multi-touch, infers the longitudinal direction of the touched fingers from the detected multi-touch points, and displays the pointers corresponding to a certain distance extending from each of the touch points in the inferred finger length direction, respectively,
  • the display screen is conveniently controlled by interlocking the multiple pointers linked to the multiple touch points in a group.
  • the inferring step calculates a bisector that bisects a straight line between two farthest touch points among the detected multiple touch points, and extends a vertical extension line having a certain length in one direction from the bisector to meet the horizontal line. It is preferable to calculate the reference point, calculate the azimuth angle of each touch point counterclockwise from the horizontal line around the reference point, and determine extension lines passing through the touch points from the reference point in the longitudinal direction of the finger corresponding to each touch point. .
  • the pointer when the pointer mode is selected, the pointer is displayed at an arbitrary first position of the touch screen, and the touch position corresponding to the finger touch (or touch down) on the touch panel is displayed. Recognizes and jumps the pointer displayed at the first position to a specific second position of the display panel region which is separated by a predetermined distance from the recognized touch position and is not visually covered by the finger, and the finger is touched at the touch position.
  • the pointer is moved and displayed while maintaining a certain distance in conjunction with the movement of the finger, and the object is selected when the finger is released (or touched up) while the display position of the pointer is positioned on the object to be selected.
  • the selection command is entered in the selection wait phase, the object waiting for selection is activated. It is also possible.
  • the apparatus of the present invention includes a touch panel or a sensor unit, a display panel disposed below the touch panel, an input / output unit for inputting user key commands, a memory in which a touch screen control program is stored, and a touch screen control program stored in the memory. It includes a control unit for performing.
  • the touch screen control program recognizes a touch position corresponding to a finger touch (or touch down) on the touch panel, displays a pointer in an area of the display panel which is separated from the recognized touch position by a distance and is not visually obscured by the finger.
  • the pointer When moving a finger while touching a finger on a position, the pointer is moved and displayed while maintaining a certain distance in conjunction with finger movement, and the finger is released when the display position of the pointer is positioned on an object to be selected. Or touch-up) to wait for selection of an object and to activate an object that is waiting for selection when a selection command is input in the selection waiting step.
  • the apparatus of the present invention may be implemented in any one of software, hardware or a combination thereof.
  • the electronic device By using the electronic device using the method and method of the present invention, it is possible to easily click and confirm the image object which is the target object by using the pointer to the data and the web environment created in the existing computing environment. It can be used as a means of connecting the web environment and the app environment in a more familiar way, and the information provided on the screen can be checked more quickly and easily by performing the functions of zoom in and zoom out or zoom in and zoom out provided by the pointer.
  • the automatic direction setting of the pointer using a touch position group of five or less fingers of a user enables a user to select and click a single finger on a large touch screen using a large number of pointers even in a large touch sensing device.
  • FIG. 1 is a block diagram of a touch screen control device according to an embodiment of the present invention.
  • FIG. 2 is a screen state diagram for explaining the configuration of a single pointer according to the present invention.
  • FIG. 3 is a flowchart illustrating a pointer mode selection process according to the present invention.
  • FIG. 4 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in the pointer mode according to the present invention.
  • 5 is a view for explaining a blind spot on the touch screen.
  • FIG. 6 is a view for explaining the operation principle of the display position control of the pointer 150 in each area of the rectangular zone according to the present invention.
  • FIG. 7 is a flowchart illustrating a process of controlling a pointer display position in the blind spot of the embodiment of FIG. 6.
  • FIGS. 8A and 8B are photographs showing a state in which a distance between a touch position and a display position is changed as the lower edge of the touch screen is approached.
  • FIG. 9 is a view for explaining an embodiment for variably controlling the display position of the pointer 150 in the entire screen area.
  • FIG. 10 is a flowchart for explaining a pointer display position control process in an entire region of one embodiment of FIG. 9; FIG.
  • FIG. 11 is a view for explaining an embodiment of a virtual indicator rod using a variable control technique of the display position of the present invention.
  • 12A to 12H are screen state diagrams for explaining the environment setting of a pointer according to the present invention.
  • FIG. 13 is a view for explaining a method of setting the direction and distance of the pointer of the present invention in the touch state.
  • FIG. 14 is a screen state diagram showing a state in which a point display area is touched according to the present invention.
  • Fig. 15 is a flowchart for explaining an enlargement / reduction processing program according to the present invention.
  • 16 is a screen state diagram showing the virtual controller 170 according to the present invention.
  • 17 is a screen state diagram for explaining a virtual touch ball according to the present invention.
  • FIG. 18 is a flowchart illustrating a virtual touch ball control operation according to the present invention.
  • FIG. 19 is a diagram for explaining a two-touch two pointer according to the present invention.
  • 20 is a view for explaining a three touch three pointer according to the present invention.
  • 21 is a view for explaining a five touch five pointer according to the present invention.
  • FIG. 22 is a flowchart illustrating a pointer display and an object selection process associated with a touch operation in a multi-touch multi-pointer mode according to the present invention
  • FIG. 23 is a view for explaining a modified embodiment of the multi-touch multi-pointer mode according to the present invention according to the present invention.
  • 24 is a diagram for explaining the principle of calculating the display point azimuth angle in the modified embodiment.
  • 25 is a screen state diagram of a modified embodiment for explaining the display operation of the pointer according to the present invention.
  • FIG. 26 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation according to a modified embodiment
  • the touch screen controller is an electronic device having a touch screen control function, for example, a smart phone having a touch function, a monitor having a touch function of a digital camera or camcorder, a navigation function having a touch function, a portable multimedia player having a touch function, It includes tablet computers such as iPads and Galaxy Tabs, touchscreen monitors for personal computers such as laptops and desktop computers, smart televisions with touch capabilities, and large wall-mounted displays with touch capabilities.
  • Touchdown Indicates an operation state in which a finger touches the touch panel, and the controller recognizes the touchdown state when the touch state is maintained for a predetermined time, for example, 300ms to 500ms without moving from the first touch position.
  • Touch release or touch up It means the operation state that the finger touches the touch panel.
  • Touch movement It refers to an operation state of moving on the touch panel while keeping a finger in a touch-down state.
  • Touch-tapping Defines an operation state in which touchdowns and touchups are sequentially performed and maintains a short time from the touchdown to the touchup within a predetermined time, rather than the recognition time of the touchdown described above. Set within a short time, such as 100 ms to 400 ms.
  • FIG. 1 is a block diagram of a touch screen control device according to the present invention.
  • the touch screen controller 100 includes a touch screen 110, a controller 120, an input / output unit 130, and a memory unit 140.
  • the touch screen 110 includes a touch panel 114 on the display panel 112.
  • the display panel 112 is configured of a flat panel display such as an OLED or an LED and displays an image, a pointer image, an object, etc. for a user interface.
  • the touch panel 114 detects a touch of a finger by a pressure-sensitive or electrostatic method by a sensor unit to recognize a touch position or a touch coordinate.
  • the control unit 120 is composed of a microprocessor or microcomputer such as a central processing unit (CPU) and a digital signal processing unit (DSP), executes a touch screen control program, and displays display data corresponding to a recognized touch position in response to a given command. It is generated and provided to the display panel to control the operation of the entire touch screen.
  • the input / output unit 130 includes a key input unit, a serial input / output unit, a parallel input / output unit, a wired / wireless communication module, etc., and receives a key command signal or an external data signal, etc., and provides the input to the controller 120 or data generated from the controller 120. Output to the outside.
  • the memory unit 140 includes a cache memory, a DRAM, an SRAM, a flash memory, a magnetic disk storage device, an optical disk storage device, and the like, stores a touch screen control program, and stores data generated by the controller 120.
  • the controller 120 is operated by a graphical user interface (GUI) system to display and control GUI objects on the display panel 112.
  • GUI graphical user interface
  • a GUI object or an image object refers to a unit in which displacement and deformation of an image may be involved and image processing may be performed.
  • the GUI object may be an icon, a desktop, or a window for an application (eg, Word, Excel, power point, internet explorer, etc.) and may be displayed in full or partial area on the screen of the touch screen 110.
  • Objects selected by the user include images, videos, and music.
  • a GUI object or an image object refers to a unit in which displacement and deformation of an image may be involved and image processing may be performed.
  • the GUI object may be an icon, a desktop, or a window for an application (eg, Word, Excel, power point, internet explorer, etc.) and may be displayed in full or partial area on the screen of the touch screen 110.
  • Objects selected by the user include images, videos, and music.
  • UI elements include icons, button boxes, pointers, virtual touch balls, and virtual indicator bars provided by GUI objects.
  • FIG. 2 is a screen state diagram for explaining the structure of a pointer according to the present invention.
  • the pointer 150 of the present invention has a "+” plus sign in a translucent circle, and the intersection point of the "+" plus sign is presented as the display position 152 (Px, Py).
  • the display position 152 of the pointer 150 is spaced apart from the touch position 154 (Tx, Ty) by a distance d, and the azimuth angle ⁇ from the horizontal line is displayed at 90 ° or 12 o'clock.
  • the display position 152 of the pointer 150 is sufficient if the display area is not covered by the finger.
  • the shape or shape of the pointer 150 is not limited to the illustrated circle, but various shapes and colors. Translucent or transparent can be transformed into, but includes "+" plus sign inside.
  • the pointer mode selection button 160 is displayed on the lower left of the display panel in a translucent or transparent manner.
  • the pointer mode selection button 160 is marked with a letter "P" in a circle and is displayed in a different color in the normal mode and the pointer mode to indicate the selected mode.
  • the pointer mode selection button 160 is not fixed to the lower left side and may be positioned in an arbitrary area of the display panel by setting the environment.
  • the pointer mode selection button 160 is for distinguishing the pointer mode to be used in parallel with the existing gesture control method.
  • the controller 120 switches from the normal mode to the pointer mode. In the point mode, other control schemes, such as gesture control schemes, are disabled.
  • a menu bar 162 may be displayed at the bottom of the screen.
  • the menu bar 162 includes a single / multi mode button 162a, an environment setting button 162b, a virtual controller button 162c, a virtual indicator bar button 162d, and a virtual touch ball button 162e.
  • FIG. 3 is a flowchart illustrating a pointer mode selection process according to the present invention.
  • the control unit 120 checks whether the pointer mode selection button display is set, and at the time of setting, the pointer mode selection button ( 160 displays (S102 to S106).
  • the controller 120 changes the color of the letter “P” to indicate that the pointer mode is present (S110).
  • the erase or check of the set time is checked (S112 and S114).
  • the pointer mode is released by restoring the color of the letter "P" to the original color to indicate that the pointer mode is in the normal mode (S116).
  • FIG. 4 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in the pointer mode according to the present invention.
  • the controller 120 sets setting data (distance value, azimuth, size, transparency, shape, and color data) of the pointer stored in the memory unit 140. Take a to configure a pointer graphic image and switch the system to the pointer standby mode (S122). Subsequently, when a finger touchdown is detected on the touch panel 114 (S124), the controller 120 substitutes the distance value and the azimuth angle set from the touch position 154 (Tx0, Ty0) and displays the display position 152 (Px0, Py0). Calculate and display the prepared pointer graphic image in the calculated display position (S126).
  • setting data distance value, azimuth, size, transparency, shape, and color data
  • the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, thereby maintaining the set distance and the set azimuth angle. Control the display.
  • the finger movement is stopped and the touch-up operation state in which the finger contact falls on the touch panel is detected (S128).
  • the controller 120 calculates the current position 156 (Px1, Py1) where the pointer 150 is located as the selection position, not the current touch position.
  • the controller 120 waits for selection of the object 157 corresponding to the selection position (S130).
  • the controller 120 checks whether a selection command such as a touchdown or a touch click is input in the selection standby state (S132), and checks whether the set selection waiting time has elapsed (S134).
  • a selection command such as a touchdown or a touch click is input in step S132
  • the object 157 that is waiting for selection is activated (S136) and the process returns to step S122.
  • the controller 120 automatically returns to step S122 when there is no input of a selection command until the set time elapses.
  • the pointer mode selection button 160 may automatically disappear when the selection object is activated. This is because the selection operation is convenient for the pointer and the touch control is convenient for the screen operation unless the word writing mode is used.
  • a specific touch control operation command for example, vertical movement control
  • the touch position and the pointer display position are not fixed to the set value in the pointer display and movement process of step S126, but may be controlled to be automatically changed in the specific region.
  • FIG. 5 is a view for explaining a blind spot on the touch screen
  • FIG. 6 is a view for explaining a preferred embodiment of display position control of the pointer 150 in each area of the blind spot.
  • the total square region 180 is set to the width D of the edge branches from the upper, lower, left and right boundary lines, and includes the upper side region 182, the lower side region 184, the left side region 186, the right side region 188, and the upper left corner region 181, The upper right corner area 183, the lower left corner area 185, and the lower right corner area 187 are divided.
  • the vertical distance d1 of the pointer 150 in the upper side region 182 is determined by the following equation.
  • C1 is the coefficient of the upper edge region and S1 represents the vertical distance from the rectangular boundary to the upper edge region. Therefore, in the upper side region 182, the azimuth angle maintains the set value and the vertical distance decreases in inverse proportion to the entry distance S1.
  • the vertical distance d1 of the pointer 150 in the lower side region 184 is determined by the following equation.
  • C2 is the coefficient of the lower edge region and S2 is the vertical distance from the rectangular boundary to the lower edge region. Therefore, in the same manner as the upper side region 182, in the lower side region 184, the vertical distance is reduced in inverse proportion to the entry distance S2.
  • the horizontal distance dL of the pointer 150 in the left side region 186 is determined by the following equation.
  • C3 is the coefficient of the left side region and S3 represents the horizontal distance from the rectangular boundary to the left side region.
  • the vertical distance is maintained at d0, but the horizontal distance dL is variable.
  • the azimuth angle ⁇ L is determined by the following equation.
  • both the distance and the azimuth angle are variably controlled.
  • the vertical distance and the horizontal distance in the lower side region and the left side region are reduced in inverse proportion to the entry horizontal component and the entrance vertical component, respectively, so that both the distance and the azimuth angle are variably controlled.
  • the horizontal distance dR of the pointer 150 in the right side region 188 is determined by the following equation.
  • C4 is the coefficient of the right side region and S4 represents the horizontal distance from the rectangular boundary to the right side region.
  • the vertical distance is maintained at d0, but the horizontal distance dR is variable.
  • the azimuth angle ⁇ R is determined by the following equation.
  • both the distance and the azimuth angle are variably controlled.
  • the vertical distance and the horizontal distance in the lower side region and the right side region are reduced in inverse proportion to the entry horizontal component and the entrance vertical component, respectively, so that both the distance and the azimuth angle are variably controlled.
  • FIG. 7 is a flowchart illustrating a process of controlling a pointer display position in the rectangular area of the embodiment of FIG. 6.
  • the controller 120 inputs a touch position value (S142) when the input touch position value enters the range of the set rectangular area 180 (S144). It is checked whether it is the left or right side area (S150) or the top, bottom, left and right corner areas (S154). In step S146, if the upper or lower side area is inversely proportional to the entry distance from the boundary line of the blind spot, the distance value is varied (S148). In step S150, the azimuth angle is changed in inverse proportion to the entry distance from the boundary line of the blind spot in the left or right side area (S152).
  • step S154 if any one of the upper, lower, left, and right corner regions is inversely proportional to the entry distance from the boundary line of the blind spot, the distance value and the azimuth angle are simultaneously changed (S156).
  • the distance value or azimuth data calculated in each of steps S148 to S156 is transmitted from the control unit 120 to the pointer display position control program and reflected in the screen coordinate value of the pointer image.
  • the distance and azimuth between the display position and the touch position of the pointer are automatically controlled adaptively to the entry distance, thereby making it easy to select an object in the blind spot.
  • the rectangular area is divided into four sides and four corners to control the distance value and the azimuth angle, but the present invention is not limited thereto, and any one or more of them may be used in combination.
  • the vertical area can be variably controlled by setting only the lower area to the rectangular area.
  • the pointer display position may not be variably controlled in the rectangular area but may be extended in a variable control method over the entire screen area.
  • FIG. 9 is a view for explaining an embodiment for variably controlling the display position of the pointer 150 in the entire screen area.
  • the X-marked circles indicated by dotted lines in the drawings are touch points T1, T2, T3, and T4, and the plus-marked red circles indicate the pointer mark points P1, P2, P3, and P4, and the respective touch points (T1, T2, T3).
  • the counterclockwise azimuth from the horizontal line of T4) is ⁇ T1 , ⁇ T2 , ⁇ T3 , ⁇ T4 Mark each as
  • the distance of the display point in the touch point azimuth direction increases to a value greater than the set distance d0 and converges to the set distance d0 when approaching the center point.
  • the touch point T3 approaches the touch point T3 in the azimuth angle ( ⁇ T3 -90 °) with respect to the touch point and approaches the center point (0,0). As it approaches, it converges to the set distance d0.
  • the touch point and the pointer display point are closer to each other as the lower edge of the touch screen 110 approaches.
  • the shortest distance should be maintained at a minimum distance from the finger touch position so that the pointer mark is not hidden.
  • FIG. 10 is a flowchart illustrating a pointer display position control process in the entire area of the embodiment of FIG. 9.
  • the controller 120 inputs a touch position value (S162) to calculate a distance value q and an azimuth angle between the input touch point and the center point (S164). It is checked at which upper limit of the first upper limit to the fourth upper limit the calculated azimuth angle of the touch point (S166 to S172).
  • the azimuth angle of the display point P1 is calculated based on the first upper limit display point distance calculation formula of Table 1 and the touch point T1 (S167). If it is the second upper limit in operation S168, the azimuth angle of the display point P2 is calculated based on the second upper limit display point distance calculation formula of Table 1 and the touch point T2 (S169).
  • the azimuth angle of the display point P3 is calculated based on the third upper limit display point distance calculation formula of Table 1 and the touch point T3 (S171). If it is the fourth upper limit in step S172, the azimuth angle of the display point P4 is calculated based on the fourth upper limit display point distance calculation formula of Table 1 and the touch point T4 (S173).
  • the distance values or azimuth data of the display points P1 to P4 calculated in steps S167 to S173 are transmitted from the controller 120 to the pointer display position control program and reflected on the screen coordinate values of the pointer image.
  • objects such as a selection menu or an icon button having a high frequency of use are mainly disposed on the top and left and right sides of the screen configuration on the Internet, and objects having a low frequency of use are disposed at the bottom of the screen.
  • objects of high importance are arranged on the top and left sides of ergonomic visual or behavioral habits, it is preferable to select the object area as a pointer while moving up from the bottom in the synchronous direction where important parts are not covered by the fingers.
  • FIG. 11 is a view for explaining an embodiment of a virtual indicator rod using a variable control technique of the display position of the present invention.
  • the indicator rod zone 192 is set near the right edge of the touch screen 110 or near the lower right corner, and each touch position of the indicator rod 192 is determined by the touch screen 110.
  • the distance from the touch point T to the display point P is variably controlled so as to correspond to the entire area, and the pointer image is displayed as the instruction image 194.
  • Variable control is performed at the ratio of the horizontal distance 110b to (P).
  • the azimuth angle of the display point based on the touch point may be fixed at a specific azimuth angle or variably controlled in a manner similar to the above-described rectangular area or the entire area.
  • the controller 120 sets the distance variable control mode to the virtual indicator bar control mode.
  • the virtual indicator bar control technology of the present invention is very useful in the technical fields such as wall-mounted large touch screen device, touch television, touch screen electronic blackboard, touch screen large monitor.
  • 12A to 12H are screen state diagrams for explaining an environment setting of a pointer according to the present invention.
  • the controller 120 executes the pointer configuration module to set various parameters of the pointer.
  • a screen (Fig. 12A) showing a pull-down menu for setting the direction and distance of the pointer is displayed and the direction setting screen (Fig. 12A) is responded to in response to the pull-down menu selection of Fig. 12A. 12b) and the distance setting screen (Fig. 12c).
  • the screen showing the pull-down menu for setting the pointer size, transparency, color, shape, etc. is displayed (Fig. 12D), and in response to the pull-down menu selection, the color setting screen 12e, the size setting screen (Fig. 12F), The transparency setting screen (Fig. 12G) and the appearance setting screen (Fig. 12H) are displayed.
  • FIG. 13 is a view for explaining a method of setting the direction and distance of the pointer of the present invention in the touch state.
  • the pointer 150 is displayed by the distance and direction values set by the environment setting.
  • the controller 120 recognizes another touch position.
  • the controller 120 automatically switches to the pointer distance and direction configuration mode to execute the pointer configuration module.
  • the touch movement of the finger 172 is tracked, the pointer 151 is positioned at a desired position and direction, and when the finger 172 is touched up (released), the position and finger of the currently placed pointer are touched.
  • the distance value between the touch positions of 170 is calculated and the set value is changed to the new distance value and the direction value.
  • the display position of the pointer can be changed at an arbitrary distance and direction from the touch position in the touch state without selecting the environment setting menu, convenience in use can be increased.
  • FIG. 14 is a view showing a state in which a point display area is touched according to the present invention.
  • the controller 126 executes an enlargement / reduction processing program to execute the inside of the pointer 150.
  • the "+" sign is replaced by the "-" sign. If it is in the + sign state, it performs enlargement function in response to the touch action and inverts the + sign in the circle with-sign. On the contrary, in-sign state, it performs reduction function in response to touch action and inverts-sign in circle with + sign.
  • 15 is a flowchart for explaining an enlargement / reduction processing program according to the present invention.
  • the controller 120 checks whether a touchdown 154 is detected in a pointer standby state (S182) (S184).
  • the pointer image 150 including the enlargement (+ sign) or the reduction (-sign) is displayed on the display position 152 away from the touch position 154 by a set distance as described above ( S186).
  • the touch-up is detected (S188). If the touch-up is detected, it is detected whether a retouch is input, and it is checked whether the retouch position 158 is a pointer display area touchdown (S190) or a touch click (S198).
  • step S190 in response to the displayed enlargement or reduction symbol to enlarge or reduce continuously at a predetermined ratio (S192).
  • the + sign enlarges the screen image in steps of 10% from the original size.
  • -sign scales the screen image by 10% from the original size. This reduction process is maintained until the touch up is detected.
  • the touch-up is detected in step S194, the enlargement or reduction symbol is inverted and returned to step S190 (S196). That is, if the + sign in step S190, after the step S196 is inverted to the-sign.
  • step S198 If the touch click is checked in step S198, only one-step magnification is executed at the set magnification or reduction ratio, for example, 200% (S199). After the basic enlargement in step S199, step S196 is performed to invert and display the pointer symbol from + to-and return to step S190. Therefore, if there is another touch click in step S198, this time the screen image which has already been enlarged 200% is restored to the original 100% screen image.
  • the pointer display area is divided into touchdown or touchclick to enable continuous enlargement or reduction control during touchdown, and can be quickly controlled in two stages of 200% enlargement-100% reduction during touchdown. Done.
  • 16 is a screen state diagram in which the virtual controller 170 according to the present invention is displayed.
  • the controller 120 sets the system to the virtual controller operation mode and touches the virtual controller 170 with the pointer 150 at the touch point T. Mark).
  • the virtual controller 170 may perform a right button function of a conventional mouse. Although the virtual controller 170 is displayed only on the right side of the touch point 170, another virtual controller corresponding to the left button of the mouse may be displayed on the left side of the touch point T.
  • the virtual controller 170 may assign various functions such as a menu selection button as well as a left or right mouse button function.
  • FIG. 17 is a screen state diagram illustrating a virtual touch ball according to the present invention
  • FIG. 18 is a flowchart illustrating a virtual touch ball control operation according to the present invention.
  • the virtual touch ball 170 is displayed in a form similar to the existing track ball on the upper left of the touch screen 110.
  • the virtual touch ball 170 is preferably displayed in a transparent or translucent.
  • the virtual touch ball 170 includes an activation area 170a and an inactivation area 170b around the active touch ball 170.
  • the activation area 170a is an area for detecting the moving speed and the moving direction from touch detection in order to control the screen movement or scrolling in response to the touch operation. This is an area for detecting whether touch movement is started.
  • the activation area 170a controls the screen to move the screen or the object in the direction in which the screen or the object is rotated in response thereto.
  • the virtual touch ball 170 stops the function of the virtual touch ball 170 to maintain control by the touch movement of the pointer 150.
  • the virtual touch ball 170 recognizes the operation of the virtual touch ball 170 and performs a screen movement or screen scroll operation.
  • the controller 120 displays the virtual touch ball 170 on the upper left of the screen. (S206).
  • the touchdown and the touch movement are detected in the active area 170a and the deactivated area 170b of the virtual touch ball 170 (S208).
  • the control unit 120 moves the screen or object in the corresponding direction in response to the direction and speed at which the virtual touch ball rolls (S210).
  • the touch-up is detected in the activation area 170a (S212)
  • the touch-up is detected, the screen is stopped while gradually decreasing the screen moving speed (S214).
  • S224 If the touch is not detected in S216, it is detected whether the set time has elapsed (S224). If it has not elapsed, the virtual touch ball mode is maintained, and if it is not elapsed, it returns to the normal mode standby state (S224).
  • 19 is a view for explaining the two touch 2 pointer according to the present invention.
  • the two touch points T1 and T2 may be separated. Find the azimuth.
  • the bisector 202 of the straight line distance S between two touch points T1 and T2 is calculated on the touch screen 110 to obtain the azimuth angle of the touch points.
  • Point that draws a vertical line perpendicular to the distance S from the bisector 202 in one direction (usually the lower direction of the screen) and meets a horizontal line 208 (consistent with the screen horizontal) set at a distance from the bisector 202. (210) is calculated as a reference point.
  • the predetermined distance 212 between the bisector 202 and the reference point 210 is preferably proportional to the length from the stop end to the point where the palm meets the wrist. In particular, it is limited to the value between the palm vertical length (when the finger is raised to the maximum on the touch screen) and the hand length (when the hand is stretched out on touch).
  • the azimuth angles ⁇ T1 and ⁇ T2 of the extension lines 214 and 216 are drawn from the reference point 210 by extending the extension lines 214 and 216 passing through the touch points T1 and T2 and counterclockwise with respect to the horizontal line 208. )
  • the pointer display points P1 and P2 are calculated as positions spaced apart from the reference point 210 through the touch points T1 and T2 by the set distance d, and the pointer images are rendered on the calculated pointer display points P1 and P2. Are displayed respectively.
  • the pointer display control as described above, the pointer can be displayed on an extension line of the finger length direction.
  • 206 is a virtual circle having a radius RK, and two touch points T1 and T2 are located at the same distance from the center point 204 of the circle, that is, on the circumference.
  • 20 is a view for explaining a three touch 3 pointer according to the present invention.
  • the three touch three pointer finds the two farthest touch points T1 and T2 among the touch points T1, T2 and T3 and has these two touch points T1 and T2, which is the same as the two touch two pointer described above.
  • the azimuth angles ⁇ T1 , ⁇ T2 , ⁇ T3 of the respective touch points T1, T2, and T3 are calculated to calculate the pointer display points P1, P2, and P3.
  • the point that sets the direction reference in the direction opposite to the touch point T3 when drawing a vertical line perpendicular to the distance S from the bisector 202 is different from the two-touch method.
  • 21 is a view for explaining a five touch 5 pointer according to the present invention.
  • the controller 120 recognizes that five fingers of the thumb F1, the index finger F2, the middle finger F3, the ring finger F4, and the finger F5 are touched, and responds to the multi-touch in response to the multi-touch.
  • the coordinate values of T2, T3, T4, and T5) are calculated.
  • the controller 120 generates a multi-pointer by inputting coordinate values of the detected multi-touch positions.
  • the reference point 210 is a point where the vertical extension line 212 is drawn in the opposite direction of the touch positions T2 to T4 and the vertical extension line 212 and the horizontal line 208 meet.
  • Extension lines passing through the respective touch points T1, T2, T3, T4, and T5 are drawn from the reference point 210 and the azimuth angles ⁇ T1 , ⁇ T2 , ⁇ T3 counterclockwise from the virtual horizontal line 208. , ⁇ T4 , ⁇ T5 ) are obtained, respectively. This is summarized by the following formula.
  • n is a positive integer and ⁇ n is the angle between touch points.
  • the azimuth of the missing touch points is calculated to maintain the distance, angle and azimuth between the touch points even if the multi-touch is moved and the coordinates change frequently.
  • the longitudinal direction of each finger may be inferred from the azimuth of each touch point.
  • a pointer display point corresponding to each finger touch is calculated on an extension line that is spaced a predetermined distance from each touch point and has an azimuth angle of each touch point.
  • FIG. 22 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in a multi-touch multi-pointer mode according to the present invention.
  • the control unit 120 brings setting data (vertical extension line length value, distance setting value from the touch point to the display point, size, transparency, shape, and color data) stored in the memory unit 140. Configures the pointer graphic image and switches the system to the pointer standby mode (S232). Subsequently, when the multi-touchdown is sensed by the touch panel 114 (S234), the controller 120 performs the above-described algorithm from the touch positions T1 to T5 to calculate the azimuth angle of each touch point, thereby adjusting each display direction of the multi-pointers. Infer (S236).
  • the pointer display points P1 to P5 are respectively calculated at positions spaced by a predetermined distance set in the direction inferred from the touch point, and the pointer images are rendered at the calculated display points P1 to P5 to display the multi-pointers on the screen.
  • the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, thereby maintaining the set distance and the set azimuth angle. Control the display. The movement of the finger is stopped and the touch-up operation state in which the finger touch falls on the touch panel is detected (S242).
  • the controller 120 calculates the current position where the pointer display point P is located as the selection position, not the current touch position.
  • the controller 120 waits for selection of an object corresponding to the selection position (S244).
  • the controller 120 checks whether a selection command such as a touchdown or a touch click is input in the selection waiting state (S246), and checks whether the set selection waiting time has elapsed (S250). If a selection command such as touchdown or touchclick is input in step S246, the object waiting for selection is activated (S248) and the process returns to step S232.
  • step S250 the controller 120 automatically returns to step S232 when there is no input of a selection command until the set time elapses.
  • FIG. 23 is a view for explaining a modified embodiment of the multi-touch multi-pointer mode according to the present invention
  • FIG. 24 is a view for explaining the principle of calculating the display point azimuth angle in the modified embodiment.
  • the touch point T1 and the reference point 210 of the thumb are connected to each other through three joint points J11, J12, and J13.
  • the touch point T2 and the reference point 210 of the index finger are connected to each other through three joint points J21, J22, and J23.
  • the middle touch point T3 and the reference point 210 are connected to each other through three joint points J31, J32, and J33.
  • the touch point T4 and the reference point 210 of the ring finger are connected to each other through three joint points J41, J42, and J43.
  • the touch point T5 of the thumb and the reference point 210 are connected to each other through three joint points J51, J52, and J53.
  • the pointers P1 to P5 corresponding to each finger are disposed at a predetermined distance from the touch point on the extension line extending through the touch point from the joint points J11, J21, J31, J41, and J51, respectively.
  • the controller 120 may determine the distance Di0 of the touch point Ti and the joint point Ji1 of the fingers, the distance Di1 between the joint point Ji1 and the joint point Ji2, and the joint point Ji2.
  • the angle ⁇ T32 between (Ji2), the distance ⁇ T33 between the joint point (Ji2) and the joint point (Ji3), and the angle ⁇ T3 between the joint point (Ji3) and the reference point 210 are managed as parameters of the corresponding finger.
  • all joint points are positioned on a straight line connecting the reference point and the touch point, and the distance between each point is the maximum length.
  • the azimuth angles of the joint points and the distance between the points vary according to the degree of bending the finger.
  • each joint point is calculated and arranged as a ratio of the distance between the touch point and the reference point, and the azimuth angle between the reference point 210 and the third joint point J33 is obtained by the following equation.
  • ⁇ T3 ⁇ One + ⁇ 2 + ⁇ 3
  • the azimuth angle of the second joint point (J32) is obtained by the following equation.
  • the azimuth angle of the first joint point J31 is obtained by the following equation.
  • the azimuth angle of the display point P3 is calculated at the same angle as the azimuth angle of the first joint point J31.
  • the touch point and the reference point 210 are not simply connected in a straight line as described above, but connect the joint points of each finger to display the virtual hand structure as a whole, and pass the touch point from the first joint point.
  • the direction of the finger and the arrangement direction of the pointer can be controlled more precisely.
  • the direction of the pointer can be controlled according to the degree of bending the finger.
  • FIG. 25 is a screen state diagram of a modified embodiment for explaining a display operation of a pointer according to the present invention
  • FIG. 26 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation according to a modified embodiment.
  • a pointer is displayed and waits at an arbitrary position 152 on the screen (262).
  • an arbitrary position 154 of the touch screen 110 is touched (S264)
  • the pointer is jumped and displayed from the position 152 to the 156 position as shown in FIG. 25 (S266).
  • the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, and the pointer 150. ) Is controlled to be displayed while maintaining the set distance and the set azimuth (S270).
  • step S272 The finger movement is stopped and the touch-up operation state in which the finger contact falls on the touch panel is detected (S272).
  • the controller 120 calculates the current position 156 (Px1, Py1) where the pointer 150 is located as the selection position, not the current touch position.
  • the controller 120 waits for selection of the object 157 corresponding to the selection position (S274).
  • the controller 120 checks whether a selection command such as touchdown or touchclick is input in the selection standby state (S276).
  • a selection command such as a touchdown or a touch click is input in step S276, the object 157 that is waiting for selection is activated (S278), and the process returns to step S262 to maintain the pointer displayed.
  • the present invention may be implemented in software such as an application form of a smart phone, hardware in an electronic device having a touch screen, or a combination of firmware and hardware.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention se rapporte à un procédé permettant de commander un écran tactile. Le procédé faisant l'objet de la présente invention consiste : à afficher sur un panneau d'affichage un pointeur qui est espacé d'une position de contact d'un doigt sur un panneau tactile selon une distance prédéfinie et qui n'est pas recouvert visuellement par un doigt ; à déplacer et à afficher le pointeur en fonction du mouvement du doigt tandis qu'une distance prédéfinie entre le pointeur et le doigt est conservée lorsque le doigt se déplace et touche la position de contact ; à mettre un objet devant être sélectionné en attente quand le doigt est retiré du panneau tactile alors que la position d'affichage du pointeur est fixée sur l'objet devant être sélectionné ; et à sélectionner l'objet mis en attente lorsqu'un ordre de sélection est entré avant que l'affichage du pointeur ne soit supprimé. En conséquence, une sélection erronée due à une opération erronée peut être évitée lors d'un toucher-glisser sur un écran tactile.
PCT/KR2011/008568 2010-11-10 2011-11-10 Appareil à écran tactile et procédé permettant de commander cet appareil WO2012064128A2 (fr)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
KR10-2010-0111287 2010-11-10
KR20100111287 2010-11-10
KR20110020671 2011-03-09
KR10-2011-0020671 2011-03-09
KR20110024697 2011-03-21
KR10-2011-0024697 2011-03-21
KR20110041742 2011-05-03
KR10-2011-0041742 2011-05-03
KR10-2011-0070855 2011-07-18
KR1020110070855A KR101095851B1 (ko) 2010-11-10 2011-07-18 터치스크린장치 및 그 제어방법

Publications (2)

Publication Number Publication Date
WO2012064128A2 true WO2012064128A2 (fr) 2012-05-18
WO2012064128A3 WO2012064128A3 (fr) 2012-07-05

Family

ID=45506555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/008568 WO2012064128A2 (fr) 2010-11-10 2011-11-10 Appareil à écran tactile et procédé permettant de commander cet appareil

Country Status (2)

Country Link
KR (1) KR101095851B1 (fr)
WO (1) WO2012064128A2 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168431A1 (fr) * 2013-04-10 2014-10-16 주식회사 지니틱스 Procédé de traitement d'événement tactile et appareil correspondant
KR20140122683A (ko) * 2013-04-10 2014-10-20 주식회사 지니틱스 터치지점들이 상대적으로 회전하는 터치이벤트 처리방법
WO2016076519A1 (fr) * 2014-11-12 2016-05-19 주식회사 트레이스 Numériseur de vol stationnaire tridimensionnel
WO2018048050A1 (fr) * 2016-09-12 2018-03-15 에스케이텔레콤 주식회사 Dispositif d'affichage tactile multipoint et procédé de reconnaissance tactile associé

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101902006B1 (ko) * 2012-01-26 2018-10-01 삼성디스플레이 주식회사 터치 스크린 패널 일체형 표시장치
CN103425311A (zh) * 2012-05-25 2013-12-04 捷达世软件(深圳)有限公司 移动对象点选定位方法及系统
JP5977132B2 (ja) * 2012-09-28 2016-08-24 富士ゼロックス株式会社 表示制御装置、画像表示装置、およびプログラム
WO2014129681A1 (fr) * 2013-02-21 2014-08-28 엘지전자 주식회사 Dispositif d'affichage et procédé de pointage pour dispositif d'affichage
US11096668B2 (en) 2013-03-13 2021-08-24 Samsung Electronics Co., Ltd. Method and ultrasound apparatus for displaying an object
WO2014142468A1 (fr) 2013-03-13 2014-09-18 Samsung Electronics Co., Ltd. Procédé de fourniture d'une copie image et appareil à ultrasons associé
KR102378503B1 (ko) * 2020-12-29 2022-03-24 울산대학교 산학협력단 무형 마우스를 통한 정보의 입출력 방법 및 비일시성의 컴퓨터 판독 가능 기록 매체

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1040010A (ja) * 1996-07-19 1998-02-13 Ricoh Co Ltd タッチパネル付き情報処理装置
JPH1124841A (ja) * 1997-07-07 1999-01-29 Canon Inc 情報処理装置、処理方法、及び記憶媒体
US20090288043A1 (en) * 2007-12-20 2009-11-19 Purple Labs Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
KR20100104884A (ko) * 2009-03-19 2010-09-29 김연수 포인터 디스플레이가 가능한 터치스크린

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1040010A (ja) * 1996-07-19 1998-02-13 Ricoh Co Ltd タッチパネル付き情報処理装置
JPH1124841A (ja) * 1997-07-07 1999-01-29 Canon Inc 情報処理装置、処理方法、及び記憶媒体
US20090288043A1 (en) * 2007-12-20 2009-11-19 Purple Labs Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
KR20100104884A (ko) * 2009-03-19 2010-09-29 김연수 포인터 디스플레이가 가능한 터치스크린

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168431A1 (fr) * 2013-04-10 2014-10-16 주식회사 지니틱스 Procédé de traitement d'événement tactile et appareil correspondant
KR20140122683A (ko) * 2013-04-10 2014-10-20 주식회사 지니틱스 터치지점들이 상대적으로 회전하는 터치이벤트 처리방법
KR101661606B1 (ko) 2013-04-10 2016-09-30 주식회사 지니틱스 터치지점들이 상대적으로 회전하는 터치이벤트 처리방법
WO2016076519A1 (fr) * 2014-11-12 2016-05-19 주식회사 트레이스 Numériseur de vol stationnaire tridimensionnel
WO2018048050A1 (fr) * 2016-09-12 2018-03-15 에스케이텔레콤 주식회사 Dispositif d'affichage tactile multipoint et procédé de reconnaissance tactile associé
US11237621B2 (en) 2016-09-12 2022-02-01 Sk Telecom Co., Ltd. Multi-touch display apparatus and touch recognition method thereof

Also Published As

Publication number Publication date
WO2012064128A3 (fr) 2012-07-05
KR101095851B1 (ko) 2011-12-21

Similar Documents

Publication Publication Date Title
WO2012064128A2 (fr) Appareil à écran tactile et procédé permettant de commander cet appareil
WO2016064137A1 (fr) Appareil et procédé de traçage et de résolution de contenu de figure
US8866776B2 (en) Information processing device adapted to receiving an input for user control using a touch pad and information processing method thereof
WO2013089392A1 (fr) Dispositif d'affichage pliable et son procédé d'affichage
KR100901106B1 (ko) 터치 스크린 제어 방법, 터치 스크린 장치 및 휴대용 소형 전자 장치
US20090027354A1 (en) Automatic switching for a dual mode digitizer
US20090160805A1 (en) Information processing apparatus and display control method
WO2014084633A1 (fr) Procédé d'affichage d'applications et dispositif électronique associé
US20100177053A2 (en) Method and apparatus for control of multiple degrees of freedom of a display
WO2011046270A1 (fr) Système de commande d'entrées de type tactile multipoints
WO2013141464A1 (fr) Procédé de commande d'entrée tactile
KR20140038568A (ko) 터치스크린 장치의 사용자로부터 수신된 입력 및 제스쳐에 응답하여 동작을 수행하는 컴퓨터로 구현된 방법 및 컴퓨터 판독가능 매체
EP2852882A1 (fr) Procédé et appareil de commande d'interface utilisateur utilisant un écran tactile
JPH0778120A (ja) 手持ち演算装置及び手持ち演算装置における入力信号処理方法
WO2012030194A1 (fr) Procédé et appareil d'interfaçage
WO2014090116A1 (fr) Procédé destiné à afficher une icône, microprocesseur et terminal mobile
JP6162299B1 (ja) 情報処理装置、入力切替方法、及びプログラム
WO2014054861A1 (fr) Terminal et procédé de traitement d'entrée multipoint
WO2016085186A1 (fr) Appareil électronique et procédé d'affichage d'objet graphique de ce dernier
WO2016129923A1 (fr) Dispositif d'affichage, procédé d'affichage et support d'enregistrement lisible par ordinateur
TW201520876A (zh) 使用者介面的操作方法與電子裝置
WO2014104727A1 (fr) Procédé de fourniture d'interface utilisateur utilisant un système tactile multipoint et appareil associé
WO2010095783A1 (fr) Procédé de commande d'un écran tactile et dispositif d'écran tactile faisant appel à celui-ci
WO2020027417A1 (fr) Dispositif électronique et procédé pour fournir un outil de saisie virtuelle
CN111142775A (zh) 一种手势交互方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11839685

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11839685

Country of ref document: EP

Kind code of ref document: A2