WO2012064128A2 - Touch screen apparatus and method for controlling same - Google Patents

Touch screen apparatus and method for controlling same Download PDF

Info

Publication number
WO2012064128A2
WO2012064128A2 PCT/KR2011/008568 KR2011008568W WO2012064128A2 WO 2012064128 A2 WO2012064128 A2 WO 2012064128A2 KR 2011008568 W KR2011008568 W KR 2011008568W WO 2012064128 A2 WO2012064128 A2 WO 2012064128A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch
pointer
point
display
finger
Prior art date
Application number
PCT/KR2011/008568
Other languages
French (fr)
Korean (ko)
Other versions
WO2012064128A3 (en
Inventor
채상우
Original Assignee
Chae Sang-Woo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chae Sang-Woo filed Critical Chae Sang-Woo
Publication of WO2012064128A2 publication Critical patent/WO2012064128A2/en
Publication of WO2012064128A3 publication Critical patent/WO2012064128A3/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a touch screen device and a control method thereof, and more particularly, to an apparatus and method for displaying a user interface graphic object at a position different from a touch position in response to a finger touch and controlling the displayed GUI object.
  • a touch screen using a touch sensing device has been widely used as a control method of a graphic user interface in wired and wireless electronic devices of mobile devices and ubiquitous environments.
  • the multi-network environment of the user may be richer and more diverse, and a more convenient graphical user interface may be provided.
  • a cursor is first displayed on a screen, and when a touch is detected within a predetermined area around the cursor display position, the touch position and the cursor are displayed.
  • the display position is interlocked and the touch position is moved, the cursor display position follows the same interval while maintaining the same distance, and when the touch position is touched, the tapping operation recognition initiates a cursor control technique that recognizes that the tapping operation occurs at the cursor display position.
  • Patent Publication No. 2008-0070226 (Samsung), a first touch and a second touch are detected, a pointer is displayed at a first touch position, and the movement of the second touch is controlled by the position movement of the pointer displayed at the first touch position.
  • the cursor display direction is set clockwise around the touch position to facilitate selection in the rectangular area.
  • Patent Publication No. 2009-0094360 (Microsoft) while maintaining a finger in contact with the screen, the user guides the position of the pointer based on visual feedback provided by the callout, and the user guides the screen to the screen. You can precisely move and fine-tune the pointer position by moving your finger on the surface of the screen until it is over a copy of the desired target displayed in the non-occluded area of the screen.
  • a pointer is displayed at a position spaced a predetermined distance from a touch position in a predetermined direction.
  • These devices typically perform a selection command of a selection position by moving the pointer to a desired selection position (drag) and then releasing the touch while holding the pointer at the selection position while maintaining the touch state after the touch screen device. .
  • the selection command is performed at the touched location, which causes an unwanted object to be selected and executed.
  • the existing multi-screen method uses a finger as a pointer itself, which causes a lot of mistake in clicking confirmation and frequent errors, making it difficult for users to search the Internet or perform tasks. If not, you can select a small link area after zooming out and zooming out using two fingers in selecting a small area displayed on the GUI object.
  • This method has a problem that it is not easy to select GUI objects with a more natural touch method in the existing web environment, and seriously, a new app needs to be made to fit the area of the average finger's finger to operate the gesture-based control. It causes difficulties.
  • An object of the present invention is to provide a touch screen device and a method of controlling the same, which can achieve convenience of a touch input in order to solve the problems of the related art.
  • Another object of the present invention is to provide a touch screen device and a control method thereof capable of accurately controlling an object.
  • Another object of the present invention is to provide a touch screen device and a control method capable of controlling a display position of a pointer adaptively to a physical environment of a screen when a touch is moved.
  • Still another object of the present invention is to provide a touch screen device and a control method having a zoom in and zoom out function associated with a pointer.
  • Another object of the present invention to provide a touch screen device and a control method having a virtual indicator bar function associated with the pointer.
  • Another object of the present invention is to provide a touch screen device and a control method having a virtual touch ball function associated with a pointer.
  • Another object of the present invention to provide a touch screen device and a control method capable of multi-touch multi-pointer control.
  • the control method of the present invention is a touch screen device having a display panel and a touch panel, the touch position corresponding to the finger touch (or touch down) on the touch panel, and recognized
  • the pointer is displayed on an area of the display panel which is a certain distance away from the touch position and is not visually obscured by the finger, and when the finger is moved while the finger is touched at the touch position, the pointer is maintained while maintaining a certain distance in conjunction with the finger movement.
  • the selection wait step is limited to a certain time after the touch release and if there is no input of a selection command until the predetermined time elapses, it is preferable to erase the displayed pointer display.
  • the selection command may be generated by at least one of a touch operation for displaying a pointer and another touch operation (touch down, touch click, double touch click), or key input.
  • the pointer operation mode is activated by touching the pointer mode selection button displayed on the touch screen, which is convenient for use in parallel with the existing gesture operation mode.
  • the pointer movement display step may variably control any one of a distance or a direction between the touch point and the pointer display point in conjunction with the distance at which the touch movement enters the rectangular area set near the edge of the touch screen.
  • the pointer movement display step may variably control any one of a distance or a direction between the touch point and the pointer display point corresponding to the distance between the lower edge of the touch screen and the touch point.
  • the pointer movement display step may variably control the distance between the touch point and the pointer display point corresponding to the distance between the center point and the touch point of the touch screen.
  • the distance between the reference point and the touch point in the indicator bar control zone in the virtual indicator bar mode, and variably control at least one of the distance or direction between the touch point and the display point of the indicator bar corresponding to the calculated reference point distance A virtual indicator bar mode may be performed.
  • the pointer display area is displayed on the other hand while one hand is touched at the first touch point. Detects the touch with the other hand, moves and displays the pointer in association with the touch movement by the other hand, and between the second touch point and the first touch point where the touch-up is generated in conjunction with the touch-up of the other hand. Updating the distance and direction to a new setting position of the pointer makes setting the display direction and distance of the pointer very easy.
  • a touch screen device for displaying a pointer at a setting position spaced apart by a set distance in a set direction from a touch point, wherein an enlarged symbol or a reduced symbol is displayed in a display area of the pointer, and a touch of the display area of the pointer is detected.
  • the enlargement or reduction corresponding to the enlarged or reduced symbol is executed, the enlarged or reduced symbol in the display area of the pointer is displayed as a reduced or enlarged symbol, and the touch detection step is performed in response to the touch detection. Iteratively repeating the inverted display steps makes the zoom in and zoom out interface very convenient.
  • the virtual touch ball is displayed on the touch screen, the direction and rotation speed of rotating the touch ball by touch in the virtual touch ball display area are detected, and the image is displayed by moving in response to the detected rotation direction and rotation speed.
  • the touch-up is detected in the image movement display step, it is very convenient to move the image on the touch screen by gradually decelerating and stopping the image movement speed in the movement direction of the touch-up moment.
  • the present invention detects the multi-touch, infers the longitudinal direction of the touched fingers from the detected multi-touch points, and displays the pointers corresponding to a certain distance extending from each of the touch points in the inferred finger length direction, respectively,
  • the display screen is conveniently controlled by interlocking the multiple pointers linked to the multiple touch points in a group.
  • the inferring step calculates a bisector that bisects a straight line between two farthest touch points among the detected multiple touch points, and extends a vertical extension line having a certain length in one direction from the bisector to meet the horizontal line. It is preferable to calculate the reference point, calculate the azimuth angle of each touch point counterclockwise from the horizontal line around the reference point, and determine extension lines passing through the touch points from the reference point in the longitudinal direction of the finger corresponding to each touch point. .
  • the pointer when the pointer mode is selected, the pointer is displayed at an arbitrary first position of the touch screen, and the touch position corresponding to the finger touch (or touch down) on the touch panel is displayed. Recognizes and jumps the pointer displayed at the first position to a specific second position of the display panel region which is separated by a predetermined distance from the recognized touch position and is not visually covered by the finger, and the finger is touched at the touch position.
  • the pointer is moved and displayed while maintaining a certain distance in conjunction with the movement of the finger, and the object is selected when the finger is released (or touched up) while the display position of the pointer is positioned on the object to be selected.
  • the selection command is entered in the selection wait phase, the object waiting for selection is activated. It is also possible.
  • the apparatus of the present invention includes a touch panel or a sensor unit, a display panel disposed below the touch panel, an input / output unit for inputting user key commands, a memory in which a touch screen control program is stored, and a touch screen control program stored in the memory. It includes a control unit for performing.
  • the touch screen control program recognizes a touch position corresponding to a finger touch (or touch down) on the touch panel, displays a pointer in an area of the display panel which is separated from the recognized touch position by a distance and is not visually obscured by the finger.
  • the pointer When moving a finger while touching a finger on a position, the pointer is moved and displayed while maintaining a certain distance in conjunction with finger movement, and the finger is released when the display position of the pointer is positioned on an object to be selected. Or touch-up) to wait for selection of an object and to activate an object that is waiting for selection when a selection command is input in the selection waiting step.
  • the apparatus of the present invention may be implemented in any one of software, hardware or a combination thereof.
  • the electronic device By using the electronic device using the method and method of the present invention, it is possible to easily click and confirm the image object which is the target object by using the pointer to the data and the web environment created in the existing computing environment. It can be used as a means of connecting the web environment and the app environment in a more familiar way, and the information provided on the screen can be checked more quickly and easily by performing the functions of zoom in and zoom out or zoom in and zoom out provided by the pointer.
  • the automatic direction setting of the pointer using a touch position group of five or less fingers of a user enables a user to select and click a single finger on a large touch screen using a large number of pointers even in a large touch sensing device.
  • FIG. 1 is a block diagram of a touch screen control device according to an embodiment of the present invention.
  • FIG. 2 is a screen state diagram for explaining the configuration of a single pointer according to the present invention.
  • FIG. 3 is a flowchart illustrating a pointer mode selection process according to the present invention.
  • FIG. 4 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in the pointer mode according to the present invention.
  • 5 is a view for explaining a blind spot on the touch screen.
  • FIG. 6 is a view for explaining the operation principle of the display position control of the pointer 150 in each area of the rectangular zone according to the present invention.
  • FIG. 7 is a flowchart illustrating a process of controlling a pointer display position in the blind spot of the embodiment of FIG. 6.
  • FIGS. 8A and 8B are photographs showing a state in which a distance between a touch position and a display position is changed as the lower edge of the touch screen is approached.
  • FIG. 9 is a view for explaining an embodiment for variably controlling the display position of the pointer 150 in the entire screen area.
  • FIG. 10 is a flowchart for explaining a pointer display position control process in an entire region of one embodiment of FIG. 9; FIG.
  • FIG. 11 is a view for explaining an embodiment of a virtual indicator rod using a variable control technique of the display position of the present invention.
  • 12A to 12H are screen state diagrams for explaining the environment setting of a pointer according to the present invention.
  • FIG. 13 is a view for explaining a method of setting the direction and distance of the pointer of the present invention in the touch state.
  • FIG. 14 is a screen state diagram showing a state in which a point display area is touched according to the present invention.
  • Fig. 15 is a flowchart for explaining an enlargement / reduction processing program according to the present invention.
  • 16 is a screen state diagram showing the virtual controller 170 according to the present invention.
  • 17 is a screen state diagram for explaining a virtual touch ball according to the present invention.
  • FIG. 18 is a flowchart illustrating a virtual touch ball control operation according to the present invention.
  • FIG. 19 is a diagram for explaining a two-touch two pointer according to the present invention.
  • 20 is a view for explaining a three touch three pointer according to the present invention.
  • 21 is a view for explaining a five touch five pointer according to the present invention.
  • FIG. 22 is a flowchart illustrating a pointer display and an object selection process associated with a touch operation in a multi-touch multi-pointer mode according to the present invention
  • FIG. 23 is a view for explaining a modified embodiment of the multi-touch multi-pointer mode according to the present invention according to the present invention.
  • 24 is a diagram for explaining the principle of calculating the display point azimuth angle in the modified embodiment.
  • 25 is a screen state diagram of a modified embodiment for explaining the display operation of the pointer according to the present invention.
  • FIG. 26 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation according to a modified embodiment
  • the touch screen controller is an electronic device having a touch screen control function, for example, a smart phone having a touch function, a monitor having a touch function of a digital camera or camcorder, a navigation function having a touch function, a portable multimedia player having a touch function, It includes tablet computers such as iPads and Galaxy Tabs, touchscreen monitors for personal computers such as laptops and desktop computers, smart televisions with touch capabilities, and large wall-mounted displays with touch capabilities.
  • Touchdown Indicates an operation state in which a finger touches the touch panel, and the controller recognizes the touchdown state when the touch state is maintained for a predetermined time, for example, 300ms to 500ms without moving from the first touch position.
  • Touch release or touch up It means the operation state that the finger touches the touch panel.
  • Touch movement It refers to an operation state of moving on the touch panel while keeping a finger in a touch-down state.
  • Touch-tapping Defines an operation state in which touchdowns and touchups are sequentially performed and maintains a short time from the touchdown to the touchup within a predetermined time, rather than the recognition time of the touchdown described above. Set within a short time, such as 100 ms to 400 ms.
  • FIG. 1 is a block diagram of a touch screen control device according to the present invention.
  • the touch screen controller 100 includes a touch screen 110, a controller 120, an input / output unit 130, and a memory unit 140.
  • the touch screen 110 includes a touch panel 114 on the display panel 112.
  • the display panel 112 is configured of a flat panel display such as an OLED or an LED and displays an image, a pointer image, an object, etc. for a user interface.
  • the touch panel 114 detects a touch of a finger by a pressure-sensitive or electrostatic method by a sensor unit to recognize a touch position or a touch coordinate.
  • the control unit 120 is composed of a microprocessor or microcomputer such as a central processing unit (CPU) and a digital signal processing unit (DSP), executes a touch screen control program, and displays display data corresponding to a recognized touch position in response to a given command. It is generated and provided to the display panel to control the operation of the entire touch screen.
  • the input / output unit 130 includes a key input unit, a serial input / output unit, a parallel input / output unit, a wired / wireless communication module, etc., and receives a key command signal or an external data signal, etc., and provides the input to the controller 120 or data generated from the controller 120. Output to the outside.
  • the memory unit 140 includes a cache memory, a DRAM, an SRAM, a flash memory, a magnetic disk storage device, an optical disk storage device, and the like, stores a touch screen control program, and stores data generated by the controller 120.
  • the controller 120 is operated by a graphical user interface (GUI) system to display and control GUI objects on the display panel 112.
  • GUI graphical user interface
  • a GUI object or an image object refers to a unit in which displacement and deformation of an image may be involved and image processing may be performed.
  • the GUI object may be an icon, a desktop, or a window for an application (eg, Word, Excel, power point, internet explorer, etc.) and may be displayed in full or partial area on the screen of the touch screen 110.
  • Objects selected by the user include images, videos, and music.
  • a GUI object or an image object refers to a unit in which displacement and deformation of an image may be involved and image processing may be performed.
  • the GUI object may be an icon, a desktop, or a window for an application (eg, Word, Excel, power point, internet explorer, etc.) and may be displayed in full or partial area on the screen of the touch screen 110.
  • Objects selected by the user include images, videos, and music.
  • UI elements include icons, button boxes, pointers, virtual touch balls, and virtual indicator bars provided by GUI objects.
  • FIG. 2 is a screen state diagram for explaining the structure of a pointer according to the present invention.
  • the pointer 150 of the present invention has a "+” plus sign in a translucent circle, and the intersection point of the "+" plus sign is presented as the display position 152 (Px, Py).
  • the display position 152 of the pointer 150 is spaced apart from the touch position 154 (Tx, Ty) by a distance d, and the azimuth angle ⁇ from the horizontal line is displayed at 90 ° or 12 o'clock.
  • the display position 152 of the pointer 150 is sufficient if the display area is not covered by the finger.
  • the shape or shape of the pointer 150 is not limited to the illustrated circle, but various shapes and colors. Translucent or transparent can be transformed into, but includes "+" plus sign inside.
  • the pointer mode selection button 160 is displayed on the lower left of the display panel in a translucent or transparent manner.
  • the pointer mode selection button 160 is marked with a letter "P" in a circle and is displayed in a different color in the normal mode and the pointer mode to indicate the selected mode.
  • the pointer mode selection button 160 is not fixed to the lower left side and may be positioned in an arbitrary area of the display panel by setting the environment.
  • the pointer mode selection button 160 is for distinguishing the pointer mode to be used in parallel with the existing gesture control method.
  • the controller 120 switches from the normal mode to the pointer mode. In the point mode, other control schemes, such as gesture control schemes, are disabled.
  • a menu bar 162 may be displayed at the bottom of the screen.
  • the menu bar 162 includes a single / multi mode button 162a, an environment setting button 162b, a virtual controller button 162c, a virtual indicator bar button 162d, and a virtual touch ball button 162e.
  • FIG. 3 is a flowchart illustrating a pointer mode selection process according to the present invention.
  • the control unit 120 checks whether the pointer mode selection button display is set, and at the time of setting, the pointer mode selection button ( 160 displays (S102 to S106).
  • the controller 120 changes the color of the letter “P” to indicate that the pointer mode is present (S110).
  • the erase or check of the set time is checked (S112 and S114).
  • the pointer mode is released by restoring the color of the letter "P" to the original color to indicate that the pointer mode is in the normal mode (S116).
  • FIG. 4 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in the pointer mode according to the present invention.
  • the controller 120 sets setting data (distance value, azimuth, size, transparency, shape, and color data) of the pointer stored in the memory unit 140. Take a to configure a pointer graphic image and switch the system to the pointer standby mode (S122). Subsequently, when a finger touchdown is detected on the touch panel 114 (S124), the controller 120 substitutes the distance value and the azimuth angle set from the touch position 154 (Tx0, Ty0) and displays the display position 152 (Px0, Py0). Calculate and display the prepared pointer graphic image in the calculated display position (S126).
  • setting data distance value, azimuth, size, transparency, shape, and color data
  • the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, thereby maintaining the set distance and the set azimuth angle. Control the display.
  • the finger movement is stopped and the touch-up operation state in which the finger contact falls on the touch panel is detected (S128).
  • the controller 120 calculates the current position 156 (Px1, Py1) where the pointer 150 is located as the selection position, not the current touch position.
  • the controller 120 waits for selection of the object 157 corresponding to the selection position (S130).
  • the controller 120 checks whether a selection command such as a touchdown or a touch click is input in the selection standby state (S132), and checks whether the set selection waiting time has elapsed (S134).
  • a selection command such as a touchdown or a touch click is input in step S132
  • the object 157 that is waiting for selection is activated (S136) and the process returns to step S122.
  • the controller 120 automatically returns to step S122 when there is no input of a selection command until the set time elapses.
  • the pointer mode selection button 160 may automatically disappear when the selection object is activated. This is because the selection operation is convenient for the pointer and the touch control is convenient for the screen operation unless the word writing mode is used.
  • a specific touch control operation command for example, vertical movement control
  • the touch position and the pointer display position are not fixed to the set value in the pointer display and movement process of step S126, but may be controlled to be automatically changed in the specific region.
  • FIG. 5 is a view for explaining a blind spot on the touch screen
  • FIG. 6 is a view for explaining a preferred embodiment of display position control of the pointer 150 in each area of the blind spot.
  • the total square region 180 is set to the width D of the edge branches from the upper, lower, left and right boundary lines, and includes the upper side region 182, the lower side region 184, the left side region 186, the right side region 188, and the upper left corner region 181, The upper right corner area 183, the lower left corner area 185, and the lower right corner area 187 are divided.
  • the vertical distance d1 of the pointer 150 in the upper side region 182 is determined by the following equation.
  • C1 is the coefficient of the upper edge region and S1 represents the vertical distance from the rectangular boundary to the upper edge region. Therefore, in the upper side region 182, the azimuth angle maintains the set value and the vertical distance decreases in inverse proportion to the entry distance S1.
  • the vertical distance d1 of the pointer 150 in the lower side region 184 is determined by the following equation.
  • C2 is the coefficient of the lower edge region and S2 is the vertical distance from the rectangular boundary to the lower edge region. Therefore, in the same manner as the upper side region 182, in the lower side region 184, the vertical distance is reduced in inverse proportion to the entry distance S2.
  • the horizontal distance dL of the pointer 150 in the left side region 186 is determined by the following equation.
  • C3 is the coefficient of the left side region and S3 represents the horizontal distance from the rectangular boundary to the left side region.
  • the vertical distance is maintained at d0, but the horizontal distance dL is variable.
  • the azimuth angle ⁇ L is determined by the following equation.
  • both the distance and the azimuth angle are variably controlled.
  • the vertical distance and the horizontal distance in the lower side region and the left side region are reduced in inverse proportion to the entry horizontal component and the entrance vertical component, respectively, so that both the distance and the azimuth angle are variably controlled.
  • the horizontal distance dR of the pointer 150 in the right side region 188 is determined by the following equation.
  • C4 is the coefficient of the right side region and S4 represents the horizontal distance from the rectangular boundary to the right side region.
  • the vertical distance is maintained at d0, but the horizontal distance dR is variable.
  • the azimuth angle ⁇ R is determined by the following equation.
  • both the distance and the azimuth angle are variably controlled.
  • the vertical distance and the horizontal distance in the lower side region and the right side region are reduced in inverse proportion to the entry horizontal component and the entrance vertical component, respectively, so that both the distance and the azimuth angle are variably controlled.
  • FIG. 7 is a flowchart illustrating a process of controlling a pointer display position in the rectangular area of the embodiment of FIG. 6.
  • the controller 120 inputs a touch position value (S142) when the input touch position value enters the range of the set rectangular area 180 (S144). It is checked whether it is the left or right side area (S150) or the top, bottom, left and right corner areas (S154). In step S146, if the upper or lower side area is inversely proportional to the entry distance from the boundary line of the blind spot, the distance value is varied (S148). In step S150, the azimuth angle is changed in inverse proportion to the entry distance from the boundary line of the blind spot in the left or right side area (S152).
  • step S154 if any one of the upper, lower, left, and right corner regions is inversely proportional to the entry distance from the boundary line of the blind spot, the distance value and the azimuth angle are simultaneously changed (S156).
  • the distance value or azimuth data calculated in each of steps S148 to S156 is transmitted from the control unit 120 to the pointer display position control program and reflected in the screen coordinate value of the pointer image.
  • the distance and azimuth between the display position and the touch position of the pointer are automatically controlled adaptively to the entry distance, thereby making it easy to select an object in the blind spot.
  • the rectangular area is divided into four sides and four corners to control the distance value and the azimuth angle, but the present invention is not limited thereto, and any one or more of them may be used in combination.
  • the vertical area can be variably controlled by setting only the lower area to the rectangular area.
  • the pointer display position may not be variably controlled in the rectangular area but may be extended in a variable control method over the entire screen area.
  • FIG. 9 is a view for explaining an embodiment for variably controlling the display position of the pointer 150 in the entire screen area.
  • the X-marked circles indicated by dotted lines in the drawings are touch points T1, T2, T3, and T4, and the plus-marked red circles indicate the pointer mark points P1, P2, P3, and P4, and the respective touch points (T1, T2, T3).
  • the counterclockwise azimuth from the horizontal line of T4) is ⁇ T1 , ⁇ T2 , ⁇ T3 , ⁇ T4 Mark each as
  • the distance of the display point in the touch point azimuth direction increases to a value greater than the set distance d0 and converges to the set distance d0 when approaching the center point.
  • the touch point T3 approaches the touch point T3 in the azimuth angle ( ⁇ T3 -90 °) with respect to the touch point and approaches the center point (0,0). As it approaches, it converges to the set distance d0.
  • the touch point and the pointer display point are closer to each other as the lower edge of the touch screen 110 approaches.
  • the shortest distance should be maintained at a minimum distance from the finger touch position so that the pointer mark is not hidden.
  • FIG. 10 is a flowchart illustrating a pointer display position control process in the entire area of the embodiment of FIG. 9.
  • the controller 120 inputs a touch position value (S162) to calculate a distance value q and an azimuth angle between the input touch point and the center point (S164). It is checked at which upper limit of the first upper limit to the fourth upper limit the calculated azimuth angle of the touch point (S166 to S172).
  • the azimuth angle of the display point P1 is calculated based on the first upper limit display point distance calculation formula of Table 1 and the touch point T1 (S167). If it is the second upper limit in operation S168, the azimuth angle of the display point P2 is calculated based on the second upper limit display point distance calculation formula of Table 1 and the touch point T2 (S169).
  • the azimuth angle of the display point P3 is calculated based on the third upper limit display point distance calculation formula of Table 1 and the touch point T3 (S171). If it is the fourth upper limit in step S172, the azimuth angle of the display point P4 is calculated based on the fourth upper limit display point distance calculation formula of Table 1 and the touch point T4 (S173).
  • the distance values or azimuth data of the display points P1 to P4 calculated in steps S167 to S173 are transmitted from the controller 120 to the pointer display position control program and reflected on the screen coordinate values of the pointer image.
  • objects such as a selection menu or an icon button having a high frequency of use are mainly disposed on the top and left and right sides of the screen configuration on the Internet, and objects having a low frequency of use are disposed at the bottom of the screen.
  • objects of high importance are arranged on the top and left sides of ergonomic visual or behavioral habits, it is preferable to select the object area as a pointer while moving up from the bottom in the synchronous direction where important parts are not covered by the fingers.
  • FIG. 11 is a view for explaining an embodiment of a virtual indicator rod using a variable control technique of the display position of the present invention.
  • the indicator rod zone 192 is set near the right edge of the touch screen 110 or near the lower right corner, and each touch position of the indicator rod 192 is determined by the touch screen 110.
  • the distance from the touch point T to the display point P is variably controlled so as to correspond to the entire area, and the pointer image is displayed as the instruction image 194.
  • Variable control is performed at the ratio of the horizontal distance 110b to (P).
  • the azimuth angle of the display point based on the touch point may be fixed at a specific azimuth angle or variably controlled in a manner similar to the above-described rectangular area or the entire area.
  • the controller 120 sets the distance variable control mode to the virtual indicator bar control mode.
  • the virtual indicator bar control technology of the present invention is very useful in the technical fields such as wall-mounted large touch screen device, touch television, touch screen electronic blackboard, touch screen large monitor.
  • 12A to 12H are screen state diagrams for explaining an environment setting of a pointer according to the present invention.
  • the controller 120 executes the pointer configuration module to set various parameters of the pointer.
  • a screen (Fig. 12A) showing a pull-down menu for setting the direction and distance of the pointer is displayed and the direction setting screen (Fig. 12A) is responded to in response to the pull-down menu selection of Fig. 12A. 12b) and the distance setting screen (Fig. 12c).
  • the screen showing the pull-down menu for setting the pointer size, transparency, color, shape, etc. is displayed (Fig. 12D), and in response to the pull-down menu selection, the color setting screen 12e, the size setting screen (Fig. 12F), The transparency setting screen (Fig. 12G) and the appearance setting screen (Fig. 12H) are displayed.
  • FIG. 13 is a view for explaining a method of setting the direction and distance of the pointer of the present invention in the touch state.
  • the pointer 150 is displayed by the distance and direction values set by the environment setting.
  • the controller 120 recognizes another touch position.
  • the controller 120 automatically switches to the pointer distance and direction configuration mode to execute the pointer configuration module.
  • the touch movement of the finger 172 is tracked, the pointer 151 is positioned at a desired position and direction, and when the finger 172 is touched up (released), the position and finger of the currently placed pointer are touched.
  • the distance value between the touch positions of 170 is calculated and the set value is changed to the new distance value and the direction value.
  • the display position of the pointer can be changed at an arbitrary distance and direction from the touch position in the touch state without selecting the environment setting menu, convenience in use can be increased.
  • FIG. 14 is a view showing a state in which a point display area is touched according to the present invention.
  • the controller 126 executes an enlargement / reduction processing program to execute the inside of the pointer 150.
  • the "+" sign is replaced by the "-" sign. If it is in the + sign state, it performs enlargement function in response to the touch action and inverts the + sign in the circle with-sign. On the contrary, in-sign state, it performs reduction function in response to touch action and inverts-sign in circle with + sign.
  • 15 is a flowchart for explaining an enlargement / reduction processing program according to the present invention.
  • the controller 120 checks whether a touchdown 154 is detected in a pointer standby state (S182) (S184).
  • the pointer image 150 including the enlargement (+ sign) or the reduction (-sign) is displayed on the display position 152 away from the touch position 154 by a set distance as described above ( S186).
  • the touch-up is detected (S188). If the touch-up is detected, it is detected whether a retouch is input, and it is checked whether the retouch position 158 is a pointer display area touchdown (S190) or a touch click (S198).
  • step S190 in response to the displayed enlargement or reduction symbol to enlarge or reduce continuously at a predetermined ratio (S192).
  • the + sign enlarges the screen image in steps of 10% from the original size.
  • -sign scales the screen image by 10% from the original size. This reduction process is maintained until the touch up is detected.
  • the touch-up is detected in step S194, the enlargement or reduction symbol is inverted and returned to step S190 (S196). That is, if the + sign in step S190, after the step S196 is inverted to the-sign.
  • step S198 If the touch click is checked in step S198, only one-step magnification is executed at the set magnification or reduction ratio, for example, 200% (S199). After the basic enlargement in step S199, step S196 is performed to invert and display the pointer symbol from + to-and return to step S190. Therefore, if there is another touch click in step S198, this time the screen image which has already been enlarged 200% is restored to the original 100% screen image.
  • the pointer display area is divided into touchdown or touchclick to enable continuous enlargement or reduction control during touchdown, and can be quickly controlled in two stages of 200% enlargement-100% reduction during touchdown. Done.
  • 16 is a screen state diagram in which the virtual controller 170 according to the present invention is displayed.
  • the controller 120 sets the system to the virtual controller operation mode and touches the virtual controller 170 with the pointer 150 at the touch point T. Mark).
  • the virtual controller 170 may perform a right button function of a conventional mouse. Although the virtual controller 170 is displayed only on the right side of the touch point 170, another virtual controller corresponding to the left button of the mouse may be displayed on the left side of the touch point T.
  • the virtual controller 170 may assign various functions such as a menu selection button as well as a left or right mouse button function.
  • FIG. 17 is a screen state diagram illustrating a virtual touch ball according to the present invention
  • FIG. 18 is a flowchart illustrating a virtual touch ball control operation according to the present invention.
  • the virtual touch ball 170 is displayed in a form similar to the existing track ball on the upper left of the touch screen 110.
  • the virtual touch ball 170 is preferably displayed in a transparent or translucent.
  • the virtual touch ball 170 includes an activation area 170a and an inactivation area 170b around the active touch ball 170.
  • the activation area 170a is an area for detecting the moving speed and the moving direction from touch detection in order to control the screen movement or scrolling in response to the touch operation. This is an area for detecting whether touch movement is started.
  • the activation area 170a controls the screen to move the screen or the object in the direction in which the screen or the object is rotated in response thereto.
  • the virtual touch ball 170 stops the function of the virtual touch ball 170 to maintain control by the touch movement of the pointer 150.
  • the virtual touch ball 170 recognizes the operation of the virtual touch ball 170 and performs a screen movement or screen scroll operation.
  • the controller 120 displays the virtual touch ball 170 on the upper left of the screen. (S206).
  • the touchdown and the touch movement are detected in the active area 170a and the deactivated area 170b of the virtual touch ball 170 (S208).
  • the control unit 120 moves the screen or object in the corresponding direction in response to the direction and speed at which the virtual touch ball rolls (S210).
  • the touch-up is detected in the activation area 170a (S212)
  • the touch-up is detected, the screen is stopped while gradually decreasing the screen moving speed (S214).
  • S224 If the touch is not detected in S216, it is detected whether the set time has elapsed (S224). If it has not elapsed, the virtual touch ball mode is maintained, and if it is not elapsed, it returns to the normal mode standby state (S224).
  • 19 is a view for explaining the two touch 2 pointer according to the present invention.
  • the two touch points T1 and T2 may be separated. Find the azimuth.
  • the bisector 202 of the straight line distance S between two touch points T1 and T2 is calculated on the touch screen 110 to obtain the azimuth angle of the touch points.
  • Point that draws a vertical line perpendicular to the distance S from the bisector 202 in one direction (usually the lower direction of the screen) and meets a horizontal line 208 (consistent with the screen horizontal) set at a distance from the bisector 202. (210) is calculated as a reference point.
  • the predetermined distance 212 between the bisector 202 and the reference point 210 is preferably proportional to the length from the stop end to the point where the palm meets the wrist. In particular, it is limited to the value between the palm vertical length (when the finger is raised to the maximum on the touch screen) and the hand length (when the hand is stretched out on touch).
  • the azimuth angles ⁇ T1 and ⁇ T2 of the extension lines 214 and 216 are drawn from the reference point 210 by extending the extension lines 214 and 216 passing through the touch points T1 and T2 and counterclockwise with respect to the horizontal line 208. )
  • the pointer display points P1 and P2 are calculated as positions spaced apart from the reference point 210 through the touch points T1 and T2 by the set distance d, and the pointer images are rendered on the calculated pointer display points P1 and P2. Are displayed respectively.
  • the pointer display control as described above, the pointer can be displayed on an extension line of the finger length direction.
  • 206 is a virtual circle having a radius RK, and two touch points T1 and T2 are located at the same distance from the center point 204 of the circle, that is, on the circumference.
  • 20 is a view for explaining a three touch 3 pointer according to the present invention.
  • the three touch three pointer finds the two farthest touch points T1 and T2 among the touch points T1, T2 and T3 and has these two touch points T1 and T2, which is the same as the two touch two pointer described above.
  • the azimuth angles ⁇ T1 , ⁇ T2 , ⁇ T3 of the respective touch points T1, T2, and T3 are calculated to calculate the pointer display points P1, P2, and P3.
  • the point that sets the direction reference in the direction opposite to the touch point T3 when drawing a vertical line perpendicular to the distance S from the bisector 202 is different from the two-touch method.
  • 21 is a view for explaining a five touch 5 pointer according to the present invention.
  • the controller 120 recognizes that five fingers of the thumb F1, the index finger F2, the middle finger F3, the ring finger F4, and the finger F5 are touched, and responds to the multi-touch in response to the multi-touch.
  • the coordinate values of T2, T3, T4, and T5) are calculated.
  • the controller 120 generates a multi-pointer by inputting coordinate values of the detected multi-touch positions.
  • the reference point 210 is a point where the vertical extension line 212 is drawn in the opposite direction of the touch positions T2 to T4 and the vertical extension line 212 and the horizontal line 208 meet.
  • Extension lines passing through the respective touch points T1, T2, T3, T4, and T5 are drawn from the reference point 210 and the azimuth angles ⁇ T1 , ⁇ T2 , ⁇ T3 counterclockwise from the virtual horizontal line 208. , ⁇ T4 , ⁇ T5 ) are obtained, respectively. This is summarized by the following formula.
  • n is a positive integer and ⁇ n is the angle between touch points.
  • the azimuth of the missing touch points is calculated to maintain the distance, angle and azimuth between the touch points even if the multi-touch is moved and the coordinates change frequently.
  • the longitudinal direction of each finger may be inferred from the azimuth of each touch point.
  • a pointer display point corresponding to each finger touch is calculated on an extension line that is spaced a predetermined distance from each touch point and has an azimuth angle of each touch point.
  • FIG. 22 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in a multi-touch multi-pointer mode according to the present invention.
  • the control unit 120 brings setting data (vertical extension line length value, distance setting value from the touch point to the display point, size, transparency, shape, and color data) stored in the memory unit 140. Configures the pointer graphic image and switches the system to the pointer standby mode (S232). Subsequently, when the multi-touchdown is sensed by the touch panel 114 (S234), the controller 120 performs the above-described algorithm from the touch positions T1 to T5 to calculate the azimuth angle of each touch point, thereby adjusting each display direction of the multi-pointers. Infer (S236).
  • the pointer display points P1 to P5 are respectively calculated at positions spaced by a predetermined distance set in the direction inferred from the touch point, and the pointer images are rendered at the calculated display points P1 to P5 to display the multi-pointers on the screen.
  • the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, thereby maintaining the set distance and the set azimuth angle. Control the display. The movement of the finger is stopped and the touch-up operation state in which the finger touch falls on the touch panel is detected (S242).
  • the controller 120 calculates the current position where the pointer display point P is located as the selection position, not the current touch position.
  • the controller 120 waits for selection of an object corresponding to the selection position (S244).
  • the controller 120 checks whether a selection command such as a touchdown or a touch click is input in the selection waiting state (S246), and checks whether the set selection waiting time has elapsed (S250). If a selection command such as touchdown or touchclick is input in step S246, the object waiting for selection is activated (S248) and the process returns to step S232.
  • step S250 the controller 120 automatically returns to step S232 when there is no input of a selection command until the set time elapses.
  • FIG. 23 is a view for explaining a modified embodiment of the multi-touch multi-pointer mode according to the present invention
  • FIG. 24 is a view for explaining the principle of calculating the display point azimuth angle in the modified embodiment.
  • the touch point T1 and the reference point 210 of the thumb are connected to each other through three joint points J11, J12, and J13.
  • the touch point T2 and the reference point 210 of the index finger are connected to each other through three joint points J21, J22, and J23.
  • the middle touch point T3 and the reference point 210 are connected to each other through three joint points J31, J32, and J33.
  • the touch point T4 and the reference point 210 of the ring finger are connected to each other through three joint points J41, J42, and J43.
  • the touch point T5 of the thumb and the reference point 210 are connected to each other through three joint points J51, J52, and J53.
  • the pointers P1 to P5 corresponding to each finger are disposed at a predetermined distance from the touch point on the extension line extending through the touch point from the joint points J11, J21, J31, J41, and J51, respectively.
  • the controller 120 may determine the distance Di0 of the touch point Ti and the joint point Ji1 of the fingers, the distance Di1 between the joint point Ji1 and the joint point Ji2, and the joint point Ji2.
  • the angle ⁇ T32 between (Ji2), the distance ⁇ T33 between the joint point (Ji2) and the joint point (Ji3), and the angle ⁇ T3 between the joint point (Ji3) and the reference point 210 are managed as parameters of the corresponding finger.
  • all joint points are positioned on a straight line connecting the reference point and the touch point, and the distance between each point is the maximum length.
  • the azimuth angles of the joint points and the distance between the points vary according to the degree of bending the finger.
  • each joint point is calculated and arranged as a ratio of the distance between the touch point and the reference point, and the azimuth angle between the reference point 210 and the third joint point J33 is obtained by the following equation.
  • ⁇ T3 ⁇ One + ⁇ 2 + ⁇ 3
  • the azimuth angle of the second joint point (J32) is obtained by the following equation.
  • the azimuth angle of the first joint point J31 is obtained by the following equation.
  • the azimuth angle of the display point P3 is calculated at the same angle as the azimuth angle of the first joint point J31.
  • the touch point and the reference point 210 are not simply connected in a straight line as described above, but connect the joint points of each finger to display the virtual hand structure as a whole, and pass the touch point from the first joint point.
  • the direction of the finger and the arrangement direction of the pointer can be controlled more precisely.
  • the direction of the pointer can be controlled according to the degree of bending the finger.
  • FIG. 25 is a screen state diagram of a modified embodiment for explaining a display operation of a pointer according to the present invention
  • FIG. 26 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation according to a modified embodiment.
  • a pointer is displayed and waits at an arbitrary position 152 on the screen (262).
  • an arbitrary position 154 of the touch screen 110 is touched (S264)
  • the pointer is jumped and displayed from the position 152 to the 156 position as shown in FIG. 25 (S266).
  • the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, and the pointer 150. ) Is controlled to be displayed while maintaining the set distance and the set azimuth (S270).
  • step S272 The finger movement is stopped and the touch-up operation state in which the finger contact falls on the touch panel is detected (S272).
  • the controller 120 calculates the current position 156 (Px1, Py1) where the pointer 150 is located as the selection position, not the current touch position.
  • the controller 120 waits for selection of the object 157 corresponding to the selection position (S274).
  • the controller 120 checks whether a selection command such as touchdown or touchclick is input in the selection standby state (S276).
  • a selection command such as a touchdown or a touch click is input in step S276, the object 157 that is waiting for selection is activated (S278), and the process returns to step S262 to maintain the pointer displayed.
  • the present invention may be implemented in software such as an application form of a smart phone, hardware in an electronic device having a touch screen, or a combination of firmware and hardware.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Disclosed is a method for controlling a touch screen. The method of the present invention involves displaying a pointer on a display panel which is spaced apart from a finger touch position on a touch panel by a predetermined distance and which is not visually covered by a finger, moving and displaying the pointer in accordance with the movement of the finger while maintaining a predetermined distance between the pointer and the finger when the finger moves while touching the touch position, enabling an object to be on standby for selection when the finger is removed from the touch panel while the display position of the pointer is set on the object to be selected, and selecting the standby object when a selection command is inputted before the display of the pointer is suppressed. Accordingly, an erroneous selection caused by an erroneous operation can be prevented during touch dragging on a touch screen.

Description

터치스크린장치 및 그 제어방법Touch screen device and control method
본 발명은 터치스크린 장치 및 그 제어방법에 관한 것으로 특히 손가락 터치에 감응하여 터치 위치와 다른 위치에 유저 인터페이스 그래픽 객체를 표시하고 표시된 GUI 객체를 통하여 제어하는 장치 및 방법에 관한 것이다.The present invention relates to a touch screen device and a control method thereof, and more particularly, to an apparatus and method for displaying a user interface graphic object at a position different from a touch position in response to a finger touch and controlling the displayed GUI object.
최근에 무선 통신 기술의 발달로 휴대 장치의 모바일 및 유비쿼터스 환경의 유선 및 무선 전자장치에서 그래픽사용자 인터페이스의 제어방법으로 터치감지장치를 이용한 터치스크린이 널리 사용되고 있다.Recently, due to the development of wireless communication technology, a touch screen using a touch sensing device has been widely used as a control method of a graphic user interface in wired and wireless electronic devices of mobile devices and ubiquitous environments.
터치스크린을 사용하는 전자장치에서 소프트웨어를 활용하여 전자기기를 작동함에 있어서 사용자의 멀티네트워크 환경을 보다 풍부하고 다양하게 수행하고 보다 편리한 그래픽 사용자 인터페이스를 제공할 수 있다는 장점을 가진다.When operating electronic devices using software in electronic devices using a touch screen, the multi-network environment of the user may be richer and more diverse, and a more convenient graphical user interface may be provided.
사용자가 터치스크린이 장착되는 전자기기에서 프로그램을 잘 활용하고 GUI객체를 선택하여 이용하기 위해서는 그에 맞는 보조장치 수단을 활용하여 표시된 GUI객체를 용이하게 선택하고 확대 또는 축소하여 검색할 수 있어야 한다. 이에 관련된 종래 기술로서 미국공개번호 제2008/0122796호 등에 개시된 다중 터치에 관한 멀티 터치 방식이 있다.In order for a user to utilize a program well and to select and use a GUI object in an electronic device equipped with a touch screen, the user should be able to easily select, enlarge or reduce the displayed GUI object by using an auxiliary device corresponding thereto. As a related art, there is a multi-touch method for a multi-touch disclosed in US Patent Publication No. 2008/0122796.
유저 그래픽 인터페이스 사용 환경에서 화면상에 디스플레이 되는 작은 객체들을 뭉뚝한 손가락으로 직접 타켓팅 하기 곤란하였다.It was difficult to directly target small objects displayed on the screen with a blunt finger in an environment using a user graphic interface.
그러므로 사용자가 터치 감지형 디스플레이 상에서 자신의 손가락만을 사용하여 커서를 호버링(유영:hovering)할 수 있게 하는 방법을 제공할 필요가 있다.Therefore, there is a need to provide a method that enables a user to hover the cursor using only their finger on a touch-sensitive display.
공개특허 제2006-0072082호(마이크로소프트), 미국공개특허 제2009/0288043호(Purple Labs)에서는 화면상에 먼저 커서를 표시하고 커서 표시 위치를 중심으로 일정 영역 이내에 터치가 감지되면 터치위치와 커서 표시위치가 연동되어 터치위치를 이동하면 동일 간격을 유지한 상태에서 커서 표시위치가 따라 다니며 터치위치에서 탭핑하면 그 탭핑 동작인식은 커서 표시위치에서 발생한 것으로 인식하는 커서 제어 기술을 개시한다.In Patent Publication Nos. 2006-0072082 (Microsoft) and U.S. Patent Publication No. 2009/0288043 (Purple Labs), a cursor is first displayed on a screen, and when a touch is detected within a predetermined area around the cursor display position, the touch position and the cursor are displayed. When the display position is interlocked and the touch position is moved, the cursor display position follows the same interval while maintaining the same distance, and when the touch position is touched, the tapping operation recognition initiates a cursor control technique that recognizes that the tapping operation occurs at the cursor display position.
공개특허 제2008-0070226호(삼성)에서는 제1터치와 제2터치를 감지하고 제1터치위치에 포인터를 표시하고 제2터치의 이동을 제1터치위치에 표시된 포인터의 위치이동으로 제어한다.In Patent Publication No. 2008-0070226 (Samsung), a first touch and a second touch are detected, a pointer is displayed at a first touch position, and the movement of the second touch is controlled by the position movement of the pointer displayed at the first touch position.
등록특허 제10-897806호(엘지) 터치위치 근처에 커서를 표시하고 두드리는 것에 의해 터치위치를 중심으로 시계방향으로 커서 표시 방향을 설정하여 사각영역에서 선택을 용이하게 한다.By displaying and tapping the cursor near the touch position, the cursor display direction is set clockwise around the touch position to facilitate selection in the rectangular area.
공개특허 제2009-0094360호(마이크로소프트)에서는 손가락을 스크린과 접촉한 상태로 유지하면서, 사용자는 콜아웃(Callout)에 의해 제공된 시각적 피드백에 기초하여 포인터의 위치를 안내하고, 사용자는 포인터가 스크린의 가려지지 않은 스크린 영역(non-occluded area)에 표시된 원하는 타겟의 사본 위에 있을 때까지 자신의 손가락을 스크린의 표면상에서 움직임으로써 정확하게 이동하고 포인터 위치를 미세하게 조정할 수 있다.In Patent Publication No. 2009-0094360 (Microsoft), while maintaining a finger in contact with the screen, the user guides the position of the pointer based on visual feedback provided by the callout, and the user guides the screen to the screen. You can precisely move and fine-tune the pointer position by moving your finger on the surface of the screen until it is over a copy of the desired target displayed in the non-occluded area of the screen.
일본공개특허 제1994-51908호(소니)에서는 터치위치로부터 소정 방향에 소정거리를 두는 위치에 포인터를 표시한다.In Japanese Patent Laid-Open No. 1944-51908 (Sony), a pointer is displayed at a position spaced a predetermined distance from a touch position in a predetermined direction.
즉 종래 기술들은 터치위치와 선택위치가 동일할 경우 손가락으로 가려져 정확한 선택위치가 선택되었는지를 확인할 수 없으므로 이를 확인하기 위해 터치위치와 다른 위치에 포인터 또는 선택위치를 디스플레이하는 것에 의해 정확한 선택을 한다.That is, in the prior arts, when the touch position and the selection position are the same, it is impossible to determine whether the correct selection position is selected by a finger, so that the correct selection is made by displaying a pointer or a selection position at a position different from the touch position to confirm this.
이들 장치는 통상적으로 터치스크린 장치에서 터치 후 터치상태를 유지하면서 원하는 선택위치로 포인터를 이동시켜(드래그) 포인터를 선택위치에 고정한 상태에서 터치를 해제하는 동작에 의해 선택위치의 선택명령이 수행된다.These devices typically perform a selection command of a selection position by moving the pointer to a desired selection position (drag) and then releasing the touch while holding the pointer at the selection position while maintaining the touch state after the touch screen device. .
그러나 이동 중에 원치 않게 터치가 떨어지면 터치가 떨어진 위치에서 선택명령이 수행되므로 원하지 않는 객체가 선택되어 실행되는 문제가 발생되었다.However, if the touch is undesirably dropped during the movement, the selection command is performed at the touched location, which causes an unwanted object to be selected and executed.
다시 말하면 기존에 사용되는 멀티스크린 방식은 손가락을 그 자체로 포인터로 사용하기 때문에 많은 클릭확인의 실수를 가져오고 잦은 오류를 발생시켜서 이용자의 인터넷 검색이나 작업수행이 어려움을 가져오고 있으며 손가락이 굵거나 섬세하지 않다면 GUI객체에 표시되는 작은 영역을 선택함에 있어서 두 손가락을 이용하여 줌과 줌 아웃을 실행한 이후에 작은 링크영역을 선택하게 된다. 이런 방식은 기존에 만들어진 웹 환경에서 더욱 자연스럽게 터치 방식으로 GUI객체의 선택이 용이하지 않다는 문제가 있고 심각하게는 제스처방식의 제어가 동작하도록 보통사람의 손가락이 터치하는 면적에 맞추어 새로운 앱을 만들어야 하는 어려움을 초래한다.In other words, the existing multi-screen method uses a finger as a pointer itself, which causes a lot of mistake in clicking confirmation and frequent errors, making it difficult for users to search the Internet or perform tasks. If not, you can select a small link area after zooming out and zooming out using two fingers in selecting a small area displayed on the GUI object. This method has a problem that it is not easy to select GUI objects with a more natural touch method in the existing web environment, and seriously, a new app needs to be made to fit the area of the average finger's finger to operate the gesture-based control. It causes difficulties.
본 발명의 목적은 이와 같은 종래 기술의 문제점을 해결하기 위하여 터치입력의 편리성을 도모할 수 있는 터치스크린 장치 및 그 제어방법을 제공하는 데 있다.SUMMARY OF THE INVENTION An object of the present invention is to provide a touch screen device and a method of controlling the same, which can achieve convenience of a touch input in order to solve the problems of the related art.
본 발명의 다른 목적은 객체를 정확하게 선택 제어할 수 있는 터치스크린 장치 및 그 제어방법을 제공하는 데 있다.Another object of the present invention is to provide a touch screen device and a control method thereof capable of accurately controlling an object.
본 발명의 또 다른 목적은 터치 이동시 화면의 물리적 환경에 적응적으로 포인터의 표시위치를 제어할 수 있는 터치스크린 장치 및 제어방법을 제공하는 데 있다.Another object of the present invention is to provide a touch screen device and a control method capable of controlling a display position of a pointer adaptively to a physical environment of a screen when a touch is moved.
본 발명의 또 다른 목적은 포인터와 연동된 확대 및 축소 기능을 가진 터치스크린 장치 및 제어방법을 제공하는 데 있다.Still another object of the present invention is to provide a touch screen device and a control method having a zoom in and zoom out function associated with a pointer.
본 발명의 또 다른 목적은 포인터와 연동된 가상 지시봉 기능을 가진 터치스크린 장치 및 제어방법을 제공하는 데 있다.Another object of the present invention to provide a touch screen device and a control method having a virtual indicator bar function associated with the pointer.
본 발명의 또 다른 목적은 포인터와 연동된 가상 터치볼 기능을 가진 터치스크린 장치 및 제어방법을 제공하는 데 있다.Another object of the present invention is to provide a touch screen device and a control method having a virtual touch ball function associated with a pointer.
본 발명의 또 다른 목적은 멀티 터치 멀티 포인터 제어가 가능한 터치스크린 장치 및 제어방법을 제공하는 데 있다.Another object of the present invention to provide a touch screen device and a control method capable of multi-touch multi-pointer control.
상술한 본 발명의 목적들을 달성하기 위하여 본 발명의 제어방법은 표시패널과 터치패널를 구비한 터치스크린 장치에 있어서, 터치패널 상의 손가락 터치(또는 터치다운)에 대응하는 터치위치를 인식하고, 인식된 터치위치로부터 일정 거리 떨어지고 손가락에 의해 시각적으로 가려지지 않은 표시패널 영역에 포인터를 표시하고, 터치위치에 손가락을 터치한 상태로 손가락을 이동할 경우에 손가락 이동에 연동하여 일정 거리를 유지하면서 상기 포인터를 이동 표시하고, 포인터의 표시위치를 선택하고자 하는 객체 상에 위치시킨 상태에서 손가락을 터치해제(또는 터치업)하면 상기 객체를 선택 대기시키고, 선택대기단계에서 선택명령이 입력되면 선택대기 중인 객체를 활성화시키는 것을 특징으로 한다.In order to achieve the above object of the present invention, the control method of the present invention is a touch screen device having a display panel and a touch panel, the touch position corresponding to the finger touch (or touch down) on the touch panel, and recognized The pointer is displayed on an area of the display panel which is a certain distance away from the touch position and is not visually obscured by the finger, and when the finger is moved while the finger is touched at the touch position, the pointer is maintained while maintaining a certain distance in conjunction with the finger movement. When the display is moved and the display position of the pointer is positioned on the object to be selected, when the finger is released (or touched up), the object is waited for selection, and when the selection command is input in the selection waiting step, the object waiting for selection is selected. It is characterized by activating.
본 발명에서 선택대기단계는 상기 터치해제 이후 일정시간으로 제한되고 상기 일정시간이 경과될 때까지 선택명령의 입력이 없으면 표시된 포인터 표시를 소거하는 것이 바람직하다.In the present invention, the selection wait step is limited to a certain time after the touch release and if there is no input of a selection command until the predetermined time elapses, it is preferable to erase the displayed pointer display.
또한 선택명령은 포인터를 표시하기 위한 터치동작과 다른 터치동작(터치다운, 터치클릭 또는 더블터치클릭) 또는 키입력 중 적어도 어느 하나에 의해 생성될 수 있다.In addition, the selection command may be generated by at least one of a touch operation for displaying a pointer and another touch operation (touch down, touch click, double touch click), or key input.
본 발명에서 포인터 동작모드는 터치스크린 상에 표시된 포인터 모드 선택버튼을 터치하는 것에 의해 활성화되는 것이 기존의 제스쳐 동작 모드와 병행 사용상 편리하다.In the present invention, the pointer operation mode is activated by touching the pointer mode selection button displayed on the touch screen, which is convenient for use in parallel with the existing gesture operation mode.
본 발명에서 포인터 이동 표시단계는 터치스크린의 에지부근에 설정된 사각구역으로 터치이동이 진입한 거리에 연동하여 터치점과 포인터 표시점 사이의 거리 또는 방향 중 어느 하나를 가변 제어하는 것이 바람직하다.In the present invention, the pointer movement display step may variably control any one of a distance or a direction between the touch point and the pointer display point in conjunction with the distance at which the touch movement enters the rectangular area set near the edge of the touch screen.
또한 포인터 이동 표시단계는 상기 터치스크린의 하측에지와 터치점과의 거리에 대응하여 터치점과 포인터 표시점 사이의 거리 또는 방향 중 어느 하나를 가변 제어할 수 있다. 또한 포인터 이동 표시단계는 터치스크린의 중심점과 터치점 사이의 거리에 대응하여 터치점과 포인터 표시점 사이의 거리를 가변 제어할 수도 있다.In addition, the pointer movement display step may variably control any one of a distance or a direction between the touch point and the pointer display point corresponding to the distance between the lower edge of the touch screen and the touch point. In addition, the pointer movement display step may variably control the distance between the touch point and the pointer display point corresponding to the distance between the center point and the touch point of the touch screen.
본 발명에서는 가상 지시봉 모드에서 지시봉 제어구역 내에서 기준점과 터치점 사이의 거리를 산출하고, 산출된 기준점 거리에 대응하여 터치점과 지시봉의 표시점 사이의 거리 또는 방향 중 적어도 어느 하나를 가변 제어하는 가상 지시봉 모드를 수행할 수 있다.In the present invention, the distance between the reference point and the touch point in the indicator bar control zone in the virtual indicator bar mode, and variably control at least one of the distance or direction between the touch point and the display point of the indicator bar corresponding to the calculated reference point distance A virtual indicator bar mode may be performed.
본 발명에서는 제1터치점으로부터 설정된 방향으로 일정 거리 이격된 설정위치에 포인터를 표시하는 터치스크린 장치에 있어서, 제1터치점에 하나의 손을 터치한 상태에서 상기 포인터 표시영역을 다른 하나의 손으로 터치하는 것을 검출하고, 다른 하나의 손에 의한 터치이동에 연동하여 포인터를 이동 표시하고, 다른 하나의 손의 터치업에 연동하여 터치업이 발생된 제2터치점과 제1터치점 사이의 거리 및 방향을 포인터의 새로운 설정위치로 갱신 처리하는 것이 포인터의 표시방향과 거리 설정을 매우 용이하게 한다.According to the present invention, in a touch screen device displaying a pointer at a setting position spaced a predetermined distance away from a first touch point, the pointer display area is displayed on the other hand while one hand is touched at the first touch point. Detects the touch with the other hand, moves and displays the pointer in association with the touch movement by the other hand, and between the second touch point and the first touch point where the touch-up is generated in conjunction with the touch-up of the other hand. Updating the distance and direction to a new setting position of the pointer makes setting the display direction and distance of the pointer very easy.
본 발명에서는 터치점으로부터 설정된 방향으로 설정 거리만큼 이격된 설정위치에 포인터를 표시하는 터치스크린 장치에 있어서, 포인터의 표시영역에 확대기호 또는 축소기호를 표시하고, 포인터의 표시영역의 터치를 검출하고, 터치 검출에 응답하여 확대기호 또는 축소기호에 대응하는 확대 또는 축소를 실행하고, 포인터의 표시영역의 확대기호 또는 축소기호를 축소기호 또는 확대기호로 반전 표시하고, 터치 검출에 응답하여 터치검출단계부터 반전 표시단계를 반복 수행하는 것이 확대 및 축소 인터페이스를 매우 편리하게 한다.In the present invention, a touch screen device for displaying a pointer at a setting position spaced apart by a set distance in a set direction from a touch point, wherein an enlarged symbol or a reduced symbol is displayed in a display area of the pointer, and a touch of the display area of the pointer is detected. In response to the touch detection, the enlargement or reduction corresponding to the enlarged or reduced symbol is executed, the enlarged or reduced symbol in the display area of the pointer is displayed as a reduced or enlarged symbol, and the touch detection step is performed in response to the touch detection. Iteratively repeating the inverted display steps makes the zoom in and zoom out interface very convenient.
본 발명에서 터치스크린 상에 가상 터치볼을 표시하고, 가상 터치볼 표시영역 안에서 터치에 의한 터치볼을 회전시키는 방향 및 회전속도를 검출하고, 검출된 회전 방향 및 회전속도에 대응하여 이미지를 이동 표시하고, 이미지 이동 표시단계에서 터치업이 검출되면 터치업 순간의 이동방향 에 대한 이미지 이동속도를 점차 감속하여 정지시키는 것이 터치스크린에서 이미지 이동을 매우 편리하게 한다.In the present invention, the virtual touch ball is displayed on the touch screen, the direction and rotation speed of rotating the touch ball by touch in the virtual touch ball display area are detected, and the image is displayed by moving in response to the detected rotation direction and rotation speed. In addition, when the touch-up is detected in the image movement display step, it is very convenient to move the image on the touch screen by gradually decelerating and stopping the image movement speed in the movement direction of the touch-up moment.
본 발명에서 다중 터치를 검출하고, 검출된 다중 터치점들로부터 터치된 손가락들의 길이방향을 추론하고, 추론된 손가락 길이방향으로 각 터치점들 각각에서 연장된 일정 거리에 대응하는 포인터들을 각각 표시하고, 다중 터치점들에 연동하는 다중 포인터들을 상호 연동시켜 한 그룹으로 표시 제어하는 것이 터치스크린 제어를 편리하게 한다.In the present invention, it detects the multi-touch, infers the longitudinal direction of the touched fingers from the detected multi-touch points, and displays the pointers corresponding to a certain distance extending from each of the touch points in the inferred finger length direction, respectively, In addition, the display screen is conveniently controlled by interlocking the multiple pointers linked to the multiple touch points in a group.
여기서 추론 단계는 검출된 다중 터치점들 중 가장 멀리 이격된 두 터치점들 사이의 직선거를 이등분한 이등분점을 산출하고, 이등분점으로부터 일측 방향으로 일정 길이를 가진 수직연장선을 연장시켜 수평선과 만나는 기준점을 산출하고, 기준점을 중심으로 수평선으로부터 반시계방향으로 각 터치점들의 방위각을 산출하고, 기준점으로부터 각 터치점들을 통과하는 연장선들을 각 터치점에 대응하는 손가락의 길이방향으로 결정하는 것이 바람직하다.Here, the inferring step calculates a bisector that bisects a straight line between two farthest touch points among the detected multiple touch points, and extends a vertical extension line having a certain length in one direction from the bisector to meet the horizontal line. It is preferable to calculate the reference point, calculate the azimuth angle of each touch point counterclockwise from the horizontal line around the reference point, and determine extension lines passing through the touch points from the reference point in the longitudinal direction of the finger corresponding to each touch point. .
본 발명에서 표시패널과 터치패널를 구비한 터치스크린 장치에 있어서, 포인터 모드가 선택되면 터치스크린의 임의 제1위치에 포인터를 표시하고, 터치패널 상의 손가락 터치(또는 터치다운)에 대응하는 터치위치를 인식하고, 인식된 터치위치로부터 일정 거리 떨어지고 손가락에 의해 시각적으로 가려지지 않은 표시패널 영역의 특정 제2위치로 상기 제1위치에 표시된 포인터를 점핑 표시하고, 터치위치에 손가락을 터치한 상태로 손가락을 이동할 경우에 상기 손가락 이동에 연동하여 일정 거리를 유지하면서 상기 포인터를 이동 표시하고, 포인터의 표시위치를 선택하고자 하는 객체 상에 위치시킨 상태에서 손가락을 터치해제(또는 터치업)하면 객체를 선택 대기시키고, 선택대기단계에서 선택명령이 입력되면 선택대기 중인 객체를 활성화시키는 것도 가능하다.In the touch screen device including the display panel and the touch panel in the present invention, when the pointer mode is selected, the pointer is displayed at an arbitrary first position of the touch screen, and the touch position corresponding to the finger touch (or touch down) on the touch panel is displayed. Recognizes and jumps the pointer displayed at the first position to a specific second position of the display panel region which is separated by a predetermined distance from the recognized touch position and is not visually covered by the finger, and the finger is touched at the touch position. When moving, the pointer is moved and displayed while maintaining a certain distance in conjunction with the movement of the finger, and the object is selected when the finger is released (or touched up) while the display position of the pointer is positioned on the object to be selected. When the selection command is entered in the selection wait phase, the object waiting for selection is activated. It is also possible.
본 발명의 장치는 터치패널 또는 센서부와, 터치패널 하방에 오버랩되게 설치된 표시패널과, 사용자 키명령을 입력하기 위한 입출력부와, 터치스크린 제어프로그램이 저장된 메모리와, 메모리에 저장된 터치스크린 제어프로그램을 수행하는 제어부를 포함한다. 터치스크린 제어프로그램은 터치패널 상의 손가락 터치(또는 터치다운)에 대응하는 터치위치를 인식하고, 인식된 터치위치로부터 일정 거리 떨어지고 손가락에 의해 시각적으로 가려지지 않은 표시패널 영역에 포인터를 표시하고, 터치위치에 손가락을 터치한 상태로 손가락을 이동할 경우에 손가락 이동에 연동하여 일정 거리를 유지하면서 상기 포인터를 이동 표시하고, 포인터의 표시위치를 선택하고자 하는 객체 상에 위치시킨 상태에서 손가락을 터치해제(또는 터치업)하면 객체를 선택 대기시키고, 선택대기단계에서 선택명령이 입력되면 선택대기 중인 객체를 활성화시키는 기능을 수행한다.The apparatus of the present invention includes a touch panel or a sensor unit, a display panel disposed below the touch panel, an input / output unit for inputting user key commands, a memory in which a touch screen control program is stored, and a touch screen control program stored in the memory. It includes a control unit for performing. The touch screen control program recognizes a touch position corresponding to a finger touch (or touch down) on the touch panel, displays a pointer in an area of the display panel which is separated from the recognized touch position by a distance and is not visually obscured by the finger. When moving a finger while touching a finger on a position, the pointer is moved and displayed while maintaining a certain distance in conjunction with finger movement, and the finger is released when the display position of the pointer is positioned on an object to be selected. Or touch-up) to wait for selection of an object and to activate an object that is waiting for selection when a selection command is input in the selection waiting step.
본 발명의 장치는 소프트웨어, 하드웨어 또는 이들의 조합 중 어느 하나로 구현될 수 있다.The apparatus of the present invention may be implemented in any one of software, hardware or a combination thereof.
본 발명 방법 및 방식을 이용하여 전자기기를 사용함으로써 기존의 컴퓨팅환경에서 작성된 자료 및 웹 환경을 상기 포인터를 이용하여 정확하게 목표대상인 이미지객체를 쉽게 클릭 및 확인할 수 있어서 다양한 모양과 색상을 가진 상기 포인터를 이용하여 더욱 친숙하게 웹 환경과 앱 환경을 연결하는 수단으로 이용할 수 있으며 상기 포인터가 제공하는 확대 및 축소 또는 줌인 및 줌 아웃의 기능을 수행으로써 더 빠르고 쉽게 화면상에 표시되는 정보를 확인할 수 있음과 이용자의 5개 이하의 손가락의 터치위치 그룹을 이용한 상기 포인터의 자동 방향설정으로 대형 터치감지장치에서도 다수의 포인터를 이용하여 대형 터치스크린화면상에서 기존의 이용자들이 손가락 하나로 선택하고 클릭하는 어려움을 손가락(들)의 범위 안에 있는 GUI객체를 용이하게 선택할 수 있고 포인터의 방향프로그램을 이용하여 이용자들이 공동으로 작업함으로써 빠른 인터넷 검색과 작업 및 포인터의 다양한 기능으로 이용자에게 친밀감을 주며 이용자의 편의성과 효율성으로 멀티 네트워크통신 및 전자정보를 사용하는데 있어서 이용자 환경을 개선하고 더욱 효율적으로 기존의 컴퓨터 네트워크 기반을 연결하는 수단으로 활용될 것이다.By using the electronic device using the method and method of the present invention, it is possible to easily click and confirm the image object which is the target object by using the pointer to the data and the web environment created in the existing computing environment. It can be used as a means of connecting the web environment and the app environment in a more familiar way, and the information provided on the screen can be checked more quickly and easily by performing the functions of zoom in and zoom out or zoom in and zoom out provided by the pointer. The automatic direction setting of the pointer using a touch position group of five or less fingers of a user enables a user to select and click a single finger on a large touch screen using a large number of pointers even in a large touch sensing device. Easy GUI objects within the scope of By using the pointer direction program, users can work together to provide users with fast internet search, work and various functions of the pointer, and give users a sense of convenience and efficiency in using multi-network communication and electronic information. It will be used as a means of improving the environment and connecting existing computer network infrastructure more efficiently.
도 1은 본 발명에 의한 바람직한 일 실시예의 터치스크린 제어장치의 블록도.1 is a block diagram of a touch screen control device according to an embodiment of the present invention.
도 2는 본 발명에 의한 싱글 포인터의 구성을 설명하기 위한 화면 상태도.2 is a screen state diagram for explaining the configuration of a single pointer according to the present invention;
도 3은 본 발명에 의한 포인터 모드 선택과정을 설명하기 위한 흐름도.3 is a flowchart illustrating a pointer mode selection process according to the present invention.
도 4는 본 발명에 의한 포인터 모드에서 터치동작과 연동된 포인터 표시 및 객체 선택 과정을 설명하기 위한 흐름도.4 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in the pointer mode according to the present invention.
도 5는 터치스크린 상의 사각구역을 설명하기 위한 도면.5 is a view for explaining a blind spot on the touch screen.
도 6은 본 발명에 의한 사각구역의 각 영역에서 포인터(150)의 표시위치제어의 동작원리를 설명하기 위한 도면.6 is a view for explaining the operation principle of the display position control of the pointer 150 in each area of the rectangular zone according to the present invention.
도 7은 도 6의 일실시예의 사각구역에서 포인터 표시위치 제어 과정을 설명하기 위한 흐름도.FIG. 7 is a flowchart illustrating a process of controlling a pointer display position in the blind spot of the embodiment of FIG. 6.
도 8a 및 도 8b는 터치스크린 하부에지에 접근함에 따라 터치위치와 표시위치 사이의 거리가 달라진 상태를 나타낸 사진들.8A and 8B are photographs showing a state in which a distance between a touch position and a display position is changed as the lower edge of the touch screen is approached.
도 9는 화면 전체 영역에서 포인터(150)의 표시위치를 가변제어하기 위한 실시예를 설명하기 위한 도면.9 is a view for explaining an embodiment for variably controlling the display position of the pointer 150 in the entire screen area.
도 10은 도 9의 일실시예의 전체영역에서 포인터 표시위치 제어 과정을 설명하기 위한 흐름도.FIG. 10 is a flowchart for explaining a pointer display position control process in an entire region of one embodiment of FIG. 9; FIG.
도 11은 본 발명의 표시위치의 가변제어기술을 이용한 가상 지시봉 실시예를 설명하기 위한 도면이다.11 is a view for explaining an embodiment of a virtual indicator rod using a variable control technique of the display position of the present invention.
도 12a 내지 도 12h 는 본 발명에 의한 포인터의 환경설정을 설명하기 위한 화면 상태도들.12A to 12H are screen state diagrams for explaining the environment setting of a pointer according to the present invention;
도 13은 본 발명의 포인터의 방향과 거리를 터치상태에서 설정하는 방식을 설명하기 위한 도면.13 is a view for explaining a method of setting the direction and distance of the pointer of the present invention in the touch state.
도 14는 본 발명에 의한 포인트 표시영역을 터치한 상태를 나타낸 화면 상태도.14 is a screen state diagram showing a state in which a point display area is touched according to the present invention;
도 15는 본 발명에 의한 확대/축소 처리 프로그램을 설명하기 위한 흐름도.Fig. 15 is a flowchart for explaining an enlargement / reduction processing program according to the present invention.
도 16은 본 발명에 의한 가상 제어기(170)가 표시된 화면 상태도.16 is a screen state diagram showing the virtual controller 170 according to the present invention.
도 17은 본 발명에 의한 가상 터치볼을 설명하기 위한 화면 상태도.17 is a screen state diagram for explaining a virtual touch ball according to the present invention.
도 18 본 발명에 의한 가상 터치볼 제어 동작을 설명하기 위한 흐름도.18 is a flowchart illustrating a virtual touch ball control operation according to the present invention.
도 19는 본 발명에 의한 2 터치 2 포인터를 설명하기 위한 도면.19 is a diagram for explaining a two-touch two pointer according to the present invention;
도 20은 본 발명에 의한 3 터치 3 포인터를 설명하기 위한 도면.20 is a view for explaining a three touch three pointer according to the present invention.
도 21은 본 발명에 의한 5 터치 5 포인터를 설명하기 위한 도면.21 is a view for explaining a five touch five pointer according to the present invention.
도 22는 본 발명에 의한 멀티터치 멀티 포인터 모드에서 터치동작과 연동된 포인터 표시 및 객체 선택 과정을 설명하기 위한 흐름도.FIG. 22 is a flowchart illustrating a pointer display and an object selection process associated with a touch operation in a multi-touch multi-pointer mode according to the present invention; FIG.
도 23은 본 발명에 의한 본 발명에 의한 멀티터치 멀티 포인터 모드의 변형 실시예를 설명하기 위한 도면.23 is a view for explaining a modified embodiment of the multi-touch multi-pointer mode according to the present invention according to the present invention.
도 24는 변형 실시예의 표시점 방위각을 산출하는 원리를 설명하기 위한 도면이다.24 is a diagram for explaining the principle of calculating the display point azimuth angle in the modified embodiment.
도 25는 본 발명에 의한 포인터의 표시동작을 설명하기 위한 변형 실시예의 화면 상태도.25 is a screen state diagram of a modified embodiment for explaining the display operation of the pointer according to the present invention;
도 26은 변형 실시예에 의한 터치동작과 연동된 포인터 표시 및 객체 선택 과정을 설명하기 위한 흐름도.FIG. 26 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation according to a modified embodiment; FIG.
상기 목적을 달성하기 위해서 본 발명은 도면을 참조하여 상세하게 설명하고자 하며 실시예에 따른 사상이 충분히 전달될 수 있도록 하기 위해서 예로서 제공되는 것으로, 본 발명은 이하 설명된 실시 예들에 한정되지 않고 다른 형태로 구체화될 수도 있다.In order to achieve the above object, the present invention will be described in detail with reference to the drawings, and the present invention is provided by way of example so that the spirit according to the embodiment can be sufficiently transmitted, and the present invention is not limited to the embodiments described below. It may also be embodied in the form.
본 발명에서 터치스크린 제어장치는 터치스크린 제어기능을 가진 전자장치, 예컨대 터치기능을 가진 스마트 폰, 디지털 카메라나 캠코더의 터치기능을 가진 모니터, 터치기능을 가진 네비게이션, 터치기능을 가진 휴대용 멀티미디어 플레이어, 아이패드나 갤럭시 탭과 같은 테블릿 컴퓨터, 노트북 컴퓨터나 데크탑 컴퓨터와 같은 개인용 컴퓨터의 터치스크린 모니터, 터치기능을 가진 스마트 텔레비전, 터치기능을 가진 대형 벽걸이 디스플레이 등을 포함한다.In the present invention, the touch screen controller is an electronic device having a touch screen control function, for example, a smart phone having a touch function, a monitor having a touch function of a digital camera or camcorder, a navigation function having a touch function, a portable multimedia player having a touch function, It includes tablet computers such as iPads and Galaxy Tabs, touchscreen monitors for personal computers such as laptops and desktop computers, smart televisions with touch capabilities, and large wall-mounted displays with touch capabilities.
본 발명에서 터치동작에 관련되어 사용되는 용어들의 정의는 다음과 같다.Definitions of terms used in connection with the touch operation in the present invention are as follows.
터치다운 --- 터치패널에 손가락이 접촉되는 동작상태를 나타내고, 제어부에서는 처음 터치 위치에서 이동없이 일정시간, 예컨대 300ms ~500ms 동안 터치상태를 유지할 경우 터치다운상태로 인식함.Touchdown --- Indicates an operation state in which a finger touches the touch panel, and the controller recognizes the touchdown state when the touch state is maintained for a predetermined time, for example, 300ms to 500ms without moving from the first touch position.
터치 해제 또는 터치업 --- 터치패널에 손가락의 접촉이 떨어지는 동작상태를 의미함.Touch release or touch up --- It means the operation state that the finger touches the touch panel.
터치이동 --- 터치다운된 상태에서 손가락이 접촉된 상태를 유지하면서 터치패널 상을 이동하는 동작 상태를 의미함.Touch movement --- It refers to an operation state of moving on the touch panel while keeping a finger in a touch-down state.
터치클릭(tapping)--- 터치다운 및 터치업이 순차적으로 연속하여 이루어진 동작상태를 정의하고 터치다운부터 터치업까지 시간이 소정 시간 이내로 짧은 시간을 유지하는 것임, 상술한 터치다운의 인식시간 보다 짧은 시간, 예컨대 100ms ~400ms 이내로 설정됨.Touch-tapping --- Defines an operation state in which touchdowns and touchups are sequentially performed and maintains a short time from the touchdown to the touchup within a predetermined time, rather than the recognition time of the touchdown described above. Set within a short time, such as 100 ms to 400 ms.
더블터치클릭(double tapping)--- 터치클릭이 두 번 연속적으로 이루어진 동작상태를 의미함.Double tapping --- A touch state where two touch clicks are performed in succession.
도 1은 본 발명에 의한 터치스크린 제어장치의 블록도를 나타낸다.1 is a block diagram of a touch screen control device according to the present invention.
도 1을 참조하면 터치스크린 제어장치(100)는 터치스크린(110), 제어부(120), 입출력부(130), 메모리부(140)를 포함한다.Referring to FIG. 1, the touch screen controller 100 includes a touch screen 110, a controller 120, an input / output unit 130, and a memory unit 140.
터치스크린(110)은 표시패널(112) 상에 터치패널(114)을 포함한다. 표시패널(112)은 OLED 또는 LED 등과 같은 평판 디스플레이로 구성되고 유저 인터페이스를 위한 이미지, 포인터 이미지, 객체 등을 표시한다. 터치패널(114)은 센서부로 감압방식 또는 정전방식으로 손가락 터치를 감지하여 터치위치 또는 터치좌표를 인식한다. 제어부(120)는 중앙처리장치(CPU) 및 디지털 신호처리부(DSP) 등의 마이크로프로세서 또는 마이크로컴퓨터로 구성되고 터치스크린 제어 프로그램을 실행하고 주어진 명령에 응답하여 인식된 터치위치에 대응하여 표시데이터를 생성하여 표시패널에 제공하여 터치스크린 전체의 동작을 제어한다. 입출력부(130)는 키입력부, 시리얼 입출력부, 병렬입출력부 또는 유무선 통신모듈 등을 포함하고 키명령신호나 외부 데이터 신호등을 입력받아 제어부(120)에 제공하거나 제어부(120)로부터 생성된 데이터를 외부로 출력한다. 메모리부(140)는 캐시메모리, 디램, 에스램, 플래쉬메모리, 자기디스크 저장장치 또는 광디스크 저장장치 등을 포함하고 터치스크린 제어프로그램을 저장하고, 제어부(120)에서 생성된 데이터를 저장한다.The touch screen 110 includes a touch panel 114 on the display panel 112. The display panel 112 is configured of a flat panel display such as an OLED or an LED and displays an image, a pointer image, an object, etc. for a user interface. The touch panel 114 detects a touch of a finger by a pressure-sensitive or electrostatic method by a sensor unit to recognize a touch position or a touch coordinate. The control unit 120 is composed of a microprocessor or microcomputer such as a central processing unit (CPU) and a digital signal processing unit (DSP), executes a touch screen control program, and displays display data corresponding to a recognized touch position in response to a given command. It is generated and provided to the display panel to control the operation of the entire touch screen. The input / output unit 130 includes a key input unit, a serial input / output unit, a parallel input / output unit, a wired / wireless communication module, etc., and receives a key command signal or an external data signal, etc., and provides the input to the controller 120 or data generated from the controller 120. Output to the outside. The memory unit 140 includes a cache memory, a DRAM, an SRAM, a flash memory, a magnetic disk storage device, an optical disk storage device, and the like, stores a touch screen control program, and stores data generated by the controller 120.
제어부(120)에서는 그래픽 유저 인터페이스(Grapic User Interface: 이하 GUI라 칭함)시스템으로 운영되어 표시패널(112)에 GUI 객체를 표시 제어한다. GUI 객체 또는 이미지객체는 이미지의 변위 및 변형이 수반되고 이미지 프로세싱이 수행될 수 있는 단위를 의미한다. 예를 들면 GUI 객체는 아이콘, 바탕화면 또는 응용 프로그램(예로서 Word, Excel, power point, internet explorer 등)용 윈도우 등일 수 있고 터치스크린(110)의 화면상에 전체 영역 또는 일부 영역으로 표시될 수 있고 이용자가 선택하는 객체로는 이미지, 동영상, 음악 등이 있다. GUI 객체 또는 이미지객체는 이미지의 변위 및 변형이 수반되고 이미지 프로세싱이 수행될 수 있는 단위를 의미한다. 예를 들면 GUI 객체는 아이콘, 바탕화면 또는 응용 프로그램(예로서 Word, Excel, power point, internet explorer 등)용 윈도우 등일 수 있고 터치스크린(110)의 화면상에 전체 영역 또는 일부 영역으로 표시될 수 있고 이용자가 선택하는 객체로는 이미지, 동영상, 음악 등이 있다. UI요소는 GUI 객체에서 제공하는 아이콘, 버튼박스, 포인터, 가상 터치 볼, 가상 지시봉 등이 있다.The controller 120 is operated by a graphical user interface (GUI) system to display and control GUI objects on the display panel 112. A GUI object or an image object refers to a unit in which displacement and deformation of an image may be involved and image processing may be performed. For example, the GUI object may be an icon, a desktop, or a window for an application (eg, Word, Excel, power point, internet explorer, etc.) and may be displayed in full or partial area on the screen of the touch screen 110. Objects selected by the user include images, videos, and music. A GUI object or an image object refers to a unit in which displacement and deformation of an image may be involved and image processing may be performed. For example, the GUI object may be an icon, a desktop, or a window for an application (eg, Word, Excel, power point, internet explorer, etc.) and may be displayed in full or partial area on the screen of the touch screen 110. Objects selected by the user include images, videos, and music. UI elements include icons, button boxes, pointers, virtual touch balls, and virtual indicator bars provided by GUI objects.
도 2는 본 발명에 의한 포인터의 구성을 설명하기 위한 화면 상태도이다.2 is a screen state diagram for explaining the structure of a pointer according to the present invention.
도면을 참조하면 본 발명의 포인터(150)는 반투명 원 안에 "+"플러스 기호가 표시된 것으로 "+"플러스 기호의 교차점이 표시위치(152)(Px, Py)로 제시된다.Referring to the drawings, the pointer 150 of the present invention has a "+" plus sign in a translucent circle, and the intersection point of the "+" plus sign is presented as the display position 152 (Px, Py).
포인터(150)의 표시위치(152)는 터치위치(154)(Tx, Ty)로부터 거리 d 만큼 이격되고 수평선으로부터 방위각(θ)은 90°즉 12시 방향에 위치되어 표시된다.The display position 152 of the pointer 150 is spaced apart from the touch position 154 (Tx, Ty) by a distance d, and the azimuth angle θ from the horizontal line is displayed at 90 ° or 12 o'clock.
포인터(150)의 표시위치(152)는 손가락에 의해 가려지지 않은 표시영역이면 충분하다. 포인터(150)의 모양이나 형상은 예시된 원으로 한정되는 것은 아니며 다양한 형태, 색깔. 반투명 또는투명 등으로 변형 가능하나 내부에 "+"플러스 기호를 포함한다.The display position 152 of the pointer 150 is sufficient if the display area is not covered by the finger. The shape or shape of the pointer 150 is not limited to the illustrated circle, but various shapes and colors. Translucent or transparent can be transformed into, but includes "+" plus sign inside.
표시패널의 좌측 하단에는 반투명 또는 투명으로 포인터 모드 선택버튼(160)이 표시된다. 포인터 모드 선택버튼(160)은 원 내에 "P"문자가 표시된 것으로 이 일반모드와 포인터 모드에서 서로 다른 색깔로 표시되어 선택된 모드를 나타낸다.The pointer mode selection button 160 is displayed on the lower left of the display panel in a translucent or transparent manner. The pointer mode selection button 160 is marked with a letter "P" in a circle and is displayed in a different color in the normal mode and the pointer mode to indicate the selected mode.
포인터 모드 선택버튼(160)은 좌측 하단으로 고정되지 않고 환경 설정에 의해 표시패널의 임의의 영역에 위치시킬 수 있다. 포인터 모드 선택버튼(160)은 기존의 제스처 제어방식과 병행하여 사용할 수 있도록 포인터 모드를 구분하기 위한 것이다. 포인터 모드 선택버튼(160)을 터치하면 제어부(120)는 일반 모드에서 포인터 모드로 전환된다. 포인트 모드에서는 다른 제어방식, 예컨대 제스처 제어방식은 작동이 중지된다.The pointer mode selection button 160 is not fixed to the lower left side and may be positioned in an arbitrary area of the display panel by setting the environment. The pointer mode selection button 160 is for distinguishing the pointer mode to be used in parallel with the existing gesture control method. When the pointer mode selection button 160 is touched, the controller 120 switches from the normal mode to the pointer mode. In the point mode, other control schemes, such as gesture control schemes, are disabled.
또한 화면 하단에는 메뉴 바(162)가 표시될 수 있다. 메뉴 바(162)에는 싱글/멀티모드버튼(162a), 환경설정버튼(162b), 가상 제어기 버튼(162c), 가상 지시봉 버튼(162d), 가상 터치볼 버튼(162e)을 포함한다.In addition, a menu bar 162 may be displayed at the bottom of the screen. The menu bar 162 includes a single / multi mode button 162a, an environment setting button 162b, a virtual controller button 162c, a virtual indicator bar button 162d, and a virtual touch ball button 162e.
도 3은 본 발명에 의한 포인터 모드 선택과정을 설명하기 위한 흐름도를 나타낸다.3 is a flowchart illustrating a pointer mode selection process according to the present invention.
도 3을 참조하면, 일반모드 대기상태에서(S102) 제어부(120)는 포인터 모드 선택버튼 표시가 설정되어 있는지를 체크하고 설정시에는 도 2에 도시한 바와 같이 화면 좌하단에 포인터 모드 선택버튼(160)이 표시한다(S102~S106). 제어부(120)는 포인터 모드 선택버튼(160)이 선택되면(S108) "P"문자의 색깔을 전환시켜 포인터 모드임을 나타낸다(S110). 포인터 모드에서 포인터(150)의 소거 도는 설정시간 경과를 체크한다(S112, S114). S112단계에서 포인터가 화면 상에서 소거되거나 S114단계에서 설정시간이 경과되면 "P"문자의 색깔을 원래 색깔로 복원시켜 포인터 모드가 해제되어 일반모드상태임 나타낸다(S116).Referring to FIG. 3, in the normal mode standby state (S102), the control unit 120 checks whether the pointer mode selection button display is set, and at the time of setting, the pointer mode selection button ( 160 displays (S102 to S106). When the pointer mode selection button 160 is selected (S108), the controller 120 changes the color of the letter “P” to indicate that the pointer mode is present (S110). In the pointer mode, the erase or check of the set time is checked (S112 and S114). When the pointer is erased on the screen in step S112 or when the set time elapses in step S114, the pointer mode is released by restoring the color of the letter "P" to the original color to indicate that the pointer mode is in the normal mode (S116).
도 4는 본 발명에 의한 포인터 모드에서 터치동작과 연동된 포인터 표시 및 객체 선택 과정을 설명하기 위한 흐름도이다.4 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in the pointer mode according to the present invention.
도 4를 참조하면, 먼저 사용자가 포인터 모드 선택버튼(160)을 터치하면 제어부(120)는 메모리부(140)에 저장된 포인터의 설정 데이터(거리값, 방위각, 크기, 투명도, 모양 및 색깔 데이터)를 가져다가 포인터 그래픽 이미지를 구성하고 시스템을 포인터 대기모드로 전환시킨다(S122). 이어서 터치패널(114)에 손가락 터치다운이 감지되면(S124) 제어부(120)에서는 터치위치(154)(Tx0, Ty0)로부터 설정된 거리값과 방위각을 대입하여 표시위치(152)(Px0, Py0)를 산출하고 준비된 포인터 그래픽 이미지를 산출된 표시위치에 표시한다(S126). S104단계에서 터치다운된 상태를 유지하면서 손가락을 이동시키면 제어부(120)에서는 이동에 따른 터치위치 값의 변동에 대응하여 표시위치 값을 실시간으로 산출하여 포인터(150)가 설정 거리와 설정 방위각을 유지하면서 표시되도록 제어한다. 손가락 이동이 정지되고 터치패널에서 손가락 접촉이 떨어지는 터치 업 동작상태를 검출한다(S128). S128단계에서 터치 업이 검출되면 제어부(120)는 현재 터치위치가 아니라 포인터(150)가 위치한 현재 위치(156)(Px1, Py1)를 선택위치로 산출한다. 제어부(120)에서는 선택위치에 대응되는 객체(157)를 선택 대기시킨다(S130). 이어서 제어부(120)에서는 선택대기 상태에서 터치다운 또는 터치 클릭과 같은 선택명령이 입력되는가를 체크하고(S132), 설정된 선택대기시간이 경과되었는지를 체크한다(S134). S132단계에서 터치다운이나 터치클릭과 같은 선택명령이 입력되면 선택대기 중인 객체(157)를 활성화시키고(S136) S122단계로 복귀한다. S134단계에서 제어부(120)는 설정시간이 경과할 때가지 어떠한 선택명령의 입력이 없으면 S122단계로 자동 복귀한다.Referring to FIG. 4, first, when a user touches the pointer mode selection button 160, the controller 120 sets setting data (distance value, azimuth, size, transparency, shape, and color data) of the pointer stored in the memory unit 140. Take a to configure a pointer graphic image and switch the system to the pointer standby mode (S122). Subsequently, when a finger touchdown is detected on the touch panel 114 (S124), the controller 120 substitutes the distance value and the azimuth angle set from the touch position 154 (Tx0, Ty0) and displays the display position 152 (Px0, Py0). Calculate and display the prepared pointer graphic image in the calculated display position (S126). When the finger is moved while maintaining the touch-down state in step S104, the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, thereby maintaining the set distance and the set azimuth angle. Control the display. The finger movement is stopped and the touch-up operation state in which the finger contact falls on the touch panel is detected (S128). When the touch-up is detected in step S128, the controller 120 calculates the current position 156 (Px1, Py1) where the pointer 150 is located as the selection position, not the current touch position. The controller 120 waits for selection of the object 157 corresponding to the selection position (S130). Subsequently, the controller 120 checks whether a selection command such as a touchdown or a touch click is input in the selection standby state (S132), and checks whether the set selection waiting time has elapsed (S134). When a selection command such as a touchdown or a touch click is input in step S132, the object 157 that is waiting for selection is activated (S136) and the process returns to step S122. In step S134, the controller 120 automatically returns to step S122 when there is no input of a selection command until the set time elapses.
여기서 포인터 모드 선택버튼(160)은 선택 객체가 활성화되면 자동으로 사라지는 방식도 가능하다. 이는 워드 작성 모드가 아니면 선택동작은 포인터가 편리하고 화면조작은 터치제어가 편리하기 때문이다.Here, the pointer mode selection button 160 may automatically disappear when the selection object is activated. This is because the selection operation is convenient for the pointer and the touch control is convenient for the screen operation unless the word writing mode is used.
또한 포인터 대기모드가 실행되면 전체화면 혹은 전체 이미지 영역이 고정되도록 특정한 터치 제어동작 명령(예컨대 상하 좌우 이동제어)이 일시 중지되는 것이 바람직하다.In addition, when the pointer standby mode is executed, it is preferable to temporarily suspend a specific touch control operation command (for example, vertical movement control) so that the entire screen or the entire image area is fixed.
상술한 실시예에서 S126단계의 포인터 표시 및 이동과정에서 터치위치와 포인터 표시위치가 설정된 값으로 고정된 것이 아니라 설정된 특정 영역에서는 자동적으로 가변되도록 제어할 수 있다.In the above-described embodiment, the touch position and the pointer display position are not fixed to the set value in the pointer display and movement process of step S126, but may be controlled to be automatically changed in the specific region.
도 5는 터치스크린 상의 사각구역을 설명하기 위한 도면이고 도 6은 사각구역의 각 영역에서 포인터(150)의 표시위치제어의 바람직한 일실시예를 설명하기 위한 도면이다.FIG. 5 is a view for explaining a blind spot on the touch screen, and FIG. 6 is a view for explaining a preferred embodiment of display position control of the pointer 150 in each area of the blind spot.
도 5에서 전체 사각구역((180)은 손가락으로 터치할 경우 손가락 굵기 때문에 터치스크린(110)의 에지영역에서 손가락 굵기보다 작은 사이즈의 객체가 터치 되지 않은 영역이 존재하게 된다.In FIG. 5, since the entire blind spot 180 is a finger thick when a finger is touched, an area where an object of a size smaller than the finger thickness is not touched in an edge region of the touch screen 110.
전체사각구역(180)은 상하좌우 경계선으로부터 에지가지의 폭 D로 설정되고 상변영역(182), 하변영역(184), 좌변영역(186), 우변영역(188)과 좌상 모서리영역(181), 우상 모서리 영역(183), 좌하 모서리 영역(185), 우하 모서리 영역(187)으로 구분된다.The total square region 180 is set to the width D of the edge branches from the upper, lower, left and right boundary lines, and includes the upper side region 182, the lower side region 184, the left side region 186, the right side region 188, and the upper left corner region 181, The upper right corner area 183, the lower left corner area 185, and the lower right corner area 187 are divided.
터치이동에 의해 터치위치가 사각구역(180)으로 진입하면 표시위치와의 거리 do와 방위각 θ=90°에서 다음과 같이 제어된다.When the touch position enters the blind spot 180 by the touch movement, it is controlled as follows at a distance do from the display position and the azimuth angle θ = 90 °.
상변영역(182)에서 포인터(150)의 수직거리 d1은 다음 수식에 의해 결정된다.The vertical distance d1 of the pointer 150 in the upper side region 182 is determined by the following equation.
d1 = d0 - (C1× S1)d1 = d0-(C1 × S1)
여기서 C1은 상변영역의 계수이고 S1은 사각구역 경계로부터 상변영역으로 진입한 수직거리를 나타낸다. 따라서 상변영역(182)에서는 방위각은 설정값을 유지하고 진입거리 S1에 반비례하여 수직거리가 줄어든다.Where C1 is the coefficient of the upper edge region and S1 represents the vertical distance from the rectangular boundary to the upper edge region. Therefore, in the upper side region 182, the azimuth angle maintains the set value and the vertical distance decreases in inverse proportion to the entry distance S1.
하변영역(184)에서 포인터(150)의 수직거리 d1은 다음 수식에 의해 결정된다.The vertical distance d1 of the pointer 150 in the lower side region 184 is determined by the following equation.
d1 = d0 - (C2× S2)d1 = d0-(C2 × S2)
여기서 C2는 하변영역의 계수이고 S2는 사각구역 경계로부터 하변영역으로 진입한 수직거리를 나타낸다. 따라서 상변영역(182)과 동일하게 하변영역(184)에서는 진입거리 S2에 반비례하여 수직거리가 줄어든다.Where C2 is the coefficient of the lower edge region and S2 is the vertical distance from the rectangular boundary to the lower edge region. Therefore, in the same manner as the upper side region 182, in the lower side region 184, the vertical distance is reduced in inverse proportion to the entry distance S2.
좌변영역(186)에서 포인터(150)의 수평거리 dL은 다음 수식에 의해 결정된다.The horizontal distance dL of the pointer 150 in the left side region 186 is determined by the following equation.
dL = C3× S3dL = C3 × S3
여기서 C3은 좌변영역의 계수이고 S3은 사각구역 경계로부터 좌변영역으로 진입한 수평거리를 나타낸다. 수직거리는 d0로 유지되나 수평거리 dL이 가변된다.Where C3 is the coefficient of the left side region and S3 represents the horizontal distance from the rectangular boundary to the left side region. The vertical distance is maintained at d0, but the horizontal distance dL is variable.
그러므로 방위각 θL은 다음 수식에 의해 결정된다.Therefore, the azimuth angle θL is determined by the following equation.
θL = 90˚ + actan(dL/d0)θL = 90˚ + actan (dL / d0)
좌상 모서리 영역(181)에서는 상술한 상변영역 및 좌변영역에서의 각각 수직거리와 수평거리가 진입 수평성분과 진입수직성분에 각각 반비례하여 줄어들게 되므로 거리와 방위각이 모두 가변 제어된다.In the upper left corner region 181, since the vertical distance and the horizontal distance in the upper and left edge regions are respectively reduced in inverse proportion to the entry horizontal component and the entry vertical component, both the distance and the azimuth angle are variably controlled.
마찬가지 방식으로 좌하 모서리 영역(185)에서는 상술한 하변영역 및 좌변영역에서의 각각 수직거리와 수평거리가 진입 수평성분과 진입수직성분에 각각 반비례하여 줄어들게 되므로 거리와 방위각이 모두 가변 제어된다.In the same manner, in the lower left corner region 185, the vertical distance and the horizontal distance in the lower side region and the left side region are reduced in inverse proportion to the entry horizontal component and the entrance vertical component, respectively, so that both the distance and the azimuth angle are variably controlled.
우변영역(188)에서 포인터(150)의 수평거리 dR은 다음 수식에 의해 결정된다.The horizontal distance dR of the pointer 150 in the right side region 188 is determined by the following equation.
dR = C4× S4dR = C4 × S4
여기서 C4은 우변영역의 계수이고 S4는 사각구역 경계로부터 우변영역으로 진입한 수평거리를 나타낸다. 수직거리는 d0로 유지되나 수평거리 dR이 가변된다.Where C4 is the coefficient of the right side region and S4 represents the horizontal distance from the rectangular boundary to the right side region. The vertical distance is maintained at d0, but the horizontal distance dR is variable.
그러므로 방위각 θR은 다음 수식에 의해 결정된다.Therefore, the azimuth angle θR is determined by the following equation.
θR = 90˚- actan(dR/d0) θR = 90˚- actan (dR / d0)
우상 모서리 영역(183)에서는 상술한 상변영역 및 우변영역에서의 각각 수직거리와 수평거리가 진입 수평성분과 진입수직성분에 각각 반비례하여 줄어들게 되므로 거리와 방위각이 모두 가변 제어된다.In the upper right corner region 183, since the vertical distance and the horizontal distance in the upper and lower right regions are respectively reduced in inverse proportion to the entry horizontal component and the entry vertical component, both the distance and the azimuth angle are variably controlled.
마찬가지 방식으로 우하 모서리 영역(187)에서는 상술한 하변영역 및 우변영역에서의 각각 수직거리와 수평거리가 진입 수평성분과 진입수직성분에 각각 반비례하여 줄어들게 되므로 거리와 방위각이 모두 가변 제어된다.In the same manner, in the lower right corner region 187, the vertical distance and the horizontal distance in the lower side region and the right side region are reduced in inverse proportion to the entry horizontal component and the entrance vertical component, respectively, so that both the distance and the azimuth angle are variably controlled.
도 7은 도 6의 일실시예의 사각구역에서 포인터 표시위치 제어 과정을 설명하기 위한 흐름도이다.FIG. 7 is a flowchart illustrating a process of controlling a pointer display position in the rectangular area of the embodiment of FIG. 6.
도 7을 참조하면, 제어부(120)에서는 터치위치 값을 입력하여(S142) 입력된 터치위치 값이 설정된 사각영역(180)의 범위 내로 진입하게 되면(S144) 상 또는 하변 영역인지(S146) 아니면 좌 또는 우변 영역인지(S150) 아니면 상하좌우 모서리 영역인지(S154)를 체크한다. S146단계에서 상 또는 하변영역이면 사각구역의 경계선으로부터 진입거리에 반비례하여 거리값을 가변시킨다(S148). S150단계에서 좌 또는 우변영역이면 사각구역의 경계선으로부터 진입거리에 반비례하여 방위각을 가변시킨다(S152). S154단계에서 상하좌우 모서리 영역 중 어느 한 영역이면 사각구역의 경계선으로부터 진입거리에 반비례하여 거리 값 및 방위각을 동시에 가변시킨다(S156). S148 내지 S156단계에서 각각 산출된 거리값 또는 방위각 데이터는 제어부(120)에서 포인터 표시위치 제어 프로그램에 전달되어 포인터 이미지의 화면 좌표 값에 반영된다.Referring to FIG. 7, the controller 120 inputs a touch position value (S142) when the input touch position value enters the range of the set rectangular area 180 (S144). It is checked whether it is the left or right side area (S150) or the top, bottom, left and right corner areas (S154). In step S146, if the upper or lower side area is inversely proportional to the entry distance from the boundary line of the blind spot, the distance value is varied (S148). In step S150, the azimuth angle is changed in inverse proportion to the entry distance from the boundary line of the blind spot in the left or right side area (S152). In step S154, if any one of the upper, lower, left, and right corner regions is inversely proportional to the entry distance from the boundary line of the blind spot, the distance value and the azimuth angle are simultaneously changed (S156). The distance value or azimuth data calculated in each of steps S148 to S156 is transmitted from the control unit 120 to the pointer display position control program and reflected in the screen coordinate value of the pointer image.
이와 같이 사각구역(180)에서는 포인터의 표시위치와 터치위치 사이의 거리 및 방위각이 진입거리에 적응적으로 자동 제어되므로 사각영역에서 객체 선택을 쉽게 할 수 있다.As described above, in the blind spot 180, the distance and azimuth between the display position and the touch position of the pointer are automatically controlled adaptively to the entry distance, thereby making it easy to select an object in the blind spot.
상술한 실시예에서는 사각구역을 4변과 4모서리로 구획하여 거리값과 방위각을 제어하였으나 이에 한정된 것은 아니고 이들 중 어느 하나 이상을 조합하여 사용할 수 있다. 예컨대 가장 단순한 제어방식으로는 사각구역으로 하변 영역만 설정하여 수직거리만 가변 제어할 수 있다. 도 8a 내지 도 8c는 본 발명에 의한 터치위치와 표시위치 사이의 거리가 하부에지에 접근함에 따라 달라진 상태를 나타낸 사진들이다.In the above-described embodiment, the rectangular area is divided into four sides and four corners to control the distance value and the azimuth angle, but the present invention is not limited thereto, and any one or more of them may be used in combination. For example, in the simplest control method, the vertical area can be variably controlled by setting only the lower area to the rectangular area. 8A to 8C are photographs illustrating a state in which the distance between the touch position and the display position according to the present invention changes as the lower edge approaches.
상술한 실시예에서는 싱글 터치에 대해서만 설명하였으나 동시에 두 개 이상의 터치가 존재할 경우에도 각각의 터치점에 대한 표시점의 사각영역 가변제어는 각각 동일하게 적용될 것이다.In the above-described embodiment, only a single touch has been described. However, even when two or more touches exist at the same time, the rectangular area variable control of the display point for each touch point will be equally applied.
또한 본 발명에서는 반드시 사각영역에서만 포인터 표시위치를 가변 제어하는 것이 아니라 화면 전체영역에 대해서 가변제어방식으로 확장할 수 있다.In addition, in the present invention, the pointer display position may not be variably controlled in the rectangular area but may be extended in a variable control method over the entire screen area.
도 9는 화면 전체 영역에서 포인터(150)의 표시위치를 가변제어하기 위한 실시예를 설명하기 위한 도면이다.9 is a view for explaining an embodiment for variably controlling the display position of the pointer 150 in the entire screen area.
도 9에서 터치스크린(110)의 중심점(TS0)으로부터 -y 축방향으로 설정거리 d0 만큼 떨어진 터치점(T0)(0, -T0)에 대응하는 포인터 표시점을 기준 표시점(0,0)으로 한다. 터치스크린(110)이 내접하는 가상원(190)의 반지름 r이라 하고 원점(0,0)와 터지점(T1) 사이의 거리를 q라고 하면 반지름 r에 대한 터치점 거리의 비율을 δ = (q/r)로 하면 다음 표1에 정리한 바와 같이 원점(0,0)으로부터 터치점가지의 거리의 비율에 의해 터치점으로부터 포인터 표시점의 거리 및 방향을 가변 제어할 수 있다. 여기서 도면에 점선으로 표시된 엑스표시원은 터치점(T1, T2, T3, T4)이고 플러스 표시 적색원은 포인터 표시점(P1, P2, P3, P4)를 나타내며 각 터치점(T1, T2, T3, T4)의 수평선으로부터 반시계방향의 방위각은 θ T1 , θ T2 , θ T3 , θ T4 로 각각 표시한다. In FIG. 9, the pointer display point corresponding to the touch point T0 (0, -T0) spaced by the set distance d0 in the -y axis direction from the center point TS0 of the touch screen 110 is the reference display point (0,0). It is done. If the radius r of the virtual circle 190 inscribed by the touch screen 110 is q and the distance between the origin point (0,0) and the touch point T1 is q, the ratio of the touch point distance to the radius r is δ = ( q / r), as summarized in the following Table 1, the distance and direction of the pointer display point from the touch point can be variably controlled by the ratio of the distance of the touch point branch from the origin (0,0). Here, the X-marked circles indicated by dotted lines in the drawings are touch points T1, T2, T3, and T4, and the plus-marked red circles indicate the pointer mark points P1, P2, P3, and P4, and the respective touch points (T1, T2, T3). , The counterclockwise azimuth from the horizontal line of T4) is θ T1                 ,θ T2                 ,θ T3                 ,θ T4                  Mark each as
< 표 1 ><Table 1>
Figure PCTKR2011008568-appb-I000001
Figure PCTKR2011008568-appb-I000001
즉 제1상한과 제2상한에서는 터치점이 중심점으로부터 멀어질수록 터치점 방위각 방향으로 표시점의 거리가 설정 거리 d0 보다 큰 값으로 증가하고 중심점으로 접근하면 설정거리 d0로 수렴한다.That is, in the first and second upper limits, as the touch point moves away from the center point, the distance of the display point in the touch point azimuth direction increases to a value greater than the set distance d0 and converges to the set distance d0 when approaching the center point.
반대로 제3상한에서는 터치점이 중심점으로부터 멀어질수록 터치점을 기준으로 방위각(θ T3 -90°) 방향에서 터치점(T3)을 향하여 설정 거리 d0 보다 작은 값으로 접근하고 중심점(0,0)에 접근할수록 설정거리d0로 수렴한다.On the contrary, in the third upper limit, as the touch point moves away from the center point, the touch point T3 approaches the touch point T3 in the azimuth angle (θ T3 -90 °) with respect to the touch point and approaches the center point (0,0). As it approaches, it converges to the set distance d0.
제4상한에서는 터치점이 중심점으로부터 멀어질수록 터치점(T4)을 기준으로 방위각(θ T4 -270°) 방향에서 터치점(T4)을 향하여 설정 거리 d0 보다 작은 값으로 접근하고 중심점(0,0)에 접근할수록 설정거리d0로 수렴한다.In the fourth upper limit, as the touch point moves away from the center point, the touch point T4 approaches the touch point T4 in the azimuth angle θ T4 -270 ° with respect to the touch point T4 and the center point (0,0). ), Converges to the set distance d0.
따라서 제3상한 및 제4상한에서는 터치스크린(110)의 하단에지로 접근할수록 터치점과 포인터 표시점이 최단거리로 근접된다. 여기서 최단거리는 손가락 터치위치로부터 포인터 표시점이 가려지지 않을 정도의 최소 거리를 유지하여야 한다.Therefore, in the third and fourth quadrants, the touch point and the pointer display point are closer to each other as the lower edge of the touch screen 110 approaches. Here, the shortest distance should be maintained at a minimum distance from the finger touch position so that the pointer mark is not hidden.
도 10은 도 9의 일실시예의 전체영역에서 포인터 표시위치 제어 과정을 설명하기 위한 흐름도이다.FIG. 10 is a flowchart illustrating a pointer display position control process in the entire area of the embodiment of FIG. 9.
도 10을 참조하면, 제어부(120)에서는 터치위치 값을 입력하여(S162) 입력된 터치점과 중심점 사이의 거리값 q 및 방위각을 산출한다(S164). 산출된 터치점의 방위각으로 터치점이 제1상한 내지 제4상한 중 어느 상한에 위치한지를 체크한다(S166~S172). S166단계에서 제1상한이면 < 표 1 >의 제1상한 표시점 거리 산출식과 터치점(T1)을 기준으로 표시점(P1) 방위각을 산출한다(S167). S168단계에서 제2상한이면 < 표 1 >의 제2상한 표시점 거리 산출식과 터치점(T2)을 기준으로 표시점(P2) 방위각을 산출한다(S169). S170단계에서 제3상한이면 < 표 1 >의 제3상한 표시점 거리 산출식과 터치점(T3)을 기준으로 표시점(P3) 방위각을 산출한다(S171). S172단계에서 제4상한이면 < 표 1 >의 제4상한 표시점 거리 산출식과 터치점(T4)을 기준으로 표시점(P4) 방위각을 산출한다(S173).Referring to FIG. 10, the controller 120 inputs a touch position value (S162) to calculate a distance value q and an azimuth angle between the input touch point and the center point (S164). It is checked at which upper limit of the first upper limit to the fourth upper limit the calculated azimuth angle of the touch point (S166 to S172). In operation S166, the azimuth angle of the display point P1 is calculated based on the first upper limit display point distance calculation formula of Table 1 and the touch point T1 (S167). If it is the second upper limit in operation S168, the azimuth angle of the display point P2 is calculated based on the second upper limit display point distance calculation formula of Table 1 and the touch point T2 (S169). If it is the third upper limit in operation S170, the azimuth angle of the display point P3 is calculated based on the third upper limit display point distance calculation formula of Table 1 and the touch point T3 (S171). If it is the fourth upper limit in step S172, the azimuth angle of the display point P4 is calculated based on the fourth upper limit display point distance calculation formula of Table 1 and the touch point T4 (S173).
S167 내지 S173단계에서 각각 산출된 표시점(P1~P4)의 거리값 또는 방위각 데이터는 제어부(120)에서 포인터 표시위치 제어 프로그램에 전달되어 포인터 이미지의 화면 좌표 값에 반영된다.The distance values or azimuth data of the display points P1 to P4 calculated in steps S167 to S173 are transmitted from the controller 120 to the pointer display position control program and reflected on the screen coordinate values of the pointer image.
통상 인터넷 상의 화면 구성상 상단 및 좌우측에는 사용빈도가 높은 선택메뉴나 아이콘 버튼 등의 객체들이 주로 배치되고 하단에는 사용빈도가 낮은 객체들이 배치되는 경향이 있다. 또한 인체공학적 시각적 또는 행동 습관적으로 상단 및 좌측에 중요도 높은 객체가 배치되므로 중요 부분이 손가락에 의해 가려지지 않는 동선방향인 아래에서 위로 올리면서 객체영역을 포인터로 선택하는 것이 바람직하다.In general, objects such as a selection menu or an icon button having a high frequency of use are mainly disposed on the top and left and right sides of the screen configuration on the Internet, and objects having a low frequency of use are disposed at the bottom of the screen. In addition, since objects of high importance are arranged on the top and left sides of ergonomic visual or behavioral habits, it is preferable to select the object area as a pointer while moving up from the bottom in the synchronous direction where important parts are not covered by the fingers.
도 11은 본 발명의 표시위치의 가변제어기술을 이용한 가상 지시봉 실시예를 설명하기 위한 도면이다.11 is a view for explaining an embodiment of a virtual indicator rod using a variable control technique of the display position of the present invention.
도 11을 참조하면 가상 지시봉 실시예에서는 터치스크린(110)의 우측 에지 부근 또는 우하측 모서리 부근에 지시봉 구역(192)을 설정하고 이 지시봉 구역(192)에서 각 터치위치가 터치스크린(110)의 전체 영역에 각각 대응하도록 터치점(T)으로부터 표시점(P)까지의 거리를 가변제어하고 포인터 이미지를 지시 이미지(194)로 표시한다. 예를들면 지시봉 제어구역(192)의 우측에지(192a)를 기준으로 터치점(T)까지 수평거리 (192b)와 터치스크린(110)의 우측에지(110a)로부터 지시 이미지(194)의 표시점(P)까지의 수평거리(110b)의 비율로 가변 제어한다. 즉 우측에지에서 멀어질수록 터치점(T)와 표시점(P)의 거리는 증가하고 우측에지에 접근할수록 터치점(T)과 표시점(P)의 거리는 감소한다. 터치점을 기준으로 한 표시점의 방위각은 특정 방위각으로 고정되거나 상술한 사각영역이나 전체영역과 유사한 방식으로 가변 제어될 수 있다.Referring to FIG. 11, in the virtual indicator rod embodiment, the indicator rod zone 192 is set near the right edge of the touch screen 110 or near the lower right corner, and each touch position of the indicator rod 192 is determined by the touch screen 110. The distance from the touch point T to the display point P is variably controlled so as to correspond to the entire area, and the pointer image is displayed as the instruction image 194. For example, the horizontal distance 192b to the touch point T and the display point of the indication image 194 from the right edge 110a of the touch screen 110 based on the right edge 192a of the indicator bar control zone 192. Variable control is performed at the ratio of the horizontal distance 110b to (P). That is, the distance between the touch point T and the display point P increases as the distance from the right edge increases, and the distance between the touch point T and the display point P decreases as the right edge approaches. The azimuth angle of the display point based on the touch point may be fixed at a specific azimuth angle or variably controlled in a manner similar to the above-described rectangular area or the entire area.
제어부(120)에서는 메뉴 바(162)의 환경설정버튼(162d)이 선택되면 거리 가변제어모드를 가상 지시봉 제어모드로 설정한다.When the environment setting button 162d of the menu bar 162 is selected, the controller 120 sets the distance variable control mode to the virtual indicator bar control mode.
본 발명의 가상 지시봉 제어기술은 벽걸이 대형 터치스크린 장치, 터치 텔레비전, 터치스크린 전자칠판, 터치스크린 대형 모니터 등의 기술 분야에서 매우 유용하다.The virtual indicator bar control technology of the present invention is very useful in the technical fields such as wall-mounted large touch screen device, touch television, touch screen electronic blackboard, touch screen large monitor.
도 12a 내지 도 12h 는 본 발명에 의한 포인터의 환경설정을 설명하기 위한 화면 상태도들이다.12A to 12H are screen state diagrams for explaining an environment setting of a pointer according to the present invention.
제어부(120)에서는 메뉴 바(162)의 환경설정버튼(162b)이 선택되면 포인터 환경설정모듈을 실행하여 포인터의 여러 가지 파라미터들을 설정할 수 있도록 한다.When the configuration button 162b of the menu bar 162 is selected, the controller 120 executes the pointer configuration module to set various parameters of the pointer.
터치스크린(110)의 표시패널에 포인터 환경설정 메뉴 선택시 포인터의 방향과 거리를 설정하기 위한 풀다운메뉴를 나타낸 화면(도 12a)을 표시하고 도12a의 풀다운 메뉴선택에 응답하여 방향설정화면(도 12b) 및 거리설정화면(도 12c)을 표시한다. 각 설정화면에서 원하는 방향과 거리를 설정한다. 환경설정 초기화면에서 포인터 크기, 투명도, 색상, 모양 등을 설정하기 위한 풀다운메뉴를 나타낸 화면(도 12d)을 나타내고 풀다운 메뉴선택에 응답하여 컬러설정화면(12e), 크기설정화면(도 12f), 투명도 설정화면(도 12g) 및 모양설정화면(도 12h)을 표시한다. 각 설정화면에서 원하는 포인터 크기, 투명도, 색상 및 모양을 설정한다.When the pointer configuration menu is selected on the display panel of the touch screen 110, a screen (Fig. 12A) showing a pull-down menu for setting the direction and distance of the pointer is displayed and the direction setting screen (Fig. 12A) is responded to in response to the pull-down menu selection of Fig. 12A. 12b) and the distance setting screen (Fig. 12c). Set the desired direction and distance on each setting screen. In the initial setting screen, the screen showing the pull-down menu for setting the pointer size, transparency, color, shape, etc. is displayed (Fig. 12D), and in response to the pull-down menu selection, the color setting screen 12e, the size setting screen (Fig. 12F), The transparency setting screen (Fig. 12G) and the appearance setting screen (Fig. 12H) are displayed. Set the desired pointer size, transparency, color and shape on each setting screen.
도 13은 본 발명의 포인터의 방향과 거리를 터치상태에서 설정하는 방식을 설명하기 위한 도면이다.13 is a view for explaining a method of setting the direction and distance of the pointer of the present invention in the touch state.
도 13을 참조하면, 터치스크린(110)을 오른손 검지(FR1)로 터치하면 환경설정에 의해 설정된 거리 및 방향값에 의해 포인터(150)가 표시된다. 이와 같이 포인터(150)가 표시된 상태에서 왼손 검지(FL1)로 포인터 표시영역을 터치하면 제어부(120)에서는 다른 터치위치를 인식하게 된다. 제어부(120)는 포인터가 표시된 상태에서 포인터 표시영역에 다른 터치가 인식되면 자동으로 포인터 거리 및 방향 환경설정모드로 전환하여 포인터 환경설정모듈을 실행시킨다. 포인터 환경설정모듈에서는 손가락(172)의 터치이동을 추적하고 원하는 위치와 방향에 포인터(151)를 위치시킨 다음에 손가락(172)이 터치 업(터치해제)되면 현재 새롭게 배치된 포인터의 위치와 손가락(170)의 터치위치 사이의 거리값을 산출하여 새로운 거리값과 방향값으로 설정값을 바꾼다.Referring to FIG. 13, when the touch screen 110 is touched with the right hand detection FR1, the pointer 150 is displayed by the distance and direction values set by the environment setting. As such, when the pointer 150 is touched with the left hand detection FL1 while the pointer 150 is displayed, the controller 120 recognizes another touch position. When another touch is recognized in the pointer display area while the pointer is displayed, the controller 120 automatically switches to the pointer distance and direction configuration mode to execute the pointer configuration module. In the pointer configuration module, the touch movement of the finger 172 is tracked, the pointer 151 is positioned at a desired position and direction, and when the finger 172 is touched up (released), the position and finger of the currently placed pointer are touched. The distance value between the touch positions of 170 is calculated and the set value is changed to the new distance value and the direction value.
따라서 환경 설정 메뉴를 선택하지 않고 터치상태에서 터치위치를 중심으로 임의의 거리와 방향으로 포인터의 표시위치를 변경할 수 있으므로 사용상 편리성을 증대시킬 수 있다.Therefore, since the display position of the pointer can be changed at an arbitrary distance and direction from the touch position in the touch state without selecting the environment setting menu, convenience in use can be increased.
도 14는 본 발명에 의한 포인트 표시영역을 터치한 상태를 나타낸 도면이다. 14 is a view showing a state in which a point display area is touched according to the present invention.
도 14에 도시한 바와 같이 포인터(150)의 표시영역을 터치하면 터치위치(158)의 좌표가 제어부(126)에 제공되고 제어부(126)에서는 확대/축소 처리 프로그램을 실행하여 포인터(150) 내부의 "+" 기호를 "-" 기호와 교호로 표시한다. 만약 + 기호상태에서는 터치동작에 응답하여 확대기능을 수행하고 원내의 + 기호를 - 기호로 반전 표시한다. 반대로 - 기호상태에서는 터치동작에 응답하여 축소기능을 수행하고 원내의 - 기호를 + 기호로 반전 표시한다.As shown in FIG. 14, when the display area of the pointer 150 is touched, the coordinates of the touch position 158 are provided to the controller 126, and the controller 126 executes an enlargement / reduction processing program to execute the inside of the pointer 150. The "+" sign is replaced by the "-" sign. If it is in the + sign state, it performs enlargement function in response to the touch action and inverts the + sign in the circle with-sign. On the contrary, in-sign state, it performs reduction function in response to touch action and inverts-sign in circle with + sign.
도 15는 본 발명에 의한 확대/축소 처리 프로그램을 설명하기 위한 흐름도이다.15 is a flowchart for explaining an enlargement / reduction processing program according to the present invention.
도 15를 참조하면, 제어부(120)에서는 포인터 대기 상태에서(S182) 터치다운(154)이 검출되는지를 체크한다(S184). S184단계에서 터치다운이 검출되면 상술한 바와 같이 확대(+기호) 또는 축소(-기호)를 포함한 포인터 이미지(150)를 터치위치(154)로부터 설정거리 만큼 떨어진 표시위치(152)에 표시한다(S186). 이어서 터치업을 검출하고(S188) 터치업이 검출되면 재터치가 입력되었는지를 검출하여 재터치위치(158)가 포인터 표시영역 터치다운인지(S190) 아니면 터치클릭인지를 체크한다(S198). S190단계에서 터치다운이면 표시된 확대 또는 축소 기호에 대응하여 확대 또는 축소를 일정 비율로 연속적으로 실행한다(S192). 예컨대 +기호이면 원래 크기로부터 10%씩 단계적으로 화면 이미지를 확대 처리한다. 반대로 -기호이면 원래 크기로부터 10%씩 단계적으로 화면 이미지를 축소 처리한다. 이 축소처리는 터치업이 검출될 때까지 유지한다. S194단계에서 터치업이 검출되면 확대 또는 축소 기호를 반전 표시하고 S190단계로 복귀한다(S196). 즉 S190단계에서 +기호이면 S196단계 이후에는 -기호로 반전시킨다.Referring to FIG. 15, the controller 120 checks whether a touchdown 154 is detected in a pointer standby state (S182) (S184). When the touchdown is detected in step S184, the pointer image 150 including the enlargement (+ sign) or the reduction (-sign) is displayed on the display position 152 away from the touch position 154 by a set distance as described above ( S186). Subsequently, the touch-up is detected (S188). If the touch-up is detected, it is detected whether a retouch is input, and it is checked whether the retouch position 158 is a pointer display area touchdown (S190) or a touch click (S198). If the touchdown in step S190, in response to the displayed enlargement or reduction symbol to enlarge or reduce continuously at a predetermined ratio (S192). For example, the + sign enlarges the screen image in steps of 10% from the original size. On the contrary,-sign scales the screen image by 10% from the original size. This reduction process is maintained until the touch up is detected. When the touch-up is detected in step S194, the enlargement or reduction symbol is inverted and returned to step S190 (S196). That is, if the + sign in step S190, after the step S196 is inverted to the-sign.
S198단계에서 터치클릭이 체크되면 설정된 확대 또는 축소비율 예컨대 200%로 1단계 확대만 실행한다(S199). S199단계에서 기본 확대 이후에는 S196단계를 수행하여 포인터의 기호를 +에서 -로 반전 표시하고 S190단계로 복귀한다. 그러므로 S198단계에서 또 다른 터치클릭이 있다면 이번에는 이미 200% 확대된 화면 이미지를 원래 100% 화면 이미지로 복원되게 된다.If the touch click is checked in step S198, only one-step magnification is executed at the set magnification or reduction ratio, for example, 200% (S199). After the basic enlargement in step S199, step S196 is performed to invert and display the pointer symbol from + to-and return to step S190. Therefore, if there is another touch click in step S198, this time the screen image which has already been enlarged 200% is restored to the original 100% screen image.
그러므로 본 발명에서는 포인터 표시영역을 터치다운 또는 터치클릭으로 구분하여 터치다운시에는 연속적인 확대나 축소 제어가 가능하도록 하고, 터치클릭시에는 200%확대-100%축소의 2단계로 신속하게 제어 가능하게 된다.Therefore, in the present invention, the pointer display area is divided into touchdown or touchclick to enable continuous enlargement or reduction control during touchdown, and can be quickly controlled in two stages of 200% enlargement-100% reduction during touchdown. Done.
도 16은 본 발명에 의한 가상 제어기(170)가 표시된 화면 상태도이다.16 is a screen state diagram in which the virtual controller 170 according to the present invention is displayed.
도 16을 참조하면 제어부(120)에서는 메뉴 바(162)의 가상 제어기 버튼(162c)이 선택되면 가상 제어기 작동 모드로 시스템을 설정하고 포인터(150)와 함께 가상 제어기(170)를 터치점(T) 근처에 표시한다. 가상 제어기(170)는 통상의 마우스의 오른쪽 버튼 기능을 수행할 수 있다. 도면에서는 터치점(170)의 오른쪽에만 가상 제어기(170)를 표시하였으나 터치점(T)의 왼쪽에 마우스의 왼쪽버튼에 대응하는 또 다른 가상 제어기를 표시할 수 도 있다. 가상 제어기(170)는 마우스 왼쪽 또는 오른쪽 버튼 기능 뿐만 아니라 메뉴선택버튼 등과 같이 다양한 기능을 부여할 수 있다.Referring to FIG. 16, when the virtual controller button 162c of the menu bar 162 is selected, the controller 120 sets the system to the virtual controller operation mode and touches the virtual controller 170 with the pointer 150 at the touch point T. Mark). The virtual controller 170 may perform a right button function of a conventional mouse. Although the virtual controller 170 is displayed only on the right side of the touch point 170, another virtual controller corresponding to the left button of the mouse may be displayed on the left side of the touch point T. The virtual controller 170 may assign various functions such as a menu selection button as well as a left or right mouse button function.
도 17은 본 발명에 의한 가상 터치볼을 설명하기 위한 화면 상태도를 나타내고 도 18 본 발명에 의한 가상 터치볼 제어 동작을 설명하기 위한 흐름도이다.FIG. 17 is a screen state diagram illustrating a virtual touch ball according to the present invention, and FIG. 18 is a flowchart illustrating a virtual touch ball control operation according to the present invention.
도 17을 참조하면 가상 터치볼(170)은 터치스크린(110)의 좌상단에 기존의 트랙볼과 유사한 형태로 표시된다. 여기서 가상 터치볼(170)은 투명 또는 반투명으로 표시되는 것이 바람직하다. 가상 터치볼(170)은 활성화영역(170a)과 그 주변의 비활성화 영역(170b)을 포함한다. 여기서 활성화영역(170a)은 터치작동에 응답하여 화면 이동 또는 스크롤 제어하기 위하여 터치 검출로부터 이동속도 및 이동방향을 검출하기 위한 영역이다 비활성화 영역(170b)은 단지 영역 밖으로부터 터치이동이 있는지 아니면 영역 안에서 터치이동이 시작되는지를 검출하기 위한 영역이다.Referring to FIG. 17, the virtual touch ball 170 is displayed in a form similar to the existing track ball on the upper left of the touch screen 110. The virtual touch ball 170 is preferably displayed in a transparent or translucent. The virtual touch ball 170 includes an activation area 170a and an inactivation area 170b around the active touch ball 170. Here, the activation area 170a is an area for detecting the moving speed and the moving direction from touch detection in order to control the screen movement or scrolling in response to the touch operation. This is an area for detecting whether touch movement is started.
활성화 영역(170a)은 구 형상 이미지로 손가락(FL1)으로 구 형상 이미지를 원하는 방향으로 돌리면 제어부(120)에서는 이에 응답하여 화면 또는 객체가 돌리는 방향으로 이동되도록 화면 제어한다. 가상 터치볼(170)은 비활성화 영역(170b) 밖에서부터 터치다운되어 안으로 터치 이동되면 가상 터치볼(170)의 기능을 중지하여 포인터(150)의 터치이동으로 제어를 유지한다. 가상 터치볼(170)은 비활성화 영역(170b) 안에서부터 터치다운되어 활성화 영역(170a) 안으로 터치 이동되면 가상 터치볼(170) 동작으로 인식하여 화면이동 또는 화면 스크롤 동작을 수행한다.When the spherical image is rotated in the desired direction with the finger FL1 as the spherical image, the activation area 170a controls the screen to move the screen or the object in the direction in which the screen or the object is rotated in response thereto. When the virtual touch ball 170 is touched down from the outside of the inactive area 170b and touched inwards, the virtual touch ball 170 stops the function of the virtual touch ball 170 to maintain control by the touch movement of the pointer 150. When the virtual touch ball 170 is touched down from within the inactive area 170b and touched in the active area 170a, the virtual touch ball 170 recognizes the operation of the virtual touch ball 170 and performs a screen movement or screen scroll operation.
도 18을 참조하면, 제어부(120)는 일반모드 대기상태에서(S202) 메뉴바(160)의 가상 터치볼 버튼(160d)이 선택되면(S204) 가상 터치볼(170)을 화면 좌상단에 표시한다(S206). 가상 터치볼(170)의 활성화 영역(170a) 및 비활성화 영역(170b) 내에서 터치다운 및 터치이동을 검출한다(S208). S208단계에서 터치볼 작동이 검출되면 제어부(120)는 가상 터치볼이 굴러가는 방향 및 속도에 응답하여 화면 또는 객체를 해당 방향으로 이동시킨다(S210). 활성화 영역(170a)에서 터치업을 검출하고(S212) 터치업이 검출되면 화면 이동속도를 점차 줄여가면서 정지한 다음에 S206단계로 복귀한다(S214). S208단계에서 가상 터치볼(170)의 비활성화 영역(170b) 밖에서부터 터치다운되어 영역(170b) 안으로 터치 이동을 검출되면(S216) 가상 터치볼의 모드를 일시 중단시킨다(S218). 터치이동이 영역 밖으로 빠져나가거나 터치업되는 것이 검출되면 일시중단모드를 해제하고(S220) S206단계로 복귀한다.Referring to FIG. 18, when the virtual touch ball button 160d of the menu bar 160 is selected (S204) in the normal mode standby state (S202), the controller 120 displays the virtual touch ball 170 on the upper left of the screen. (S206). The touchdown and the touch movement are detected in the active area 170a and the deactivated area 170b of the virtual touch ball 170 (S208). When the touch ball operation is detected in step S208, the control unit 120 moves the screen or object in the corresponding direction in response to the direction and speed at which the virtual touch ball rolls (S210). When the touch-up is detected in the activation area 170a (S212), when the touch-up is detected, the screen is stopped while gradually decreasing the screen moving speed (S214). In operation S208, when a touch movement is detected from outside the inactive area 170b of the virtual touch ball 170 and a touch movement is detected in the area 170b (S216), the mode of the virtual touch ball is suspended (S218). If it is detected that the touch movement is moved out of the area or touched up, the suspend mode is released (S220) and the process returns to step S206.
S216에서 터치가 검출되지 않으면 설정시간이 경과되었는지를 검출하고(S224) 경과되지 않았으면 가상 터치볼 모드를 유지하고 경과되었으면 일반모드 대기상태로 복귀한다(S224).If the touch is not detected in S216, it is detected whether the set time has elapsed (S224). If it has not elapsed, the virtual touch ball mode is maintained, and if it is not elapsed, it returns to the normal mode standby state (S224).
도 19는 본 발명에 의한 2 터치 2 포인터를 설명하기 위한 도면이다.19 is a view for explaining the two touch 2 pointer according to the present invention.
도 19를 참조하면 터치스크린(110) 상에 두 터치점들(T1, T2)에 대응하는 포인터들(P1, P2)을 일정 거리 및 방향에 표시하기 위하여 두 터치점들(T1, T2)의 방위각을 구한다.Referring to FIG. 19, in order to display the pointers P1 and P2 corresponding to the two touch points T1 and T2 on the touch screen 110 at a predetermined distance and in a direction, the two touch points T1 and T2 may be separated. Find the azimuth.
먼저 터치점들의 방위각을 구하기 위하여 터치스크린(110) 상에 두 터치점들(T1, T2) 사이의 직선거리(S)의 이등분점(202)을 산출한다. 이등분점(202)으로부터 거리(S)에 대해 수직인 수직선을 일측방향(통상 화면의 하측방향)으로 긋고 이등분점(202)로부터 일정거리에 설정된 수평선(208)(화면 수평과 일치)과 만나는 점(210)을 기준점으로 산출한다. 여기서 이등분점(202)과 기준점(210) 사이의 일정 거리(212)는 중지 끝에서 손바닥과 손목이 만나는 지점까지의 길이에 비례하는 것이 바람직하다. 특히 손바닥 수직길이(터치시 화면에 대해 손가락을 최대로 세웠을 때) 와 손길이(터치시 손을 쫙 펼쳤을 때)의 사이 값으로 한정된다.First, the bisector 202 of the straight line distance S between two touch points T1 and T2 is calculated on the touch screen 110 to obtain the azimuth angle of the touch points. Point that draws a vertical line perpendicular to the distance S from the bisector 202 in one direction (usually the lower direction of the screen) and meets a horizontal line 208 (consistent with the screen horizontal) set at a distance from the bisector 202. (210) is calculated as a reference point. Here, the predetermined distance 212 between the bisector 202 and the reference point 210 is preferably proportional to the length from the stop end to the point where the palm meets the wrist. In particular, it is limited to the value between the palm vertical length (when the finger is raised to the maximum on the touch screen) and the hand length (when the hand is stretched out on touch).
기준점(210)으로부터 각 터치점(T1, T2)을 통과하는 연장선(214, 216)을 긋고 수평선(208)을 기준으로 반시계방향으로 각 연장선(214, 216)의 방위각(θT1 , θT2)을 구한다.The azimuth angles θ T1 and θ T2 of the extension lines 214 and 216 are drawn from the reference point 210 by extending the extension lines 214 and 216 passing through the touch points T1 and T2 and counterclockwise with respect to the horizontal line 208. )
따라서 포인터 표시점(P1, P2)은 기준점(210)으로부터 터치점(T1, T2)을 통과하여 설정거리 d 만큼 이격된 위치로 산출되고 산출된 포인터 표시점(P1, P2)에 포인터 이미지가 렌더링되어 각각 표시된다. 이와 같은 포인터 표시제어에 의해 포인터는 대략적으로 손가락 길이 방향의 연장선 상에 표시될 수 있다. 도면에서 206은 반지름이 RK인 가상원으로 두 터치점(T1, T2)가 원의 중심점(204)로부터 동일 거리, 즉 원주 상에 위치한다.Therefore, the pointer display points P1 and P2 are calculated as positions spaced apart from the reference point 210 through the touch points T1 and T2 by the set distance d, and the pointer images are rendered on the calculated pointer display points P1 and P2. Are displayed respectively. By the pointer display control as described above, the pointer can be displayed on an extension line of the finger length direction. In the drawing, 206 is a virtual circle having a radius RK, and two touch points T1 and T2 are located at the same distance from the center point 204 of the circle, that is, on the circumference.
도 20은 본 발명에 의한 3 터치 3 포인터를 설명하기 위한 도면이다.20 is a view for explaining a three touch 3 pointer according to the present invention.
3 터치 3 포인터는 터치점(T1, T2, T3)들 중 가장 멀리 이격된 두 터치점(T1, T2)을 찾고 이들 두 터치점들(T1, T2)을 가지고 상술한 2 터치 2포인터와 동일한 방식으로 각 터치점들(T1, T2, T3)의 방위각(θ T1 , θ T2 , θ T3 )을 구하여 포인터 표시점들(P1, P2, P3)을 산출한다. 여기서 이등분점(202)으로부터 거리(S)에 대해 수직인 수직선을 그을 때 터치점(T3)과 반대방향으로 방향 기준을 잡는 점이 2 터치 방식과 다르다.The three touch three pointer finds the two farthest touch points T1 and T2 among the touch points T1, T2 and T3 and has these two touch points T1 and T2, which is the same as the two touch two pointer described above. The azimuth angles θ T1 , θ T2 , θ T3 of the respective touch points T1, T2, and T3 are calculated to calculate the pointer display points P1, P2, and P3. Here, the point that sets the direction reference in the direction opposite to the touch point T3 when drawing a vertical line perpendicular to the distance S from the bisector 202 is different from the two-touch method.
도 21은 본 발명에 의한 5 터치 5 포인터를 설명하기 위한 도면이다.21 is a view for explaining a five touch 5 pointer according to the present invention.
제어부(120)는 엄지(F1), 검지(F2), 중지(F3), 약지(F4), 소지(F5)의 다섯 손가락이 터치되는 것을 인식하고 멀티터치에 감응하여 멀티터치위치들(T1, T2, T3, T4, T5)의 좌표값를 산출한다.The controller 120 recognizes that five fingers of the thumb F1, the index finger F2, the middle finger F3, the ring finger F4, and the finger F5 are touched, and responds to the multi-touch in response to the multi-touch. The coordinate values of T2, T3, T4, and T5) are calculated.
제어부(120)에서는 감지된 멀티터치위치들의 좌표값을 입력하여 멀티포인터를 생성한다. 터치위치들의 상호 직선거리를 비교하여 가장 멀리 떨어진 두 터치위치(T1, T5), 즉 엄지 터치위치와 소지 터치위치를 연결하는 직선거리(S)의 1/2되는 지점, 즉 이등분점(202)의 좌표값을 산출한다. 여기서 멀티터치위치들(T1, T2, T3, T4, T5)들을 연결한 폐다각형의 가장 긴 변의 이등분점을 구하여도 가능하다.The controller 120 generates a multi-pointer by inputting coordinate values of the detected multi-touch positions. The distance between two touch points T1 and T5 that are farthest from each other by comparing the touch positions with each other, that is, the half point of the straight distance S connecting the thumb touch position with the touch position, that is, the bisector 202. Calculate the coordinate value of. Here, it is also possible to obtain the bisector of the longest side of the closed polygon connecting the multi-touch positions T1, T2, T3, T4, and T5.
이등분점(202)이 구해지면 터치위치들(T2~T4)들의 반대 방향으로 수직 연장선(212)을 긋고 수직 연장선(212)와 수평선(208)이 만나는 점을 기준점(210)으로 한다.When the bisector 202 is obtained, the reference point 210 is a point where the vertical extension line 212 is drawn in the opposite direction of the touch positions T2 to T4 and the vertical extension line 212 and the horizontal line 208 meet.
기준점(210)으로부터 각 터치점들(T1, T2, T3, T4, T5)을 통과하는 연장선들을 긋고 가상 수평선(208)으로부터 반시계방향으로 각 터치위치들의 방위각 (θ T1 , θ T2 , θ T3 , θ T4 , θ T5 )을 각각 구한다. 이를 정리하면 다음 수식으로 정리된다.Extension lines passing through the respective touch points T1, T2, T3, T4, and T5 are drawn from the reference point 210 and the azimuth angles θ T1 , θ T2 , θ T3 counterclockwise from the virtual horizontal line 208. , θ T4 , θ T5 ) are obtained, respectively. This is summarized by the following formula.
θ Tn = ∑θ n θ Tn = ∑θ n
여기서 n은 양의 정수이고 θ n 은 터치점들 사이의 사이 각이다.Where n is a positive integer and θ n is the angle between touch points.
F1 : θ T1 = θ1 F1: θ T1 = θ 1
F2 : θ T2 = θ1 + θ2 F2: θ T2 = θ 1 + θ 2
F3 : θ T3 = θ1 + θ2 θ3 F3: θ T3                 =θOne + Θ2+ θ3             
F4 : θ T4 = θ1 + θ2 θ3 θ4 F4: θ T4                 =θOne + Θ2+ θ3+ θ4             
F5 : θ T5 = θ1 + θ2 θ3 θ4 θ5 F5: θ T5                 =θOne + Θ2+ θ3+ θ4+ θ5             
상기 관계식에 의해 멀티 터치점들 중 일시적으로 일부 터치점이 사라지더라도 사라진 터치점에 대한 방위각을 산출하여 멀티 터치가 이동되어 좌표가 수시로 변화되더라도 각 터치점들 사이의 거리, 각도 및 방위각들을 지속적으로 유지할 수 있다. 각 터치점들의 방위각으로부터 각 손가락들의 길이방향을 추론할 수 있다. 각 터치점들로부터 일정 거리 이격되고 각 터치점의 방위각을 가지는 연장선 상에 각 손가락 터치에 대응하는 포인터 표시점이 산출된다.Even if some of the multi-touch points are temporarily lost by the relation, the azimuth of the missing touch points is calculated to maintain the distance, angle and azimuth between the touch points even if the multi-touch is moved and the coordinates change frequently. Can be. The longitudinal direction of each finger may be inferred from the azimuth of each touch point. A pointer display point corresponding to each finger touch is calculated on an extension line that is spaced a predetermined distance from each touch point and has an azimuth angle of each touch point.
도 22는 본 발명에 의한 멀티터치 멀티 포인터 모드에서 터치동작과 연동된 포인터 표시 및 객체 선택 과정을 설명하기 위한 흐름도이다.22 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation in a multi-touch multi-pointer mode according to the present invention.
도 22를 참조하면, 먼저 사용자가 싱글 및 멀티 포인터 버튼(160a)을 터치하면 싱글모드 또는 멀티모드 중 어느 하나가 선택된다. 제어부(120)는 멀티모드가 선택되면 메모리부(140)에 저장된 멀티 포인터의 설정 데이터(수직 연장선 길이값 , 터치점으로부터 표시점 까지의 거리 설정값, 크기, 투명도, 모양 및 색깔 데이터)를 가져다가 포인터 그래픽 이미지를 구성하고 시스템을 포인터 대기모드로 전환시킨다(S232). 이어서 터치패널(114)에 멀티 터치다운이 감지되면(S234) 제어부(120)에서는 터치위치(T1~T5)로부터 상술한 알고리즘을 수행하여 각 터치점들의 방위각을 산출함으로써 멀티 포인터들의 각 표시방향을 추론한다(S236). 이어서 터치점으로부터 추론된 방향으로 설정된 일정거리 만큼 이격된 위치에 포인터 표시점(P1~P5)을 각각 산출하고 산출된 표시점(P1~P5)에 포인터 이미지를 렌더링하여 화면 상에 멀티 포인터들을 표시한다(S238). 터치다운된 상태를 유지하면서 손가락을 이동시키면(S240) 제어부(120)에서는 이동에 따른 터치위치 값의 변동에 대응하여 표시위치 값을 실시간으로 산출하여 포인터(150)가 설정 거리와 설정 방위각을 유지하면서 표시되도록 제어한다. 손가락 이동이 정지되고 터치패널에서 손가락 접촉이 떨어지는 터치 업 동작상태를 검출한다(S242). S242단계에서 터치 업이 검출되면 제어부(120)는 현재 터치위치가 아니라 포인터 표시점(P)이 위치한 현재 위치를 선택위치로 산출한다. 제어부(120)에서는 선택위치에 대응되는 객체를 선택 대기시킨다(S244). 이어서 제어부(120)에서는 선택대기 상태에서 터치다운 또는 터치 클릭과 같은 선택명령이 입력되는가를 체크하고(S246), 설정된 선택대기시간이 경과되었는지를 체크한다(S250). S246단계에서 터치다운이나 터치클릭과 같은 선택명령이 입력되면 선택대기 중인 객체를 활성화시키고(S248) S232단계로 복귀한다. S250단계에서 제어부(120)는 설정시간이 경과할 때가지 어떠한 선택명령의 입력이 없으면 S232단계로 자동 복귀한다.Referring to FIG. 22, when a user touches the single and multi pointer buttons 160a, one of the single mode and the multi mode is selected. When the multi-mode is selected, the control unit 120 brings setting data (vertical extension line length value, distance setting value from the touch point to the display point, size, transparency, shape, and color data) stored in the memory unit 140. Configures the pointer graphic image and switches the system to the pointer standby mode (S232). Subsequently, when the multi-touchdown is sensed by the touch panel 114 (S234), the controller 120 performs the above-described algorithm from the touch positions T1 to T5 to calculate the azimuth angle of each touch point, thereby adjusting each display direction of the multi-pointers. Infer (S236). Subsequently, the pointer display points P1 to P5 are respectively calculated at positions spaced by a predetermined distance set in the direction inferred from the touch point, and the pointer images are rendered at the calculated display points P1 to P5 to display the multi-pointers on the screen. (S238). When the finger is moved while maintaining the touched down state (S240), the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, thereby maintaining the set distance and the set azimuth angle. Control the display. The movement of the finger is stopped and the touch-up operation state in which the finger touch falls on the touch panel is detected (S242). When the touch-up is detected in operation S242, the controller 120 calculates the current position where the pointer display point P is located as the selection position, not the current touch position. The controller 120 waits for selection of an object corresponding to the selection position (S244). Subsequently, the controller 120 checks whether a selection command such as a touchdown or a touch click is input in the selection waiting state (S246), and checks whether the set selection waiting time has elapsed (S250). If a selection command such as touchdown or touchclick is input in step S246, the object waiting for selection is activated (S248) and the process returns to step S232. In step S250, the controller 120 automatically returns to step S232 when there is no input of a selection command until the set time elapses.
도 23은 본 발명에 의한 본 발명에 의한 멀티터치 멀티 포인터 모드의 변형 실시예를 설명하기 위한 도면이고, 도 24는 변형 실시예의 표시점 방위각을 산출하는 원리를 설명하기 위한 도면이다.FIG. 23 is a view for explaining a modified embodiment of the multi-touch multi-pointer mode according to the present invention, and FIG. 24 is a view for explaining the principle of calculating the display point azimuth angle in the modified embodiment.
도 23의 변형 실시예에서는 엄지의 터치점(T1)과 기준점(210) 사이를 3개의 관절점들(J11, J12, J13)을 통하여 서로 연결한다. 검지의 터치점(T2)과 기준점(210) 사이를 3개의 관절점들(J21, J22, J23)을 통하여 서로 연결한다. 중지의 터치점(T3)과 기준점(210) 사이를 3개의 관절점들(J31, J32, J33)을 통하여 서로 연결한다. 약지의 터치점(T4)과 기준점(210) 사이를 3개의 관절점들(J41, J42, J43)을 통하여 서로 연결한다. 엄지의 터치점(T5)과 기준점(210) 사이를 3개의 관절점들(J51, J52, J53)을 통하여 서로 연결한다.In the modified embodiment of FIG. 23, the touch point T1 and the reference point 210 of the thumb are connected to each other through three joint points J11, J12, and J13. The touch point T2 and the reference point 210 of the index finger are connected to each other through three joint points J21, J22, and J23. The middle touch point T3 and the reference point 210 are connected to each other through three joint points J31, J32, and J33. The touch point T4 and the reference point 210 of the ring finger are connected to each other through three joint points J41, J42, and J43. The touch point T5 of the thumb and the reference point 210 are connected to each other through three joint points J51, J52, and J53.
각 손가락에 대응하는 포인터들(P1~P5)은 각각 관절점들(J11, J21, J31, J41, J51)로부터 터치점을 통과하여 연장된 연장선 상에 터치점과 일정거리를 두고 배치된다.The pointers P1 to P5 corresponding to each finger are disposed at a predetermined distance from the touch point on the extension line extending through the touch point from the joint points J11, J21, J31, J41, and J51, respectively.
도 24를 참조하면 제어부(120)에서는 각 손가락들의 터치점(Ti)과 관절점(Ji1)의 거리 Di0, 관절점(Ji1)과 관절점(Ji2)사이의 거리 Di1 , 관절점(Ji2)과 관절점(Ji3) 사이의 거리 Di2, 관절점(Ji3)과 기준점(210) 사이의 거리 Di3와 터치점(Ti)와 관절점(Ji1)과의 각도 θ T31 , 관절점(Ji1)과 관절점(Ji2)사이의 각도 θ T32 , 관절점(Ji2)과 관절점(Ji3) 사이의 거리 θ T33 , 관절점(Ji3)과 기준점(210) 사이의 각도 θ T3 를 대응 손가락의 파라미터로 관리한다. 손가락 파라미터는 손을 쫙 폈을 때 기준점과 터치점을 연결하는 직선 상에 모든 관절점이 위치하게 되고 각 점들 사이의 거리가 최대 길이로 된다. 손가락을 구부리는 정도에 따라 각 관절점들의 방위각들과 각 점들 사이의 거리값이 가변된다.Referring to FIG. 24, the controller 120 may determine the distance Di0 of the touch point Ti and the joint point Ji1 of the fingers, the distance Di1 between the joint point Ji1 and the joint point Ji2, and the joint point Ji2. Distance Di2 between joint point Ji3, Distance Di3 between joint point Ji3 and reference point 210 and angle between touch point Ti and joint point Ji1 θ T31 , Joint point Ji1 and joint point The angle θ T32 between (Ji2), the distance θ T33 between the joint point (Ji2) and the joint point (Ji3), and the angle θ T3 between the joint point (Ji3) and the reference point 210 are managed as parameters of the corresponding finger. When the finger parameter is released, all joint points are positioned on a straight line connecting the reference point and the touch point, and the distance between each point is the maximum length. The azimuth angles of the joint points and the distance between the points vary according to the degree of bending the finger.
그러므로 터치점과 기준점 사이의 거리에 대한 비율로 각 관절점들의 위치를 산출하여 배치시키고 기준점(210)과 제3관절점(J33)과의 방위각을 구하면 다음 수식으로 구해진다.Therefore, the position of each joint point is calculated and arranged as a ratio of the distance between the touch point and the reference point, and the azimuth angle between the reference point 210 and the third joint point J33 is obtained by the following equation.
θ T3 = θ1 + θ2 θ3 θ T3                 =θOne + Θ2+ θ3             
제2관절점(J32)의 방위각은 다음 수식으로 구해지며The azimuth angle of the second joint point (J32) is obtained by the following equation.
θ J32 = θ1 + θ2 θ3 θ T33 θ J32                 =θOne + Θ2+ θ3+ θ T33             
제1관절점(J31)의 방위각은 다음 수식으로 구해진다.The azimuth angle of the first joint point J31 is obtained by the following equation.
θ J32 = θ1 + θ2 θ3 θ T33 +θ T32 θ J32                 =θOne + Θ2+ θ3+ θ T33 + Θ T32             
최종적으로 표시점(P3)의 방위각은 제1관절점(J31)의 방위각과 동일한 각으로 산출된다.Finally, the azimuth angle of the display point P3 is calculated at the same angle as the azimuth angle of the first joint point J31.
변형 실시예는 터치점과 기준점(210)을 상술한 바와 같이 단순하게 직선으로 연결하는 것이 아니라 각 손가락의 관절점들을 연결하여 전체적으로는 가상 손 구조물을 표시하고, 제1관절점으로부터 터치점을 통과한 연장선 상에 표시점을 위치시킴으로써 손가락의 방향과 포인터의 배치방향을 보다 세밀하게 제어할 수 있다. 또한 손가락을 구부리는 정도에 따라 포인터의 방향을 제어할 수 있다. 상기 본 발명에서 인체공학적인 이용자 및 사람의 손과 손가락의 평균적으로 정해진 각도와 길이를 분석하여 적용한 프로그램 또는 데이터를 활용하는 포인터(들)의 방향을 구현하는 방식 및 방법을 포함하며 상기 본 발명방법을 또 다른 실시 예로서 구체화할 수 있으며 다른 구성요소를 활용할 수 있다.In the modified embodiment, the touch point and the reference point 210 are not simply connected in a straight line as described above, but connect the joint points of each finger to display the virtual hand structure as a whole, and pass the touch point from the first joint point. By positioning the display point on one extension line, the direction of the finger and the arrangement direction of the pointer can be controlled more precisely. In addition, the direction of the pointer can be controlled according to the degree of bending the finger. The method and method for implementing the direction of the pointer (s) using the program or data applied by analyzing the average angle and length of the ergonomic user and human hands and fingers in the present invention and the method As another embodiment may be embodied and other components may be utilized.
이상에서 살펴본 바와 같이 본 발명을 다양한 실시 예를 들어 상세히 기술하였지만, 본 발명이 속하는 기술분야에 있어서 통상의 지식을 가진 사람이라면 본 발명을 여러 가지로 변형하여 실시할 수 있을 것이다.As described above, the present invention has been described in detail with reference to various embodiments, but a person having ordinary knowledge in the technical field to which the present invention belongs may modify and practice the present invention in various ways.
도 25는 본 발명에 의한 포인터의 표시동작을 설명하기 위한 변형 실시예의 화면 상태도이고 도 26은 변형 실시예에 의한 터치동작과 연동된 포인터 표시 및 객체 선택 과정을 설명하기 위한 흐름도를 나타낸다.25 is a screen state diagram of a modified embodiment for explaining a display operation of a pointer according to the present invention, and FIG. 26 is a flowchart illustrating a pointer display and an object selection process linked to a touch operation according to a modified embodiment.
도 25 및 도 26을 참조하면, 포인터 모드선택버튼(160)을 터치하면 화면 임의의 위치(152)에 포인터가 표시되어 대기한다(262). 이 상태에서 터치스크린(110)의 임의의 위치(154)를 터치하면(S264) 도 25에 도시한 바와 같이 포인터가 위치(152)에서 156위치로 점핑 표시된다(S266). S264단계에서 터치다운된 상태를 유지하면서 손가락을 이동, 즉 터치이동이 검출되면(S268) 제어부(120)에서는 이동에 따른 터치위치 값의 변동에 대응하여 표시위치 값을 실시간으로 산출하여 포인터(150)가 설정 거리와 설정 방위각을 유지하면서 표시되도록 제어한다(S270). 손가락 이동이 정지되고 터치패널에서 손가락 접촉이 떨어지는 터치 업 동작상태를 검출한다(S272). S272단계에서 터치 업이 검출되면 제어부(120)는 현재 터치위치가 아니라 포인터(150)가 위치한 현재 위치(156)(Px1, Py1)를 선택위치로 산출한다. 제어부(120)에서는 선택위치에 대응되는 객체(157)를 선택 대기시킨다(S274). 이어서 제어부(120)에서는 선택대기 상태에서 터치다운 또는 터치 클릭과 같은 선택명령이 입력되는가를 체크한다(S276). S276단계에서 터치다운이나 터치클릭과 같은 선택명령이 입력되면 선택대기 중인 객체(157)를 활성화시키고(S278) S262단계로 복귀하여 포인터가 표시된 상태로 유지한다.Referring to FIGS. 25 and 26, when the pointer mode selection button 160 is touched, a pointer is displayed and waits at an arbitrary position 152 on the screen (262). In this state, when an arbitrary position 154 of the touch screen 110 is touched (S264), the pointer is jumped and displayed from the position 152 to the 156 position as shown in FIG. 25 (S266). In operation S264, when the finger is moved, that is, the touch movement is detected while maintaining the touch-down state (S268), the controller 120 calculates the display position value in real time in response to the change in the touch position value according to the movement, and the pointer 150. ) Is controlled to be displayed while maintaining the set distance and the set azimuth (S270). The finger movement is stopped and the touch-up operation state in which the finger contact falls on the touch panel is detected (S272). When the touch-up is detected in step S272, the controller 120 calculates the current position 156 (Px1, Py1) where the pointer 150 is located as the selection position, not the current touch position. The controller 120 waits for selection of the object 157 corresponding to the selection position (S274). Subsequently, the controller 120 checks whether a selection command such as touchdown or touchclick is input in the selection standby state (S276). When a selection command such as a touchdown or a touch click is input in step S276, the object 157 that is waiting for selection is activated (S278), and the process returns to step S262 to maintain the pointer displayed.
상술한 바와 같이 본 발명은 예컨대 스마트 폰의 어플형태와 같은 소프트웨어로 구현되거나 터치스크린을 구비한 전자장치 내의 하드웨어로 구현되거나 펌웨어 및 하드웨어의 조합으로 구현될 수 있다.As described above, the present invention may be implemented in software such as an application form of a smart phone, hardware in an electronic device having a touch screen, or a combination of firmware and hardware.

Claims (16)

  1. 표시패널과 터치패널를 구비한 터치스크린 제어방법에 있어서,In the touch screen control method comprising a display panel and a touch panel,
    상기 터치패널 상의 손가락 터치(또는 터치다운)에 대응하는 터치위치를 인식하는 단계;Recognizing a touch position corresponding to a finger touch (or touchdown) on the touch panel;
    상기 인식된 터치위치로부터 일정 거리 떨어지고 손가락에 의해 시각적으로 가려지지 않은 표시패널 영역에 포인터를 표시하는 단계;Displaying a pointer on an area of the display panel which is spaced apart from the recognized touch position and which is not visually covered by a finger;
    상기 터치위치에 손가락을 터치한 상태로 손가락을 이동할 경우에 상기 손가락 이동에 연동하여 일정 거리를 유지하면서 상기 포인터를 이동 표시하는 단계;Moving and displaying the pointer while maintaining a predetermined distance in conjunction with the finger movement when the finger is moved while the finger is in touch with the touch position;
    상기 포인터의 표시위치를 선택하고자 하는 객체 상에 위치시킨 상태에서 손가락을 터치해제(또는 터치업)하면 상기 객체를 선택 대기시키는 단계; 및Waiting for selection of the object when the finger is released (or touched up) while the display position of the pointer is positioned on an object to be selected; And
    상기 선택대기단계에서 선택명령이 입력되면 상기 선택대기 중인 객체를 활성화시키는 단계를 구비하고,If the selection command is input in the selection waiting step, and the step of activating the object waiting for selection, and
    상기 포인터를 이동 표시하는 단계는The moving display of the pointer
    상기 터치스크린의 에지부근에 설정된 사각구역으로 터치이동이 진입한 거리에 연동하여 터치점과 포인터 표시점 사이의 거리 또는 방향 중 어느 하나를 가변 제어하는 것을 특징으로 하는 터치스크린 제어방법.And variably controlling any one of a distance or a direction between a touch point and a pointer display point in association with a distance at which a touch movement is entered into a rectangular area set near an edge of the touch screen.
  2. 제1항에 있어서, 상기 선택대기단계는 상기 터치해제 이후 일정시간으로 제한되고 상기 일정시간이 경과될 때까지 상기 선택명령의 입력이 없으면 상기 표시된 포인터 표시를 소거하는 것을 특징으로 하는 터치스크린 제어방법.The method of claim 1, wherein the selecting wait step is limited to a predetermined time after the touch is released and if there is no input of the selection command until the predetermined time elapses, the displayed pointer display is erased. .
  3. 제1항에 있어서, 상기 선택명령은 상기 포인터를 표시하기 위한 터치동작과 다른 터치동작(터치다운, 터치클릭 또는 더블터치클릭) 또는 키입력 중 적어도 어느 하나에 의해 생성된 것을 특징으로 하는 터치스크린 제어방법.The touch screen as claimed in claim 1, wherein the selection command is generated by at least one of a touch operation (touch down, touch click, double touch click) or a key input different from a touch operation for displaying the pointer. Control method.
  4. 제1항에 있어서, 상기 포인터 동작모드는The method of claim 1, wherein the pointer operation mode is
    상기 터치스크린 상에 표시된 포인터 모드 선택버튼을 터치하는 것에 의해 활성화되는 것을 특징으로 하는 터치스크린 제어방법.And a pointer mode selection button displayed on the touch screen to activate the touch screen.
  5. 제1항에 있어서, 상기 사각구역 중 상측영역 또는 하측영역에서는 터치점과 표시점 사이의 거리가 가변 제어되는 것을 특징으로 하는 터치스크린 제어방법.The method of claim 1, wherein a distance between a touch point and a display point is variably controlled in an upper region or a lower region of the rectangular zone.
  6. 제1항에 있어서, 상기 사각구역 중 좌측영역 또는 우측영역에서는 터치점과 표시점 사이의 방향이 가변 제어되는 것을 특징으로 하는 터치스크린 제어방법.The method of claim 1, wherein a direction between the touch point and the display point is variably controlled in the left area or the right area of the rectangular area.
  7. 제1항에 있어서, 상기 사각구역 중 좌상, 좌하, 우상 및 우하 모서리 영역들 각각에서는 터치점과 표시점 사이의 거리 및 방향이 동시에 가변 제어되는 것을 특징으로 하는 터치스크린 제어방법.The method of claim 1, wherein a distance and a direction between the touch point and the display point are simultaneously variably controlled in each of the upper left, lower left, upper right and lower right corner regions of the rectangular area.
  8. 다중 터치를 검출하는 단계;Detecting multiple touches;
    상기 검출된 다중 터치점들로부터 터치된 손가락들의 길이방향을 추론하는 단계;Inferring a longitudinal direction of touched fingers from the detected multiple touch points;
    상기 추론된 손가락 길이방향으로 각 터치점들 각각에서 연장된 일정 거리에 대응하는 포인터들을 각각 표시하는 단계; 및Displaying pointers corresponding to a predetermined distance extending from each of the touch points in the inferred finger length direction, respectively; And
    상기 다중 터치점들에 연동하는 다중 포인터들을 길이방향 추론결과를 기초로 상호 연동시켜 한 그룹으로 표시 제어하는 단계를 구비한 것을 특징으로 하는 터치스크린 제어방법.And controlling the plurality of pointers linked to the multiple touch points to be displayed in a group by interlocking with each other based on a longitudinal inference result.
  9. 제8항에 있어서 상기 추론 단계는The method of claim 8, wherein the reasoning step
    상기 검출된 다중 터치점들 중 가장 멀리 이격된 두 터치점들 사이의 직선거리 또는 상기 다중 터치점들을 연결하는 폐다각형 중 가장 긴 변을 이등분한 이등분점을 산출하는 단계;Calculating a bisector that bisects the longest side of a linear distance between two touch points spaced farthest from the detected multiple touch points or a closed polygon that connects the multiple touch points;
    상기 이등분점으로부터 일측 방향으로 일정 길이를 가진 수직연장선을 연장시켜 수평선과 만나는 기준점을 산출하는 단계;Calculating a reference point that meets a horizontal line by extending a vertical extension line having a predetermined length in one direction from the bisector;
    상기 기준점을 중심으로 상기 수평선으로부터 반시계방향으로 각 터치점들의 방위각을 산출하는 단계; 및Calculating azimuth angles of the touch points counterclockwise from the horizontal line with respect to the reference point; And
    상기 기준점으로부터 각 터치점들을 통과하는 연장선들을 각 터치점에 대응하는 손가락의 길이방향으로 결정하는 단계를 구비한 것을 특징으로 하는 터치스크린 제어방법.And determining extension lines passing through the touch points from the reference point in the longitudinal direction of the finger corresponding to each touch point.
  10. 제9항에 있어서 상기 이등분점으로부터 일측 방향은10. The method of claim 9 wherein one direction from the bisector
    터치스크린의 화면 하측방향이거나 나머지 터치점들이 위치한 방향과 반대방향 중 어느 하나인 것을 특징으로 하는 터치스크린 제어방법.The touch screen control method, characterized in that any one of the lower direction of the screen of the touch screen or the direction opposite to the direction in which the remaining touch points are located.
  11. 제9항에 있어서 상기 수직 연장선의 일정 거리는The method of claim 9 wherein the predetermined distance of the vertical extension line is
    적어도 손바닥 길이 내지 손 길이의 범위 내에서 어느 한 길이인 것을 특징으로 하는 터치스크린 제어방법.Touch screen control method, characterized in that any length within at least palm length to hand length range.
  12. 제9항에 있어서 상기 수평선은The method of claim 9 wherein the horizontal line
    터치스크린의 화면 수평과 일치한 것을 특징으로 하는 터치스크린 제어방법.Touch screen control method, characterized in that the screen horizontally match the touch screen.
  13. 터치스크린의 터치를 감지하기 위한 센서부; 및A sensor unit for sensing a touch of the touch screen; And
    상기 센서부에서 감지된 터치입력에 응답하여 터치스크린 제어프로그램를 수행하는 제어부를 포함하고,And a controller configured to execute a touch screen control program in response to the touch input sensed by the sensor unit.
    상기 제어부에서 수행되는 터치스크린 제어프로그램은The touch screen control program performed by the controller
    상기 터치스크린의 다중 터치를 검출하는 단계;Detecting multiple touches of the touch screen;
    상기 검출된 다중 터치점들로부터 터치된 손가락들의 길이방향을 추론하는 단계;Inferring a longitudinal direction of touched fingers from the detected multiple touch points;
    상기 추론된 손가락 길이방향으로 각 터치점들 각각에서 연장된 일정 거리에 대응하는 포인터들을 각각 상기 터치스크린 상에 표시하는 단계; 및Displaying pointers corresponding to a predetermined distance extending from each of the touch points in the inferred finger length direction on the touch screen, respectively; And
    상기 다중 터치점들에 연동하는 다중 포인터들을 길이방향 추론결과를 기초로 상호 연동시켜 한 그룹으로 표시 제어하는 단계를 포함하는 것을 특징으로 하는 터치스크린 장치.And controlling display of the plurality of pointers linked to the plurality of touch points in a group by interworking the plurality of pointers based on a longitudinal inference result.
  14. 다중 터치를 검출하는 단계;Detecting multiple touches;
    상기 검출된 다중 터치점들로부터 터치된 손가락들의 길이방향을 추론하는 단계;Inferring a longitudinal direction of touched fingers from the detected multiple touch points;
    상기 추론된 손가락 길이방향으로 각 터치점들 각각에서 연장된 일정 거리에 대응하는 포인터들의 표시점을 산출하는 단계; 및Calculating display points of pointers corresponding to a predetermined distance extending from each of the touch points in the inferred finger length direction; And
    상기 다중 터치점들에 연동하는 다중 포인터들을 길이방향 추론결과를 기초로 상호 연동시켜 하나의 손 모양 이미지로 표시 제어하는 단계를 구비한 것을 특징으로 하는 터치스크린 제어방법.And controlling the display of a single hand image by interlocking the plurality of pointers linked to the multiple touch points based on a longitudinal inference result.
  15. 제14항에 있어서, 상기 추론단계는The method of claim 14, wherein the reasoning step
    상기 검출된 다중 터치점들 중 가장 멀리 이격된 두 터치점들 사이의 직선거를 이등분한 이등분점을 산출하는 단계;Calculating a bisector that bisects a straight line between two touch points spaced farthest from the detected multiple touch points;
    상기 이등분점으로부터 일측 방향으로 일정 길이를 가진 수직연장선을 연장시켜 수평선과 만나는 기준점을 산출하는 단계;Calculating a reference point that meets a horizontal line by extending a vertical extension line having a predetermined length in one direction from the bisector;
    상기 기준점을 중심으로 상기 수평선으로부터 반시계방향으로 각 터치점들의 방위각을 산출하는 단계;Calculating azimuth angles of the touch points counterclockwise from the horizontal line with respect to the reference point;
    상기 기준점으로부터 각 터치점들을 각각 연결하는 선의 길이 비율에 응답하여 각 손가락들의 관절점들의 표시 위치를 산출하는 단계;Calculating display positions of joint points of the fingers in response to a length ratio of a line connecting the respective touch points from the reference point;
    상기 터치점과 최근접하는 관절점으로부터 상기 터치점을 통과하는 연장선들을 각 터치점에 대응하는 손가락의 길이방향으로 결정하는 단계를 구비한 것을 특징으로 하는 터치스크린 제어방법.And determining extension lines passing through the touch points from the joint points closest to the touch points in the longitudinal direction of the finger corresponding to each touch point.
  16. 제14항에 있어서, 상기 손 모양 이미지는The method of claim 14, wherein the hand image is
    기준점과 각 터치점들 사이를 각 손가락의 구부러진 형상에 배치된 복수의 관절점들로 연결한 형태인 것을 특징으로 하는 터치스크린 제어방법.Touch screen control method characterized in that the connection between the reference point and each touch point with a plurality of joint points arranged in the curved shape of each finger.
PCT/KR2011/008568 2010-11-10 2011-11-10 Touch screen apparatus and method for controlling same WO2012064128A2 (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
KR20100111287 2010-11-10
KR10-2010-0111287 2010-11-10
KR20110020671 2011-03-09
KR10-2011-0020671 2011-03-09
KR10-2011-0024697 2011-03-21
KR20110024697 2011-03-21
KR10-2011-0041742 2011-05-03
KR20110041742 2011-05-03
KR1020110070855A KR101095851B1 (en) 2010-11-10 2011-07-18 Touch screen apparatus and method for controlling touch screen
KR10-2011-0070855 2011-07-18

Publications (2)

Publication Number Publication Date
WO2012064128A2 true WO2012064128A2 (en) 2012-05-18
WO2012064128A3 WO2012064128A3 (en) 2012-07-05

Family

ID=45506555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/008568 WO2012064128A2 (en) 2010-11-10 2011-11-10 Touch screen apparatus and method for controlling same

Country Status (2)

Country Link
KR (1) KR101095851B1 (en)
WO (1) WO2012064128A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168431A1 (en) * 2013-04-10 2014-10-16 주식회사 지니틱스 Method for processing touch event and apparatus for same
KR20140122683A (en) * 2013-04-10 2014-10-20 주식회사 지니틱스 Method for processing touch event when a touch point is rotating respectively to other touch point
WO2016076519A1 (en) * 2014-11-12 2016-05-19 주식회사 트레이스 Three-dimensional hovering digitizer
WO2018048050A1 (en) * 2016-09-12 2018-03-15 에스케이텔레콤 주식회사 Multi-touch display device and touch recognition method thereof

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101902006B1 (en) * 2012-01-26 2018-10-01 삼성디스플레이 주식회사 display device integrated touch screen panel
CN103425311A (en) * 2012-05-25 2013-12-04 捷达世软件(深圳)有限公司 Positioning method and positioning system for mobile object clicking selection
JP5977132B2 (en) * 2012-09-28 2016-08-24 富士ゼロックス株式会社 Display control device, image display device, and program
WO2014129681A1 (en) * 2013-02-21 2014-08-28 엘지전자 주식회사 Display device and pointing method for display device
US11096668B2 (en) 2013-03-13 2021-08-24 Samsung Electronics Co., Ltd. Method and ultrasound apparatus for displaying an object
WO2014142468A1 (en) 2013-03-13 2014-09-18 Samsung Electronics Co., Ltd. Method of providing copy image and ultrasound apparatus therefor
KR20220017143A (en) 2020-08-04 2022-02-11 삼성전자주식회사 Electronic device and method for dispalying electronic pen pointer in the same
KR102378503B1 (en) * 2020-12-29 2022-03-24 울산대학교 산학협력단 Method and non-transitory computer-readable recording medium for inputting and outputting information using virtual mouse

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1040010A (en) * 1996-07-19 1998-02-13 Ricoh Co Ltd Information processor with touch panel
JPH1124841A (en) * 1997-07-07 1999-01-29 Canon Inc Information processing device and method, and storage medium
US20090288043A1 (en) * 2007-12-20 2009-11-19 Purple Labs Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
KR20100104884A (en) * 2009-03-19 2010-09-29 김연수 Touch-screen for displaying pointer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1040010A (en) * 1996-07-19 1998-02-13 Ricoh Co Ltd Information processor with touch panel
JPH1124841A (en) * 1997-07-07 1999-01-29 Canon Inc Information processing device and method, and storage medium
US20090288043A1 (en) * 2007-12-20 2009-11-19 Purple Labs Method and system for moving a cursor and selecting objects on a touchscreen using a finger pointer
KR20100104884A (en) * 2009-03-19 2010-09-29 김연수 Touch-screen for displaying pointer

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168431A1 (en) * 2013-04-10 2014-10-16 주식회사 지니틱스 Method for processing touch event and apparatus for same
KR20140122683A (en) * 2013-04-10 2014-10-20 주식회사 지니틱스 Method for processing touch event when a touch point is rotating respectively to other touch point
KR101661606B1 (en) 2013-04-10 2016-09-30 주식회사 지니틱스 Method for processing touch event when a touch point is rotating respectively to other touch point
WO2016076519A1 (en) * 2014-11-12 2016-05-19 주식회사 트레이스 Three-dimensional hovering digitizer
WO2018048050A1 (en) * 2016-09-12 2018-03-15 에스케이텔레콤 주식회사 Multi-touch display device and touch recognition method thereof
US11237621B2 (en) 2016-09-12 2022-02-01 Sk Telecom Co., Ltd. Multi-touch display apparatus and touch recognition method thereof

Also Published As

Publication number Publication date
WO2012064128A3 (en) 2012-07-05
KR101095851B1 (en) 2011-12-21

Similar Documents

Publication Publication Date Title
WO2012064128A2 (en) Touch screen apparatus and method for controlling same
WO2016064137A1 (en) Apparatus and method of drawing and solving figure content
WO2013089392A1 (en) Bendable display device and displaying method thereof
KR100901106B1 (en) Touch screen control method, touch screen apparatus and portable small electronic device
US20130057487A1 (en) Information processing device adapted to receiving an input for user control using a touch pad and information processing method thereof
US20090027354A1 (en) Automatic switching for a dual mode digitizer
US20090160805A1 (en) Information processing apparatus and display control method
WO2014084633A1 (en) Method for displaying applications and electronic device thereof
US20100177053A2 (en) Method and apparatus for control of multiple degrees of freedom of a display
WO2011046270A1 (en) Multi-touch type input controlling system
US9898184B2 (en) Operation method of operating system
WO2013141464A1 (en) Method of controlling touch-based input
KR20140038568A (en) Multi-touch uses, gestures, and implementation
EP2852882A1 (en) Method and apparatus of controlling user interface using touch screen
WO2012030194A1 (en) Method and apparatus for interfacing
JPH0778120A (en) Hand-held arithmetic unit and processing method of input signal in hand-held arithmetic unit
WO2014090116A1 (en) Method for displaying icon, microprocessor and mobile terminal
JP6162299B1 (en) Information processing apparatus, input switching method, and program
WO2014054861A1 (en) Terminal and method for processing multi-point input
WO2020130356A1 (en) System and method for multipurpose input device for two-dimensional and three-dimensional environments
WO2016085186A1 (en) Electronic apparatus and method for displaying graphical object thereof
WO2016129923A1 (en) Display device, display method and computer-readable recording medium
TW201520876A (en) Method for operating user interface and electronic device thereof
WO2014104727A1 (en) Method for providing user interface using multi-point touch and apparatus for same
WO2010095783A1 (en) Touch screen control method and touch screen device using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11839685

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11839685

Country of ref document: EP

Kind code of ref document: A2