US20160092061A1 - Method for selecting target at touch point on touch screen of mobile device - Google Patents

Method for selecting target at touch point on touch screen of mobile device Download PDF

Info

Publication number
US20160092061A1
US20160092061A1 US14/840,907 US201514840907A US2016092061A1 US 20160092061 A1 US20160092061 A1 US 20160092061A1 US 201514840907 A US201514840907 A US 201514840907A US 2016092061 A1 US2016092061 A1 US 2016092061A1
Authority
US
United States
Prior art keywords
control unit
touch
mobile device
touch screen
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/840,907
Inventor
Yeo Min YUN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US14/840,907 priority Critical patent/US20160092061A1/en
Publication of US20160092061A1 publication Critical patent/US20160092061A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention relates generally to a method for selecting a target at a touch point on a touch screen of a mobile device and, more particularly, to a method for selecting a user's desired target from among a plurality of targets, such as hyperlinks, Points of Interest (POIs) or local information items, at a touch point on a touch screen.
  • targets such as hyperlinks, Points of Interest (POIs) or local information items
  • a touch screen has become one of essential elements of a smart phone.
  • any touch event i.e., a user's touch action on a touch screen through at least one finger or a stylus
  • a smart phone performs a particular function corresponding to a touch point.
  • a touch point is relatively larger than the screen resolution, there is a likelihood that an undesired target may be mistakenly touched.
  • such mistaken touches may often require users to provide an exact touch.
  • a user can touch an enlarged screen, such an enlargement may require complicated and troublesome manipulations.
  • the present invention is provided to address the above-mentioned problems and/or disadvantages and to offer at least the advantages described below.
  • An aspect of the present invention provides methods for selecting a user's desired target among a plurality of targets at a touch point on a touch screen.
  • a method for selecting a target at a touch point on a touch screen of a mobile device including displaying a web page; when there is a touch on the touch screen, checking the number of hyperlinks overlapped with the touch point in the displayed web page; if two or more hyperlinks are overlapped with the touch point, defining a target area around the touch point, enlarging the target area, and displaying the enlarged target area; and if a single one of the hyperlinks in the enlarged target area is overlapped with the touch point, selecting and visually emphasizing the single hyperlink among the hyperlinks.
  • a method for selecting a target at a touch point on a touch screen of a mobile device including displaying map information; when there is a touch on the touch screen, checking the number of POIs (points of interest) overlapped with the touch point in the displayed map information; if two or more POIs are overlapped with the touch point, enlarging and displaying the POIs overlapped with the touch point; if a single one of the enlarged POIs is overlapped with the touch point, selecting the single POI among the enlarged POIs; and setting a destination in a load guide by using the selected POI.
  • POIs points of interest
  • a method for selecting a target at a touch point on a touch screen of a mobile device including displaying local information items on an image in an overlay manner; when there is a touch on the touch screen, checking the number of the local information items overlapped with the touch point; if two or more local information items are overlapped with the touch point, enlarging and displaying the local information items overlapped with the touch point; if a single one of the enlarged local information items is overlapped with the touch point, selecting the single local information item among the enlarged local information items; and displaying detail information of the selected local information item.
  • FIG. 1 is a block diagram illustrating a mobile device in accordance with an embodiment of the present invention
  • FIGS. 2A and 2B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with an embodiment of the present invention
  • FIGS. 3 and 4 are screenshots illustrating selection of a hyperlink in the method of FIGS. 2A and 2B in accordance with an embodiment of the present invention
  • FIGS. 5A and 5B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with another embodiment of the present invention.
  • FIGS. 6 and 7 are screenshots illustrating selection of a point of interest in the method of FIGS. 5A and 5B in accordance with an embodiment of the present invention
  • FIGS. 8A and 8B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with still another embodiment of the present invention.
  • FIGS. 9 and 10 are screenshots illustrating selection of local information in the method of FIGS. 8A and 8B in accordance with an embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a mobile device in accordance with an embodiment of the present invention.
  • the mobile device may include a wireless communication unit 110 , an input unit 120 , a sensing unit 130 , a memory unit 140 , a display unit 150 , a camera 160 , a Global Positioning System (GPS) receiver 170 , and a control unit 180 that generally controls the other units of the mobile terminal.
  • GPS Global Positioning System
  • the wireless communication unit 110 includes a mobile communication module (not shown) that wirelessly communicates with a base station (not shown) in order to provide data received from the control unit 180 to the base station or provide data received from the base station to the control unit 180 . Additionally, the wireless communication unit 110 may have a WiFi module in order to access to a Local Area Network (LAN).
  • LAN Local Area Network
  • the input unit 120 is includes a touch screen 121 and a plurality of buttons 122 .
  • the input unit 120 outputs, to the control unit 180 , a touch event or a key event caused by user input.
  • the sensing unit 130 detects the direction, attitude angle, etc. of the mobile device and outputs detection results to the control unit 180 .
  • the sensing unit 130 may have a gyro sensor 131 , a geomagnetic sensor 132 , and/or an acceleration sensor 133 .
  • the gyro sensor 131 measures the attitude angles of the mobile device in the X-axis, Y-axis and Z-axis directions, and also detects a rotation on each axis, namely rolling on the X-axis, pitching on the Y-axis, and yawing on the Z-axis.
  • the geomagnetic sensor 132 detects the direction of the mobile device and may perform a tilt compensation function for the gyro sensor 132 .
  • the acceleration sensor 133 measures the acceleration of the mobile device.
  • the memory unit 140 may be divided into program and data regions (not shown).
  • the program region may store an Operating System (OS), an augmented reality application for representing local information on an image (such as a realistic image, e.g., a digital photograph) in an overlay manner, a road guide application, a mobile web browser, and the like.
  • the data region may store a map database 141 , a local information database 142 , etc.
  • the GPS receiver 170 offers GPS satellite information such as GPS satellite location, transmission time, reception time, satellite signal strength, etc. to the control unit 180 .
  • FIGS. 2A and 2B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with an embodiment of the present invention. Specifically, FIGS. 2A and 2B illustrate a method for selecting a user's desired target among a plurality of hyperlinks overlapped with a touch point on the touch screen and then displaying a web page of the selected target. The process of FIGS. 2A and 2B is performed by the control unit 180 . Additionally, FIGS. 3 and 4 are screenshots illustrating selection of a hyperlink in accordance with the method of FIGS. 2A and 2B .
  • step 201 the control unit 180 receives a request for Internet access and then establishes an Internet connection by controlling the wireless communication unit 110 .
  • the control unit 180 controls the wireless communication unit 110 and accesses a predetermined web site.
  • step 202 the control unit 180 controls the display unit 150 to display a web page received through the wireless communication unit 110 .
  • step 203 the control unit 180 determines whether any touch event occurs. If a touch event occurs, the control unit 180 proceeds to steps 204 and 205 and then checks the number of hyperlinks overlapped with a touch point. If no hyperlink is overlapped with a touch point, the control unit 180 proceeds to step 206 .
  • step 206 of FIG. 2A the control unit 180 determines whether a touch is released. If so, the control unit 180 returns to step 203 . However, if a touch is not yet released, namely if a user keeps the finger touched on the touch screen 121 or takes a drag action (i.e., moves the finger across the touch screen while maintaining a touch), a touch event still continues. Therefore, the control unit 180 returns to step 204 .
  • step 207 the control unit 180 selects and visually emphasizes a touched hyperlink among displayed hyperlinks. For example, by controlling the display unit 150 , the control unit 180 makes a clear distinction in size, transparency or color between the touched hyperlink and the other hyperlinks.
  • step 208 the control unit 180 determines whether a touch is released. If so, the control unit 180 proceeds to step 218 in FIG. 2B (as indicated by the C symbol in both FIGS. 2A and 2B ), which is described in detail below.
  • step 208 of FIG. 2A the control unit 180 returns to step 205 and checks again whether a single hyperlink is still overlapped with a touch point. The method returns to step 205 because a drag may cause a touch on several hyperlinks or on no hyperlinks. If a single hyperlink is overlapped with a touch point as the result of check in step 205 , namely, if a touched hyperlink only is still touched or if any other single hyperlink is alternatively touched, the control unit 180 proceeds again to step 207 . Otherwise the control unit 180 proceeds again to step 206 .
  • the control unit 180 proceeds to step 209 and defines a target area 20 to be enlarged around the touch point 10 as shown in FIG. 3 .
  • the control unit 180 defines the target area 20 that surrounds the touch point 10 and contains all hyperlinks overlapped with the touch point 10 .
  • the control unit 180 enlarges the target area 20 at a given ratio in step 210 and then, as shown in FIG. 4 , displays the enlarged target area 20 together with the touch point 10 in an overlay manner in step 211 .
  • the touch point is unchanged in size, whereas hyperlinks in the target area are enlarged. Therefore, even though a user does not move the finger, there is a high probability that the only one hyperlink will be touched.
  • step 212 of FIG. 2B (as indicated by the B symbol in both FIGS. 2A and 2B ), where the control unit 180 checks whether a single hyperlink only is overlapped with the touch point. If two or more hyperlinks are overlapped with the touch point or if no hyperlink is overlapped with the touch point as the result of check in step 212 , the control unit 180 proceeds to step 213 and determines whether a touch is released. If so, the method returns to FIG. 2A (as shown by the A symbol in both FIGS. 2A and 2B ) and the control unit 180 removes the displayed target area 20 in step 214 and then returns to step 203 . If a touch is not yet released, the control unit 180 returns to step 212 in FIG. 2B . However, if the touch point 10 gets out of the target area 20 due to a drag, the control unit 180 does not return to step 212 but proceeds to step 214 in FIG. 2A .
  • step 212 if only a single hyperlink is determined to be overlapped with the touch point as the result of check in step 212 , the control unit 180 selects and visually emphasizes a touched hyperlink among displayed hyperlinks in step 215 as earlier discussed in reference to step 207 in FIG. 2A .
  • step 216 of FIG. 2B the control unit 180 determines whether a touch is released.
  • step 312 of FIG. 2B the control unit 180 returns to step 212 . If a touch is released, the control unit 180 removes the displayed target area 20 in step 217 and then proceeds to step 218 . In step 218 , the control unit 180 controls the wireless communication unit 110 to download a web page corresponding to the hyperlink selected in step 215 or 207 . Then the control unit 180 controls the display unit 150 to display the downloaded web page.
  • FIGS. 5A and 5B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with another embodiment of the present invention. Specifically, FIGS. 5A and 5B illustrate a method for selecting a user's desired target from among a plurality of Points of Interest (POIs) overlapped with a touch point on the touch screen and then setting the destination in a road guide from the selected target.
  • POIs refer to major facilities on a map, such as a station, an airport, a terminal, a hotel, a building, a theater, etc. Such POIs are selectively displayed in stages according to the scale of a map.
  • FIGS. 5A and 5B are screenshots illustrating selection of a point of interest in the method in FIGS. 5A and 5B .
  • the control unit 180 receives a request for a road guide. For instance, when a user touches a navigation icon, the touch screen 121 delivers a touch event to the control unit 180 . Then the control unit 180 controls the GPS receiver 170 and receives GPS satellite information from GPS satellites. Additionally, the control unit 180 measures a user's current position, namely, the location of the mobile device, by using the received GPS satellite information and then proceeds to step 502 . Alternatively, the control unit 180 may obtain the location of the mobile device by using WiFi Positioning System (WPS) or Cellular Network Positioning System (CPS) instead of GPS. Since WPS and CPS are well known in the art, detailed descriptions of these systems are omitted for clarity and conciseness.
  • WPS WiFi Positioning System
  • CPS Cellular Network Positioning System
  • the control unit 180 controls the display unit 150 to display a user's current location. Specifically, the control unit 180 searches the map database 141 and checks whether there is map information corresponding to the current location. If map information corresponding to the current location is found, the control unit 180 reads such map information from the map database 141 and then displays the map information. If no map information is found, the control unit 180 controls the wireless communication unit 110 and sends a request for map information to a map provider server. Then the control unit 180 receives the requested map information, displays the received map information on the display unit 150 and also saves the received map information in the map database 141 .
  • the control unit 180 receives destination information from a user. For instance, a user may enter destination information through a virtual keyboard displayed on the touch screen 121 . Additionally, in step 504 , the control unit 180 controls the display unit 150 to display the destination and a surrounding area. Namely, the control unit 180 reads map information corresponding to the destination from the map database 141 and then displays the map information on the display unit 150 . If no map information is found, the control unit 180 downloads map information from the server.
  • step 505 the control unit 180 determines whether any touch event occurs. If a touch event occurs, the control unit 180 proceeds to steps 506 and 507 and then checks the number of POIs overlapped with a touch point. If no POI is overlapped with a touch point, the control unit 180 proceeds to step 508 . In step 508 , the control unit 180 determines whether a touch is released. If a touch is released, the control unit 180 returns to step 505 . However, if a touch is not yet released, the control unit 180 returns to step 506 .
  • step 509 the control unit 180 selects and visually emphasizes a touched POI among displayed POIs. For instance, by controlling the display unit 150 , the control unit 180 performs a clear distinction in size, transparency or color between the touched POI and the other POIs.
  • step 510 of FIG. 5A the control unit 180 determines whether a touch is released. If a touch is released, the control unit 180 proceeds to step 520 of FIG. 5B (as indicated by the F symbol in both FIGS. 5A and 5B ), which is described in further detail below. If a touch is not released, the control unit 180 returns to step 507 and re-checks whether a single POI is still overlapped with a touch point. This recheck is performed because a drag may cause a touch on several POIs or no POIs. If a single POI is determined as overlapping with a touch point as the result of check in step 507 , the control unit 180 returns to step 509 . Otherwise, the control unit 180 returns to step 508 .
  • control unit 180 proceeds to step 511 and defines a target area 40 to be enlarged around the touch point 30 as shown in FIG. 6 .
  • control unit 180 enlarges the target area 40 at a given ratio in step 512 and then, as shown in FIG. 7 , displays the enlarged target area 40 together with the touch point 30 as an overlay in step 513 of FIG. 5A .
  • step 514 of FIG. 5B (as indicated by the E symbol in both FIGS. 5A and 5B ), where the control unit 180 checks whether only a single POI is overlapped with the touch point. If at least two POIs are determined as being overlapped with the touch point or if no POI is overlapped with the touch point, as the result of check in step 514 , the control unit 180 proceeds to step 515 of FIG. 5B and determines whether a touch is released. If a touch is released, the method returns to FIG. 5A (as shown by the D symbol in both FIGS. 5A and 5B ) where the control unit 180 removes the displayed target area 40 in step 516 and then returns to step 505 .
  • step 515 of FIG. 2B the control unit 180 returns to step 514 .
  • the control unit 180 does not return to step 514 of FIG. 2B , but instead proceeds to step 516 of FIG. 2A .
  • step 514 if only a single POI is determined as being overlapped with the touch point, as the result of check in step 514 , the control unit 180 selects and visually emphasizes a touched POI among the displayed POIs in step 517 , such as previously discussed with respect to step 509 in FIG. 5A .
  • step 518 of FIG. 5B the control unit 180 determines whether a touch is released. If a touch is not released, the control unit 180 returns to step 514 . If a touch is released, the control unit 180 removes the displayed target area 40 in step 519 , and then proceeds to step 520 .
  • step 520 of FIG. 2B the control unit 180 sets the destination in a road guide by using the POI selected in step 517 or 509 .
  • the control unit 180 controls the display unit 150 to output a query message about whether to set the destination. If any touch signal indicating a “yes” is received from the input unit 121 in response to the query message, the control unit 180 sets the destination in a road guide by using the POI selected in step 517 or 509 .
  • step 521 the control unit 180 computes an optimal route from a user's current location to the destination and then displays the optimal route on the display unit 150 .
  • FIGS. 8A and 8B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with still another embodiment of the present invention. Particularly, this embodiment is related to an augmented reality application that represents, through an overlay, local information corresponding to an image taken by the camera 160 .
  • FIGS. 8A and 8B show a method for selecting a user's desired target from among a plurality of local information items overlapped with a touch point on the touch screen and then displaying the details of the selected target. The process shown in FIGS. 8A and 8B is performed by the control unit 180 .
  • FIGS. 9 and 10 are screenshots illustrating selection of local information in the method of FIGS. 8A and 8B .
  • the control unit 180 receives a request for local information. For instance, when a user touches an augmented reality icon, the touch screen 121 delivers a touch event to the control unit 180 .
  • the control unit 180 activates the camera 160 to display an image. Also, the control unit 180 activates the GPS receiver 170 and the sensors 131 , 132 and 133 and then, by using information received from them, measures the location, direction, and attitude angle of the mobile device.
  • the control unit 180 controls the display unit 150 to display local information items as an overlay on the image. Specifically, the control unit 180 retrieves local information items from the local information database 142 in consideration of the location, direction, and attitude angle. If there is no local information item corresponding to the location, direction, and attitude angle in the local information database 142 , the control unit 180 downloads necessary local information from a server.
  • step 804 of FIG. 8A the control unit 180 determines whether any touch event occurs. If a touch event occurs, the control unit 180 proceeds to steps 805 and 806 and then checks the number of local information items overlapped with a touch point. If no local information item is overlapped with a touch point, the control unit 180 proceeds to step 807 . In step 807 , the control unit 180 determines whether a touch is released. If so, the control unit 180 returns to step 804 . However, if a touch is not yet released, the control unit 180 returns to step 805 .
  • step 808 the control unit 180 selects and visually emphasizes a touched local information item among displayed local information items. For instance, by controlling the display unit 150 , the control unit 180 makes a clear distinction in size, transparency or color between the touched local information item and the other local information items.
  • step 809 the control unit 180 determines whether a touch is released. If the touch is released, the control unit 180 proceeds to step 817 in FIG. 8B (as indicated by the I symbol in both FIGS. 8A and 8B ), which is described in detail below.
  • control unit 180 returns to step 806 of FIG. 8A and re-checks whether a single local information item is still overlapped with a touch point. If a single item is determined to be overlapped with a touch point, as the result of check in step 806 , the control unit 180 proceeds again to step 808 . If the single item is not overlapped with the touch point, the control unit 180 proceeds to step 807 as also discussed above.
  • step 805 If at least two information items are determined to be overlapped with a touch point 50 , as the result of check in step 805 , as shown in FIG. 9 , the control unit 180 proceeds to step 810 , enlarges such local information items, and displays the local information items in the form of icons around the touch point 50 as shown in FIG. 10 .
  • step 811 of FIG. 8B the control unit 180 checks whether only a single local information item is overlapped with the touch point. If at least two local information items are overlapped with the touch point or if no local information item is overlapped with the touch point as the result of check in step 811 , the control unit 180 proceeds to step 812 and determines whether a touch is released. If the touch is released in step 812 of FIG. 8B , the method returns to FIG. 8A (as indicated by the G symbol in both FIGS. 8A and 8B ), where the control unit 180 removes the displayed icons in step 813 and then returns to step 804 of FIG. 8A . If a touch is not yet released in step 812 , the control unit 180 returns to step 811 of FIG. 8B .
  • step 811 if only a single local information item is overlapped with the touch point as the result of check in step 811 , the control unit 180 selects and visually emphasizes a touched local information item among displayed items in step 814 as earlier discussed in reference to step 808 in FIG. 8A .
  • step 815 of FIG. 8B the control unit 180 determines whether a touch is released. If a touch is not released, the control unit 180 returns to step 811 . If a touch is released, the control unit 180 removes the displayed icons in step 816 and then proceeds to step 817 .
  • the control unit 180 displays the details of the local information item selected in step 814 or 808 , such as a distance from a user's current location to the selected local information, a phone number, etc. Specifically, the control unit 180 retrieves detailed information from the local information database 142 and outputs the detailed information to the display unit 150 . If there is no detailed information in the local information database 142 , the control unit 180 downloads detailed information from a server and outputs the downloaded information to the display unit 150 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A method for selecting a user's desired target from among a plurality of targets, at a touch point on a touch screen of a mobile device is provided. The method includes displaying a web page; checking a touch input at a touch point is received on the touch screen, a number of hyperlinks overlapped with the touch point in the displayed web page; enlarging, if at least two hyperlinks are overlapped with the touch point, defining a target area around the touch point, the target area, and displaying the enlarged target area; and selecting and visually emphasizing, if only a single one of the hyperlinks in the enlarged target area is overlapped with the touch point, the single hyperlink among the hyperlinks.

Description

    PRIORITY
  • This application is a continuation of, and claims priority under 35 U.S.C. §120 to, U.S. Pat. No. 9,122,382 which issued on Sep. 1, 2015 and claimed priority under 35 U.S.C. §119(a) to an application filed in the Korean Intellectual Property Office on Jan. 13, 2011 and assigned Serial No. 10-2011-003397, the contents of all of which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to a method for selecting a target at a touch point on a touch screen of a mobile device and, more particularly, to a method for selecting a user's desired target from among a plurality of targets, such as hyperlinks, Points of Interest (POIs) or local information items, at a touch point on a touch screen.
  • 2. Description of the Related Art
  • Due to the dramatic advancement of electronic communication technologies, users have come to use a variety of functions offered in mobile phones. Unlike traditional mobile phones that allow only the use of predetermined functions, smart phones not only allow users to install and remove various applications, but also permit direct Internet access through wireless networks such as Wireless Fidelity (WiFi). Accordingly, market demands for smart phones have been rapidly increasing.
  • In particular, a touch screen has become one of essential elements of a smart phone. When any touch event (i.e., a user's touch action on a touch screen through at least one finger or a stylus) from a user is input, a smart phone performs a particular function corresponding to a touch point. However, since such a touch point is relatively larger than the screen resolution, there is a likelihood that an undesired target may be mistakenly touched. Unfortunately, such mistaken touches may often require users to provide an exact touch. Although a user can touch an enlarged screen, such an enlargement may require complicated and troublesome manipulations.
  • SUMMARY OF THE INVENTION
  • Accordingly, the present invention is provided to address the above-mentioned problems and/or disadvantages and to offer at least the advantages described below.
  • An aspect of the present invention provides methods for selecting a user's desired target among a plurality of targets at a touch point on a touch screen.
  • According to one aspect of the present invention, provided is a method for selecting a target at a touch point on a touch screen of a mobile device, the method including displaying a web page; when there is a touch on the touch screen, checking the number of hyperlinks overlapped with the touch point in the displayed web page; if two or more hyperlinks are overlapped with the touch point, defining a target area around the touch point, enlarging the target area, and displaying the enlarged target area; and if a single one of the hyperlinks in the enlarged target area is overlapped with the touch point, selecting and visually emphasizing the single hyperlink among the hyperlinks.
  • According to another aspect of the present invention, provided is a method for selecting a target at a touch point on a touch screen of a mobile device, the method including displaying map information; when there is a touch on the touch screen, checking the number of POIs (points of interest) overlapped with the touch point in the displayed map information; if two or more POIs are overlapped with the touch point, enlarging and displaying the POIs overlapped with the touch point; if a single one of the enlarged POIs is overlapped with the touch point, selecting the single POI among the enlarged POIs; and setting a destination in a load guide by using the selected POI.
  • According to still another aspect of the present invention, provided is a method for selecting a target at a touch point on a touch screen of a mobile device, the method including displaying local information items on an image in an overlay manner; when there is a touch on the touch screen, checking the number of the local information items overlapped with the touch point; if two or more local information items are overlapped with the touch point, enlarging and displaying the local information items overlapped with the touch point; if a single one of the enlarged local information items is overlapped with the touch point, selecting the single local information item among the enlarged local information items; and displaying detail information of the selected local information item.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, when taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a block diagram illustrating a mobile device in accordance with an embodiment of the present invention;
  • FIGS. 2A and 2B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with an embodiment of the present invention;
  • FIGS. 3 and 4 are screenshots illustrating selection of a hyperlink in the method of FIGS. 2A and 2B in accordance with an embodiment of the present invention;
  • FIGS. 5A and 5B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with another embodiment of the present invention;
  • FIGS. 6 and 7 are screenshots illustrating selection of a point of interest in the method of FIGS. 5A and 5B in accordance with an embodiment of the present invention;
  • FIGS. 8A and 8B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with still another embodiment of the present invention; and
  • FIGS. 9 and 10 are screenshots illustrating selection of local information in the method of FIGS. 8A and 8B in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION OF EMBODIMENTS OF THE PRESENT INVENTION
  • Non-limiting embodiments of the present invention are described in detail with reference to the accompanying drawings. In the following description, the matters defined in the description are provided to assist a comprehensive understanding of the present invention, and it is obvious to those of ordinary skill in the art that predetermined modifications or changes of the matters described herein can be made without departing from the scope of the invention.
  • Furthermore, well-known or widely used techniques, elements, structures, and processes may not be described or illustrated in detail to avoid obscuring the essence of the present invention. Although the drawings represent embodiments of the invention, the drawings are not necessarily to scale and certain features may be exaggerated or omitted in order to better illustrate and explain the present invention.
  • FIG. 1 is a block diagram illustrating a mobile device in accordance with an embodiment of the present invention. As shown in FIG. 1, the mobile device may include a wireless communication unit 110, an input unit 120, a sensing unit 130, a memory unit 140, a display unit 150, a camera 160, a Global Positioning System (GPS) receiver 170, and a control unit 180 that generally controls the other units of the mobile terminal.
  • The wireless communication unit 110 includes a mobile communication module (not shown) that wirelessly communicates with a base station (not shown) in order to provide data received from the control unit 180 to the base station or provide data received from the base station to the control unit 180. Additionally, the wireless communication unit 110 may have a WiFi module in order to access to a Local Area Network (LAN).
  • The input unit 120 is includes a touch screen 121 and a plurality of buttons 122. The input unit 120 outputs, to the control unit 180, a touch event or a key event caused by user input.
  • The sensing unit 130 detects the direction, attitude angle, etc. of the mobile device and outputs detection results to the control unit 180. The sensing unit 130 may have a gyro sensor 131, a geomagnetic sensor 132, and/or an acceleration sensor 133. As well known in the art, the gyro sensor 131 measures the attitude angles of the mobile device in the X-axis, Y-axis and Z-axis directions, and also detects a rotation on each axis, namely rolling on the X-axis, pitching on the Y-axis, and yawing on the Z-axis. The geomagnetic sensor 132 detects the direction of the mobile device and may perform a tilt compensation function for the gyro sensor 132. The acceleration sensor 133 measures the acceleration of the mobile device.
  • The memory unit 140 may be divided into program and data regions (not shown). The program region may store an Operating System (OS), an augmented reality application for representing local information on an image (such as a realistic image, e.g., a digital photograph) in an overlay manner, a road guide application, a mobile web browser, and the like. The data region may store a map database 141, a local information database 142, etc.
  • The GPS receiver 170 offers GPS satellite information such as GPS satellite location, transmission time, reception time, satellite signal strength, etc. to the control unit 180.
  • FIGS. 2A and 2B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with an embodiment of the present invention. Specifically, FIGS. 2A and 2B illustrate a method for selecting a user's desired target among a plurality of hyperlinks overlapped with a touch point on the touch screen and then displaying a web page of the selected target. The process of FIGS. 2A and 2B is performed by the control unit 180. Additionally, FIGS. 3 and 4 are screenshots illustrating selection of a hyperlink in accordance with the method of FIGS. 2A and 2B.
  • In step 201, as shown on FIG. 2A, the control unit 180 receives a request for Internet access and then establishes an Internet connection by controlling the wireless communication unit 110. For example, when a user touches a web browser icon, the touch screen 121 delivers a touch event to the control unit 180. Then, the control unit 180 controls the wireless communication unit 110 and accesses a predetermined web site.
  • In step 202, the control unit 180 controls the display unit 150 to display a web page received through the wireless communication unit 110. In step 203, the control unit 180 determines whether any touch event occurs. If a touch event occurs, the control unit 180 proceeds to steps 204 and 205 and then checks the number of hyperlinks overlapped with a touch point. If no hyperlink is overlapped with a touch point, the control unit 180 proceeds to step 206.
  • In step 206 of FIG. 2A, the control unit 180 determines whether a touch is released. If so, the control unit 180 returns to step 203. However, if a touch is not yet released, namely if a user keeps the finger touched on the touch screen 121 or takes a drag action (i.e., moves the finger across the touch screen while maintaining a touch), a touch event still continues. Therefore, the control unit 180 returns to step 204.
  • If the only one hyperlink is overlapped with a touch point as the result of the check performed in steps 204 and 205 of FIG. 2A, the control unit 180 proceeds to step 207. In step 207, the control unit 180 selects and visually emphasizes a touched hyperlink among displayed hyperlinks. For example, by controlling the display unit 150, the control unit 180 makes a clear distinction in size, transparency or color between the touched hyperlink and the other hyperlinks. Next, in step 208, the control unit 180 determines whether a touch is released. If so, the control unit 180 proceeds to step 218 in FIG. 2B (as indicated by the C symbol in both FIGS. 2A and 2B), which is described in detail below.
  • If a touch is not released in step 208 of FIG. 2A, the control unit 180 returns to step 205 and checks again whether a single hyperlink is still overlapped with a touch point. The method returns to step 205 because a drag may cause a touch on several hyperlinks or on no hyperlinks. If a single hyperlink is overlapped with a touch point as the result of check in step 205, namely, if a touched hyperlink only is still touched or if any other single hyperlink is alternatively touched, the control unit 180 proceeds again to step 207. Otherwise the control unit 180 proceeds again to step 206.
  • If two or more hyperlinks are overlapped with a touch point 10 as the result of the check in step 204 of FIG. 2A, the control unit 180 proceeds to step 209 and defines a target area 20 to be enlarged around the touch point 10 as shown in FIG. 3. For instance, the control unit 180 defines the target area 20 that surrounds the touch point 10 and contains all hyperlinks overlapped with the touch point 10. Next, the control unit 180 enlarges the target area 20 at a given ratio in step 210 and then, as shown in FIG. 4, displays the enlarged target area 20 together with the touch point 10 in an overlay manner in step 211. Particularly, the touch point is unchanged in size, whereas hyperlinks in the target area are enlarged. Therefore, even though a user does not move the finger, there is a high probability that the only one hyperlink will be touched.
  • Next, the method continues in step 212 of FIG. 2B (as indicated by the B symbol in both FIGS. 2A and 2B), where the control unit 180 checks whether a single hyperlink only is overlapped with the touch point. If two or more hyperlinks are overlapped with the touch point or if no hyperlink is overlapped with the touch point as the result of check in step 212, the control unit 180 proceeds to step 213 and determines whether a touch is released. If so, the method returns to FIG. 2A (as shown by the A symbol in both FIGS. 2A and 2B) and the control unit 180 removes the displayed target area 20 in step 214 and then returns to step 203. If a touch is not yet released, the control unit 180 returns to step 212 in FIG. 2B. However, if the touch point 10 gets out of the target area 20 due to a drag, the control unit 180 does not return to step 212 but proceeds to step 214 in FIG. 2A.
  • Continuing in FIG. 2B, if only a single hyperlink is determined to be overlapped with the touch point as the result of check in step 212, the control unit 180 selects and visually emphasizes a touched hyperlink among displayed hyperlinks in step 215 as earlier discussed in reference to step 207 in FIG. 2A. Next, in step 216 of FIG. 2B, the control unit 180 determines whether a touch is released.
  • If a touch is not released in step 312 of FIG. 2B, the control unit 180 returns to step 212. If a touch is released, the control unit 180 removes the displayed target area 20 in step 217 and then proceeds to step 218. In step 218, the control unit 180 controls the wireless communication unit 110 to download a web page corresponding to the hyperlink selected in step 215 or 207. Then the control unit 180 controls the display unit 150 to display the downloaded web page.
  • FIGS. 5A and 5B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with another embodiment of the present invention. Specifically, FIGS. 5A and 5B illustrate a method for selecting a user's desired target from among a plurality of Points of Interest (POIs) overlapped with a touch point on the touch screen and then setting the destination in a road guide from the selected target. Herein, POIs refer to major facilities on a map, such as a station, an airport, a terminal, a hotel, a building, a theater, etc. Such POIs are selectively displayed in stages according to the scale of a map. For example, buildings and theaters may only be shown in a map having the scale greater than 1 cm/100 m. The method illustrated in FIGS. 5A and 5B is performed by the control unit 180. Additionally, FIGS. 6 and 7 are screenshots illustrating selection of a point of interest in the method in FIGS. 5A and 5B.
  • In step 501 of FIG. 5A, the control unit 180 receives a request for a road guide. For instance, when a user touches a navigation icon, the touch screen 121 delivers a touch event to the control unit 180. Then the control unit 180 controls the GPS receiver 170 and receives GPS satellite information from GPS satellites. Additionally, the control unit 180 measures a user's current position, namely, the location of the mobile device, by using the received GPS satellite information and then proceeds to step 502. Alternatively, the control unit 180 may obtain the location of the mobile device by using WiFi Positioning System (WPS) or Cellular Network Positioning System (CPS) instead of GPS. Since WPS and CPS are well known in the art, detailed descriptions of these systems are omitted for clarity and conciseness.
  • In step 502, the control unit 180 controls the display unit 150 to display a user's current location. Specifically, the control unit 180 searches the map database 141 and checks whether there is map information corresponding to the current location. If map information corresponding to the current location is found, the control unit 180 reads such map information from the map database 141 and then displays the map information. If no map information is found, the control unit 180 controls the wireless communication unit 110 and sends a request for map information to a map provider server. Then the control unit 180 receives the requested map information, displays the received map information on the display unit 150 and also saves the received map information in the map database 141.
  • Next, in step 503 in FIG. 5A, the control unit 180 receives destination information from a user. For instance, a user may enter destination information through a virtual keyboard displayed on the touch screen 121. Additionally, in step 504, the control unit 180 controls the display unit 150 to display the destination and a surrounding area. Namely, the control unit 180 reads map information corresponding to the destination from the map database 141 and then displays the map information on the display unit 150. If no map information is found, the control unit 180 downloads map information from the server.
  • Next, in step 505, the control unit 180 determines whether any touch event occurs. If a touch event occurs, the control unit 180 proceeds to steps 506 and 507 and then checks the number of POIs overlapped with a touch point. If no POI is overlapped with a touch point, the control unit 180 proceeds to step 508. In step 508, the control unit 180 determines whether a touch is released. If a touch is released, the control unit 180 returns to step 505. However, if a touch is not yet released, the control unit 180 returns to step 506.
  • If only one POI is determined as overlapping with a touch point as the result of check in steps 506 and 507, the control unit 180 proceeds to step 509. In step 509, the control unit 180 selects and visually emphasizes a touched POI among displayed POIs. For instance, by controlling the display unit 150, the control unit 180 performs a clear distinction in size, transparency or color between the touched POI and the other POIs.
  • Next, in step 510 of FIG. 5A, the control unit 180 determines whether a touch is released. If a touch is released, the control unit 180 proceeds to step 520 of FIG. 5B (as indicated by the F symbol in both FIGS. 5A and 5B), which is described in further detail below. If a touch is not released, the control unit 180 returns to step 507 and re-checks whether a single POI is still overlapped with a touch point. This recheck is performed because a drag may cause a touch on several POIs or no POIs. If a single POI is determined as overlapping with a touch point as the result of check in step 507, the control unit 180 returns to step 509. Otherwise, the control unit 180 returns to step 508.
  • If at least two POIs are determined as overlapping with a touch point 30 in step 506, the control unit 180 proceeds to step 511 and defines a target area 40 to be enlarged around the touch point 30 as shown in FIG. 6. Next, the control unit 180 enlarges the target area 40 at a given ratio in step 512 and then, as shown in FIG. 7, displays the enlarged target area 40 together with the touch point 30 as an overlay in step 513 of FIG. 5A.
  • Next, the method continues in step 514 of FIG. 5B (as indicated by the E symbol in both FIGS. 5A and 5B), where the control unit 180 checks whether only a single POI is overlapped with the touch point. If at least two POIs are determined as being overlapped with the touch point or if no POI is overlapped with the touch point, as the result of check in step 514, the control unit 180 proceeds to step 515 of FIG. 5B and determines whether a touch is released. If a touch is released, the method returns to FIG. 5A (as shown by the D symbol in both FIGS. 5A and 5B) where the control unit 180 removes the displayed target area 40 in step 516 and then returns to step 505. If a touch is not yet released in step 515 of FIG. 2B, the control unit 180 returns to step 514. However, if the touch point 10 moves out of the target area 40 due to a drag, the control unit 180 does not return to step 514 of FIG. 2B, but instead proceeds to step 516 of FIG. 2A.
  • Continuing in FIG. 5B, if only a single POI is determined as being overlapped with the touch point, as the result of check in step 514, the control unit 180 selects and visually emphasizes a touched POI among the displayed POIs in step 517, such as previously discussed with respect to step 509 in FIG. 5A. Next, in step 518 of FIG. 5B, the control unit 180 determines whether a touch is released. If a touch is not released, the control unit 180 returns to step 514. If a touch is released, the control unit 180 removes the displayed target area 40 in step 519, and then proceeds to step 520.
  • In step 520 of FIG. 2B, the control unit 180 sets the destination in a road guide by using the POI selected in step 517 or 509. Specifically, the control unit 180 controls the display unit 150 to output a query message about whether to set the destination. If any touch signal indicating a “yes” is received from the input unit 121 in response to the query message, the control unit 180 sets the destination in a road guide by using the POI selected in step 517 or 509. Next, in step 521, the control unit 180 computes an optimal route from a user's current location to the destination and then displays the optimal route on the display unit 150.
  • FIGS. 8A and 8B are a flow diagram illustrating a method for selecting a target at a touch point in accordance with still another embodiment of the present invention. Particularly, this embodiment is related to an augmented reality application that represents, through an overlay, local information corresponding to an image taken by the camera 160. Specifically, FIGS. 8A and 8B show a method for selecting a user's desired target from among a plurality of local information items overlapped with a touch point on the touch screen and then displaying the details of the selected target. The process shown in FIGS. 8A and 8B is performed by the control unit 180. Additionally, FIGS. 9 and 10 are screenshots illustrating selection of local information in the method of FIGS. 8A and 8B.
  • In step 801 of FIG. 8A, the control unit 180 receives a request for local information. For instance, when a user touches an augmented reality icon, the touch screen 121 delivers a touch event to the control unit 180. Next, in step 802, the control unit 180 activates the camera 160 to display an image. Also, the control unit 180 activates the GPS receiver 170 and the sensors 131, 132 and 133 and then, by using information received from them, measures the location, direction, and attitude angle of the mobile device.
  • Next, in step 803, the control unit 180 controls the display unit 150 to display local information items as an overlay on the image. Specifically, the control unit 180 retrieves local information items from the local information database 142 in consideration of the location, direction, and attitude angle. If there is no local information item corresponding to the location, direction, and attitude angle in the local information database 142, the control unit 180 downloads necessary local information from a server.
  • Next, in step 804 of FIG. 8A, the control unit 180 determines whether any touch event occurs. If a touch event occurs, the control unit 180 proceeds to steps 805 and 806 and then checks the number of local information items overlapped with a touch point. If no local information item is overlapped with a touch point, the control unit 180 proceeds to step 807. In step 807, the control unit 180 determines whether a touch is released. If so, the control unit 180 returns to step 804. However, if a touch is not yet released, the control unit 180 returns to step 805.
  • If the single local information item is determined to be overlapped with a touch point, as the result of check in steps 805 and 806, the control unit 180 proceeds to step 808. In step 808, the control unit 180 selects and visually emphasizes a touched local information item among displayed local information items. For instance, by controlling the display unit 150, the control unit 180 makes a clear distinction in size, transparency or color between the touched local information item and the other local information items. Next, in step 809, the control unit 180 determines whether a touch is released. If the touch is released, the control unit 180 proceeds to step 817 in FIG. 8B (as indicated by the I symbol in both FIGS. 8A and 8B), which is described in detail below. If a touch is not released, the control unit 180 returns to step 806 of FIG. 8A and re-checks whether a single local information item is still overlapped with a touch point. If a single item is determined to be overlapped with a touch point, as the result of check in step 806, the control unit 180 proceeds again to step 808. If the single item is not overlapped with the touch point, the control unit 180 proceeds to step 807 as also discussed above.
  • If at least two information items are determined to be overlapped with a touch point 50, as the result of check in step 805, as shown in FIG. 9, the control unit 180 proceeds to step 810, enlarges such local information items, and displays the local information items in the form of icons around the touch point 50 as shown in FIG. 10.
  • Next, the method continues in step 811 of FIG. 8B (as indicated by the H symbol in both FIGS. 8A and 8B), the control unit 180 checks whether only a single local information item is overlapped with the touch point. If at least two local information items are overlapped with the touch point or if no local information item is overlapped with the touch point as the result of check in step 811, the control unit 180 proceeds to step 812 and determines whether a touch is released. If the touch is released in step 812 of FIG. 8B, the method returns to FIG. 8A (as indicated by the G symbol in both FIGS. 8A and 8B), where the control unit 180 removes the displayed icons in step 813 and then returns to step 804 of FIG. 8A. If a touch is not yet released in step 812, the control unit 180 returns to step 811 of FIG. 8B.
  • Continuing in FIG. 8B, if only a single local information item is overlapped with the touch point as the result of check in step 811, the control unit 180 selects and visually emphasizes a touched local information item among displayed items in step 814 as earlier discussed in reference to step 808 in FIG. 8A. Next, in step 815 of FIG. 8B, the control unit 180 determines whether a touch is released. If a touch is not released, the control unit 180 returns to step 811. If a touch is released, the control unit 180 removes the displayed icons in step 816 and then proceeds to step 817.
  • In step 817 of FIG. 8B, the control unit 180 displays the details of the local information item selected in step 814 or 808, such as a distance from a user's current location to the selected local information, a phone number, etc. Specifically, the control unit 180 retrieves detailed information from the local information database 142 and outputs the detailed information to the display unit 150. If there is no detailed information in the local information database 142, the control unit 180 downloads detailed information from a server and outputs the downloaded information to the display unit 150.
  • While this invention has been particularly shown and described with reference to an embodiment thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (20)

What is claimed is:
1. A mobile device comprising:
a wireless communication unit;
a touchscreen; and
a control unit electronically connected to the touchscreen and the wireless communication unit,
wherein the control unit is configured to:
display a first web page on the touch screen;
receive a touch input on the touch screen;
if at least two hyperlinks are around a touch point of the received touch input, define a target area enclosing at least a portion of the at least two hyperlinks;
display an enlarged version of the defined target area;
visually emphasize a touched hyperlink in the enlarged version;
select the touched hyperlink to download a web page;
control the wireless communication unit to download a second web page corresponding to the selected hyperlink; and
display the downloaded second web page on the touchscreen.
2. The mobile device of claim 1, wherein the downloading of the second web page is performed in response to releasing the received touch input from the touch screen.
3. The mobile device of claim 1, wherein the control unit is further configured to remove, when the received touch input is released from the touch screen while no one hyperlink in the enlarged version is overlapped with the touch point, the displayed target area.
4. The mobile device of claim 1, wherein the control unit is configured to display the touched hyperlink differently than other hyperlinks in at least one of size, transparency and color.
5. The mobile device of claim 1, wherein the wireless communication unit comprises a communication module for accessing a local area network (LAN).
6. The mobile device of claim 1, wherein the communication module comprises a wireless fidelity (WiFi) module.
7. A method for operating a mobile device having a touch screen, the method comprising:
displaying a first web page on the touch screen;
receiving a touch input on the touch screen;
defining, if at least two hyperlinks are around a touch point of the received touch input, a target area enclosing at least portion of the at least two hyperlinks;
displaying an enlarged version of the defined target area;
visually emphasizing a touched hyperlink in the enlarged version;
selecting the touched hyperlink to download a web page;
downloading a second web page corresponding to the selected hyperlink; and
displaying the downloaded second web page on the touchscreen.
8. The method of claim 7, further comprising:
removing the displayed target area when the received touch input is released from the touch screen while no one hyperlink in the enlarged version is overlapped with the touch point.
9. The method of claim 7, wherein visually emphasizing the touched hyperlink in the enlarged version comprises:
displaying the touched hyperlink differently than other hyperlinks in at least one of size, transparency and color.
10. A mobile device comprising:
a wireless communication unit;
a touchscreen; and
a control unit electronically connected to the touchscreen and the wireless communication unit,
wherein the control unit is configured to:
display a map on the touch screen;
receive a touch input on the touch screen;
if at least two Points of Interest (POIs) on the displayed map are around a touch point of the received touch input, define a target area enclosing at least portion of the at least two POIs;
display an enlarged version of the defined target area;
visually emphasize and select a touched POI in the enlarged version; and
set a destination in a road guide according to the selected POI.
11. The mobile device of claim 10, wherein the setting of the destination is performed in response to releasing of the received touch input from the touch screen.
12. The mobile device of claim 10, wherein the control unit is further configured to remove, when the received touch input is released from the touch screen while no one POI in the enlarged version is overlapped with the touch point, the displayed target area.
13. The mobile device of claim 10, wherein the control unit is further configured to display the touched POI differently than other POIs in at least one of size, transparency and color.
14. The mobile device of claim 10, wherein the wireless communication unit comprises a communication module for accessing a local area network (LAN).
15. The mobile device of claim 10, wherein the communication module comprises a wireless fidelity (WiFi) module.
16. The mobile device of claim 10, further comprising:
a Global Positioning System (GPS) receiver electronically connected to the control unit,
wherein the control unit is further configured to measure a location of the mobile device using information received from the GPS receiver, compute a route from the location to the destination, and display the route on the map.
17. The mobile device of claim 10, wherein the control unit is further configured to measure a location of the mobile device using a wireless fidelity (WiFi) positioning system (WPS) or a cellular network positioning system (CPS), compute a route from the location to the destination, and display the route on the map.
18. A method for operating a mobile device having a touch screen, the method comprising:
displaying a map on the touch screen;
receiving a touch input on the touch screen;
defining, if at least two Points of Interest (POIs) on the displayed map are around a touch point of the received touch input, a target area enclosing at least portion of the at least two POIs;
displaying an enlarged version of the defined target area;
visually emphasizing and selecting a touched POI in the enlarged version; and
setting a destination in a road guide according to the selected POI
19. The method of claim 18, further comprising:
removing the displayed target area when the received touch input is released from the touch screen while no one POI in the enlarged version is overlapped with the touch point.
20. The method of claim 17, wherein visually emphasizing the touched POI in the enlarged version comprises:
displaying the touched POI differently than other POIs in at least one of size, transparency and color.
US14/840,907 2011-01-13 2015-08-31 Method for selecting target at touch point on touch screen of mobile device Abandoned US20160092061A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/840,907 US20160092061A1 (en) 2011-01-13 2015-08-31 Method for selecting target at touch point on touch screen of mobile device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2011-0003397 2011-01-13
KR1020110003397A KR20120082102A (en) 2011-01-13 2011-01-13 Method for selecting a target in a touch point
US13/347,290 US9122382B2 (en) 2011-01-13 2012-01-10 Method for selecting target at touch point on touch screen of mobile device
US14/840,907 US20160092061A1 (en) 2011-01-13 2015-08-31 Method for selecting target at touch point on touch screen of mobile device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/347,290 Continuation US9122382B2 (en) 2011-01-13 2012-01-10 Method for selecting target at touch point on touch screen of mobile device

Publications (1)

Publication Number Publication Date
US20160092061A1 true US20160092061A1 (en) 2016-03-31

Family

ID=45607571

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/347,290 Expired - Fee Related US9122382B2 (en) 2011-01-13 2012-01-10 Method for selecting target at touch point on touch screen of mobile device
US14/840,907 Abandoned US20160092061A1 (en) 2011-01-13 2015-08-31 Method for selecting target at touch point on touch screen of mobile device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/347,290 Expired - Fee Related US9122382B2 (en) 2011-01-13 2012-01-10 Method for selecting target at touch point on touch screen of mobile device

Country Status (3)

Country Link
US (2) US9122382B2 (en)
EP (1) EP2477105A3 (en)
KR (1) KR20120082102A (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9092134B2 (en) * 2008-02-04 2015-07-28 Nokia Technologies Oy User touch display interface providing an expanded selection area for a user selectable object
US9251144B2 (en) * 2011-10-19 2016-02-02 Microsoft Technology Licensing, Llc Translating language characters in media content
JP6010376B2 (en) * 2012-07-25 2016-10-19 京セラ株式会社 Electronic device, selection program and method
KR101475021B1 (en) * 2012-08-21 2014-12-22 김원섭 Apparatus having touch screen and method for controlling touch screen
KR20140027690A (en) * 2012-08-27 2014-03-07 삼성전자주식회사 Method and apparatus for displaying with magnifying
CN103793164A (en) * 2012-10-31 2014-05-14 国际商业机器公司 Touch screen display processing method and device and browser
EP2755124B1 (en) * 2013-01-15 2019-03-13 BlackBerry Limited Enhanced display of interactive elements in a browser
US9575653B2 (en) 2013-01-15 2017-02-21 Blackberry Limited Enhanced display of interactive elements in a browser
JP5991538B2 (en) * 2013-02-20 2016-09-14 富士ゼロックス株式会社 Data processing apparatus, data processing system, and program
US8812995B1 (en) * 2013-04-10 2014-08-19 Google Inc. System and method for disambiguating item selection
JP6136568B2 (en) * 2013-05-23 2017-05-31 富士通株式会社 Information processing apparatus and input control program
KR20150014084A (en) * 2013-07-29 2015-02-06 삼성전자주식회사 Device based on touch screen and method for controlling object thereof
US9329692B2 (en) 2013-09-27 2016-05-03 Microsoft Technology Licensing, Llc Actionable content displayed on a touch screen
CN103646028B (en) * 2013-11-01 2017-12-08 北京奇虎科技有限公司 Highlight regions acquisition methods and equipment
WO2015089477A1 (en) * 2013-12-13 2015-06-18 AI Squared Techniques for programmatic magnification of visible content elements of markup language documents
US20150234547A1 (en) * 2014-02-18 2015-08-20 Microsoft Technology Licensing, Llc Portals for visual interfaces
US20160018951A1 (en) * 2014-07-17 2016-01-21 Microsoft Corporation Contextual view portals
CN104503660A (en) * 2014-12-18 2015-04-08 厦门美图移动科技有限公司 Icon arranging method, device and mobile terminal
JP2016192111A (en) * 2015-03-31 2016-11-10 パイオニア株式会社 Selection device, selection method, and selection device program
CN105159586B (en) * 2015-08-27 2018-01-23 广东欧珀移动通信有限公司 A kind of alarm clock prompting method and mobile terminal
CN105371850B (en) * 2015-11-17 2018-12-11 广东欧珀移动通信有限公司 A kind of route navigation method and mobile terminal
CN109144355B (en) * 2017-08-15 2022-03-04 北京仁光科技有限公司 Interaction device and interaction system based on optical signals

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100302176A1 (en) * 2009-05-29 2010-12-02 Nokia Corporation Zoom-in functionality

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5119079A (en) * 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
US6856901B2 (en) * 2003-06-02 2005-02-15 Alpine Electronics, Inc. Display method and apparatus for navigation system
JP2006134184A (en) * 2004-11-08 2006-05-25 Honda Access Corp Remote control switch
US7921365B2 (en) * 2005-02-15 2011-04-05 Microsoft Corporation System and method for browsing tabbed-heterogeneous windows
US8527905B2 (en) * 2006-06-07 2013-09-03 International Business Machines Corporsation Providing archived web page content in place of current web page content
KR20090000137A (en) * 2007-01-11 2009-01-07 삼성전자주식회사 System and method for navigation of web browser
US8471823B2 (en) * 2007-08-16 2013-06-25 Sony Corporation Systems and methods for providing a user interface
KR20090024541A (en) * 2007-09-04 2009-03-09 삼성전자주식회사 Method for selecting hyperlink and mobile communication terminal using the same
KR20090038540A (en) * 2007-10-16 2009-04-21 주식회사 현대오토넷 Apparatus and method for changing image position on the screen, and nevigation system using the same
RU2495477C2 (en) * 2008-09-10 2013-10-10 ОПЕРА СОФТВЭА ЭйСиЭй Method and apparatus for selecting object on display screen
JP4752887B2 (en) * 2008-09-12 2011-08-17 ソニー株式会社 Information processing apparatus, information processing method, and computer program
KR101542495B1 (en) * 2008-12-02 2015-08-06 엘지전자 주식회사 Method for displaying information for mobile terminal and apparatus thereof
US9372614B2 (en) * 2009-07-09 2016-06-21 Qualcomm Incorporated Automatic enlargement of viewing area with selectable objects
US8381118B2 (en) * 2009-10-05 2013-02-19 Sony Ericsson Mobile Communications Ab Methods and devices that resize touch selection zones while selected on a touch sensitive display
US20110106439A1 (en) * 2009-11-04 2011-05-05 In-Tai Huang Method of displaying multiple points of interest on a personal navigation device
US20130047100A1 (en) * 2011-08-17 2013-02-21 Google Inc. Link Disambiguation For Touch Screens

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100302176A1 (en) * 2009-05-29 2010-12-02 Nokia Corporation Zoom-in functionality

Also Published As

Publication number Publication date
KR20120082102A (en) 2012-07-23
EP2477105A3 (en) 2017-04-05
US9122382B2 (en) 2015-09-01
US20120182237A1 (en) 2012-07-19
EP2477105A2 (en) 2012-07-18

Similar Documents

Publication Publication Date Title
US9122382B2 (en) Method for selecting target at touch point on touch screen of mobile device
US9702721B2 (en) Map service with network-based query for search
US9710554B2 (en) Methods, apparatuses and computer program products for grouping content in augmented reality
US9099056B2 (en) Method of labelling a highly curved path on a map rendered on a wireless communications device
US9273979B2 (en) Adjustable destination icon in a map navigation tool
US9074898B2 (en) Apparatus and method for providing position information service
KR101233534B1 (en) Graphical user interface for presenting location information
KR100985737B1 (en) Method, terminal device and computer-readable recording medium for providing information on an object included in visual field of the terminal device
US20110137561A1 (en) Method and apparatus for measuring geographic coordinates of a point of interest in an image
JP2017536527A (en) Providing in-navigation search results that reduce route disruption
JP2004272217A (en) Map image display controlling method, its program, storage medium for storing the program and electronic equipment
US10094681B2 (en) Controlling a map system to display off-screen points of interest
JP5952667B2 (en) Message management apparatus, message management method, and program
US20120303265A1 (en) Navigation system with assistance for making multiple turns in a short distance
US20140152562A1 (en) Display controller, display system, storage medium and method
CN110619085A (en) Information processing method and device
KR102078859B1 (en) Method and apparatus for providing street view
KR101307349B1 (en) Device and method for displaying locations on a map of mobile terminal
JP6284426B2 (en) Route output device and route output method
JP7258565B2 (en) navigation device
JP6795652B2 (en) Map display system, map display method, program, and route display device
US20210191682A1 (en) Method and system for associating and displaying content and list of contents on dual screen
KR101529398B1 (en) Digital map device and Method for operating thereof
JP2018045139A (en) Electronic device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION