CN101393509B - Method of displaying planar image - Google Patents
Method of displaying planar image Download PDFInfo
- Publication number
- CN101393509B CN101393509B CN2008102149388A CN200810214938A CN101393509B CN 101393509 B CN101393509 B CN 101393509B CN 2008102149388 A CN2008102149388 A CN 2008102149388A CN 200810214938 A CN200810214938 A CN 200810214938A CN 101393509 B CN101393509 B CN 101393509B
- Authority
- CN
- China
- Prior art keywords
- dimensional image
- mark
- display
- zone
- input operation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/367—Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The present invention provides a method for displaying a planar image, wherein a planar image and a mark are displayed on a display having an outer periphery. A region is provided on the display. The region has a boundary distanced from the outer periphery of the display. The mark is moved with respect to the planar image on the display when the mark is positioned in the region. The planar image is scrolled in a direction directed from the region to the mark on the display when the mark is positioned at an outside of the region. This method allows the mark to be positioned efficiently.
Description
Technical field
The present invention relates to the display packing of the two dimensional image in a kind of e-machine of show tags in the lump.
Background technology
In the e-machine such as mobile phone and onboard navigation system, use rocking bar and coordinate entering devices such as touch pad, touch-screen, come marks (mark) such as moving cursor and arrow.In these move operations, it is very important making mark easily move to the target location from current display position.
Fig. 5 is the synoptic diagram that the spy opens the touch pad 501 of disclosed circle in the 2006-268663 communique.At the center of the touch pad 501 of circle, dispose to make and be presented at the button S1 that the cursor on the display stops.On touch pad 501, from center button S1 concentric circles be provided with circular speed definition district A1~A4.Speed definition district A1~A4, by according to this arranged in order on the radial direction of the touch pad that begins from button S1 501.The operator moves the cursor that shows on the display with finger touch speed definition district A1~A4.The translational speed and the direction of speed definition district A1~A4 decision cursor.The speed definition district that touches is far away more from button S1, and cursor moves soon more.Cursor moves in the pairing direction of direction from the button S1 position that A1~A4 is touched towards the speed definition district on display.The operator perhaps makes finger remove touch pad 501 cursor is stopped through with finger touch button S1.
Usually, before cursor is about to be positioned in the target location, carry out meticulous moving to cursor.In the touch pad 501 in the past, if after cursor excessively moves, it is moved round about, the operator just need make finger remove contact plate 501, makes the pressing position of finger from the speed definition, crosses button S1 and moves to opposite location.Like this,, move to reverse direction, just need make finger remove touch pad 501, finger is placed on the position that looks different in order to make cursor for positioning cursor subtly.
For touch pad 501 in the past, move subtly in order to make cursor, need control cursor according to translational speed and direction fine segmentation speed definition district.For existing PC, accurate localization can use pointing apparatus (pointing device) such as mouse and touch pad to carry out, and the rough of cursor moved in the picture, can use scroll bar (scroll bar) easily to carry out with pointing apparatus.Therefore, in existing PC, use several different methods for moving cursor is essential, so efficient is not high.
Summary of the invention
Have demonstration two dimensional image and mark on the display of outer rim.On display, set and have from the zone of the outer rim of display outer rim partition distance.In the time of in mark is positioned at this zone, mark is moved with respect to two dimensional image.When mark is positioned at the outside, zone, two dimensional image is being rolled from the zone towards the direction of mark.
According to said method, can realize the mark location efficiently.
Description of drawings
Fig. 1 is the stereographic map of e-machine according to the embodiment of the present invention.
Fig. 2 A to Fig. 2 F is the skeleton diagram of expression according to the two dimensional image display packing of embodiment.
Fig. 3 A to Fig. 3 C is the skeleton diagram of expression according to another two dimensional image display packing of embodiment.
Fig. 4 A to Fig. 4 D is the skeleton diagram of expression according to the another two dimensional image display packing of embodiment.
Fig. 5 is the synoptic diagram of touch pad in the past.
Embodiment
Fig. 1 is the stereographic map of e-machine 1001 according to the embodiment of the present invention.E-machine 1001 is mobile phones, and it comprises display 1 and input operation part 2.Input operation part 2 is made up of navigation key part 3, numerical key part 4.Input operation part 2, the resistance mode of downside that can be through being configured in various input equipments, for example navigation key part 3 and numerical key part 4 that can input coordinate and the simulation imput device of static capacity mode are worked.According to the e-machine 1001 of embodiment, the touch pad through the static capacity mode constitutes input operation part 2.
Downside in numerical key part 4 disposes static capacity type sensor.When the operator uses the 4 last times of operating bodies such as finger 1002 digital touch key parts as electric conductor, the static capacity of static capacity type sensor can change with the variation of operating body 1002 positions.Control part 51 goes out the touch location of operating body 1002 according to the change-detection of this static capacity.Owing to can use operating body 1002 to touch and input coordinate, so Yi Bian the operator can operate e-machine 1001 by one side observation display 1 intuitively.
Near the central authorities of display 1, be provided with zone 7 with the outer rim 1A partition distance of display 1.Control part 51 can the outer rim 7A with zone 7 be that the boundary line shows on display 1, and the operator can confirm zone 7 with eyes like this.Color harmony gradation, brightness that control part 51 also can change background colour in the inboard and the outside in the zone 7 of display 1, like this, the operator can confirm zone 7 with eyes thus.Though zone 7 is circular, also can be other shape, polygons such as quadrilateral and octagon etc. can.
Below, the action according to the e-machine 1001 of embodiment is described.Fig. 2 A to Fig. 2 F representes to be presented at the viewing area 5 of the two dimensional image 52 on the display 1.
Shown in Fig. 2 A, when having mark 6 in the zone 7, be presented at the viewing area 5 on the display 1, be fixed with respect to two dimensional image 52 and display 1.At this moment, control part 51 moves mark 6 according to the coordinate from 2 inputs of input operation part.In addition, control part 51 can make mark 6 be positioned at zone 7 shown in Fig. 2 A when starting e-machine 1001.In addition, control part 51 equally also can make mark 6 be positioned at zone 7 through the key operation of navigation key part 3.
Come input coordinate through the operator with the numerical key part 4 that finger (operating body 1002) touches operation part 2, that kind of control part 51 shown in Fig. 2 B makes mark 6 move to beyond the zone 7 with respect to two dimensional image 52.Control part 51; Time point in mark 6 sending areas 7, at the assigned position 7B in the zone 7 at 7 center, zone etc. on the direction 91A of mark 6, rolling two dimensional image 52; Moving two dimensional image 52 on the opposite direction 91B and on display 1, showing; That is to say, on two dimensional image 52, viewing area 5 is moved on the direction identical with mark 6.
Shown in Fig. 2 C, the distance L 1 between mark 6 and zone 7 (the position 7B) is big more, and control part 51 is with regard to the two dimensional image 52 that rolls at high speed more.Control part 51 can be on display 1 display-object position 8, the operator can seek target location 8.
In addition, during as if rolling two dimensional image 52, shown in Fig. 2 D, move to zone 7 target location 8.When 8 access areas 7 of target location, through the operation of operator to input operation part 2, control part 51 makes mark 6 approach zone 7, and control part 51 slows down the speed of rolling two dimensional image 52.Like this, leave the assigned position 7B in zone 7, make two dimensional image 52 high speed scrolls, just can carry out coarse localization mark 6 through making mark 6.The assigned position 7B of favored area 7 is centers of the circle in zone 7, but is not limited to this.
From the picture shown in Fig. 2 D, two dimensional image 52 is rolled with less speed, that kind shown in the image pattern 2E, control part 51 makes target location 8 be positioned at zone 7 through the operation of operator to input operation part 2.When making target location 8 place in the zone 7, the operator places mark 6 in regional 7, control part 51 two dimensional image 52 that stops to roll.
Thereafter, shown in Fig. 2 F, the operator operates input operation part 2, and mark 6 is positioned on the target location 8.Because as long as mark 6 is positioned at zone 7, viewing area 5 will stop with respect to two dimensional image 52, so the operator can be easy to and very subtly mark 6 is positioned on the target location 8.
In addition, though mark 6 is arrowhead-shaped pointers, in embodiment, also can be the point of for example cursor etc. or can to regard as in fact be the shape of point, also can obtain same effect.
Like this, make the mark 6 shown in Fig. 2 B that two dimensional image 52 begins to roll approaching, thereby control part 51 make two dimensional image 52 roll slowly more with zone 7.Thereafter, shown in Fig. 2 C, through mark 6 is strengthened the rolling speed of two dimensional image 52 away from zone 7.Then, 8 have just got into viewing area 5 and have been displayed on 1 last time of display in the target location, shown in Fig. 2 D, and mark 6 access areas 7, the rolling speed of two dimensional image 52 will diminish.Through aforesaid operations, operator's two dimensional image 52 that can roll intuitively is positioned on the target location 8 mark 6.
In the e-machine 1001 according to embodiment, display 1 has the rectangle that comprises minor face 101 and long limit 102.Long limit 102 is longer than minor face 101.The diameter in circular zone 7 is at more than 20% of length of the minor face 101 of display 1, below 50%, and more preferably 30% of minor face 101.Thus, just can guarantee in the zone the sufficient actuating range that 7 inside and outside this both sides' movement indicias 6 are required, so, maneuverable e-machine 1001 can be obtained.Being shaped as under the polygonal situation of zone 7, also can obtain same effect through having the identical width of size.
The operator from finger (operating body 1002) beginning position contacting, is moved to the position that will reach with finger at the state that keeps in touch on input operation part 2.Control part 51 detects the position of operating body 1002 beginning position contacting and operating body 1002 arrival.Control part 51, it is the absolute coordinates of benchmark that these position coordinateses are calculated as the position with input operation part 2.Thereafter, control part 51 is with the absolute coordinates of the position of operating body 1002 arrival, and changing and be calculated to be with operating body 1002 beginning position contacting is the relative coordinate of benchmark.Thus, even if begin to touch the position difference of input operation part 2, the operator can make mark 6 move too.That is to say, regardless of beginning position contacting on input operation part 2, while the operator can both see that mark 6 carries out easy and meticulous location to mark 6.Like this, control part 51 moves mark 6 according to the absolute coordinates of the position of the input operation part 2 that contacts with operating body 1002 on display 1.In addition, control part 51 converts absolute coordinates into relative coordinate, according to this relative coordinate mark 6 is moved on display 1.
Fig. 3 A to Fig. 3 C is the skeleton diagram of expression according to another display packing of the two dimensional image 52 of embodiment.
In Fig. 3 A, mark 6 is positioned at zone 7, and control part 51 makes two dimensional image 52 stop with respect to display 1.With operating body 1002 contact input operation parts 2 and move, shown in Fig. 3 B, control part 51 moves to beyond the zone 7 mark 6 through the operator.At this moment, control part 51 to towards the direction 91A of mark 6 rolling two dimensional image 52, moves two dimensional image 52 at the assigned position 7B in zone 7 on the direction 91B opposite with direction 91A, when rolling, dwindles to show two dimensional image 52.
Shown in Fig. 3 C, the distance L 1 between zone 7 (position 7B) and the mark 6 is big more, be presented at the two dimensional image 52 on the display 1 minification, be of the ratio just more increase of the area of viewing area 5 to the area of two dimensional image 52.Thus, can increase the indication range of two dimensional image 52 on display 1, so the operator is movement indicia 6 efficiently.The operator is through making zone 7 (position 7B) corresponding with minification with the distance L 1 between the mark 6, and movement indicia 6 intuitively.
Fig. 4 A to Fig. 4 C is the skeleton diagram of expression according to the another display packing of the two dimensional image 52 of embodiment.
In Fig. 4 A, mark 6 is positioned at zone 7, and control part 51 makes two dimensional image 52 stop with respect to display 1.Control part 51 shows the closed loop line 9 of the solid line consistent with the outer rim 7A in zone 7 on display 1.
Shown in Fig. 4 B and Fig. 4 C, according to the operation of operator to importation 2, control part 51 with mark 6 from the zone 7 medial movement to the outside.When mark 6 was positioned at regional 7 outsides, control part 51 changed the shape of closed loop lines 9, and at this moment, closed loop line 9 has peripheral edge portion 10A consistent with the outer rim 7A in zone 7 and the outshot 10B that is connected with peripheral edge portion 10A and gives prominence to from outer rim 7A.The front end 110B of outshot 10B be positioned at mark 6 or mark 6 near.When mark 6 moved, the length of the outshot 10B of closed loop line 9 will change.
Fig. 4 D is the enlarged drawing that is presented at the closed loop line 9 on the display 1.Shown in Fig. 4 B and 4C, even if closed loop line 9 has peripheral edge portion 10A and outshot 10B, zone 7 self shapes are that outer rim 7A can not change yet.Shown in Fig. 4 A, when mark 6 was positioned at 7 inboards, zone, the outer rim 7A in zone 7 was consistent with closed loop line 9.Thus, when rolling two dimensional image 52, the operator just can confirm rolling speed and direction with eyes on one side, on one side input operation part 2 is operated.
Shown in Fig. 4 B and Fig. 4 C, wait operating body 1002 (Fig. 1) movement indicia 6 to a certain position P0 in regional 7 outsides with finger.Thereafter, be at mark 6 under the state of a certain position P0, when the operator made operating body 1002 leave input operation part 2, control part 51 can move to mark the assigned position 7C at 7 center, zone etc.Thus, the operator can need not through the operation of self mark 6 to be turned back in the zone 7, repeatedly repeats the operation to input operation part 2 easily.
Closed loop line 9 shown in Fig. 4 A to Fig. 4 D can be applied among the display packing and these both sides of demonstration that pass through to dwindle the two dimensional image 52 that carries out shown in Fig. 3 A to Fig. 3 C of passing through the two dimensional image 52 that rolling carries out shown in Fig. 2 A to Fig. 2 D simultaneously.Thus, the operator can observe closed loop line 9, operates input operation part 2 intuitively.
In addition, control part 51 is when the assigned position 7C that mark 6 is moved in regional 7, and the two dimensional image 52 that can roll makes the position of mark 6 residing two dimensional images 52 move to regional 7 assigned position 7C.That is to say, through the operator operating body 1002 is touched input operation part 2, control part 51 moves to mark 6 a certain position P0 of two dimensional image 52.Thereafter, be at mark 6 under the state of a certain position P0, when the operator made operating body 1002 leave input operation part 2, control part 51 rolling two dimensional images 52 made a certain position P0 of two dimensional image 52 move to the assigned position 7C in the zone 7.Mark 6 moves to assigned position 7C's, can carry out simultaneously with the rolling that is used for making a certain position P0 move to the two dimensional image 52 of assigned position 7C.The operator is to target location 8 movement indicias 6.Through in the moving process of mark 6; Operating body 1002 such as will point and leave input operation part 2; The position of mark 6 and mark 6 residing two dimensional images 52 moves to the assigned position 7C in zone 7 together, thereby the operator can move to target location 8 with mark 6 at an easy rate.Assigned position 7C both can be identical with the assigned position 7B shown in Fig. 2 B, also can be different.
Above-mentioned mark 6 and two dimensional image 52 move to zone 7, can be applied to simultaneously shown in display packing and Fig. 3 A to Fig. 3 C of the two dimensional image 52 that carries out based on rolling shown in Fig. 2 A to Fig. 2 D based among these both sides of demonstration that dwindle the two dimensional image 52 that carries out.Thus, the operator can move to target location 8 with mark 6 at an easy rate.
In above explanation, e-machine 1001 is mobile phones, and two dimensional image 52 is maps.In embodiment, e-machine 1001 also can be other e-machine such as PC and vehicle mounted guidance that possesses display 1, and two dimensional image 52 also can be other a two dimensional image such as picture of general network browsing.Through the e-machine 1001 according to embodiment, the operator can use the rolling of two dimensional image 52 and moving of mark 6 simultaneously, can at an easy rate mark 6 be positioned on the target location 8 of two dimensional image 52.
In above-mentioned explanation,, use other the coordinate entering device such as touch-screen of resistive film mode also can obtain same effect though input operation part 2 is touch pads of static capacity mode.
Claims (19)
1. the display packing of a two dimensional image comprises:
Has the step that shows two dimensional image and mark on the display of outer rim in the viewing area;
On said display, set the step have with the zone of the outer rim of the said outer rim partition distance of said display;
In the time of in said mark is positioned at said zone, make the said step that moves with respect to said two dimensional image on the said display that is marked at; With
When said mark is positioned at the outside in said zone, from said zone on the direction of said mark, the step that said two dimensional image is rolled on said display.
2. the display packing of two dimensional image according to claim 1 is characterized in that,
From said zone on the direction of said mark; The step that said two dimensional image is rolled on said display; Comprise: according to by the position of the regulation in the said zone and the speed that distance determined between the said mark; From the position of said regulation on the direction of said mark, the step of the said two dimensional image that rolls
Said distance between said assigned position and the said mark is big more, and said speed is big more.
3. the display packing of two dimensional image according to claim 1 is characterized in that,
In the step that towards the direction of said mark said two dimensional image is rolled at said display from said zone; Comprise: on said display, dwindle the said two dimensional image of demonstration on one side, on one side in the step that towards the direction of said mark said two dimensional image is rolled at said display from said zone.
4. the display packing of two dimensional image according to claim 3 is characterized in that,
On said display, dwindle on one side and show said two dimensional image; On one side in the step that towards the direction of said mark said two dimensional image is rolled at said display from said zone; Comprise: with according to the position of the regulation in the said zone and the minification that distance was determined between the said mark; Dwindle the step that shows said two dimensional image
Said distance between the position of said regulation and the said mark is big more, and said minification is big more.
5. the display packing of two dimensional image according to claim 1 is characterized in that,
Also comprise: calculate the step of the absolute coordinates of the input operation position partly that contacts by operating body,
Move the step of said mark, comprise the step that on said display, moves said mark according to said absolute coordinates.
6. the display packing of two dimensional image according to claim 5 is characterized in that,
Also comprise: convert said absolute coordinates the step of relative coordinate to,
Step according to said absolute coordinates mobile said mark on said display comprises: the step that on said display, moves said mark according to said relative coordinate.
7. the display packing of two dimensional image according to claim 5 is characterized in that,
The said absolute coordinates according to the said input operation said position partly that contacts with said operating body moves the step of said mark on said display after; Also comprise:, make said mark move to the step on the assigned position in the said zone said operating body being removed said input operation part timesharing.
8. the display packing of two dimensional image according to claim 7 is characterized in that,
On said display, move the step of said mark according to the said absolute coordinates of the said input operation said position partly that contacts with said operating body; Comprise: through contact said input operation part with said operating body; Make said mark be in the step of a certain position of said two dimensional image
Assigning to make after said mark is in the step of said a certain position of said two dimensional image through contact said input operation part with said operating body; Also comprise: said operating body is being removed said input operation part timesharing, the said two dimensional image that rolls makes the said a certain position of said two dimensional image move to the step of the position of the said regulation in the said zone.
9. the display packing of two dimensional image according to claim 8 is characterized in that,
Said mark is moved to the step of the said assigned position in the said zone, carry out simultaneously with the step that the said two dimensional image that rolls makes the said a certain position of said two dimensional image move to the position of the said regulation in the said zone.
10. the display packing of two dimensional image according to claim 1 is characterized in that,
Said two dimensional image and said mark are presented at the step of said display, comprise: closed loop line, said two dimensional image and the said mark consistent with the said outer rim in said zone is presented at the step on the said display.
11. the display packing of two dimensional image according to claim 5 is characterized in that,
On said display, move the step of said mark according to the said absolute coordinates of the said input operation said position partly that contacts with said operating body; Comprise: through contact said input operation part with said operating body; Make said mark be in the step of a certain position of said two dimensional image
Assigning to make after said mark is in the step of said a certain position of said two dimensional image through contact said input operation part with said operating body; Also comprise: said operating body is being removed said input operation part timesharing, the said two dimensional image that rolls makes the said a certain position of said two dimensional image move to the step of the position of the regulation in the said zone.
12. the display packing of two dimensional image according to claim 10 is characterized in that,
Said closed loop line is a solid line.
13. the display packing of two dimensional image according to claim 1 is characterized in that,
Said two dimensional image and said mark are presented at the step of said display, comprise:
When said mark is positioned at said zone, 1st closed loop line, said two dimensional image and the said mark consistent with the said outer rim in said zone is presented at the step on the said display; With
When said mark is positioned at the outside in said zone, the 2nd closed loop line, said two dimensional image and said mark are presented at the step on the said display,
Said the 2nd closed loop line has:
The peripheral edge portion consistent with the said outer rim in said zone; With
Have near the front end that is positioned at the said mark, and outshot outstanding from the said outer rim in said zone and that be connected with said peripheral edge portion.
14. the display packing of two dimensional image according to claim 13 is characterized in that,
Said the 1st closed loop line and said the 2nd closed loop line are solid lines.
15. the display packing of two dimensional image according to claim 1 is characterized in that,
The said outer rim of said display has the rectangular shape that comprises long limit and minor face,
The size in said zone is more than 20%, below 50% of length of the said minor face of said display.
16. the display packing of two dimensional image according to claim 15 is characterized in that,
The said long limit of the said rectangular shape of the said outer rim of said display is longer than said minor face.
17. the display packing of two dimensional image according to claim 1 is characterized in that,
The said outer rim in said zone is circle or polygon.
18. the display packing of two dimensional image according to claim 1 is characterized in that,
Move the step of said mark, also comprise: the said step that navigates to the target location of said two dimensional image in the said zone that is marked at.
19. the display packing of two dimensional image according to claim 1 is characterized in that,
Said zone is positioned at the central authorities of said display.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007244834A JP5003377B2 (en) | 2007-09-21 | 2007-09-21 | Mark alignment method for electronic devices |
JP2007-244834 | 2007-09-21 | ||
JP2007244834 | 2007-09-21 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101393509A CN101393509A (en) | 2009-03-25 |
CN101393509B true CN101393509B (en) | 2012-06-27 |
Family
ID=40473047
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2008102149388A Expired - Fee Related CN101393509B (en) | 2007-09-21 | 2008-08-29 | Method of displaying planar image |
Country Status (3)
Country | Link |
---|---|
US (1) | US8196060B2 (en) |
JP (1) | JP5003377B2 (en) |
CN (1) | CN101393509B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006099395A2 (en) * | 2005-03-11 | 2006-09-21 | Adobe Systems, Inc. | System and method for displaying information using a compass |
US20090037840A1 (en) * | 2007-08-03 | 2009-02-05 | Siemens Medical Solutions Usa, Inc. | Location Determination For Z-Direction Increments While Viewing Medical Images |
TWI368161B (en) * | 2007-12-21 | 2012-07-11 | Htc Corp | Electronic apparatus and input interface thereof |
JP2012093951A (en) * | 2010-10-27 | 2012-05-17 | Sony Corp | Image processing device, image processing system, image processing method, and program |
CN103020057A (en) * | 2011-09-21 | 2013-04-03 | 幻音科技(深圳)有限公司 | Method and device for displaying RSS (really simple syndication) abstract by windowing |
DE102013000272A1 (en) * | 2013-01-09 | 2014-07-10 | Daimler Ag | A method of moving an image content displayed on a display device of a vehicle, a vehicle operating and display device, and a computer program product |
US9424358B2 (en) * | 2013-08-16 | 2016-08-23 | International Business Machines Corporation | Searching and classifying information about geographic objects within a defined area of an electronic map |
CN105518415A (en) * | 2014-10-22 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Flight path setting method and apparatus |
USD768702S1 (en) | 2014-12-19 | 2016-10-11 | Amazon Technologies, Inc. | Display screen or portion thereof with a graphical user interface |
JP7252729B2 (en) | 2018-10-18 | 2023-04-05 | キヤノン株式会社 | Image processing device, image processing method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1517677A (en) * | 2003-01-06 | 2004-08-04 | ������������ʽ���� | Navigation device |
CN1670680A (en) * | 2004-03-18 | 2005-09-21 | 国际商业机器公司 | Method and apparatus for two-dimensional scrolling in a graphical display window |
CN101014095A (en) * | 2006-01-31 | 2007-08-08 | 佳能株式会社 | Method for displaying an identified region together with an image, and image pick-up apparatus |
Family Cites Families (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5337402A (en) * | 1986-06-12 | 1994-08-09 | Keiji Kitagawa | Graphic data processing apparatus using displayed graphics for application program selection |
JPH02188790A (en) * | 1989-01-18 | 1990-07-24 | Canon Inc | Information display device |
JPH02287391A (en) * | 1989-04-27 | 1990-11-27 | Toshiba Corp | Graphic display device |
JPH0546349A (en) * | 1991-08-14 | 1993-02-26 | Sharp Corp | Information processor |
JPH07280577A (en) * | 1994-04-05 | 1995-10-27 | Sumitomo Electric Ind Ltd | Map scrolling method in navigation system |
US5959628A (en) * | 1994-06-28 | 1999-09-28 | Libera, Inc. | Method for providing maximum screen real estate in computer controlled display systems |
US5528260A (en) * | 1994-12-22 | 1996-06-18 | Autodesk, Inc. | Method and apparatus for proportional auto-scrolling |
US5805165A (en) * | 1995-08-31 | 1998-09-08 | Microsoft Corporation | Method of selecting a displayed control item |
JPH1031477A (en) * | 1996-07-15 | 1998-02-03 | Kobe Nippon Denki Software Kk | Method and device for image display |
US6351273B1 (en) * | 1997-04-30 | 2002-02-26 | Jerome H. Lemelson | System and methods for controlling automatic scrolling of information on a display or screen |
US6369837B1 (en) * | 1998-07-17 | 2002-04-09 | International Business Machines Corporation | GUI selector control |
US6803931B1 (en) * | 1999-11-04 | 2004-10-12 | Kendyl A. Roman | Graphical user interface including zoom control box representing image and magnification of displayed image |
JP3949862B2 (en) * | 1999-12-22 | 2007-07-25 | 富士通テン株式会社 | Navigation device |
US6577296B2 (en) * | 2000-11-14 | 2003-06-10 | Vega Vista, Inc. | Fixed cursor |
US6661409B2 (en) * | 2001-08-22 | 2003-12-09 | Motorola, Inc. | Automatically scrolling handwritten input user interface for personal digital assistants and the like |
US7154480B2 (en) * | 2002-04-30 | 2006-12-26 | Kazuho Iesaka | Computer keyboard and cursor control system with keyboard map switching system |
US7958455B2 (en) * | 2002-08-01 | 2011-06-07 | Apple Inc. | Mode activated scrolling |
JP2004271439A (en) * | 2003-03-11 | 2004-09-30 | Denso Corp | Operation system and cursor controller unit |
JP4215549B2 (en) * | 2003-04-02 | 2009-01-28 | 富士通株式会社 | Information processing device that operates in touch panel mode and pointing device mode |
JP2005044241A (en) * | 2003-07-24 | 2005-02-17 | Nec Corp | Pointing device notification system and method |
JP2005275602A (en) * | 2004-03-23 | 2005-10-06 | Hitachi Software Eng Co Ltd | Facility management mapping system |
US20050223343A1 (en) * | 2004-03-31 | 2005-10-06 | Travis Amy D | Cursor controlled shared display area |
US7178111B2 (en) * | 2004-08-03 | 2007-02-13 | Microsoft Corporation | Multi-planar three-dimensional user interface |
US7434173B2 (en) * | 2004-08-30 | 2008-10-07 | Microsoft Corporation | Scrolling web pages using direct interaction |
JP2006146556A (en) * | 2004-11-19 | 2006-06-08 | Nintendo Co Ltd | Image display processing program and image display processing device |
US7796116B2 (en) * | 2005-01-12 | 2010-09-14 | Thinkoptics, Inc. | Electronic equipment for handheld vision based absolute pointing system |
US20060164396A1 (en) * | 2005-01-27 | 2006-07-27 | Microsoft Corporation | Synthesizing mouse events from input device events |
WO2006099395A2 (en) * | 2005-03-11 | 2006-09-21 | Adobe Systems, Inc. | System and method for displaying information using a compass |
JP2006268663A (en) | 2005-03-25 | 2006-10-05 | Sharp Corp | Cursor movement control device, cursor movement control method, program and recording medium |
US7355595B2 (en) * | 2005-04-15 | 2008-04-08 | Microsoft Corporation | Tactile device for scrolling |
JP2006350535A (en) * | 2005-06-14 | 2006-12-28 | Elan Microelectronics Corp | Touch panel equipped with smart type autoscroll function, and its control method |
JP4619882B2 (en) * | 2005-07-12 | 2011-01-26 | 株式会社東芝 | Mobile phone and remote control method thereof |
US7542845B2 (en) * | 2005-07-29 | 2009-06-02 | Microsoft Corporation | Information navigation interface |
TW200717292A (en) * | 2005-10-25 | 2007-05-01 | Elan Microelectronics Corp | Window-moving method with a variable reference point |
US7782296B2 (en) * | 2005-11-08 | 2010-08-24 | Microsoft Corporation | Optical tracker for tracking surface-independent movements |
US7847754B2 (en) * | 2006-04-27 | 2010-12-07 | Kabushiki Kaisha Sega | Image processing program and image display device |
US8677257B2 (en) * | 2006-08-04 | 2014-03-18 | Apple Inc. | Granular graphical user interface element |
US8277316B2 (en) * | 2006-09-14 | 2012-10-02 | Nintendo Co., Ltd. | Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting |
US7987423B2 (en) * | 2006-10-11 | 2011-07-26 | Hewlett-Packard Development Company, L.P. | Personalized slide show generation |
KR20100005152A (en) * | 2007-05-30 | 2010-01-13 | 가부시키가이샤 나비타이무쟈판 | Map display system, map display, and map display method |
US9074907B2 (en) * | 2007-07-12 | 2015-07-07 | Alpine Electronics, Inc. | Navigation method and system for selecting and visiting scenic places on selected scenic byway |
JPWO2009016693A1 (en) * | 2007-07-27 | 2010-10-07 | 株式会社ナビタイムジャパン | Map display system, map display device, and map display method |
-
2007
- 2007-09-21 JP JP2007244834A patent/JP5003377B2/en not_active Expired - Fee Related
-
2008
- 2008-08-29 CN CN2008102149388A patent/CN101393509B/en not_active Expired - Fee Related
- 2008-09-19 US US12/234,058 patent/US8196060B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1517677A (en) * | 2003-01-06 | 2004-08-04 | ������������ʽ���� | Navigation device |
CN1670680A (en) * | 2004-03-18 | 2005-09-21 | 国际商业机器公司 | Method and apparatus for two-dimensional scrolling in a graphical display window |
CN101014095A (en) * | 2006-01-31 | 2007-08-08 | 佳能株式会社 | Method for displaying an identified region together with an image, and image pick-up apparatus |
Non-Patent Citations (1)
Title |
---|
JP特开2004-038603A 2004.02.05 |
Also Published As
Publication number | Publication date |
---|---|
JP2009075909A (en) | 2009-04-09 |
CN101393509A (en) | 2009-03-25 |
JP5003377B2 (en) | 2012-08-15 |
US8196060B2 (en) | 2012-06-05 |
US20090083659A1 (en) | 2009-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101393509B (en) | Method of displaying planar image | |
CN101606120B (en) | Control device, input device, control system, control method, and hand-held device | |
CN104737221B (en) | Information display device and display information operation method | |
US8199111B2 (en) | Remote input device and electronic apparatus using the same | |
US20040196267A1 (en) | Information processing apparatus operating in touch panel mode and pointing device mode | |
CN101231564A (en) | Apparatus and method for improvement of usability of touch screen | |
CN101178631A (en) | Computer system and method thereof | |
JP2005234291A (en) | Display apparatus and display method | |
JP6429886B2 (en) | Touch control system and touch control method | |
CN103003787B (en) | For the method and apparatus providing user interface | |
US20080158249A1 (en) | Method for Displaying Graphic Objects and Communications Device | |
CN107209637B (en) | Graphical interface and method for managing the same during touch selection of displayed elements | |
CN102750079A (en) | Terminal device, object control method, and program | |
CN104736969A (en) | Information display device and display information operation method | |
CN101726293B (en) | Navigation apparatus | |
CN102934067A (en) | Information processing system, operation input device, information processing device, information processing method, program and information storage medium | |
JP2015170282A (en) | Operation device for vehicle | |
CN102472626B (en) | Map display device | |
JP3559951B2 (en) | Navigation device | |
CN102810047A (en) | Apparatus and method for browsing a map displayed on a touch screen | |
CN103186285A (en) | Operation input system | |
EP2787422B1 (en) | Information processing device | |
US20150009136A1 (en) | Operation input device and input operation processing method | |
CN106020625A (en) | Interactive system and method for controlling vehicle application through same | |
EP2988194B1 (en) | Operating device for vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20120627 Termination date: 20160829 |
|
CF01 | Termination of patent right due to non-payment of annual fee |