WO2010008088A1 - 情報処理装置、プログラムを記録した記憶媒体及びオブジェクト移動方法 - Google Patents
情報処理装置、プログラムを記録した記憶媒体及びオブジェクト移動方法 Download PDFInfo
- Publication number
- WO2010008088A1 WO2010008088A1 PCT/JP2009/063135 JP2009063135W WO2010008088A1 WO 2010008088 A1 WO2010008088 A1 WO 2010008088A1 JP 2009063135 W JP2009063135 W JP 2009063135W WO 2010008088 A1 WO2010008088 A1 WO 2010008088A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- detected
- mentioned
- displayed
- area
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an information processing apparatus having a plurality of stages capable of detecting, a storage medium storing a program used in an information processing apparatus, and an object moving method.
- the information processing apparatus is provided with a so-called touch screen or touch screen.
- a stage capable of detecting a plurality of cases has been developed in place of the stage for detecting only the conventional one.
- the touch display or touch device which can detect the number, is described in 002,304,256 and 007,279,638, respectively.
- the information processing device with a stage can be intuitively operated by the user. While this gives an intuitive feeling to the user, an excessive burden is placed on the user if the user's sensational image (moving part of the screen) is not made. For example, touch screen down) or touch screen
- An information processing device that causes the display content to be reduced by giving up (up) gives a good workability to the work.
- detailed work such as when only the part of the desired display information is moved or when only the desired display information part is skipped, or when the work is performed using a plurality of Between the setting of the device and the user's emotional work Easy to occur.
- the user repeats the intuitive work. As shown, even if the user touches the touch screen with the intention of making a sketch, the information processing device will not operate if the sketch setting is up. In this case, the user cannot perform the desired operation and repeats the same operation, which places an excessive burden on the user.
- an information processing apparatus that is capable of object movement by appropriately detecting a user's sensational work using the above-mentioned detection capability.
- the information processing device is capable of detecting a display that displays an object, and detecting a plurality of objects on the surface.
- the object displayed in the above is controlled to move to the object, and the object in the active area on the surface is detected by using the above, excluding the range of the object, When 2 in the range of the object is detected and a movement instruction of the object is further detected, the object is controlled to be movable within the active range.
- an information processing apparatus that can detect a user's sensuous work by using the ability to detect a plurality of objects and can move an object.
- FIG. 3 is a functional block diagram showing an information processing apparatus according to an embodiment of the invention.
- FIG. 2 is a chart for explaining the operation of the information processing device.
- FIG. 3 is a functional block diagram showing the terminal that is the information processing apparatus according to the implementation of the second aspect. 4 is a figure used to explain the end of 3).
- 5 5 is a diagram showing the surface of the touch screen at the end of 3 when touching.
- 6 6 is a diagram showing the surface of the surface at the end of 3 in the case of chita.
- 7 7 is a diagram showing 2 on the surface at the time of touching at the end of 3.
- 8 8 is a diagram showing 3 on the surface at the time of touching at the end of 3.
- g g is a diagram showing 4 on the surface when the touch is touched at the end of 3.
- 0 is a diagram showing 5 on the surface at the time of touching at the end of 3.
- Fig. 6 is a diagram showing 6 on the surface at the time of touching at the end of 3;
- FIG. 3 is a functional block diagram showing a top-up which is an information processing apparatus according to the third embodiment.
- 5A 5B is Chita at 3
- the information processing apparatus operates by a program, and is configured as a terminal having a communication function (speech, message reception, Internet), a cadence function, and a schedule.
- a communication function speech, message reception, Internet
- a cadence function e.g., a radio access control function
- a schedule e.g., a radio access control function
- Sonapi, sto, P (Pe sona Handyphone ys e) etc. which have a touch screen function.
- FIG. 5 is a functional diagram showing the configuration of the information processing apparatus 0 according to the state of the following.
- the information processing device 0 shown in Fig. 2 performs various arithmetic processes and displays the display information instructed by the control.
- the display 0 displays the information (or the detected coordinates) and the output coordinates in the area where the display information is displayed. ) Is detected, it has a memory 3 composed of ROR, lame, etc. In 0, the operating system, the dry application program for contact, and the information processing device 0
- the control can display the second information in the box 2 which is a part in the box.
- the control is capable of detecting a plurality of in the area where the information is displayed and the area where the information is displayed.
- the control detects a change of only one output coordinate, only the display (object) displayed in the area where the detected coordinate where the movement is detected exists. The process of moving (scrolling) based on the conversion is executed. In 0, the information displayed in the range that is displayed (active) by control,
- the two information displayed in the enclosed area are displayed.
- the two (objects) can be constructed using images, graphics, characters, or a combination of them. In addition, it can be expressed as a symbol, a button, a character, an input, a visually indicated information indicating a control or a combination thereof. In addition, the two pieces of information including characters include information for managing the selected characters as a group after the selection.
- the information processing apparatus 0 can detect the user's sensational action appropriately and can move the object using the plurality of detection capabilities.
- the information processing apparatus 0 can detect the user's sensational action appropriately and can move the object using the plurality of detection capabilities.
- not only when using a stylus or the like, but also when using a human finger or the like is included.
- the information processing device 0 draws the information displayed in the area of the display 0 screen and the 2 information displayed in the area according to the application operating system recorded in the memory 0. Display on display 0. From time to time, the control may be performed on the dry operating system for contact, the area where the information is displayed on the application program, the area where the two information is displayed, ), Detect the detected coordinates and their differences using contact. Recognizes the detected touch, contact removal, and output coordinates as necessary by down-setting, up-setting, and moving instructions, and determining the displayed object.
- control identifies the movement instruction of the object from the conversion of the number of outgoing coordinates detected through contact. Detects multiple softwares through contact with various types of software, and identifies the user's movement instructions based on the conversion of a number of (exit coordinates) marks. Movement instruction It is possible to identify the direction of movement by cutting the coordinates of the movement of the other finger. It is also possible to determine the time between going out and going out and distinguishing other fingers based on the difference between them.
- the movement instruction can be determined. By controlling in this way, different actions can be realized by changing the output coordinates of multiple.
- the contact may be set so that only the ground on the display surface can be operated or only the fixed work can be performed. Further, it may be judged that the contact or invalidity is invalid according to the display surface information. Furthermore, the operation may be different depending on whether the information processing apparatus 0 is held with one hand or with both hands.
- FIG. 2 is a chart for explaining the operation of the information processing apparatus 0. 2 of the information processing device 0 described above, This is an operation that realizes the dynamic function of the audio provided to the user as the internet.
- step S20 monitors the touch of the display 20 through the contact.
- step S203 determines that there are a plurality of, and if a plurality of is detected, the process proceeds to step S203, and if only one is issued, the process proceeds to step S205 (step S2 2). .
- step S205 it is determined that only one of the deviations in the two areas indicated by the report of 2 is detected, and that a plurality of are detected in the two areas.
- the object in the active range on the display surface is detected by excluding the range of the object.
- the control monitors the outgoing coordinates of the detected number and determines whether only one has moved (a (skew) indication has been made). If only one of them has moved, the process proceeds to step S204, and if more than one have moved simultaneously, the process proceeds to step S206 (step S203). In other words, it is determined whether or not a change in the output coordinates of only one of the plurality is detected. In other words, it removes the object range of the moving object and detects in the active range on the display surface and determines whether or not an object movement instruction is detected.
- Control is performed according to the operation when another operation () different from the movement instruction is performed.
- control 0 fixes the display information displayed in the area where other connections exist, and the display displayed in the area where movement is detected is fixed on the display screen.
- the information is moved (step S24).
- the detected coordinates will be displayed in the area where is detected Only the displayed information is moved based on the detected coordinates.
- the object is controlled to move within the active range, and the object instructed to move is moved.
- the control monitors the output coordinates of the detected and indicates whether or not the user has indicated movement (step S2 5).
- the control moves (squeezes) the display body to be activated (step S206). In other words, the control moves based on the detection coordinates when all the output coordinates are detected, based on the two reports.
- the information processing device can detect the user's sensuous work properly and move the display (object) by using the ability to detect multiple objects.
- FIG. 3 is a functional block diagram showing the composition of a 20 according to the present embodiment.
- the 20 is provided with a radio unit 2, a broadcast receiving unit 20 2, a GPS signal unit 20 3, a camera 2 4, an acoustic processor 2 05, a display with touch capability 2 06, and a memory 2 0 20 8.
- the line part 2 transmits information between the radios via the antennas.
- the transmission / reception unit 22 receives transmission signals transmitted from broadcasting stations (upper, satellite, etc.) and demodulates the received transmission signals, and performs video data, audio data, information data, etc. Line.
- the GPS signal unit 203 obtains the distance of the GPS star A200 by measuring the time until the radio waves radiated from a plurality of GPS stars reach A200, and obtains the position using the distance. For the camera 204, click (
- the management unit 205 manages music notification sound that is input and output via the speaker, or sound that is voice.
- the function display 2 6 is a touch screen (touch screen) that combines () for images, figures, characters, symbols, etc. ).
- the 2008 functions include telephone functions, functions, Internet (eb) functions, camera functions, TV communication functions, GPS () functions, functions, and other functions and parts of the terminal.
- the control 20 8 controls the display function with the touch function display 20 6 to display the electronic screen, and detects the user's display using the switch (function, contact) of the touch function display 20 6. To do.
- a No. 208 of 208 monitors the touch of the display with touch capability 20 6 (step S4). When a touch is detected, control 20 determines whether the number of contact points is one or more. If the number is detected, the process proceeds to step S403, and if it is 1, the process proceeds to step S45 (step S402).
- control 28 determines whether or not each of them is located at a predetermined position (or ()). The determined position will be described later.
- step S404 If the touch at the specified position is detected, the process proceeds to step S404. If the touch at the specified position is not detected, the process proceeds to step S405 (step S403).
- control 2 fixes the background (equivalent to the information) and controls the object to be movable within the active range (step S404).
- control 20 8 will scrub the display body based on the amount of movement until the contact is released (the erasure disappears) (steps S 205 S 2 06)
- the specified position is within the display area excluding the object to be moved or within the active area on the display surface excluding the object to be moved, and the user can touch the hand holding (200).
- the () active position is where the thumb or other specific finger is reached when the 20 is held in the recommended standard condition.
- the specified position will be explained with reference to 7 7
- No 5 is a figure showing the surface of a 20.
- No 5 indicates the display body displayed on the function display 20 6 by the control 20 8.
- 6A, 7A, 8, g, and A indicate the front surface of the object.
- FIGS. 5B, 6, 7, 8, g, and OBB show the moving surface of the object.
- 5 5 is a diagram showing the surface of the touch screen when touching.
- (schedule) 50 is displayed in the display 500, but 2 (oji) 502 is displayed.
- the user displays 2502 with his finger 2 Suppose you touch the area and move it downward as shown by the arrows.
- step S40 shows the operation from step S40 shown in 4 to step S405 through o in step S42.
- 5 shows the display result obtained by scoring the body based on the detected coordinates as a result of 5.
- 4 shows the state of step S406 shown in FIG.
- 6 6 6 is a diagram showing the surface of the surface when tapping. 6 6 shows a screen in which object 5002 is controlled to be movable within active 5003 when an object 5002 on the display screen is detected except for the area of object 502 (indicated by a toggle). It is.
- Step S40 shown in 4 shows the operation from step S40 eS to step S403.
- 6 is the display information displayed in the area where the result of 6 is that the control 08 detected that the detected coordinate was changed.
- (Bicto) Shows the display result of moving only 502 based on the detected coordinates. 4 shows the operation of step S404 shown in FIG. 6 As shown in 6B, the object is detected within the active 503 on the display surface, and the object is controlled to move within the active 503 when the output coordinate does not change. With this, you can move the object while fixing the schedule table.
- the control 2008 detects multiple coordinates in the area 2 and the second area, and also detects the output coordinates in the area 2, and converts the output coordinates in the area 2 If it is detected, only mark 50 based on the detected coordinates. Note that a plurality of areas do not necessarily have to exist in the second area.
- 7 7 is a diagram showing another transition of the surface at the time of touching. 7 In 7, the fixed position (fixed position, designation, movement
- the screen When the touch of) is detected, the screen is shown in which the schedule table is fixed, and only the matching object is controlled to move.
- Step S4 shown in Fig. 4 is shown.
- the object when a touch at a specified position is detected, the object can be moved and the object can be moved by controlling the object to be movable with the schedule table fixed. .
- This type of control is suitable for programs that do not control the display at the object position that displays 52 of 52.
- 8 8 is a diagram showing yet another transition of the surface at the time of touching. 8 In 8B, the same as 7 7B When a touch is detected, the schedule table is fixed and only the audio is controlled to move. 7 7 means 7
- the movement area is set for the time display in the schedule table, whereas in 8 8, the movement fixed position 505 is set to be visible by the application program in the active area. Is a point.
- gg is a diagram showing yet another transition of the surface at the time of the touch.
- g gB shows a screen in which the schedule table is fixed and the audio is controlled to move when a touch at a specified position is detected, as in the case of other touches.
- 8 8 shows that in 8A 8 the moving position is set in the active area by the schedule table application program, whereas in gA 8 the schedule table application is set.
- the moving object 506 is set by a program outside the gram.
- g detects () in the coordinate area where the object 5006 provided in the area where the schedule table is displayed, and in the area 2 where the moved object is displayed. This shows the case where the change of the output coordinates only in the area of is detected as a square.
- Fig. 9 shows the display result of moving only the object that is displayed in the area where control 20 is detected as the result of g as the result of g, but is displayed in the area where it exists.
- the object is moved with the schedule table fixed when the touch of the coordinate area where the moving object 506 is displayed is detected by the program outside the active application program.
- you can fix the schedule table and move the object.
- the move object 506 is movable on the screen and may be displayed in the foreground.
- the schedule table is regarded as paper
- the moving object is used like a humor or paperweight, and the user's sensitive work is properly detected and the movement is instructed. You can move the object.
- 0 0 is a diagram showing yet another transition of the surface at the time of touching. 0 indicates a screen in which only the object is controlled to move, with the schedule table fixed when a touch at a specified position is detected, as in the case of other touches.
- the movement fixed position 505 is set by the application program in the schedule table in the area of the active area, whereas in 0 O, in the area of the contact capability display 26 6 This is the point that the base fixing 507 is set in the display area.
- 0 detects the touch of the fixed base 5007, the touch of the area 2 where the moved object is displayed, and further detects the touch coordinates that exist only in the area 2 It shows.
- O shows the display result of moving only the indicated object in the area where control 20 has detected the change of the detected coordinate as the result of O, based on the change of the detected coordinate.
- the base when the touch of the base fixing 507 is detected, the base can be fixed and the object can be moved by controlling the moving object with the base fixed here. .
- the schedule table is regarded as paper
- the base fixing part can be used to press the edge of the paper, and the user's sensitive work can be detected appropriately and Can be moved.
- B is a diagram showing yet another transition of the surface at the time of touching.
- B shows a screen in which only the object is controlled to move, with the schedule table fixed when multiple objects are detected at the same time, as in other transitions.
- the touch detection area which is the area where the touch of the display with touch function 2006 is detected, is provided on the display frame, not just on the display screen. Furthermore, as shown in 2, if the contact continues from the top of the display 500 to 59, it is regarded as a moving area.
- the figure shows the case of detecting the coordinates of the output coordinates only in the region of. As a result of the above, only the object displayed in the area where the control 20 has detected the change of the detected coordinate is moved based on the change of the detected coordinate is shown.
- the base () schedule () is fixed (immobility), and the input eye (2) is made movable. In this way, it is possible to detect only the user's impression and to display the display information (skew) displayed by the user and move only the input.
- the input item By controlling in this way, when moving the input item, it is possible to select the input item, display the detailed information, change the detailed information, and confirm the multiple operations. You can do this. Also, unlike the case where the time information included in the conventional information is changed and the input item is moved, the input item can be moved while checking other input schedules, thus preventing the duplication of the schedule.
- the information displayed on the edge (the edge on the surface of the display information, which is the background) is displayed. If it is controlled to move near the center of the eve without disturbing the movement of the orange, it will further facilitate the user's emotional work. For example, when the display information has more than one (crossing), the object that is moving near the bottom of the control and display screens is detected, and the information displayed on is displayed near the heart of the active area. To move to, simply switch the display information to the next minute.
- a zone is provided in the area where the information is displayed and / or the area where the second information is displayed, and the movement is detected and moved. If the detection of 2 is detected on one side and the change of the output coordinate of 2 is detected, the 2 exists or is displayed in the 2nd area. Alternatively, the second report can be moved based on the detected coordinates.
- the 1 (, background) is not limited to the schedule table, and may be a map, a list of photos, etc. .
- the application program can be a text program. Just select the individual text that has been entered and selected. When applied to a program, the selected text part in addition to photos and figures can be moved according to the user's sensitive movement.
- the background of a program when divided into multiple areas, it may be controlled to fix only the area where the contact exists. In this way, it is possible to confirm other areas apart from the predetermined area in parallel.
- Bumpy 300 Next, a description will be given of Bumpy 300 according to the third embodiment.
- Fig. 3 shows the composition of the butterfly 300 according to this embodiment. It is a functional book diagram.
- the tab 30 is used to perform various arithmetic processes in line 3, R for temporarily storing information, R 3 2 for storing basic programs, and touch nes display for inputting / outputting information. It includes 320, memory 330, and twiddle 3 for communicating with the Internet via the network.
- the touch display 320 has a display 32 322, and displays information such as images, figures, characters, and symbols output from the control 3 on the display 32 screen, and the user's 322 touch input. Is detected.
- the application program 33 touch nes display 32 month dry 332, various data, various tents, S (e a n ys em) 333, etc. are stored.
- OS 333 operates based on the indications of OS 333, Dry 332, and Application Program 33.
- Software such as OS 333 is expanded to 3 as needed.
- the contact 322 detects the touch of the contact 322 door (chitouch screen) based on one of OS333, dry 332, and application program 33, or a combination thereof.
- 322 can detect multiple (), no matter how it is detected.
- this software can be used, but there is no particular limitation as long as contact can be detected.
- the tab 300 can detect the user's sensation appropriately using the contact 32 2 and enable the movement of the audio.
- Steps S 40 to S 403 shown in 4 are the same operations as those in steps 4 to S 40 to S 403. In addition, the operation is the same as that in steps S4 5 S4 6 and steps S4 5 S4 6 of steps S 405 S 4 6
- the tab 3 of the tablet 300 monitors the touch of the touch display 300 (step S40), and when a touch is detected, the number of touch points is equal to 1 or more (step S402). .
- control 3 detects the touch of the surrounding area of the other object (other cell) on the display screen, excluding the moving object's object (moving cell), and moves the moving object's object. Determine whether or not the instruction can be detected (Step S403)
- control 3 fixes the other object and controls the moving object to move (Step).
- 5 5 is a diagram showing 60 in the table program at the time of touching.
- 5 is the display information displayed in the area where the control 3 detected the change of the detected coordinate exists.
- the display result is shown by scribing only (Oji) based on the detection coordinates.
- the rectangle 63 that is diagonally opposite to the (predetermined) cell is fixed in response to the upper left movement instruction, and the movement instruction (the 7900th cell) 604 is made movable. Control.
- the movement function moves to the vicinity of the active area, the information of the program is divided and displayed for the program, and the movement cell 604 is controlled by the movement function. To do.
- the object instructed to move is detected, and the object instructed to move is controlled to move within the active area.
- the user can move the information while looking at the moving center (7900) and the other side up and down and right.
- control 3 detects the touch of the area where 2 of the information is displayed and detects the change of the output coordinate of only one, The information is divided and displayed, and the detected coordinate change is displayed in the area where exists
- the user can recognize the work directed to the displayed object on the screen and perform dynamic control of the object based on the result.
- the load on the user can be reduced.
- the information processing device that can detect the user's sensational action and detect the movement of the object by using the ability to detect multiple objects, and the program that realizes it and its program It is possible to provide a recordable body in which the gram is recorded.
- Any information processing device that can detect a plurality of objects can be applied to any information processing device. Further, as in the case of the terminal, it is suitable for an information processing apparatus that holds and operates the processing apparatus with one hand.
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010520915A JP5267827B2 (ja) | 2008-07-17 | 2009-07-15 | 情報処理装置、プログラムを記録した記憶媒体及びオブジェクト移動方法 |
CN200980128039.9A CN102099775B (zh) | 2008-07-17 | 2009-07-15 | 信息处理装置、记录有程序的存储介质以及目标移动方法 |
EP09798012.2A EP2306286A4 (en) | 2008-07-17 | 2009-07-15 | INFORMATION PROCESSING APPARATUS, STORAGE MEDIUM ON WHICH A PROGRAM HAS BEEN RECORDED, AND METHOD FOR MODIFYING OBJECT |
US13/054,698 US20110126097A1 (en) | 2008-07-17 | 2009-07-15 | Information processing apparatus, storage medium having program recorded thereon, and object movement method |
US14/338,840 US9933932B2 (en) | 2008-07-17 | 2014-07-23 | Information processing apparatus having a contact detection unit capable of detecting a plurality of contact points, storage medium having program recorded thereon, and object movement method |
US15/902,129 US10656824B2 (en) | 2008-07-17 | 2018-02-22 | Information processing apparatus having a contact detection unit capable of detecting a plurality of contact points, storage medium having program recorded thereon, and object movement method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-185622 | 2008-07-17 | ||
JP2008185622 | 2008-07-17 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/054,698 A-371-Of-International US20110126097A1 (en) | 2008-07-17 | 2009-07-15 | Information processing apparatus, storage medium having program recorded thereon, and object movement method |
US14/338,840 Continuation US9933932B2 (en) | 2008-07-17 | 2014-07-23 | Information processing apparatus having a contact detection unit capable of detecting a plurality of contact points, storage medium having program recorded thereon, and object movement method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010008088A1 true WO2010008088A1 (ja) | 2010-01-21 |
Family
ID=41550488
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/063135 WO2010008088A1 (ja) | 2008-07-17 | 2009-07-15 | 情報処理装置、プログラムを記録した記憶媒体及びオブジェクト移動方法 |
Country Status (5)
Country | Link |
---|---|
US (3) | US20110126097A1 (ja) |
EP (1) | EP2306286A4 (ja) |
JP (6) | JP5267827B2 (ja) |
CN (2) | CN102099775B (ja) |
WO (1) | WO2010008088A1 (ja) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011197848A (ja) * | 2010-03-18 | 2011-10-06 | Rohm Co Ltd | タッチパネル入力装置 |
JP2011227703A (ja) * | 2010-04-20 | 2011-11-10 | Rohm Co Ltd | 2点検知可能なタッチパネル入力装置 |
JP2011242820A (ja) * | 2010-05-13 | 2011-12-01 | Panasonic Corp | 電子機器、表示方法、及びプログラム |
CN102375602A (zh) * | 2010-08-13 | 2012-03-14 | 卡西欧计算机株式会社 | 输入装置以及输入方法 |
JP2012068778A (ja) * | 2010-09-22 | 2012-04-05 | Kyocera Corp | 携帯端末、入力制御プログラム及び入力制御方法 |
WO2012160829A1 (ja) * | 2011-05-25 | 2012-11-29 | パナソニック株式会社 | タッチスクリーン装置、タッチ操作入力方法及びプログラム |
JP2012234569A (ja) * | 2012-08-09 | 2012-11-29 | Panasonic Corp | 電子機器、表示方法、及びプログラム |
JP2013521547A (ja) * | 2010-02-25 | 2013-06-10 | マイクロソフト コーポレーション | マルチスクリーンのホールド及びページフリップジェスチャー |
JP2013214310A (ja) * | 2008-09-03 | 2013-10-17 | Canon Inc | 情報処理装置、その動作方法及びプログラム |
JP2014006671A (ja) * | 2012-06-22 | 2014-01-16 | Yahoo Japan Corp | 画像表示装置、画像表示方法、及び、画像表示プログラム |
JP2014112335A (ja) * | 2012-12-05 | 2014-06-19 | Fuji Xerox Co Ltd | 情報処理装置及びプログラム |
JP2014142750A (ja) * | 2013-01-23 | 2014-08-07 | Dainippon Printing Co Ltd | 入力機能及び表示機能を有するicカード |
JP2014215916A (ja) * | 2013-04-26 | 2014-11-17 | 株式会社サミーネットワークス | 表示制御方法、表示制御プログラム、および、携帯情報端末 |
WO2015107617A1 (ja) * | 2014-01-14 | 2015-07-23 | 株式会社 東芝 | 電子機器、制御方法およびプログラム |
WO2016006074A1 (ja) * | 2014-07-09 | 2016-01-14 | 株式会社東芝 | 電子機器、方法及びプログラム |
US9250800B2 (en) | 2010-02-18 | 2016-02-02 | Rohm Co., Ltd. | Touch-panel input device |
JP2016219067A (ja) * | 2016-09-28 | 2016-12-22 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
US11055050B2 (en) | 2010-02-25 | 2021-07-06 | Microsoft Technology Licensing, Llc | Multi-device pairing and combined display |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008275565A (ja) * | 2007-05-07 | 2008-11-13 | Toyota Motor Corp | ナビゲーション装置 |
US9483755B2 (en) | 2008-03-04 | 2016-11-01 | Apple Inc. | Portable multifunction device, method, and graphical user interface for an email client |
US8368667B2 (en) * | 2008-06-26 | 2013-02-05 | Cirque Corporation | Method for reducing latency when using multi-touch gesture on touchpad |
EP2306286A4 (en) | 2008-07-17 | 2016-05-11 | Nec Corp | INFORMATION PROCESSING APPARATUS, STORAGE MEDIUM ON WHICH A PROGRAM HAS BEEN RECORDED, AND METHOD FOR MODIFYING OBJECT |
US20110216095A1 (en) * | 2010-03-04 | 2011-09-08 | Tobias Rydenhag | Methods, Devices, and Computer Program Products Providing Multi-Touch Drag and Drop Operations for Touch-Sensitive User Interfaces |
JP5473708B2 (ja) * | 2010-03-26 | 2014-04-16 | 京セラ株式会社 | 携帯端末及び表示制御プログラム |
KR102006740B1 (ko) | 2010-10-20 | 2019-08-02 | 삼성전자 주식회사 | 휴대 단말기의 화면 표시 방법 및 장치 |
JP2013069273A (ja) * | 2011-09-07 | 2013-04-18 | Nitto Denko Corp | 入力体の動き検出方法およびそれを用いた入力デバイス |
US9501213B2 (en) * | 2011-09-16 | 2016-11-22 | Skadool, Inc. | Scheduling events on an electronic calendar utilizing fixed-positioned events and a draggable calendar grid |
EP2584441A1 (en) * | 2011-10-18 | 2013-04-24 | Research In Motion Limited | Electronic device and method of controlling same |
US8810535B2 (en) | 2011-10-18 | 2014-08-19 | Blackberry Limited | Electronic device and method of controlling same |
KR101710418B1 (ko) * | 2011-12-19 | 2017-02-28 | 삼성전자주식회사 | 휴대 단말기의 멀티 터치 인터렉션 제공 방법 및 장치 |
TWI494802B (zh) * | 2012-01-04 | 2015-08-01 | Asustek Comp Inc | 操作方法與使用其之可攜式電子裝置 |
EP2812784A4 (en) * | 2012-02-07 | 2015-11-11 | Blackberry Ltd | METHODS AND DEVICES FOR MERGING CONTACT RECORDINGS |
US8539375B1 (en) * | 2012-02-24 | 2013-09-17 | Blackberry Limited | Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content |
US9223483B2 (en) | 2012-02-24 | 2015-12-29 | Blackberry Limited | Method and apparatus for providing a user interface on a device that indicates content operators |
JP5974657B2 (ja) * | 2012-06-15 | 2016-08-23 | 株式会社リコー | 情報処理装置、情報処理方法および情報処理プログラム |
JP2014032450A (ja) | 2012-08-01 | 2014-02-20 | Sony Corp | 表示制御装置、表示制御方法及びコンピュータプログラム |
JP5975794B2 (ja) * | 2012-08-29 | 2016-08-23 | キヤノン株式会社 | 表示制御装置、表示制御方法、プログラム及び記憶媒体 |
US11068128B2 (en) | 2013-09-03 | 2021-07-20 | Apple Inc. | User interface object manipulations in a user interface |
US9841815B2 (en) * | 2013-09-09 | 2017-12-12 | Samsung Electronics Co., Ltd. | Method for differentiation of touch input and visualization of pending touch input |
US20150212682A1 (en) * | 2014-01-30 | 2015-07-30 | Accompani, Inc. | Managing calendar and contact information |
JP2015172861A (ja) * | 2014-03-12 | 2015-10-01 | レノボ・シンガポール・プライベート・リミテッド | 携帯式電子機器の使用環境を切り換える方法、携帯式電子機器およびコンピュータ・プログラム |
WO2016036509A1 (en) * | 2014-09-02 | 2016-03-10 | Apple Inc. | Electronic mail user interface |
WO2016036416A1 (en) | 2014-09-02 | 2016-03-10 | Apple Inc. | Button functionality |
JP7089352B2 (ja) | 2016-09-16 | 2022-06-22 | 日東電工株式会社 | スパイラル型膜エレメント |
US11435830B2 (en) | 2018-09-11 | 2022-09-06 | Apple Inc. | Content-based tactile outputs |
JP7327368B2 (ja) * | 2020-12-02 | 2023-08-16 | 横河電機株式会社 | 装置、方法およびプログラム |
US20230230044A1 (en) * | 2021-12-30 | 2023-07-20 | Microsoft Technology Licensing, Llc | Calendar update using template selections |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5825352A (en) * | 1996-01-04 | 1998-10-20 | Logitech, Inc. | Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad |
JPH1173271A (ja) * | 1997-08-28 | 1999-03-16 | Sharp Corp | 指示装置、処理装置および記憶媒体 |
JP2000163193A (ja) * | 1998-11-25 | 2000-06-16 | Seiko Epson Corp | 携帯情報機器及び情報記憶媒体 |
JP2001265481A (ja) * | 2000-03-21 | 2001-09-28 | Nec Corp | ページ情報表示方法及び装置並びにページ情報表示用プログラムを記憶した記憶媒体 |
JP2002304256A (ja) | 2001-04-06 | 2002-10-18 | Sony Corp | 情報処理装置 |
JP2005301516A (ja) * | 2004-04-08 | 2005-10-27 | Sony Corp | 情報処理装置および方法、並びにプログラム |
WO2007089766A2 (en) * | 2006-01-30 | 2007-08-09 | Apple Inc. | Gesturing with a multipoint sensing device |
JP2007279638A (ja) | 2006-04-12 | 2007-10-25 | Xanavi Informatics Corp | ナビゲーション装置 |
JP2008508600A (ja) * | 2004-07-30 | 2008-03-21 | アップル インコーポレイテッド | タッチ・センシティブ入力デバイスのためのモード・ベースのグラフィカル・ユーザ・インタフェース |
JP2008097172A (ja) * | 2006-10-10 | 2008-04-24 | Sony Corp | 表示装置および表示方法 |
JP2008185622A (ja) | 2007-01-26 | 2008-08-14 | Ricoh Co Ltd | 色ずれ検出方法、色ずれ検出装置及び画像形成装置 |
Family Cites Families (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03180922A (ja) | 1989-12-11 | 1991-08-06 | Fujitsu Ltd | タッチパネル構造 |
JPH07175587A (ja) * | 1993-10-28 | 1995-07-14 | Hitachi Ltd | 情報処理装置 |
US5732227A (en) * | 1994-07-05 | 1998-03-24 | Hitachi, Ltd. | Interactive information processing system responsive to user manipulation of physical objects and displayed images |
JPH11102274A (ja) | 1997-09-25 | 1999-04-13 | Nec Corp | スクロール装置 |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US7840912B2 (en) * | 2006-01-30 | 2010-11-23 | Apple Inc. | Multi-touch gesture dictionary |
JP2000137564A (ja) * | 1998-11-02 | 2000-05-16 | Pioneer Electronic Corp | 画面操作装置および方法 |
JP4542637B2 (ja) | 1998-11-25 | 2010-09-15 | セイコーエプソン株式会社 | 携帯情報機器及び情報記憶媒体 |
JP2000163444A (ja) | 1998-11-25 | 2000-06-16 | Seiko Epson Corp | 携帯情報機器及び情報記憶媒体 |
JP2001134382A (ja) * | 1999-11-04 | 2001-05-18 | Sony Corp | 図形処理装置 |
US7003641B2 (en) * | 2000-01-31 | 2006-02-21 | Commvault Systems, Inc. | Logical view with granular access to exchange data managed by a modular data and storage management system |
JP4803883B2 (ja) * | 2000-01-31 | 2011-10-26 | キヤノン株式会社 | 位置情報処理装置及びその方法及びそのプログラム。 |
US20030117427A1 (en) * | 2001-07-13 | 2003-06-26 | Universal Electronics Inc. | System and method for interacting with a program guide displayed on a portable electronic device |
JP2003173237A (ja) | 2001-09-28 | 2003-06-20 | Ricoh Co Ltd | 情報入出力システム、プログラム及び記憶媒体 |
JP3847641B2 (ja) | 2002-02-28 | 2006-11-22 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置、情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、及び情報処理方法 |
US7411575B2 (en) | 2003-09-16 | 2008-08-12 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
JP4045550B2 (ja) * | 2004-06-28 | 2008-02-13 | 富士フイルム株式会社 | 画像表示制御装置及び画像表示制御プログラム |
TWI248576B (en) * | 2004-07-05 | 2006-02-01 | Elan Microelectronics Corp | Method for controlling rolling of scroll bar on a touch panel |
JP2006092025A (ja) * | 2004-09-21 | 2006-04-06 | Fujitsu Ltd | 電子機器並びに表示画面の制御処理方法および表示画面の制御処理プログラム |
US7760189B2 (en) * | 2005-01-21 | 2010-07-20 | Lenovo Singapore Pte. Ltd | Touchpad diagonal scrolling |
JP2007141177A (ja) * | 2005-11-22 | 2007-06-07 | Tokai Rika Co Ltd | 操作入力装置 |
JP3970906B2 (ja) | 2006-02-03 | 2007-09-05 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置、情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、及び情報処理方法 |
JP4810658B2 (ja) * | 2006-03-10 | 2011-11-09 | 国立大学法人広島大学 | 接触検出装置及び接触検出方法 |
TW200805131A (en) * | 2006-05-24 | 2008-01-16 | Lg Electronics Inc | Touch screen device and method of selecting files thereon |
US7956847B2 (en) * | 2007-01-05 | 2011-06-07 | Apple Inc. | Gestures for controlling, manipulating, and editing of media files using touch sensitive devices |
US7936341B2 (en) * | 2007-05-30 | 2011-05-03 | Microsoft Corporation | Recognizing selection regions from multiple simultaneous inputs |
JP2009017286A (ja) | 2007-07-05 | 2009-01-22 | Niigata Seimitsu Kk | Am/fmラジオ受信機およびこれに用いる受信用半導体集積回路 |
WO2009049331A2 (en) * | 2007-10-08 | 2009-04-16 | Van Der Westhuizen Willem Mork | User interface |
US9513765B2 (en) * | 2007-12-07 | 2016-12-06 | Sony Corporation | Three-dimensional sliding object arrangement method and system |
US8745514B1 (en) * | 2008-04-11 | 2014-06-03 | Perceptive Pixel, Inc. | Pressure-sensitive layering of displayed objects |
JP4171770B1 (ja) * | 2008-04-24 | 2008-10-29 | 任天堂株式会社 | オブジェクト表示順変更プログラム及び装置 |
JP5500855B2 (ja) * | 2008-07-11 | 2014-05-21 | キヤノン株式会社 | 情報処理装置及びその制御方法 |
EP2306286A4 (en) | 2008-07-17 | 2016-05-11 | Nec Corp | INFORMATION PROCESSING APPARATUS, STORAGE MEDIUM ON WHICH A PROGRAM HAS BEEN RECORDED, AND METHOD FOR MODIFYING OBJECT |
US8407606B1 (en) * | 2009-01-02 | 2013-03-26 | Perceptive Pixel Inc. | Allocating control among inputs concurrently engaging an object displayed on a multi-touch device |
US20120242620A1 (en) * | 2011-03-22 | 2012-09-27 | Research In Motion Limited | Combined optical navigation and button |
US9134893B2 (en) * | 2012-12-14 | 2015-09-15 | Barnes & Noble College Booksellers, Llc | Block-based content selecting technique for touch screen UI |
US9471150B1 (en) * | 2013-09-27 | 2016-10-18 | Emc Corporation | Optimized gestures for zoom functionality on touch-based device |
-
2009
- 2009-07-15 EP EP09798012.2A patent/EP2306286A4/en not_active Ceased
- 2009-07-15 CN CN200980128039.9A patent/CN102099775B/zh active Active
- 2009-07-15 JP JP2010520915A patent/JP5267827B2/ja active Active
- 2009-07-15 US US13/054,698 patent/US20110126097A1/en not_active Abandoned
- 2009-07-15 CN CN201410425514.1A patent/CN104216655B/zh active Active
- 2009-07-15 WO PCT/JP2009/063135 patent/WO2010008088A1/ja active Application Filing
-
2013
- 2013-05-10 JP JP2013100183A patent/JP5618106B2/ja active Active
- 2013-05-10 JP JP2013100182A patent/JP5787375B2/ja active Active
-
2014
- 2014-07-23 US US14/338,840 patent/US9933932B2/en active Active
- 2014-09-18 JP JP2014189573A patent/JP5811381B2/ja active Active
-
2015
- 2015-09-18 JP JP2015185285A patent/JP6150082B2/ja active Active
-
2017
- 2017-05-24 JP JP2017102607A patent/JP6369704B2/ja active Active
-
2018
- 2018-02-22 US US15/902,129 patent/US10656824B2/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5825352A (en) * | 1996-01-04 | 1998-10-20 | Logitech, Inc. | Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad |
JPH1173271A (ja) * | 1997-08-28 | 1999-03-16 | Sharp Corp | 指示装置、処理装置および記憶媒体 |
JP2000163193A (ja) * | 1998-11-25 | 2000-06-16 | Seiko Epson Corp | 携帯情報機器及び情報記憶媒体 |
JP2001265481A (ja) * | 2000-03-21 | 2001-09-28 | Nec Corp | ページ情報表示方法及び装置並びにページ情報表示用プログラムを記憶した記憶媒体 |
JP2002304256A (ja) | 2001-04-06 | 2002-10-18 | Sony Corp | 情報処理装置 |
JP2005301516A (ja) * | 2004-04-08 | 2005-10-27 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2008508600A (ja) * | 2004-07-30 | 2008-03-21 | アップル インコーポレイテッド | タッチ・センシティブ入力デバイスのためのモード・ベースのグラフィカル・ユーザ・インタフェース |
JP2008508601A (ja) * | 2004-07-30 | 2008-03-21 | アップル インコーポレイテッド | タッチ・センシティブ入力デバイスのためのジェスチャ |
WO2007089766A2 (en) * | 2006-01-30 | 2007-08-09 | Apple Inc. | Gesturing with a multipoint sensing device |
JP2007279638A (ja) | 2006-04-12 | 2007-10-25 | Xanavi Informatics Corp | ナビゲーション装置 |
JP2008097172A (ja) * | 2006-10-10 | 2008-04-24 | Sony Corp | 表示装置および表示方法 |
JP2008185622A (ja) | 2007-01-26 | 2008-08-14 | Ricoh Co Ltd | 色ずれ検出方法、色ずれ検出装置及び画像形成装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2306286A4 |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013214310A (ja) * | 2008-09-03 | 2013-10-17 | Canon Inc | 情報処理装置、その動作方法及びプログラム |
US9250800B2 (en) | 2010-02-18 | 2016-02-02 | Rohm Co., Ltd. | Touch-panel input device |
US9760280B2 (en) | 2010-02-18 | 2017-09-12 | Rohm Co., Ltd. | Touch-panel input device |
US11055050B2 (en) | 2010-02-25 | 2021-07-06 | Microsoft Technology Licensing, Llc | Multi-device pairing and combined display |
JP2013521547A (ja) * | 2010-02-25 | 2013-06-10 | マイクロソフト コーポレーション | マルチスクリーンのホールド及びページフリップジェスチャー |
JP2011197848A (ja) * | 2010-03-18 | 2011-10-06 | Rohm Co Ltd | タッチパネル入力装置 |
JP2011227703A (ja) * | 2010-04-20 | 2011-11-10 | Rohm Co Ltd | 2点検知可能なタッチパネル入力装置 |
JP2011242820A (ja) * | 2010-05-13 | 2011-12-01 | Panasonic Corp | 電子機器、表示方法、及びプログラム |
CN102375602A (zh) * | 2010-08-13 | 2012-03-14 | 卡西欧计算机株式会社 | 输入装置以及输入方法 |
JP2012068778A (ja) * | 2010-09-22 | 2012-04-05 | Kyocera Corp | 携帯端末、入力制御プログラム及び入力制御方法 |
WO2012160829A1 (ja) * | 2011-05-25 | 2012-11-29 | パナソニック株式会社 | タッチスクリーン装置、タッチ操作入力方法及びプログラム |
JP2014006671A (ja) * | 2012-06-22 | 2014-01-16 | Yahoo Japan Corp | 画像表示装置、画像表示方法、及び、画像表示プログラム |
JP2012234569A (ja) * | 2012-08-09 | 2012-11-29 | Panasonic Corp | 電子機器、表示方法、及びプログラム |
JP2014112335A (ja) * | 2012-12-05 | 2014-06-19 | Fuji Xerox Co Ltd | 情報処理装置及びプログラム |
JP2014142750A (ja) * | 2013-01-23 | 2014-08-07 | Dainippon Printing Co Ltd | 入力機能及び表示機能を有するicカード |
JP2014215916A (ja) * | 2013-04-26 | 2014-11-17 | 株式会社サミーネットワークス | 表示制御方法、表示制御プログラム、および、携帯情報端末 |
WO2015107617A1 (ja) * | 2014-01-14 | 2015-07-23 | 株式会社 東芝 | 電子機器、制御方法およびプログラム |
WO2016006074A1 (ja) * | 2014-07-09 | 2016-01-14 | 株式会社東芝 | 電子機器、方法及びプログラム |
JP2016219067A (ja) * | 2016-09-28 | 2016-12-22 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN104216655A (zh) | 2014-12-17 |
US20140333580A1 (en) | 2014-11-13 |
US9933932B2 (en) | 2018-04-03 |
US10656824B2 (en) | 2020-05-19 |
EP2306286A4 (en) | 2016-05-11 |
CN104216655B (zh) | 2018-02-16 |
JP2017168136A (ja) | 2017-09-21 |
JP5267827B2 (ja) | 2013-08-21 |
JP6150082B2 (ja) | 2017-06-21 |
JP6369704B2 (ja) | 2018-08-08 |
US20180181278A1 (en) | 2018-06-28 |
JP2015015045A (ja) | 2015-01-22 |
CN102099775A (zh) | 2011-06-15 |
JP5618106B2 (ja) | 2014-11-05 |
JPWO2010008088A1 (ja) | 2012-01-05 |
JP2013214308A (ja) | 2013-10-17 |
US20110126097A1 (en) | 2011-05-26 |
EP2306286A1 (en) | 2011-04-06 |
JP5787375B2 (ja) | 2015-09-30 |
JP2013175228A (ja) | 2013-09-05 |
JP5811381B2 (ja) | 2015-11-11 |
JP2016026352A (ja) | 2016-02-12 |
CN102099775B (zh) | 2014-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010008088A1 (ja) | 情報処理装置、プログラムを記録した記憶媒体及びオブジェクト移動方法 | |
JP6151157B2 (ja) | 電子機器および制御プログラム並びに電子機器の動作方法 | |
US8174496B2 (en) | Mobile communication terminal with touch screen and information inputing method using the same | |
KR101523979B1 (ko) | 휴대 단말기 및 그 휴대 단말기에서 기능 수행 방법 | |
US20130082824A1 (en) | Feedback response | |
JP2010009321A (ja) | 入力装置 | |
US9189077B2 (en) | User character input interface with modifier support | |
US10216409B2 (en) | Display apparatus and user interface providing method thereof | |
JP5261729B2 (ja) | 情報処理装置、及びプログラム | |
JP2012505568A (ja) | 移動通信装置のためのマルチメディアモジュール | |
KR20140016454A (ko) | 터치스크린을 구비한 휴대 단말기의 오브젝트 이동을 위한 드래그 제어 방법 및 장치 | |
JP5923395B2 (ja) | 電子機器 | |
JP5449269B2 (ja) | 入力装置 | |
JP2008009856A (ja) | 入力装置 | |
JP2009146212A (ja) | 情報処理装置 | |
JP6542451B2 (ja) | 電子機器 | |
JP6408641B2 (ja) | 電子機器 | |
JP2015111369A (ja) | 電子装置 | |
JP2016181291A (ja) | 装置、制御方法、ならびに制御プログラム | |
JP2015022668A (ja) | 入力装置 | |
JP2011123571A (ja) | 指示体位置検出機能付き電子機器、入力方法およびプログラム | |
JP2015139617A (ja) | クイズゲーム制御方法及びクイズゲーム制御プログラム | |
JP2015139616A (ja) | クイズゲーム制御方法及びクイズゲーム制御プログラム | |
JP2011008624A (ja) | 情報処理装置、項目選択受付方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980128039.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09798012 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010520915 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13054698 Country of ref document: US Ref document number: 2009798012 Country of ref document: EP |