WO2014024948A1 - Information processing apparatus, method of controlling the same, program and storage medium - Google Patents
Information processing apparatus, method of controlling the same, program and storage medium Download PDFInfo
- Publication number
- WO2014024948A1 WO2014024948A1 PCT/JP2013/071440 JP2013071440W WO2014024948A1 WO 2014024948 A1 WO2014024948 A1 WO 2014024948A1 JP 2013071440 W JP2013071440 W JP 2013071440W WO 2014024948 A1 WO2014024948 A1 WO 2014024948A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- display
- objects
- division
- display object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an
- Patent Laid-Open No. 2009-294857 proposes a technology for manipulating an object by operations of a user using a multi-touch UI that can recognize touching of a plurality of fingers.
- a multi-touch UI that can recognize touching of a plurality of fingers.
- processing on objects is assigned to multi-touch gestures in which at least one finger is fixed on the object and another finger is moved.
- An aspect of the present invention is to eliminate the above-mentioned problems which are found in the conventional technology.
- a feature of the present invention is to provide a technique that divides a display object into two display objects by an operation of touching the display object with both hands and dividing the display object, and decides an object included in each divided display object.
- an information processing apparatus provided with a display unit including a touch panel, the information processing apparatus comprising: detection means for detecting a plurality of touch points on a display object displayed on the display unit; and division means for dividing, if a number of the plurality of touch points detected by the detection means is a predetermined number or more, the display object into a plurality of display objects in response to at least some of the plurality of touch points moving in an opposite direction on the touch panel to other of the plurality of touch points.
- a method for controlling an information processing apparatus provided with a display unit including a touch panel, the method comprising: a detection step of detecting a plurality of touch points on a display object displayed on the display unit; and a division step of dividing, if a number of the plurality of touch points detected in the detection step is a predetermined number or more, the display object into a plurality of display objects in response to at least some of the plurality of touch points moving in an opposite direction on the touch panel to other of the plurality of touch points.
- FIG. 1 is a block diagram illustrating a hardware configuration of an information processing apparatus according to an embodiment.
- FIG. 2 is a block diagram illustrating a configuration of a software module executed by a CPU of the information processing apparatus according to the embodiment .
- FIGS. 3A to 3G depict views illustrating examples of touch information generated through touch input by a user on a touch panel of the information processing apparatus according to the embodiment.
- FIG. 4 is a flowchart for describing gestural event generation processing performed by the information processing apparatus.
- FIGS. 5A to 5M depict views illustrating a list of event names that are to be generated in
- FIGS. 6, 7, 9, 10A and 10B flowcharts of FIGS. 6, 7, 9, 10A and 10B, and pieces of information that are to be transmitted to a gestural event processing section when the corresponding event has been generated.
- FIG. 6 is a flowchart for describing processing that is associated with touch of a new finger of the user in step S404 in FIG. 4.
- FIG. 7 is a flowchart for describing processing that is associated with movement of the finger of the user in step S406 in FIG. 4..
- FIGS. 8A and 8B depict views illustrating a situation in which the user performs a rotation operation on a touch panel in a clockwise direction.
- FIG. 9 is a flowchart for describing processing that is associated with touch release in step S408 in FIG. 4.
- FIGS. 10A and 10B are flowcharts for describing processing that is associated with timer interrupt in step S409 in FIG. 4.
- FIGS. 11A and 11B depict views illustrating division of a display object on a touch UI of the information processing apparatus using two hands.
- FIG. 12 depicts a view illustrating
- FIGS. 13A and 13B depict views illustrating situations in which images of objects are displayed in display objects in an aggregated manner.
- FIGS. 14A and 14B depict views illustrating division of a display object 1305 in FIG. 13B is divided into two display objects.
- FIGS. 15A and 15B depict views illustrating division of a display object 1306 in FIG. 13B into two display objects.
- FIGS. 16A to 16C depict views illustrating division of a display object 1307 in FIG. 13B into display objects 1605 and 1606.
- FIGS. 17A and 17B are diagrams illustrating a situation in which a user attempt to divide a display object 1308 in FIG. 13B with two hands but cannot divide the display object.
- FIG. 18 is a block diagram illustrating a configuration of a division processing module that is software of the information processing apparatus according to the embodiment of the present invention.
- FIGS. 19A to 19D depict views illustrating examples of tables that associate display objects with objects and manage various types of information on the display objects.
- FIG. 20 is a flowchart for describing a flow of processing performed by the division processing module of the information processing apparatus
- FIG. 21A is a flowchart for describing object selection processing in step S2003 in FIG. 20.
- FIG. 21B is a flowchart for describing object selection cancellation in step S2005 in FIG. 20.
- FIG. 22 is a flowchart for describing tearing processing performed by the information
- FIG. 1 is a block diagram illustrating a hardware configuration of an information processing apparatus 1000 according to the present embodiment.
- the information processing apparatus 1000 is mainly provided with a main board 1100, a display unit 1200 such as a liquid crystal display unit, a touch panel 1300, and button devices 1400. Also, the display unit 1200 and the touch panel 1300 are mainly provided with a main board 1100, a display unit 1200 such as a liquid crystal display unit, a touch panel 1300, and button devices 1400. Also, the display unit 1200 and the touch panel 1300 are
- a touch UI 1500 collectively referred to as a touch UI 1500.
- the main board 1100 includes a CPU 1101, a wireless LAN module 1102, a power supply controller 1103, a display controller (DISPC) 1104, and a panel controller (PANELC) 1105.
- the main board 1100 further includes a ROM 1106, a RAM 1107, a secondary battery 1108, and a time.r 1109. These modules 1101 to 1109 are connected to each other via a bus (not shown) .
- the CPU 1101 controls the devices connected to the bus, and deploys and executes, in the RAM 1107, a software module (FIG. 2) according to the present embodiment stored in the ROM 1106.
- the RAM 1107 stores the programs stored in the ROM 1106.
- the switch 1107 switches output of the image data deployed in the RAM 1107 at a high speed in accordance with a request of the CPU 1101, and outputs a synchronization signal to the display unit 1200.
- the image data stored in the RAM 1107 is output to the display unit 1200 in synchronization with the synchronization signal output by the DISPC 1104, and an image that corresponds to the image data is displayed on the display unit 1200.
- the panel controller (PANELC) 1105 controls the touch panel 1300 and the button devices 1400 in accordance with a request of the CPU 1101. With this control, the CPU 1101 is notified of a touch point on the touch panel 1300 at which a finger or an instruction tool such as a stylus pen touches, a key code that corresponds to a touched key on the button devices 1400, or the like.
- the touch point information includes a coordinate value (hereinafter referred to as the "x-coordinate” ) that indicates an absolute position in the lateral direction of the touch panel 1300, and a coordinate value (hereinafter referred to as the "y- coordinate”) that indicates an absolute position in the vertical direction.
- the touch panel 1300 is capable of detecting multiple simultaneous touch points, and in this case, the CPU 1101 is notified of pieces of touch point information equal in number to the number of touch points.
- the touch panel 1300 may be any of various types of touch panel systems such as a resistive membrane system, a capacitance system, a surface acoustic wave system, an infrared ray system, an electromagnetic induction system, an image
- the power supply controller 1103 is
- the wireless LAN module 1102 establishes wireless communication with a wireless LAN module of another device in accordance with control of the CPU 1101, and mediates
- the wireless LAN module 1102 is an IEEE 802.11b wireless LAN module.
- the timer 1109 generates a timer interrupt to a gestural event
- the gestural event generation section 2100 (FIG. 2) in accordance with control of the CPU 1101.
- the gestural event generation section 2100 will be described later with reference to FIG. 2.
- FIG. 2 is a block diagram illustrating a configuration of a software module executed by the CPU 1101 of the information processing apparatus 1000 according to the embodiment. Note that this software module is realized by the CPU 1101 deploying and executing, in the RAM 1107, a program stored in the ROM 1106.
- the gestural event generation section 2100 Upon receipt of a touch input on the touch panel 1300 by a user, the gestural event generation section 2100 generates various types of gestural events shown in FIGS. 5A to 5M.
- the gestural event generation section . 2100 transmits the generated gestural events to a gestural event processing section 2200.
- the gestural event processing section 2200 receives the gestural events generated in the gestural event generation section 2100, and executes processing that corresponds to each gestural event.
- a drawing section 2300 A drawing section 2300
- the gestural event generation section 2100 will be described in detail later with reference to FIGS. 3A to 10.
- FIGS. 3A to 3G are diagrams illustrating examples of touch information generated through touch input by a user on the touch panel 1300 of the
- finger touch input is taken as an example of the touch input by the user, the touch input may be input through a stylus pen or the like .
- the touch information includes, as
- touch information number time of touch input
- number of touch points the number of touch points
- touch point coordinate information the number of touch points
- Touch information number indicates the order in which the corresponding touch information is generated.
- the time of touch input indicates the time when touch input on the touch panel 1300 was performed by the user.
- the number of touch points indicates the number of sets of coordinates at which the user
- the touch point coordinate information indicates information relating to
- the touch time and the release time respectively indicate the time when this finger touches the touch panel 1300, and the time when this finger is released from the touch panel 1300.
- the moving flag indicates that the finger of the user is moving on the touch panel 1300.
- the single tap flag indicates that a single tap was made on the touch panel 1300 by the finger of the user.
- “single tap” refers to an operation in which the finger of the user is released within a predetermined period of time after touching the touch panel 1300.
- the double tap flag indicates that a double tap was made on the touch panel 1300 by the finger of the user.
- double tap refers to an operation in which a single tap is made in succession within a predetermined period of time.
- the long tap flag indicates that a long tap was made on the touch panel 1300 by the finger of the user.
- long tap refers to an operation in which the finger of the user does not move while continuing to touch the touch panel for a predetermined period of time after touching the touch panel.
- the information obtains an entity that is, for example, a touch point 1 as shown in FIG. 3A.
- the touch point 1 has the values of the x-coordinate , the y-coordinate, the touch time, the release time, the moving flag, the single tap flag, the double tap flag, and the long tap flag in accordance with the constituent elements of the touch point coordinate information, as shown in FIG. 3A.
- FIGS. 3A to 3G each illustrate an example of the touch information.
- FIGS. 3A to 3G are arranged in time series, and each indicate a piece of touch information of a certain point of time. Also, the pieces of touch information shown in FIGS. 3A to 3G are generated in order from FIG. 3A to FIG. 3G in
- the shaded regions denote regions that have different touch information from that of the last touch information.
- touch information When touch information is generated, a pointer indicating the generated touch information is stored. Therefore, by referencing this pointer, it is possible to reference the values of latest touch information. Also, all pointers of touch information generated in the past are stored, and therefore it is possible to reference all values of the held touch information. For example, touch information
- FIGS. 3A to 3G will briefly be described. All the pieces of touch information are generated in step S601 in FIG. 6, step S701 in FIG. 7, step S901 in FIG. 9, or steps S1011 or S1013 in FIG. 10B.
- the touch point 1 indicates the values of touch point coordinate information of the finger of the user.
- a touch point 2 indicates the values of touch point coordinate
- moving flags of the touch points 1 and 2 show "TRUE", indicating that the two fingers are moving.
- the x-coordinates and the y-coordinates change but the touch time does not change.
- the touch points 1 and 2 are at the same coordinates as those in FIG. 3D, and the newly added touch point 3 indicates the values of touch point coordinate information of the third finger.
- the release time of the touch point 2 shows the time when the second finger is released.
- the x- coordinates, the y-coordinates , and the touch times of the touch points 1 to 3 are not changed and the long tap flag of the touch point 2 is changed to "TRUE".
- FIG. 4 is a flowchart for describing gestural event generation processing performed by the information processing apparatus 1000 according to the embodiment. This flowchart shows processing from detection of touch input on the touch panel 1300 by the user until generation of event processing corresponding to each gesture operation of the user. This processing is executed by the gestural event generation section 2100 of the software module, and is described here as processing that is executed by the CPU 1101.
- step S401 the CPU 1101
- step S402 the procedure advances to step S402, and the CPU 1101 checks whether or not touch input of the user or interrupt of the timer 1109 has occurred, and if touch input or interrupt has occurred, the procedure advances to step S403, and otherwise the procedure returns to step S402.
- step S403 the CPU 1101 determines whether or not touch of a new finger of the user has been detected, and if touch of a new finger has been detected, the procedure advances to step S404, and otherwise the procedure shifts to step S405.
- step S404 the CPU 1101 executes processing associated with the touch of a new finger of the user and the procedure returns to step S402. The details of step S404 will be described later with reference to the flowchart of FIG. 6.
- step S405 the CPU 1101 determines whether or not movement of the finger of the user that is touching the touch panel has been detected, and if movement has been detected, the procedure advances to step S406, and otherwise the procedure shifts to step S407.
- step S406 the CPU 1101 executes processing associated with the movement of the finger of the user, and the procedure returns to step S402. The details of step S406 will be described later with reference to the flowchart in FIG. 7.
- step S407 the CPU 1101 determines whether or not release of the finger of the user from the touch panel 1300 has been detected, and if the finger has been released, the procedure advances to step S408, in which processing associated with touch release by the user is executed, and returns to step S402. The details of this step S408 will be described later with reference to the flowchart in FIG. 9.
- step S407 if release of the finger has not been detected, the procedure shifts to step S409.
- step S409 the CPU 1101 executes processing that is performed when timer interrupt of the timer 1109 has occurred, and returns to step S402.
- FIGS. 5A to 5M are diagrams illustrating a list of event names that are generated in the
- FIGS. 6, 7, 9, and 10 flowcharts of FIGS. 6, 7, 9, and 10, and pieces of information that are transmitted to the gestural event processing section 2200 from the gestural event
- FIG. 6 is a flowchart for describing processing that is associated with touch of a new finger of the user in step S404 in FIG. 4.
- step S601 the CPU 1101 generates new touch information if the last touch information does not exist. On the other hand, if the last touch information does exist, the CPU 1101 changes the touch information number, the time of touch input, and the number of touch points of this last touch information. Further, touch information incorporating a touch point of the finger that newly touched the touch panel is generated (see FIG. 3B) .
- last touch information refers to touch information that was generated immediately before the touch information generated in step S601.
- latest touch information refers to the touch
- the shaded regions in FIG. 3B are regions that differ from those in FIG. 3A.
- the touch information number is changed to "2"
- the time of touch input is changed to "1”21”
- the number of touch points is changed to "2”
- “touch point 2" that corresponds to touch input of the second finger of the user is added.
- the touch, information P5 of FIG. 3E is similarly generated with reference to the touch information P4 of FIG. 3D.
- step S602 the procedure advances to step S602, and the CPU 1101 executes processing for transmitting a touch event.
- the processing for transmitting a touch event coordinate values of the touch input and the number of touch points which are the latest pieces of touch information are transmitted to the gestural event processing section 2200 and the processing associated with touch of the finger of the user ends.
- FIG. 7 is a flowchart for describing processing associated with movement of the finger of the user in step S406 in FIG. 4.
- step S701 the CPU 1101 changes the touch information number and the time of touch input of the last touch information, and generates touch information whose moving flag is "TRUE" (see FIG. 3C) .
- last touch information refers to touch information that was generated immediately before the touch information generated in step S701.
- latest touch information refers to the touch
- the touch information P3 of FIG. 3C is newly generated with reference to the touch information P2. Specifically, the touch information number is changed to "3", the time of touch input is changed to "3"00", the values of the x-coordihates and the y-coordinates of the touch points 1 and 2 are changed to the latest values, and the moving flag is changed to "TRUE".
- step S702 the CPU 1101 determines whether or not the number of touch points of the latest touch information is "1", and if it is "1", the procedure advances to step S703, and otherwise the procedure shifts to step S704.
- step S703 the CPU 1101 executes processing for
- swipe refers to an operation in which a fingertip moves (slides) in one direction while remaining in contact with the touch panel 1300.
- swipe event if a swipe event has been generated, the coordinate values of the latest touch information, and a moving distance obtained based on a difference in the coordinate values between the latest touch information and the last touch information are transmitted to the gestural event processing section 2200.
- step S704 the CPU 1101 determines whether or not the number of touch points of the latest touch information is "2". If so, the procedure
- step S705 the CPU 1101 determines whether or not the length of a straight line that connects the two touch points has reduced, and thereby determines whether or not pinch-in has been performed. If so, the procedure advances to step S706, and otherwise the procedure advances to step S707. In step S706, the CPU 1101 executes processing for
- pinch-in refers to an operation in which two fingertips move closer to each other (in a pinching manner) while remaining in contact with the touch panel 1300. Accordingly, if a pinch-in event has been generated, as illustrated in FIG. 5C, the coordinate values of the center of the two touch points, and a reduction ratio of pinch-in that is calculated from the reduced length of the straight line connecting the two touch points are transmitted to the gestural event processing section 2200.
- step S707 the CPU 1101 determines whether or not pinch-out has been performed by- determining whether or not the length of the straight line connecting the two touch points has extended, and if so, the procedure advances to step S708, and
- step S708 processing for transmitting a pinch-out event is performed and the procedure returns to the flowchart of FIG. 4.
- pinch-out refers to an operation in which two fingertips move away from each other (so that the fingers spread apart) while remaining in contact with the touch panel 1300.
- the pinch-out event has been generated, as illustrated in FIG. 5D, the coordinate values of the center of the two touch points of the latest touch information and an extension ratio of pinch-out that is calculated from an extended length of the straight line connecting the two touch points are transmitted to the gestural event processing section 2200.
- step S709 the CPU 1101 determines whether or not a two point swipe has been performed by determining whether or not the two touch points are moving in the same direction, and if it is determined that a two point swipe has been performed, the
- step S710 the CPU 1101 executes processing for transmitting a two point swipe event and returns to the flowchart of FIG. 4. If the two point swipe event has been generated, as
- the coordinate values of the two touch points of the latest touch information, and a moving distance obtained based on a difference in the values of the two touch points between the latest and the last touch information are transmitted to the gestural event processing section 2200.
- step S711 the CPU 1101 determines whether or not rotation has been performed on the basis of rotation of the two touch points, and if rotation has been performed, the procedure advances to step S712, and otherwise the procedure advances to step S713.
- step S712 the CPU 1101 executes processing for
- the CPU 1101 advances the procedure to step S713 to perform other processing, and returns to the flowchart of FIG. 4.
- the other processing may be processing in which nothing is performed.
- step S704 if it is determined that the number of touch points of the latest touch information is not "2", the CPU 1101 advances the procedure to step S714 to perform processing for transmitting an event that is generated when three or more touch points have moved, and the procedure returns to the flowchart of FIG. 4. If, as shown in FIG. 5M, the three or more touch point move event has been generated, the following information is transmitted to the gestural event processing section 2200.
- This information includes all coordinate values of the latest touch information, the latest coordinate values of the centroid calculated from all the touch points, the number of the latest coordinates, all coordinate values of the last touch information, and the last coordinate values of the centroid.
- FIGS. 8A and 8B illustrate a situation in which the user performs a rotation operation on the touch panel 1300 in a clockwise direction.
- the user is in touch with two points of (xl, yl) and (x2, y2) with his or her two fingers, the coordinate values of the center of the rotation are at the center of a straight line m that connects these two points, and an angle a is an angle obtained by a line parallel to the x-axis and the straight line m.
- FIG. 8A the user is in touch with two points of (xl, yl) and (x2, y2) with his or her two fingers, the coordinate values of the center of the rotation are at the center of a straight line m that connects these two points, and an angle a is an angle obtained by a line parallel to the x-axis and the straight line m.
- the user is in touch with two points of (xl' , yl') and ( ⁇ 2' , y2' ) with his or her two fingers, and the coordinate values of the center of rotation are at the center of a straight line n that connects these two points, and an angle b is an angle obtained by a line parallel to the x-axis and the straight line n.
- the rotation angle is obtained by subtracting the angle b from the angle a.
- FIG. 9 is a flowchart for describing processing associated with the touch release in step S408 in FIG. 4.
- step S901 the CPU 1101 changes the touch information number, the time of touch input, and the number of touch points of the last touch information, and generates touch information in which a release time is set when touch has been released from the touch point.
- last touch information refers to touch information that was generated immediately before the touch information generated in step S901.
- latest touch information refers to the touch information generated in step S901. For example, if the last touch information is the touch information P5 of FIG. 3E, the touch information P6 of FIG. 3F is newly generated with reference to the touch information P5. Specifically, the touch information number is changed to "6", the time of touch input is changed to "7"00", the number of touch points is changed to "2”, and the release time of the coordinates 2 at which touch has been released is set to "7"00".
- step S902 the procedure advances to step S902, and the CPU 1101 determines whether or not the moving flag of the touch-released touch point in the latest touch information is "TRUE”, and if so, the procedure advances to step S903 and otherwise the procedure advances to step S904.
- step S903 the CPU 1101 recognizes that the finger has been released during the movement since the moving flag of the touch-released touch point is "TRUE”, and executes processing for transmitting a flick event, and the procedure advances to step S909.
- flick refers to an operation in which a finger is released (in a manner in which the finger is flicked) during a swipe. If a flick event has been generated, as illustrated in FIG. 5G, the coordinate values of the latest touch information and a moving speed of the finger calculated from the coordinate values of the latest touch information and the last touch information are transmitted to the gestural event processing section 2200.
- step S904 the CPU 1101 determines whether or not the single tap flag of the touch- released touch point is "TRUE”, and if so, the
- step S905 since a single tap has already been made on the touch-released touch point, the CPU 1101 sets a double tap flag for the touch-released touch point to on, and the procedure advances to step S909.
- step S906 the CPU 1101 determines whether or not " ⁇ (release time) - (touch time) ⁇ ⁇ predetermined period of time" for the touch-released touch point, and if so, the procedure advances to step S907, and otherwise the procedure advances to step S908.
- step S907 since touch has been released within a predetermined period of time, the CPU 1101 sets the single tap flag for the touch-released touch point to on, and the procedure advances to step S909.
- step S908 since touch has been released after the predetermined period of time has elapsed, the CPU 1101 sets the long tap flag for this touch-released, touch point to on, and the procedure advances to step S909.
- step S909 the CPU 1101 executes processing for transmitting a touch release event. If the touch release event has been generated, as illustrated in FIG. 5H, the coordinate values of the latest touch
- FIGS. 10A and 10B are flowcharts for
- step S1001 the CPU 1101
- step S1002 the CPU 1101 executes processing for transmitting a double tap event since the double tap flag of the touch point is set to on, and the procedure advances to step S1005.
- the double tap event has been generated, as illustrated in FIG. 51, the coordinate values of the latest touch information are transmitted to the gestural event processing section 2200.
- step S1003 the CPU
- step S1004 the CPU 1101 executes processing for transmitting a single tap event since the single tap flag of the touch point is set to on, and the procedure advances to step S1005. If the single tap event has been generated, as illustrated in FIG. 5J, the coordinate values of the latest touch information are transmitted to the
- step S1005 the CPU 1101 determines whether or not, in the latest touch information, there is a touch point whose long tap flag is "TRUE", and if there is such a touch point, the procedure advances to step S1006, and otherwise the procedure advances to step S1007.
- step S1006 the CPU 1101 executes processing for transmitting a long tap event since the long tap flag of the touch point is set to on, and the procedure advances to step S1007. If the long tap event has been generated, as illustrated in FIG. 5K, the coordinate values of the latest touch information are transmitted to the gestural event processing section 2200.
- step S1007 the CPU 1101 determines whether or not, in the latest touch information, there is a touch point for which a predetermined period of time or more has elapsed since the touch time, and if there is such a touch point, the procedure advances to step S1008, and otherwise the procedure advances to step S1010 (FIG. 10B) .
- step S1008 the CPU 1101 searches, with respect to the touch point of the latest touch information for which a predetermined period of time or more has elapsed since touch has been made, all touch points of the past touch information, and
- step S1010 determines whether or not there is a touch point whose moving flag has been changed to "TRUE" among the latest and the past information. If there is such a touch point, the procedure advances to step S1010, and otherwise the procedure advances to step S1009.
- step S1009 the CPU 1101 executes processing for transmitting a touch and hold event because the touch point has a moving flag that has not been set to on after a finger has touched the touch screen and has been held without moving for a predetermined period of time or more, and the procedure advances to step S1010. If the touch and hold event has been generated, as illustrated in FIG. 5L, the coordinate values of the latest touch information are transmitted to the
- step S1010 the CPU 1101 determines whether or not there is a touch point whose moving flag has been set to on, and if there is such a touch point, the procedure advances to step S1011, and otherwise the procedure advances to step S1012.
- step S1011 the CPU 1101 changes the touch information number and the time of touch input of the last touch information, and generates touch information in which moving flags are set to off for all the touch points, and the procedure advances to step S1012. For example, if the last touch information is the touch information P3 of FIG. 3C, the touch information P4 of FIG. 3D is newly generated with reference to the touch information P3. Specifically, in FIG.
- the touch information number is changed to "4"
- the time of touch input is changed to "3"050”
- the moving flag is set to "FALSE”.
- the reason why the time of touch input shows "3"050" is that the timer interrupt is assumed to be generated at a 50 millisecond interval, for example.
- step S1012 the procedure advances to step S1012, and the CPU 1101 determines whether or not there is a touch point for which a predetermined period of time has elapsed since the release time of the touch point, and if there is such a touch point, the procedure advances to step S1013, and otherwise, the processing associated with the timer interrupt ends.
- the CPU 1101 changes the touch information number of the last touch information, and generates touch information that excludes the touch point for which a predetermined period of time has elapsed since the release time. For example, if the last touch information is the touch information P6 of FIG. 3F, the touch information P7 of FIG. 3G is newly generated with reference to the touch information P6. Specifically, the touch information number is changed to "7", and the touch point 2 in FIG. 3F is deleted.
- FIGS. 11A and 11B illustrate division of a display object 1123 on the touch UI 1500 of the
- object refers to an entity of each page object of a document such as a PDF file
- display object refers to a preview image or the like of each page that is preview-displayed.
- definition of an object is not particularly limited to this .
- FIG. 11A illustrates a situation in which display objects including the display object 1123 are displayed on the touch UI 1500.
- the two hands 1121 and 1122 touch the display object 1123 with three or more fingers of each hand (a total of six or more fingers), and respectively move in the directions of arrows 1128 and 1129.
- the two hands 1121 and 1122 have moved a predetermined distance in
- the display object 1123 is divided into two display objects 1125 and 1126, as illustrated in FIG. 11B.
- FIG. 12 illustrates positions of display objects that are displayed on the touch UI 1500 in the embodiment.
- the position of each display object is assumed to be the upper left vertex of a rectangle representing the display object.
- coordinates of the upper left point (x, y) on the touch UI 1500 are defined as (0, 0)
- the horizontal axis indicates an x axis
- a vertical axis indicates a y axis.
- Coordinates CI to C3 show the positions of the respective display objects. Also, with respect to the size of each display object, X denotes the horizontal length thereof, Y denotes the vertical length thereof, and D denotes the distance between the positions of adjacent display objects.
- FIGS. 13A and 13B illustrate situations in which images of objects are displayed in display objects in an aggregated manner.
- Reference numerals 1301 to 1304 in FIG. 13A and reference numerals 1305 to 1308 in FIG. 13B respectively denote display objects.
- FIG. 13A illustrates display objects that are set to so-called "1-in-l", in which an image of one object is displayed in one display object.
- FIG. 13B illustrates a situation in which the display objects are set to different aggregation settings.
- a display object 1305 is set to 1-in-l in which only an image of an object 1 is displayed.
- a display object 1306 is set to 2-in-l in which images of objects 2 and 3 are displayed.
- a display object 1307 is set to 4-in-l in which images of objects 4 to 7 are displayed.
- a display object 1308 is set to 6-in-l in which images of objects 8 to 13 are displayed.
- FIG. 18 is a block diagram illustrating a configuration of a division processing module 1801 that is software of the information processing apparatus according to the present invention.
- This division processing module 1801 is included in the gestural event processing section 2200 of FIG. 2, and executes processing with respect to the touch event (FIG. 5A) , the touch release event (FIG. 5H) , and the three or more finger move event (FIG. 5M) .
- a touch event processing section 1802 processes the touch event.
- a touch release event processing section 1803 processes the touch release event.
- a three or more finger move event processing section 1804 processes the three or more finger move event.
- a tearing processing section 1805 included in the three or more finger move event processing section 1804 executes processing for dividing a display object when, among the three or more finger move event, an operation using two hands as illustrated in FIG. 11B has been executed.
- FIGS. 19A to 19D illustrate examples of tables that associate display objects with objects and manage various types of information on the display objects. These tables are data held in the RAM 1107, and can be read and written by the tearing processing section 1805. Also, defaults of the tables are
- the drawing section 2300 of FIG. 2 loads the information of the tables and executes drawing
- display object number is a number for uniquely specifying a display object to be displayed on the touch UI 1500.
- Object number is an ID of an object with which the object can uniquely be identified.
- Storage address shows an address in the RAM 1107 where each object is stored. Rectangular coordinates are coordinates of the position on the touch UI 1500 at which the upper left corner of a rectangle showing the corresponding display object is located.
- Selection flag is a flag that indicates whether or not the corresponding object is selected, with the selection flag being "TRUE" when the object is selected and otherwise is "FALSE”. At the end of the object information, "ENDOBJ", which indicates the end of the object information, is stored in the object number. Also, in this case, all values such as
- the drawing section 2300 loads the object information and displays preview images of the objects in respective rectangular regions.
- objects that have the same display object numbers are displayed in one display object in an aggregated manner.
- the numbers of objects in the horizontal and vertical directions are determined with reference to the aggregation setting.
- An object whose selection flag is set to on is displayed as being selected (for example, with a highlighted color) .
- FIG. 19A illustrates object information in which, as illustrated in FIG. 13B, objects are set to 1-in-l, 2-in-l, 4-in-l, and 6-in-l in the order from the object with the object number "1".
- FIGS. 14A and 14B illustrate division of the display object 1305 in FIG. 13B into two display objects 1405 and 1406. Note that the same reference numerals are given to the elements common to FIG. 11A.
- FIG. 14A illustrates a state in which the two hands 1121 and 1122 are touching the display object 1305, and FIG. 14B ⁇ illustrates a state in which, by the right and left hands respectively moving in the
- the display object 1305 is divided into two display objects 1405 and 1406.
- the divided display objects 1405 and 1406 (1-1 and 1-2) have in common a landscape display format having a long horizontal length.
- each divided portrait object having a long vertical length is rotated 90 degrees in the left direction, and the vertical length is enlarged so as to be aligned with the length of the display object 1305.
- the divided portrait objects may be displayed without being rotated or enlarged.
- FIGS. 15A and 15B illustrate division of the display object 1306 in FIG. 13B into display objects 1505 and 1506. Note that the same reference numerals are given to the elements common to FIG. 11A.
- the changes in the object information at this time are shown by the changes between FIG. 19A and FIG. 19C (the changed parts are indicated by shaded regions in FIG. 19C) .
- the contents of the changes at this time will be described later in detail with reference to FIG. 22.
- FIGS. 16A to 16C illustrate division of the display object 1307 in FIG. 13B into display objects 1605 and 1606.
- FIG. 16A illustrates the display object 1307 in which the objects are arranged 4-in-l
- FIG. 16B illustrates a state in which the display object 1307 has been divided into the display objects 1605 and 1606 as illustrated in FIG. 11B.
- the display objects have in common a landscape display format also after the division of the display object 1307. Therefore, display is performed such that the divided portrait objects 1605 and 1606 each having a long vertical length are rotated 90 degrees in the left direction, and the display objects 1605 and 1606 are enlarged so that the sizes thereof match the original size of the display object 1307.
- the changes in the object information at this time are shown by the changes between FIG. 19A and FIG. 19D (the changed parts are indicated by shaded parts in FIG. 19D) .
- the contents of the changes at this time will be described later in detail with reference to FIG. 22.
- the change in display from FIG. 16B to FIG. 16C may be made by, for example, an operator releasing his or her fingers from the touch UI 1500.
- FIGS. 17A and 17B are diagrams illustrating that a user attempt to divide the display object 1308 in FIG. 13B with two hands but cannot divide the display object. This is intended to prevent the display object from being divided in the middle because the number of objects in the horizontal direction is three.
- the division processing is hereinafter likewise prohibited when the number of objects to be arranged and displayed in the horizontal direction within the display object is an odd number other than 1, but the present embodiment is not limited to this.
- a display object may be divided into two at a position other than the middle, such as an intermediate portion between objects, or at all portions that can be divided. In this case, the object information is not changed.
- FIGS. 14A to 17B and FIGS. 20 to 22 the detailed operations for dividing a display object according to the present embodiment will be described with reference to FIGS. 14A to 17B and FIGS. 20 to 22. Note that, in the present embodiment, this processing is realized by software of the information processing apparatus 1000, but may be realized by hardware modules.
- FIG. 20 is a flowchart for describing the flow of processing performed by the division processing module 1801 of the information processing apparatus according to the present embodiment.
- a program for executing this processing is stored in the ROM 1106, and the processing is realized by the CPU 1101
- step S2001 the CPU 1101 checks whether or not a timer event or a gestural event has been received.
- Timer event refers to an event
- step S2002 the procedure advances to step S2002, and if a timer event or a gestural event has not been received, the procedure returns to step S2001 to check again whether or not an event has been received.
- step S2002 the CPU 1101 determines whether or not the detected event is a touch event. If the detected event is determined to be a touch event, the procedure advances to step S2003, and otherwise the procedure advances to step S2004.
- step S2003 the CPU 1101 executes object selection processing. This processing is performed by the touch event processing section 1802 of the division processing module 1801. The flowchart of the object selection processing is illustrated in FIG. 21A, and a description thereof will be given later.
- step S2003 ends, the procedure returns to step S2001.
- step S2002 If the detected event is determined not to be a touch event in step S2002, the procedure advances to step S2004, in which the CPU 1101 determines whether or not the event detected in step S2001 is a touch release event. If the event is determined to be a touch release event, the procedure advances to step S2004, in which the CPU 1101 determines whether or not the event detected in step S2001 is a touch release event. If the event is determined to be a touch release event, the procedure advances to step S2004, in which the CPU 1101 determines whether or not the event detected in step S2001 is a touch release event. If the event is determined to be a touch release event, the procedure advances to step S2004, in which the CPU 1101 determines whether or not the event detected in step S2001 is a touch release event. If the event is determined to be a touch release event, the procedure advances to step S2004, in which the CPU 1101 determines whether or not the event detected in step S2001 is a touch release event. If the event is determined to be a touch release event, the procedure advances
- step S2005 the CPU 1101 executes object selection cancellation processing. This processing is performed by the touch release event processing section 1803 of the division processing module 1801. This processing is illustrated in the flowchart of FIG. 21B, and a description thereof will be given later.
- step S2005 thus ends, the procedure returns to step S2001.
- step S2006 the CPU 1101
- step S2007 determines whether or not the detected event is a three or more finger move event. If the detected event is determined to be a three or more finger move event, the procedure advances to step S2007, and otherwise the procedure returns to step S2001 to wait to receive an event.
- step S2007 the CPU 1101 determines based on information included in the three or more finger move event whether or not the number of touch points is six or more. If the number of touch points is six or more, the procedure advances to step S2008, and otherwise the procedure returns to step S2001 to wait to receive an event.
- step S2008 the CPU 1101 executes display object tearing processing. This processing is executed by the tearing processing section 1805 included in the three or more finger move event processing section 1804 of the division processing module 1801. The flowchart of this processing is illustrated in FIG. 22, and a description thereof will be given later. When the processing in step S2008 thus ends, the procedure returns to step S2001.
- FIG. 21A is a flowchart for describing the object selection processing in step S2003 of FIG. 20 executed in the information processing apparatus according to the present embodiment. This flowchart corresponds to the processing performed by the touch event processing section 1802.
- step S3001 the CPU 1101 checks, with respect to each set of the rectangular coordinates, whether or not there are six or more touch points in an object display region.
- Object display region refers to a rectangular region whose upper left vertex is located at the above-mentioned rectangular coordinates and that has a horizontal length X and a vertical length Y.
- Touch point refers to the latest touch point that can be obtained as additional information of the touch event. If there are six or more touch points, the procedure advances to step S3002, in which the CPU 1101 sets the selection flag of the object information of an object that includes three or more touch points within the display region to "TRUE". Then, the procedure advances to step S3003 to update the display state in this display object, and the processing ends. Note that if in step S3001 there are not six or more touch points, this processing ends.
- FIG. 21B is a flowchart for describing the object selection cancellation in step S2005 of FIG. 20. This flowchart corresponds to the processing performed by the touch release event processing section 1803.
- step S3004 the CPU 1101 checks whether or not there is an object whose selection flag is "TRUE" among all the objects. If there is such an object, the procedure advances to step S3005, and if there is not such an object, the procedure ends.
- step S3005 the CPU 1101 checks, with respect to each object whose selection flag was set (to "TRUE") in step S3004, whether or not there are six or more touch points in a rectangle region whose vertex is at the rectangle coordinates. If all the objects whose selection flags are set to on include six or more touch points, the procedure ends. On the other hand, if the objects whose selection flags are set to on include at least one object in which there are not six or more touch points, the procedure advances to step S3006.
- step S3006 the CPU 1101 sets the selection flag of an object that does not include three or more touch points in the rectangle region thereof to "FALSE", that is, the CPU 1101 sets the object to be unselected. Then, the procedure advances to step S3007, and the CPU 1101 requests the drawing section 2300 to update the display image .
- FIG. 22 is a flowchart for describing the tearing processing performed by the information
- This processing is the processing in step S2008 of FIG. 20, and corresponds to the processing performed by the tearing processing section 1805.
- a program for executing this processing is stored in the ROM 1106, and the processing is realized by the CPU 1101 executing this program.
- This processing describes in detail the procedures, as illustrated in FIG. 11B, for dividing the display object 1123 with two hands 1121 and 1122 in a manner of tearing the display object.
- step S2201 the CPU 1101
- This determination may be made by
- step S2202 the CPU 1101 calculates an average value of distances from the latest centroid coordinates to each latest touch point.
- the latest touch point and the latest coordinates of the centroid can be obtained from the three or more finger move event.
- Fi(t) (where i is a value from 1 to N, and N denotes the number of latest touch points) and the coordinates of the latest centroid that are calculated from these coordinates of the latest touch points are denoted by G(t), the average value av(t) of the
- step S2203 the procedure advances to step S2203, and the CPU 1101 determines whether or not the average value of the distances from the latest centroid to each touch point that is obtained by equation (1) is greater than a predetermined distance. If the average value is greater than the predetermined distance, the procedure advances to step S2204, and if the average value is equal to or less than the predetermined distance, the procedure ends. This is to determine whether or not the operator is touching the display object while spreading the fingers of his or her two hands, as illustrated in FIGS. 11A, 11B, 15A, and 15B, and if the operator is doing so, it is determined that the
- step S2204 the CPU 1101 checks the object information, and determines whether or not the object whose selection flag is set to on has an
- step S2205 the procedure advances to step S2205, and if the number of the objects in the horizontal direction is an odd number, the procedure advances to step S2206.
- step S2205 the CPU 1101 updates the object information. At this time, different display object numbers are given to the upper half and the lower half of the selected display objects. Then, the procedure advances to step S2207, and the CPU 1101 requests the drawing section 2300 to update the display image.
- the object information is changed from FIG. 19A to FIG. 19C.
- the display object number 2 in FIG. 19A is changed to respectively different display object numbers 2-1 and 2-1 in FIG. 19C.
- the aggregation setting is changed from (2, 1) in FIG. 19A to (1, 1), and the rectangle coordinates are changed from C2 to C2-1 and C2-2.
- the display object 1307 as illustrated in FIG. 16A which is set to 4-in-l
- the display object will be as illustrated in FIG. 16C, and the object information is changed from FIG. 19A to FIG. 19D.
- the display object numbers of the object numbers 4 to 7 that are all "3" in FIG. 19A are changed such that the upper half of the objects, that is, the object numbers 4 and 5 have the display object numbers "3-1", and the lower half of the objects, that is, the object numbers 6 and 7 have the display object numbers "3-2".
- the aggregation setting is
- aggregation setting is (2, 1) instead of (1, 2) is, as described above, to display .the .objects . in a . landscape format.
- the order of allocation in the case of 4-in-l in the present embodiment has a configuration in which the numbers of 4, 5, 6, and 7 are arranged in the order of the upper left, the upper right, the lower left, and the lower right (Z order) .
- the numbers are
- step S2206 the CPU
- 1101 checks the object information to determine whether or not the selected display objects have an aggregation setting in which the number of objects in the
- step S2208 the CPU 1101 divides one display object into two display objects, as illustrated for example in the above-described FIG. 14B, and updates the object information. At this time, a new display object row is added immediately below the selected display object.
- step S2207 the CPU 1101 requests the drawing section 2300 to change the display image based on the object information .
- attributes of the objects in an original display object such as for example the order of the objects having an aggregation setting. Accordingly, if a display object that is constituted by objects each having an
- each divided display object can have, for example, continuous page numbers. Also, the objects included in the each
- divided display object can be correlated with each other .
- the present embodiment it is possible to divide a display object into two display objects by an operation of touching the display object with two hands and dividing the display object, and to decide an object included in each divided display object. Therefore, the display object for which, for example, an aggregation setting such as 2-in-l is configured can be divided into two display objects and display objects that are each set to 1-in-l can be generated.
- aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described
- the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiments.
- the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (for example, computer- readable medium) .
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/009,091 US9348443B2 (en) | 2012-08-10 | 2013-08-01 | Information processing apparatus, method of controlling the same, program and storage medium |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2012178922A JP2014038383A (ja) | 2012-08-10 | 2012-08-10 | 情報処理装置及びその制御方法、プログラム |
| JP2012-178922 | 2012-08-10 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014024948A1 true WO2014024948A1 (en) | 2014-02-13 |
Family
ID=50068169
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2013/071440 Ceased WO2014024948A1 (en) | 2012-08-10 | 2013-08-01 | Information processing apparatus, method of controlling the same, program and storage medium |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US9348443B2 (cg-RX-API-DMAC7.html) |
| JP (1) | JP2014038383A (cg-RX-API-DMAC7.html) |
| WO (1) | WO2014024948A1 (cg-RX-API-DMAC7.html) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5726221B2 (ja) * | 2013-02-04 | 2015-05-27 | シャープ株式会社 | データ処理装置 |
| JP6344083B2 (ja) * | 2014-06-20 | 2018-06-20 | カシオ計算機株式会社 | マルチタッチシステム、タッチ座標ペア決定方法及びタッチ座標ペア決定プログラム |
| JP6977661B2 (ja) * | 2018-05-14 | 2021-12-08 | コニカミノルタ株式会社 | 印刷制御装置、印刷制御プログラム及び印刷制御方法 |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010041826A2 (en) * | 2008-10-06 | 2010-04-15 | Samsung Electronics Co., Ltd. | Method and apparatus for managing lists using multi-touch |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7840912B2 (en) * | 2006-01-30 | 2010-11-23 | Apple Inc. | Multi-touch gesture dictionary |
| KR100646431B1 (ko) * | 2005-08-25 | 2006-11-23 | 삼성전자주식회사 | 복수의 문서를 한 페이지에 다면인쇄하는 인쇄 시스템 및방법 |
| JP2008201045A (ja) * | 2007-02-21 | 2008-09-04 | Brother Ind Ltd | 画像形成装置 |
| JP5164675B2 (ja) | 2008-06-04 | 2013-03-21 | キヤノン株式会社 | ユーザインターフェースの制御方法及び情報処理装置及びプログラム |
| DE102009058145A1 (de) * | 2009-12-12 | 2011-06-16 | Volkswagen Ag | Bedienverfahren für eine Anzeigevorrichtung eines Fahrzeugs |
| US9104308B2 (en) * | 2010-12-17 | 2015-08-11 | The Hong Kong University Of Science And Technology | Multi-touch finger registration and its applications |
-
2012
- 2012-08-10 JP JP2012178922A patent/JP2014038383A/ja not_active Withdrawn
-
2013
- 2013-08-01 US US14/009,091 patent/US9348443B2/en not_active Expired - Fee Related
- 2013-08-01 WO PCT/JP2013/071440 patent/WO2014024948A1/en not_active Ceased
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010041826A2 (en) * | 2008-10-06 | 2010-04-15 | Samsung Electronics Co., Ltd. | Method and apparatus for managing lists using multi-touch |
Also Published As
| Publication number | Publication date |
|---|---|
| US20140062928A1 (en) | 2014-03-06 |
| JP2014038383A (ja) | 2014-02-27 |
| US9348443B2 (en) | 2016-05-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20130328804A1 (en) | Information processing apparatus, method of controlling the same and storage medium | |
| US9076085B2 (en) | Image processing apparatus, image processing apparatus control method, and storage medium | |
| EP2835731B1 (en) | Image display apparatus, image display method, and image display program | |
| US20200174632A1 (en) | Thumbnail display apparatus, thumbnail display method, and computer readable medium for switching displayed images | |
| US9557904B2 (en) | Information processing apparatus, method for controlling display, and storage medium | |
| US9685143B2 (en) | Display control device, display control method, and computer-readable storage medium for changing a representation of content displayed on a display screen | |
| KR20140098904A (ko) | 멀티태스킹 운용 방법 및 이를 지원하는 단말기 | |
| US10599317B2 (en) | Information processing apparatus | |
| JP6161418B2 (ja) | 画像形成装置、画像形成装置の制御方法、並びにコンピュータプログラム | |
| JP2020191113A (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム | |
| KR20120035748A (ko) | 인쇄옵션 표시방법 및 인쇄옵션 표시장치 | |
| JP2014038560A (ja) | 情報処理装置、情報処理方法及びプログラム | |
| JP2016126657A (ja) | 情報処理装置、情報処理装置の制御方法、及びプログラム | |
| JP6025473B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| US20160028905A1 (en) | Image processing apparatus, method for controlling the same, and storage medium | |
| JP2014182652A (ja) | 情報処理装置およびその制御方法、ならびにプログラム | |
| WO2014024948A1 (en) | Information processing apparatus, method of controlling the same, program and storage medium | |
| KR102105492B1 (ko) | 정보 처리 장치, 정보 처리 장치의 제어 방법 및 저장 매체 | |
| JP7612930B2 (ja) | 画像処理装置、画像処理装置の制御方法、及びプログラム | |
| JP5165624B2 (ja) | 情報入力装置、オブジェクト表示方法、およびコンピュータが実行可能なプログラム | |
| JP2021028851A (ja) | 画像処理装置、画像処理装置の制御方法、及びプログラム | |
| JP6531425B2 (ja) | 表示装置、画像処理装置及びプログラム | |
| JP2016053888A (ja) | 情報処理装置、表示制御方法、及びコンピュータプログラム | |
| JP2019145183A (ja) | 画像処理装置、画像処理装置の制御方法、及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 14009091 Country of ref document: US |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13827952 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 13827952 Country of ref document: EP Kind code of ref document: A1 |