WO2014148215A1 - 端末装置及びオブジェクト選択方法 - Google Patents
端末装置及びオブジェクト選択方法 Download PDFInfo
- Publication number
- WO2014148215A1 WO2014148215A1 PCT/JP2014/054712 JP2014054712W WO2014148215A1 WO 2014148215 A1 WO2014148215 A1 WO 2014148215A1 JP 2014054712 W JP2014054712 W JP 2014054712W WO 2014148215 A1 WO2014148215 A1 WO 2014148215A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- selection
- range
- selection range
- coordinate
- terminal device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to a terminal device and an object selection method.
- terminal devices such as smartphones, tablet terminals, personal computers, and the like, which are equipped with a touch panel display as an input / output device, are rapidly spreading.
- an operator such as a user's finger or a stylus pen is used to select an object such as a document or an image displayed on the touch panel display.
- a document display device described in Patent Document 1 below is known as a technique for selecting a text displayed on a touch panel display.
- a range of text displayed on the touch panel display can be selected by moving the finger touching the touch panel display in a desired direction.
- the selection operation tends to be complicated, and it is difficult to intuitively select a necessary range.
- the operation time may become longer depending on the width of the selection range. Further, if the operation time is set short, it is difficult to accurately determine the selection range.
- the present invention has been made in view of such problems, and a terminal device and an object selection capable of accurately determining a selection range by an intuitive operation when selecting an object on a touch panel display. It aims to provide a method.
- a terminal device displays an object and detects a proximity or contact of an operation element and an approach position of the operation element to the touch panel display, or an operation element.
- the coordinate detection means for detecting the coordinate value that is the contact position of the plurality of operation elements, and when the coordinate values of the plurality of operation elements detected by the coordinate detection means are simultaneously maintained for a predetermined time
- Object selection means for setting a predetermined selection range and selecting an object included in the selection range.
- the object selection method includes an input / output step in which the touch panel display displays an object and detects the approach or contact of the operation element, and the coordinate detection unit is an operation element for the touch panel display.
- a terminal device or object selection method in a state where an object is displayed on the touch panel display, the approach or contact positions of the plurality of operators on the touch panel display are detected by coordinate values, and the plurality of operators are selected. If it is determined that the coordinate values have been maintained for a predetermined time, an object included in the selection range surrounded by the coordinate values is selected. Thereby, an object in a desired range is accurately selected by an operation for a predetermined time. As a result, it is possible to select an object by accurately defining a selection range by an intuitive operation. Furthermore, by setting the selection range based on whether or not the coordinate values of a plurality of operators are maintained for a certain period of time, it is possible to prevent a conflict with an input operation using a plurality of conventional operators.
- the selection range when selecting an object on the touch panel display, the selection range can be accurately determined by an intuitive operation.
- movement at the time of the object selection by the terminal device 1 of FIG. 3 is a flowchart showing an object omission display operation and an operation at the time of object selection by the terminal device 1 of FIG. 1. It is a figure which shows the example of an output at the time of the object selection in the terminal device 1 of FIG. It is a figure which shows the example of an output of the object omission display in the terminal device 1 of FIG. It is a figure which shows the structural example of the data stored in the coordinate information storage part 31 in the modification of this invention.
- FIG. 1 is a schematic configuration diagram of a terminal device 1 according to a preferred embodiment of the present invention.
- a terminal device 1 shown in FIG. 1 is an information processing terminal device that can be connected to a mobile communication network represented by a smartphone, a tablet terminal, a mobile phone terminal, a personal computer, or a wireless LAN (Local Area Network).
- a mobile communication network represented by a smartphone, a tablet terminal, a mobile phone terminal, a personal computer, or a wireless LAN (Local Area Network).
- a wireless LAN Local Area Network
- the terminal device 1 is a terminal device including a so-called touch panel display 21 as an output device for displaying information.
- a so-called touch panel display 21 As functional components, an input / output control unit 22, a coordinate detection unit (coordinate detection unit) 23, an object selection unit (Object selection means) 24, an application program 25, an object storage unit 30, a coordinate information storage unit 31, and a data storage unit 32 are provided.
- an input / output control unit 22 As functional components, an input / output control unit 22, a coordinate detection unit (coordinate detection unit) 23, an object selection unit (Object selection means) 24, an application program 25, an object storage unit 30, a coordinate information storage unit 31, and a data storage unit 32 are provided.
- an input / output control unit 22 As functional components, an input / output control unit 22, a coordinate detection unit (coordinate detection unit) 23, an object selection unit (Object selection means) 24, an application program 25, an object storage unit 30, a coordinate information storage unit 31, and a data storage unit 32 are provided.
- the touch panel display 21 provided in the terminal device 1 is an input / output device that displays an object such as an icon, a photograph, or a document as an image, and receives an information input operation by detecting the approach or contact of an operator.
- the touch panel display 21 is configured to be able to simultaneously detect the approach or contact of a plurality of user's fingers as an operation element, but is configured to be able to detect the approach or contact of a stylus pen as an operation element. Also good.
- the touch panel display 21 is configured to be able to simultaneously detect the approach distance between the user's fingers and the screen of the touch panel display 21.
- the touch panel display 21 is a capacitive touch panel display, and the distance (approach distance) between the screen and the finger is detected by detecting the amount of change in the capacitance between the finger and the circuit in the touch panel display. It can be calculated. Thereby, touch panel display 21 can distinguish and detect that a finger touched the screen and a finger approached the screen.
- the touch panel display 21 may employ a configuration that can calculate an optical approach distance using infrared rays, such as an In-Cell type.
- the input / output control unit 22 controls the display of the object on the touch panel display 21 by reading out the object from the object storage unit 30 that stores the data of the object to be displayed and delivering it to the touch panel display 21. Further, the input / output control unit 22 receives input of information on the object by the approach or contact of the user's finger via the coordinate detection unit 23. For example, the input / output control unit 22 performs control so that an icon image is displayed on the touch panel display 21, and receives input of a predetermined command corresponding to the icon image by detecting contact with the icon image. In addition, the input / output control unit 22 also passes the coordinate information of the approach or contact of the user's finger to the object selection unit 24. Further, when the object selection unit 24 selects a range of an object including a text image or an image, the input / output control unit 22 also performs display control for the range selection (details will be described later).
- the input / output control unit 22 controls display of text images such as electronic books and e-mail data, as well as photographic images such as diary data posted on an SNS (Social Networking Service) website on the Internet.
- a text image including Moreover, these objects may be read from the object storage part 30, and the terminal device 1 may acquire from the outside via a mobile communication network or wireless LAN.
- the coordinate detection unit 23 receives a signal from the touch panel display 21, a two-dimensional coordinate value (hereinafter referred to as “coordinate information”) indicating the approach position or contact position of the finger on the screen detected by the touch panel display 21, and At that time, the distance from the finger screen detected by the touch panel display 21 (hereinafter referred to as “approach distance”) is calculated each time. Then, the coordinate detection unit 23 detects whether the detected coordinate information is detected with respect to the approaching finger or detected with respect to the touching finger based on the calculated approach distance. Type information is generated. Here, the coordinate detection unit 23 may generate information indicating the approach distance itself instead of the detection type information.
- the coordinate detection unit 23 stores the two-dimensional coordinate value “(X, Y)” as the coordinate information and the detection type information in the coordinate information storage unit 31 in association with each other. At this time, the coordinate detection unit 23 simultaneously generates and stores a plurality of combinations of coordinate information and detection type information when a plurality of fingers are simultaneously detected by the touch panel display 21.
- FIG. 2 shows a configuration example of data stored in the coordinate information storage unit 31 by the coordinate detection unit 23.
- detection type is stored in the data storage unit 32 as two data records in which coordinate information and detection type information are associated with each other.
- Proximity coordinate value “(X1, Y1)”
- detection type proximity
- coordinate value “(X2, Y2)” are stored.
- the object selection unit 24 sets a selection range for an object including a text image and an image displayed on the touch panel display 21, and selects text data and image data included in the selection range. And stored in the data storage unit 32. Specifically, the object selection unit 24 reads information about a plurality of operators detected by the coordinate detection unit 23 from the coordinate information storage unit 31 as needed. Further, the object selection unit 24 maintains the coordinates indicated by the coordinate information of the two selectors included in the read information at the same time for a preset time (for example, one second), and sets the two selectors. It is determined whether or not the corresponding detection type information indicates “approaching” (hereinafter referred to as “range selection condition”).
- the object selection unit 24 determines whether or not the displacement amount of the coordinate value within the set time is within a predetermined error range. Then, when the range selection condition is satisfied, the object selection unit 24 sets a selection range on the touch panel display 21 determined by the coordinate information of the two operators, and text data in the object included in the selection range And image data are selected. Thereafter, the object selection unit 24 stores the selected data in the data storage unit 32 at a timing when selection of data in the object is confirmed by receiving a predetermined operation from the user via the input / output control unit 22. .
- the object selection unit 24 can directly transfer the data whose selection is confirmed to the application program 25, and the type of data transfer operation is “copy”, “cut”, “search”, “share”, or the like. You may set many types like
- the object selection unit 24 determines the selection of the object as follows after setting the selection range of the object as described above. That is, after determining that the range selection condition is satisfied, the object selection unit 24 changes so that the coordinates of the two operators are close to each other (so-called pinch-in operation), and the two selectors It is determined whether or not the detection type information corresponding to 1 is maintaining “approaching” (hereinafter referred to as “range selection confirmation condition”). Then, the object selection unit 24 determines the data selected when it is determined that the range selection determination condition is satisfied. At this time, the range selection confirmation condition may be that the coordinates of the two selectors are both changed, or may be that the coordinates of one of the two selectors are changed so as to approach the other. .
- the object selection unit 24 can control the omission display of the object via the input / output control unit 22 as follows when executing the setting of the selection range of the object. Specifically, in the state where the coordinate values of the two operators are detected, the object selection unit 24 has “contact” as the detection type information of one operator and “approach” as the detection type information of the other operator. The coordinate value of one of the controls is maintained, and the position indicated by the coordinate value of the other control is changing so as to approach the position indicated by the coordinate value of one of the controls (so-called , That the operation is a pinch-in operation) (hereinafter, referred to as “abbreviated display condition”).
- the object selection unit 24 determines that the abbreviated display condition is satisfied, the object selection unit 24 omits the display of the range sandwiched between the positions indicated by the coordinate values of the two operators of the displayed object.
- the input / output control unit 22 is controlled so that a part on the side of the operator on which “approach” is detected is folded back (displayed) in the omitted range.
- the application program 25 is various programs for processing data delivered from the object selection unit 24.
- the application program 25 is a document creation program, an e-mail creation program, a browser for connecting to a website, an electronic book viewing program, or the like.
- the application program 25 executes processing corresponding to the data delivery operation type. For example, when the operation type “Copy” is delivered, the data is copied to document data or an e-mail document.
- the operation type “Search” is delivered, the data is used as a search key to perform a search process on the Internet.
- the operation type “shared” is delivered, the data is transmitted for posting on the website.
- FIG. 3 is a flowchart showing an operation at the time of object selection by the terminal device 1
- FIG. 4 is a flowchart showing an operation of the object omission display by the terminal device 1 and an operation at the time of object selection.
- FIG. 5 shows an output example of the touch panel display 21 when an object is selected
- FIG. 6 shows an output example of an object omission display on the touch panel display 21.
- the approach of the user's finger is detected at two points by the coordinate detection unit 23 with the object displayed on the touch panel display 21 (step S ⁇ b> 101, FIG. 5A, FIG. 5). 5 (b)). Thereafter, the object selection unit 24 determines whether or not the range selection condition is satisfied (step S102). As a result of the determination, when it is determined that the range selection condition is not satisfied (step S102; NO), the input / output control unit 22 detects contact of the user's finger at two points. Are determined to be close to each other or away from each other (so-called pinch-in / pinch-out pinch operation is performed) (S103).
- step S103 When it is determined that a pinch operation has been performed (step S103; YES), the input / output control unit 22 executes enlargement / reduction processing of the displayed object (step S104). On the other hand, when it is determined that the pinch operation has not been performed (step S103; NO), the process returns to step S101 and returns to the object range selection process again.
- the object selection unit 24 highlights (emphasizes) the selected object range G1 or surrounds it with a frame.
- a button image G2 for allowing the user to select an operation condition for data transfer is displayed on the touch panel display 21 (step S105, FIG. 5C).
- the object selection unit 24 performs control so that the cursors G3 and G4 are displayed over the object at the start and end positions of the selected range.
- the object selection unit 24 determines whether or not the range selection confirmation condition is satisfied (step S106).
- step S106 when it is determined that the range selection determination condition is satisfied (step S106; YES), the data selected by the object selection unit 24 is determined, the operation type is set to “copy”, and the data is stored. The data is stored in the data storage unit 32 (step S107).
- step S106 when it is determined that the range selection confirmation condition is not satisfied (step S106; NO), whether the approach or contact of two fingers is not detected by the object selection unit 24 (operation to release the finger is detected). Whether or not) is determined (step S108). If two fingers are still detected (step S108; NO), the process returns to step S105, and the display of the object selection range is continued. On the other hand, when the operation of releasing the finger is detected (step S108; YES), the object selection unit 24 detects the contact operation (slide operation) on the cursor G3 by the user, and the start point position changing process is accepted. It is determined whether or not it has been performed (step S109).
- the object selecting unit 24 changes the starting point position of the object selection range (step S110), and the starting point position changing process is not accepted.
- the object selection unit 24 maintains the start point position of the selection range of the object without being changed.
- the object selection unit 24 determines whether or not the end point position changing process has been accepted by detecting a touch operation (slide operation) on the user's cursor G4 (step S111).
- the object selecting unit 24 changes the end point position of the object selection range (step S112), and the end point position changing process is not accepted.
- the object selection unit 24 maintains the end position of the selection range of the object without being changed.
- the object selection unit 24 determines whether or not the data delivery operation type has been selected by detecting the user's contact with the button image G2 (step S113). ).
- the selection of the action type is accepted (step S113; YES)
- the data of the object in the selection range is delivered from the object selection unit 24 to the application program 25 together with the information of the action type (step S114).
- processing corresponding to the operation type using the data delivered by the application program 25 is activated.
- the approach of the user's finger is detected at two points by the coordinate detection unit 23 with the object displayed on the touch panel display 21 (step S201). Thereafter, it is determined whether or not the omission display condition is satisfied by the object selection unit 24 (steps S202 and S203). As a result of the determination, when it is determined that the omitted display condition is satisfied (step S202; YES and step S203; YES), the object selection unit 24 sets the object range G5 sandwiched between the positions of the two fingers. Simultaneously with the display, the range G6 of a part of the finger-side object in which “approach” is detected is folded back to the range G5 side and displayed (step S204, FIG. 6A). In this state, the object selection range is set and confirmed in the same manner as the operations in steps S101 to S114 in FIG. 3 (steps S205 to S218, FIG. 6B).
- the object selection unit 24 When executing the object abbreviated display, if the object advance operation is insufficient with a single operation, the object selection unit 24 repeatedly determines the abbreviated display conditions, so that the object advance display is performed. The operation may be repeated. At this time, the advance display operation is repeated when one of the user's fingers for which “approach” has been detected is no longer detected on the touch panel display 21.
- the approach positions of a plurality of fingers on the touch panel display 21 are detected as coordinate values, and the coordinate values of the plurality of fingers are determined for a predetermined time.
- an object included in the selection range surrounded by the coordinate values is selected.
- the operation time is not affected by the width of the selection range, and there is no problem that it is difficult to set the selection range when the operation time is set short, and an object in the desired range is accurately selected by an operation for a certain period of time.
- the selection range is set based on whether or not the coordinate values of a plurality of fingers are maintained for a certain period of time, it is possible to prevent a conflict with a conventional input operation using a plurality of fingers. Specifically, since the selection range is set when the “approach” coordinates of two fingers are maintained for a certain period of time, it can be clearly distinguished from the conventional pinch-in and pinch-out operations that are operated with two fingers. Can be prevented. Furthermore, since the object is not easily hidden by the selection operation, the selection range can be set while visually confirming by the user, and the operability is further improved.
- the object selection unit 24 determines the selected data when a pinch-in operation is detected as the range selection determination condition, the selection range of the object selected by a smooth operation continuous with the object range selection operation Can be determined.
- the object selection unit 24 can execute the omission display of the object, the selection range using a plurality of fingers can be more easily set when it is desired to select a wide range of the object.
- the abbreviated display operation can be realized by an intuitive operation.
- the object selection unit 24 may determine the range selection condition, the range selection confirmation condition, and the abbreviated display condition using the detection results of the user's four fingers.
- the object selection unit 24 uses the data stored in the coordinate information storage unit 31 as shown in FIG. 7 to detect the detection information about the four fingers, “detection type“ proximity ”, coordinate value“ (X1, Y1) ”. ”,“ Detection type “proximity”, coordinate value “(X2, Y2)” ”,“ Detection type “proximity”, coordinate value “(X3, Y3)” ”,“ detection type “proximity”, coordinate value “(X4 , Y4) "”.
- the object selection unit 24 maintains the coordinates indicated by the coordinate information of the four selectors included in the read information as a range selection condition at the same time for a preset time, In addition, it is determined whether the detection type information corresponding to the four selectors indicates “approaching.” Further, the object selection unit 24 selects two fingers of both hands as a range selection confirmation condition. It is determined whether the used pinch-in operation is performed. , The coordinates of one of the four fingers change so as to approach each other, and the coordinates of the other pair of fingers also change so as to approach each other (so-called pinch-in with both hands). And the detection type information corresponding to the four fingers maintains “approaching.” Further, the object selection unit 24 determines whether the omission display condition is determined. When the abbreviated display condition is satisfied for each of the two pairs of fingers, the abbreviated display of the object is executed.
- FIG. 8 is a flowchart showing an operation at the time of object selection using the detection result of four fingers by the terminal device 1
- FIG. 9 is a diagram showing an output example of the touch panel display 21 at the time of object selection.
- the approach of the user's finger is detected at four points by the coordinate detection unit 23 with the object displayed on the touch panel display 21 (step S301, FIG. 9A). Thereafter, the object selection unit 24 determines whether or not the range selection condition is satisfied (step S302). As a result of the determination, when it is determined that the range selection condition is not satisfied (step S302; NO), the input / output control unit 22 detects contact of the user's finger at two points. Are determined to be close to each other or away from each other (so-called pinch-in / pinch-out pinch operation is performed) (S303).
- step S303 When it is determined that a pinch operation has been performed (step S303; YES), the input / output control unit 22 executes enlargement / reduction processing of the displayed object (step S304). On the other hand, when it is determined that the pinch operation has not been performed (step S303; NO), the process returns to step S301 and returns to the object range selection process again.
- step S302 when it is determined that the range selection condition is satisfied (step S302; YES), the object selection unit 24 highlights (emphasizes) the selected object range G7 or surrounds it with a frame. By being displayed, simultaneously with the display on the touch panel display 21, button images G8 and G9 for allowing the user to select an operation condition for data transfer are displayed on the touch panel display 21 (step S305, FIG. 9 ( b)). Next, the object selection unit 24 determines whether or not the range selection confirmation condition is satisfied (step S306).
- step S306 when it is determined that the range selection confirmation condition is satisfied (step S306; YES), the data selected by the object selection unit 24 is confirmed, the action type is set to “capture”, and the selected range is selected.
- Frame image data including the included image and text is stored in the data storage unit 32 (step S307).
- step S306 when it is determined that the range selection confirmation condition is not satisfied (step S306; NO), whether the approach or contact of the four fingers is not detected by the object selection unit 24 (operation for releasing the finger is detected). Whether or not) is determined (step S308). If four fingers are still detected (step S308; NO), the process returns to step S305, and the display of the object selection range is continued. On the other hand, when an operation of releasing the finger is detected (step S308; YES), the object selection unit 24 detects the contact operation (slide operation) on the selection range G7 of the user, thereby changing the start point position. It is determined whether or not it has been accepted (step S309).
- the object selecting unit 24 changes the starting point position of the object selection range (step S310), and the starting point position changing process is not accepted.
- the object selection unit 24 maintains the start position of the selection range of the object without being changed.
- the object selecting unit 24 determines whether or not the end point position changing process has been received by detecting a contact operation (slide operation) on the selection range G7 by the user (step S311).
- the end point position change process is accepted (step S311; YES)
- the object selection unit 24 changes the end point position of the object selection range (step S312), and the end point position change process is not accepted.
- Step S311 the object selection unit 24 maintains the end position of the selection range of the object without being changed.
- the change of the start point position and the end point position of the selection range is executed by detecting a touch operation on the upper left, upper right, lower right, lower left, upper side, right side, lower side, or left side in the frame of the selection range G7. .
- the object selection unit 24 determines whether or not an operation type for data transfer has been selected by detecting contact of the user with the button images G8 and G9 in a state where the selection range of the object is displayed. Step S313).
- the selection of the action type is accepted (step S313; YES)
- the data of the object in the selection range is delivered from the object selection unit 24 to the application program 25 together with the information of the action type (step S314).
- processing corresponding to the operation type using the data delivered by the application program 25 is activated.
- the determination of the object range selection condition is triggered by the simultaneous detection of two or four operators on the touch panel display. However, the coordinates of the two points are shifted in time.
- the determination of the object range selection condition may be started when the four points are detected or the four coordinates are detected with a time shift.
- the object selection unit 24 determines whether or not two or four points of approach are detected as the range selection condition and the range selection determination condition. Even when contact is detected, it may be determined that the range selection condition and the range selection confirmation condition are satisfied.
- the data transfer operation type to be selected when the range selection confirmation condition is satisfied is not limited to “copy” and “capture”, but various other types such as “cut”, “paste to mail body”, etc. The operation type may be set.
- the object selection unit 24 of the terminal device 1 may display the received mail list screen on the touch panel display 21 by the electronic mail application as shown in FIG. It operates to select a received mail displayed inside the selection range G10.
- this selection range G10 a rectangular area having two sides in one direction (along the short side direction of the touch panel display 21) is set with reference to the coordinates of the two operators.
- the object selection unit 24 sets the type of the delivery operation of the received mail to “delete”, “move to another folder”, etc., and sets the data to be delivered as the body including the header of the email.
- the object selection unit 24 when the bookmark selection screen is displayed on the touch panel display 21 by a Web browser application or the like, is placed inside the selection range G11 when the range selection condition is satisfied. It works to select the bookmark displayed on the screen.
- This selection range G11 is set to a rectangular area that includes the coordinates of the two operators.
- the object selection unit 24 sets the data to be delivered as data including text data and image data downloaded corresponding to the selected bookmark, or a URL (Uniform Resource Locator) corresponding to the selected bookmark. Address information of the download destination such as.
- the object selection unit 24 selects all bookmarks included in the folder G12.
- the object selection unit 24 when the application list screen is displayed on the touch panel display 21 by the operating system or the like, is placed inside the selection range G13 when the range selection condition is satisfied. Operates to select the displayed application program.
- This selection range G13 is set to a rectangular area that includes the coordinates of the two controls inside the four corners.
- the object selection unit 24 sets the application program delivery operation type to “install”, “uninstall”, “share”, “copy”, or the like. Then, the object selection unit 24 selects the storage target information (shortcut) of the execution program corresponding to the selected application program, the text data including the image data related to the execution program, or the data to be delivered. Address information such as an installation address and an uninstallation address corresponding to the application program is used. Further, when the selected object includes a folder in which a plurality of application programs are stored, the object selection unit 24 selects all application programs included in the folder.
- the object selection unit 24 selects the selection range G14 when the range selection condition is satisfied in a state where the photo list screen is displayed on the touch panel display 21 by a photo editing application program or the like. It operates to select the image data displayed inside.
- This selection range G14 is set to a rectangular area that includes the coordinates of the two controls inside the four corners. At this time, when the selected object includes a folder in which a plurality of image data is stored, the object selection unit 24 selects all the image data included in the folder.
- the object selection unit 24 when the map selection image G is displayed on the touch panel display 21 by the map search application program or the like, It operates to select the map image displayed on the inside.
- This selection range G14 is set to a rectangular area that includes the coordinates of the two controls inside the four corners.
- the object selection unit 24 can include address information such as a URL corresponding to the position indicated by the map image, in addition to the image data corresponding to the selected map image, as the data to be delivered.
- the object selection unit 24 of the terminal device 1 of the above-described embodiment may operate so as to exclude a part of the range from the selection after setting the selection range once.
- FIG. 15 is a conceptual diagram showing a range on the touch panel display 21 that is excluded by the selection range exclusion process by the object selection unit 24.
- FIG. 16 is a diagram of objects selected through the selection range exclusion process by the object selection unit 24. It is a figure which shows an image.
- the object selection unit 24 sets the selection range G16 when the range selection condition is satisfied based on the positions L1 and L2 indicated by the coordinate information of the two selectors. Before the selection of the object is confirmed, it is determined whether or not the range selection condition is satisfied with respect to the coordinate information of the other two selectors. Then, when the range selection condition is satisfied, the object selection unit 24 sets a selection range G17 defined by the positions L3 and L4 indicated by the coordinate information of the other two selectors. Furthermore, the object selection unit 24 recognizes whether or not the selection range G17 set next is included in the selection range G16 set first, and when it is recognized that the selection range G17 is included, the object selection unit 24 selects an object.
- the selection range G17 is excluded from the selection range G16.
- the data of the two separated regions G18 from the objects such as text data displayed on the touch panel display 21 can be obtained by a simple operation using the four operators of the user (FIG. 16A). Selection is made possible (FIG. 16B). As a result, the operation for excluding a part of the selected object can be made more efficient.
- the object selection unit 24 of the terminal device 1 may operate so as to add a partial range after setting the selection range once.
- FIG. 17 is a conceptual diagram showing the range on the touch panel display 21 added by the selection range addition process by the object selection unit 24, and
- FIG. 18 shows the objects selected through the selection range addition process by the object selection unit 24. It is a figure which shows an image.
- the object selection unit 24 sets the selection range G19 when the range selection condition is satisfied based on the positions L5 and L6 indicated by the coordinate information of the two selectors. Furthermore, before the selection of the object is confirmed, it is determined whether or not the range selection condition is satisfied with respect to the coordinate information of the other two selectors. Then, the object selection unit 24 sets the selection range G20 defined by the positions L7 and L8 indicated by the coordinate information of the other two selectors when the range selection condition is satisfied. Further, the object selection unit 24 recognizes whether or not the next selection range G20 exists outside the selection range G19 set first, and when it is recognized that the selection range G20 exists outside, the object selection unit 24 selects an object. A selection range G20 is added to the selection range G19.
- the object selection unit 24 recognizes whether or not a part of the selection range G22 set next overlaps the selection range G21 set first. .
- the object selection unit 24 recognizes that a part of the selection range G22 overlaps the selection range G21, the object selection unit 24 outside the selection range G21 that does not overlap the selection range G21 for selecting an object.
- the selection range G22 of the range is added.
- An octagonal selection range G25 is set so as to include the text described between the two texts T3 and T4 corresponding to the child coordinate information.
- the object selection unit 24 when the object selection unit 24 recognizes list information such as a photo data list or an application list displayed on the touch panel display 21, it is shown in FIG. As described above, the hexagonal selection range G26 corresponding to the coordinate information of the two operators may be set as the boundary range of the selection range. That is, the object selection unit 24 can change the selection range setting method according to the display direction of the object and the display area of each object.
- the object selection unit 24 of the terminal device 1 operates to recognize the type of the object to be selected and automatically select a specific type of object from the objects included in the selection range based on a predetermined determination criterion. May be. For example, the object selection unit 24 determines whether the object type included in the selection range is “file” or “folder”, and automatically selects the object type “file”. To work. Further, the object selection unit 24 operates to recognize the type with the largest number of objects included in the selection range and automatically select the object with the largest number of types.
- the object selection unit 24 sets the types of objects included in the selection range G27 to “folder”, “DOC file”, “JPEG file”, and “PPT file”. ", A button image G28 for allowing the user to select an object selection criterion is displayed on the touch panel display. At this time, when the selection criterion of “select file only” is input by the user, the object selection unit 24 selects the types “DOC file”, “JPEG file”, and “JPEG file” from the objects included in the selection range G27. Select the object "PPT file”.
- the object selection unit 24 selects the most frequently-typed object “JPEG file” from among the objects included in the selection range G27. (FIG. 22B). By such an operation, an object desired by the user can be efficiently selected.
- the object selection means in the above embodiment sets the first selection range when a plurality of coordinate values are maintained for a predetermined time, and then the plurality of coordinate values are maintained for a predetermined time. It is preferable to set a selection range for selecting an object by setting a second selection range at the time and further excluding or adding the second selection range to the first selection range. With this configuration, it is possible to improve the degree of freedom in selecting an object by continuous operation using a plurality of operators. As a result, the operation at the time of object selection can be made efficient.
- the object selection means excludes the second selection range from the first selection range when it is recognized that the second selection range is included in the first selection range. In this way, it is possible to make the operation for excluding the selected object more efficient.
- the object selection means adds the second selection range to the first selection range when it is recognized that the second selection range exists outside the first selection range. In this case, the operation for adding the selection of the object can be made efficient.
- the object selection means recognizes the type of the object to be selected and changes the boundary range of the selection range set for a plurality of coordinate values according to the type. If such an object selection means is provided, the selection range according to the type of object can be set appropriately. As a result, the operation at the time of object selection can be made more efficient.
- the object selection means recognizes the type of the object included in the selection range and selects a specific type of object from the objects included in the selection range based on a predetermined determination criterion. By adopting such a configuration, it is possible to efficiently select an object desired by the user.
- the object selecting means recognizes the most types of objects based on the objects included in the selection range, and selects the objects of that type from the objects included in the selection range. In this way, an object desired by the user can be efficiently selected.
- the object selecting means may determine selection of an object included in the selection range when detecting a change in coordinate value of at least one of the plurality of operators after setting the selection range. preferable. Further, the object selection means, after setting the selection range, determines the selection of an object included in the selection range when detecting a change in the coordinate value of two or more of the plurality of controls. Is also preferable. In this way, the selection range of the selected object can be determined by a continuous smooth operation.
- the object selection means maintains the coordinate value of one of the operating elements detected by the coordinate detecting means, and the position indicated by the coordinate value of the other operating element detected by the coordinate detecting means is one operation.
- the present invention is intended to use the terminal device and the object selection method, and when selecting an object on the touch panel display, the selection range can be accurately determined by an intuitive operation.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
また、上述した実施形態では、選択対象のオブジェクトとしてテキストイメージを説明しているが、その他の様々なオブジェクトを対象としうる。例えば、端末装置1のオブジェクト選択部24は、図10に示すように、電子メール用アプリケーションによってタッチパネルディスプレイ21上に受信メール一覧画面が表示された状態において、範囲選択条件が満たされた際に、選択範囲G10の内側に表示されている受信メールを選択するように動作する。この選択範囲G10は、2つの操作子の座標を基準にして一方向に(タッチパネルディスプレイ21の短辺方向に沿って)2辺を有する矩形領域が設定される。この際、オブジェクト選択部24は、受信メールの引き渡し動作の種別を「削除」、「別フォルダに移動」等に設定し、引き渡し対象のデータを電子メールのヘッダ等を含む本文とする。
上述した実施形態の端末装置1のオブジェクト選択部24は、いったん選択範囲を設定した後に、一部の範囲を選択から除外するように動作してもよい。図15は、オブジェクト選択部24による選択範囲の除外処理によって除外されたタッチパネルディスプレイ21上の範囲を示す概念図、図16は、オブジェクト選択部24による選択範囲の除外処理を経て選択されたオブジェクトのイメージを示す図である。
上述した実施形態の端末装置1のオブジェクト選択部24は、いったん選択範囲を設定した後に、一部の範囲を付加するように動作してもよい。図17は、オブジェクト選択部24による選択範囲の付加処理によって付加されたタッチパネルディスプレイ21上の範囲を示す概念図、図18は、オブジェクト選択部24による選択範囲の付加処理を経て選択されたオブジェクトのイメージを示す図である。
上述した実施形態の端末装置1のオブジェクト選択部24は、選択対象のオブジェクトの種別を認識し、その種別に応じて選択子の座標情報に対する選択範囲の設定方法を変更するように動作してもよい。例えば、図19(a)に示すように、オブジェクト選択部24は、タッチパネルディスプレイ21上に表示された横書きのテキストデータを認識した際には、選択範囲の境界の範囲として、2つの操作子の座標情報に対応する2つのテキストT1,T2との間に記載されるテキストを含むような八角形の選択範囲G24を設定する。同様に、図19(b)に示すように、オブジェクト選択部24は、タッチパネルディスプレイ21上に表示された縦書きのテキストデータを認識した際には、選択範囲の境界の範囲として、2つの操作子の座標情報に対応する2つのテキストT3,T4との間に記載されるテキストを含むような八角形の選択範囲G25を設定する。これにより、オブジェクトの種類に応じた選択範囲を適切に設定することができるので、オブジェクト選択時の操作をより一層効率化することができる。
上述した実施形態の端末装置1のオブジェクト選択部24は、選択対象のオブジェクトの種別を認識し、所定の判断基準で選択範囲に含まれるオブジェクトの中から特定種類のオブジェクトを自動選択するように動作してもよい。例えば、オブジェクト選択部24は、選択範囲に含まれるオブジェクトの種別が、「ファイル」であるか「フォルダ」であるかを判別し、オブジェクトの種別が「ファイル」であるものを自動で選択するように動作する。また、オブジェクト選択部24は、選択範囲に含まれるオブジェクトの最も数の多い種別を認識し、最も数の多い種別のオブジェクトを自動で選択するように動作する。
Claims (11)
- オブジェクトを表示するとともに、操作子の接近或いは接触を検知するタッチパネルディスプレイと、
前記タッチパネルディスプレイへの操作子の接近位置、或いは前記操作子の接触位置である座標値を検出する座標検出手段と、
前記座標検出手段によって検出された複数の操作子の前記座標値が、同時に所定時間の間維持されたときに、前記複数の操作子の前記座標値によって定められる選択範囲を設定し、該選択範囲に含まれる前記オブジェクトを選択するオブジェクト選択手段と、
を備えることを特徴とする端末装置。 - 前記オブジェクト選択手段は、
複数の前記座標値が所定の時間の間維持されたときに第1の前記選択範囲を設定し、その後、複数の前記座標値が所定の時間の間維持されたときに第2の前記選択範囲を設定し、さらに、前記第1の選択範囲に対して前記第2の選択範囲を除外或いは付加することで前記オブジェクトを選択するための前記選択範囲を設定する、
ことを特徴とする請求項1記載の端末装置。 - 前記オブジェクト選択手段は、
前記第1の選択範囲に前記第2の選択範囲が包含されていると認識されたときには、前記第1の選択範囲に対して前記第2の選択範囲を除外する、
ことを特徴とする請求項2記載の端末装置。 - 前記オブジェクト選択手段は、
前記第2の選択範囲が前記第1の選択範囲外に存在すると認識されたときには、前記第1の選択範囲に対して前記第2の選択範囲を付加する、
ことを特徴とする請求項2記載の端末装置。 - 前記オブジェクト選択手段は、
選択対象のオブジェクトの種類を認識し、複数の前記座標値に対して設定する前記選択範囲の境界の範囲を、前記種類に応じて変更する、
ことを特徴とする請求項1~4のいずれか1項に記載の端末装置。 - 前記オブジェクト選択手段は、
前記選択範囲に含まれるオブジェクトの種類を認識し、所定の判断基準で前記選択範囲に含まれる前記オブジェクトの中から特定の種類のオブジェクトを選択する、
ことを特徴とする請求項1~5のいずれか1項に記載の端末装置。 - 前記オブジェクト選択手段は、
前記選択範囲に含まれるオブジェクトを基に最も数の多いオブジェクトの種類を認識し、前記選択範囲に含まれる前記オブジェクトの中から当該種類のオブジェクトを選択する、
ことを特徴とする請求項6に記載の端末装置。 - 前記オブジェクト選択手段は、
前記選択範囲を設定後、前記複数の操作子のうちの少なくとも1つの操作子の前記座標値の変化を検出したときに、前記選択範囲に含まれる前記オブジェクトの選択を確定させる、
ことを特徴とする請求項1~7のいずれか1項に記載の端末装置。 - 前記オブジェクト選択手段は、
前記選択範囲を設定後、前記複数の操作子のうちの2つ以上の操作子の前記座標値の変化を検出したときに、前記選択範囲に含まれる前記オブジェクトの選択を確定させる、
ことを特徴とする請求項1~8のいずれか1項に記載の端末装置。 - 前記オブジェクト選択手段は、
前記座標検出手段によって検出された一方の操作子の前記座標値が維持されており、かつ、前記座標検出手段によって検出された他方の操作子の前記座標値の示す位置が前記一方の操作子の前記座標値の示す位置に近づくように変化しているときは、前記一方の操作子の座標値と前記他方の操作子の座標値の間の範囲において前記オブジェクトの一部を繰り上げるように表示させる、
ことを特徴とする請求項1~9のいずれか1項に記載の端末装置。 - タッチパネルディスプレイが、オブジェクトを表示するとともに、操作子の接近或いは接触を検知する入出力ステップと、
座標検出手段が、前記タッチパネルディスプレイへの操作子の接近位置、或いは前記操作子の接触位置である座標値を検出する座標検出ステップと、
オブジェクト選択手段が、前記座標検出手段によって検出された複数の操作子の前記座標値が、同時に所定時間の間維持されたときに、前記複数の操作子の前記座標値によって定められる選択範囲を設定し、該選択範囲に含まれる前記オブジェクトを選択するオブジェクト選択ステップと、
を備えることを特徴とするオブジェクト選択方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/778,940 US9690464B2 (en) | 2013-03-21 | 2014-02-26 | Terminal device and method for selecting object |
EP14768594.5A EP2977883B1 (en) | 2013-03-21 | 2014-02-26 | Terminal device and method for selecting object |
JP2015506670A JP6043423B2 (ja) | 2013-03-21 | 2014-02-26 | 端末装置及びオブジェクト選択方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-058661 | 2013-03-21 | ||
JP2013058661 | 2013-03-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014148215A1 true WO2014148215A1 (ja) | 2014-09-25 |
Family
ID=51579907
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/054712 WO2014148215A1 (ja) | 2013-03-21 | 2014-02-26 | 端末装置及びオブジェクト選択方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9690464B2 (ja) |
EP (1) | EP2977883B1 (ja) |
JP (1) | JP6043423B2 (ja) |
WO (1) | WO2014148215A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016206723A (ja) * | 2015-04-15 | 2016-12-08 | キヤノン株式会社 | 表示装置及び表示方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10156908B2 (en) * | 2015-04-15 | 2018-12-18 | Sony Interactive Entertainment Inc. | Pinch and hold gesture navigation on a head-mounted display |
KR102330605B1 (ko) * | 2016-06-22 | 2021-11-24 | 가부시키가이샤 한도오따이 에네루기 켄큐쇼 | 반도체 장치 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10108092A (ja) * | 1996-10-03 | 1998-04-24 | Nippon Telegr & Teleph Corp <Ntt> | 映像表示装置 |
JP2010097473A (ja) * | 2008-10-17 | 2010-04-30 | Sony Corp | 表示装置、表示方法及びプログラム |
JP2012521048A (ja) * | 2009-03-16 | 2012-09-10 | アップル インコーポレイテッド | タッチスクリーンディスプレイを有する多機能デバイスでの編集の方法およびグラフィカルユーザインターフェース |
JP2012185323A (ja) * | 2011-03-04 | 2012-09-27 | Sharp Corp | 再生装置、再生方法、プログラムおよび記録媒体 |
JP2012190073A (ja) * | 2011-03-08 | 2012-10-04 | Japan Science & Technology Agency | 家事計画作成支援装置および家事計画作成支援方法 |
JP2012226531A (ja) * | 2011-04-19 | 2012-11-15 | Konica Minolta Business Technologies Inc | ファイル処理システム、管理装置、および制御プログラム |
JP2013008201A (ja) | 2011-06-24 | 2013-01-10 | Sharp Corp | 文章表示装置、文章表示方法、プログラムおよび記録媒体 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
MY119560A (en) | 1996-05-27 | 2005-06-30 | Nippon Telegraph & Telephone | Scheme for detecting captions in coded video data without decoding coded video data |
US7743348B2 (en) * | 2004-06-30 | 2010-06-22 | Microsoft Corporation | Using physical objects to adjust attributes of an interactive display application |
US7644374B2 (en) | 2005-04-14 | 2010-01-05 | Microsoft Corporation | Computer input control for specifying scope with explicit exclusions |
JP4605279B2 (ja) * | 2008-09-12 | 2011-01-05 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
KR101586627B1 (ko) | 2008-10-06 | 2016-01-19 | 삼성전자주식회사 | 멀티 터치를 이용한 리스트 관리 방법 및 장치 |
JP5217960B2 (ja) * | 2008-11-26 | 2013-06-19 | 株式会社リコー | 画像処理装置、画像処理方法、及びプログラム |
JP5333321B2 (ja) * | 2009-09-30 | 2013-11-06 | アイシン・エィ・ダブリュ株式会社 | ナビゲーション装置 |
-
2014
- 2014-02-26 JP JP2015506670A patent/JP6043423B2/ja active Active
- 2014-02-26 WO PCT/JP2014/054712 patent/WO2014148215A1/ja active Application Filing
- 2014-02-26 EP EP14768594.5A patent/EP2977883B1/en active Active
- 2014-02-26 US US14/778,940 patent/US9690464B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10108092A (ja) * | 1996-10-03 | 1998-04-24 | Nippon Telegr & Teleph Corp <Ntt> | 映像表示装置 |
JP2010097473A (ja) * | 2008-10-17 | 2010-04-30 | Sony Corp | 表示装置、表示方法及びプログラム |
JP2012521048A (ja) * | 2009-03-16 | 2012-09-10 | アップル インコーポレイテッド | タッチスクリーンディスプレイを有する多機能デバイスでの編集の方法およびグラフィカルユーザインターフェース |
JP2012185323A (ja) * | 2011-03-04 | 2012-09-27 | Sharp Corp | 再生装置、再生方法、プログラムおよび記録媒体 |
JP2012190073A (ja) * | 2011-03-08 | 2012-10-04 | Japan Science & Technology Agency | 家事計画作成支援装置および家事計画作成支援方法 |
JP2012226531A (ja) * | 2011-04-19 | 2012-11-15 | Konica Minolta Business Technologies Inc | ファイル処理システム、管理装置、および制御プログラム |
JP2013008201A (ja) | 2011-06-24 | 2013-01-10 | Sharp Corp | 文章表示装置、文章表示方法、プログラムおよび記録媒体 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016206723A (ja) * | 2015-04-15 | 2016-12-08 | キヤノン株式会社 | 表示装置及び表示方法 |
Also Published As
Publication number | Publication date |
---|---|
EP2977883A4 (en) | 2016-05-11 |
JPWO2014148215A1 (ja) | 2017-02-16 |
JP6043423B2 (ja) | 2016-12-14 |
US20160041726A1 (en) | 2016-02-11 |
US9690464B2 (en) | 2017-06-27 |
EP2977883B1 (en) | 2018-09-05 |
EP2977883A1 (en) | 2016-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7328182B2 (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム | |
CN108509115B (zh) | 页操作方法及其电子装置 | |
EP2372516B1 (en) | Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display | |
JP6338318B2 (ja) | 操作装置、画像形成装置及びコンピュータプログラム | |
US9766739B2 (en) | Method and apparatus for constructing a home screen in a terminal having a touch screen | |
KR102033801B1 (ko) | 인플레이스 방식으로 값을 편집하는 사용자 인터페이스 제공 기법 | |
US20140019910A1 (en) | Touch and gesture input-based control method and terminal therefor | |
KR101328202B1 (ko) | 제스처 입력을 통한 기능 수행 명령실행 방법 및 장치 | |
EP2325740A2 (en) | User interface apparatus and method | |
KR20150119135A (ko) | 전자 장치에 표시된 콘텐츠를 관리하기 위한 시스템 및 방법 | |
US20140368875A1 (en) | Image-forming apparatus, control method for image-forming apparatus, and storage medium | |
JP2016126657A (ja) | 情報処理装置、情報処理装置の制御方法、及びプログラム | |
JP6399834B2 (ja) | 情報処理装置、情報処理装置の制御方法、及びプログラム | |
JP6601042B2 (ja) | 電子機器、電子機器の制御プログラム | |
JP6043423B2 (ja) | 端末装置及びオブジェクト選択方法 | |
US20120120021A1 (en) | Input control apparatus | |
JP6299245B2 (ja) | 表示装置、制御プログラム、スクロール表示方法、及び記録媒体 | |
JP5820414B2 (ja) | 情報処理装置及び情報処理方法 | |
US20140040827A1 (en) | Information terminal having touch screens, control method therefor, and storage medium | |
KR102301652B1 (ko) | 페이지 운용 방법 및 그 전자 장치 | |
JP2014203202A (ja) | 情報処理装置、情報処理装置の制御方法、およびプログラム | |
JP2015014888A (ja) | 操作装置、画像形成装置、操作装置の制御方法、及びプログラム | |
JP2015102946A (ja) | 情報処理装置、情報処理装置の制御方法、およびプログラム | |
JP6036123B2 (ja) | 情報表示装置及びプログラム | |
JP2017123055A (ja) | 画像処理装置、プレビュー画像の表示制御方法およびコンピュータプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14768594 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015506670 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14778940 Country of ref document: US Ref document number: 2014768594 Country of ref document: EP |