WO2014141799A1 - 電子機器、情報処理方法、及び情報処理プログラム - Google Patents
電子機器、情報処理方法、及び情報処理プログラム Download PDFInfo
- Publication number
- WO2014141799A1 WO2014141799A1 PCT/JP2014/053226 JP2014053226W WO2014141799A1 WO 2014141799 A1 WO2014141799 A1 WO 2014141799A1 JP 2014053226 W JP2014053226 W JP 2014053226W WO 2014141799 A1 WO2014141799 A1 WO 2014141799A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- operation type
- unit
- electronic device
- type determination
- operation input
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04186—Touch location disambiguation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an electronic device, an information processing method, and an information processing program.
- Some electronic devices such as multi-function mobile phones include a display panel that displays an image, and a touch panel that is in contact with the surface and receives an operation input by a user (hereinafter referred to as a “touch panel display”).
- a touch panel display Sometimes.
- miniaturization of electronic equipment and improvement in operability are achieved.
- an intuitive operation can be realized by bringing an operation object such as a user's index finger, middle finger, and touch pen into contact with the display surface of the touch panel while the user holds the electronic device.
- a finger When a user grips an electronic device, a finger may touch a part of the touch panel.
- An insensitive area may be provided at the edge of the touch panel or the lower part of the display surface so that it is not detected as an operation input even if a finger touches it. That is, even if a finger touches the insensitive area, an operation input due to the touch is not accepted, and no output signal is generated from the touch panel.
- an operation type is acquired based on an output signal of the touch panel, a touch area on the touch panel is acquired, and an amount corresponding to the acquired touch area is acquired. Determined as manipulated variable.
- there is no description about the insensitive area of the touch panel there is no description about effectively using the touch to the edge of the touch panel or the lower part of the display surface.
- the operation information acquisition device described in Patent Document 1 does not acquire the operation type based on the operation input by touching the edge of the touch panel or the lower part of the display surface.
- the operability could not be improved by determining the corresponding operation amount.
- the present invention has been made in view of the above points, and provides an electronic device, an information processing method, and an information processing program that have improved operability in accordance with an operation type.
- One aspect of the present invention has been made to solve the above-described problems, and one aspect of the present invention is an operation input unit that receives an operation input, and is received in a predetermined region of the operation input unit.
- An operation type determination unit that determines an operation type according to the distribution of the area that has received the operation input, and an operation control unit that controls processing related to the operation input according to the operation type determined by the operation type determination unit. It is an electronic device provided.
- operability can be improved according to the operation type.
- FIG. 1 is a schematic block diagram illustrating a configuration of an electronic device 1 according to the present embodiment.
- the electronic device 1 is, for example, a multi-function mobile phone (including a so-called “smart phone”), a tablet terminal device, a personal computer, or the like.
- the electronic device 1 may have any size as long as the user can carry it.
- the electronic device 1 has a size that can be gripped by one human hand. In that case, the electronic device 1 has any dimension in the range of, for example, a width of 55 to 85 mm, a height of 100 to 160 mm, and a thickness of 8 to 20 mm.
- the electronic device 1 includes a touch panel 101, a connection unit 102, a memory 103, and a control unit 104.
- the touch panel 101 includes a touch sensor (operation input unit) 111, an operation input processing unit 112, a display processing unit 113, and a display unit 114.
- the touch sensor 111 detects a position where an operation object (for example, a user's finger) contacts on the surface, generates position information indicating the detected position, and outputs the generated position information to the operation input processing unit 112. That is, this position information indicates a position where an operation input by the user is accepted.
- the touch sensor 111 includes, for example, a plurality of elements (for example, a capacitance sensor, a piezoelectric sensor, and the like) arranged in a grid pattern on one plane in order to detect the position where the operation article touches.
- the position information is information indicating the position of each element that has detected that the operation article has come into contact.
- the surface of the touch sensor 111 is divided into a sensitive part 111a and a peripheral part 111b described later (FIG. 2).
- the operation input processing unit 112 and the display processing unit 113 are configured with control components such as a CPU (Central Processing Unit), for example.
- the functions of the operation input processing unit 112 and the display processing unit 113 are realized by the CPU executing a control program.
- the operation input processing unit 112 has a function by executing a program such as a touch sensor driver.
- the operation input processing unit 112 detects position information input from the touch sensor 111 at predetermined time intervals (for example, 20 ms), and performs a process of removing noise from the detected position information.
- the contact area where the operation article comes into contact with the touch sensor 111 is generally wide and there may be a plurality of areas.
- the operation input processing unit 112 distinguishes each contact area, and calculates the area for each contact area and its operating point (for example, the center point).
- the calculated operation point represents a position (touch position) at which the operation article touches and receives an operation input.
- the operation input processing unit 112 is an element of the touch sensor 111 that detects that the operation object has touched, and includes a continuous area occupied by a group of elements adjacent to each other. The contact area is determined.
- the operation input processing unit 112 generates contact information (touch information) indicating a contact region, an operation point and an area for each contact region, and outputs the generated contact information to the control unit 104 via the connection unit 102.
- the contact area indicated by the contact information is not limited to the sensitive part 111a but also includes a contact area belonging to the peripheral part 111b.
- the display processing unit 113 has a function by executing a program such as a drawing driver, for example.
- the display processing unit 113 outputs the image information input from the control unit 104 to the display unit 114.
- the display unit 114 the surface of the display panel constituting the display unit 114 is in contact with the back surface of the touch sensor 111.
- the display unit 114 displays an image based on the image signal input from the display processing unit 113.
- the display unit 114 is, for example, a liquid crystal display panel, an organic EL (Electroluminescence) display panel, or the like. As shown in FIG. 1, the display unit 114 may be integrated with the touch sensor 111 or may be a separate body.
- the connection unit 102 electrically connects the touch panel 101 and the control unit 104, and transmits and receives signals between them.
- the memory 103 stores a program (for example, an OS (Operating System) and application software (hereinafter referred to as an application)) executed by the control unit 104.
- the memory 103 stores data used for processing executed by the control unit 104 and data generated by the processing.
- the memory 103 is, for example, a ROM (Read Only Memory) and a RAM (Random Access Memory).
- the control unit 104 controls the operation of the electronic device 1.
- the control unit 104 includes, for example, a control component such as a CPU, and can execute various functions by the electronic device 1 by executing a program stored in the memory 103. Considering these functions, the control unit 104 includes an operation type determination unit 141, an operation control unit 142, and a display control unit 143.
- the control unit 104 reads an image signal related to a screen component such as an icon from the memory 103, outputs the read image signal to the display processing unit 113, and displays the screen component at a predetermined position on the display unit 114.
- the screen component is an image indicating that a pointing device such as the touch sensor 111 can accept an operation input to the display area displayed on the display unit 114.
- the screen parts are also referred to as UI (User Interface) parts or GUI (Graphic User Interface) components, and include, for example, icons, buttons, links, and the like.
- the control unit 104 performs a process of exerting the function of the electronic device 1 based on the relationship between the position of the operating point indicated by the contact information input from the operation input processing unit 112 and the position of the screen component displayed on the display unit 114. I do. In the following description, these processes may be referred to as operation function processes. For example, when the operation point indicated by the contact information input from the operation input processing unit 112 is included in the area where the screen component is displayed (pressing on the screen component), the control unit 104 functions according to the screen component. May be executed.
- the function corresponding to the screen component includes, for example, starting an application corresponding to the screen component, displaying an image based on an image signal, and the like.
- the control unit 104 may scroll the image when a flick operation is detected in the image displayed on the display unit 114.
- the flick operation is an operation for moving the operating point while pressing an area where an image is displayed on the touch panel 101. That is, the control unit 104 detects a flick operation when the operation point indicated by the contact information continuously moves. Scrolling an image means moving the position where the image is displayed in the direction in which the operating point moves.
- the operation type determination unit 141 determines an operation type (operation mode) based on the contact information input from the operation input processing unit 112.
- the operation type determination unit 141 determines, for example, whether the electronic device 1 is gripped with the left or right hand, whether it is operated with one hand (one-hand operation), or operated with both hands (two-hand operation) as the operation type. To do.
- the operation type determination unit 141 generates operation type information indicating the determined operation type, and outputs the generated operation type information to the operation control unit 142 and the display control unit 143. In this determination, a distribution of contact areas belonging to the peripheral portion 111b (FIG. 2) among the contact areas indicated by the contact information is used. An example of processing in which the operation type determination unit 141 determines the operation type will be described later.
- the operation control unit 142 controls the change amount (movement amount) of the position of the image displayed on the display unit 114 based on the operation type information input from the operation type determination unit 141. For example, the operation control unit 142 determines a scroll amount larger than the scroll amount in the case of indicating a two-handed operation as the amount of change when the input operation type information indicates a one-handed operation.
- the scroll amount is a ratio of the change amount of the display position of the image to the change amount of the operating point indicated by the contact information. When the change amount of the operating point is constant, the change amount of the position of the displayed image increases as the scroll amount increases.
- the memory 103 stores in advance a scroll amount for indicating a two-handed operation and a scroll amount for indicating a one-handed operation.
- the operation control unit 142 reads the scroll amount corresponding to the input operation type information from the memory 103.
- the operation control unit 142 uses the read scroll amount when controlling the position where the image is displayed according to the scroll operation. As a result, smooth operation can be realized even in the case of a one-handed operation that tends to limit the movement of the finger.
- the display control unit 143 controls a mode of displaying an image to be displayed on the display unit 114 based on the operation type information input from the operation type determination unit 141.
- the display control unit 143 controls a mode of displaying an image to be displayed on the display unit 114 based on the operation type information input from the operation type determination unit 141. For example, when the input operation type information indicates that the electronic apparatus 1 is held with one hand (for example, the left hand), the display control unit 143 distributes the distribution to one side (for example, the left) side.
- the screen parts may be displayed so as to be biased.
- the display control unit 143 determines a predetermined distance from the left end of the display unit 114 (for example, the middle point in the left-right direction). Screen parts are displayed on the left side. Thereby, since screen components are distributed to positions (near positions) that can be easily reached from one hand (for example, the left hand) for gripping, operability when performing one-hand operation is improved.
- the display control unit 143 may perform a process of displaying the distribution so as to be biased only when the operation type information indicates a one-hand operation (left-hand operation or right-hand operation).
- the input operation type information indicates a two-hand operation regardless of whether the image displayed on the display unit 114 is a screen component, the side opposite to the hand (for example, the left hand) holding the electronic device 1
- the image may be displayed so as to be biased (for example, on the right side).
- the screen component 114 is displayed on the opposite side to the one hand.
- An image may be displayed so that is biased. In that case, since the displayed image is not covered by the hand related to the operation, the visibility of the image can be maintained.
- FIG. 2 is a plan view showing the electronic apparatus 1 according to the present embodiment.
- the electronic device 1 has a vertically long rectangular shape in which the length (height) of one side is larger than the length (width) of the other side.
- the electronic device 1 may have a horizontally long rectangular shape whose width is wider than the height.
- the X direction indicates the width direction of the electronic device 1
- the Y direction indicates the height direction of the electronic device 1.
- the X direction and Y direction shown in FIG. 2 are the same as in FIGS. 3-6 and 11. In the following description, the X direction and the Y direction may be referred to as right and down, respectively.
- the touch panel 101 covers most of the surface of the electronic device 1. A region where the operation article comes into contact with the surface of the touch sensor 111 included in the touch panel 101 is determined as a contact region in the operation input processing unit 112 (FIG. 1). In the touch sensor 111, a region inside the thick broken line indicates the sensitive portion 111a. The operating point calculated based on the contact area included in the sensitive unit 111a is used for operation function processing in the control unit 104 (FIG. 1), that is, processing according to an operation input by the user.
- a region outside the thick broken line indicates the peripheral edge 111b.
- the peripheral portion 111b is conventionally set as a non-sensitive region, but in the present embodiment, the peripheral portion 111b is not used as a non-sensitive region, and a contact region included in the region is also used.
- the peripheral edge portion 111b includes a peripheral edge portion 111b-1, a left lower end portion 111b-2, and a right lower end portion 111b-3.
- the edge portion 111b-1 is a region having a predetermined width (for example, 6 mm) inward from each of the right, upper, left, and lower sides of the outer periphery of the touch sensor 111.
- the lower left end 111b-2 bites inward by a predetermined radius (for example, 10 mm) around the vertex at the lower left end of the edge 111b-1, and is sandwiched between the edge 111b-1 and the sensitive part 111a. This is a fan-shaped area.
- the right lower end 111b-3 is a fan-shaped sandwiched between the edge 111b-1 and the sensitive part 111a, with the radius (for example, 10 mm) inwardly centered on the vertex of the right lower end of the edge 111b-1. It is an area.
- the left lower end 111b-2 and the right lower end 111b-3 are collectively referred to as a top end.
- FIG. 3 is a conceptual diagram illustrating an example of the arrangement of the operation article in contact with the touch panel 101.
- FIG. 3 shows a state where the user performs an operation with the tip of the thumb F1a of the left hand while holding the electronic device 1 with the left hand.
- the abdomen of the thumb F1a of the left hand, the tip of the index finger F1c, and the tip of the middle finger F1d are in contact with the lower left end, right middle and right middle of the electronic device 1 slightly below.
- FIG. 4 is a conceptual diagram illustrating an example of the distribution of the contact area of the operation article that has touched the touch panel 101.
- the example shown in FIG. 4 shows the contact area acquired when handled as shown in the example of FIG.
- Two areas t1a, t1b, t1c, and t1d surrounded by a one-dot broken line at two places on the lower left side of the touch panel 101 and two places on the right side are contact areas.
- a cross mark included in each contact area indicates an operating point.
- the contact regions t1a, t1b, t1c, and t1d are regions where the abdomen of the thumb F1a, the tip of the thumb F1a, the tip of the index finger F1c, and the tip of the middle finger F1d are in contact with the touch panel 101, respectively. Since the operating point of the contact area t1b is included in the sensitive unit 111a, it is used in the operation function process performed by the control unit 104. Since the operating points of the contact areas t1a, t1c, and t1d are not included in the sensitive unit 111a, they may not be used in the operation function process performed by the control unit 104.
- the contact area t1a overlaps the lower left end 111b-2, but no contact area overlaps the lower right end 111b-3. This is because when the user holds the electronic device 1 with the left hand, the abdomen of the thumb F1a of the left hand mainly contacts the lower left end 111b-2 and the finger of the left hand does not contact the lower right end 111b-3. . When the user holds the electronic device 1 with the right hand, the abdomen of the thumb of the right hand mainly contacts the lower right end 111b-3, and the finger of the right hand does not contact the lower left end 111b-2.
- the operation type determination unit 141 holds the electronic device 1 with the left hand. It is determined that On the other hand, the operation type determination unit 141 detects the contact area included in the lower right end 111b-3, and when the detected area of the contact area is larger than the area of the contact area included in the lower left end 111b-2, It is determined that the device 1 is held with the right hand.
- the operation type determination unit 141 determines that the area of the contact region included in the lower left end 111b-2 or the right lower end 111b-3 to be compared is a threshold value of a predetermined contact region, for example, the lower left end If it is larger than 0.2 times the area of 111b-2, it may be determined whether the hand holding the electronic device 1 is the left hand or the right hand. As a result, when a part of an object other than the operation article such as clothes slightly touches the lower left end 111b-2 or the lower right end 111b-3, it is erroneously determined which hand is being held. Can be avoided.
- the double-headed arrows shown in the vicinity of the respective operating points of the contact areas t1a and t1b indicate that the contact areas t1a and t1b move up and down and left and right around the respective operating points. These movements are caused by the interlocking of the abdomen of the thumb F1a when the tip of the thumb F1a of one hand (for example, the left hand) moves during operation.
- the operation type determination unit 141 is significant between the operation point related to the contact region t1a occupying part or all of the lower left end 111b-2 and the operation point related to the contact region t1b included in the sensitive unit 111a. If there is a correlation, it is determined that the operation is being performed with the left hand holding the electronic device 1.
- the operation type determination unit 141 has a predetermined cross-correlation threshold (for example, 0) between the coordinates of the operation point related to the contact region t1a and the coordinate of the operation point related to the contact region t1b included in the sensitive unit 111a. When it is larger than 1), it is determined that there is a significant correlation.
- the operation type determination unit 141 is significant between the operation point related to the contact area occupying part or all of the lower right end part 111b-3 and the operation point related to the contact area included in the sensitive unit 111a. If there is a correlation, it is determined that the operation is being performed with the right hand holding the electronic device 1.
- the determined right-handed operation and left-handed operation are each one type of one-handed operation.
- the operation type determination unit 141 determines that the electronic device 1 is operated with both hands. That is, it is determined that the hand (for example, the right hand) opposite to the hand (for example, the left hand) holding the electronic device 1 is the operating hand.
- FIG. 5 is a conceptual diagram illustrating another example of the arrangement of the operation article in contact with the touch panel 101.
- FIG. 5 shows a state in which the user operates the tip of the index finger F2 of the right hand while holding the electronic device 1 with the left hand.
- the abdomen of the thumb F1a of the left hand, the tip of the middle finger F1d, and the tip of the ring finger F1e are in contact with the lower left end of the electronic device 1, slightly above the middle right, and in the middle right.
- FIG. 6 is a conceptual diagram illustrating another example of the distribution of the contact area of the operation article that has touched the touch panel 101.
- the contact areas t1e and t2 indicate contact areas where the tip of the left ring finger F1e and the tip of the right index finger F2 are in contact with the touch panel 101, respectively.
- a double-headed arrow shown near the operating point of the contact area t2 indicates that the contact area t2 moves up and down.
- the fact that the double arrow shown in FIG. 3 is not shown in the vicinity of the operating point related to the contact region t1a indicates that the contact region t1a is almost stationary. That is, it means that the contact area of the hand (for example, the left hand) holding the electronic device 1 is almost stationary regardless of the movement of the contact area of the hand (for example, the right hand) related to the operation.
- the operation type determination unit 141 makes a significant difference between the operation point t1a related to the contact area occupying part or all of the lower left end 111b-2 and the operation point related to the contact area t2 included in the sensitive unit 111a. It is determined that there is no significant correlation. Therefore, the operation type determination unit 141 determines that the hand not holding the electronic device 1 (for example, the right hand) is operating, and the electronic device 1 is operated with both hands.
- FIG. 7 is a flowchart illustrating an example of information processing according to the present embodiment.
- the operation input processing unit 112 detects the position information input from the touch sensor 111 at predetermined time intervals, distinguishes the contact area indicated by the detected position information, and determines the operation point for each contact area. calculate. Thereby, the contact information concerning the contact area where the operation article contacts the touch sensor 111 is acquired. Thereafter, the process proceeds to step S102.
- Step S102 The operation type determination unit 141 compares the area of the contact area included in the lower left end 111b-2 with the area of the contact area included in the lower right end 111b-3, and holds the electronic device 1 It is determined whether the current hand is a left hand or a right hand. Note that the operation type determination unit 141 may determine whether the operation is performed with one hand or the both hands from the viewpoint of operation rather than the viewpoint of gripping (see step S205 in FIG. 8). . Thereafter, the process proceeds to step S103.
- the display control unit 143 controls the arrangement of the screen components so as to be biased toward the hand (for example, the left hand) holding the electronic device 1, for example.
- the display control unit 143 is biased to the opposite side (for example, the right side) from the hand (for example, the left hand) that holds the electronic device 1.
- An image other than the screen component may be displayed.
- the display control unit 143 biases the image to the opposite side (for example, the right side) from the hand (for example, the left hand) that holds the electronic device 1. In this way, an image may be displayed. Thereafter, the processing according to this flowchart is terminated.
- FIG. 8 is a flowchart showing another example of information processing according to the present embodiment.
- the information processing according to FIG. 8 includes step S101 (FIG. 7).
- the processing in step S101 is the same as the operation in step S101 in FIG.
- the process proceeds to Step S202.
- the operation type determination unit 141 includes a touch area in which a touch area having an operating point in the sensitive part 111a and a touch area that occupies a part or all of the top end (the left bottom end 111b-2 or the right bottom end 111b-3). It is determined whether each is distributed to 101. If it is determined that each is distributed (YES in step S202), the process proceeds to step S203. If it is determined that the distribution is not made (NO in step S202), the processing according to this flowchart is terminated.
- Step S203 The operation type determination unit 141 sets a predetermined time (for example, a trajectory of the operating point included in the sensitive unit 111a and a trajectory of the operating point included in the contact area occupying part or all of the peripheral portion 111b). 3 seconds). Thereafter, the process proceeds to step S204.
- Step S204 The operation type determination unit 141 calculates a cross-correlation between the locus of the operating point included in the sensitive unit 111a and the locus of the operating point related to the contact region t1a occupying part or all of the peripheral portion 111b. The operation type determination unit 141 determines whether or not there is a significant correlation between the two depending on whether or not the calculated cross-correlation is greater than a predetermined threshold. Thereafter, the process proceeds to step S205.
- Step S205 When the operation type determination unit 141 determines that there is a significant correlation between the two, the operation type determination unit 141 determines that the electronic device 1 is operated with one hand and determines that there is no significant correlation between the two. In this case, it is determined that the electronic device 1 is operated with both hands. Thereafter, the process proceeds to step S206.
- Step S206 The operation control unit 142 sets a scroll amount (change amount) according to whether the electronic apparatus 1 is operated with one hand or both hands. In the set scroll amount, the scroll amount in the case of one-hand operation is larger than the scroll amount in the case of two-hand operation. Thereafter, the processing according to this flowchart is terminated.
- the electronic device 1 may execute the process shown in FIG. 7 and the process shown in FIG. 8 separately or in parallel.
- the operation input unit (for example, the peripheral portion 111b) of the operation input unit (for example, the touch sensor 111) that receives the operation input receives the operation input that has been received (for example, the peripheral portion 111b).
- the operation type is determined according to the distribution of the contact area), and the process related to the operation input is controlled according to the determined operation type.
- the operation type is determined by utilizing the operation input received in the partial area that has not been used in the past, and the operability is improved by the processing according to the determined operation type.
- FIG. 9 is a schematic block diagram illustrating a configuration of the electronic device 2 according to the present embodiment.
- the control unit 204 includes an operation input processing unit 112 and a display processing unit 113 in addition to the operation type determination unit 141, the operation control unit 142, and the display control unit 143.
- the touch panel 201 includes a touch sensor 111 and a display unit 114. However, in the touch panel 201, the operation input processing unit 112 and the display processing unit 113 of the first embodiment in FIG.
- connection unit 102 (FIG. 1) of the first embodiment of FIG. 1 is omitted, and output and display of position information from the touch sensor 111 to the operation input processing unit 112 is displayed. An image signal is directly output from the processing unit 113 to the display unit 114.
- the operation input processing unit 112 and the display processing unit 113 are integrated in the control unit 204, so that the operation type determination unit 141 and the operation control unit 142 are integrated.
- the display control unit 143, the operation input processing unit 112, and the display processing unit 113 can operate with a common program (for example, OS), so that the processing speed can be increased.
- components such as the CPU and the connection unit 102 in the operation input unit (for example, the touch panel 201) can be omitted, and the number of components can be reduced.
- FIG. 10 is a schematic block diagram showing the configuration of the electronic device 3 according to this embodiment.
- the electronic device 3 according to the present embodiment includes a touch panel 301 and a control unit 304 instead of the touch panel 101 and the control unit 104 in the electronic device 1 of FIG. 1 of the first embodiment, and further includes an acceleration sensor 305.
- the touch panel 301 includes a touch sensor 311 instead of the touch sensor 111 in the touch panel 101 (FIG. 1).
- the control unit 304 includes an operation type determination unit 341 instead of the operation type determination unit 141 in the control unit 104 (FIG. 1).
- FIG. 11 is a plan view showing the electronic device 3 according to the present embodiment.
- the touch sensor 311 has a sensitive part 111a and a peripheral part 111b, like the touch sensor 111 (FIG. 2).
- the peripheral edge portion 111b has a right edge portion 311b-4 and a left upper edge portion 311b-5 in addition to the lower left edge portion 111b-2 and the right lower edge portion 111b-3 as the edge portion 111b-1 and four top edge portions.
- the upper right end 311b-4 bites inward by a predetermined radius (for example, 10 mm) around the vertex at the upper right end of the edge 111b-1, and is sandwiched between the edge 111b-1 and the sensitive part 111a. This is a fan-shaped area.
- the upper left end portion 311b-5 has a fan shape sandwiched between the edge portion 111b-1 and the sensitive portion 111a, with the radius (for example, 10 mm) inwardly centered on the upper left vertex of the edge portion 111b-1. It is an area.
- the acceleration sensor 305 detects the gravitational acceleration and outputs an acceleration signal indicating the detected gravitational acceleration to the operation type determination unit 341.
- the acceleration sensor 305 is a triaxial acceleration sensor having sensitivity axes in three directions orthogonal to each other. In the electronic device 3, the acceleration sensor 305 is arranged so that two of the three sensitivity axes are oriented in the X and Y directions, respectively. Thereby, at least the component of gravitational acceleration in the X and Y directions, that is, the inclination of the electronic device 3 in the X and Y planes is detected.
- the operation type determination unit 341 performs the same processing as the operation type determination unit 141 (FIG. 1). However, the operation type determination unit 341 uses any two of the four apex portions based on the acceleration signal input from the acceleration sensor 305 for the operation function processing according to the sensitive unit 111a, or the remaining two. Is used to determine the operation type according to the peripheral portion 111b described above. Here, the operation type determination unit 341 determines that the two top ends set at both ends of the bottom side of the touch sensor 311 are used for determination of the operation type, and the remaining two top ends are used for the operation function process. judge.
- the operation type determination unit 341 determines that the lower side of the touch sensor 311 is the base, and the lower left end 111b ⁇ 2 and the lower right end 111b-3 are determined to be used for determining the operation type.
- the operation type determination unit 341 determines that the right side of the touch sensor 311 is the base, and the right lower end 111b-3 and the upper right end It is determined that the unit 311b-4 is used for determining the operation type.
- the operation type determination unit 341 determines that the top side of the touch sensor 311 is the bottom side, and It is determined that the part 311b-4 and the upper left end part 311b-5 are used for determining the operation type.
- the operation type determination unit 341 determines that the left side of the touch sensor 311 is the bottom side, and It is determined that the part 311b-5 and the lower left end part 111b-2 are used for determining the operation type.
- the electronic device 3 includes a touch sensor 311 and an operation type determination unit 341 instead of the touch sensor 111 and the operation type determination unit 141 in the electronic device 2 (FIG. 9), and further includes an acceleration sensor 305. May be configured.
- a predetermined region from each vertex of the operation input unit (for example, the touch sensor 311) according to the detected gravitational acceleration is changed to a region (for example, the received operation input). It is determined whether or not to use the operation type according to the distribution of the contact area. Accordingly, the operation type can be accurately determined regardless of the direction in which the user holds the electronic apparatus (for example, the electronic apparatus 3) according to the present embodiment (for example, the vertical holding or the horizontal holding).
- An operation input unit that receives an operation input, an operation type determination unit that determines an operation type according to a distribution of regions that have received an operation input received in a predetermined part of the operation input unit,
- An electronic device comprising: an operation control unit that controls processing related to an operation input according to the operation type determined by the operation type determination unit.
- An electronic device comprising: a display control unit that controls a mode of displaying an image on a display unit according to the operation type determined by the operation type determination unit.
- An information processing method in an electronic device which is an operation type determination that determines an operation type according to a distribution of regions that have received operation inputs received in a predetermined region of an operation input unit that receives operation inputs.
- An information processing method comprising: a process; and an operation control process for controlling processing related to an operation input according to the operation type determined in the operation type determination process.
- An operation type determination procedure for determining an operation type in accordance with a distribution of areas in which operation inputs received in a predetermined partial area of an operation input unit that receives operation inputs are received in a computer of the electronic device, the operation An information processing program for executing an operation control procedure for controlling processing related to an operation input according to an operation type determined by a type determination procedure.
- the operation type is determined by using the operation input received in a part of the area, and the operability is improved by the processing according to the determined operation type.
- operability is improved by controlling the amount of change in the position at which an image is displayed according to the determined operation type.
- operability and visibility are improved by displaying an image in a manner corresponding to the determined operation type.
- the control units 104, 204, and 304 in the above-described embodiment may be realized by a computer.
- the program for realizing the control function may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read by the computer system and executed.
- the “computer system” here is a computer system built in the electronic devices 1, 2, and 3, and includes hardware such as an OS and peripheral devices.
- the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
- the “computer-readable recording medium” is a medium that dynamically holds a program for a short time, such as a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line,
- a volatile memory inside a computer system serving as a server or a client may be included, and the one holding a program for a certain period of time may be included.
- the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
- some or all of the electronic devices 1, 2, and 3 in the above-described embodiments may be realized as an integrated circuit such as an LSI (Large Scale Integration).
- LSI Large Scale Integration
- Each functional block of the electronic devices 1, 2, and 3 may be individually made into a processor, or a part or all of them may be integrated into a processor.
- the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. Further, in the case where an integrated circuit technology that replaces LSI appears due to progress in semiconductor technology, an integrated circuit based on the technology may be used.
- One embodiment of the present invention can be applied to an electronic device, an information processing method, an information processing program, and the like that require improvement in operability according to an operation type.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
本願は、2013年3月13日に、日本に出願された特願2013-050785号に基づき優先権を主張し、その内容をここに援用する。
これにより、電子機器の小型化と操作性の向上が図られている。このような電子機器では、ユーザが把持しながらタッチパネルの表示面にユーザの人差指、中指、タッチペン等の操作物を接触することで、直感的な操作を実現することができる。
また、特許文献1に記載の操作情報取得装置では、タッチパネルの出力信号に基づいて操作種別を取得し、タッチパネルにおけるタッチ面積を取得し、取得したタッチ面積に対応する量を取得された操作種別の操作量として決定する。しかし、タッチパネルの不感応領域に関する記載はなく、タッチパネルの辺縁部や表示面の下部へのタッチを有効に利用することに関する記載もない。
本発明は上記の点に鑑みてなされたものであり、操作種別に応じて操作性を向上した電子機器、情報処理方法、及び情報処理プログラムを提供する。
以下、図面を参照しながら本発明の第1の実施形態について説明する。
図1は、本実施形態に係る電子機器1の構成を示す概略ブロック図である。
電子機器1は、例えば、多機能携帯電話機(いわゆる「スマートフォン」を含む)、タブレット端末装置、パーソナルコンピュータ、等である。電子機器1の大きさは、ユーザが携帯可能であれば、いかなる大きさであってもよい。電子機器1は、概ね人間の片手で把持できる程度の大きさを有する。その場合、電子機器1は、例えば、幅は55~85mm、高さは100~160mm、厚さは8mm~20mmの範囲のいずれかの寸法を有する。
電子機器1は、タッチパネル101、接続部102、メモリ103、制御部104を含んで構成される。
タッチセンサ111は、その表面において操作物(例えば、ユーザの指)が接触した位置を検出し、検出した位置を示す位置情報を生成し、生成した位置情報を操作入力処理部112に出力する。即ち、この位置情報はユーザによる操作入力を受け付けた位置を示す。タッチセンサ111は、操作物が接触した位置を検知するために、例えば、1つの平面上に格子状に配列された複数の素子(例えば、静電容量センサ、圧電センサ、等)を備え、各素子に操作物が接触したか否かを検出する。つまり、位置情報は、操作物が接触したことを検出した素子毎の位置を示す情報である。タッチセンサ111の表面は、後述する感応部111aと周縁部111bに区分される(図2)。
操作入力処理部112は、例えば、タッチセンサドライバ等のプログラムを実行することによる機能を有する。操作入力処理部112は、タッチセンサ111から入力された位置情報を予め定めた時間間隔(例えば、20ms)毎に検出し、検出した位置情報についてノイズを除去する処理を行う。タッチセンサ111に操作物が接触した接触領域は、一般に広がりを有し、複数個存在する場合がある。例えば、ユーザの複数の指での接触した場合、ある1本の指で指先と指の腹で接触した場合、ある指と掌で接触した場合、等がある。そこで、操作入力処理部112は、各接触領域を区別し、接触領域毎の面積とその動作点(例えば、中心点)を算出する。算出した動作点は、操作物が接触して操作入力を受け付けた位置(タッチ位置)を代表する。
表示処理部113は、例えば、描画ドライバ等のプログラムを実行することによる機能を有する。表示処理部113は、制御部104から入力された画像情報を表示部114に出力する。
メモリ103には、制御部104で実行するプログラム(例えば、OS(Operating System)、アプリケーションソフトウェア(以下ではアプリケーションと呼ぶ))を記憶する。また、メモリ103は、制御部104が実行する処理に用いられるデータやその処理で生成したデータを記憶する。メモリ103は、例えば、ROM(Read Only Memory)及びRAM(Random Access Memory)である。
制御部104は、例えば、操作入力処理部112から入力された接触情報が示す動作点が、画面部品が表示された領域に含まれるとき(画面部品への押下)、その画面部品に応じた機能を実行することがある。その画面部品に応じた機能とは、例えば、その画面部品に対応するアプリケーションを起動させること、画像信号に基づく画像を表示すること、等がある。
なお、表示部114に表示させる画像が画面部品とは異なり、操作入力を受け付けない通常の画像であって、入力された操作種別情報が片手操作を示す場合、その片手とは反対側に画面部品が偏るように画像を表示してもよい。その場合、表示される画像が操作に係る手に覆われないため、その画像に対する視認性を維持することができる。
図2は、本実施形態に係る電子機器1を示す平面図である。
図2に示す例では、電子機器1は、一辺の長さ(高さ)が他の一辺の長さ(幅)よりも大きい縦長の長方形の形状を有する。本実施形態では、これには限られず電子機器1は、幅が高さよりも広い横長の長方形の形状を有していてもよい。図2では、X方向は電子機器1の幅の方向を示し、Y方向は電子機器1の高さの方向を示す。図2に示すX方向、Y方向は、図3-6、11でも同様である。以下の説明では、X方向、Y方向を、それぞれ右、下、と呼ぶことがある。
辺縁部111b-1は、タッチセンサ111の外周の右、上、左、下の各辺から内側に所定の幅(例えば、6mm)を有する領域である。左下端部111b-2は、辺縁部111b-1の左下端の頂点を中心として、所定の半径(例えば、10mm)だけ内側に食い込み、辺縁部111b-1と感応部111aに挟まれた扇形の領域である。右下端部111b-3は、辺縁部111b-1の右下端の頂点を中心として、その半径(例えば、10mm)だけ内側に食い込み、辺縁部111b-1と感応部111aに挟まれた扇形の領域である。
なお、以下の説明では左下端部111b-2と右下端部111b-3を頂端部と総称する。
図3は、ユーザが左手で電子機器1を把持しながら左手の親指F1aの先端部で操作を行う状態を示す。この例では、左手の親指F1aの腹部、人差指F1cの先端部及び中指F1dの先端部が、電子機器1の左下端、右方中段及び右方中段よりやや下方にそれぞれ接触している。
図4に示す例は、図3の例に示すように取り扱われたときに取得された接触領域を示す。タッチパネル101の左下側に2箇所、右辺に2箇所、それぞれ一点破線で囲まれる領域t1a、t1b、t1c、t1dが接触領域である。それぞれの接触領域に含まれる×印が、動作点を示す。接触領域t1a、t1b、t1c、t1dは、それぞれ親指F1aの腹部、親指F1aの先端部、人差指F1cの先端部、中指F1dの先端部がそれぞれタッチパネル101に接触している領域である。接触領域t1bの動作点は、感応部111aに含まれるため、制御部104が行う操作機能処理で用いられる。接触領域t1a、t1c、t1dの動作点は、感応部111aに含まれないため制御部104が行う操作機能処理で用いられなくてもよい。
なお、操作種別判定部141は、比較対象となる左下端部111b-2もしくは右下端部111b-3に含まれる接触領域の面積が、予め定めた接触領域の面積の閾値、例えば、左下端部111b-2の面積の0.2倍よりも大きい場合、電子機器1を把持する手が左手又は右手のいずれかであるかを判定してもよい。これにより、衣服等、操作物以外の物体の一部がわずかに左下端部111b-2又は右下端部111b-3に接触している場合、どちらの手で把持されているかを誤判定することが避けられる。
この判定された、右手での操作や、左手での操作は、それぞれ片手操作の1種である。
即ち、電子機器1を把持している手(例えば、左手)とは反対側の手(例えば、右手)が操作を行っている手であると判定される。
図5は、ユーザが左手で電子機器1を把持しながら、右手の人差指F2の先端部で操作を行う状態を示す。この例では、左手の親指F1aの腹部、中指F1dの先端部及び薬指F1eの先端部が、電子機器1の左下端、右方中段よりやや上方、右方中段にそれぞれ接触している。
接触領域t1e、t2は、左手の薬指F1eの先端部、右手の人差指F2の先端部がそれぞれタッチパネル101に接触している接触領域を示す。接触領域t2の動作点の近傍に示されている両矢印は、接触領域t2がそれぞれ上下に動くことを示す。これに対し、接触領域t1aに係る動作点の近傍に図3に示した両矢印が示されていないのは、接触領域t1aがほぼ静止していることを示す。即ち、操作に係る手(例えば、右手)の接触領域の動きに関わらず、電子機器1を把持する手(例えば、左手)の接触領域がほぼ静止していることを意味する。
図7は、本実施形態に係る情報処理の一例を示すフローチャートである。
(ステップS101)操作入力処理部112は、タッチセンサ111から入力された位置情報を予め定めた時間間隔毎に検出し、検出した位置情報が示す接触領域を区別し、接触領域毎の動作点を算出する。これにより、タッチセンサ111に操作物が接触した接触領域にかかる接触情報を取得する。その後、ステップS102に進む。
(ステップS102)操作種別判定部141は、左下端部111b-2に含まれる接触領域の面積と、右下端部111b-3に含まれる接触領域の面積とを比較して、電子機器1を把持している手が左手であるか右手であるかを判定する。なお、操作種別判定部141は、把持の観点ではなくて操作の観点から、片手で操作が行われているか両手で操作が行われているかを判定してもよい(図8、ステップS205参照)。その後、ステップS103に進む。
図8は、本実施形態に係る情報処理のその他の例を示すフローチャートである。
図8に係る情報処理は、ステップS101(図7)を有する。ステップS101の処理は、図7のステップS101の操作と同じであるため、その説明を省略する。ステップS101を実行した後、ステップS202に進む。
(ステップS202)操作種別判定部141は、感応部111aに動作点を有する接触領域と頂端部(左下端部111b-2もしくは右下端部111b-3)の一部又は全部を占める接触領域がタッチパネル101にそれぞれ分布しているか否かを判定する。それぞれ分布していると判定された場合(ステップS202 YES)、ステップS203に進む。分布していないと判定された場合(ステップS202 NO)、本フローチャートに係る処理を終了する。
(ステップS204)操作種別判定部141は、感応部111aに含まれる動作点の軌跡と、周縁部111bの一部又は全部を占める接触領域t1aに係る動作点の軌跡との相互相関を算出する。操作種別判定部141は、算出した相互相関が予め定めた閾値よりも大きいか否かにより、両者間における有意な相関の有無を判定する。その後、ステップS205に進む。
(ステップS206)操作制御部142は、電子機器1が片手で操作されているか、両手で操作されているかに応じたスクロール量(変化量)を設定する。設定されるスクロール量では、片手操作の場合のスクロール量の方が両手操作の場合のスクロール量よりも大きい。その後、本フローチャートに係る処理を終了する。
次に、本発明の第2の実施形態について説明する。上述の実施形態と同一の構成については、同一の符号を付して説明を援用する。
図9は、本実施形態に係る電子機器2の構成を示す概略ブロック図である。
本実施形態に係る電子機器2において、制御部204は、操作種別判定部141、操作制御部142、表示制御部143の他、操作入力処理部112及び表示処理部113を備える。タッチパネル201は、タッチセンサ111と表示部114を備える。しかし、タッチパネル201では、図1の第1の実施形態の操作入力処理部112及び表示処理部113が、省略されている。また、第2の実施形態(図9)では、図1の第1の実施形態の接続部102(図1)が省略され、タッチセンサ111から操作入力処理部112への位置情報の出力、表示処理部113から表示部114への画像信号の出力が直接行われる。
次に、本発明の第3の実施形態について説明する。上述の実施形態と同一の構成については、同一の符号を付して説明を援用する。
図10は、本実施形態に係る電子機器3の構成を示す概略ブロック図である。
本実施形態に係る電子機器3は、第1の実施形態の図1の電子機器1において、タッチパネル101、制御部104の代わりにタッチパネル301、制御部304を備え、更に加速度センサ305を備える。
タッチパネル301は、タッチパネル101(図1)においてタッチセンサ111の代わりにタッチセンサ311を備える。制御部304は、制御部104(図1)において、操作種別判定部141の代わりに操作種別判定部341を備える。
タッチセンサ311は、タッチセンサ111(図2)と同様に感応部111aと周縁部111bを有する。周縁部111bは、辺縁部111b-1と4つの頂端部として、左下端部111b-2、右下端部111b-3の他に右上端部311b-4、左上端部311b-5を有する。右上端部311b-4は、辺縁部111b-1の右上端の頂点を中心として、所定の半径(例えば、10mm)だけ内側に食い込み、辺縁部111b-1と感応部111aに挟まれた扇形の領域である。左上端部311b-5は、辺縁部111b-1の左上端の頂点を中心として、その半径(例えば、10mm)だけ内側に食い込み、辺縁部111b-1と感応部111aに挟まれた扇形の領域である。
加速度信号のX成分の値がY成分の絶対値よりも大きい正の値であるとき、操作種別判定部341は、タッチセンサ311の右辺を底辺と判定し、右下端部111b-3と右上端部311b-4とを操作種別の判定に用いると判定する。
加速度信号のY成分の絶対値がX成分の絶対値よりも大きく、Y成分の値が負の値であるとき、操作種別判定部341は、タッチセンサ311の上辺を底辺と判定し、右上端部311b-4と左上端部311b-5とを操作種別の判定に用いると判定する。
加速度信号のX成分の絶対値がY成分の絶対値よりも大きく、X成分の値が負の値であるとき、操作種別判定部341は、タッチセンサ311の左辺を底辺と判定し、左上端部311b-5と左下端部111b-2とを操作種別の判定に用いると判定する。
(1)操作入力を受け付ける操作入力部と、前記操作入力部の予め定めた一部の領域で受け付けた操作入力を受け付けた領域の分布に応じて操作種別を判定する操作種別判定部と、前記操作種別判定部が判定した操作種別に応じて操作入力に係る処理を制御する操作制御部と、を備える電子機器。
上述した(2)によれば、判定した操作種別に応じて画像を表示させる位置の変化量を制御することで操作性が向上する。
上述した(3)によれば、判定した操作種別に応じた態様で画像を表示することで操作性及び視認性が向上する。
Claims (5)
- 操作入力を受け付ける操作入力部と、
前記操作入力部の予め定めた一部の領域で受け付けた操作入力を受け付けた領域の分布に応じて操作種別を判定する操作種別判定部と、
前記操作種別判定部が判定した操作種別に応じて操作入力に係る処理を制御する操作制御部と、
を備える電子機器。 - 前記操作制御部は、前記操作種別判定部が判定した操作種別に応じて表示部に画像を表示させる位置の変化量を制御する請求項1に記載の電子機器。
- 前記操作種別判定部が判定した操作種別に応じて表示部に画像を表示する態様を制御する表示制御部と、
を備える請求項1又は2に記載の電子機器。 - 電子機器における情報処理方法であって、
操作入力を受け付ける操作入力部の予め定めた一部の領域で受け付けた操作入力を受け付けた領域の分布に応じて操作種別を判定する操作種別判定過程と、
前記操作種別判定過程で判定した操作種別に応じて操作入力に係る処理を制御する操作制御過程と、
を有する情報処理方法。 - 電子機器のコンピュータに、
操作入力を受け付ける操作入力部の予め定めた一部の領域で受け付けた操作入力を受け付けた領域の分布に応じて操作種別を判定する操作種別判定手順、
前記操作種別判定手順で判定した操作種別に応じて操作入力に係る処理を制御する操作制御手順、
を実行させるための情報処理プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020157018120A KR20150093780A (ko) | 2013-03-13 | 2014-02-12 | 전자 기기 및 정보 처리 방법 |
CN201480004364.5A CN104903838A (zh) | 2013-03-13 | 2014-02-12 | 电子设备、信息处理方法以及信息处理程序 |
US14/655,391 US20150363036A1 (en) | 2013-03-13 | 2014-02-12 | Electronic device, information processing method, and information processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013050785A JP5995171B2 (ja) | 2013-03-13 | 2013-03-13 | 電子機器、情報処理方法、及び情報処理プログラム |
JP2013-050785 | 2013-03-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014141799A1 true WO2014141799A1 (ja) | 2014-09-18 |
Family
ID=51536475
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/053226 WO2014141799A1 (ja) | 2013-03-13 | 2014-02-12 | 電子機器、情報処理方法、及び情報処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20150363036A1 (ja) |
JP (1) | JP5995171B2 (ja) |
KR (1) | KR20150093780A (ja) |
CN (1) | CN104903838A (ja) |
WO (1) | WO2014141799A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9501166B2 (en) * | 2015-03-30 | 2016-11-22 | Sony Corporation | Display method and program of a terminal device |
JP6492910B2 (ja) * | 2015-04-13 | 2019-04-03 | ブラザー工業株式会社 | 携帯端末 |
JP7353989B2 (ja) * | 2020-01-09 | 2023-10-02 | ヤフー株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3145773U (ja) * | 2008-07-17 | 2008-10-23 | 有限会社インターネットアンドアーツ | タッチパッド入力装置 |
JP2010213169A (ja) * | 2009-03-12 | 2010-09-24 | Fujifilm Corp | 表示装置、表示処理方法及び撮像装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007298694A (ja) * | 2006-04-28 | 2007-11-15 | Sharp Corp | 画像形成装置、設定入力表示方法、プログラムおよび記録媒体 |
KR20100039194A (ko) * | 2008-10-06 | 2010-04-15 | 삼성전자주식회사 | 사용자 접촉 패턴에 따른 GUI(Graphic User Interface) 표시 방법 및 이를 구비하는 장치 |
US8799827B2 (en) * | 2010-02-19 | 2014-08-05 | Microsoft Corporation | Page manipulations using on and off-screen gestures |
-
2013
- 2013-03-13 JP JP2013050785A patent/JP5995171B2/ja active Active
-
2014
- 2014-02-12 WO PCT/JP2014/053226 patent/WO2014141799A1/ja active Application Filing
- 2014-02-12 US US14/655,391 patent/US20150363036A1/en not_active Abandoned
- 2014-02-12 KR KR1020157018120A patent/KR20150093780A/ko not_active Application Discontinuation
- 2014-02-12 CN CN201480004364.5A patent/CN104903838A/zh active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3145773U (ja) * | 2008-07-17 | 2008-10-23 | 有限会社インターネットアンドアーツ | タッチパッド入力装置 |
JP2010213169A (ja) * | 2009-03-12 | 2010-09-24 | Fujifilm Corp | 表示装置、表示処理方法及び撮像装置 |
Also Published As
Publication number | Publication date |
---|---|
US20150363036A1 (en) | 2015-12-17 |
CN104903838A (zh) | 2015-09-09 |
JP5995171B2 (ja) | 2016-09-21 |
JP2014178750A (ja) | 2014-09-25 |
KR20150093780A (ko) | 2015-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9350841B2 (en) | Handheld device with reconfiguring touch controls | |
US9671893B2 (en) | Information processing device having touch screen with varying sensitivity regions | |
KR101150321B1 (ko) | 정보 처리 장치 및 정보 처리 장치의 표시 정보 편집 방법 | |
JP5759660B2 (ja) | タッチ・スクリーンを備える携帯式情報端末および入力方法 | |
JP5780438B2 (ja) | 電子機器、位置指定方法及びプログラム | |
JP5640486B2 (ja) | 情報表示装置 | |
KR20110063561A (ko) | 다접촉 터치 스크린 상의 그래픽 객체를 다루는 것에 의해 전자 기기를 제어하기 위한 장치 | |
TW201329835A (zh) | 顯示控制裝置、顯示控制方法及電腦程式 | |
JP5222967B2 (ja) | 携帯端末 | |
EP3550419B1 (en) | Mobile terminal device and method for controlling display of mobile terminal device | |
JP2012079279A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP5995171B2 (ja) | 電子機器、情報処理方法、及び情報処理プログラム | |
JP2014016743A (ja) | 情報処理装置、情報処理装置の制御方法、および情報処理装置の制御プログラム | |
JP2014191560A (ja) | 入力装置、入力方法、及び記録媒体 | |
US9235338B1 (en) | Pan and zoom gesture detection in a multiple touch display | |
JP2012141650A (ja) | 携帯端末 | |
WO2013080425A1 (ja) | 入力装置、情報端末、入力制御方法、および入力制御プログラム | |
JP6411067B2 (ja) | 情報処理装置及び入力方法 | |
EP3433713B1 (en) | Selecting first digital input behavior based on presence of a second, concurrent, input | |
JP2012238128A (ja) | 背面入力機能を有する情報機器、背面入力方法、およびプログラム | |
JP6034140B2 (ja) | 表示装置、表示制御方法及びプログラム | |
JP5855481B2 (ja) | 情報処理装置、その制御方法およびその制御プログラム | |
US20150091831A1 (en) | Display device and display control method | |
JP2014215890A (ja) | 電子機器、情報処理方法及び情報処理プログラム | |
JP5624662B2 (ja) | 電子機器、表示制御方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14764127 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14655391 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20157018120 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14764127 Country of ref document: EP Kind code of ref document: A1 |