WO2012111230A1 - 情報処理端末およびその制御方法 - Google Patents
情報処理端末およびその制御方法 Download PDFInfo
- Publication number
- WO2012111230A1 WO2012111230A1 PCT/JP2011/079007 JP2011079007W WO2012111230A1 WO 2012111230 A1 WO2012111230 A1 WO 2012111230A1 JP 2011079007 W JP2011079007 W JP 2011079007W WO 2012111230 A1 WO2012111230 A1 WO 2012111230A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detection
- display
- contact
- information
- detection information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1615—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
- G06F1/1616—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to an information processing terminal including a plurality of touch panel displays.
- information processing terminals such as mobile phone terminals, smartphones, tablet terminals, and personal computers are provided with touch panel displays.
- Some information processing terminals of this type are provided with two or more touch panel displays and are virtually used as a large screen (see Patent Document 1).
- some capacitive touch panels have a high sensitivity mode.
- the capacitive touch panel in the high sensitivity mode can detect an operation only by the proximity of the finger even if the finger does not directly touch the touch panel surface.
- By using the capacitive touch panel in such a high sensitivity mode it is possible to operate the touch panel display while wearing gloves.
- some high-sensitivity capacitive touch panels can distinguish whether a finger touches the touch panel surface or is close to it. By using this, it is possible to realize an information processing terminal capable of performing various operations as compared with a touch panel display that simply detects only contact.
- a gap between the touch panel displays is formed at a connection part between the casings such as a hinge part. Therefore, in the touch operation near the gap between the two touch panel displays, it may be difficult to identify whether the touch operation is a simultaneous touch operation on the two displays or a single touch operation between the two displays.
- Patent Document 1 There is a technique described in Patent Document 1 that performs special processing near the gap between two touch panel displays. According to the technique described in Patent Document 1, when a user slides a finger to move an object near the edge of the touch panel display, the object is moved to one of the touch panel displays based on the center of gravity position of the object. According to this, an object can be moved from one touch panel display to another touch panel display.
- Patent Document 1 automatically moves an object according to the position of the center of gravity, it is good to move an object from one touch panel display to another touch panel display. It is not suitable for simple slide operation. Therefore, when the object is further moved in the destination touch panel display, there is a disadvantage that the touch panel display of the destination needs to be touched again and slid.
- An object of the present invention is to provide an information processing terminal with improved operability in the vicinity of a gap between a plurality of touch panel displays.
- an information processing terminal of the present invention provides: A first display and detection unit that performs both display and contact detection on the first display detection surface, and outputs first detection information upon detecting contact of a contact object with the first display detection surface; A second display and detection unit that performs both display and contact detection on the second display detection surface, and outputs second detection information upon detecting contact of a contact object with the second display detection surface; It is arranged in the gap between the first display detection surface of the first display / detection unit and the second display detection surface of the second display / detection unit, detects contact, and detects contact of a contact object Then, a detection unit that outputs third detection information; A processing unit that executes processing based on one, a plurality, or all of the detection information among the first detection information, the second detection information, and the third detection information; have.
- the control method of the present invention includes first and second display / detection units that perform both display and contact detection on a display detection surface and detect contact with the display detection surface, and the first display / detection.
- a method for controlling an information processing terminal comprising: a detection unit that is disposed in a gap between a first display detection surface of a unit and a second display detection surface of the second display / detection unit and performs contact detection; When the first display and detection unit detects contact, it outputs first detection information, When the second display and detection unit detects contact, it outputs second detection information, When the detection unit detects contact, it outputs third detection information, Processing based on one, a plurality, or all of the detection information among the first detection information, the second detection information, and the third detection information is executed.
- the operability in the vicinity of the gap between the display / detection unit can be improved.
- FIG. 1 is a front view of the information processing terminal according to the present embodiment.
- the information processing terminal 10 has a structure in which two housings 14 and 15 are connected so as to be openable and closable by, for example, a connecting part 16 of a hinge mechanism.
- FIG. 1 shows the information processing terminal 10 in an open state.
- the display / detection unit 11 is mounted on the housing 14, and the display / detection unit 12 is mounted on the housing 15.
- the display / detection units 11 and 12 have a function of displaying an image on a display detection surface for performing both display and detection, and a function of detecting contact with a contact object such as a finger on the display detection surface. It is a display.
- the detection unit 13 is mounted on the connecting unit 16 located in the gap between the display detection surface of the display / detection unit 11 and the display detection surface of the display / detection unit 12.
- the detection unit 13 has a function of detecting a contact object.
- the detection unit 13 is a touch panel that does not have a display function.
- FIG. 2 is a block diagram showing a functional configuration of the information processing terminal according to the present embodiment.
- the information processing terminal 10 includes display / detection units 11 and 12, a detection unit 13, and a processing unit 17.
- the processing unit 17 is, for example, a processing unit that is realized by a processor executing a software program.
- the processing unit 17 is based on detection information from the display / detection units 11 and 12 and the detection unit 13. Execute various processes including drawing.
- the display / detection unit 11 performs both display and contact detection, and outputs first detection information when detecting the contact of a contact object such as a finger.
- the display and detection unit 12 performs both display and contact detection, and outputs second detection information when detecting contact of a contact object.
- the detection unit 13 is disposed in the connecting unit 16 in the gap between the display / detection unit 11 and the display / detection unit 12, performs contact detection, and outputs third detection information when the contact of the contact object is detected.
- the first to third detection information is input to the processing unit 17.
- the processing unit 17 executes various processes based on one, a plurality, or all of the detection information among the first detection information, the second detection information, and the third detection information. Depending on the state of detection by the detection unit 13, the processing unit 17 executes processing based on one detection information among the first to third detection information and executes processing based on two detection information. There are cases where the process is executed based on three pieces of detection information.
- the processing unit 17 determines whether the display / detection unit 11 is a single touch operation that is an operation by touching one place based on the relationship between the first and second detection information and the third detection information. And a multi-touch operation that is an operation by touching the display / detection unit 12.
- Single touch operation is operation which taps, flicks, and slides with one finger, for example.
- the multi-touch operation is an operation of tapping, flicking, and sliding with a plurality of fingers, for example.
- the operability in the vicinity of the gap between the display / detection units 11 and 12 can be improved.
- the display / detection units 11 and 12 may output the contact coordinates as detection information, and the detection unit 13 may output that the contact is detected as detection information.
- the processing unit 17 contacts the contact coordinates indicated by the first detection information and the second detection information. It is good to recognize that a single touch operation has been performed on the center point of coordinates.
- the processing unit 17 receives the first detection information and the first detection information when the first detection information in which the contact coordinates draw a locus is input, and subsequently, the second detection information in which the contact coordinates draw a locus is input. If the third detection information is input between the two detection information, the trajectory of the first detection information and the trajectory of the second detection information may be connected, and processing based on the connected trajectory may be executed.
- a capacitive touch sensor or a pressure touch sensor can be employed.
- a capacitive touch sensor having a high sensitivity mode that can distinguish whether a finger touches or is close to the finger can be used. For example, when an operation due to the proximity of a finger is detected, it is determined that the user is wearing gloves, and the operation mode such as the detection logic of touch operation and the size of the screen display of objects etc. is automatically changed. Conceivable.
- Various touch panels and sensors can also be employed for the detection unit 13.
- a capacitive touch sensor having a high sensitivity mode that can distinguish whether a finger touches or is close to the finger can be used.
- a metal type contact sensor may be used as long as it is not necessary to detect coordinates and only the presence or absence of contact can be detected.
- the display / detection units 11 and 12 output first and second detection information capable of distinguishing contact and proximity, and the detection unit 13 is third capable of distinguishing contact and proximity.
- the detection information may be output.
- the processing unit 17 may determine that the proximity is detected when the first detection information indicating proximity, the second detection information indicating proximity, and the third detection information indicating proximity are input at the same time.
- the processing unit 17 may determine that a contact has occurred when first detection information indicating proximity, second detection information indicating proximity, and third detection information indicating contact are input simultaneously.
- the detection unit 13 does not necessarily have to be configured independently, and may be configured integrally with one or both of the display / detection units 11 and 12.
- a structure in which a transparent touch panel is overlaid on a display where the display and detection units 11 and 12 display an image with liquid crystal or the like is assumed.
- the transparent touch panel may be larger in size than the display surface of the display, and the portion outside the display panel area of the transparent touch panel may be the detection unit 13.
- the information processing terminal 10 can improve the operability in the vicinity of the gap between the display and detection units 11 and 12 without separately including the detection unit 13 in addition to the display and detection units 11 and 12. it can.
- the detection unit 13 is provided in the connection unit 16 that connects the two housings 14 and 15 such as a hinge mechanism, but the present invention is not limited to this.
- the detection unit 13 may be provided in the gap between the two display / detection units 11 and 12, and does not necessarily have to be a connection unit.
- the present invention can be similarly applied to a slide-type information processing terminal in which two casings that overlap each other are slid to have two display surfaces arranged.
- the gap between the two display / detection units 11 and 12 is not a connecting unit.
- the detection unit 13 is provided not in the connecting portion but in the gap portion.
- FIG. 3 is a block diagram showing the configuration of the information processing terminal 10 according to the first embodiment.
- the information processing terminal according to the first embodiment includes touch panels 21 and 22 that are input devices, displays 26 and 27 that are output devices, and touch panel display control units 28 and 29 that control the displays 26 and 27 and the touch panels 21 and 22.
- a contact sensor 23 disposed in the gap between the touch panels 21 and 22, a contact sensor control unit 30 for controlling the contact sensor, a memory 31 for storing various information, and a processing unit for executing various processes by executing a software program. 32.
- the display 26 is physically overlapped with the touch panel 21.
- the display 27 is physically overlapped with the touch panel 22.
- the display 26 and the touch panel 21 are mounted on the casing 24, and the display 27 and the touch panel 22 are mounted on the casing 25.
- FIG. 4 is a perspective view showing an appearance of the information processing terminal according to the first embodiment.
- the housing 24 and the housing 25 are connected by a hinge portion 33, and the housings 24 and 25 are closed as shown in FIG. 4A with the hinge portion 33 as an axis, or the housing 24 is shown in FIG. 4B.
- the bodies 24 and 25 can be opened.
- the touch panel 21 appears on the housing 24 and the touch panel 22 appears on the housing 25.
- a contact sensor 23 is disposed between the touch panels 21 and 22. This contact sensor 23 detects the contact of a contact object such as a finger.
- the touch panels 21 and 22 detect the contact of the contact object, and at that time, further detect the contact position on the touch panel 21 and 22 and output a signal.
- the touch panel display control units 28 and 29 convert signals from the touch panels 21 and 22 for processing and send them to the processing unit 32.
- the contact sensor 23 detects that the contact object has come into contact and outputs a signal.
- the contact sensor control unit 30 converts the signal from the contact sensor 23 for processing and sends it to the processing unit 32.
- the processing unit 32 is a processing unit that executes processing based on programs, data, and the like stored in the memory 31, and performs processing using information input by signals from the touch panel displays 28 and 29 and the contact sensor control unit 30. Execute. As a result of executing the processing, the processing unit 32 displays objects, images, message texts, and the like on the displays 26 and 27 as a response to a touch operation with a finger, for example.
- FIG. 5 is a flowchart showing an example of the operation of the information processing terminal according to the first embodiment.
- the processing unit 32 first determines whether or not contact coordinate information is input from the touch panel 21 (step 101). If the information on the touch coordinates of the touch panel 21 has been input, the processing unit 32 next determines whether or not the information on the touch coordinates has been input from the touch panel 22 (step 102). If the information on the contact coordinates of the touch panel 22 is not input, the processing unit 32 determines that the touch is a single touch on the touch panel 21 (step 103) and executes the process (step 104).
- the processing unit 32 next determines whether or not information indicating that there is contact with the touch sensor 23 is input (step 105). ). If information indicating that there is contact is input from the contact sensor 23, the processing unit 32 determines that the touch is a single touch between the touch panel 21 and the touch panel 22 (step 106), and executes the process (step). 104).
- step 105 If information indicating that there is contact with the contact sensor 23 is not input in step 105, the processing unit 32 determines that the touch is a multi-touch that touches the touch panel 21 and the touch panel 22 (step 107). Is executed (step 104).
- the processing unit 32 next determines whether or not the touch coordinate information is input from the touch panel 22 (step 108). If the information on the touch coordinates of the touch panel 22 has been input, the processing unit 32 determines that the touch is a single touch on the touch panel 22 (step 109) and executes the process (step 104). If the information of the contact coordinates of the touch panel 22 is not input, the processing unit 32 determines that the touch operation is not performed (step 110) and does not execute the process.
- the contact sensor 23 disposed between the touch panel 21 and the touch panel 22 by using the contact sensor 23 disposed between the touch panel 21 and the touch panel 22, the contact by a single contact object such as when the contact object is large or the touch panels 21 and 22 are highly sensitive. It is possible to identify whether (single touch) extends over the touch panel 21 and the touch panel 22 or whether contact with the touch object (multi-touch) occurs on each of the touch panel 21 and the touch panel 22.
- FIG. 6 is a diagram illustrating a state in which an operation for drawing a trajectory spanning two touch panel displays is performed in the second embodiment.
- FIG. 7 is a flowchart showing an example of the operation of the information processing terminal according to the second embodiment.
- an operation of drawing a trajectory across the touch panel 21 and the touch panel 22 may be performed.
- an operation is assumed in which a finger is slid from the touch panel 21 to the touch panel 22 to draw a trajectory that continuously spans the two touch panels 21 and 22.
- an object displayed on the display 26 is selected at the start point of the locus, moved from the display 26 to the display 27, and then the movement is completed at the end point of the locus.
- an operation of drawing a trajectory across the touch panel 21 and the touch panel 22 can be used as an operation for performing a gesture command for scrolling the screen left and right.
- the processing unit 32 contacts the contact sensor 23 with the contact sensor 23 after the locus is drawn on the touch panel 21. It is determined whether or not there has been (step 201).
- the processing unit 32 determines whether or not the start point of the next locus exists on the other touch panel (here, the touch panel 22) (step 202). If there is a start point of the next locus on the touch panel 22, the processing unit 32 connects the locus of both touch panels 21 and 22 (step 203), and executes the process (step 204).
- Step 205 When there is no contact with the contact sensor 23 at step 201 and when the next locus does not exist on the other touch panel at step 202, the operation is a flick or slide operation within one touch panel. (Step 205), and the process is executed (step 204).
- the information processing terminal 10 can correctly recognize whether a trajectory such as a flick operation traverses the touch panels 21 and 22 or separate trajectories on the touch panels 21 and 22 and can perform processing. .
- the third embodiment two touch panels with high sensitivity are used.
- the high-sensitivity touch panel can distinguish and detect that the contact object has come into contact with the contact object and can output information on the contact or close coordinates. Proximity means that a finger or the like is not touching but is approaching within a predetermined range.
- the basic configuration of the information processing terminal according to the third embodiment is the same as that according to the first embodiment shown in FIGS.
- FIG. 8 is a flowchart showing an example of the operation of the information processing terminal according to the third embodiment.
- the processing unit 32 first determines whether or not information on coordinates in contact with or close to the touch panel 21 has been input (step 301). If the information on the touch or proximity coordinates of the touch panel 21 has been input, the processing unit 32 next determines whether or not the information on the close coordinates has been input from the touch panel 22 (step 302).
- the processing unit 32 next determines whether or not information indicating that there is a contact has been input from the contact sensor 23 (step 303). If information indicating that there is contact with the contact sensor 23 is input, the processing unit 32 determines that the touch is a single touch between the touch panel 21 and the touch panel 22 (step 304), and executes processing (step). 305).
- step 303 If information indicating that there is contact with the contact sensor 23 is not input in step 303, the processing unit 32 determines that the touch panel 21 is a single touch contact or proximity depending on the determination result in step 301. Then, the process is executed (step 305).
- step 302 if the information of the proximity coordinates of the touch panel 22 is not input, the processing unit 32 next determines whether or not information indicating that there is a contact is input from the contact sensor 23 (step 308). If information indicating that there is contact with the contact sensor 23 is not input, the processing unit 32 determines that the touch panel 21 is a single-touch contact or proximity depending on the determination result in step 301 (step 306). Alternatively, the process is executed (step 307). If information indicating that there is contact with the contact sensor 23 is input in step 308, the processing unit 32 determines that the touch is a single touch on the touch panel 21 (step 306) and executes processing (step 305). ).
- step 301 If it is determined in step 301 that the information on the coordinates of the touch on the touch panel 21 and the information on the coordinates of the proximity are input, the processing unit 32 next determines whether the information on the coordinates of the proximity is input from the touch panel 22. (Step 309). If information on the coordinates of the proximity of the touch panel 22 has been input, the processing unit 32 further determines whether information indicating that there is contact with the contact sensor 23 has been input (step 310).
- the processing unit 32 determines that the touch is a single touch on the touch panel 22 (step 311), and executes the process (step 305). If information indicating that there is contact with the contact sensor 23 is not input, the processing unit 32 determines that the touch sensor 22 is in proximity (step 307).
- the processing unit 32 determines that the touch operation is not performed (step 312) and does not execute the process.
- the information processing terminal determines that the contact object is in contact even when the contact object is not in contact with either the touch panel 21 or 22 but is in contact with only the contact sensor 23, and executes the processing. Therefore, the touch operation in the gap between the touch panels 21 and 22 can be correctly recognized.
- FIG. 9 is a front view of an information processing terminal according to the fourth embodiment.
- the touch panels 21 and 22 and a portion corresponding to the contact sensor 23 are integrally formed.
- a display device such as a liquid crystal has a thickness and an outer peripheral frame
- the casing 24 and the casing 25 are brought close to each other by a mechanism, it is difficult to bring the display areas of the display 26 and the display 27 close to each other. is there.
- the distance between the touch panel 21 and the touch panel 22 can be made shorter than the distance between the display 26 and the display 27.
- the touch panels 21 and 22 are larger than the displays 26 and 27 and cannot be displayed in the gap between the displays 26 and 27, but the touch operation is detected by the touch panels 21 and 22. There is a part that can The portion is used as the contact sensor 23. As a result, in this embodiment, there is almost no gap between the touch panel 21 and the touch panel 22.
- FIG. 10 is a flowchart showing an example of the operation of the information processing terminal according to the fourth embodiment.
- the processing unit 32 first determines whether or not information on contact coordinates within the display 26 range of the touch panel 21 has been input (step 401). If the information of the contact coordinates within the display 26 range of the touch panel 21 has been input, the processing unit 32 next determines whether the information of the contact coordinates within the display 27 range has been input from the touch panel 22 (step 402). ). If the information of the contact coordinates within the display 27 range of the touch panel 22 has not been input, the processing unit 32 determines that the touch panel 21 is a single touch (step 403) and executes the process (step 404).
- the processing unit 32 If the contact coordinate information within the display 27 range of the touch panel 22 has been input at step 402, the processing unit 32 then inputs the contact coordinates information outside the display 26 and 27 range of the touch panel 21, 22. It is determined whether it has been done (step 405). If information indicating that there is contact is input from the contact sensor 23, the processing unit 32 determines that the touch is a single touch between the touch panel 21 and the touch panel 22 (step 406), and executes the process (step). 404).
- step 405 If information indicating that there is contact with the contact sensor 23 is not input in step 405, the processing unit 32 determines that the touch is a multi-touch that touches the touch panel 21 and the touch panel 22 (step 407). Is executed (step 404).
- step 401 if the information of the contact coordinates within the display 26 range of the touch panel 21 is not input, the processing unit 32 next determines whether the information of the contact coordinates within the display 27 range is input from the touch panel 22 or not. Determination is made (step 408). If the information of the contact coordinates within the display 27 range of the touch panel 22 has been input, the processing unit 32 determines that the touch panel 22 is a single touch (step 409) and executes the process (step 404). If the information of the contact coordinates within the display 27 range of the touch panel 22 is not input, the processing unit 32 determines that the touch operation is not performed (step 410) and does not execute the process.
- the information processing terminal can obtain the same effect as that of the first embodiment without having the contact sensor 23 as a physically independent configuration.
- the contact sensor 23 installed in the gap between the displays 26 and 27 does not need to output contact coordinates, so a typical example is a metal plate.
- a pressure sensor or a capacitive touch sensor may be used.
- a pressure sensor is preferable when it is assumed that the operation is performed with gloves on, and a metal contact sensor is preferable when it is assumed that the operation is performed with bare hands.
- the processing unit 32 may execute processing based on coordinate information from the touch panels 21 and 22 and coordinate information from the contact sensor 23. .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
第1の表示検知面で表示と接触検知の両方を行い、該第1の表示検知面への接触物の接触を検知すると第1の検知情報を出力する第1の表示兼検知部と、
第2の表示検知面で表示と接触検知の両方を行い、該第2の表示検知面への接触物の接触を検知すると第2の検知情報を出力する第2の表示兼検知部と、
前記第1の表示兼検知部の前記第1の表示検知面と前記第2の表示兼検知部の前記第2の表示検知面の間隙に配置され、接触検知を行い、接触物の接触を検知すると第3の検知情報を出力する検知部と、
前記第1の検知情報、前記第2の検知情報、および前記第3の検知情報のうち1つまたは複数または全ての検知情報に基づく処理を実行する処理部と、
を有している。
前記第1の表示兼検知部が接触を検知すると第1の検知情報を出力し、
前記第2の表示兼検知部が接触を検知すると第2の検知情報を出力し、
前記検知部が接触を検知すると第3の検知情報を出力し、
前記第1の検知情報、前記第2の検知情報、および前記第3の検知情報のうち1つまたは複数または全ての検知情報に基づく処理を実行するというものである。
図3は、第1の実施例による情報処理端末10の構成を示すブロック図である。第1の実施例による情報処理端末は、入力装置であるタッチパネル21、22と、出力装置であるディスプレイ26、27と、ディスプレイ26、27およびタッチパネル21、22を制御するタッチパネルディスプレイ制御部28、29と、タッチパネル21、22の間隙に配置された接触センサー23と、接触センサーを制御する接触センサー制御部30と、各種情報を記憶するメモリー31と、ソフトウェアプログラムを実行して各種処理を行う処理部32とを含む。ディスプレイ26は、タッチパネル21と物理的に重ねられている。同じくディスプレイ27も、タッチパネル22と物理的に重なられている。ディスプレイ26とタッチパネル21は筐体24に搭載され、ディスプレイ27とタッチパネル22は筐体25に搭載されている。
第2の実施例では指等の接触物で2つのタッチパネルにまたがるような軌跡を描く操作を可能にしている。
第3の実施例では、2つのタッチパネルに高感度のものが用いられている。高感度のタッチパネルは、接触物が接触したことと、接触物が近接したことを区別し検出し、その接触あるいは近接した座標の情報を出力することができる。近接というのは指等が接触してはいないが、所定以内に近づいていることである。
図9は、第4の実施例による情報処理端末の正面図である。第4の実施例では、タッチパネル21、22と、接触センサー23に相当する部分とが一体的に構成されている。
11 表示兼検知部
12 表示兼検知部
13 検知部
14、15 筐体
16 連結部
17 処理部
21、22 タッチパネル
23 接触センサー
24、25 筐体
26、27 ディスプレイ
28 タッチパネルディスプレイ制御部
30 接触センサー制御部
32 処理部
33 ヒンジ部
Claims (12)
- 第1の表示検知面で表示と接触検知の両方を行い、該第1の表示検知面への接触物の接触を検知すると第1の検知情報を出力する第1の表示兼検知部と、
第2の表示検知面で表示と接触検知の両方を行い、該第2の表示検知面への接触物の接触を検知すると第2の検知情報を出力する第2の表示兼検知部と、
前記第1の表示兼検知部の前記第1の表示検知面と前記第2の表示兼検知部の前記第2の表示検知面の間隙に配置され、接触検知を行い、接触物の接触を検知すると第3の検知情報を出力する検知部と、
前記第1の検知情報、前記第2の検知情報、および前記第3の検知情報のうち1つまたは複数または全ての検知情報に基づく処理を実行する処理部と、
を有する情報処理端末。 - 前記処理部は、前記第1および第2の検知情報と、前記第3の検知情報との関係に基づいて、シングルタッチ操作かマルチタッチ操作かを判断する、請求項1に記載の情報処理端末。
- 前記第1の表示兼検知部は前記第1の表示検知面内の接触座標を前記第1の検知情報として出力し、前記第2の表示兼検知部は前記第2の表示検知面内の接触座標を前記第2の検知情報として出力し、前記検知部は接触を検知した旨を前記第3の検知情報として出力する、請求項1に記載の情報処理端末。
- 前記処理部は、前記第1の検知情報と前記第2の検知情報と前記第3の検知情報が同時に入力されると、前記第1の検知情報が示す接触座標と前記第2の検知情報が示す接触座標の中点へのシングルタッチ操作が行われたと認識する、請求項3に記載の情報処理端末。
- 前記処理部は、接触座標が軌跡を描く第1の検知情報が入力され、それに続いて第3の検知情報が入力され、更に接触座標が軌跡を描く第2の検知情報が入力されたら、前記第1の検知情報の軌跡と前記第2の検知情報の軌跡を連結する、請求項3に記載の情報処理端末。
- 前記第1の表示兼検知部は、前記第1の表示検知面への前記接触物の接触と近接を区別することが可能な前記第1の検知情報を出力し、
前記第2の表示兼検知部は、前記第2の表示検知面への前記接触物の接触と近接を区別することが可能な前記第2の検知情報を出力し、
前記検知部は、前記接触物の接触と近接を区別することが可能な前記第3の検知情報を出力し、
前記処理部は、近接を示す第1の検知情報と、近接を示す第2の検知情報と、近接を示す第3の検知情報とが同時に入力されたときには近接と判定し、近接を示す第1の検知情報と、近接を示す第2の検知情報と、接触を示す第3の検知情報とが同時に入力されたときには接触と判定する、請求項1に記載の情報処理端末。 - 前記第1の表示兼検知部と前記第2の表示兼検知部の少なくとも一方は、表示ディスプレイ上に透明タッチパネルを重ねた構造であり、
前記透明タッチパネルが前記表示ディスプレイの表示面よりも大きいサイズであり、該透明タッチパネルにおける前記表示パネルの領域外の部分が前記検知部となっている、請求項1から6のいずれか一項に記載の情報処理端末。 - 表示検知面での表示と接触検知の両方を行い、該表示検知面への接触を検知する第1および第2の表示兼検知部と、前記第1の表示兼検知部の第1の表示検知面と前記第2の表示兼検知部の第2の表示検知面の間隙に配置され、接触検知を行う検知部と、を有する情報処理端末の制御方法であって、
前記第1の表示兼検知部が接触を検知すると第1の検知情報を出力し、
前記第2の表示兼検知部が接触を検知すると第2の検知情報を出力し、
前記検知部が接触を検知すると第3の検知情報を出力し、
前記第1の検知情報、前記第2の検知情報、および前記第3の検知情報のうち1つまたは複数または全ての検知情報に基づく処理を実行する、情報処理端末の制御方法。 - 前記第1および第2の検知情報と、前記第3の検知情報との関係に基づいて、シングルタッチ操作かマルチタッチ操作かを判断する、請求項8に記載の情報処理端末の制御方法。
- 前記第1の表示兼検知部は前記第1の表示検知面内の接触座標を前記第1の検知情報として出力し、前記第2の表示兼検知部は前記第2の表示検知面内の接触座標を前記第2の検知情報として出力し、前記検知部は接触を検知した旨を前記第3の検知情報として出力し、
前記第1の検知情報と前記第2の検知情報と前記第3の検知情報が同時に出力されると、前記第1の検知情報が示す接触座標と前記第2の検知情報が示す接触座標の中点へのシングルタッチ操作が行われたと認識する、請求項8に記載の情報処理端末の制御方法。 - 前記第1の表示兼検知部は前記第1の表示検知面内の接触座標を前記第1の検知情報として出力し、前記第2の表示兼検知部は前記第2の表示検知面内の接触座標を前記第2の検知情報として出力し、前記検知部は接触を検知した旨を前記第3の検知情報として出力し、
接触座標が軌跡を描く第1の検知情報が出力され、それに続いて、接触座標が軌跡を描く第2の検知情報が出力されたとき、前記第1の検知情報と前記第2の検知情報の間に第3の検知情報が出力されていたら、前記第1の検知情報の軌跡と前記第2の検知情報の軌跡を連結する、請求項8に記載の情報処理端末の制御方法。 - 前記第1の表示兼検知部は、前記第1の表示検知面への前記接触物の接触と近接を区別することが可能な前記第1の検知情報を出力し、前記第2の表示兼検知部は、前記第2の表示検知面への前記接触物の接触と近接を区別することが可能な前記第2の検知情報を出力し、前記検知部は、前記接触物の接触と近接を区別することが可能な前記第3の検知情報を出力し、
近接を示す第1の検知情報と、近接を示す第2の検知情報と、近接を示す第3の検知情報とが同時に出力されたときには近接と判定し、
近接を示す第1の検知情報と、近接を示す第2の検知情報と、接触を示す第3の検知情報とが同時に出力されたときには接触と判定する、請求項8に記載の情報処理端末の制御方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11858732.8A EP2677403A4 (en) | 2011-02-16 | 2011-12-15 | INFORMATION PROCESSING DEVICE AND METHOD FOR ITS CONTROL |
US13/979,800 US9122337B2 (en) | 2011-02-16 | 2011-12-15 | Information processing terminal, and method for controlling same |
JP2012557795A JP5846129B2 (ja) | 2011-02-16 | 2011-12-15 | 情報処理端末およびその制御方法 |
CN201180067743.5A CN103384862B (zh) | 2011-02-16 | 2011-12-15 | 信息处理终端及其控制方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-030970 | 2011-02-16 | ||
JP2011030970 | 2011-02-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012111230A1 true WO2012111230A1 (ja) | 2012-08-23 |
Family
ID=46672179
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/079007 WO2012111230A1 (ja) | 2011-02-16 | 2011-12-15 | 情報処理端末およびその制御方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9122337B2 (ja) |
EP (1) | EP2677403A4 (ja) |
JP (1) | JP5846129B2 (ja) |
CN (1) | CN103384862B (ja) |
WO (1) | WO2012111230A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015114717A (ja) * | 2013-12-09 | 2015-06-22 | シャープ株式会社 | 情報表示制御装置、情報表示制御方法及びプログラム |
JP2015194983A (ja) * | 2014-03-25 | 2015-11-05 | パナソニックIpマネジメント株式会社 | 入力装置および表示装置 |
KR20150140365A (ko) | 2013-05-08 | 2015-12-15 | 알프스 덴키 가부시키가이샤 | 입력 장치 |
JP2018032432A (ja) * | 2017-10-30 | 2018-03-01 | シャープ株式会社 | 情報表示制御装置、情報表示制御方法及びプログラム |
JP2019109802A (ja) * | 2017-12-20 | 2019-07-04 | コニカミノルタ株式会社 | 表示装置、画像処理装置及びプログラム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150185839A1 (en) * | 2013-12-28 | 2015-07-02 | Aleksander Magi | Multi-screen wearable electronic device for wireless communication |
KR102408828B1 (ko) * | 2015-09-03 | 2022-06-15 | 삼성디스플레이 주식회사 | 전자장치 및 그 구동방법 |
KR102510395B1 (ko) * | 2015-12-01 | 2023-03-16 | 삼성디스플레이 주식회사 | 표시 장치 시스템 |
USD891427S1 (en) * | 2017-11-13 | 2020-07-28 | Samsung Display Co., Ltd. | Display device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010176332A (ja) | 2009-01-28 | 2010-08-12 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
JP2010250463A (ja) * | 2009-04-14 | 2010-11-04 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
JP2010250465A (ja) * | 2009-04-14 | 2010-11-04 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5644469A (en) * | 1991-03-06 | 1997-07-01 | Canon Kabushiki Kaisha | Information processing apparatus support system with support arms which are capable of closing toward a keyboard unit or away from a keyboard unit |
JPH0926832A (ja) * | 1995-07-07 | 1997-01-28 | Seiko Epson Corp | 情報処理装置および処理方法 |
US20090322689A1 (en) * | 2008-06-30 | 2009-12-31 | Wah Yiu Kwong | Touch input across touch-sensitive display devices |
-
2011
- 2011-12-15 JP JP2012557795A patent/JP5846129B2/ja active Active
- 2011-12-15 CN CN201180067743.5A patent/CN103384862B/zh active Active
- 2011-12-15 WO PCT/JP2011/079007 patent/WO2012111230A1/ja active Application Filing
- 2011-12-15 US US13/979,800 patent/US9122337B2/en active Active
- 2011-12-15 EP EP11858732.8A patent/EP2677403A4/en not_active Ceased
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010176332A (ja) | 2009-01-28 | 2010-08-12 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
JP2010250463A (ja) * | 2009-04-14 | 2010-11-04 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
JP2010250465A (ja) * | 2009-04-14 | 2010-11-04 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2677403A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150140365A (ko) | 2013-05-08 | 2015-12-15 | 알프스 덴키 가부시키가이샤 | 입력 장치 |
JP2015114717A (ja) * | 2013-12-09 | 2015-06-22 | シャープ株式会社 | 情報表示制御装置、情報表示制御方法及びプログラム |
JP2015194983A (ja) * | 2014-03-25 | 2015-11-05 | パナソニックIpマネジメント株式会社 | 入力装置および表示装置 |
JP2018032432A (ja) * | 2017-10-30 | 2018-03-01 | シャープ株式会社 | 情報表示制御装置、情報表示制御方法及びプログラム |
JP2019109802A (ja) * | 2017-12-20 | 2019-07-04 | コニカミノルタ株式会社 | 表示装置、画像処理装置及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP5846129B2 (ja) | 2016-01-20 |
CN103384862A (zh) | 2013-11-06 |
US20130335359A1 (en) | 2013-12-19 |
CN103384862B (zh) | 2017-04-12 |
JPWO2012111230A1 (ja) | 2014-07-03 |
EP2677403A4 (en) | 2014-10-01 |
US9122337B2 (en) | 2015-09-01 |
EP2677403A1 (en) | 2013-12-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5846129B2 (ja) | 情報処理端末およびその制御方法 | |
US8289292B2 (en) | Electronic device with touch input function and touch input method thereof | |
KR200450989Y1 (ko) | 양면 터치스크린을 구비한 플랫 패널 형상의 모바일 장치 | |
EP2241963B1 (en) | Information processing apparatus, information processing method and program | |
JP4372188B2 (ja) | 情報処理装置および表示制御方法 | |
US20180067638A1 (en) | Gesture Language for a Device with Multiple Touch Surfaces | |
JP6319298B2 (ja) | 情報端末、表示制御方法及びそのプログラム | |
EP2657811A1 (en) | Touch input processing device, information processing device, and touch input control method | |
JP2013546110A (ja) | コンピューティング装置の動きを利用するコンピューティング装置と相互作用するときに発生する入力イベントの解釈の強化 | |
TWI659353B (zh) | 電子設備以及電子設備的工作方法 | |
JP2012212230A (ja) | 電子機器 | |
US20140195935A1 (en) | Information processing device, information processing method, and information processing program | |
TW201741814A (zh) | 視窗控制方法及行動終端 | |
JP6304232B2 (ja) | 携帯電子機器、その制御方法及びプログラム | |
JP5515951B2 (ja) | 情報処理装置、入力制御方法、プログラム及び記録媒体 | |
JP2013114645A (ja) | 小型情報機器 | |
CN105183353B (zh) | 用于触控设备的多点触控输入方法 | |
KR20160000534U (ko) | 터치패드를 구비한 스마트폰 | |
CN204270284U (zh) | 一种显示设备 | |
TWI493433B (zh) | 被遮蔽畫面投影方法及應用該方法之可攜式電子裝置 | |
JP2013228786A (ja) | 情報端末、情報表示方法及びプログラム | |
KR20130032598A (ko) | 휴대용 단말기에서 화면 크기를 조절하기 위한 장치 및 방법 | |
JP2013020344A (ja) | 入力装置及び入力検出方法 | |
JP2016130976A (ja) | 情報処理装置、情報処理装置の制御方法、および制御プログラム | |
TWI466007B (zh) | 多重觸控輸入方法及其電子裝置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11858732 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13979800 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2012557795 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011858732 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |