WO2016114269A1 - 情報処理装置およびその制御方法 - Google Patents
情報処理装置およびその制御方法 Download PDFInfo
- Publication number
- WO2016114269A1 WO2016114269A1 PCT/JP2016/050734 JP2016050734W WO2016114269A1 WO 2016114269 A1 WO2016114269 A1 WO 2016114269A1 JP 2016050734 W JP2016050734 W JP 2016050734W WO 2016114269 A1 WO2016114269 A1 WO 2016114269A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- mode
- control unit
- unit
- threshold
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present invention relates to an information processing apparatus and a control method thereof, and more particularly to an information processing apparatus including a display unit and an object detection unit and a control method thereof.
- Patent Document 1 discloses two modes: a scroll mode in which display contents can be scrolled by a touch panel provided on a display screen, and a pointer mode in which a selection target displayed on the display screen by the touch panel can be selected. There is described a pointer display device that can switch modes by long-pressing a touch panel.
- Japanese Patent Publication Japanese Patent Laid-Open No. 2011-170901 (published on September 1, 2011)”
- the present inventors have a display unit and an object detection unit having a planar detection area for detecting an object that is close or in contact with the display unit at a position different from the display area of the display unit.
- An information processing apparatus equipped with the information processing apparatus has been developed, and a simpler operation in such an information processing apparatus has been intensively studied.
- the present invention has been made in view of the above problems, and a main object of the present invention is to provide a technique for realizing a simpler operation in an information processing apparatus including a display unit and an object detection unit.
- an information processing device provides a display portion and a planar detection region that detects an object that is close to or touches at a position different from the display region of the display portion.
- An object detection unit a position information acquisition unit that acquires position information indicating the position at which the object detection unit has detected the object, and in the first mode, the position of the pointer on the display area is determined according to the position information.
- the display control unit that scrolls at least a part of the display area in accordance with the change in the position information, and the duration for which the position information continues to indicate the same position is the first threshold value.
- An operation control unit that switches the operation of the display control unit from the first mode to the second mode when the time exceeds the second threshold, and executes an action according to the position of the pointer when the duration exceeds the second threshold.
- the control method of the information processing device includes a display unit and an object detection unit that has a planar detection region for detecting an object that is close to or touches at a position different from the display region of the display unit. And a position information acquisition unit that acquires position information indicating the position at which the object detection unit has detected the object.
- the display method In the first mode, the display method according to the position information The position of the pointer on the area is determined, and in the second mode, the display control step of scrolling at least a part of the display area according to the change of the position information, and the position information continuously indicates the same position.
- the continuous duration exceeds the first threshold
- the operation of the display control process is switched from the first mode to the second mode.
- an action corresponding to the position of the pointer is performed. Action to perform And a control process.
- the user can switch from the first mode to the second mode and perform another action without releasing an object (for example, a finger) for operation from the object detection unit. Can be continued. Thereby, a simpler operation can be realized in the information processing apparatus including the display unit and the object detection unit.
- an object for example, a finger
- FIG. 2 is a diagram illustrating an appearance of the mobile phone 10 according to the present embodiment.
- the mobile phone 10 is a so-called foldable mobile phone.
- the first housing 1 and the second housing 2 are connected via a hinge 3 and are rotatable about the axis of the hinge 3.
- casing 2 are substantially flat plate shape, for example.
- a display unit 14 is disposed on one surface of the first housing 1.
- a hard key 15 is arranged on one surface of the second housing 2, and the touch pad (object detection unit) 16 is superimposed on the hard key 15 below the hard key 15 (inside the second housing 2). Sensors are arranged.
- the cellular phone 10 has an open state (the form shown in FIG. 2) in which the first housing 1 and the second housing 2 are opened, and a surface (display surface) on which the display unit 14 of the first housing 1 is disposed. And a closed state (not shown) in which the surface (operation surface) on which the hard key 15 of the second housing 2 is disposed is opposed.
- the display unit 14 displays an image.
- an LCD liquid crystal display
- an organic EL display or the like can be applied to the display unit 14.
- the hard key 15 is for the user to operate the mobile phone 10.
- the hard key 15 is a physical key that outputs a signal corresponding to the key pressed by the user.
- the hard key 15 is a menu key, a numeric keypad, a cross key, a center key, an on-hook key, an off-hook key, or the like.
- the touch pad 16 is for operating the mobile phone 10.
- the touch pad 16 includes the sensor, detects an object (such as a user's finger or stylus) that approaches or touches the touch pad 16 at predetermined time intervals, and detects the detected position (for example, two-dimensional on the touch pad 16). Position information indicating coordinates) is output.
- the detection area where the touch pad 16 detects an object is the entire surface (operation surface) on which the hard keys 15 of the second housing 2 are arranged. That is, the operation surface of the second housing 2 is the detection surface of the touch pad 16. Therefore, the key top surface of the hard key 15 is a part of the detection surface and is included in the detection region.
- the sensor included in the touch pad 16 is a capacitance sensor or the like.
- the touch pad 16 detects whether or not the user's finger is in contact with the operation surface of the second housing 2. As shown in FIG. 2, the operation surface (detection region of the touch pad 16) of the second housing 2 is located at a position different from the display surface (display region) of the display unit 14.
- the mobile phone 10 is not limited to this, and may be a mobile phone of straight type, slide type, biaxial hinge type, or the like. In the present embodiment, a mobile phone is illustrated, but the present invention is not limited to this.
- the present invention can be applied to any information processing apparatus including a display unit and an object detection unit having a planar detection area for detecting an object approaching or in contact with the display unit at a position different from the display area of the display unit. is there.
- the present invention is applicable to, for example, notebook PCs, portable game machines, digital cameras, digital video cameras, portable music players, and the like.
- the mobile phone 10 has the two operation units (input units) of the hard key 15 and the touch pad 16.
- the mobile phone 10 has a “key operation mode”, “pointer mode” (first mode), and “scroll mode” (second mode). There are three modes.
- the key operation mode is a mode that can be operated only with the hard key 15. That is, in the key operation mode, the operation using the touch pad 16 is invalid.
- key operation mode for example, operate the cross key to move the focus (select an item in the list), press the center key to make a decision, operate the numeric keypad to enter numbers or characters, or go off-hook.
- a call can be started by pressing a key, or a call or application can be ended by pressing an on-hook key.
- the key operation mode press and hold the off-hook key to enter pointer mode.
- a specific application is activated to shift to the pointer mode.
- the key operation mode is set.
- the pointer mode is a mode in which an arrow mark cursor is displayed on the screen, and a cursor moving operation and a determination operation using the touch pad 16 are enabled.
- the pointer mode includes a cursor non-display state and a cursor display state.
- the cursor non-display state the user touches the touch pad 16 (the operation surface of the second housing 2) and swipes a little to display the cursor. That is, the cursor transitions to the cursor display state.
- the operation by the hard key 15 is possible similarly to the key operation mode.
- the user can move the cursor by swiping or flicking the touch pad 16, or single-tap the touch pad 16 within a predetermined time (for example, 1.5 seconds) after the cursor moves.
- the determination can be input at the position where the cursor is located, or the determination can be input at the position where the cursor is located by double-tapping the touch pad 16.
- the cursor is deleted. That is, the state transits to a cursor non-display state.
- the transition to the cursor non-display state also occurs when a predetermined time elapses without operating the touch pad 16.
- the key operation mode is entered by terminating the specific application.
- a long tap on the touch pad 16 makes a transition to the scroll mode.
- the cursor is displayed.
- Scroll mode is a mode that enables the screen to be scrolled by the touch pad 16.
- the screen can be scrolled by swiping or flicking the touch pad 16, or predetermined processing corresponding to the running application can be executed by long-tapping the touch pad 16.
- a single tap of the touch pad 16 makes a transition to the pointer mode.
- the mode is changed to the pointer mode.
- the key operation mode is entered.
- FIG. 1 is a block diagram showing a main configuration of a mobile phone 10 according to the present embodiment.
- the mobile phone 10 includes a main control unit 11, a storage unit 12, a work memory 13, a display unit 14, hard keys 15, and a touch pad 16.
- the mobile phone 10 may include members such as a communication unit, a voice input unit, and a voice output unit, but these members are not shown because they are not related to the features of the invention.
- the main control unit 11 is, for example, a CPU (Central Processing Unit), the storage unit 12 is, for example, a ROM (Read Only Memory), the work memory 13 is, for example, a RAM (Random Access Memory), and the main control unit 11 is By executing the program read from the storage unit 12 to the work memory 13, various calculations are performed and the respective units included in the mobile phone 10 are comprehensively controlled.
- the storage unit 12 also stores a time threshold (first threshold) T1 and a time threshold (second threshold) T2.
- the main control unit 11 includes, as functional blocks, a touch pad control unit (position information acquisition unit) 21, a touch event processing unit (motion control unit) 22, an application control unit (motion control unit) 23, and display control.
- a touch pad control unit position information acquisition unit
- a touch event processing unit motion control unit
- an application control unit motion control unit
- display control This is a configuration including the unit 24.
- the touch pad control unit 21 acquires position information from the touch pad 16, and specifies a touch event based on the acquired position information.
- the touch pad control unit 21 generates event information indicating the identified touch event, and notifies the generated event information to the touch event processing unit 22.
- the touch pad control unit 21 determines the type of touch event at each detection interval of the touch pad 16, and specifies the touch event.
- the touch pad control unit 21 determines that the touch down has occurred at the position indicated by the acquired position information. Further, when the position information is not sent from the touch pad 16 in the state where the position information has been received from the touch pad 16 last time, the touch pad control unit 21 performs the touch at the position indicated by the position information acquired last time. Judge that there was an up.
- the touch pad control unit 21 acquires position information from the touch pad 16 in a state where the position information has been received from the touch pad 16 last time, and the position indicated by the acquired position information is the position indicated by the previously acquired position information. If it is different from, it is determined that there has been a move up to the position indicated by the position information acquired this time. As described above, the touch pad control unit 21 specifies one of touch-up, touch-down, and move as a touch event.
- the touch pad control unit 21 When the touch pad control unit 21 specifies a touch event, the touch pad control unit 21 generates event information including type information and position information indicating the type of touch event (touch down, touch up, or move). Specifically, when the touch event is a touchdown, the event information includes position information indicating a touchdown position. When the touch event is a touch-up, the event information includes position information indicating a touch-up position (previous touch-down or move position). If the touch event is a move, the event information includes position information indicating the position of the move.
- event information includes type information and position information indicating the type of touch event (touch down, touch up, or move). Specifically, when the touch event is a touchdown, the event information includes position information indicating a touchdown position. When the touch event is a touch-up, the event information includes position information indicating a touch-up position (previous touch-down or move position). If the touch event is a move, the event information includes position information indicating the position of the move.
- the touch event processing unit 22 acquires event information from the touch pad control unit 21, processes the event information, and notifies the application control unit 23 or the display control unit 24 of the result.
- User operations on the touch pad 16 include a touch operation, a single tap, a double tap, a long tap, a swipe, and a flick.
- swipe and flick are not particularly distinguished, and these are collectively referred to as a scroll operation.
- the touch event processing unit 22 determines the type of user operation based on one or more touch events specified by the touch pad control unit 21.
- the touch operation is an operation in which the user brings an object into contact with the touch pad 16.
- the touch event processing unit 22 determines that the user has performed a touch operation.
- the single tap is an operation in which the user releases the object from the touch pad 16 immediately after contacting the object with the touch pad 16.
- the touch event processing unit 22 detects a touchdown and detects a touchup next time or within a predetermined period, the touch event processing unit 22 determines that the user has performed a single tap.
- the double tap is an operation in which the user performs an operation of continuously releasing the object from the touch pad 16 twice in succession by bringing the object into contact with the touch pad 16.
- the touch event processing unit 22 determines that the user has double-tapped when it is determined that there has been a single tap twice consecutively within a predetermined period.
- the long tap is an operation in which the user brings the object into contact with the touch pad 16 and releases the object from the touch pad 16 after a predetermined period. The long tap determination by the touch event processing unit 22 will be described later.
- the scroll operation is an operation in which a user moves an object on the touch pad 16 while bringing the object into contact with the touch pad 16 and bringing the object into contact with the touch pad 16. When the touch event processing unit 22 detects a touchdown (or move) and then detects a move that moves more than a predetermined distance, the touch event processing unit 22 determines that the user has performed a scroll operation.
- the touch event processing unit 22 notifies the application control unit 23 of the detected single tap, double tap, and long tap.
- the touch event processing unit 22 notifies the display control unit 24 of the position information and the detected scroll operation. Note that the touch event processing unit 22 may further notify the application control unit 23 of the touch event itself, the scroll operation, and the like.
- the application control unit 23 controls applications that can be executed by the mobile phone 10. Specifically, the application control unit 23 acquires a user operation from the touch event processing unit 22. The application control unit 23 executes a predetermined process in the running application according to the acquired user operation. The application control unit 23 can control any application that can be executed by the mobile phone 10, such as a call, mail, image display, video playback, document creation, and the like.
- the display control unit 24 controls the display content of the display unit 14 based on the output from the application control unit 23.
- the display control unit 24 also controls various cursors displayed on the display unit 14 and controls scrolling in the display area of the display unit 14.
- the display control unit 24 determines the position of the pointer on the display area according to the position information acquired from the touch event processing unit 22, and causes the display unit 14 to display the cursor 6 indicating the pointer.
- the display control unit 24 scrolls (moves and displays) at least a part of the display area according to the scroll operation acquired from the touch event processing unit 22. As described above, the scroll operation is detected according to the change in the position information. In the scroll mode, the display control unit 24 causes the display unit 14 to display the cursor 7 indicating the scroll mode. Moreover, in one embodiment, the display control unit 24 may move at least a part of the display area and acquire the display content to be displayed in the moved area from the application control unit 23.
- the touch event processing unit 22 controls whether the display control unit 24 operates in the pointer mode or the scroll mode.
- the touch event processing unit 22 when the user presses and holds the touch pad 16 for a long time, the touch event processing unit 22 has a touch duration time (duration) in which the position information continues to indicate the same position as the time threshold T1, It is determined whether or not each of T2 has been exceeded.
- the touch event processing unit 22 detects that the touch pad control unit 21 does not detect a touch up or move after each time threshold value elapses after the touch pad control unit 21 detects the touch down. It can be determined that the touch duration has exceeded each time threshold.
- the touch event processing unit 22 measures a touch duration time during which the touch pad control unit 21 does not detect touch up or move after the touch pad control unit 21 detects touch down. , Each time threshold may be compared.
- the touch event processing unit 22 switches the operation of the display control unit 24 from the pointer mode to scrolling when the touch duration exceeds the time threshold T1, and the application control when the touch duration exceeds the time threshold T2.
- the unit 23 is notified of the long touch described above.
- the application control unit 23 that has acquired the long touch executes an action (for example, display of the list image 8) according to the position of the pointer.
- the application control unit 23 sends an instruction to the display control unit 24 to display the display content including one or more selection targets 5 on the display unit 14, and whether or not the pointer is set to the selection target. Depending on which selection target 5 is selected, an action to be executed may be determined.
- the user can continue to switch from the pointer mode to scrolling and to perform another action by simply adjusting the long press time without once releasing the finger 4 or the like from the touch pad 16. .
- the time threshold T1 is shorter than the time threshold T2, and the display control unit 24 switches to the scroll mode when the touch duration exceeds the time threshold T1, and the touch event processing unit 22
- the application control unit 23 causes the list image 8 to be superimposed on the display area of the display unit 14, and the display control unit 24 is detected according to the change in the position information.
- the list image 8 is scrolled according to the scroll operation. Thereby, the user can realize display of the list image 8 and scrolling of the list image 8 without releasing the finger 4 from the touch pad 16.
- FIG. 3 is a flowchart for explaining an example of processing when the touch pad is pressed and held in the present embodiment
- FIG. 4 is a diagram for explaining the processing example.
- step S1 when the display control unit 24 operates in the pointer mode and the cursor 6 indicating the pointer is set to the selection target 5, the user uses the finger 4 for a long time. Assume that the push has started. At this time, the touchpad control unit 21 detects a touchdown (step S1).
- the touch event processing unit 22 acquires a touchdown from the touchpad control unit 21, the touch event processing unit 22 starts to determine whether or not a touchup is detected before the time threshold T1 elapses (step S2). And when touch-up is detected before the elapse of the time threshold T1, a single tap is detected as a user operation and notified to the application control unit 23.
- the application control unit 23 notified of the single tap determines whether or not the pointer is set to the selection target 5 based on the position information at the time of touch-up (step S3), and the pointer is set to the selection target 5 In step S4, an action when the selection target 5 is selected is executed. Then, it returns to step S1.
- the touch event processing unit 22 changes the operation of the display control unit 24 from the pointer mode to the scroll mode (step S5). .
- the display control unit 24 causes the display unit 14 to display the cursor 7 indicating the scroll mode.
- the touch event process part 22 acquires whether the selection target 5 exists in the touch position which position information shows from the application control part 23 (step S6), and when not existing, a long press corresponding
- the touch event processing unit 22 notifies the display control unit 24 of the scroll operation, and the display control unit 24 displays in accordance with the scroll operation.
- the scrolling of the area is executed (step S9).
- the touch event processing unit 22 detects a long touch and notifies the application control unit 23.
- the application control unit 23 notified of the long touch executes an action corresponding to the selection target 5 present at the touch position (step S10). For example, as shown in FIG. 4C, a list image (superimposed display image) 8 including a list of next actions related to the selection target 5 is superimposed on the display unit 14 to be selected by the user. It may be.
- the action to be executed is not limited to the display of the list image, and any type of image (superimposed display image) related to the selection target 5 may be superimposed on the display unit 14, Other actions, for example, an action such as starting a drag operation of the selection target 5 may be executed.
- the touch event processing unit 22 determines whether or not the touch pad control unit 21 detects a move (step S11).
- the touch control unit 21 detects a move
- the touch control unit 21 performs a scroll operation on the display control unit 24. Notice.
- the display control unit 24 notified of the scroll operation executes scrolling of the list image 8 (step S12).
- the touch event processing unit 22 determines whether or not the condition that the touch pad control unit 21 detects touch-up and the touch-down is not detected for a predetermined time is satisfied (step S13), and unless the condition is satisfied. Returning to step S11, the scroll operation of the list image 8 is continued. The reason that the condition that the touchdown is not detected for a predetermined time is included is to prevent malfunction during the scroll mode.
- the touch event processing unit 22 changes the operation of the display control unit 24 from the scroll mode to the pointer mode (step S14). Thereby, the user can point and select an item in the list image 8.
- touch event processing unit 22 and the application control unit 23 are not limited to this, and the other function may be performed by the other. Further, the touch event processing unit 22 and the application control unit 23 may be integrally configured as an operation control unit.
- the time threshold value T1 is shorter than the time threshold value T2, and the display control unit 24 switches to the scroll mode when the touch duration exceeds the time threshold value T1, and the touch event processing unit 22 and the application
- the control unit 23 switches the operation of the display control unit 24 from the scroll mode to the pointer mode according to the type of action executed by the application control unit 23.
- the display control unit operates in the pointer mode after the action.
- the user executes the action and switches to the pointer mode. This can be realized without separating the finger 4 from the touch pad 16.
- FIG. 5 is a flowchart for explaining an example of processing when the touch pad is pressed long in this embodiment
- FIG. 6 is a diagram for explaining the processing example.
- steps S1 to S10 are executed as in the first embodiment (see (a), (b), (c), and (e) of FIG. 6). Subsequently, the touch event processing unit 22 acquires whether or not the executed action is a display of the list image 8 from the application control unit 23 (step S20).
- the touch event processing unit 22 acquires the size of the list image 8 from the application control unit 23, and It is determined whether or not the size exceeds the size of the display area (screen size) of the display unit 14 (step S21).
- steps S11 to S14 are executed as in the first embodiment.
- the touch event processing unit 22 shifts the operation of the display control unit 24 to the pointer mode (step S22). As a result, as shown in FIG. 6D, the user can select the item 8a of the list image 8 without once releasing the finger 4 from the touch pad 16, thereby realizing a simpler operation. it can.
- step S23 even when the action executed by the application control unit 23 is not the display of the list image 8 but, for example, the start of the drag operation of the selection target 5 as shown in FIG. 22 shifts the operation of the display control unit 24 to the pointer mode (step S23). Thereby, as shown in FIG. 6F, the user can perform the drag operation of the selection target 5 without once releasing the finger 4 from the touch pad 16, and can realize a simpler operation. . Note that when the user releases the finger 4 from the touch pad 16 and the touch pad control unit 21 detects a touch-up, the application control unit 23 may perform a drop operation.
- a time threshold T3 is set, and the time threshold T2 is shorter than the time threshold T3.
- the touch event processing unit 22 and the application control unit 23 exceed the time threshold T2
- the list image 8 is superimposed on the display area of the display unit 14, and the display control unit 24 switches to the scroll mode when the touch duration exceeds the time threshold value T3, and the list image 8 is changed according to the change of the position information. Scroll.
- the user can realize display of the list image 8 and scrolling of the list image 8 without releasing the finger 4 from the touch pad 16.
- FIG. 7 is a block diagram showing a main configuration of the mobile phone according to the present embodiment.
- the storage unit 12 stores a time threshold (first threshold) T3 in addition to the time threshold T1 and the time threshold (second threshold) T2.
- FIG. 8 is a flowchart for explaining an example of processing when the touch pad is pressed and held in the present embodiment
- FIG. 9 is a diagram for explaining the processing example.
- steps S1 to S4 are executed as in the first embodiment (see FIG. 9A). Subsequently, the touch event processing unit 22 acquires whether or not the display area (screen) of the display unit 14 is scrollable from the display control unit 24 (step S30). If the display area of the display unit 14 can be scrolled, steps S5 to S10 are subsequently executed as in the first embodiment, and then step S22 is executed as in the second embodiment. The process proceeds as in Example 2.
- steps S6 to S8 and S10 are executed in the pointer mode without executing step S5 (FIGS. 9B and 9C). )reference).
- steps S6 to S8 and S10 are executed in the pointer mode without executing step S5 (FIGS. 9B and 9C). )reference).
- a move is detected before the elapse of the time threshold value T2 in step S8, it is not in the scroll mode, so the process does not proceed to step S9 but returns to step S1 via step S31.
- the touch event processing unit 22 acquires the size of the list image 8 from the application control unit 23, and determines whether or not the size exceeds the size of the display area (screen size) of the display unit 14 (step S1). S22). At this time, as shown in FIG. 9B, when the size of the list image 8 does not exceed the screen size, since the pointer mode has already been established, the long press processing is terminated as it is.
- the touch event processing unit 22 determines that the time threshold T3 has elapsed. Prior to this, it is determined whether or not touch-up is detected (step S33). If touch-up is detected before the elapse of the time threshold T3, the long press processing is terminated as it is. On the other hand, if touch-up is not detected before the time threshold T3 has elapsed, the touch event processing unit 22 changes the operation of the display control unit 24 to the scroll mode (step S34), and is the same as in the first embodiment. Steps S11 to S14 are executed. Thereby, as shown in FIG. 9D, the user can scroll the list image 8 without once releasing the finger 4 from the touch pad 16, and can realize a simpler operation.
- the control block of the mobile phone 10 is a logic formed in an integrated circuit (IC chip) or the like. It may be realized by a circuit (hardware) or may be realized by software using a CPU (Central Processing Unit).
- the mobile phone 10 includes a CPU that executes instructions of a program that is software that implements each function, a ROM (Read Only Memory) in which the program and various data are recorded so as to be readable by a computer (or CPU), or A storage device (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like are provided.
- a computer or CPU
- the recording medium a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
- the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
- a transmission medium such as a communication network or a broadcast wave
- the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
- the information processing apparatus (cellular phone 10) according to aspect 1 of the present invention has an object detection that has a display unit 14 and a planar detection region for detecting an object that is close to or in contact with the display unit at a position different from the display region of the display unit.
- the position information acquisition unit touch pad control unit 21
- the first mode pointer mode
- a second mode scroll mode
- the operation of the display control unit is switched from the first mode to the second mode
- the connection time exceeds the second threshold value (time threshold T2)
- a operation control section to perform an action in accordance with the position of the pointer (touch event processing unit 22 and the application control unit 23).
- the operation of the display control unit is When the mode is switched from the first mode to the second mode and the duration exceeds the second threshold, an action corresponding to the position of the pointer is executed. Therefore, the user can continuously perform the switching from the first mode to the second mode and another action without once separating the object from the object detection unit. Thereby, a simpler operation can be realized.
- the first threshold value (time threshold value T1) is shorter than the second threshold value (time threshold value T2).
- the superimposed display image (list image 8) is displayed in the display area, and the display control unit switches the first time when the duration time exceeds the first threshold value.
- the superimposed display image may be scrolled according to the change in the position information.
- the operation of the display control unit switches from the first mode to the second mode, and then the duration exceeds the second threshold.
- the superimposed display image is displayed in the display area, and the superimposed image is scrolled according to the change in the position information.
- the first threshold value (time threshold value T1) is shorter than the second threshold value (time threshold value T2). You may comprise so that operation
- movement of the said display control part may be switched from said 2nd mode to said 1st mode when the said continuous time exceeds said 2nd threshold value.
- the display control unit operates in the first mode after the action.
- operation of a display control part can be automatically switched from said 2nd mode to said 1st mode.
- the user can continuously execute the action and switch from the second mode to the first mode without once separating the object from the object detection unit. Thereby, a simpler operation can be realized.
- the second threshold value (time threshold value T2) is shorter than the first threshold value (time threshold value T3).
- the superimposed display image (list image 8) is displayed in the display area, and the display control unit is configured to display the second mode when the duration time exceeds the first threshold value.
- the superimposed display image may be scrolled in accordance with the change in the position information.
- the operation of the display control unit is While switching from the first mode to the second mode, the superimposed image is scrolled according to the change in the position information. Accordingly, the user continuously performs the display of the superimposed display image, the switching from the first mode to the second mode, and the scrolling of the superimposed display image without once separating the object from the object detection unit. be able to. Thereby, a simpler operation can be realized.
- the control method of the information processing apparatus includes a display unit 14 and an object detection unit (which has a planar detection area for detecting an object that is approaching or touching at a position different from the display area of the display unit (A method for controlling an information processing apparatus comprising: a touch pad 16); and a position information acquisition unit (touch pad control unit 21) that acquires position information indicating a position where the object detection unit detects the object.
- a method for controlling an information processing apparatus comprising: a touch pad 16); and a position information acquisition unit (touch pad control unit 21) that acquires position information indicating a position where the object detection unit detects the object.
- the mode pointer mode
- the position of the pointer on the display area is determined according to the position information.
- the second mode at least a part of the display area is changed according to the change in the position information.
- the operation of the display control process is switched from the first mode to the second mode, When the continuation time exceeds a second threshold value, and a motion control step of executing the action corresponding to the position of the pointer.
- the information processing apparatus may be realized by a computer.
- the computer operates as each unit (software element: the display control unit and the operation control unit) included in the information processing apparatus.
- an information processing apparatus control program for realizing the information processing apparatus on a computer and a computer-readable recording medium on which the information processing apparatus is recorded also fall within the scope of the present invention.
- the present invention can be generally used for an information processing apparatus including a display unit and an object detection unit.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
以下、本発明の一実施形態について、詳細に説明する。
本実施形態に係る携帯電話機(情報処理装置)について、図2に基づいて説明する。図2は、本実施形態に係る携帯電話機10の外観を示す図である。
上述のように携帯電話機10は、ハードキー15およびタッチパッド16の2つの操作部(入力部)を有する。ユーザのタッチパッド16による操作を容易にするため(誤操作を防止するため)、携帯電話機10は、「キー操作モード」、「ポインタモード」(第一モード)、「スクロールモード」(第二モード)の3つのモードを有する。
次に、図1に基づいて、携帯電話機10の構成および機能について詳細に説明する。図1は、本実施形態に係る携帯電話機10の要部構成を示すブロック図である。
続いて、ユーザがタッチパッド16を長押ししたときの処理(長押し対応処理)について説明する。
本発明の他の実施形態について説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材、および、同じ処理を行うステップについては、同じ符号を付記し、その説明を省略する。
本発明のさらに他の実施形態について説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材、および、同じ処理を行うステップについては、同じ符号を付記し、その説明を省略する。
携帯電話機10の制御ブロック(特に、主制御部11のタッチパッド制御部21、タッチイベント処理部22、アプリ制御部23および表示制御部24)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
本発明の態様1に係る情報処理装置(携帯電話機10)は、表示部14と、近接または接触する物体を検出する平面状の検出領域を該表示部の表示領域とは異なる位置に有する物体検出部(タッチパッド16)と、該物体検出部が該物体を検出した位置を示す位置情報を取得する位置情報取得部(タッチパッド制御部21)と、第一モード(ポインタモード)では、該位置情報に応じて該表示領域上のポインタの位置を決定し、第二モード(スクロールモード)では、該位置情報の変化に応じて該表示領域の少なくとも一部をスクロールさせる表示制御部24と、該位置情報が継続して同じ位置を示している継続時間が第一閾値(時間閾値T1、時間閾値T3)を超えたとき、該表示制御部の動作を第一モードから第二モードに切り替え、該継続時間が第二閾値(時間閾値T2)を超えたとき、該ポインタの位置に応じたアクションを実行する動作制御部(タッチイベント処理部22およびアプリ制御部23)とを備える。
5 選択対象
6,7 カーソル
8 リスト画像(重畳表示画像)
10 携帯電話機(情報処理装置)
14 表示部
16 タッチパッド(物体検出部)
21 タッチパッド制御部(位置情報取得部)
22 タッチイベント処理部(動作制御部)
23 アプリ制御部(動作制御部)
24 表示制御部
T1 時間閾値(第一閾値)
T2 時間閾値(第二閾値)
T3 時間閾値(第一閾値)
Claims (5)
- 表示部と、
近接または接触する物体を検出する平面状の検出領域を該表示部の表示領域とは異なる位置に有する物体検出部と、
該物体検出部が該物体を検出した位置を示す位置情報を取得する位置情報取得部と、
第一モードでは、該位置情報に応じて該表示領域上のポインタの位置を決定し、第二モードでは、該位置情報の変化に応じて該表示領域の少なくとも一部をスクロールさせる表示制御部と、
該位置情報が継続して同じ位置を示している継続時間が第一閾値を超えたとき、該表示制御部の動作を第一モードから第二モードに切り替え、該継続時間が第二閾値を超えたとき、該ポインタの位置に応じたアクションを実行する動作制御部とを備える、情報処理装置。 - 上記第一閾値の方が上記第二閾値よりも短く、
上記動作制御部は、上記継続時間が上記第二閾値を超えたとき、上記表示領域に重畳表示画像を表示させ、
上記表示制御部は、上記継続時間が上記第一閾値を超えたときに切り替えた上記第二モードにおいて、該重畳表示画像が表示されているとき、上記位置情報の変化に応じて該重畳表示画像をスクロールさせる、請求項1に記載の情報処理装置。 - 上記第一閾値の方が上記第二閾値よりも短く、
上記動作制御部は、上記継続時間が上記第二閾値を超えたとき、上記表示制御部の動作を上記第二モードから上記第一モードに切り替える、請求項1または2に記載の情報処理装置。 - 上記第二閾値の方が上記第一閾値よりも短く、
上記動作制御部は、上記継続時間が上記第二閾値を超えたとき、上記表示領域に重畳表示画像を表示させ、
上記表示制御部は、上記継続時間が上記第一閾値を超えたときに上記第二モードに切り替えて、上記位置情報の変化に応じて該重畳表示画像をスクロールさせる、請求項1に記載の情報処理装置。 - 表示部と、近接または接触する物体を検出する平面状の検出領域を該表示部の表示領域とは異なる位置に有する物体検出部と、該物体検出部が該物体を検出した位置を示す位置情報を取得する位置情報取得部とを備える情報処理装置の制御方法であって、
第一モードでは、該位置情報に応じて該表示領域上のポインタの位置を決定し、第二モードでは、該位置情報の変化に応じて該表示領域の少なくとも一部をスクロールさせる表示制御工程と、
該位置情報が継続して同じ位置を示している継続時間が第一閾値を超えたとき、該表示制御工程の動作を第一モードから第二モードに切り替え、該継続時間が第二閾値を超えたとき、該ポインタの位置に応じたアクションを実行する動作制御工程とを含む、情報処理装置の制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/535,418 US20170351394A1 (en) | 2015-01-15 | 2016-01-12 | Information processing apparatus and method of controlling same |
JP2016569364A JPWO2016114269A1 (ja) | 2015-01-15 | 2016-01-12 | 情報処理装置およびその制御方法 |
CN201680004507.1A CN107111440A (zh) | 2015-01-15 | 2016-01-12 | 信息处理装置及其控制方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015006037 | 2015-01-15 | ||
JP2015-006037 | 2015-01-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016114269A1 true WO2016114269A1 (ja) | 2016-07-21 |
Family
ID=56405812
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/050734 WO2016114269A1 (ja) | 2015-01-15 | 2016-01-12 | 情報処理装置およびその制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170351394A1 (ja) |
JP (1) | JPWO2016114269A1 (ja) |
CN (1) | CN107111440A (ja) |
WO (1) | WO2016114269A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020166886A (ja) * | 2016-07-26 | 2020-10-08 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理システム、操作デバイス、及び操作デバイスの制御方法 |
KR20210118938A (ko) * | 2019-04-10 | 2021-10-01 | 광저우 스위엔 일렉트로닉스 코., 엘티디. | 터치 조작 모드의 제어 방법, 장치, 설비 및 저장매체 |
US11173393B2 (en) | 2017-09-29 | 2021-11-16 | Sony Interactive Entertainment Inc. | Operation device and control apparatus therefor |
US11198060B2 (en) | 2016-03-04 | 2021-12-14 | Sony Interactive Entertainment Inc. | Control apparatus and control program |
JP7001368B2 (ja) | 2017-02-23 | 2022-01-19 | 株式会社東海理化電機製作所 | 操作装置 |
US11511185B2 (en) | 2017-10-27 | 2022-11-29 | Sony Interactive Entertainment Inc. | Operation device |
US11524226B2 (en) | 2016-07-26 | 2022-12-13 | Sony Interactive Entertainment Inc. | Operation device and method for controlling the same |
US11596858B2 (en) | 2016-07-21 | 2023-03-07 | Sony Interactive Entertainment Inc. | Operating device and control system |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017195472A1 (ja) * | 2016-05-11 | 2017-11-16 | シャープ株式会社 | 情報処理装置、情報処理装置の制御方法、および制御プログラム |
US11686493B2 (en) * | 2019-12-04 | 2023-06-27 | Ademco Inc. | Digital HVAC controller for navigating information based on two or more inputs |
US11280512B2 (en) | 2019-12-04 | 2022-03-22 | Ademco Inc. | Digital HVAC controller with carousel screens |
JP2023184295A (ja) * | 2022-06-17 | 2023-12-28 | カシオ計算機株式会社 | 電子楽器、電子楽器の操作状況報知方法及びプログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009245239A (ja) * | 2008-03-31 | 2009-10-22 | Sony Corp | ポインタ表示装置、ポインタ表示検出方法、ポインタ表示検出プログラム及び情報機器 |
JP2012203484A (ja) * | 2011-03-24 | 2012-10-22 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
JP2013222214A (ja) * | 2012-04-12 | 2013-10-28 | Denso Corp | 表示操作装置および表示システム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7673111B2 (en) * | 2005-12-23 | 2010-03-02 | Intel Corporation | Memory system with both single and consolidated commands |
JP4840413B2 (ja) * | 2008-07-02 | 2011-12-21 | ソニー株式会社 | 情報表示方法、情報処理装置および情報表示用プログラム |
JP5136675B2 (ja) * | 2011-06-09 | 2013-02-06 | ソニー株式会社 | ポインタ表示装置、ポインタ表示検出方法及び情報機器 |
JP2013257641A (ja) * | 2012-06-11 | 2013-12-26 | Fujitsu Ltd | 情報端末装置及び表示制御方法 |
TW201409425A (zh) * | 2012-08-22 | 2014-03-01 | Fu Zhi Technology Co Ltd | 教學系統 |
-
2016
- 2016-01-12 JP JP2016569364A patent/JPWO2016114269A1/ja active Pending
- 2016-01-12 US US15/535,418 patent/US20170351394A1/en not_active Abandoned
- 2016-01-12 CN CN201680004507.1A patent/CN107111440A/zh active Pending
- 2016-01-12 WO PCT/JP2016/050734 patent/WO2016114269A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009245239A (ja) * | 2008-03-31 | 2009-10-22 | Sony Corp | ポインタ表示装置、ポインタ表示検出方法、ポインタ表示検出プログラム及び情報機器 |
JP2012203484A (ja) * | 2011-03-24 | 2012-10-22 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
JP2013222214A (ja) * | 2012-04-12 | 2013-10-28 | Denso Corp | 表示操作装置および表示システム |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11654353B2 (en) | 2016-03-04 | 2023-05-23 | Sony Interactive Entertainment Inc. | Control apparatus and control program |
US11198060B2 (en) | 2016-03-04 | 2021-12-14 | Sony Interactive Entertainment Inc. | Control apparatus and control program |
US11596858B2 (en) | 2016-07-21 | 2023-03-07 | Sony Interactive Entertainment Inc. | Operating device and control system |
US11344797B2 (en) | 2016-07-26 | 2022-05-31 | Sony Interactive Entertainment Inc. | Information processing system, operation device, and operation device control method with multi-mode haptic feedback |
US11524226B2 (en) | 2016-07-26 | 2022-12-13 | Sony Interactive Entertainment Inc. | Operation device and method for controlling the same |
JP2020166886A (ja) * | 2016-07-26 | 2020-10-08 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理システム、操作デバイス、及び操作デバイスの制御方法 |
US11980810B2 (en) | 2016-07-26 | 2024-05-14 | Sony Interactive Entertainment Inc. | Information processing system, operation device, and operation device control method |
US12109487B2 (en) | 2016-07-26 | 2024-10-08 | Sony Interactive Entertainment Inc. | Operation device and method for controlling the same |
JP7001368B2 (ja) | 2017-02-23 | 2022-01-19 | 株式会社東海理化電機製作所 | 操作装置 |
US11173393B2 (en) | 2017-09-29 | 2021-11-16 | Sony Interactive Entertainment Inc. | Operation device and control apparatus therefor |
US11511185B2 (en) | 2017-10-27 | 2022-11-29 | Sony Interactive Entertainment Inc. | Operation device |
KR20210118938A (ko) * | 2019-04-10 | 2021-10-01 | 광저우 스위엔 일렉트로닉스 코., 엘티디. | 터치 조작 모드의 제어 방법, 장치, 설비 및 저장매체 |
KR102684576B1 (ko) | 2019-04-10 | 2024-07-11 | 광저우 스위엔 일렉트로닉스 코., 엘티디. | 터치 조작 모드의 제어 방법, 장치, 설비 및 저장매체 |
Also Published As
Publication number | Publication date |
---|---|
US20170351394A1 (en) | 2017-12-07 |
CN107111440A (zh) | 2017-08-29 |
JPWO2016114269A1 (ja) | 2017-09-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016114269A1 (ja) | 情報処理装置およびその制御方法 | |
JP5970086B2 (ja) | タッチスクリーンホバリング入力処理 | |
AU2007342094B2 (en) | Back-side interface for hand-held devices | |
AU2011282997B2 (en) | Motion continuation of touch input | |
US20170038906A1 (en) | Information processing device, operation input method and operation input program | |
US20100182264A1 (en) | Mobile Device Equipped With Touch Screen | |
EP2416233A1 (en) | Information processing apparatus, information processing method, and computer program | |
WO2012089921A1 (en) | Method and apparatus for controlling a zoom function | |
CN101379461A (zh) | 具有多重触摸输入的便携式电子设备 | |
WO2013093205A1 (en) | Apparatus and method for providing transitions between screens | |
WO2014002633A1 (ja) | 処理装置、動作制御方法及びプログラム | |
JPWO2014141548A1 (ja) | 表示制御 | |
JP6027703B1 (ja) | 情報処理装置および制御プログラム | |
JP6114886B2 (ja) | 情報処理装置、情報処理装置の制御方法および制御プログラム | |
JP6077573B2 (ja) | 情報処理装置、その制御方法、および制御プログラム | |
JP6367720B2 (ja) | 情報処理装置およびプログラム | |
JP6134748B2 (ja) | 情報処理装置、その制御方法、および制御プログラム | |
JP6118005B1 (ja) | 情報処理装置および制御プログラム | |
JP2016130976A (ja) | 情報処理装置、情報処理装置の制御方法、および制御プログラム | |
JP7196246B2 (ja) | ユーザインターフェース処理プログラム、記録媒体、ユーザインターフェース処理方法 | |
WO2015114938A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP2016218859A (ja) | 情報処理装置、制御方法、および制御プログラム | |
JP6603797B2 (ja) | 情報処理装置、情報処理装置の制御方法、および制御プログラム | |
WO2015015731A1 (ja) | 画像表示装置、画像表示方法及び画像表示プログラム製品 | |
JP2018180917A (ja) | 電子機器、電子機器の制御方法、および電子機器の制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16737340 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016569364 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15535418 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16737340 Country of ref document: EP Kind code of ref document: A1 |