WO2011024521A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2011024521A1 WO2011024521A1 PCT/JP2010/058405 JP2010058405W WO2011024521A1 WO 2011024521 A1 WO2011024521 A1 WO 2011024521A1 JP 2010058405 W JP2010058405 W JP 2010058405W WO 2011024521 A1 WO2011024521 A1 WO 2011024521A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- display screen
- pressing force
- operating body
- information processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04105—Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a program.
- the display device includes a touch screen (touch panel) in which a capacitive type or resistive type touch sensor is provided on the surface of a display unit such as a liquid crystal display. According to the touch screen, the information processing apparatus can be input by touching the display screen, and the user can easily handle the information processing apparatus.
- a touch screen touch panel
- a capacitive type or resistive type touch sensor is provided on the surface of a display unit such as a liquid crystal display.
- Patent Document 1 a technique for improving the usability of a user in input using a touch screen has been disclosed (for example, Patent Document 1).
- the pressing force of the operating body which contacts a touch screen is detected, and the detected pressure data is associated with processing related to operations such as cursor movement and double click. Accordingly, it is possible to increase processing related to operations in the information processing apparatus including the touch screen.
- the music data of the portable information terminal has four hierarchical structures of artist, album, music, and playback screen.
- music data having a hierarchical structure is reproduced by an information processing apparatus having a touch screen, desired music is selected by touching an object on the display screen in the order of artist, album, music, and reproduction screen.
- touch “return” or the like on the display screen it is necessary to touch “return” or the like on the display screen a plurality of times to return to the hierarchy in order and reselect the target in the hierarchy.
- the present invention has been made in view of the above problems, and an object of the present invention is to shorten the operation step by switching the display screen and selecting a target in accordance with the pressing force of the operating body. It is an object of the present invention to provide a new and improved information processing apparatus, information processing method, and program that can be executed.
- a detection unit that detects a contact operation and a pressing force of a contact operation surface by an operating body, and a pressing force of the operating body detected by the detecting unit
- a display switching unit that switches display of the display screen, and a predetermined operation when the detection unit detects a predetermined operation on the contact operation surface of the operating body while the display switching unit is switching the display screen.
- An information processing apparatus includes an execution unit that executes predetermined processing in accordance with an operation.
- the display switching unit switches display of the display screen corresponding to each layer for executing processing of a plurality of layers according to the pressing force of the operating body detected by the detection unit, and the execution unit is configured to display the display switching unit. While the display screen is switched to the display of one of the plurality of hierarchies, when another operation of the operating body is detected by the detection unit, processing in one hierarchy is performed according to the other operation. You may make it perform.
- the display switching unit switches the display to the display screen of the lower hierarchy of the one layer displayed on the display screen when the pressing force of the operating body detected by the detection unit increases, and the detection unit detects When the pressing force of the operating body is reduced, the display may be switched to the display screen of the upper hierarchy of the one hierarchy displayed on the display screen.
- the display switching unit displays a map display including a hierarchical display indicating one or two or more hierarchical levels on the display screen, and includes the content of the hierarchical display according to the contact operation of the operating tool detected by the detecting unit.
- a map display may be displayed on the display screen.
- the display switching unit may switch the display content of the hierarchical display included in the map display displayed on the display screen according to the pressing force of the operating body detected by the detection unit.
- the display switching unit may switch the display while increasing the transparency of the display screen before switching according to the pressing force of the operating tool detected by the detection unit.
- the display switching unit may switch the display while reducing the display size of the display screen before switching the display screen according to the pressing force of the operating tool detected by the detection unit.
- a predetermined process may be executed according to the contact operation of the operating body.
- the execution unit performs the operation of the operation body when the pressing force of the operation body detected by the detection unit exceeds a predetermined threshold value within a predetermined time while the display switching unit is switching the display screen.
- a predetermined process may be executed in accordance with the contact operation.
- the detection unit detects the tilt of the casing
- the execution unit detects the tilt of the casing when the detection unit detects the tilt of the casing while the display switching unit switches the display screen.
- Predetermined processing may be executed according to the inclination.
- the display switching unit may change the display color of the position corresponding to the contact location on the contact operation surface according to the pressing force of the contact operation surface by the operating body detected by the detection unit.
- a step of detecting a contact operation and a pressing force of a contact operation surface by an operating tool and a display according to the detected pressing force of the operating tool.
- a predetermined operation on the contact operation surface of the operating tool is detected while switching the display of the screen and while the display of the display screen is switched, a predetermined process is executed according to the predetermined operation
- an information processing method including the steps.
- a computer includes a detection unit that detects a contact operation and a pressing force of a contact operation surface by an operation unit, and an operation unit detected by the detection unit.
- the display switching unit that switches the display screen according to the pressing force of the display, and a predetermined operation on the contact operation surface of the operating body is detected by the detection unit while the display screen is switched by the display switching unit.
- a program for causing the information processing apparatus to function is provided, and an execution unit that executes a predetermined process in response to a predetermined operation.
- FIG. 3 is an explanatory diagram illustrating a hardware configuration of the information processing apparatus according to the embodiment. It is a block diagram which shows the function structure of the information processing apparatus concerning the embodiment. It is explanatory drawing explaining switching of the display screen concerning the embodiment. It is explanatory drawing explaining the relationship between the pressing force concerning this embodiment, and time. It is explanatory drawing explaining the relationship between the pressing force concerning this embodiment, and time. It is explanatory drawing explaining the process according to the inclination of the housing
- Display devices include a touch screen (touch panel) in which a capacitive or resistive touch sensor is provided on the surface of a display unit such as a liquid crystal display. According to the touch screen, the information processing apparatus can be input by touching the display screen, and the user can easily handle the information processing apparatus.
- a touch screen touch panel
- a capacitive or resistive touch sensor is provided on the surface of a display unit such as a liquid crystal display.
- the information processing apparatus can be input by touching the display screen, and the user can easily handle the information processing apparatus.
- the pressing force of the operating body that touches the touch screen is detected, and the detected pressure data is associated with processing related to operations such as cursor movement and double click. Accordingly, it is possible to increase processing related to operations in the information processing apparatus including the touch screen.
- the music data of the portable information terminal has four hierarchical structures: an artist selection screen 501, an album selection screen 502, a music selection screen 503, and a playback screen 504.
- an artist selection screen 501 an album selection screen 502
- a music selection screen 503 a music selection screen 503
- the information processing apparatus 100 according to the embodiment of the present invention has been created with the above circumstances as a focus. According to the information processing apparatus 100 according to the present embodiment, it is possible to shorten the operation step by switching the display screen and selecting a target according to the pressing force of the operating body.
- the information processing apparatus 100 will be described by exemplifying a small audio player, media player, PDA (personal digital assistant), mobile phone, and the like as shown in FIG. 1, but is not limited to this example. It is also possible to apply to a personal computer or the like.
- display apparatuses such as a display, as an integrated apparatus, it is not limited to this example.
- the information processing device 100 and the display device may be configured as separate devices.
- FIG. 1 is an explanatory diagram illustrating a configuration of a display device of the information processing apparatus 100.
- the information processing apparatus 100 according to the present embodiment is an apparatus that can input information by touching or pressing a display screen of a display device that displays information with an operating tool.
- a user who owns the information processing apparatus 100 may select or determine a target touched by the operating tool by bringing the operating tool into contact with a target indicated by an icon or a character key displayed on the display device. it can.
- the input display unit of the information processing apparatus 100 is configured by stacking a sheet-like pressure sensor 106 and an electrostatic touch panel 105 on the display screen side of the display device 104. .
- the electrostatic touch panel 105 has a function of detecting contact of the operating body with the display screen.
- the electrostatic touch panel 105 includes electrostatic sensors arranged in a lattice pattern, and constantly changes its value according to a change in capacitance. When a finger as an operating body approaches or touches the electrostatic sensor, the capacitance detected by the electrostatic sensor increases. Capacitance of each electrostatic sensor can be acquired simultaneously. By detecting and interpolating the capacitance changes of all the electrostatic sensors at the same time, it is possible to detect the shape of a finger that is in proximity or in contact.
- the electrostatic touch panel 105 outputs the detected capacitance value to a CPU (Central Processing Unit; reference numeral 101 in FIG. 2).
- CPU Central Processing Unit
- the pressure-sensitive sensor 106 has a function of detecting pressure for pressing the display screen.
- the pressure-sensitive sensor 106 for example, a resistive film pressure-sensitive sensor that forms an electrode surface with two sheet panels and detects the position by detecting energization of the pressed portion can be used.
- the pressure-sensitive sensor 106 is also provided with a plurality of detection points in the sheet for detecting the pressed position. The energization at each detection point can be detected simultaneously.
- the pressure-sensitive sensor 106 outputs the pressure that presses the display screen, detected at each detection point, to the CPU.
- the CPU associates various information input from the electrostatic touch panel 105 and the pressure sensor 106 with the display position of the display content displayed on the display device 104, and analyzes the movement of the operating body. Then, the CPU recognizes input information input to the information processing apparatus 100 from the analyzed movement of the operating tool, and executes a process corresponding to the input information. Thus, the user can input the input information by operating the contents displayed on the display screen. Note that when the operating tool is brought into contact with or pressed against the display screen of the display device 104, the operating tool actually touches the surface of the electrostatic touch panel 105, not the display screen of the display device 104. is doing. As described above, even when the operating body is actually in contact with the surface of the electrostatic touch panel 105, the following description may be made as “contact the operating body with the display screen of the display device 104”.
- the information processing apparatus 100 includes a CPU 101, a RAM (Random Access Memory) 102, a nonvolatile memory 103, a display device 104, an electrostatic touch panel 105, and a pressure sensitive sensor. Sensor 106.
- the CPU 101 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 100 according to various programs. Further, the CPU 101 may be a microprocessor.
- the RAM 102 temporarily stores programs used in the execution of the CPU 101, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus including a CPU bus.
- the nonvolatile memory 103 stores programs used by the CPU 101, calculation parameters, and the like.
- a ROM Read Only Memory
- flash memory or the like can be used as the non-volatile memory 103.
- the display device 104 is an example of an output device that outputs information.
- a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, or the like can be used.
- the electrostatic touch panel 105 is an example of an input device through which a user inputs information, and an input control circuit that generates an input signal based on an input by the user and outputs the information to the CPU 101. Etc.
- the pressure sensor 106 is an example of an input device through which a user inputs information.
- the electrostatic touch panel 105 and the pressure sensor 106 can be configured as described above.
- the user can input various data or instruct a processing operation to the information processing apparatus 100 by operating the electrostatic touch panel 105 and the pressure sensor 106.
- the contact operation on the contact operation surface is detected by the electrostatic touch panel 105, and the pressing force on the contact operation surface is detected by the pressure sensor 106.
- the CPU 11 includes a detection unit 112, a display switching unit 114, an execution unit 116, and the like.
- the detection unit 112 has a function of detecting a contact operation on the contact operation surface (display screen) of the operation body input by the electrostatic touch panel 105. As described above, when the contact body contacts the display screen, the capacitance detected by the electrostatic touch panel 105 increases. When the value of the capacitance detected by the electrostatic touch panel 105 exceeds a predetermined value, the detection unit 112 can detect that the operating tool has touched the display screen. The detection unit 112 determines whether or not the operating tool has touched the display screen. If it is determined that the operating tool has touched, the detection unit 112 executes the detected capacitance value as a detection result with the display switching unit 114. Part 116.
- the detection unit 112 has a function of detecting the pressing force of the contact operation surface by the operating body. As described above, the pressure sensor 106 detects an electrical signal corresponding to the magnitude of pressure. The detection unit 112 determines whether or not the operating body has pressed the display screen based on the electrical signal detected by the pressure sensor 106, and when it is determined that the operating body has pressed, the detected pressing force is detected. The detection result is provided to the display switching unit 114 and the execution unit 116.
- the detection unit 112 may acquire the inclination of the housing of the information processing apparatus 100 detected by an inclination sensor (not shown).
- the tilt sensor may be any sensor that can detect the tilt of the housing, such as a gyro sensor, a mechanical type, or an optical four-direction detection sensor.
- the gyro sensor can detect the inclination of the housing by measuring the angular velocity.
- the mechanical and optical four-direction detection sensors are devices capable of detecting four directions using infrared LEDs and transistors, and can detect the inclination and orientation of the housing.
- the display switching unit 114 has a function of switching the display screen according to the pressing force of the operating body detected by the detection unit 112.
- switching of the display screen by the display switching unit 114 will be described with reference to FIG.
- FIG. 4 is an explanatory diagram for explaining the relationship between the screen example of the display screen switched by the display switching unit 114 and the pressing force.
- music data will be described as an example of data displayed on the display screen.
- the music data has a hierarchical structure, and has a three-layer structure from an upper hierarchy to a lower hierarchy.
- the highest hierarchy is an artist, and the lower hierarchy of the artist is an album. Furthermore, the lower hierarchy of the artist is music.
- the display screen for playing music data has a four-layer structure.
- the display at the highest level is the artist list
- the display at the lower level of the artist list is the album list.
- the lower-level display of the album list is a music list.
- the list in the lower hierarchy of the music list is a reproduction display of the music. Normally, when reproducing music data having a hierarchical structure, first, an artist to be reproduced is selected from the artist list, and then an album to be reproduced is selected from the album list of the selected artist. Further, when a song to be reproduced is selected from the song list of the selected album, a reproduction screen for the selected song is displayed.
- the graph 210 in FIG. 4 shows that the pressing force of the operating body increases with time.
- the pressing force of the operating body increases from 0 to F1 and F2, and further increases to F2 or more.
- the screen example 201 is displayed on the display screen. Is displayed.
- the screen example 201 is, for example, a music data playback screen.
- a screen example 202 is displayed on the display screen until the pressing force on the display screen of the operating body becomes F1 or more and becomes F2.
- the screen example 202 is, for example, a music list screen of music data.
- the screen example 203 is displayed on the display screen.
- the screen example 203 is, for example, a music data album list screen.
- the display switching unit 114 when the display screen is pushed in by the operating body, it becomes possible to move the hierarchy according to the degree of the push-in. For example, when the pressing force increases, the display switching unit 114 switches to the upper layer, and when the pressing force decreases, the display switching unit 114 switches to the lower layer.
- the display switching unit 114 displays the music list, album from the music playback screen. The screen transitions to the upper level display screen to the list and artist list.
- the display switching unit 114 transitions the screen from the artist list to the original music reproduction screen.
- the display switching unit 114 increases the transparency of the display indicating the current hierarchy according to the pressing force, and displays the upper hierarchy or the lower hierarchy of the current hierarchy.
- the display switching unit 114 may change the display size indicating the current hierarchy in accordance with the pressing force when the screen is changed. For example, the display indicating the current hierarchy may be gradually reduced to display the upper hierarchy or the lower hierarchy of the current hierarchy.
- the display switching unit 114 may change the display color of the position corresponding to the contact location on the contact operation surface in accordance with the pressing force of the contact operation surface detected by the detection unit 112. For example, as illustrated in FIG. 4, when the display screen is pressed by the operating body, the contact portion 204 of the operating body is displayed in a circular shape on the display screen. The contact location 204 changes according to the pressing force of the operating body. For example, when the pressing force is 3N or less, the contact location 204 is displayed in blue, and when the pressing force is greater than 3N and 7N or less, the contact location 204 is displayed in yellow. When the pressing force is greater than 7N, the contact location 204 is displayed in red.
- the user can visually recognize how much the layer changes when the display screen is pressed. Therefore, the user can more intuitively change the display screen to the upper hierarchy or the lower hierarchy or select a target.
- whether or not the display screen is switched by pressing the display screen may be displayed. For example, when the operating tool touches the display screen, a message indicating that the display screen can be switched by pressing force may be displayed superimposed on the displayed display screen.
- the execution unit 116 performs the predetermined operation when the detection unit 112 detects a predetermined operation on the contact surface of the operating body while the display switching unit 114 switches the display screen. And a function of executing a predetermined process according to the above.
- the predetermined operation on the contact surface is, for example, another operation different from the pressing of the operating body, and includes, for example, contact (touch) on the display screen.
- the user presses the display screen with the right hand to change the level of the display screen, and touches the selection target with the left hand while the level to be selected is displayed.
- the predetermined process executed by the execution unit 116 is a hierarchy determination operation, selection of a target in the determined hierarchy, or the like.
- the execution unit 116 executes a process instructed by the user according to a predetermined operation performed while the hierarchy of the display screen is changed according to the pressing force of the user.
- the execution unit 116 when the pressing force of the operating tool detected by the detection unit 112 exceeds a predetermined threshold value within a predetermined time while the display screen display is switched by the display switching unit 114, A predetermined process may be executed according to the contact operation of the operating body.
- FIG. 5 is an explanatory diagram for explaining the relationship between the pressing force F [N] and the time t [sec].
- the pressing force increases with time, and the pressing force changes abruptly during t1 time. That is, the user performs an operation of strongly pressing the display screen in a short time within the time t1.
- the execution unit 116 fixes the layer displayed within the time t1, and selects the target of the layer touched by the operating tool. To do.
- the pressing force of the operating body increases with time, and the pressing force is suddenly applied during t2.
- the execution unit 116 may execute a predetermined process. The user gradually pushes the display screen, and suddenly stops pushing the display screen within the time t2. In other words, the operating body that has been pushed in is rapidly removed from the display screen. In this case, the execution unit 116 fixes the hierarchy displayed within the time t2, and selects the target of the hierarchy touched by the operating tool.
- the execution unit 116 performs a predetermined process according to the inclination of the housing when the detection unit 112 detects the inclination of the housing while the display switching unit 114 switches the display screen. You may make it perform.
- a predetermined process is performed by the execution part 116 according to the inclination of a housing
- casing is demonstrated.
- 7 and 8 are explanatory diagrams for explaining a case where predetermined processing is executed by the execution unit 116 in accordance with the inclination of the housing.
- the user holds the information processing apparatus 100 as shown in the explanatory diagram 220 of FIG.
- the user's thumb which is the operating body, is in contact with the display screen.
- the user presses the display screen to make a transition to an upper hierarchy, or gradually moves the operating tool away from the display screen to make a transition to a lower hierarchy.
- the music data described above will be described as an example.
- the user presses the display screen on which the music playback screen is displayed, and transitions to the album list or artist list screen.
- the execution unit 116 may fix the level displayed when the housing is tilted and scroll the display screen in the direction in which the housing is tilted (rightward).
- the execution unit 116 may fix the level displayed when the casing is tilted and further scroll the display screen in the direction in which the casing is tilted (leftward). Further, it is assumed that the housing is tilted upward (downward) as shown in the explanatory diagram 223 while the display screen is switched. In this case, the execution unit 116 may fix the level displayed when the casing is tilted and further scroll the display screen in the tilted direction (upward or downward). Good.
- the execution unit 116 may change the selection target according to the vibration of the casing.
- the music data is described as an example.
- the present invention is not limited to this example, and data having a hierarchical structure can be used.
- it may be video or still image data, schedule data, or the like.
- the contents of the hierarchical display included in the map display may be displayed according to the pressing force of the operating tool.
- FIG. 9 is an explanatory diagram illustrating a display example of hierarchical display on a map.
- the map 301 includes a hierarchy display indicating a plurality of hierarchies.
- the hierarchy display indicating a plurality of levels is, for example, a building or a department store in which a plurality of floors exist in one spot.
- the user touches and presses the spot to be hierarchically displayed with the operating tool.
- the display switching unit 114 displays the contents of the floor of the designated spot according to the pressing force of the operating tool. For example, if the spot touched by the operating body is an eight-story building, the contents of the upper floor are displayed as the pressing force increases.
- the execution unit 116 fixes the hierarchical display by contact other than the pressing of the operating body and switches the contact with the operating body as described above while the content display of the floor of the designated spot is switched. Further detailed information may be displayed. For example, when a plurality of stores are arranged on each floor of a specified building, detailed information such as the store name, opening time, and closed days of the specified store is displayed by contact other than pressing of the operating tool. It may be. When switching the content display on the floor as described above, as shown in FIG. 9, the color display of the contact portion 303 of the operating body may be changed according to the pressing force of the operating body.
- FIG. 10 is an explanatory diagram for explaining an example of display switching when a plurality of spots are displayed on a map. For example, as illustrated in FIG. 10, it is assumed that “convenience store” and “toilet” are displayed on the map as in display example 311 as a result of the user searching using the search function of the information processing apparatus 100.
- the display example 311 has a problem that when the display of the “convenience store” and the display of the “toilet” overlap or are too close, the target display cannot be selected well by touching the operating tool. . In this case, the user has to enlarge the map or change the display range.
- the search results can be classified for each group, and the search results can be displayed for each group. For example, when “Toilet” and “Convenience Store” are displayed on the same map, only “Toilet” or “Convenience Store” is displayed depending on the pressing force of the contact. Can be.
- the display example 312 of FIG. 10 when the pressing force of the operating body is 2N or more and 5N or less, only “toilet” is displayed. Further, in the display example 314 of FIG. 11, only “convenience store” is displayed when the pressing force of the operating body is greater than 5N. The user can confirm only the position of the toilet on the map or only the position of the convenience store by simply changing the pressing force of the operating body.
- the color display of the contact portion of the operating body may be changed according to the pressing force of the operating body.
- FIG. 12 is a flowchart showing details of the operation of the information processing apparatus 100.
- the detection unit 112 detects the pressing force of the operating tool and determines whether or not the pressing force of the operating tool is larger than a certain value (S102).
- step S102 when it is determined that the pressing force of the operating body is not a value larger than a certain value, it is recognized as a normal operation because it is a normal operation (S104).
- step S102 when the detection unit 112 determines that the pressing force of the operating body is greater than a certain value, the detected pressing force changes by a certain value or more compared to the previously detected pressing force. It is determined whether or not (S106).
- step S ⁇ b> 106 when the pressing force detected last time has changed by a certain value or more, it indicates that the display screen has been further pressed by the operating tool. On the other hand, in step S106, if the pressing force detected last time does not change by a certain value or more, it indicates that no further pressing is performed by the operating tool.
- step S106 when the display value changes by a certain value or more, the display switching process after step S110 is not performed, so that the display switching not intended by the user can be prevented.
- step S106 If it is determined in step S106 that the pressing force detected last time has changed by a certain value or more, the display switching unit 114 calculates the ⁇ value of the image displayed on the display screen (S110). . In step S110, not only the ⁇ value of the image but also the transparency and scale of the image may be calculated. Then, the display switching unit 114 updates the drawing displayed on the display screen according to the pressing force of the operating tool detected in step S106 (S112).
- the drawing updated in step S112 is, for example, drawing of the upper layer of the currently displayed layer or drawing of the lower layer as described above.
- a layer to be operated is calculated (S114).
- the layer to be operated in step S114 means, for example, a hierarchy to be operated when displaying data having a hierarchical structure.
- it means display of a group to be operated.
- the execution unit 116 determines whether or not to change the operation target layer based on the value of the operation target layer calculated in step S114 (S116). As described above, the execution unit 116 executes a predetermined process when a predetermined operation is performed while the display switching unit 114 is switching the display screen. If a predetermined operation is performed before the display switching unit 114 completely switches the display screen from the lower hierarchy to the upper hierarchy, should the process be executed at the lower hierarchy or the upper hierarchy? Need to be determined. For example, the execution unit 116 may execute the lower layer processing when a predetermined operation is performed before the display screen is completely switched from the lower layer to the upper layer.
- step S116 when it is determined in step S116 that the operation layer is to be changed, the operation layer is updated and the designated process is executed (S118). If it is determined in step S116 that the operation layer is not changed, the processing from step S102 is executed again.
- step S106 If it is determined in step S106 that the pressing force detected last time has not changed by a certain value or more, the execution unit 116 recognizes the operation on the displayed operation layer and performs the instructed process. Execute (S108). The details of the operation in the information processing apparatus 100 have been described above.
- the contact operation and the pressing force by the operating body that operates the display screen are detected, and the display screen display is switched according to the detected pressing force of the operating body. Then, when an operation such as a touch on the contact operation surface of the operating body, a sudden change in pressing force, or a tilt of the housing is detected while the display screen is being switched, A predetermined process is executed. As a result, it is possible to easily switch the display on the display screen with few operation steps and execute the instructed process.
- the display screen can be switched and processing can be executed by a combination of the pressing force by the operating body and the contact operation, the operability can be expanded without interfering with the existing operation. Furthermore, an intuitive operation can be realized by returning feedback such as changing the display color according to the pressing force.
- each step in the processing of the information processing apparatus 100 of the present specification does not necessarily have to be processed in time series in the order described as a flowchart. That is, each step in the process of the information processing apparatus 100 may be executed in parallel even if it is a different process.
Abstract
Description
〔1〕本実施形態の目的
〔2〕情報処理装置のハードウェア構成
〔3〕情報処理装置の機能構成
〔4〕情報処理装置の動作の詳細
まず、本発明の実施形態の目的について説明する。表示装置には、液晶ディスプレイなどの表示部の表面に静電容量方式や抵抗膜方式のタッチセンサを設けるタッチスクリーン(タッチパネル)がある。タッチスクリーンによれば、表示画面に触れることで情報処理装置の入力が可能となり、ユーザが容易に情報処理装置を扱うことができるようになる。
以上、本発明の実施形態の目的について説明した。次に、図1を参照して、本実施形態にかかる情報処理装置100の表示装置の構成について説明する。図1は、情報処理装置100の表示装置の構成を示す説明図である。本実施形態にかかる情報処理装置100は、情報を表示する表示装置の表示画面を操作体で接触したり押圧したりすることにより、情報を入力可能な装置である。情報処理装置100を所有するユーザは、表示装置に表示されたアイコンや文字キー等で示される対象に操作体を接触させることにより、操作体が接触した対象を選択したり決定したりすることができる。
以上、本実施形態にかかる情報処理装置100のハードウェア構成について説明した。次に、図3を参照して、情報処理装置100の機能構成について説明する。図3では、特に、CPU101における表示切替制御について説明する。なお、図3に示した情報処理装置100の機能構成を説明するに際し、適宜図4~図11を参照して説明する。図3に示したように、CPU11は、検出部112、表示切替部114、実行部116などを備える。
以上、情報処理装置100の機能構成について説明した。次に、図12を参照して、情報処理装置100の動作の詳細について説明する。図12は、情報処理装置100の動作の詳細を示すフローチャートである。図12に示したように、まず、検出部112は、操作体の押圧力を検出して、操作体の押圧力が一定よりも大きいか否かを判定する(S102)。ステップS102において、操作体の押圧力が一定よりも大きい値ではないと判定された場合には、通常の操作であるため、通常の操作として認識する(S104)。
104 表示装置
105 静電式タッチパネル
106 感圧センサ
112 検出部
114 表示切替部
116 実行部
Claims (13)
- 操作体による接触操作面の接触操作および押圧力を検出する検出部と、
前記検出部により検出された前記操作体の押圧力に応じて表示画面の表示を切り替える表示切替部と、
前記表示切替部により表示画面の表示が切り替えられている間に、前記検出部により前記操作体の接触操作面への所定の操作が検出された場合に、前記所定の操作に応じて所定の処理を実行する実行部と、
を備える情報処理装置。 - 前記表示切替部は、前記検出部により検出された前記操作体の押圧力に応じて、複数階層の処理を実行させるための各階層に対応する表示画面の表示を切り替え、
前記実行部は、前記表示切替部により前記表示画面が前記複数階層のうちの一の階層の表示に切り替えられている間に、前記検出部により前記操作体の前記他の操作が検出された場合に、前記他の操作に応じて前記一の階層における処理を実行する、請求項1に記載の情報処理装置。 - 前記表示切替部は、
前記検出部により検出された前記操作体の押圧力が増加した場合に、表示画面に表示されている前記一の階層の下位の階層の表示画面に表示を切り替え、
前記検出部により検出された前記操作体の押圧力が減少した場合に、表示画面に表示されている前記一の階層の上位の階層の表示画面に表示を切り替える、請求項2に記載の情報処理装置。 - 前記表示切替部は、1または2以上の複数階層を示す階層表示を含む地図表示を表示画面に表示させ、前記検出部により検出された前記操作体の接触操作に応じて、前記階層表示の内容を含む前記地図表示を表示画面に表示する、請求項1に記載の情報処理装置。
- 前記表示切替部は、前記検出部により検出された前記操作体の押圧力に応じて、表示画面に表示されている前記地図表示に含まれる前記階層表示の表示内容を切り替える、請求項4に記載の情報処理装置。
- 前記表示切替部は、前記検出部により検出された前記操作体の押圧力に応じて、切り替え前の表示画面の表示の透明度を上げながら表示を切り替える、請求項1に記載の情報処理装置。
- 前記表示切替部は、前記検出部により検出された前記操作体の押圧力に応じて、表示画面の切り替え前の表示画面の表示サイズを小さくしながら表示を切り替える、請求項1に記載の情報処理装置。
- 前記実行部は、前記表示切替部により表示画面の表示が切り替えられている間に、前記接触面を押圧した前記操作体とは異なる他の操作体の接触操作が前記検出部により検出された場合に、前記他の操作体の接触操作に応じて所定の処理を実行する、請求項1に記載の情報処理装置。
- 前記実行部は、前記表示切替部により表示画面の表示が切り替えられている間に、前記検出部により検出された前記操作体の押圧力が所定時間内に所定の閾値を超えた場合に、前記操作体の接触操作に応じて所定の処理を実行する、請求項1に記載の情報処理装置。
- 前記検出部は、筐体の傾きを検出し、
前記実行部は、前記表示切替部により表示画面の表示が切り替えられている間に、前記検出部により前記筐体の傾きが検出された場合に、前記筐体の傾きに応じて所定の処理を実行する、請求項1に記載の情報処理装置。 - 前記表示切替部は、前記検出部により検出された前記操作体による前記接触操作面の押圧力に応じて、前記接触操作面の接触箇所に対応する位置の表示色を変化させる、請求項1に記載の情報処理装置。
- 操作体による接触操作面の接触操作および押圧力を検出するステップと、
前記検出された前記操作体の押圧力に応じて表示画面の表示を切り替えるステップと、
前記表示画面の表示が切り替えられている間に、前記操作体の接触操作面への所定の操作が検出された場合に、前記所定の操作に応じて所定の処理を実行するステップと、
を含む情報処理方法。 - コンピュータを、
操作体による接触操作面の接触操作および押圧力を検出する検出部と、
前記検出部により検出された前記操作体の押圧力に応じて表示画面の表示を切り替える表示切替部と、
前記表示切替部により表示画面の表示が切り替えられている間に、前記検出部により前記操作体の接触操作面への所定の操作が検出された場合に、前記所定の操作に応じて所定の処理を実行する実行部と、
を備える、情報処理装置として機能させるための、プログラム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010800373666A CN102483677A (zh) | 2009-08-31 | 2010-05-19 | 信息处理设备、信息处理方法以及程序 |
EP10811574.2A EP2474891A4 (en) | 2009-08-31 | 2010-05-19 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS AND PROGRAM |
BR112012003863A BR112012003863A2 (pt) | 2009-08-31 | 2010-05-19 | aparelho e método de processamento de informação, e, programa |
US13/391,782 US10241626B2 (en) | 2009-08-31 | 2010-05-19 | Information processing apparatus, information processing method, and program |
RU2012106498/08A RU2012106498A (ru) | 2009-08-31 | 2010-05-19 | Устройство обработки информации, способ обработки информации и программа |
US15/206,867 US10216342B2 (en) | 2009-08-31 | 2016-07-11 | Information processing apparatus, information processing method, and program |
US16/251,833 US10642432B2 (en) | 2009-08-31 | 2019-01-18 | Information processing apparatus, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009200870A JP5593655B2 (ja) | 2009-08-31 | 2009-08-31 | 情報処理装置、情報処理方法およびプログラム |
JP2009-200870 | 2009-08-31 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/391,782 A-371-Of-International US10241626B2 (en) | 2009-08-31 | 2010-05-19 | Information processing apparatus, information processing method, and program |
US15/206,867 Continuation US10216342B2 (en) | 2009-08-31 | 2016-07-11 | Information processing apparatus, information processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011024521A1 true WO2011024521A1 (ja) | 2011-03-03 |
Family
ID=43627634
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/058405 WO2011024521A1 (ja) | 2009-08-31 | 2010-05-19 | 情報処理装置、情報処理方法およびプログラム |
Country Status (7)
Country | Link |
---|---|
US (3) | US10241626B2 (ja) |
EP (1) | EP2474891A4 (ja) |
JP (1) | JP5593655B2 (ja) |
CN (2) | CN102483677A (ja) |
BR (1) | BR112012003863A2 (ja) |
RU (1) | RU2012106498A (ja) |
WO (1) | WO2011024521A1 (ja) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016006694A (ja) * | 2015-09-30 | 2016-01-14 | パイオニア株式会社 | 表示装置、表示方法及び表示用プログラム |
JP2016029575A (ja) * | 2015-09-30 | 2016-03-03 | パイオニア株式会社 | 表示装置、表示方法及び表示用プログラム |
JP2017191463A (ja) * | 2016-04-13 | 2017-10-19 | キヤノン株式会社 | 電子機器およびその制御方法 |
CN108897420A (zh) * | 2012-05-09 | 2018-11-27 | 苹果公司 | 用于响应于手势在显示状态之间过渡的设备、方法和图形用户界面 |
US11010027B2 (en) | 2012-05-09 | 2021-05-18 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US11023116B2 (en) | 2012-05-09 | 2021-06-01 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US11054990B2 (en) | 2015-03-19 | 2021-07-06 | Apple Inc. | Touch input cursor manipulation |
US11112957B2 (en) | 2015-03-08 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US11182017B2 (en) | 2015-08-10 | 2021-11-23 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US11221675B2 (en) | 2012-05-09 | 2022-01-11 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US11231831B2 (en) | 2015-06-07 | 2022-01-25 | Apple Inc. | Devices and methods for content preview based on touch input intensity |
US11240424B2 (en) | 2015-06-07 | 2022-02-01 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11314407B2 (en) | 2012-05-09 | 2022-04-26 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US11327648B2 (en) | 2015-08-10 | 2022-05-10 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11354033B2 (en) | 2012-05-09 | 2022-06-07 | Apple Inc. | Device, method, and graphical user interface for managing icons in a user interface region |
US11921975B2 (en) | 2015-03-08 | 2024-03-05 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
Families Citing this family (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8587422B2 (en) | 2010-03-31 | 2013-11-19 | Tk Holdings, Inc. | Occupant sensing system |
US9007190B2 (en) | 2010-03-31 | 2015-04-14 | Tk Holdings Inc. | Steering wheel sensors |
JP5759230B2 (ja) | 2010-04-02 | 2015-08-05 | ティーケー ホールディングス,インコーポレーテッド | 手センサを有するステアリング・ホイール |
JP2011221640A (ja) * | 2010-04-06 | 2011-11-04 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
CN101968729A (zh) * | 2010-10-19 | 2011-02-09 | 中兴通讯股份有限公司 | 屏幕显示内容的切换方法和装置 |
JP5774931B2 (ja) * | 2011-07-25 | 2015-09-09 | 京セラ株式会社 | 携帯端末装置およびプログラム |
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
JP5978610B2 (ja) * | 2011-12-09 | 2016-08-24 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
WO2013154720A1 (en) | 2012-04-13 | 2013-10-17 | Tk Holdings Inc. | Pressure sensor including a pressure sensitive material for use with control systems and methods of using the same |
JP5720851B2 (ja) | 2012-04-27 | 2015-05-20 | 株式会社村田製作所 | 操作入力装置および情報表示装置 |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
JP2015519656A (ja) | 2012-05-09 | 2015-07-09 | アップル インコーポレイテッド | ユーザインタフェースオブジェクトを移動し、ドロップするためのデバイス、方法及びグラフィカルユーザインタフェース |
WO2013169875A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
WO2013169877A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting user interface objects |
WO2013169845A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for scrolling nested regions |
WO2013169851A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
JP2013257657A (ja) * | 2012-06-11 | 2013-12-26 | Fujitsu Ltd | 情報端末装置及び表示制御方法 |
US9493342B2 (en) | 2012-06-21 | 2016-11-15 | Nextinput, Inc. | Wafer level MEMS force dies |
WO2014008377A1 (en) | 2012-07-05 | 2014-01-09 | Ian Campbell | Microelectromechanical load sensor and methods of manufacturing the same |
CN103544688B (zh) * | 2012-07-11 | 2018-06-29 | 东芝医疗系统株式会社 | 医用图像融合装置和方法 |
JP5812054B2 (ja) | 2012-08-23 | 2015-11-11 | 株式会社デンソー | 操作デバイス |
US9696223B2 (en) | 2012-09-17 | 2017-07-04 | Tk Holdings Inc. | Single layer force sensor |
CN103019593A (zh) * | 2012-11-30 | 2013-04-03 | 珠海市魅族科技有限公司 | 一种用户界面的控制方法及终端 |
WO2014105279A1 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for switching between user interfaces |
EP3467634B1 (en) | 2012-12-29 | 2020-09-23 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
EP2939098B1 (en) | 2012-12-29 | 2018-10-10 | Apple Inc. | Device, method, and graphical user interface for transitioning between touch input to display output relationships |
EP2912542B1 (en) | 2012-12-29 | 2022-07-13 | Apple Inc. | Device and method for forgoing generation of tactile output for a multi-contact gesture |
JP6097843B2 (ja) | 2012-12-29 | 2017-03-15 | アップル インコーポレイテッド | コンテンツをスクロールするか選択するかを判定するためのデバイス、方法、及びグラフィカルユーザインタフェース |
CN105144057B (zh) | 2012-12-29 | 2019-05-17 | 苹果公司 | 用于根据具有模拟三维特征的控制图标的外观变化来移动光标的设备、方法和图形用户界面 |
KR101452302B1 (ko) | 2013-07-29 | 2014-10-22 | 주식회사 하이딥 | 터치 센서 패널 |
JP6222434B2 (ja) * | 2013-08-30 | 2017-11-01 | コニカミノルタ株式会社 | 表示装置 |
US9665206B1 (en) | 2013-09-18 | 2017-05-30 | Apple Inc. | Dynamic user interface adaptable to multiple input tools |
KR101712346B1 (ko) | 2014-09-19 | 2017-03-22 | 주식회사 하이딥 | 터치 입력 장치 |
US9268484B2 (en) * | 2014-01-07 | 2016-02-23 | Adobe Systems Incorporated | Push-pull type gestures |
WO2015106246A1 (en) | 2014-01-13 | 2015-07-16 | Nextinput, Inc. | Miniaturized and ruggedized wafer level mems force sensors |
JP2015185173A (ja) * | 2014-03-24 | 2015-10-22 | 株式会社 ハイヂィープ | タッチ圧力及びタッチ面積による動作対象の臨時操作方法及び端末機 |
JP6527343B2 (ja) | 2014-08-01 | 2019-06-05 | 株式会社 ハイディープHiDeep Inc. | タッチ入力装置 |
KR102282003B1 (ko) * | 2014-08-07 | 2021-07-27 | 삼성전자 주식회사 | 전자 장치 및 이의 표시 제어 방법 |
US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US9645732B2 (en) | 2015-03-08 | 2017-05-09 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US9785305B2 (en) | 2015-03-19 | 2017-10-10 | Apple Inc. | Touch input cursor manipulation |
US10067653B2 (en) | 2015-04-01 | 2018-09-04 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9674426B2 (en) | 2015-06-07 | 2017-06-06 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
CN117486166A (zh) | 2015-06-10 | 2024-02-02 | 触控解决方案股份有限公司 | 具有容差沟槽的加固的晶圆级mems力传感器 |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
CN105159494A (zh) * | 2015-08-27 | 2015-12-16 | 广东欧珀移动通信有限公司 | 一种信息的显示方法及装置 |
CN105094647A (zh) * | 2015-08-27 | 2015-11-25 | 广东欧珀移动通信有限公司 | 一种界面控制方法及移动终端 |
WO2017035796A1 (zh) * | 2015-09-01 | 2017-03-09 | 华为技术有限公司 | 一种转场帧的显示方法及终端 |
US20170068374A1 (en) * | 2015-09-09 | 2017-03-09 | Microsoft Technology Licensing, Llc | Changing an interaction layer on a graphical user interface |
US20170068413A1 (en) * | 2015-09-09 | 2017-03-09 | Microsoft Technology Licensing, Llc | Providing an information set relating to a graphical user interface element on a graphical user interface |
JP6044912B2 (ja) * | 2015-09-24 | 2016-12-14 | 国立大学法人東京海洋大学 | 移動体運行情報システム |
CN105760046A (zh) * | 2016-01-29 | 2016-07-13 | 深圳天珑无线科技有限公司 | 支持二次按压应用程序图标的操作方法及设备 |
JP6758921B2 (ja) * | 2016-06-01 | 2020-09-23 | キヤノン株式会社 | 電子機器及びその制御方法 |
JP2018106574A (ja) * | 2016-12-28 | 2018-07-05 | デクセリアルズ株式会社 | ユーザーインターフェイス装置及び電子機器 |
EP3580539A4 (en) | 2017-02-09 | 2020-11-25 | Nextinput, Inc. | INTEGRATED DIGITAL FORCE SENSORS AND RELATED METHOD OF MANUFACTURING |
US11243125B2 (en) | 2017-02-09 | 2022-02-08 | Nextinput, Inc. | Integrated piezoresistive and piezoelectric fusion force sensor |
EP3385831A1 (en) | 2017-04-04 | 2018-10-10 | Lg Electronics Inc. | Mobile terminal |
CN107450797A (zh) * | 2017-07-07 | 2017-12-08 | 天脉聚源(北京)科技有限公司 | 一种信息显示方法及装置 |
CN107426306A (zh) * | 2017-07-07 | 2017-12-01 | 天脉聚源(北京)科技有限公司 | 一种信息显示方法及系统 |
WO2019018641A1 (en) | 2017-07-19 | 2019-01-24 | Nextinput, Inc. | STACK OF STRAIN TRANSFER IN A MEMS FORCE SENSOR |
US11423686B2 (en) | 2017-07-25 | 2022-08-23 | Qorvo Us, Inc. | Integrated fingerprint and force sensor |
US11243126B2 (en) | 2017-07-27 | 2022-02-08 | Nextinput, Inc. | Wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture |
US11579028B2 (en) | 2017-10-17 | 2023-02-14 | Nextinput, Inc. | Temperature coefficient of offset compensation for force sensor and strain gauge |
US11385108B2 (en) | 2017-11-02 | 2022-07-12 | Nextinput, Inc. | Sealed force sensor with etch stop layer |
WO2019099821A1 (en) | 2017-11-16 | 2019-05-23 | Nextinput, Inc. | Force attenuator for force sensor |
CN109710162B (zh) * | 2018-09-03 | 2021-06-04 | 天翼电子商务有限公司 | 一种利用压力感应触摸屏快速操作选择器组的方法 |
US10962427B2 (en) | 2019-01-10 | 2021-03-30 | Nextinput, Inc. | Slotted MEMS force sensor |
CN113986070B (zh) * | 2020-07-10 | 2023-01-06 | 荣耀终端有限公司 | 一种应用卡片的快速查看方法及电子设备 |
CN113190306B (zh) * | 2021-04-12 | 2024-02-20 | 沈阳中科创达软件有限公司 | 显示层级的切换方法、装置、设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1040055A (ja) * | 1996-07-18 | 1998-02-13 | Koonet:Kk | 情報提供装置及び記憶媒体 |
JP2004192241A (ja) * | 2002-12-10 | 2004-07-08 | Sony Corp | ユーザ・インタフェース装置および携帯情報装置 |
WO2008016387A1 (en) * | 2006-07-31 | 2008-02-07 | Sony Ericsson Mobile Communications Ab | Three-dimensional touch pad input device |
JP2008192092A (ja) | 2007-02-08 | 2008-08-21 | Fuji Xerox Co Ltd | タッチパネル装置、情報処理装置及びプログラム |
Family Cites Families (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2671393B2 (ja) * | 1988-06-21 | 1997-10-29 | ソニー株式会社 | 地図情報の表示装置 |
JPH0764754A (ja) | 1993-08-24 | 1995-03-10 | Hitachi Ltd | 小型情報処理装置 |
JP3718245B2 (ja) * | 1994-11-02 | 2005-11-24 | 株式会社東芝 | プロセス監視用画面選択装置 |
JPH09269883A (ja) * | 1996-03-29 | 1997-10-14 | Seiko Epson Corp | 情報処理装置および情報処理方法 |
JP2001265481A (ja) * | 2000-03-21 | 2001-09-28 | Nec Corp | ページ情報表示方法及び装置並びにページ情報表示用プログラムを記憶した記憶媒体 |
US7688306B2 (en) * | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
JP4043259B2 (ja) * | 2002-03-14 | 2008-02-06 | 富士通テン株式会社 | 情報処理装置 |
US7456823B2 (en) | 2002-06-14 | 2008-11-25 | Sony Corporation | User interface apparatus and portable information apparatus |
US11275405B2 (en) * | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
US7158123B2 (en) | 2003-01-31 | 2007-01-02 | Xerox Corporation | Secondary touch contextual sub-menu navigation for touch screen interface |
EP1510911A3 (en) * | 2003-08-28 | 2006-03-22 | Sony Corporation | Information processing apparatus, information processing method, information processing program and storage medium containing information processing program |
JP2005321972A (ja) * | 2004-05-07 | 2005-11-17 | Sony Corp | 情報処理装置、情報処理装置における処理方法及び情報処理装置における処理プログラム |
JP2006039745A (ja) * | 2004-07-23 | 2006-02-09 | Denso Corp | タッチパネル式入力装置 |
US7724242B2 (en) * | 2004-08-06 | 2010-05-25 | Touchtable, Inc. | Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter |
US7619616B2 (en) * | 2004-12-21 | 2009-11-17 | Microsoft Corporation | Pressure sensitive controls |
JP2006209258A (ja) * | 2005-01-25 | 2006-08-10 | Kenwood Corp | Av処理装置、av処理方法及びプログラム |
JP2006345209A (ja) * | 2005-06-08 | 2006-12-21 | Sony Corp | 入力装置、情報処理装置、情報処理方法、及びプログラム |
US8049731B2 (en) * | 2005-07-29 | 2011-11-01 | Interlink Electronics, Inc. | System and method for implementing a control function via a sensor having a touch sensitive control input surface |
JP2007079729A (ja) * | 2005-09-12 | 2007-03-29 | Denso Corp | タッチパネル入力装置 |
KR101171055B1 (ko) * | 2006-02-02 | 2012-08-03 | 삼성전자주식회사 | 메뉴 리스트 항목의 이동 속도 제어 장치 및 방법 |
US20070214434A1 (en) * | 2006-03-03 | 2007-09-13 | Tobias Rydenhag | User interface and navigation for portable electronic devices |
KR20070113018A (ko) * | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | 터치스크린 장치 및 그 실행방법 |
JP4360381B2 (ja) * | 2006-06-05 | 2009-11-11 | ソニー株式会社 | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2008011220A (ja) * | 2006-06-29 | 2008-01-17 | Ricoh Co Ltd | 画像形成装置 |
US7956849B2 (en) * | 2006-09-06 | 2011-06-07 | Apple Inc. | Video manager for portable multifunction device |
JP2008146453A (ja) * | 2006-12-12 | 2008-06-26 | Sony Corp | 映像信号出力装置、操作入力処理方法 |
JP4850133B2 (ja) * | 2007-06-06 | 2012-01-11 | アルパイン株式会社 | 地図表示装置及び地図表示方法 |
US20090046110A1 (en) * | 2007-08-16 | 2009-02-19 | Motorola, Inc. | Method and apparatus for manipulating a displayed image |
US20090128507A1 (en) * | 2007-09-27 | 2009-05-21 | Takeshi Hoshino | Display method of information display device |
US8223130B2 (en) * | 2007-11-28 | 2012-07-17 | Sony Corporation | Touch-sensitive sheet member, input device and electronic apparatus |
JP4605214B2 (ja) * | 2007-12-19 | 2011-01-05 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
US8584048B2 (en) * | 2008-05-29 | 2013-11-12 | Telcordia Technologies, Inc. | Method and system for multi-touch-based browsing of media summarizations on a handheld device |
US10983665B2 (en) * | 2008-08-01 | 2021-04-20 | Samsung Electronics Co., Ltd. | Electronic apparatus and method for implementing user interface |
US8604364B2 (en) * | 2008-08-15 | 2013-12-10 | Lester F. Ludwig | Sensors, algorithms and applications for a high dimensional touchpad |
US8788977B2 (en) * | 2008-11-20 | 2014-07-22 | Amazon Technologies, Inc. | Movement recognition as input mechanism |
KR100920162B1 (ko) * | 2009-02-04 | 2009-10-06 | 심재우 | 정보처리 장치 |
US20100271312A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Menu Configuration System and Method for Display on an Electronic Device |
-
2009
- 2009-08-31 JP JP2009200870A patent/JP5593655B2/ja not_active Expired - Fee Related
-
2010
- 2010-05-19 CN CN2010800373666A patent/CN102483677A/zh active Pending
- 2010-05-19 WO PCT/JP2010/058405 patent/WO2011024521A1/ja active Application Filing
- 2010-05-19 CN CN201710257891.2A patent/CN107256097A/zh active Pending
- 2010-05-19 EP EP10811574.2A patent/EP2474891A4/en not_active Ceased
- 2010-05-19 BR BR112012003863A patent/BR112012003863A2/pt not_active IP Right Cessation
- 2010-05-19 US US13/391,782 patent/US10241626B2/en not_active Expired - Fee Related
- 2010-05-19 RU RU2012106498/08A patent/RU2012106498A/ru not_active Application Discontinuation
-
2016
- 2016-07-11 US US15/206,867 patent/US10216342B2/en active Active
-
2019
- 2019-01-18 US US16/251,833 patent/US10642432B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1040055A (ja) * | 1996-07-18 | 1998-02-13 | Koonet:Kk | 情報提供装置及び記憶媒体 |
JP2004192241A (ja) * | 2002-12-10 | 2004-07-08 | Sony Corp | ユーザ・インタフェース装置および携帯情報装置 |
WO2008016387A1 (en) * | 2006-07-31 | 2008-02-07 | Sony Ericsson Mobile Communications Ab | Three-dimensional touch pad input device |
JP2008192092A (ja) | 2007-02-08 | 2008-08-21 | Fuji Xerox Co Ltd | タッチパネル装置、情報処理装置及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2474891A4 |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108897420B (zh) * | 2012-05-09 | 2021-10-22 | 苹果公司 | 用于响应于手势在显示状态之间过渡的设备、方法和图形用户界面 |
US11947724B2 (en) | 2012-05-09 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US11354033B2 (en) | 2012-05-09 | 2022-06-07 | Apple Inc. | Device, method, and graphical user interface for managing icons in a user interface region |
CN108897420A (zh) * | 2012-05-09 | 2018-11-27 | 苹果公司 | 用于响应于手势在显示状态之间过渡的设备、方法和图形用户界面 |
US10996788B2 (en) | 2012-05-09 | 2021-05-04 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US11010027B2 (en) | 2012-05-09 | 2021-05-18 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US11023116B2 (en) | 2012-05-09 | 2021-06-01 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US11314407B2 (en) | 2012-05-09 | 2022-04-26 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US11221675B2 (en) | 2012-05-09 | 2022-01-11 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US11112957B2 (en) | 2015-03-08 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US11921975B2 (en) | 2015-03-08 | 2024-03-05 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11550471B2 (en) | 2015-03-19 | 2023-01-10 | Apple Inc. | Touch input cursor manipulation |
US11054990B2 (en) | 2015-03-19 | 2021-07-06 | Apple Inc. | Touch input cursor manipulation |
US11835985B2 (en) | 2015-06-07 | 2023-12-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11231831B2 (en) | 2015-06-07 | 2022-01-25 | Apple Inc. | Devices and methods for content preview based on touch input intensity |
US11240424B2 (en) | 2015-06-07 | 2022-02-01 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11681429B2 (en) | 2015-06-07 | 2023-06-20 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11182017B2 (en) | 2015-08-10 | 2021-11-23 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US11327648B2 (en) | 2015-08-10 | 2022-05-10 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11740785B2 (en) | 2015-08-10 | 2023-08-29 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
JP2016006694A (ja) * | 2015-09-30 | 2016-01-14 | パイオニア株式会社 | 表示装置、表示方法及び表示用プログラム |
JP2016029575A (ja) * | 2015-09-30 | 2016-03-03 | パイオニア株式会社 | 表示装置、表示方法及び表示用プログラム |
JP2017191463A (ja) * | 2016-04-13 | 2017-10-19 | キヤノン株式会社 | 電子機器およびその制御方法 |
Also Published As
Publication number | Publication date |
---|---|
BR112012003863A2 (pt) | 2016-03-22 |
CN102483677A (zh) | 2012-05-30 |
US20160320880A1 (en) | 2016-11-03 |
JP5593655B2 (ja) | 2014-09-24 |
EP2474891A4 (en) | 2014-07-09 |
EP2474891A1 (en) | 2012-07-11 |
US20190155420A1 (en) | 2019-05-23 |
US10241626B2 (en) | 2019-03-26 |
US10216342B2 (en) | 2019-02-26 |
US20120146945A1 (en) | 2012-06-14 |
CN107256097A (zh) | 2017-10-17 |
RU2012106498A (ru) | 2013-08-27 |
JP2011053831A (ja) | 2011-03-17 |
US10642432B2 (en) | 2020-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5593655B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
KR100801089B1 (ko) | 터치 및 드래그를 이용하여 제어 가능한 모바일 장치 및 그조작 방법 | |
RU2533646C2 (ru) | Устройство обработки информации, способ обработки информации и программа | |
US7256770B2 (en) | Method for displaying information responsive to sensing a physical presence proximate to a computer input device | |
EP2494697B1 (en) | Mobile device and method for providing user interface (ui) thereof | |
JP5533165B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US7358956B2 (en) | Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device | |
JP5267388B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
TWI381305B (zh) | 使用者介面的顯示與操作方法以及電子裝置 | |
EP1870800B1 (en) | Touchpad including non-overlapping sensors | |
JP3143462U (ja) | 切り替え可能なユーザ・インターフェースを有する電子デバイスおよび便利なタッチ操作機能を有する電子デバイス | |
US20070263015A1 (en) | Multi-function key with scrolling | |
US8351992B2 (en) | Portable electronic apparatus, and a method of controlling a user interface thereof | |
US20070273669A1 (en) | Touch screen device and operating method thereof | |
US20110138275A1 (en) | Method for selecting functional icons on touch screen | |
US20070229472A1 (en) | Circular scrolling touchpad functionality determined by starting position of pointing object on touchpad surface | |
JP2010140321A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2013101465A (ja) | 情報処理装置、情報処理方法およびコンピュータプログラム | |
JP2011053973A (ja) | 操作制御装置、操作制御方法およびコンピュータプログラム | |
KR101479769B1 (ko) | 터치스크린 장치 및 그 파일검색방법 | |
JP2012141868A (ja) | 情報処理装置、情報処理方法およびコンピュータプログラム | |
JP2011134272A (ja) | 情報処理装置、情報処理方法およびプログラム | |
KR101446141B1 (ko) | 트리 구조의 메뉴 브라우징 방법 및 장치 | |
US20150143295A1 (en) | Method, apparatus, and computer-readable recording medium for displaying and executing functions of portable device | |
JP2010211323A (ja) | 入力システム、携帯端末、入出力装置、入力システム制御プログラム、コンピュータ読み取り可能な記録媒体、および、入力システムの制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080037366.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10811574 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010811574 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012106498 Country of ref document: RU Ref document number: 13391782 Country of ref document: US Ref document number: 1591/DELNP/2012 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012003863 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112012003863 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120222 |