WO2012114760A1 - タッチセンサを備えた電子機器 - Google Patents
タッチセンサを備えた電子機器 Download PDFInfo
- Publication number
- WO2012114760A1 WO2012114760A1 PCT/JP2012/001248 JP2012001248W WO2012114760A1 WO 2012114760 A1 WO2012114760 A1 WO 2012114760A1 JP 2012001248 W JP2012001248 W JP 2012001248W WO 2012114760 A1 WO2012114760 A1 WO 2012114760A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- application
- pressure
- touch sensor
- predetermined
- electronic device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04105—Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
Definitions
- the present invention relates to an electronic device including a touch sensor, and more particularly to an electronic device that controls application software (hereinafter, abbreviated as “application” as appropriate) in response to an operation on the touch sensor.
- application application software
- an input device including a touch sensor such as a touch panel or a touch switch as an input device such as an operation unit or a switch that receives an operation by an operator.
- input devices equipped with touch sensors are widely used in information devices such as calculators and ticket machines, household appliances such as microwave ovens, televisions, and lighting equipment, and industrial equipment (FA devices). .
- a touch sensor For such a touch sensor, various systems such as a resistive film system, a capacitance system, and an optical system are known.
- a small electronic device such as a mobile phone is often provided with a touch panel that displays, for example, a key such as a numeric keypad or an icon as an object on a display unit and detects an operator's contact with the object using a touch sensor. .
- An electronic device equipped with such a touch panel can be configured with various user interfaces by displaying objects according to the application to be activated. Therefore, since the operation is easy for the operator to understand and easy to use, electronic devices equipped with a touch panel are rapidly spreading.
- Patent Document 1 a technique for preventing malfunction of the touch panel by preventing a pressing error
- Patent Document 2 and Patent Document 3 a technique for adjusting the sensitivity of the touch panel, and a suitable material
- Patent Document 4 a technique for saving a job when switching applications using a touch panel
- Patent Document 5 a technique for displaying shortcut keys on a touch panel has been proposed.
- an electronic device as described above can usually install a plurality of various applications.
- Such an electronic device can use one electronic device for various purposes by installing a plurality of applications.
- FIGS. 10A and 10B schematically show screens displayed on the display unit of the electronic device when these applications are executed.
- FIG. 10A illustrates a schedule management application being executed in the foreground in the electronic device
- FIG. 10B is an electronic mail function application being executed in the foreground in the electronic device. It shows how it is.
- most of applications constitute a user interface in which one application occupies one screen.
- the operator wants to use the e-mail function in order to confirm the convenience of the schedule to another person when the schedule management application is executed in the foreground.
- the following scene is assumed.
- the operator wants to use the schedule management function again after executing the e-mail application in FIG. 10B in the foreground and confirming the convenience of the schedule with the e-mail. Cases are also envisaged.
- the schedule management application and the electronic mail function application can be executed simultaneously. That is, when switching between applications, the operator does not need to end an application that is already being executed.
- the operator wishes to start the application by pressing a menu key (MENU), for example. Must be selected and run in the foreground.
- the electronic device illustrated in FIG. 10C is assumed to be an electronic device including keys and buttons such as a numeric keypad formed using a mechanical switch.
- the operator must perform an operation of selecting an application desired to be activated.
- FIG. 10D shows a state in which an icon for starting each application is displayed on the screen in a mobile terminal including a touch panel.
- the operator in order to switch the application, the operator performs an operation during execution of the application to display a selection screen as shown in FIG. 10D and then selects an icon. Therefore, it is necessary to switch to a desired application.
- the operation for switching the application is generally time-consuming for the operator in both cases of a single-task electronic device and a multi-task electronic device.
- Such operations can be quite stressful, especially when applications must be switched frequently.
- a situation in which an operation of alternately switching between applications as shown in FIGS. 10A and 10B must be frequently performed is also assumed. In such a case, it is very troublesome for the operator to repeatedly perform the same operation of switching applications.
- an object of the present invention made in view of such circumstances is to provide an electronic device including a touch sensor that can control application software being executed in the background.
- the invention of the electronic device is as follows: A touch sensor; A pressure detection unit for detecting pressure on the touch sensor; A control unit that controls to switch any of the application software running in the background to the one running in the foreground when detecting data based on pressure that satisfies a predetermined criterion; Is provided.
- control unit detects data based on a pressure satisfying a predetermined criterion
- the control unit detects any application software being executed in the background.
- control is performed so as to sequentially switch to those being executed in the foreground.
- the invention according to a third aspect is the electronic device according to the first aspect, wherein when the control unit detects data based on a pressure that satisfies a predetermined stage criterion, the predetermined software out of the application software being executed in the background The one corresponding to the criteria of the stage is controlled to be switched to the one running in the foreground.
- the invention of the electronic device is A touch sensor; A pressure detection unit for detecting pressure on the touch sensor; When detecting data based on pressure satisfying a predetermined criterion, a control unit that controls so that any of the application software running in the background executes a predetermined process; Is provided.
- the invention according to a fifth aspect is the electronic device according to the fourth aspect, wherein when the control unit detects data based on a pressure that satisfies a predetermined stage criterion, the predetermined software out of the application software being executed in the background Those corresponding to the criteria of the stage are controlled so as to execute predetermined processing.
- the invention of the electronic device for achieving the above object is as follows: A touch sensor; A pressure detection unit for detecting pressure on the touch sensor; A controller that controls to activate any of the inactive application software upon detecting data based on pressure that meets a predetermined criterion; Is provided.
- the invention according to a seventh aspect is the electronic device according to the sixth aspect, wherein each time the control unit detects data based on a pressure satisfying a predetermined criterion, the application software is inactive in a predetermined order.
- the control is performed so as to activate sequentially according to the above.
- the invention according to an eighth aspect is the electronic device according to the sixth aspect, wherein when the control unit detects data based on a pressure that satisfies a predetermined step criterion, the predetermined step criterion among inactive application software It controls to activate the one corresponding to.
- the invention of the electronic device is A touch sensor; A pressure detection unit for detecting pressure on the touch sensor; When detecting data based on a pressure satisfying a predetermined criterion, a control unit that controls so that any of the inactive application software executes a predetermined process; Is provided.
- the invention according to a tenth aspect is the electronic device according to the ninth aspect, wherein when the control unit detects data based on a pressure that satisfies a predetermined step criterion, the predetermined step criterion among inactive application software The one corresponding to the above is controlled so as to execute a predetermined process.
- an electronic device equipped with a touch sensor that can control application software running in the background. Therefore, the electronic device of the present invention can greatly improve the convenience of the operator who operates the electronic device.
- FIG. 1 is a functional block diagram showing a schematic configuration of an electronic device including a touch sensor according to the first embodiment of the present invention.
- Electronic devices according to the present embodiment include, for example, mobile phones, PDAs (Personal Digital Assistants), smart phones, tablet electronic devices, portable terminals such as game machines, ATMs installed in banks, and ticket sales installed in stations. It can be a terminal device used for the machine. That is, this embodiment can be applied to any electronic device as long as it accepts an operator's touch operation by a touch sensor.
- the electronic device 1 includes a touch sensor 11, a press detection unit 12, a tactile sensation presentation unit 13, a display unit 14, a control unit 15, and a storage unit 16.
- the touch sensor 11 is a sensor having a touch surface, and detects that an operator's finger or the like has touched (touched) the touch surface.
- the touch sensor 11 is configured by a system such as a resistive film system, a capacitance system, and an optical system, for example. Further, the touch sensor 11 is made to be a transmissive type and is disposed on the front surface of the display unit 14, and is also used as a sensor for detecting that an operator touches an object such as a key or button displayed on the display unit 14. (So-called touch panel).
- the pressure detection unit 12 detects pressure on the touch sensor 11, for example, an element such as a strain gauge sensor or a piezoelectric element whose physical or electrical characteristics (distortion, resistance, voltage, etc.) change according to the pressure. Etc. are used.
- the press detection unit 12 is configured using, for example, a piezoelectric element or the like, the piezoelectric element of the press detection unit 12 has a magnitude (or load (force) of a load (force) related to the press on the touch sensor 11.
- the voltage magnitude (voltage value) which is an electrical characteristic, changes according to the speed (acceleration) at which the magnitude changes.
- the pressure detection unit 12 notifies the control unit 15 of the magnitude of the voltage (voltage value (hereinafter simply referred to as data)).
- the control unit 15 acquires (detects) the data when the press detection unit 12 notifies the control unit 15 of the data, or when the control unit 15 detects data related to the piezoelectric element of the press detection unit 12. To do. That is, the control unit 15 acquires (detects) data based on the pressure applied to the touch sensor 11. That is, the control unit 15 acquires (detects) data based on the pressure from the pressure detection unit 12.
- the tactile sensation providing unit 13 vibrates the touch sensor 11.
- the tactile sensation providing unit 13 is configured using, for example, a piezoelectric vibrator.
- the display unit 14 draws and displays an object such as an input button such as a push button switch (push button switch).
- the display unit 14 is configured using, for example, a liquid crystal display panel, an organic EL display panel, or the like.
- the display unit 14 not only displays such a user interface, but also displays various information to be presented to the operator (that is, a display screen configuring the application) according to each application.
- the control unit 15 controls the entire electronic device 1.
- the control unit 15 performs various controls including control for each function unit in accordance with various installed applications. For example, when a web browser application is started, various processes related to the function as a web browser are performed. Note that the control unit of the electronic device performs various processes according to the application.
- the storage unit 16 can be composed of a memory or the like, and stores various applications and various input information, and also functions as a work memory. In particular, in the present embodiment, the storage unit 16 stores (installs) various applications to be executed in the electronic device 1 in advance.
- FIG. 2 is a diagram showing an example of a mounting structure of the electronic device 1 shown in FIG. 2A is a cross-sectional view of the main part
- FIG. 2B is a plan view of the main part.
- the display unit 14 is housed and held in the housing 21.
- the touch sensor 11 is held on the display unit 14 via an insulator 22 made of an elastic member.
- display unit 14 and touch sensor 11 are rectangular in plan view.
- the touch sensor 11 is held on the display unit 14 via the insulators 22 arranged at the four corners deviated from the display area A of the display unit 14 indicated by virtual lines in FIG. .
- the casing 21 is provided with an upper cover 23 so as to cover the surface area of the touch sensor 11 deviated from the display area of the display unit 14, and an elastic member is provided between the upper cover 23 and the touch sensor 11.
- An insulator 24 is provided.
- the front surface member having the touch surface 11a is made of, for example, a transparent film or glass, and the back surface member is made of glass or acrylic.
- the touch sensor 11 has a structure in which when the touch surface 11a is pressed, the pressed portion is slightly bent (distorted) according to the pressing force, or the structure itself is bent slightly.
- a strain gauge sensor 31 for detecting a pressure (pressing force) applied to the touch sensor 11 is provided in the vicinity of each side covered with the upper cover 23 by adhesion or the like.
- a piezoelectric vibrator 32 for vibrating the touch sensor 11 is provided in the vicinity of two opposing sides by adhesion or the like. That is, in the electronic device 1 shown in FIG. 2, the press detection unit 12 shown in FIG. 1 is configured using four strain gauge sensors 31, and the tactile sensation providing unit 13 is configured using two piezoelectric vibrators 32. Has been. And the touch surface 11a is vibrated by vibrating the touch sensor 11 by the tactile sensation providing unit 13.
- FIG. 2B the casing 21, the upper cover 23, and the insulator 24 shown in FIG. 2A are not shown.
- FIG. 3 and 4 are flowcharts for explaining application control in accordance with an operation on the electronic apparatus 1 according to the first embodiment.
- the electronic device 1 according to the present embodiment is executing in the background if it detects data based on a pressure that satisfies a predetermined criterion for the touch sensor 11 while receiving a normal operation by a touch operation on the touch sensor 11. Switch any of these applications to those running in the foreground.
- the electronic device 1 when the position where the touch operation is detected in the touch sensor 11 is a position corresponding to an icon or the like (object) for starting an application, the electronic device 1 normally starts the application. It is explained including the processing of. In starting the process shown in FIG. 3, it is assumed that the electronic device 1 has already started some application in the foreground.
- the application to be activated here may be arbitrary, such as the above-described schedule management application or e-mail function application.
- an icon or the like for starting an application is performed on the display unit 14 of the electronic device 1.
- An object such as a key is displayed.
- the object in the present embodiment can be an image that suggests to the operator the position (area) at which a touch operation is accepted. For example, an image representing a state in which numbers and characters are drawn on the key top is displayed on the display unit 14 as an object image.
- the object in the present embodiment can be an image that suggests to the operator the position (region) for detecting contact for performing an operation on the application being executed in the foreground.
- an icon image indicating an application to be activated is displayed on the display unit 14 as an object image. At this time, what kind of object is displayed on the display unit 14 depends on the application being executed in the foreground at that time.
- step S11 when the processing of the electronic device 1 according to the present embodiment starts, the control unit 15 monitors the presence or absence of contact with the touch sensor 11 and also monitors data based on the pressure on the touch sensor 11 (step S11).
- step S11 when the touch sensor 11 detects contact (touch operation) by a pressing target (pressed object) such as an operator's finger or stylus pen, the process proceeds to step S12.
- step S12 the control unit 15 determines whether or not the data based on the pressure applied to the touch sensor 11 satisfies a predetermined standard while increasing due to the operator pressing the touch sensor 11.
- the control part 15 can make the average value of the output of the four strain gauge sensors 31 of the press detection part 12 into the data based on a press, for example.
- the data based on the pressure satisfying a predetermined standard is set so that a value such as a slightly higher standard is set in advance based on the pressure when the operator performs a normal pressing operation, and the setting can be changed thereafter. Is preferable.
- the predetermined standard is an excessively low value so that the data based on the pressure is not determined to satisfy the predetermined standard. Do not set. Further, in order to give the operator a pressure sensation for realistic tactile sensation, which will be described later, the predetermined standard is excessively determined by taking into consideration the normal pressing (for example, an average value) based on the operator's intention. Do not set a low value for.
- step S13 Even if the touch sensor 11 detects contact in step S11, if the control unit 15 has not detected data based on a pressure that satisfies a predetermined standard in step S12, the process proceeds to step S13.
- step S ⁇ b> 13 the control unit 15 determines whether or not the contact position detected by the touch sensor 11 corresponds to the position of the object displayed on the display unit 14. If the contact position detected by the touch sensor 11 does not correspond to the position of the object in step S13, the operator does not touch the position where the predetermined control by the application is performed, and the control unit 15 returns to step S11. To continue processing. On the other hand, when the position of the contact detected by the touch sensor 11 in step S13 corresponds to the position of the object, the control unit 15 determines whether or not the touch sensor 11 stops detecting the contact (step S14).
- step S14 If the contact has not yet been released in step S14 (that is, if the operator has continued to touch the touch sensor 11), the control unit 15 returns to step S11 and continues the process.
- the control unit 15 performs a process corresponding to the object at the position where the contact was detected. Is performed (step S15).
- the process at this time is defined by the application being executed in the foreground at that time. For example, when contact cancellation is detected at the position of an object that accepts an operation related to input of numbers, characters, etc. during execution of an e-mail application, processing for displaying the numbers, characters, etc. on the display unit 14 is performed. Further, for example, when contact release is detected at the position of the object of the icon that activates the application, processing for activating the application is performed.
- the electronic device 1 detects a contact release with respect to the touch sensor 11 at the position of the object displayed on the display unit 14 (detects a tap), thereby processing corresponding to the object. It can be performed.
- step S11 when the touch sensor 11 detects contact in step S11, if the control unit 15 detects data based on a pressure satisfying a predetermined standard in step S12, the process proceeds to step S16.
- step S ⁇ b> 16 the control unit 15 drives the tactile sensation providing unit 13 with a predetermined drive signal in order to notify the operator that an operation by pressing based on the operator's intention, which is different from a normal touch operation, is detected. Then, the touch sensor 11 is vibrated with a predetermined vibration pattern set in advance to present a first tactile sensation.
- the control unit 15 presents a tactile sensation to the pressed object pressing the touch surface.
- the drive of the tactile sensation providing unit 13 is controlled.
- the tactile sensation providing unit 13 drives the two piezoelectric vibrators 32 in the same phase, for example.
- step S16 in order to present a realistic tactile sensation to the operator, the electronic device 1 stimulates the tactile sensation in a state in which the pressure sensation of the operator is stimulated by performing the following processing. That is, the electronic device 1 stimulates the pressure sense until the data based on the pressure on the touch sensor 11 satisfies the standard for presenting the tactile sensation. It is driven by a predetermined drive signal to vibrate the touch surface 11a to stimulate the sense of touch. Thereby, the electronic device 1 can present to the operator a tactile sensation similar to that obtained when a button switch such as a push button switch (push button switch) is pressed.
- a button switch such as a push button switch (push button switch) is pressed.
- the operator performs an input operation on the touch sensor 11 while obtaining the same realistic tactile sensation as when the actual push button switch is operated. be able to. Further, since the operation can be performed in conjunction with the consciousness that the touch sensor 11 is “pressed”, an operation error can be prevented.
- the driving signal when presenting the tactile sensation described above that is, the constant frequency, period (wavelength), waveform, and amplitude for stimulating the tactile sensation can be appropriately set according to the tactile sensation to be presented.
- the tactile sensation providing unit 13 is driven by a drive signal for one cycle consisting of a sine wave having a constant frequency of 170 Hz. To do.
- the tactile sensation providing unit 13 is driven by such a drive signal, and the touch surface 11a is vibrated by about 15 ⁇ m with the reference pressure Pth applied. As a result, it is possible to present a realistic tactile sensation to the operator as if an actual key was pressed.
- FIG. 4 is a flowchart illustrating details of the application control process in step S17 in FIG.
- the control unit 15 first determines whether there is an application running in the background in addition to the application running in the foreground (step S21). ).
- step S21 If there is no application executing in the background in step S21, there is no application to be switched to the foreground execution. In this case, the control unit 15 notifies the operator to that effect, for example, by displaying on the display unit 14 a display indicating that “an application that is executing the background does not exist” (step S22). At this time, the control unit 15 drives the tactile sensation providing unit 13 to present a tactile sensation different from the tactile sensation when the above-described button is pressed. You may make it notify a person. Thus, in step S22, after notifying the operator that there is no application being executed in the background, the control unit 15 ends the application control process in step S17.
- step S21 if there is an application being executed in the background in step S21, the control unit 15 shifts the application currently being executed in the foreground to execution in the background (step S23). Next, the control unit 15 switches the application being executed in the background to execution in the foreground (step S24), and then ends the application control process in step S17.
- step S24 the control unit 15 controls to switch any of the applications running in the background to those running in the foreground.
- the application that switches to the one running in the foreground is naturally identified.
- the control unit 15 must specify which of them should be switched to the one running in the foreground.
- various methods can be envisaged as a method for identifying the application to be executed in the foreground among the plurality of applications being executed in the background.
- step S17 For example, while a specific application is being executed in the foreground, one application that is highly likely to be linked (or used simultaneously) with the application is specified in advance, and the specified application is specified in the process of step S17.
- the application can be run in the foreground.
- the order of shifting to the foreground during execution in the background can be defined in advance as a predetermined order. In this case, every time the process of step S17 occurs, the control unit 15 performs control so as to sequentially switch to the one being executed in the foreground according to this order.
- the control unit 15 detects data based on a pressure that satisfies a predetermined standard in step S12
- the data based on the pressure can be detected step by step. That is, for example, for data based on pressure that satisfies a predetermined standard, a predetermined standard that is gradually set higher, such as a first standard, a second standard, and a third standard, is specified in advance. deep.
- a predetermined standard that is gradually set higher, such as a first standard, a second standard, and a third standard, is specified in advance. deep.
- the data level based on the pressure is detected after a predetermined short period (for example, 1 second) after the data based on the pressure meets a predetermined standard. Measures can be taken to confirm
- a corresponding application is defined in advance for each of these predetermined stage criteria.
- the schedule management application corresponds to the data standard based on the pressure in the first stage
- the e-mail function application corresponds to the second stage standard
- the music player application corresponds to the third stage standard
- the control unit 15 detects data based on the pressure satisfying a predetermined level in step S12, in step S17, the control unit 15 determines a predetermined level in the application being executed in the background. It is possible to control to switch the one corresponding to the one running in the foreground.
- step S17 After performing the application control process in step S17, the control unit 15 determines whether or not the data based on the pressure on the touch sensor 11 falls below a predetermined standard (step S18). That is, in step S18, it is determined whether or not the operator has weakened the pressing force on the touch sensor 11.
- step S18 when the data based on the pressure on the touch sensor 11 falls below a predetermined reference, the control unit 15 drives the tactile sensation providing unit 13 with a predetermined drive signal, and uses a predetermined vibration pattern set in advance. 11 is vibrated to present a second tactile sensation (step S19).
- the tactile sensation presented here is for informing the operator that the application control process has been properly executed.
- This second tactile sensation can be the same as the first tactile sensation presented in step S16.
- the first tactile sensation is the tactile sensation of pressing the above-described button
- the second tactile sensation may be different from the first tactile sensation and may be a tactile sensation when a finger or the like is released from the button.
- the frequency of the drive signal when presenting the first tactile sensation can be 170 Hz
- the frequency of the drive signal when presenting the second tactile sensation can be 125 Hz.
- the touch sensor 11 is set in advance by driving the tactile sensation providing unit 13 with a predetermined drive signal in the same manner as when pressing. By vibrating with a predetermined vibration pattern, it is possible to present a tactile sensation when the pressing is released after the button is pressed. Therefore, in combination with the tactile sensation when the button is pressed, the operator can be presented with a realistic tactile sensation when releasing a finger or the like that is closer to the push button switch.
- FIG. 5 is a diagram for explaining processing by the electronic apparatus 1 according to the first embodiment.
- FIG. 5A an example in which the electronic apparatus 1 is executing a schedule management application in the foreground will be described.
- the electronic device 1 is executing an electronic mail function application in the background.
- the electronic device 1 since the electronic device 1 is executing the schedule management application in the foreground, the electronic device 1 performs processing corresponding to the operation in the schedule management application by accepting an operator's touch operation. Do. That is, in the state shown in FIG. 5A, the operator performs a touch operation on the touch sensor 11 (in a state where an object is displayed on the display unit 14) of the electronic device 1, thereby performing a schedule management application.
- the operator can display the details of the schedule of the date or input a schedule such as an event on the date by touching the location of the date.
- the electronic device 1 As shown in FIG. 5 (B), the e-mail application executed in the background is executed in the foreground. That is, as shown in FIG. 5A, when the operator wants to operate the e-mail application being executed in the background while the schedule management application is being executed in the foreground, any operation of the touch sensor 11 is performed. It is only necessary to manipulate the position so that data based on a pressure satisfying a predetermined criterion is detected.
- the e-mail application is switched to the foreground execution as shown in FIG. 5B, so that the operator can immediately perform an operation on the application being executed in the background.
- the electronic device 1 performs processing corresponding to the operation in the e-mail application by accepting an operator's touch operation. For example, when the operator receives an operation for inputting a character, the electronic device 1 can display the character.
- the touch surface of the touch sensor 11 is pressed again in the state shown in FIG. 5B and the control unit 15 detects data based on the pressure that satisfies a predetermined standard, the electronic device 1 displays the data shown in FIG. Return the schedule management application running in the background to foreground execution, as shown in.
- the order for shifting to the foreground during execution in the background can be defined in advance as a predetermined order.
- the electronic device 1 is assumed to be executing in the foreground according to a predetermined order. Switch sequentially. Thereby, the operator can switch the application currently being executed in the background to execution in the foreground one after another quickly.
- the control unit 15 detects data based on a pressure that satisfies a predetermined criterion
- the data based on the pressure is detected in each of the first stage, the second stage, and the like, and the predetermined stage A corresponding application can be defined in advance for each of the criteria.
- the electronic device 1 detects data based on pressure that satisfies a predetermined level when the touch surface of the touch sensor 11 is pressed, the electronic device 1 corresponds to the predetermined level of the application being executed in the background. Switch what to do with what is running in the foreground.
- the schedule management application corresponds to the standard of data based on the first-stage press
- the electronic device 1 switches the schedule management application to foreground execution.
- the e-mail function application corresponds to the data standard based on the second-stage pressure
- the electronic device 1 switches the electronic mail function application to foreground execution.
- the electronic device 1 When the electronic device 1 according to the first embodiment described above detects data based on pressure that satisfies a predetermined criterion, the electronic device 1 switches one of the applications running in the background to the one running in the foreground. On the other hand, when the electronic device 1 according to the second embodiment detects data based on a pressure that satisfies a predetermined criterion, the application running in the foreground remains as it is and any application running in the background remains unchanged. To execute a predetermined process.
- FIG. 6 is a flowchart for explaining the details of the application control process in step S17 in FIG. 3 in the present embodiment.
- the control unit 15 first determines whether there is an application running in the background in addition to the application running in the foreground (step S31). ).
- step S31 If there is no application running in the background in step S31, there is no application to be controlled running in the background. In this case, the control unit 15 notifies the operator to that effect, for example, by displaying on the display unit 14 a display indicating that “an application that is executing the background does not exist” (step S32). At this time, the control unit 15 drives the tactile sensation providing unit 13 to present a tactile sensation different from the tactile sensation when the button is pressed. You may make it inform. As described above, after notifying the operator that there is no application being executed in step S32, the control unit 15 ends the application control process in step S17.
- control unit 15 controls the application being executed in the background to execute a predetermined process while keeping the application currently being executed in the foreground. (Step S33). Then, the control part 15 complete
- step S33 the control unit 15 controls so that any of the applications being executed in the background executes a predetermined process.
- an application to be controlled to execute a predetermined process is naturally identified.
- a predetermined process to be executed by an application running in the background is specified in advance.
- the predetermined process described above can be a process for stopping the alarm.
- the application running in the background is a music player
- the above-described predetermined processing is performed when music being played back by the music player is stopped or when playback is stopped. Can be a process of starting playback, or a process of selecting a song next to the song being played.
- a predetermined process when data based on a press satisfying a predetermined criterion is detected for a short time is the next song after the song being played.
- a predetermined process when data based on a pressure satisfying a predetermined standard is detected for a relatively long time can be a process of stopping music being played back by the music player.
- a quick two-time detection double tap
- a quick three-time detection A predetermined process different depending on the number of data based on the pressure detected within a predetermined short period, such as a triple tap
- control unit 15 when there are a plurality of applications being executed in the background, the control unit 15 must specify which of those applications is to execute a predetermined process.
- various methods can be envisaged as a method for identifying a plurality of applications being executed in the background to execute a predetermined process, as in the first embodiment described above.
- step S17 the control unit 15 performs control so that the prescribed application is executed while the specified application is being executed in the background.
- the order in which the predetermined processing is performed while being executed in the background can be defined in advance as the predetermined order. In this case, every time the process of step S17 occurs, the control unit 15 performs control so that the application being executed in the foreground performs a predetermined process without changing the application being executed in the foreground according to this order.
- the control unit 15 when the control unit 15 detects data based on a pressure satisfying a predetermined standard in step S12, the data based on the pressure can be detected step by step. That is, for example, for data based on pressure satisfying a predetermined criterion, a plurality of predetermined step criteria that are gradually set higher are defined in advance, and an application corresponding to each of the predetermined step criteria is also set in advance. It can be prescribed.
- the schedule management application corresponds to the data standard based on the pressure in the first stage
- the e-mail function application corresponds to the second stage standard
- the music player application corresponds to the third stage standard, etc. Can be made.
- control unit 15 when the control unit 15 detects data based on the pressure satisfying a predetermined level in step S12, in step S17, the control unit 15 determines a predetermined level in the application being executed in the background. The one corresponding to is controlled so as to execute a predetermined process.
- FIG. 7 is a diagram for explaining processing by the electronic apparatus 1 according to the second embodiment.
- FIG. 7A shows a state in which an operator executes an application of a web browser by the electronic device 1 and browses a news article distribution site on the Internet.
- the electronic device 1 is executing a music player application in the background.
- the electronic device 1 is executing a music player application in the background, and the music of the song A is output from the earphone 50.
- the electronic device 1 since the electronic device 1 is executing the web browser application in the foreground, the electronic device 1 performs a process corresponding to the operation in the web browser application by accepting an operator's touch operation. Do. That is, in the state shown in FIG. 7A, the operator performs a touch operation on the touch sensor 11 (in the state where the object is displayed on the display unit 14) of the electronic device 1 to thereby display the web browser.
- the application can execute a predetermined process. For example, the operator can view details of the news article by touching an object displaying the item (headline) of the news article.
- the music player application executed in the background is caused to execute predetermined processing.
- a process of selecting a song next to a song being played is defined as the predetermined process. Accordingly, when the music player application running in the background performs a predetermined process, the music of the music B, which is the music following the music A, is output from the earphone 50 as shown in FIG.
- the music player application can be operated as shown in FIG. 7B, so that the operator can immediately operate the application being executed in the background.
- FIG. 5 the example in which the process of selecting the next song is executed while the application of the music player is running in the background has been described.
- various processes are executed in accordance with the prescribed predetermined process. Can be considered.
- Such a predetermined process can be assumed to be various depending on the application being executed in the background.
- the control for the application running in the foreground other than the control of the application running in the background is performed using the detection of contact by the touch sensor 11 as a trigger.
- the process is executed by detecting data based on the pressure that satisfies the first criterion, and the second criterion higher than the first criterion is set.
- Control of the application running in the background is performed by detecting data based on the filling pressure.
- FIG. 8 is a flowchart illustrating application control according to an operation on the electronic device 1 according to the third embodiment.
- the electronic apparatus 1 according to the present embodiment detects a data based on the pressure satisfying the second criterion when receiving a normal operation based on the data based on the pressure satisfying the first criterion with respect to the touch sensor 11, Switch any of the applications running in the background to those running in the foreground. Therefore, it should be noted that the electronic device 1 according to the present embodiment does not execute a specific process for the application being executed in the background and the foreground at the stage where the touch to the touch sensor 11 is received.
- the electronic device Reference numeral 1 describes a normal process such as starting the application. In starting the processing shown in FIG. 8, it is assumed that the electronic device 1 has already started some application in the foreground.
- the application to be activated here may be any application such as a schedule management application or an electronic mail function application.
- the display unit 14 of the electronic device 1 Before starting the processing shown in the flowchart of FIG. 8, before detecting the pressing operation of the operator by the data based on the pressing satisfying the first reference to the touch surface of the touch sensor 11, the display unit 14 of the electronic device 1.
- An icon for starting an application and an object such as a key for performing an operation are displayed.
- the object in the present embodiment can be an image that suggests to the operator the position (region) at which the pressing operation is received. For example, an image representing a state in which numbers and characters are drawn on the key top is displayed on the display unit 14 as an object image.
- the object in the present embodiment can be an image that suggests to the operator the position (region) for detecting contact for performing an operation on the application being executed in the foreground.
- an icon image indicating an application to be activated is displayed on the display unit 14 as an object image. At this time, what kind of object is displayed on the display unit 14 depends on the application being executed in the foreground at that time.
- step S51 when the processing of the electronic device 1 according to the present embodiment is started, the control unit 15 monitors the presence or absence of contact with the touch sensor 11 and monitors data based on the pressure on the touch sensor 11 (step S51).
- step S51 when the touch sensor 11 detects contact (touch operation) by a pressing target (pressed object) such as an operator's finger or stylus pen, the process proceeds to step S52.
- step S52 the control unit 15 determines whether or not the data based on the pressure on the touch sensor 11 is increased by the press of the touch sensor 11 by the operator and satisfies the first standard.
- the data based on the pressure satisfying the first standard is set in advance based on the pressure when the operator performs a normal pressing operation, and the setting can be changed thereafter.
- the first standard is excessively low in order not to determine that the data based on the press satisfies the first standard. Do not set a value.
- the first reference is excessively considered in consideration of a normal load (for example, an average value) of pressing based on the operator's intention. Do not set a low value.
- step S53 Even if the touch sensor 11 detects contact in step S51, if the control unit 15 does not detect data based on the pressure that satisfies the first criterion in step S52, the process returns to step S51.
- the control unit 15 detects data based on the pressure that satisfies the first criterion in step S52, the control unit 15 detects that the touch position detected by the touch sensor 11 is an object displayed on the display unit 14. It is determined whether or not it corresponds to the position (step S53).
- step S53 when the position of contact detected by the touch sensor 11 does not correspond to the position of the object, the operator does not touch the position where the predetermined control by the application is performed.
- the control unit 15 determines whether or not the data based on the pressure on the touch sensor 11 is further increased by the press of the touch sensor 11 by the operator and satisfies the second standard (step S54).
- the data based on the pressure satisfying the second standard is set so that a slightly higher standard is set in advance based on the pressure when the operator performs a normal pressing operation, and the setting can be changed thereafter. Is preferred.
- the control part 15 returns to step S52 and continues a process.
- the control unit 15 when the position of the contact detected by the touch sensor 11 corresponds to the position of the object in step S53, the control unit 15 performs an operation based on data based on the pressure that satisfies the first criterion based on the intention of the operator. The operator is informed that has been detected. Therefore, the control unit 15 drives the tactile sensation providing unit 13 with a predetermined drive signal, vibrates the touch sensor 11 with a predetermined vibration pattern set in advance, and presents a first tactile sensation (step S55).
- the first tactile sensation presented in step S55 can be, for example, a tactile sensation when the button described in the first embodiment is pressed.
- the control unit 15 determines whether or not the data based on the pressure on the touch sensor 11 is further increased by the press of the touch sensor 11 by the operator and satisfies the second standard (Ste S56). If it is determined in step S56 that the data based on the pressure on the touch sensor 11 satisfies the second reference, the control unit 15 notifies the operator that an operation based on the data based on the pressure that satisfies the second reference is detected. Therefore, the tactile sensation providing unit 13 is driven by a predetermined drive signal to present a tactile sensation (step S57).
- the tactile sensation presented in step S57 may be, for example, a tactile sensation when a button is pressed as in step S55, but the operator is notified that an operation different from the operation based on the data based on the press satisfying the first criterion is accepted.
- the tactile sensation is different from the tactile sensation when the button is pressed or the tactile sensation when the finger is released from the button.
- step S58 the control unit 15 performs an application control process in step S58.
- the application control process performed in step S58 is the same as the application control process in step S17 described in FIG. 3, and more specifically, the same process as described in FIG. 4 or FIG. 6 is performed. That is, in an aspect in which any application running in the background is switched to foreground execution, the application control process shown in FIG. 4 is performed. Further, in a mode in which a predetermined process is executed in the background for any application being executed in the background, the application control process illustrated in FIG. 6 is performed.
- control unit 15 determines whether or not the data based on the pressure on the touch sensor 11 falls below the first reference (step S59). That is, in step S59, it is determined whether or not the operator has weakened the pressing force on the touch sensor 11.
- the control unit 15 drives the tactile sensation providing unit 13 with a predetermined drive signal, and uses the predetermined vibration pattern set in advance to perform the touch sensor. 11 is vibrated to present a second tactile sensation (step S60).
- the tactile sensation presented here is for informing the operator that the application control process has been properly executed.
- the second tactile sensation can be the same as the first tactile sensation presented in step S55, as in the first embodiment.
- the first tactile sensation is a tactile sensation of pressing a button
- the second tactile sensation may be different from the first tactile sensation and may be a tactile sensation when a finger or the like is released from the button.
- step S56 when it is determined in step S56 that the data based on the pressure on the touch sensor 11 does not satisfy the second standard, the control unit 15 determines whether the data based on the pressure on the touch sensor 11 falls below the first standard. Is determined (step S61). That is, in step S61, it is determined whether or not the operator has weakened the pressing force on the touch sensor 11.
- step S62 the control unit 15 performs a process corresponding to the object at the position where the pressing operation has been detected.
- the processing corresponding to the object performed in step S62 is the same as the processing in step S15 described in FIG. In other words, the processing at this time is defined by the application being executed in the foreground at that time. If the process corresponding to the object is performed in step S62, the control unit 15 proceeds to step S60, and thereafter performs the process when the pressing operation of the operator is released.
- step S58 the application control process of step S58 is performed.
- the electronic device 1 can execute the application control process by receiving a pressing operation based on data based on pressing that satisfies the second reference at an arbitrary position of the touch sensor 11. . Therefore, when an operator wants to control an application running in the background, the operator only operates an arbitrary position of the touch sensor 11 so that data based on a pressure satisfying the second criterion is detected. It's okay.
- the same effect as that of the first or second embodiment described above can be obtained as control for the application being executed in the background.
- An operation in which something touches lightly is not regarded as a substantial operation. Therefore, electronic device 1 according to the present embodiment can reduce operation mistakes by the operator.
- processing is performed according to the flowchart described in FIG. 3 or FIG. 8, and the application control processing shown in FIG. 9 is executed in step S17 in FIG. 3 or step S58 in FIG.
- a single task electronic device cannot simultaneously execute a plurality of applications. Therefore, in the case of a single-task electronic device, while executing a specific application, an application that is highly likely to be linked (executed by switching) with the application is defined in advance, and the processing in step S17 or step S58 is performed. Causes the specified application to be executed.
- the control unit 15 detects data based on a pressure satisfying a predetermined standard
- the data based on the pressure is detected step by step, and an application corresponding to each of these predetermined level standards is also specified in advance. You may keep it.
- FIG. 9 is a flowchart illustrating application control processing performed in step S17 of FIG. 3 or step S58 of FIG. 8 when the single-task electronic device performs processing according to the flowchart described in FIG. 3 or FIG.
- the control unit 15 first determines whether there is an application defined to be executed by switching to this application in addition to the application being executed (step S1). S71).
- step S71 If there is no application to be switched in step S71, the control unit 15 notifies the operator to that effect by displaying a display “no application to be switched” on the display unit 14 (step S72). At this time, the control unit 15 drives the tactile sensation providing unit 13 to present a tactile sensation different from the tactile sensation of pressing the above-described button so as to notify the operator that there is no application to be switched. It may be. Thus, in step S72, after notifying the operator that there is no application to be switched, the control unit 15 ends the application control process according to the present embodiment.
- step S71 when there is an application to be switched in step S71, the control unit 15 once terminates the currently executing application (step S73). Next, the control unit 15 starts an application defined to be switched and executed (step S74), and then ends the application control process according to the present embodiment.
- the present invention is not limited to the above embodiment, and many variations or modifications are possible.
- the control of an application being executed in the background in the multitasking electronic device has been described.
- the present invention is not based on the distinction between the execution of the application and the foreground or background, and the application can be switched depending on whether the execution of the application is active or inactive.
- the control unit 15 when detecting data based on a pressure satisfying a predetermined criterion, can control to activate any of the inactive applications. In this case, each time the controller 15 detects data based on a pressure that satisfies a predetermined criterion, the controller 15 can also control to sequentially activate any of the inactive applications according to a predetermined order. Further, in the above-described case, when the control unit 15 detects data based on a pressure that satisfies a predetermined level criterion, the control unit 15 controls to activate an inactive application corresponding to the predetermined level criterion. You can also.
- the present invention is not based on the distinction between whether the application execution is foreground or background, and it is also possible to control an inactive application by distinguishing whether the application execution is active or inactive. That is, for example, in the second embodiment described above, when the control unit 15 detects data based on a pressure that satisfies a predetermined criterion, the control unit 15 performs control so that any of the inactive applications executes a predetermined process. You can also. Further, in this case, when the control unit 15 detects data based on a pressure that satisfies a predetermined level criterion, the control unit 15 performs control so that an inactive application corresponding to the predetermined level criterion executes a predetermined process. You can also.
- the press detection unit in each of the above-described embodiments can be configured using an arbitrary number of strain gauge sensors.
- a press detection part can assume various structures according to the contact detection system in a touch panel.
- a strain gauge sensor, a piezoelectric element, or the like is used by associating the magnitude of the resistance corresponding to the size of the contact area with the pressing load (force) on the touch surface of the touch panel. It can be configured without.
- the capacitance in the case of the capacitance method, the capacitance can be configured without using a strain gauge sensor or a piezoelectric element by associating the size of the capacitance with a load (force) of pressing against the touch surface of the touch panel. it can.
- the tactile sensation providing unit may be configured using an arbitrary number of piezoelectric vibrators, or may be configured by providing a transparent piezoelectric element on the entire surface of the touch sensor, or can express vibration that presents tactile sensation. Can be configured to rotate once in one cycle of the drive signal.
- the pressure detection unit and the tactile sensation providing unit are configured using a piezoelectric element
- the pressure detection unit / tactile sensation providing unit can also be configured by sharing the piezoelectric element.
- the electronic device drives the tactile sensation providing unit when the data based on the pressure applied to the touch sensor 11 satisfies the standard for presenting the tactile sensation.
- the case where the data based on the pressure on the touch sensor 11 satisfies the standard for presenting the tactile sensation may be the time when the data based on the pressure on the touch sensor 11 reaches a standard value for presenting the tactile sensation. It may be when the data based on the pressure on the touch sensor 11 exceeds a reference value for presenting a tactile sensation.
- the case where the data based on the pressure on the touch sensor 11 satisfies the standard for presenting the tactile sensation may be when the reference value for presenting the tactile sensation is detected by the press detection unit.
- the operator instead of presenting tactile sensation, or together with presenting tactile sensation, the operator can be notified by sound.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Telephone Function (AREA)
Abstract
Description
タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアの何れかを、フォアグラウンドで実行中のものと切り替えるように制御する制御部と、
を備えるものである。
前記制御部は、所定の基準を満たす押圧に基づくデータを検出するたびに、バックグラウンドで実行中のアプリケーションソフトウェアの何れかを、所定の順序に従って、フォアグラウンドで実行中のものと順次切り替えるように制御するものである。
前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものを、フォアグラウンドで実行中のものと切り替えるように制御するものである。
タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアの何れかが所定の処理を実行するように制御する制御部と、
を備えるものである。
前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものが所定の処理を実行するように制御するものである。
タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアの何れかをアクティブにするように制御する制御部と、
を備えるものである。
前記制御部は、所定の基準を満たす押圧に基づくデータを検出するたびに、アクティブでないアプリケーションソフトウェアの何れかを、所定の順序に従って順次アクティブにするように制御するものである。
前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものをアクティブにするように制御するものである。
タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアの何れかが所定の処理を実行するように制御する制御部と、
を備えるものである。
前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものが所定の処理を実行するように制御するものである。
図1は、本発明の第1実施の形態に係るタッチセンサを備えた電子機器の概略構成を示す機能ブロック図である。本実施の形態に係る電子機器は、例えば携帯電話やPDA(Personal Digital Assistant)、スマートフォン、タブレット型の電子機器、ゲーム機などの携帯端末、あるいは銀行に設置されるATM、駅に設置される券売機に用いる端末装置などとすることができる。すなわち、本実施の形態は、タッチセンサにより操作者のタッチ操作を受け付けるものであれば、任意の電子機器に適用することができる。
次に、本発明の第2実施の形態について説明する。第2実施の形態は、上述した第1実施の形態において、図3のステップS17におけるアプリケーション制御処理を変更するものである。したがって、第2実施の形態に係る電子機器は、第1実施の形態にて説明した電子機器1と同じ構成により実施することができるため、第1実施の形態と同じ内容になる説明は、適宜省略する。
次に、本発明の第3実施の形態について説明する。第3実施の形態は、上述した第1実施の形態と同様の技術的思想に基づいて、さらにタッチセンサ11に対する操作ミスを防止するための措置を施したものである。第3実施の形態に係る電子機器も、第1実施の形態にて説明した電子機器1と同じ構成により実施することができるため、第1実施の形態と同じ内容になる説明は、適宜省略する。
上述した第1ないし第3実施の形態にて使用した電子機器1は、全てマルチタクスで処理を行うことを想定して説明した。しかしながら、本発明は、マルチタスクの電子機器のみならず、シングルタスクの電子機器に適用することもできる。第4実施の形態に係る電子機器は、第1実施の形態にて説明した電子機器1と同じ構成の、シングルタスクの電子機器により実施することができるため、第1実施の形態と同じ内容になる説明は、適宜省略する。
11a タッチ面
12 押圧検出部
13 触感呈示部
14 表示部
15 制御部
21 筐体
22 インシュレータ
23 アッパカバー
24 インシュレータ
31 歪みゲージセンサ
32 圧電振動子
50 イヤホン
Claims (10)
- タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアの何れかを、フォアグラウンドで実行中のものと切り替えるように制御する制御部と、
を備える電子機器。 - 前記制御部は、所定の基準を満たす押圧に基づくデータを検出するたびに、バックグラウンドで実行中のアプリケーションソフトウェアの何れかを、所定の順序に従って、フォアグラウンドで実行中のものと順次切り替えるように制御する、請求項1に記載の電子機器。
- 前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものを、フォアグラウンドで実行中のものと切り替えるように制御する、請求項1に記載の電子機器。
- タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアの何れかが所定の処理を実行するように制御する制御部と、
を備える電子機器。 - 前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、バックグラウンドで実行中のアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものが所定の処理を実行するように制御する、請求項4に記載の電子機器。
- タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアの何れかをアクティブにするように制御する制御部と、
を備える電子機器。 - 前記制御部は、所定の基準を満たす押圧に基づくデータを検出するたびに、アクティブでないアプリケーションソフトウェアの何れかを、所定の順序に従って順次アクティブにするように制御する、請求項6に記載の電子機器。
- 前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものをアクティブにするように制御する、請求項6に記載の電子機器。
- タッチセンサと、
前記タッチセンサに対する押圧を検出する押圧検出部と、
所定の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアの何れかが所定の処理を実行するように制御する制御部と、
を備える電子機器。 - 前記制御部は、所定の段階の基準を満たす押圧に基づくデータを検出すると、アクティブでないアプリケーションソフトウェアのうち前記所定の段階の基準に対応するものが所定の処理を実行するように制御する、請求項9に記載の電子機器。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/001,152 US9727177B2 (en) | 2011-02-23 | 2012-02-23 | Electronic device with a touch sensor |
JP2013500904A JP5654114B2 (ja) | 2011-02-23 | 2012-02-23 | タッチセンサを備えた電子機器 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-037117 | 2011-02-23 | ||
JP2011037117 | 2011-02-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012114760A1 true WO2012114760A1 (ja) | 2012-08-30 |
Family
ID=46720551
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/001248 WO2012114760A1 (ja) | 2011-02-23 | 2012-02-23 | タッチセンサを備えた電子機器 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9727177B2 (ja) |
JP (1) | JP5654114B2 (ja) |
WO (1) | WO2012114760A1 (ja) |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102981756A (zh) * | 2012-11-01 | 2013-03-20 | 山东大学 | 一种触屏移动终端上快速切换应用的方法 |
US20140240276A1 (en) * | 2013-02-25 | 2014-08-28 | Hyun-Gyu Choi | Portable terminal |
JP2015095696A (ja) * | 2013-11-11 | 2015-05-18 | 株式会社村田製作所 | 表示装置 |
WO2016199309A1 (ja) * | 2015-06-12 | 2016-12-15 | パイオニア株式会社 | 電子機器 |
JP2017091554A (ja) * | 2016-11-28 | 2017-05-25 | Kddi株式会社 | 表示制御方法、電子機器、表示制御用プログラム及び表示制御システム |
JP2017149225A (ja) * | 2016-02-23 | 2017-08-31 | 京セラ株式会社 | 車両用コントロールユニット |
US9753639B2 (en) | 2012-05-09 | 2017-09-05 | Apple Inc. | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
JP2017528776A (ja) * | 2015-06-07 | 2017-09-28 | アップル インコーポレイテッド | ユーザインタフェース間をナビゲートするためのデバイス及び方法 |
US9778771B2 (en) | 2012-12-29 | 2017-10-03 | Apple Inc. | Device, method, and graphical user interface for transitioning between touch input to display output relationships |
US9785305B2 (en) | 2015-03-19 | 2017-10-10 | Apple Inc. | Touch input cursor manipulation |
US9823839B2 (en) | 2012-05-09 | 2017-11-21 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US9886184B2 (en) | 2012-05-09 | 2018-02-06 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
JP2018026128A (ja) * | 2016-08-11 | 2018-02-15 | 株式会社 ハイディープHiDeep Inc. | タッチ入力装置の圧力タッチ方法 |
US9959025B2 (en) | 2012-12-29 | 2018-05-01 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US9990121B2 (en) | 2012-05-09 | 2018-06-05 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US9996231B2 (en) | 2012-05-09 | 2018-06-12 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US10037138B2 (en) | 2012-12-29 | 2018-07-31 | Apple Inc. | Device, method, and graphical user interface for switching between user interfaces |
US10042542B2 (en) | 2012-05-09 | 2018-08-07 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US10067645B2 (en) | 2015-03-08 | 2018-09-04 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10067653B2 (en) | 2015-04-01 | 2018-09-04 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10073615B2 (en) | 2012-05-09 | 2018-09-11 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10078442B2 (en) | 2012-12-29 | 2018-09-18 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10095391B2 (en) | 2012-05-09 | 2018-10-09 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US10126930B2 (en) | 2012-05-09 | 2018-11-13 | Apple Inc. | Device, method, and graphical user interface for scrolling nested regions |
US10162452B2 (en) | 2015-08-10 | 2018-12-25 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10168826B2 (en) | 2012-05-09 | 2019-01-01 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US10175864B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity |
US10175757B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US10275087B1 (en) | 2011-08-05 | 2019-04-30 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10387029B2 (en) | 2015-03-08 | 2019-08-20 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
JP2019145157A (ja) * | 2019-04-24 | 2019-08-29 | パイオニア株式会社 | 電子機器 |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US10437333B2 (en) | 2012-12-29 | 2019-10-08 | Apple Inc. | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
US10496260B2 (en) | 2012-05-09 | 2019-12-03 | Apple Inc. | Device, method, and graphical user interface for pressure-based alteration of controls in a user interface |
US10599331B2 (en) | 2015-03-19 | 2020-03-24 | Apple Inc. | Touch input cursor manipulation |
US10620781B2 (en) | 2012-12-29 | 2020-04-14 | Apple Inc. | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6576245B2 (ja) * | 2013-05-08 | 2019-09-18 | 株式会社スクウェア・エニックス・ホールディングス | 情報処理装置、制御方法及びプログラム |
KR102148948B1 (ko) * | 2013-12-06 | 2020-08-27 | 삼성전자주식회사 | 전자 장치의 멀티 태스킹 방법 및 그 전자 장치 |
US20150212610A1 (en) * | 2014-01-30 | 2015-07-30 | Samsung Display Co., Ltd. | Touch-in-touch display apparatus |
US9606633B2 (en) * | 2015-05-08 | 2017-03-28 | John W. Downey | Method and apparatus for input to electronic devices |
CN105045454B (zh) | 2015-08-27 | 2017-10-17 | 广东欧珀移动通信有限公司 | 一种终端防误触方法及终端 |
EP3136215A1 (en) * | 2015-08-28 | 2017-03-01 | Nokia Technologies Oy | Responding to user input |
KR20170049991A (ko) * | 2015-10-29 | 2017-05-11 | 삼성전자주식회사 | 압력 터치를 이용한 사용자 인터렉션 제공 방법 및 그를 이용하는 전자 장치 |
CN107526506A (zh) * | 2017-08-28 | 2017-12-29 | 上海传英信息技术有限公司 | 一种播放器的控制方法、控制装置以及移动终端 |
GB2572611B (en) * | 2018-04-05 | 2020-08-19 | Sony Interactive Entertainment Europe Ltd | Computer game processing |
JP7218567B2 (ja) * | 2018-12-21 | 2023-02-07 | 京セラドキュメントソリューションズ株式会社 | 情報入力装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010067126A (ja) * | 2008-09-12 | 2010-03-25 | Sony Corp | 情報処理装置、及び情報処理方法 |
JP2010109587A (ja) * | 2008-10-29 | 2010-05-13 | Kyocera Corp | 携帯電子機器 |
JP2010118042A (ja) * | 2008-11-11 | 2010-05-27 | Pantech Co Ltd | ジェスチャを用いた移動端末のアプリケーション制御システムおよびアプリケーション制御方法 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05181606A (ja) | 1991-12-27 | 1993-07-23 | Nec Corp | タッチパネル装置 |
JP2001202192A (ja) | 2000-01-18 | 2001-07-27 | Sony Corp | 情報処理装置及びその方法並びにプログラム格納媒体 |
JP2002366277A (ja) * | 2001-06-11 | 2002-12-20 | Fujikura Ltd | 文字入力スイッチ |
JP4434609B2 (ja) | 2002-03-29 | 2010-03-17 | 株式会社東芝 | 表示入力システム |
US8483765B2 (en) * | 2005-08-26 | 2013-07-09 | Kt Corporation | User terminal for performing a plurality of applications simultaneously |
US9575655B2 (en) * | 2006-12-29 | 2017-02-21 | Nokia Technologies Oy | Transparent layer application |
KR100881186B1 (ko) | 2007-01-04 | 2009-02-05 | 삼성전자주식회사 | 터치 스크린 디스플레이 장치 |
JP5220352B2 (ja) | 2007-06-20 | 2013-06-26 | 京セラ株式会社 | 入力端末装置およびその表示制御方法 |
KR20090019161A (ko) * | 2007-08-20 | 2009-02-25 | 삼성전자주식회사 | 전자 장치 및 이를 조작하는 방법 |
US8745514B1 (en) * | 2008-04-11 | 2014-06-03 | Perceptive Pixel, Inc. | Pressure-sensitive layering of displayed objects |
US8780054B2 (en) * | 2008-09-26 | 2014-07-15 | Lg Electronics Inc. | Mobile terminal and control method thereof |
JP5228755B2 (ja) * | 2008-09-29 | 2013-07-03 | 富士通株式会社 | 携帯端末装置、表示制御方法および表示制御プログラム |
JP5157969B2 (ja) * | 2009-03-09 | 2013-03-06 | ソニー株式会社 | 情報処理装置、閾値設定方法及びそのプログラム |
KR101601040B1 (ko) * | 2009-05-19 | 2016-03-09 | 삼성전자주식회사 | 휴대 단말기의 화면 표시 방법 및 이를 지원하는 휴대 단말기 |
JP2011053974A (ja) * | 2009-09-02 | 2011-03-17 | Sony Corp | 操作制御装置、操作制御方法およびコンピュータプログラム |
US9052926B2 (en) * | 2010-04-07 | 2015-06-09 | Apple Inc. | Device, method, and graphical user interface for managing concurrently open software applications |
-
2012
- 2012-02-23 US US14/001,152 patent/US9727177B2/en active Active
- 2012-02-23 JP JP2013500904A patent/JP5654114B2/ja active Active
- 2012-02-23 WO PCT/JP2012/001248 patent/WO2012114760A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010067126A (ja) * | 2008-09-12 | 2010-03-25 | Sony Corp | 情報処理装置、及び情報処理方法 |
JP2010109587A (ja) * | 2008-10-29 | 2010-05-13 | Kyocera Corp | 携帯電子機器 |
JP2010118042A (ja) * | 2008-11-11 | 2010-05-27 | Pantech Co Ltd | ジェスチャを用いた移動端末のアプリケーション制御システムおよびアプリケーション制御方法 |
Cited By (116)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10664097B1 (en) | 2011-08-05 | 2020-05-26 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10338736B1 (en) | 2011-08-05 | 2019-07-02 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10275087B1 (en) | 2011-08-05 | 2019-04-30 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10345961B1 (en) | 2011-08-05 | 2019-07-09 | P4tents1, LLC | Devices and methods for navigating between user interfaces |
US10365758B1 (en) | 2011-08-05 | 2019-07-30 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10386960B1 (en) | 2011-08-05 | 2019-08-20 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10540039B1 (en) | 2011-08-05 | 2020-01-21 | P4tents1, LLC | Devices and methods for navigating between user interface |
US10649571B1 (en) | 2011-08-05 | 2020-05-12 | P4tents1, LLC | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10656752B1 (en) | 2011-08-05 | 2020-05-19 | P4tents1, LLC | Gesture-equipped touch screen system, method, and computer program product |
US10042542B2 (en) | 2012-05-09 | 2018-08-07 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US10481690B2 (en) | 2012-05-09 | 2019-11-19 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface |
US9823839B2 (en) | 2012-05-09 | 2017-11-21 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
US11010027B2 (en) | 2012-05-09 | 2021-05-18 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US10996788B2 (en) | 2012-05-09 | 2021-05-04 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US11947724B2 (en) | 2012-05-09 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US10969945B2 (en) | 2012-05-09 | 2021-04-06 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US9886184B2 (en) | 2012-05-09 | 2018-02-06 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US10942570B2 (en) | 2012-05-09 | 2021-03-09 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US11354033B2 (en) | 2012-05-09 | 2022-06-07 | Apple Inc. | Device, method, and graphical user interface for managing icons in a user interface region |
US10908808B2 (en) | 2012-05-09 | 2021-02-02 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
US10884591B2 (en) | 2012-05-09 | 2021-01-05 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects |
US11314407B2 (en) | 2012-05-09 | 2022-04-26 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US11023116B2 (en) | 2012-05-09 | 2021-06-01 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US9971499B2 (en) | 2012-05-09 | 2018-05-15 | Apple Inc. | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
US10782871B2 (en) | 2012-05-09 | 2020-09-22 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US9990121B2 (en) | 2012-05-09 | 2018-06-05 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US10175864B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity |
US9996231B2 (en) | 2012-05-09 | 2018-06-12 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US10496260B2 (en) | 2012-05-09 | 2019-12-03 | Apple Inc. | Device, method, and graphical user interface for pressure-based alteration of controls in a user interface |
US10168826B2 (en) | 2012-05-09 | 2019-01-01 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US10775999B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10775994B2 (en) | 2012-05-09 | 2020-09-15 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
US11221675B2 (en) | 2012-05-09 | 2022-01-11 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US10073615B2 (en) | 2012-05-09 | 2018-09-11 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10191627B2 (en) | 2012-05-09 | 2019-01-29 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
US11068153B2 (en) | 2012-05-09 | 2021-07-20 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10095391B2 (en) | 2012-05-09 | 2018-10-09 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
US10592041B2 (en) | 2012-05-09 | 2020-03-17 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
US10114546B2 (en) | 2012-05-09 | 2018-10-30 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
US10126930B2 (en) | 2012-05-09 | 2018-11-13 | Apple Inc. | Device, method, and graphical user interface for scrolling nested regions |
US9753639B2 (en) | 2012-05-09 | 2017-09-05 | Apple Inc. | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
US10175757B2 (en) | 2012-05-09 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface |
CN102981756A (zh) * | 2012-11-01 | 2013-03-20 | 山东大学 | 一种触屏移动终端上快速切换应用的方法 |
US9965074B2 (en) | 2012-12-29 | 2018-05-08 | Apple Inc. | Device, method, and graphical user interface for transitioning between touch input to display output relationships |
US10037138B2 (en) | 2012-12-29 | 2018-07-31 | Apple Inc. | Device, method, and graphical user interface for switching between user interfaces |
US10620781B2 (en) | 2012-12-29 | 2020-04-14 | Apple Inc. | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
US10101887B2 (en) | 2012-12-29 | 2018-10-16 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
US10185491B2 (en) | 2012-12-29 | 2019-01-22 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or enlarge content |
US10078442B2 (en) | 2012-12-29 | 2018-09-18 | Apple Inc. | Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold |
US9778771B2 (en) | 2012-12-29 | 2017-10-03 | Apple Inc. | Device, method, and graphical user interface for transitioning between touch input to display output relationships |
US10175879B2 (en) | 2012-12-29 | 2019-01-08 | Apple Inc. | Device, method, and graphical user interface for zooming a user interface while performing a drag operation |
US9996233B2 (en) | 2012-12-29 | 2018-06-12 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
US10437333B2 (en) | 2012-12-29 | 2019-10-08 | Apple Inc. | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
US9959025B2 (en) | 2012-12-29 | 2018-05-01 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
US10915243B2 (en) | 2012-12-29 | 2021-02-09 | Apple Inc. | Device, method, and graphical user interface for adjusting content selection |
US9857897B2 (en) | 2012-12-29 | 2018-01-02 | Apple Inc. | Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts |
US20140240276A1 (en) * | 2013-02-25 | 2014-08-28 | Hyun-Gyu Choi | Portable terminal |
JP2015095696A (ja) * | 2013-11-11 | 2015-05-18 | 株式会社村田製作所 | 表示装置 |
US10338772B2 (en) | 2015-03-08 | 2019-07-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11112957B2 (en) | 2015-03-08 | 2021-09-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10268341B2 (en) | 2015-03-08 | 2019-04-23 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11977726B2 (en) | 2015-03-08 | 2024-05-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10268342B2 (en) | 2015-03-08 | 2019-04-23 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10387029B2 (en) | 2015-03-08 | 2019-08-20 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10067645B2 (en) | 2015-03-08 | 2018-09-04 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10402073B2 (en) | 2015-03-08 | 2019-09-03 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US10613634B2 (en) | 2015-03-08 | 2020-04-07 | Apple Inc. | Devices and methods for controlling media presentation |
US10180772B2 (en) | 2015-03-08 | 2019-01-15 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10860177B2 (en) | 2015-03-08 | 2020-12-08 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US11550471B2 (en) | 2015-03-19 | 2023-01-10 | Apple Inc. | Touch input cursor manipulation |
US9785305B2 (en) | 2015-03-19 | 2017-10-10 | Apple Inc. | Touch input cursor manipulation |
US10599331B2 (en) | 2015-03-19 | 2020-03-24 | Apple Inc. | Touch input cursor manipulation |
US11054990B2 (en) | 2015-03-19 | 2021-07-06 | Apple Inc. | Touch input cursor manipulation |
US10067653B2 (en) | 2015-04-01 | 2018-09-04 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10152208B2 (en) | 2015-04-01 | 2018-12-11 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10841484B2 (en) | 2015-06-07 | 2020-11-17 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10455146B2 (en) | 2015-06-07 | 2019-10-22 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10705718B2 (en) | 2015-06-07 | 2020-07-07 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11835985B2 (en) | 2015-06-07 | 2023-12-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11681429B2 (en) | 2015-06-07 | 2023-06-20 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10303354B2 (en) | 2015-06-07 | 2019-05-28 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
JP2017528776A (ja) * | 2015-06-07 | 2017-09-28 | アップル インコーポレイテッド | ユーザインタフェース間をナビゲートするためのデバイス及び方法 |
US11231831B2 (en) | 2015-06-07 | 2022-01-25 | Apple Inc. | Devices and methods for content preview based on touch input intensity |
US9916080B2 (en) | 2015-06-07 | 2018-03-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US11240424B2 (en) | 2015-06-07 | 2022-02-01 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
JPWO2016199309A1 (ja) * | 2015-06-12 | 2018-03-29 | パイオニア株式会社 | 電子機器 |
US11269438B2 (en) | 2015-06-12 | 2022-03-08 | Pioneer Corporation | Electronic device |
WO2016199309A1 (ja) * | 2015-06-12 | 2016-12-15 | パイオニア株式会社 | 電子機器 |
US10884608B2 (en) | 2015-08-10 | 2021-01-05 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10698598B2 (en) | 2015-08-10 | 2020-06-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11182017B2 (en) | 2015-08-10 | 2021-11-23 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US11740785B2 (en) | 2015-08-10 | 2023-08-29 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10209884B2 (en) | 2015-08-10 | 2019-02-19 | Apple Inc. | Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback |
US10963158B2 (en) | 2015-08-10 | 2021-03-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10203868B2 (en) | 2015-08-10 | 2019-02-12 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10162452B2 (en) | 2015-08-10 | 2018-12-25 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
US11327648B2 (en) | 2015-08-10 | 2022-05-10 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10754542B2 (en) | 2015-08-10 | 2020-08-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
JP2017149225A (ja) * | 2016-02-23 | 2017-08-31 | 京セラ株式会社 | 車両用コントロールユニット |
US11221735B2 (en) | 2016-02-23 | 2022-01-11 | Kyocera Corporation | Vehicular control unit |
WO2017145746A1 (ja) * | 2016-02-23 | 2017-08-31 | 京セラ株式会社 | 車両用コントロールユニット |
JP2018026128A (ja) * | 2016-08-11 | 2018-02-15 | 株式会社 ハイディープHiDeep Inc. | タッチ入力装置の圧力タッチ方法 |
JP2017091554A (ja) * | 2016-11-28 | 2017-05-25 | Kddi株式会社 | 表示制御方法、電子機器、表示制御用プログラム及び表示制御システム |
JP2019145157A (ja) * | 2019-04-24 | 2019-08-29 | パイオニア株式会社 | 電子機器 |
Also Published As
Publication number | Publication date |
---|---|
JP5654114B2 (ja) | 2015-01-14 |
JPWO2012114760A1 (ja) | 2014-07-07 |
US20130335373A1 (en) | 2013-12-19 |
US9727177B2 (en) | 2017-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5654114B2 (ja) | タッチセンサを備えた電子機器 | |
JP5596348B2 (ja) | 多重モード触覚フィードバックシステム | |
CN104881175B (zh) | 具有动态触觉效应的多触摸装置 | |
JP5635274B2 (ja) | 触感呈示装置および触感呈示方法 | |
JP5529663B2 (ja) | 入力装置 | |
JP5718475B2 (ja) | 触感呈示装置 | |
JP5437786B2 (ja) | 触感呈示装置 | |
JP5555612B2 (ja) | 触感呈示装置 | |
JP5519020B2 (ja) | 入力装置および入力装置の制御方法 | |
JP2011048686A (ja) | 入力装置 | |
JP2011048832A (ja) | 入力装置 | |
JP5539788B2 (ja) | 触感呈示装置 | |
JP2016212605A (ja) | 電子機器および電子機器の制御方法 | |
JP5969279B2 (ja) | 電子機器 | |
JP5587596B2 (ja) | 触感呈示装置 | |
WO2011077687A1 (ja) | 触感呈示装置および触感呈示装置の制御方法 | |
JP5292244B2 (ja) | 入力装置 | |
JP5763579B2 (ja) | 電子機器 | |
JP5725899B2 (ja) | 文字列検索装置 | |
JP5591646B2 (ja) | 電子情報機器 | |
JP5706676B2 (ja) | 触感呈示装置 | |
JP2011095925A (ja) | 入力装置 | |
JP2011048833A (ja) | 入力装置 | |
JP2011095928A (ja) | 入力装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12749073 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013500904 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14001152 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12749073 Country of ref document: EP Kind code of ref document: A1 |