JP2011123773A - Device having touch sensor, tactile feeling presentation method, and tactile feeling presentation program - Google Patents

Device having touch sensor, tactile feeling presentation method, and tactile feeling presentation program Download PDF

Info

Publication number
JP2011123773A
JP2011123773A JP2009282259A JP2009282259A JP2011123773A JP 2011123773 A JP2011123773 A JP 2011123773A JP 2009282259 A JP2009282259 A JP 2009282259A JP 2009282259 A JP2009282259 A JP 2009282259A JP 2011123773 A JP2011123773 A JP 2011123773A
Authority
JP
Japan
Prior art keywords
tactile sensation
touch sensor
storage unit
data
storage area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2009282259A
Other languages
Japanese (ja)
Other versions
JP5490508B2 (en
Inventor
Takeshi Ishizuka
雄志 石塚
Original Assignee
Kyocera Corp
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corp, 京セラ株式会社 filed Critical Kyocera Corp
Priority to JP2009282259A priority Critical patent/JP5490508B2/en
Publication of JP2011123773A publication Critical patent/JP2011123773A/en
Application granted granted Critical
Publication of JP5490508B2 publication Critical patent/JP5490508B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a device for allowing a user to recognize the free space of a destination folder after moving a file concerning a file moving work. <P>SOLUTION: The device 101 having a touch sensor includes: a storage part 104 having a plurality of storage areas; the touch sensor 103; a tactile feeling presenting part 105 for presenting a tactile feeling to a touching object touching the touch sensor 103; and a control part 106 for performing control to allow the tactile feeling presenting part 105 to present to the touching object, the tactile feeling which is determined based on the free space or usage space of the storage area in size different from that of the data, in a case where the data stored in a prescribed storage area is stored in a storage area differen from the prescribed storage area. <P>COPYRIGHT: (C)2011,JPO&INPIT

Description

  The present invention relates to a device having a touch sensor, a tactile sensation presentation method, and a tactile sensation presentation program.

  In devices such as personal computers (PCs) and workstations, file systems having a graphical user interface (GUI) are widely used. With the recent increase in CPU speed and memory cost, file systems equipped with a GUI are increasingly used in portable devices such as mobile phones, PDAs (Personal Digital Assistants), and portable game machines. In this file system, files and folders are displayed on the screen as icons (objects). Files and folders can be moved by moving icons by a drag and drop operation.

  Conventionally, various inventions related to drag and drop have been made (see, for example, Patent Document 1). The invention of Patent Document 1 will be described with reference to FIGS. FIG. 13 shows a state where the user is dragging the icon 311 from the source window 312 (moving source folder window) to the target window 313 (moving destination folder window). The drag operation is usually performed by placing the cursor on the icon to be moved, pressing the mouse button, and moving the cursor while keeping the mouse button pressed. As shown in FIG. 14, when the icon 311 hits the inner boundary 314 of the target window 313, the target window 313 is scrolled. The user displays the desired position 315 as a drop destination on the target window 313 by scrolling, and releases the mouse button at the desired position 315, thereby completing the drop operation. Thereby, the icon 311 is displayed in the target window 313 as shown in FIG.

Japanese Unexamined Patent Publication No. 07-200266

  When a file is moved by drag and drop, the free space of the destination folder is reduced by the size of the moved file (moved file). For this reason, if the free space of the destination folder is smaller than the size of the moving file, the user cannot move the file. Even if a file is moved, if the free space in the destination folder after moving the file is small, the operation of an application (such as an operating system) stored in the destination folder may be affected. Such a problem relating to the shortage of capacity is particularly noticeable in portable devices that do not have a large-capacity internal memory and in small-capacity external memories that are connected by an external interface.

  In order to prevent such a problem, it is necessary to confirm the size of the moved file and the capacity of the destination folder before moving the file, or to confirm the capacity of the destination folder after moving the file. Checking the file size and the free space of the destination folder is necessary separately from the file moving operation, which is troublesome for the user.

  Accordingly, an object of the present invention made in view of the above problems is to provide an apparatus that allows a user to recognize the free capacity of a destination folder after moving a file in a file moving operation. is there.

In order to solve the above-described problems, an apparatus having a touch sensor according to the first aspect
A storage unit having a plurality of storage areas;
A touch sensor;
A tactile sensation providing unit for presenting a tactile sensation to a touch target touching the touch sensor;
When storing data stored in a predetermined storage area in a storage area different from the predetermined storage area, the tactile sensation determined based on the size of the data and the free capacity or the used capacity of the different storage area is A control unit that controls the tactile sensation providing unit to present to the touch target;
Is provided.

In addition, the apparatus further includes
With a display,
When an icon of the data displayed on the display unit is moved by an input to the touch sensor, the control unit calculates a motion vector of the data, and the different storage areas are calculated based on the motion vector. Identify
It is preferable.

An apparatus having a touch sensor according to the second aspect is
A first storage unit;
A second storage unit different from the first storage unit;
A touch sensor;
A tactile sensation providing unit for presenting a tactile sensation to a touch target touching the touch sensor;
When the data stored in the first storage unit is stored in the second storage unit, the tactile sensation determined based on the size of the data and the free capacity or the used capacity of the second storage unit. A control unit that controls the tactile sensation providing unit to present the touch target;
Is provided.

In addition, the apparatus further includes
With a display,
When an icon of the data displayed on the display unit is moving due to an input to the touch sensor, the control unit calculates a motion vector of the data, and based on the motion vector, the second data Specify the storage unit,
It is preferable.

  As described above, the solution of the present invention has been described as an apparatus. However, the present invention can be realized as a method, a program, and a storage medium that stores the program substantially corresponding to these, and the scope of the present invention. It should be understood that these are also included.

For example, a tactile sensation presentation method that realizes the first aspect of the present invention as a method is:
Identifying each of data stored in a predetermined storage area of the storage unit and a storage area different from the predetermined storage area;
Presenting a tactile sensation determined based on the specified data size and the free capacity or used capacity of the specified different storage areas to a touch target touching the touch sensor;
Is included.

In addition, the tactile sensation presentation method further includes:
A step of calculating a motion vector of the data when the icon of the data displayed on the display unit is moving due to an input to the touch sensor, and specifying the different storage area based on the motion vector;
It is preferable to contain.

In addition, a tactile sensation presentation method that realizes the second aspect of the present invention as a method is:
Identifying each of the data stored in the first storage unit and a second storage unit different from the first storage unit;
Presenting a tactile sensation determined based on the size of the specified data and the free or used capacity of the specified second storage unit to a touch target touching the touch sensor;
Is included.

In addition, the tactile sensation presentation method further includes:
When an icon of the data displayed on the display unit is moved by an input to the touch sensor, a motion vector of the data is calculated, and the second storage unit is specified based on the motion vector. Step,
It is preferable to contain.

Further, a tactile sensation presentation program that realizes the first aspect of the present invention as a program is:
A computer mounted on a device having a touch sensor,
Means for respectively specifying data stored in a predetermined storage area of the storage unit and a storage area different from the predetermined storage area;
Means for controlling the tactile sensation determined based on the size of the specified data and the free capacity or the used capacity of the specified different storage areas so as to be presented to a touch target touching the touch sensor;
It is intended to function as.

Further, the tactile sensation presentation program further causes the computer to
Means for calculating a motion vector of the data when the icon of the data displayed on the display unit is moved by an input to the touch sensor, and specifying the different storage area based on the motion vector;
It is preferable to function as.

In addition, a tactile sensation presentation program that realizes the second aspect of the present invention as a program is:
A computer mounted on a device having a touch sensor,
Means for respectively specifying data stored in the first storage unit and a second storage unit different from the first storage unit;
Means for controlling the tactile sensation determined based on the size of the specified data and the free or used capacity of the specified second storage unit to be presented to the touch target touching the touch sensor. ,
It is intended to function as.

Further, the tactile sensation presentation program further causes the computer to
When an icon of the data displayed on the display unit is moved by an input to the touch sensor, a motion vector of the data is calculated, and the second storage unit is specified based on the motion vector. means,
It is preferable to function as.

  According to the device having the touch sensor according to the present invention configured as described above, the size of the data (file) to be moved, the free space or the used capacity of the storage area or storage unit that is the movement destination (hereinafter referred to as movement) The tactile sensation determined based on the free space in the storage area or the like is presented to the touch target. Therefore, the user can recognize the free space in the destination storage area after the data movement by tactile sensation. In other words, the user knows the free space in the destination storage area after the data move without performing the work of checking the size of the move data and the free space in the move destination storage area separately from the file move work. Can do. Since the free capacity of the destination storage area after data movement is known, it is determined whether the data enters (stores) the destination storage area or the like.

FIG. 1 is a functional block diagram showing a schematic configuration of an apparatus according to the first embodiment of the present invention. FIG. 2 is a display screen example of the display unit of FIG. FIG. 3 is a flowchart showing the operation of the apparatus of FIG. FIG. 4 is a diagram showing file movement according to the first embodiment of the present invention. FIG. 5 is a diagram showing file movement according to the first embodiment of the present invention. FIG. 6 is a diagram showing file movement according to the first embodiment of the present invention. FIG. 7 is a diagram showing file movement according to the first embodiment of the present invention. FIG. 8 is a functional block diagram showing a schematic configuration of an apparatus according to the second embodiment of the present invention. FIG. 9 is a display screen example of the display unit of FIG. FIG. 10 is a flowchart showing the operation of the apparatus shown in FIG. FIG. 11 is a diagram showing file movement according to the second embodiment of the present invention. FIG. 12 is a diagram showing file movement according to the second embodiment of the present invention. FIG. 13 is a diagram showing conventional file movement. FIG. 14 is a diagram showing conventional file movement. FIG. 15 is a diagram showing conventional file movement.

  Hereinafter, embodiments according to the present invention will be described with reference to the drawings.

(First embodiment)
FIG. 1 is a functional block diagram showing a schematic configuration of a device having a touch sensor according to the first embodiment of the present invention. Examples of the device 101 (hereinafter abbreviated as the device 101) having the touch sensor of the present invention include a PDA, a PC, a mobile phone, a mobile game machine, a mobile music player, a mobile TV, and the like. The apparatus 101 includes a display unit 102, a touch sensor 103, a storage unit 104, a tactile sensation providing unit 105, and a control unit 106.

  The display unit 102 displays icons, windows, and the like indicating files and folders, and is configured using, for example, an LCD (Liquid Crystal Display), an organic EL display, or the like. The data in the claims includes files, folders, and the like. Hereinafter, for convenience of explanation, file movement will be described.

  The touch sensor 103 detects contact of the user's finger or the like with the touch sensor 103 as an input to the apparatus 101. For example, the touch sensor 103 is often mounted on a touch sensor in a touch panel or a laptop personal computer (laptop). Touchpad, mouse button, pointing stick, trackball, etc. Hereinafter, for convenience of explanation, the touch sensor 103 is a touch sensor in a touch panel. The touch sensor 103 of the touch panel detects an input by a touch target (hereinafter referred to as a user's finger) such as a user's finger or a stylus pen. Consists of methods. Note that it is not essential for the user's finger to physically press the touch sensor 103 when the touch sensor 103 detects an input. For example, when the touch sensor 103 is an optical type, the touch sensor 103 detects a position where the infrared light on the touch sensor 103 is blocked by the user's finger, and thus it is not necessary for the user's finger to press the touch sensor 103. It is.

  The storage unit 104 stores various types of input information and data such as files, and also functions as a work memory, for example, a hard disk drive (HDD), an SD memory card, a USB memory, a smart media, and the like. . The storage unit 104 can have a plurality of storage areas. The plurality of storage areas refers to, for example, a plurality of drives (for example, C drive and D drive) divided by partitions, and a plurality of folders whose capacity is limited in the same drive. The storage unit 104 is not limited to a hard disk drive built in the apparatus 101, and can be an external memory such as an SD memory card, USB memory, or smart media. Furthermore, the storage unit 104 is not limited to a single piece of hardware, and may refer to a plurality of pieces of hardware. For example, the storage unit 104 includes two hardware units, a hard disk drive and an SD memory card. It can also be made. In this case, in order to distinguish the plurality of storage units, reference numerals 104a, 104b, and 104c are assigned to the plurality of storage units, and the first storage unit, the second storage unit, and the third storage unit, respectively. Called.

  The tactile sensation providing unit 105 vibrates the touch sensor 103 and presents a tactile sensation to the user's finger (touch target) pressing the touch sensor 103, and is configured using, for example, a vibration element such as a piezoelectric element. . Various tactile sensations are presented by appropriately setting the frequency, period (wavelength), amplitude, and waveform.

  The control unit 106 controls and manages the entire apparatus 101 including each functional block of the apparatus 101. Here, the control unit 106 is configured as software executed on an arbitrary suitable processor such as a CPU (Central Processing Unit) or a dedicated processor specialized for each process (for example, DSP (Digital Signal Processor)). Can also be configured. Independent of the control unit 106, a display unit controller (for example, an LCD controller) that controls the display unit 102, a touch sensor controller that controls the touch sensor 103, and a tactile sensation providing unit driver that controls the tactile sensation providing unit 105 are provided. You can also.

  The control unit 106 will be described in more detail. The control unit 106 transfers from the predetermined storage area of the storage unit 104a to a storage area of the storage unit 104a different from the predetermined storage area, or from the first storage unit 104a to the second storage different from the first storage unit 104a. When moving a file to the unit 104b, the size of the file to be moved (moved file) and the free space or use of the storage area that is the destination or the second storage part 104b (hereinafter abbreviated as the destination storage area) The tactile sensation is determined based on the capacity. That is, this tactile sensation is determined based on the free capacity or the used capacity of the destination storage area after moving the file. It should be noted that the free capacity and the used capacity are added together to obtain the total storage capacity in which the storage unit or storage area can store data. That is, the free capacity and the used capacity have an inseparable relationship such that as the used capacity increases, the free capacity decreases accordingly. Therefore, only the free capacity will be described below for convenience of explanation. For example, when the free space in the destination storage area after moving the file is negative (that is, when it is impossible to store the moved file in the destination storage area, etc.), the control unit 106 Different tactile sensations are determined for cases where there is no room in the free space of the destination folder and cases where there is room in the free space of the destination folder after moving the file. In addition, the capacity | capacitance used as the standard which determines having no margin is an item which can be set arbitrarily. Then, the control unit 106 controls the tactile sensation providing unit 105 to present the determined tactile sensation to the user's finger (touch target). That is, the user can recognize from the tactile sensation transmitted from the finger whether or not the file enters the destination storage area or the like, and if so, how much free space the destination storage area becomes.

  When the apparatus 101 of FIG. 1 is a PDA having a touch panel, a display example of the display unit 102 is as shown in FIG. Although not shown here, it is assumed that the storage unit 104 of the apparatus 101 in FIG. 1 includes a first storage unit 104a and a second storage unit 104b. The first storage unit 104a is, for example, an internal HDD, and the second storage unit 104b is, for example, an SD memory card connected by an external interface. Note that the external memory connected by the external interface is not limited to the SD memory card, and may be a USB memory or smart media, for example. The display unit 102 in FIG. 1 displays the folder 1 and folder 2 windows related to the storage area of the first storage unit 104a inside the apparatus 101, and the window (SD memory card) related to the second storage unit 104b. . Folder 1 stores an image file (data) “image 1”.

  Move image files in folder 1 (predetermined storage area of first storage unit 104a) to folder 2 (storage area of storage unit 104a different from folder 1) or SD memory card (second storage unit 104b) Will be described with reference to the flowchart of FIG.

  First, as shown in FIG. 4, the user performs input with the user's finger on the icon of the file to be moved (moved file). Then, the touch sensor 103 detects this input (step S101). At this time, the control unit 106 can change the display of the icon of the file corresponding to the detected input as shown in FIG. Thereby, the user can recognize that it was able to input without making a mistake with respect to the icon of the file to move. Note that the display change can be recognized as an input by a user's finger to an icon of a file, for example, an inversion of the color of a file icon or a change in the intensity of display of a file icon. It is a change like this. Further, by changing the display around the icon of the file corresponding to the detected input, the user can be made aware that there has been an input by the user's finger.

  When the user moves an image file (move file) to folder 2 (destination folder), which is a storage area different from folder 1 (source folder), the user does not take his finger off the image file icon 108. In other words, the image file icon 108 is moved toward the folder 2 as shown in FIG. This operation corresponds to a so-called drag operation. As the touch sensor 103 continues to detect input by the user's finger, the movement of the user's finger is known, and the control unit 106 displays an image file icon 108 on the display unit 102 at a position corresponding to the movement of the user's finger. Display.

  The control unit 106 determines from the detection result of the touch sensor 103 whether or not the user's finger is positioned on the window (movement destination window) of the folder 2 (step S102). When the file icon is dragged as shown in FIG. 6, the control unit 106 determines that the user's finger is positioned on the window of the folder 2 (Yes in step S102).

  The control unit 106 determines that the user wants to move the icon 108 of the image file to the folder 2 by the drag operation described above. Then, the control unit 106 determines the tactile sensation presented to the finger of the user who is touching the touch sensor 103 based on the size of the image file and the free space of the folder 2 (step S103). Then, the control unit 106 controls the tactile sensation providing unit 105 to present the determined tactile sensation to the user's finger touching the touch sensor 103 (step S104).

  Specifically, the tactile sensation is determined by calculation or a vibration pattern table such as Table 1 or 2. The tactile sensation is determined by calculation, for example, by determining the frequency, amplitude, etc. by combining the ratio of the size of the moving file to the free space of the destination folder before moving the moving file and the sum, subtraction, multiplication, and division of constants. . An example is as follows.

  When the frequency of the tactile sensation to be presented as shown in Equation 1 is determined, the tactile sensation with a higher frequency is presented to the user's finger as the free space of the destination folder after the file movement decreases.

  Hereinafter, a case where the tactile sensation is determined from the vibration pattern table will be described in detail. Table 1 shows the correspondence between the free space of the destination folder after moving the file and the tactile sensation to be presented. In Table 1, three vibration patterns of vibration patterns 1 to 3 are prepared. In the vibration pattern 1, when the free space of the folder 2 assuming that the image file is stored in the folder 2 is negative, that is, the size of the image file is larger than the free space of the folder 2, This is a tactile sensation presented when the movement to the folder 2 is impossible. The vibration pattern 2 is a tactile sensation for allowing the user to recognize that there is no room in the free space in the destination folder after moving the moving file. In Table 1, the free space in the destination folder 2 after moving the image file is Presented when less than 1 MB (megabytes). The vibration pattern 3 is a tactile sensation for allowing the user to recognize that there is a free space in the destination folder after moving the moving file. In Table 1, the free space in the destination folder 2 after moving the image file is Presented in case of 1MB or more. Although 1 MB is set as a reference for changing the vibration pattern, this value is merely an example, and can be set to an arbitrary value.

  In order to distinguish the vibration patterns 1 to 3, for example, the amplitude can be reduced from the vibration pattern 1 to the vibration pattern 3. The vibration patterns 1 to 3 can also be distinguished by changing the period (wavelength) and the waveform. The user can recognize the free capacity of the destination folder after moving the file from the difference in vibration pattern.

  Furthermore, as shown in Table 2, the vibration pattern can be changed according to the size of the moving file. In Table 2, vibration patterns 1a, 2a, and 3a correspond to moving file sizes of less than 128 kB (kilobytes), and vibration patterns 1b, 2b, and 3b correspond to moving file sizes of 128 kB to less than 256 kB, and vibration. The patterns 1c, 2c, and 3c correspond to a moving file size of 256 kB or more. In addition, although 128 kB and 256 kB are set as a reference for changing the vibration pattern, this value is merely an example and can be set to an arbitrary value. Thereby, the user can recognize not only the free capacity of the destination folder after moving the file but also the size of the moved file from the difference in the vibration pattern.

  When an image file is to be stored in the folder 2 due to the tactile sensation presented to the user's finger, the user moves to a drop operation. That is, the user ends the drag operation (stops without moving the finger again), and moves from the state where the finger is positioned on the window of the folder 2 as shown in FIG. 6 to the drop operation (No in step S105). In order to realize the drop operation with the device 101, for example, the device 101 further includes a load detection unit (not shown) that detects a pressure load on the touch sensor, and the pressure load detected by the load detection unit is preliminarily determined. When the set load standard is satisfied, the control unit 106 sets the image file to be stored in the folder 2. This load standard defines a pressing load necessary to store the movement file in the movement destination storage area or the storage unit. Note that the load detection unit is configured using an element that linearly reacts to a load, such as a strain gauge sensor or a piezoelectric element. In addition, when the load detection unit and the tactile sensation providing unit 105 are configured using piezoelectric elements, the load detection unit and the tactile sensation providing unit 105 can be configured by sharing the piezoelectric elements. This is because the piezoelectric element generates electric power when pressure is applied and deforms when electric power is applied.

  In a configuration in which the apparatus 101 includes a load detection unit, a user performs a drop operation from a dragged state (a state in which input is continued with respect to the image file icon 108) to an image file icon 108. Press with a pressing load that satisfies the load criteria. Then, the control unit 106 stores the image file in the folder 2 (step S106). That is, the movement of the file is completed.

  In step S104, when a tactile sensation (vibration pattern 1, 1a, 1b, 1c) that cannot be moved to the folder 2 of the image file is presented to the user's finger, the user changes the destination of the moving file. Will do. When the user's finger is presented with a tactile sensation (vibration pattern 2, 2a, 2b, 2c) indicating that there is no room in the free space of the destination folder 2 after moving the image file, the user moves There are cases where the destination of the file is changed. The reason is that if data is stored to the limit in one storage area, for example, the operation of an application (such as an operating system) stored in the storage area may be affected. When changing the movement destination of the moving file, since the image file icon 108 is being dragged in step S104, the user drags the folder 2 to another storage area of the storage unit 104a or another storage unit. 104b (eg, SD memory card) can be dragged. That is, the user does not need to drag again from the folder 1. When the touch sensor 103 detects that the finger has moved again by dragging (Yes in step S105), steps S102 to S104 are repeated each time the finger moves to a new storage area or storage unit.

  When the user drags the icon 108 of the image file onto the window of the SD memory card (Yes in step S102), the control unit 106 controls the tactile sensation providing unit 105, and the SD memory card has free space. It is assumed that a tactile sensation (vibration patterns 3, 3a, 3b, 3c) is presented (steps S103 and S104). Then, as a drop operation, the user presses the icon 108 of the image file on the window of the SD memory card, and when the pressing load satisfies the load standard (step S106), the control unit 106 stores the image file in the SD memory card. (Step S107). That is, the transfer of the image file from the folder 1 to the SD memory card is completed, as shown in FIG. In FIG. 7, the image file is deleted from the folder 1, but the file movement according to the present embodiment is not limited to the “cut and paste” operation, and the moved file (image file) is moved from the movement source after the movement. It can also be a “copy and paste” operation that is not erased.

  As described above, in the present embodiment, the control unit 106 stores the data (image file) stored in the predetermined storage area (folder 1) of the storage unit 104a in the storage area (folder) of the storage unit 104a different from the predetermined storage area. In the case of storing in 2), the tactile sensation is determined based on the size of the data and the free capacity (or used capacity) of the storage area different from the predetermined storage area. Then, the control unit 106 controls the tactile sensation providing unit 105 to present the determined tactile sensation to the touch target (user's finger). That is, the tactile sensation to be presented is determined based on the free capacity of the storage area (movement destination storage area) that is the movement destination after the data movement. Therefore, the user can recognize from the tactile sensation presented by the finger whether or not the data enters the destination storage area, and if so, how much free space the destination storage area will have. can do.

  In the present embodiment, the control unit 106 converts the data (image file) stored in the first storage unit 104a (HDD) into a second storage unit 104b (SD memory) different from the first storage unit 104a. Card), the tactile sensation is determined based on the data size and the free capacity (or used capacity) of the second storage unit 104b. Then, the control unit 106 controls the tactile sensation providing unit 105 to present the determined tactile sensation to the touch target (user's finger). That is, the apparatus 101 according to the present embodiment can cope with not only data movement from one area to another area in one storage unit (hardware) but also data movement between a plurality of storage units. In the data movement from the storage unit 104a inside the apparatus 101 to the external storage unit 104b, the user determines whether or not the data enters the second storage unit 104b based on the tactile sensation presented, and if so, the second The amount of free space in the storage unit 104b can be recognized only by the data transfer operation.

  In addition, when the apparatus 101 according to the present embodiment includes a load detection unit that enables a drop operation, the control unit 106 is not a simple vibration, but a hard and hard to feel when a mechanical key is pressed. The tactile sensation providing unit 105 can be controlled so as to present a realistic tactile sensation (real click feeling) to the touch target. In order to present a realistic click feeling, for example, in step S104, when the pressing load exceeds a certain value (for example, 1N [Newton]), the tactile feeling is set to be presented. Thus, the pressure sense of the user is stimulated until the pressing load exceeds a certain value, and when the pressure load exceeds the certain value, the tactile sensation providing unit 105 can vibrate the touch surface to stimulate the user's tactile sense. become. In this way, by stimulating the user's pressure and tactile sensation, it is possible to present the user with a hard and hard feel. The touch sensor 103 itself does not physically displace like a mechanical key even when the touch surface is pressed. However, the user can operate the mechanical key by presenting the touch feeling as described above to the touch target. It is possible to obtain a realistic click feeling similar to that of the case. As a result, the user can perform an input operation to the touch sensor 103 that originally does not have feedback by pressing without a sense of incongruity. Note that a hard and hard tactile sensation can be realized by, for example, presenting a sine wave of 200 Hz to 500 Hz for one cycle or a rectangular wave for one cycle. A soft tactile sensation such as bull or beech can be realized by, for example, presenting a sine wave of 200 Hz to 500 Hz for two or three periods. A tactile sensation that can be recognized as a vibration such as a bull can be realized by, for example, presenting a sine wave for four cycles or more.

(Second Embodiment)
In the first embodiment, a case has been described in which the user's finger presents a tactile sensation in a state where the user's finger is positioned on the destination storage area or the window of the storage unit by the drag operation. In the second embodiment, A case will be described in which a tactile sensation is presented in the middle of a drag operation, that is, in a state where the user's finger is not positioned on the destination storage area or the window of the storage unit. FIG. 8 is a functional block diagram showing a schematic configuration of the input device according to the second embodiment of the present invention. Examples of the device 201 having the touch sensor of the present invention (hereinafter abbreviated as the device 201) include a PDA, a PC, a mobile phone, a mobile game machine, a mobile music player, a mobile TV, and the like. The input device 201 includes a display unit 202, a touch sensor 203, a storage unit 204, a tactile sensation providing unit 205, and a control unit 206. The functional units 202 to 205 other than the control unit 206 have the same functions as the functional units 102 to 105 in FIG.

  The control unit 206 transfers the second storage different from the first storage unit 204a from the first storage unit 204a to the storage region of the storage unit 204a different from the predetermined storage region from the predetermined storage region of the storage unit 204a. In moving the file (data) to the unit 204b, the motion vector of the data is obtained based on the input detected by the touch sensor 203. In other words, the motion vector is calculated from the locus of the icon of the data that is moving by a drag operation or the like, and is a vector that represents how the data icon moves. Then, the storage area or storage unit (storage area or the like) that is the movement destination is specified from the obtained motion vector. That is, the control unit 206 predicts a storage area or the like existing in the direction of the motion vector as a file movement destination, and determines the tactile sensation based on the predicted free space in the movement destination storage area or the like and the size of the movement file. Then, the tactile sensation providing unit 205 is controlled so as to present this tactile sensation to the user's finger. That is, the user can recognize the estimated free space in the destination storage area or the like based on the tactile sensation presented before moving the file to the destination storage area or the like by a drag operation or the like.

  Similarly to the description in the first embodiment, it is assumed that the apparatus 201 in FIG. 8 is a PDA having a touch panel. At this time, a display example of the display unit 202 is as shown in FIG. 9, an image file icon 208 is a data icon similar to the image file icon 108 in FIG. 2, and the folder 1, the folder 2, and the SD memory card are common in FIGS. 9 and 2. Therefore, the description of FIG. 9 is omitted.

  Move image files in folder 1 (predetermined storage area of first storage unit 204a) to folder 2 (storage area of storage unit 204a different from folder 1) or SD memory card (second storage unit 204b) Will be described with reference to the flowchart of FIG. In FIG. 10, the processes of steps S201 and S204 to S207 are the same as the processes of steps S101 and S104 to S107 shown in FIG. In this case, among the descriptions relating to each step, the descriptions relating to the respective functional units (the display unit 102, the touch sensor 103, the storage unit 104, and the tactile sensation providing unit 105) of the first embodiment are appropriately described in the second embodiment. Each function unit (display unit 202, touch sensor 203, storage unit 204, tactile sensation providing unit 205) is to be read.

  When the user performs an input with the user's finger on the icon of the file to be moved (moved file), the touch sensor 203 detects this input as in S101 (step S201).

  When the user moves the image file (move file) to folder 2 (destination folder), which is a storage area different from folder 1 (source folder), the user performs a drag operation as shown in FIG. The image file icon 208 is moved toward the folder 2.

  During the drag operation, the touch sensor 203 detects the movement (trajectory) of the icon 208 of the image file, and the control unit 206 calculates the motion vector of the icon 208 of the image file from the detected locus (step S211). The motion vector is obtained, for example, by connecting two points: the position of the icon 208 of the current image file and the position of the icon 208 of the past image file (for example, the position of the icon 208 of the image file before dragging). Can do. Further, a plurality of points where the image file icon 208 is moved by the drag operation can be picked up and obtained by the least square method. The motion vector can be obtained by adopting an arbitrary approximate expression in consideration of the processing capability of the apparatus 201 and the required accuracy.

  Subsequently, the control unit 206 determines from the calculated motion vector whether there is a storage area or a storage unit that is predicted as the movement destination of the image file (step S212). That is, it is determined whether or not a window related to the storage area or the storage unit exists on the calculated motion vector. If it exists (Yes in step S212), the control unit 206 predicts that the storage area or the storage unit related to the window on the calculated motion vector is the user's desired destination. In FIG. 11, since the folder 2 exists on the calculated motion vector (Yes in step S212), the folder 2 is predicted to be the user's desired destination folder.

  Then, the control unit 206 determines the tactile sensation to be presented to the finger of the user touching the touch sensor 203 based on the size of the image file and the predicted free space or used capacity of the folder 2 (step S213). This tactile sensation is determined by calculation or a vibration pattern table such as Table 1 or 2 as in the first embodiment. The control unit 206 controls the tactile sensation providing unit 205 to present the tactile sensation determined for the user's finger touching the touch sensor 203 (step S204). That is, before the image file icon 208 is positioned on the window of the movement destination folder 2, the control unit 206 predicts the user's desired movement destination as the folder 2, and uses the free space of the movement destination folder after the file movement. A corresponding tactile sensation is presented to the user's finger.

  In step S204, when a tactile sensation that the image file cannot be moved to the folder 2 is presented to the user's finger, the user changes the destination of the moved file. When changing the move destination of the move file, the user can drag from the state of FIG. 11 to another storage area of the storage unit 104a or another storage unit 204b (for example, an SD memory card). That is, the user does not need to drag again from the folder 1.

  When the user finally moves the icon 208 of the image file to the SD memory card, it becomes as shown in FIG. Since the destination of the image file icon 208 is changed to the SD memory card before the image file icon 208 is positioned on the window of the destination folder 2, the image file is compared with FIG. 7 in the first embodiment. The movement locus of the icon 208 becomes shorter.

  Thus, in this embodiment, data (image file) stored in a predetermined storage area (folder 1) of the storage unit 104a is stored in a storage area (folder 2) of the storage unit 104a different from the predetermined storage area. In this case, the control unit 206 specifies the storage area to which the data is to be moved by the motion vector of the data icon. Specifically, when the data icon is moved by a drag operation, the control unit 206 calculates a motion vector from the trajectory of the data icon, and specifies a storage area positioned in the direction of the motion vector as the movement destination. To do. That is, the control unit 206 predicts the user's desired destination as the folder 2 before the image file icon 208 is positioned on the destination folder 2 window. Therefore, the control unit 206 can present a tactile sensation corresponding to the free space of the folder 2 after moving the image file to the user's finger before the image file icon 208 is dragged to the destination folder 2. . As a result, the user can recognize whether or not the image file enters the folder 2 in the middle of the drag operation of the image file icon 208, and if so, how much free space the folder 2 has. . In particular, when the image file does not enter the folder 2, the user can change the destination of the file during the drag operation, and the file can be moved efficiently.

  In the present embodiment, the data (image file) stored in the first storage unit 204a (HDD related to the folder 1) is transferred to the second storage unit 204b (SD memory card) different from the first storage unit. In the case of storing, the second storage unit 204b, which is the data movement destination, is specified by the motion vector of the data icon. In other words, the apparatus 201 according to the present embodiment not only moves data from one area to another area in one storage unit (hardware), but also moves data icons between data in a plurality of storage units. The destination can be specified from Thus, the user can determine whether the image file enters the folder 2 during the drag operation of the image file icon 208 (before the image file icon 208 is positioned on the window of the SD memory card to which the image file is moved). In addition, when entering, it is possible to recognize how much free space the folder 2 will have.

  Although the present invention has been described based on the drawings and examples, it should be noted that those skilled in the art can easily make various modifications and corrections based on the present disclosure. Therefore, it should be noted that these variations and modifications are included in the scope of the present invention.

  For example, functions included in each member, each means, each step, etc. can be rearranged so as not to be logically contradictory, and a plurality of means, steps, etc. can be combined or divided into one. Is possible.

  In the first embodiment described above, a drag operation has been described in which a finger is input to the icon of a file to be moved, and the finger is moved to the destination folder while the input is continued. However, the file to be moved and the destination It should be noted that if the folder can be specified, the operation is not limited to the drag operation. For example, a user performs input with a finger on an icon of a file to be moved, and the touch sensor detects the input. Then, the control unit determines that the detected input is the specification of the moving file by the user, and stores information on which file is specified in the internal memory of the apparatus according to the present invention. This eliminates the need to continue to specify the file to be moved by continuing to input by a drag operation. Under such settings, the finger is once released from the touch sensor and input is performed on the window of the destination folder. When the touch sensor detects this input, the control unit can specify the destination folder. By specifying the moving file and the moving destination folder, the control unit can determine the tactile sensation based on the free space of the moving destination folder after moving the file.

  In the second embodiment, the calculation of the file icon motion vector from the file icon trajectory by the drag operation has been described. However, if the file icon trajectory is known, the file icon is limited to the drag operation. It should be noted that this is not the case. For example, the user performs a first input with a finger on the icon of the file to be moved, and the touch sensor detects the input. Then, the control unit determines that the detected input is the specification of the moving file by the user. Then, the user makes a second input to a position other than the window related to the storage area or the storage unit, and the touch sensor detects the input. By using a line connecting the position of the file icon and the position on the display unit by the second input as the locus of the file icon, the motion vector of the file icon can be calculated without a drag operation.

  In the first and second embodiments described above, the data movement between the plurality of storage units has been described from the storage unit inside the apparatus to the external storage unit. In moving data to the storage unit, the control unit changes whether or not the data will enter the destination folder by changing the tactile sensation to be presented, and if so, how much free space is available in the destination folder after moving the file. It is possible to make the user recognize whether or not

  In the first and second embodiments described above, the case where the storage area of the storage unit outside the apparatus is not divided into a plurality of partitions, folders, or the like has been described. However, the storage unit outside the apparatus has a plurality of storage areas. The apparatus of the present invention can be applied in the same manner even if it has such a device. In the data movement between a plurality of storage areas of the external storage unit or between a storage area of the external storage unit and the internal storage unit or the storage area, the control unit changes the tactile sensation to be presented. In this case, the user can recognize whether or not the data enters the destination folder, and if so, how much free space is available in the destination folder after moving the file.

101, 201 Device having a touch sensor 102, 202 Display unit 103, 203 Touch sensor 104, 204 Storage unit 105, 205 Tactile presentation unit 106, 206 Control unit 108, 208 Image file icon 311 Icon 312 Source window 313 Target window 314 Inner boundary 315 Desired position

Claims (12)

  1. A storage unit having a plurality of storage areas;
    A touch sensor;
    A tactile sensation providing unit for presenting a tactile sensation to a touch target touching the touch sensor;
    When storing data stored in a predetermined storage area in a storage area different from the predetermined storage area, the tactile sensation determined based on the size of the data and the free capacity or the used capacity of the different storage area is A control unit that controls the tactile sensation providing unit to present to the touch target;
    A device having a touch sensor comprising:
  2. The apparatus having the touch sensor according to claim 1, further comprising:
    With a display,
    When an icon of the data displayed on the display unit is moved by an input to the touch sensor, the control unit calculates a motion vector of the data, and the different storage areas are calculated based on the motion vector. Characterized by specifying
    A device having a touch sensor.
  3. A first storage unit;
    A second storage unit different from the first storage unit;
    A touch sensor;
    A tactile sensation providing unit for presenting a tactile sensation to a touch target touching the touch sensor;
    When the data stored in the first storage unit is stored in the second storage unit, the tactile sensation determined based on the size of the data and the free capacity or the used capacity of the second storage unit. A control unit that controls the tactile sensation providing unit to present the touch target;
    A device having a touch sensor comprising:
  4. The apparatus having a touch sensor according to claim 3, further comprising:
    With a display,
    When an icon of the data displayed on the display unit is moving due to an input to the touch sensor, the control unit calculates a motion vector of the data, and based on the motion vector, the second data The storage unit is specified,
    A device having a touch sensor.
  5. Identifying each of data stored in a predetermined storage area of the storage unit and a storage area different from the predetermined storage area;
    Presenting a tactile sensation determined based on the specified data size and the free capacity or used capacity of the specified different storage areas to a touch target touching the touch sensor;
    A tactile sensation presentation method including:
  6. The tactile sensation presentation method according to claim 5, further comprising:
    A step of calculating a motion vector of the data when the icon of the data displayed on the display unit is moving due to an input to the touch sensor, and specifying the different storage area based on the motion vector;
    A tactile sensation presentation method including:
  7. Identifying each of the data stored in the first storage unit and a second storage unit different from the first storage unit;
    Presenting a tactile sensation determined based on the size of the specified data and the free or used capacity of the specified second storage unit to a touch target touching the touch sensor;
    A tactile sensation presentation method including:
  8. The tactile sensation presentation method according to claim 7, further comprising:
    When an icon of the data displayed on the display unit is moved by an input to the touch sensor, a motion vector of the data is calculated, and the second storage unit is specified based on the motion vector. Step,
    A tactile sensation presentation method including:
  9. A computer mounted on a device having a touch sensor,
    Means for respectively specifying data stored in a predetermined storage area of the storage unit and a storage area different from the predetermined storage area;
    Means for controlling the tactile sensation determined based on the size of the specified data and the free capacity or the used capacity of the specified different storage areas so as to be presented to a touch target touching the touch sensor;
    Tactile sensation presentation program to function as
  10. The tactile sensation providing program according to claim 9, further comprising:
    Means for calculating a motion vector of the data when the icon of the data displayed on the display unit is moved by an input to the touch sensor, and specifying the different storage area based on the motion vector;
    Tactile sensation presentation program to function as
  11. A computer mounted on a device having a touch sensor,
    Means for respectively specifying data stored in the first storage unit and a second storage unit different from the first storage unit;
    Means for controlling the tactile sensation determined based on the size of the specified data and the free or used capacity of the specified second storage unit to be presented to the touch target touching the touch sensor. ,
    Tactile sensation presentation program to function as
  12. The tactile sensation providing program according to claim 11, further comprising:
    When an icon of the data displayed on the display unit is moved by an input to the touch sensor, a motion vector of the data is calculated, and the second storage unit is specified based on the motion vector. means,
    Tactile sensation presentation program to function as
JP2009282259A 2009-12-11 2009-12-11 Device having touch sensor, tactile sensation presentation method, and tactile sensation presentation program Active JP5490508B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009282259A JP5490508B2 (en) 2009-12-11 2009-12-11 Device having touch sensor, tactile sensation presentation method, and tactile sensation presentation program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009282259A JP5490508B2 (en) 2009-12-11 2009-12-11 Device having touch sensor, tactile sensation presentation method, and tactile sensation presentation program

Publications (2)

Publication Number Publication Date
JP2011123773A true JP2011123773A (en) 2011-06-23
JP5490508B2 JP5490508B2 (en) 2014-05-14

Family

ID=44287585

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009282259A Active JP5490508B2 (en) 2009-12-11 2009-12-11 Device having touch sensor, tactile sensation presentation method, and tactile sensation presentation program

Country Status (1)

Country Link
JP (1) JP5490508B2 (en)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014102829A (en) * 2012-11-20 2014-06-05 Immersion Corp System and method for feedforward and feedback with haptic effects
JP2014519128A (en) * 2011-10-27 2014-08-07 騰訊科技(深▲セン▼)有限公司 Method and apparatus for uploading and downloading files
CN104423922A (en) * 2013-08-26 2015-03-18 夏普株式会社 Image display apparatus and data transfer method
JP2015521316A (en) * 2012-05-09 2015-07-27 アップル インコーポレイテッド Device, method, and graphical user interface for providing tactile feedback of actions performed within a user interface
CN104866179A (en) * 2015-05-29 2015-08-26 小米科技有限责任公司 Method and apparatus for managing terminal application program
CN104994419A (en) * 2015-06-29 2015-10-21 天脉聚源(北京)科技有限公司 Method and device for realizing synchronous display of plurality of virtual players
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10540039B1 (en) 2018-10-06 2020-01-21 P4tents1, LLC Devices and methods for navigating between user interface

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005267058A (en) * 2004-03-17 2005-09-29 Seiko Epson Corp Touch panel device and terminal device
JP2005286476A (en) * 2004-03-29 2005-10-13 Nec Saitama Ltd Electronic equipment having memory free space notification function, and notification method thereof
JP2005332063A (en) * 2004-05-18 2005-12-02 Sony Corp Input device with tactile function, information inputting method, and electronic device
JP2006011914A (en) * 2004-06-28 2006-01-12 Fuji Photo Film Co Ltd Image display controller and image display control program
JP2006018582A (en) * 2004-07-01 2006-01-19 Ricoh Co Ltd Operation display device
JP2006024039A (en) * 2004-07-08 2006-01-26 Sony Corp Information processor and program to be used for the same
JP2006252550A (en) * 2005-02-14 2006-09-21 Seiko Epson Corp File operation limiting system, file operation limiting program, file operation limiting method, electronic equipment and printer
JP2007179095A (en) * 2005-12-26 2007-07-12 Fujifilm Corp Display control method, information processor, and display control program
JP2008033739A (en) * 2006-07-31 2008-02-14 Sony Corp Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement
US20090295739A1 (en) * 2008-05-27 2009-12-03 Wes Albert Nagara Haptic tactile precision selection

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005267058A (en) * 2004-03-17 2005-09-29 Seiko Epson Corp Touch panel device and terminal device
JP2005286476A (en) * 2004-03-29 2005-10-13 Nec Saitama Ltd Electronic equipment having memory free space notification function, and notification method thereof
JP2005332063A (en) * 2004-05-18 2005-12-02 Sony Corp Input device with tactile function, information inputting method, and electronic device
JP2006011914A (en) * 2004-06-28 2006-01-12 Fuji Photo Film Co Ltd Image display controller and image display control program
JP2006018582A (en) * 2004-07-01 2006-01-19 Ricoh Co Ltd Operation display device
JP2006024039A (en) * 2004-07-08 2006-01-26 Sony Corp Information processor and program to be used for the same
JP2006252550A (en) * 2005-02-14 2006-09-21 Seiko Epson Corp File operation limiting system, file operation limiting program, file operation limiting method, electronic equipment and printer
JP2007179095A (en) * 2005-12-26 2007-07-12 Fujifilm Corp Display control method, information processor, and display control program
JP2008033739A (en) * 2006-07-31 2008-02-14 Sony Corp Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement
US20090295739A1 (en) * 2008-05-27 2009-12-03 Wes Albert Nagara Haptic tactile precision selection

Cited By (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10345961B1 (en) 2011-08-05 2019-07-09 P4tents1, LLC Devices and methods for navigating between user interfaces
US10338736B1 (en) 2011-08-05 2019-07-02 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10365758B1 (en) 2011-08-05 2019-07-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10386960B1 (en) 2011-08-05 2019-08-20 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
JP2014519128A (en) * 2011-10-27 2014-08-07 騰訊科技(深▲セン▼)有限公司 Method and apparatus for uploading and downloading files
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US10168826B2 (en) 2012-05-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10114546B2 (en) 2012-05-09 2018-10-30 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10481690B2 (en) 2012-05-09 2019-11-19 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US9823839B2 (en) 2012-05-09 2017-11-21 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
JP2015521316A (en) * 2012-05-09 2015-07-27 アップル インコーポレイテッド Device, method, and graphical user interface for providing tactile feedback of actions performed within a user interface
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
JP2014102829A (en) * 2012-11-20 2014-06-05 Immersion Corp System and method for feedforward and feedback with haptic effects
US9836150B2 (en) 2012-11-20 2017-12-05 Immersion Corporation System and method for feedforward and feedback with haptic effects
JP2018110000A (en) * 2012-11-20 2018-07-12 イマージョン コーポレーションImmersion Corporation System and method for feed-forward and feed-back by haptic effect
US9857897B2 (en) 2012-12-29 2018-01-02 Apple Inc. Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts
US9965074B2 (en) 2012-12-29 2018-05-08 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US9996233B2 (en) 2012-12-29 2018-06-12 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10175879B2 (en) 2012-12-29 2019-01-08 Apple Inc. Device, method, and graphical user interface for zooming a user interface while performing a drag operation
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10101887B2 (en) 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10185491B2 (en) 2012-12-29 2019-01-22 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or enlarge content
CN104423922A (en) * 2013-08-26 2015-03-18 夏普株式会社 Image display apparatus and data transfer method
US10402073B2 (en) 2015-03-08 2019-09-03 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10180772B2 (en) 2015-03-08 2019-01-15 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268342B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10338772B2 (en) 2015-03-08 2019-07-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10387029B2 (en) 2015-03-08 2019-08-20 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10268341B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9645709B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10067645B2 (en) 2015-03-08 2018-09-04 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US10222980B2 (en) 2015-03-19 2019-03-05 Apple Inc. Touch input cursor manipulation
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
JP2017523484A (en) * 2015-05-29 2017-08-17 シャオミ・インコーポレイテッド Method and apparatus for managing terminal applications
CN104866179A (en) * 2015-05-29 2015-08-26 小米科技有限责任公司 Method and apparatus for managing terminal application program
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9706127B2 (en) 2015-06-07 2017-07-11 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10455146B2 (en) 2015-06-07 2019-10-22 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10303354B2 (en) 2015-06-07 2019-05-28 Apple Inc. Devices and methods for navigating between user interfaces
US9916080B2 (en) 2015-06-07 2018-03-13 Apple Inc. Devices and methods for navigating between user interfaces
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
CN104994419A (en) * 2015-06-29 2015-10-21 天脉聚源(北京)科技有限公司 Method and device for realizing synchronous display of plurality of virtual players
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10209884B2 (en) 2015-08-10 2019-02-19 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10203868B2 (en) 2015-08-10 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10540039B1 (en) 2018-10-06 2020-01-21 P4tents1, LLC Devices and methods for navigating between user interface

Also Published As

Publication number Publication date
JP5490508B2 (en) 2014-05-14

Similar Documents

Publication Publication Date Title
US9965074B2 (en) Device, method, and graphical user interface for transitioning between touch input to display output relationships
US6657615B2 (en) Input processing method and input processing device for implementing same
EP1979804B1 (en) Gesturing with a multipoint sensing device
US9477370B2 (en) Method and terminal for displaying a plurality of pages, method and terminal for displaying a plurality of applications being executed on terminal, and method of executing a plurality of applications
EP2724217B1 (en) Active stylus
US8854317B2 (en) Information processing apparatus, information processing method and program for executing processing based on detected drag operation
US7924271B2 (en) Detecting gestures on multi-event sensitive devices
TWI479369B (en) Computer-storage media and method for virtual touchpad
EP3028123B1 (en) Electronic device and method of recognizing input in electronic device
KR101019900B1 (en) Operation of a computer with touch screen interface
EP0996052A2 (en) Input processing method and input control apparatus
US7181697B2 (en) Method of implementing a plurality of system tray areas
CN100587657C (en) Apparatus and method for touch screen interaction based on tactile feedback and pressure measurement
TWI493394B (en) Bimodal touch sensitive digital notebook
US20090243998A1 (en) Apparatus, method and computer program product for providing an input gesture indicator
WO2013008649A1 (en) User interface device capable of execution of input by finger contact in plurality of modes, input operation assessment method, and program
US9041658B2 (en) Touch screen device and operating method thereof
US9250783B2 (en) Toggle gesture during drag gesture
US20130155018A1 (en) Device and method for emulating a touch screen using force information
KR101803948B1 (en) Touch-sensitive button with two levels
RU2523169C2 (en) Panning content using drag operation
CN101226443B (en) Mobile electronic apparatus with touch input device and display method using the same
US20080297484A1 (en) Method and apparatus for providing gesture information based on touchscreen and information terminal device having the apparatus
US9350841B2 (en) Handheld device with reconfiguring touch controls
CN101490643B (en) By touching the touch panel at a predetermined position recognition function for controlling the scrolling gesture is activated scrolling method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20121115

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130717

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130723

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130902

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140128

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140226

R150 Certificate of patent (=grant) or registration of utility model

Ref document number: 5490508

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150