WO2018123320A1 - User interface device and electronic apparatus - Google Patents

User interface device and electronic apparatus Download PDF

Info

Publication number
WO2018123320A1
WO2018123320A1 PCT/JP2017/041151 JP2017041151W WO2018123320A1 WO 2018123320 A1 WO2018123320 A1 WO 2018123320A1 JP 2017041151 W JP2017041151 W JP 2017041151W WO 2018123320 A1 WO2018123320 A1 WO 2018123320A1
Authority
WO
WIPO (PCT)
Prior art keywords
control unit
touch panel
user interface
display
interface device
Prior art date
Application number
PCT/JP2017/041151
Other languages
French (fr)
Japanese (ja)
Inventor
佐藤 克則
兼平 浩紀
赤間 博
優子 千田
森田 修身
佳 共
光春 菅原
Original Assignee
デクセリアルズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by デクセリアルズ株式会社 filed Critical デクセリアルズ株式会社
Publication of WO2018123320A1 publication Critical patent/WO2018123320A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M11/00Coding in connection with keyboards or like devices, i.e. coding of the position of operated keys
    • H03M11/02Details
    • H03M11/04Coding of multifunction keys

Definitions

  • the present invention relates to a user interface device and an electronic device.
  • touch panels that can be easily and sensibly operated with a finger or the like have become widespread, and various studies and developments have been made on downsizing, thinning, lightening, power saving, and cost reduction of touch panels.
  • a position where an instruction medium such as a finger touches the touch panel referred to as “touch position”
  • a resistance film type that detects a change in electrical resistance
  • a surface acoustic wave method that uses ultrasonic waves
  • an electrostatic A capacitance method for detecting a change in capacitance is known.
  • a capacitive touch panel is attracting attention in that it can detect a plurality of touch positions.
  • the capacitive touch panel includes a transparent electrode that generates a capacitance and an external circuit that detects a change in the capacitance.
  • a touch panel including a pressure-sensitive sensor capable of detecting a pressing force when the touch panel is pressed with a stylus pen, a finger, or the like is being provided.
  • Patent Document 1 input determination by pressing on the input operation surface is performed based on a change in capacitance of the upper electrode and lower electrode of the pressure-sensitive sensor displaced by the top plate and the touch panel moving in the pressing direction.
  • a technique for determining that an operation has been performed is disclosed.
  • Patent Document 2 discloses a sensor device including a pressure-sensitive sensor that does not use an elastic member.
  • JP 2011-134000 A Japanese Unexamined Patent Publication No. 2011-100364
  • Touch panels are used in various electronic devices.
  • an electronic device using a touch panel for example, there is a navigation device that is mounted on a car and displays a map, directions, and the like from the current position of the vehicle to a destination on a display panel.
  • a navigation device requires a simple operation.
  • the touch panel attached to the conventional navigation device has only a function of detecting the XY coordinates of the touch position, and it is inconvenient because a complicated operation must be performed by the user to obtain a desired result. .
  • the pressure sensitive sensor is merely used in an auxiliary manner in order to reliably detect that a finger or the like has touched the touch panel. For this reason, even if it is the touch panel which applied the technique disclosed by patent document 1 and 2, operation itself is the same as that of the touch panel which does not have a conventional pressure-sensitive function, and the inconvenience of operation was not eliminated. . Moreover, since the operation using an external input device etc. is requested
  • the present invention has been made in view of such a situation, and an object thereof is to make it possible to easily obtain a result requested by a user.
  • a user interface device includes a display panel that displays an object, a display control unit that performs control to display an object on the display panel according to control information, a touch panel that performs a touch operation using an instruction medium, and a touch operation that is performed.
  • An electronic apparatus includes the above-described user interface device and an electronic apparatus main body.
  • the electronic device main body performs a predetermined process based on content instructed by a touch operation or a gesture operation input from the control unit, and outputs an object on which the predetermined process has been performed to the control unit of the user interface device.
  • the user can easily obtain the result requested by the user by performing an intuitive gesture operation that combines the touch operation and the push-in operation.
  • FIG. 1 is an explanatory diagram showing a state when a driver seat and a passenger seat of a vehicle on which the navigation device 1 is mounted are viewed from the rear in the traveling direction toward the front.
  • the navigation device 1 (an example of an electronic device) is installed at a position sandwiched between the dashboard 5 and the meter panel 6 and visible to a user who holds the steering wheel 7.
  • a shift lever 8 is provided below the navigation device 1.
  • the navigation device 1 includes a navigation device body 2 that performs processing necessary for navigation, and a user interface device 3.
  • the navigation device body 2 is fitted in a recess formed in the dashboard 5 and cannot be directly recognized by the user.
  • the user interface device 3 is disposed at a position where the user can visually recognize, and outputs an instruction input by the user to the navigation device main body 2 and indicates a processing result by the navigation device main body 2 to the user.
  • the user interface device 3 is configured to have a sense of unity with the interior shape from the dashboard 5, the meter panel 6 and the shift lever 8 to the base of the windshield 4.
  • the user interface device 3 displays information (a map, various icons, etc.) necessary for navigation processing and the like output from the navigation device body 2.
  • the instruction medium touching the touch panel 20 is a user's finger.
  • a touch operation an operation performed by the user moving his / her finger while touching the touch panel 20
  • a pressing operation an operation in which the user presses the touch panel 20 with a finger
  • a gesture operation An operation performed by combining the touch operation and the pressing operation.
  • the conventional gesture operation is performed only by a touch operation.
  • FIG. 2 is a block diagram illustrating an internal configuration example of the user interface device 3.
  • the navigation device main body 2 (an example of the electronic device main body) performs predetermined processing based on the instructed content by a touch operation or a gesture operation input from the control unit 10 of the user interface device 3. Then, the navigation device body 2 outputs the object after performing the predetermined processing based on the instruction to the control unit 10 of the user interface device 3. Further, the control unit 10 determines a touch operation (finger moving direction, touch position, etc.) performed on the object.
  • a touch operation finger moving direction, touch position, etc.
  • Objects that are output from the navigation device body 2 to the user interface device 3 include, for example, maps, character strings, icons, images, and the like.
  • the navigation device body 2 performs navigation processing based on the gesture operation performed on the user interface device 3, outputs a map for display by the user interface device 3, and receives a character string input from the user interface device 3. Edit it.
  • the user interface device 3 includes a control unit 10, a storage medium control unit 14, a storage medium 15, a communication control unit 16, a coordinate detection unit 17, a pressure sensitive detection unit 18, and a display control unit 19 connected by a bus B.
  • the user interface device 3 also includes a touch panel 20 connected to the coordinate detection unit 17, a pressure sensor 30 connected to the pressure detection unit 18, and a display panel 40 connected to the display control unit 19.
  • the control unit 10 controls the operation of each unit in the user interface device 3.
  • the control unit 10 determines that a touch operation has been performed based on the coordinate detection information input from the coordinate detection unit 17, and outputs information input from the touch position to the navigation device body 2. Further, the control unit 10 determines that the pressing operation in which the touch panel 20 is pressed with a finger is performed based on the pressure-sensitive detection information input from the pressure-sensitive detection unit 18. In addition, the control unit 10 outputs control information for causing the display panel 40 to display the object output from the navigation device body 2 to the user interface device 3 to the display control unit 19.
  • control unit 10 outputs control information for changing the display form of the object displayed on the display panel 40 to the display control unit 19 based on a gesture operation that combines the touch operation and the push-in operation.
  • changing the display form of an object means, for example, displaying a map on an enlarged or reduced scale.
  • the control unit 10 includes a CPU 11, a RAM 12, and a ROM 13, and the CPU 11, the RAM 12, and the ROM 13 work together to realize the function of the control unit 10.
  • the CPU (Central Processing Unit) 11 is an example of a computer that controls the operation of each unit in the user interface device 3. For example, the CPU 11 executes a program read from the ROM 13 and performs processing related to the gesture operation according to the present embodiment.
  • a RAM (Random Access Memory) 12 stores temporary data such as a program executed by the CPU 11.
  • ROM (Read Only Memory) 13 stores a program read by the CPU 11 and the like.
  • the ROM 13 is used as an example of a computer-readable non-transitory recording medium that stores a program executed by the CPU 11. For this reason, this program is permanently stored in the ROM 13.
  • the computer-readable non-transitory recording medium storing the program executed by the CPU 11 may be a recording medium such as a CD-ROM or a DVD-ROM.
  • the storage medium control unit 14 controls the storage medium 15.
  • the storage medium control unit 14 writes data input from the control unit 10 to the storage medium 15, or reads data stored in the storage medium 15 according to an instruction from the control unit 10 and outputs the data to the control unit 10.
  • the storage medium 15 is inserted into a slot or the like provided in the user interface device 3 and stores data written by the storage medium control unit 14 or data is read by the storage medium control unit 14.
  • the storage medium 15 stores a navigation program that operates in the navigation apparatus main body 2, upgrade data of a character editing program, and the like.
  • the communication control unit 16 controls data communication processing performed through the network N between the navigation device body 2 and the user interface device 3.
  • the coordinate detection unit 17 detects the coordinates of the touch position of the touch panel 20 where the touch operation is performed.
  • the touch panel 20 is formed in a planar rectangular shape, and the position of the intersection of the X electrode and the Y electrode that intersect each other is coordinate information, and a value corresponding to the change in capacitance at this position is given to the coordinate detection unit 17. Is output.
  • the coordinate detection unit 17 detects the coordinate of the location where the coordinate information input from the touch panel 20 has changed as a finger touch position, and outputs coordinate detection information including the coordinate of the touch position to the control unit 10.
  • the pressure sensitive detection unit 18 detects that the touch panel 20 is pressed with a finger and the pressure sensitive sensor 30 senses pressure.
  • the pressure-sensitive sensor 30 is provided on the back surface of the touch panel 20 and outputs a sensor value that changes according to the pressing force applied to the touch panel 20 by a finger to the pressure-sensitive detection unit 18. Based on the sensor value input from the pressure-sensitive sensor 30, the pressure-sensitive detection unit 18 detects that a pressing operation for pressing the touch panel 20 with a finger is performed, and outputs pressure-sensitive detection information to the control unit 10. To do.
  • the display control unit 19 performs control to display objects such as icons and maps necessary for navigation on the display panel 40 formed in a planar rectangular shape according to the control information input from the control unit 10.
  • FIG. 3 is a schematic configuration diagram of the user interface device 3.
  • the position of the pressure-sensitive sensor 30 installed on the touch panel 20 as viewed from above and the position of the frame of the housing 55 are indicated by broken lines.
  • Six pressure-sensitive sensors 30 are provided on the back surface of the touch panel 20 on the frame of the housing 55 of the user interface device 3 including the touch panel 20. For this reason, the user does not see the pressure sensor 30 directly.
  • a plurality of pressure-sensitive sensors 30 are usually provided, but a plurality of pressure-sensitive sensors 30 are not necessarily provided, and may be one.
  • the top plate 50 protects the surface of the touch panel 20.
  • a transparent glass substrate, a film, or the like is used for the top plate 50.
  • the surface of the top plate 50 is an input operation surface 51 on which a user touches a finger for performing a touch operation.
  • the touch panel 20 is configured, for example, by laminating a transparent X electrode substrate 21, an adhesive layer 22, and a Y electrode substrate 23 in this order.
  • the top plate 50 and the X electrode substrate 21 are bonded and fixed by an adhesive layer 52.
  • Each of the X electrode substrate 21 and the Y electrode substrate 23 has a rectangular shape.
  • the X electrode substrate 21 and the Y electrode substrate 23 are bonded by the adhesive layer 22.
  • An area where the X direction detection electrode (not shown) formed on the X electrode substrate 21 and the Y direction detection electrode (not shown) formed on the Y electrode substrate 23 overlap each other in a plane is a coordinate detection area on the XY plane.
  • the pressure sensitive sensor 30 is disposed in a peripheral area (frame) outside the coordinate detection area on the XY plane of the touch panel 20.
  • the pressure-sensitive sensor 30 includes an elastic body 33 made of a dielectric material disposed between the touch panel 20 and the housing 55, and an upper electrode 31 and a lower electrode 35 that are disposed so as to sandwich the elastic body 33 and form a capacitor. Is provided.
  • the pressure-sensitive sensor 30 further includes an adhesive layer 32 that bonds and fixes the elastic body 33 and the upper electrode 31, and an adhesive layer 34 that bonds and fixes the elastic body 33 and the lower electrode 35.
  • the elastic bodies constituting the six pressure-sensitive sensors 30 are connected to form one frame-shaped elastic body 33, and the six pressure-sensitive sensors 30 form one elastic body 33. Sharing.
  • the elastic body 33 By providing the elastic body 33 in a ring shape, it is possible to prevent foreign dust or the like from entering the gap 41 between the touch panel 20 and the housing 55, that is, between the touch panel 20 and the display panel 40.
  • the lower electrodes 35 constituting the six pressure sensors 30 are connected to constitute one frame-like lower electrode 35, and the six pressure sensors 30 share one lower electrode 35.
  • the upper electrode 31 may also be formed in a frame shape like the lower electrode 35.
  • the elastic body 33 for example, a material having a small residual strain and a high restoration rate (restoration speed) is used.
  • the material used for the elastic body 33 include silicone rubber and urethane rubber.
  • the elastic body 33 may be displaced about 10% at the maximum with respect to the height of the original elastic body 33, for example.
  • the elastic body 33 having a thickness of 0.5 mm used for the pressure-sensitive sensor 30 may be displaced by about 10 ⁇ m.
  • the elastic body 33 of the pressure sensor 30 is distorted so that the pressure sensor 30 is bonded and fixed.
  • the top plate 50 and the touch panel 20 are moved in the pressing direction.
  • the pressure sensor 30 is pressed, its thickness is displaced in the pressing direction.
  • the back surface of the touch panel 20 approaches the front surface of the display panel 40 by the amount of displacement of the pressure sensor 30, so that a gap 41 is provided between the touch panel 20 and the display panel 40 in consideration of the movement of the touch panel 20. ing.
  • the gap 41 is not provided when the touch panel 20 and the display panel 40 are bonded.
  • the pressure-sensitive sensor 30 outputs a sensor value corresponding to the capacitance of the capacitor formed by the upper electrode 31 and the lower electrode 35 to the pressure-sensitive detection unit 18.
  • FIG. 4 is a diagram for explaining the operating principle of the pressure-sensitive sensor 30.
  • the description of the touch panel 20 is omitted.
  • An example of the pressure-sensitive sensor 30 that is not pressed with a finger is shown in the upper left of FIG. 4, and an example of the pressure-sensitive sensor 30 that is pressed with a finger is shown in the upper right of FIG.
  • the elastic body 33 is distorted so that the thickness decreases.
  • the electrostatic capacitance of the pressure sensor 30 changes because the thickness of the pressed pressure sensor 30 changes with respect to the thickness of the pressure sensor 30 which is not pressed.
  • the pressure-sensitive detection unit 18 uses the capacitance change rate between the upper electrode 31 and the lower electrode 35 due to the displacement d of the elastic body 33, the touch panel 20 is pressed, and the pressure-sensitive sensor 30 senses pressure.
  • the capacitance change rate is obtained based on sensor values output from the upper electrode 31 and the lower electrode 35 by the pressure-sensitive detection unit 18.
  • the sensor value is a voltage determined by the capacitance between the upper electrode 31 and the lower electrode 35. That is, the rate of change in capacitance is the static between the upper electrode 31 and the lower electrode 35 of the pressed pressure sensor 30 with respect to the capacitance between the upper electrode 31 and the lower electrode 35 of the pressure sensor 30 that is not pressed. It is obtained as a percentage of electric capacity.
  • the pressure-sensitive detection unit 18 obtains a change in capacitance between the upper electrode 31 and the lower electrode 35 based on a sensor value input from the pressure-sensitive sensor 30, that is, obtains a rate of change in capacitance. Is possible.
  • the pressure-sensitive detection unit 18 detects the sensor value detected by each pressure-sensitive sensor 30 disposed on the back surface of the touch panel 20 based on the graph showing the relationship between the pressing force at the bottom of FIG. It is converted into a pressing force when the touch panel 20 is pressed. Then, when the pressing force exceeds the pressing threshold, the pressure-sensitive detection unit 18 determines that the user has consciously pressed the touch panel 20 and outputs pressure-sensitive detection information to the control unit 10.
  • the pressure-sensitive detection unit 18 may obtain the pressing force based on the total value of the respective capacitance change rates detected by the pressure-sensitive sensors 30. Thereby, it is possible to detect the pressing force with high accuracy without depending on only the touch position on the input operation surface 51. Note that the pressure-sensitive detection unit 18 may obtain the pressing force from an average value obtained by dividing the total value of the capacitance change rates by the number of the pressure-sensitive sensors 30, for example.
  • the pressure-sensitive detection unit 18 may output pressure-sensitive detection information that varies depending on the magnitude of the pressing force to the control unit 10. For example, in the lower graph of FIG. 4, the pressing force at which the capacitance change rate is 2.0% is defined as a threshold th1 (an example of the first pressing threshold value), and the pressing force at which the capacitance change rate is 6.0%. Let the pressure be a threshold th2 (an example of a second pressing threshold). If the pressing force is less than the threshold th ⁇ b> 1, it is considered that the capacitance has only changed due to vibration applied to the user interface device 3 through the housing 55. That is, it is assumed that the user does not intentionally press the finger into the touch panel 20.
  • the pressure-sensitive detection unit 18 does not determine that the user has pressed the finger into the touch panel 20, and therefore does not output pressure-sensitive detection information. However, if the pressing force is equal to or greater than the threshold th1, the pressure-sensitive detection unit 18 determines that the user has pressed the finger into the touch panel 20, and outputs first pressure-sensitive detection information. Furthermore, if the pressing force is greater than or equal to the threshold th2, the pressure-sensitive detection unit 18 determines that the user has pressed the finger strongly into the touch panel 20, and outputs second pressure-sensitive detection information.
  • the pressure-sensitive detection unit 18 uses the first pressure-sensitive detection information or the second pressure-sensitive information based on the pressing force obtained from the capacitance change rate that changes according to the pressing amount of the user pushing the finger into the touch panel 20. Detection information can be output to the control unit 10. As a result, the control unit 10 can perform different processes according to the first pressure detection information or the second pressure detection information input from the pressure detection unit 18. Note that the pressure-sensitive detection information may be only one of the first pressure-sensitive detection information and the second pressure-sensitive detection information. When the first pressure detection information or the second pressure detection information is not distinguished, it is referred to as “pressure detection information”.
  • the control unit 10 determines that the finger has been pushed into the touch panel 20 and the determination operation by the user has been performed when the pressure-sensitive detection information is input from the pressure-sensitive detection unit 18. At this time, the control unit 10 vibrates the housing 55 or displays a message on the display panel 40 in order to inform the user that it is in the pressed state. Further, a voice guide may be emitted from a speaker (not shown). Or the control part 10 can also display icons, such as a circle and a square, in the pressed touch position. In this case, the icon may be blinked, or the icon display color may be changed.
  • the housing 55 can also generate a click sound or change the touch of the finger when the touch panel 20 is pushed.
  • FIGS. 5 and 6 Each step of the conventional flowchart shown in FIG. 5 and the flowchart according to the first embodiment of the present invention shown in FIG. 6 includes user interface devices 3 and 100 in order to explain specific contents of the gesture operation.
  • a display example of the screen displayed on the display panel is shown. In this display example, the position of the user's finger on the touch panel is added.
  • the screen added to the steps of FIG. 5 and FIG. 6 includes a map screen of the user interface device 3, 100 at a certain moment, a front view of the finger, a bottom view, and a left side view within a rectangular broken line range. They are displayed at the same time.
  • a thin white arrow in the screen indicates the direction of finger movement.
  • FIG. 5 is a flowchart showing an example of a conventional enlargement or reduction operation.
  • a control unit (not shown) provided in the conventional user interface device 100 detects that a finger touches the touch panel provided in the user interface device 100 (S1).
  • the control unit of the conventional user interface device 100 is simply referred to as a “control unit” without reference numeral.
  • the user selects an operation that changes depending on the number of fingers on the touch panel (S2).
  • an operation performed with one finger swipe
  • an operation performed with two fingers is selected.
  • the control unit detects the number of touches of the finger touching the touch panel (S3).
  • the control unit displays control information (not shown) that displays the control information by moving the map in accordance with the direction in which the finger touching the touch panel moves the touch panel. (S4).
  • the display control unit of the conventional user interface device 100 is simply referred to as a “display control unit” without reference numerals.
  • the user moves the map from the lower left to the upper right while touching the screen with one finger, and the map is moved and displayed in accordance with the moving direction of the one finger. The situation is shown.
  • step S5 shows a state in which the user touches the vicinity of the center of the screen with two fingers.
  • step S5 When a pinch-out that increases the distance between touch positions is performed (enlargement in S5), the control unit outputs control information for enlarging the map to the display control unit (S6).
  • the screen added in step S6 shows a state in which the user enlarges the map by widening the interval between the two fingers (pinch out) while touching the two fingers on the screen.
  • step S5 when a pinch-in that reduces the distance between touch positions is performed in step S5 (reduction in S5), the control unit outputs control information for reducing the map to the display control unit (S7).
  • the screen added in step S7 shows how the map is reduced and displayed by narrowing the interval between the two fingers (pinch in) while the user touches the screen with two fingers.
  • the display control unit displays the screen requested by the user on the display panel in accordance with the control information input from the control unit (S8), and ends this process.
  • the map has been enlarged or reduced using “touch panel touch detection” and “XY coordinate movement detection”.
  • touch panel touch detection and “XY coordinate movement detection”.
  • XY coordinate movement detection In order to enlarge or reduce the map, the user can only pinch out and pinch in with two fingers, combine two different gestures, or press the display magnification change button. There wasn't.
  • FIG. 6 is a flowchart illustrating an example of the enlargement or reduction operation according to the first embodiment.
  • the map is enlarged or reduced by a simple operation using one finger. That is, the control unit 10 according to the first embodiment provides the display control unit 19 with control information for enlarging or reducing the map in accordance with the direction in which the finger moves while the touch panel 20 is pressed by the finger. It is possible to output.
  • the gesture operation is, for example, an operation of moving a finger pressed by the user into the touch panel 20 upward or downward.
  • the coordinate detection unit 17 detects that the touch has been made (S11), and detects the coordinates of the touch position.
  • the screen added in step S11 shows a state in which the user touches the lower left of the screen with one finger.
  • step S ⁇ b> 12 shows a state in which the user touches the upper left of the screen with one finger and further presses the touch panel 20. That the touch panel 20 is pushed by a finger is indicated by the movement direction of the finger represented by a downward arrow with respect to the touch panel 20.
  • the pressure sensor detector 18 determines whether or not the pressure sensor 30 has detected pressure (S13).
  • the control unit 10 moves the map in accordance with the direction in which the finger touching the touch panel 20 moves.
  • the control information to be displayed is output to the display control unit 19 (S14).
  • the map is moved and displayed in accordance with the moving direction of one finger moved from the lower left to the upper right of the screen while the user touches the finger as usual. The state of being done is shown.
  • the control unit 10 determines, based on the coordinate detection information input from the coordinate detection unit 17, whether the moving direction of the finger moving in a state where the finger is pressed is up or down with respect to the touch panel 20. (S15).
  • the control unit 10 enlarges the map displayed on the display panel 40 when the direction in which the touch panel 20 is moved by the finger is upward (an example of the first direction) (above S15). Is output to the display control unit 19 (S16). This map continuously expands as the moving distance of the map pushed into the touch panel 20 increases.
  • the screen added in step S16 shows a state in which the map is enlarged by the user moving his / her finger up while pressing the finger on the touch panel 20.
  • control unit 10 reduces the map displayed on the display panel 40 when the direction in which the touch panel 20 is moved by the finger is down (an example of the second direction) (below S15).
  • the control information is output to the display control unit 19 (S17).
  • This map is continuously reduced as the moving distance of the map pushed into the touch panel 20 becomes longer.
  • the screen added in step S ⁇ b> 17 shows a state where the map is reduced by the user moving the finger down while pressing the finger on the touch panel 20.
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S18), and ends this process.
  • a map displayed on the display panel 40 can be enlarged or reduced at an arbitrary magnification by a simple gesture operation using one finger. It becomes. For this reason, it is possible to simplify an operation method that is conventionally realized by combining two fingers or a plurality of gestures.
  • the user can automatically move from a location search mode (an operation to move a finger in the XY direction after designating one coordinate) that involves changing the coordinates of the map.
  • a magnification change mode operation for enlarging a map
  • the user simply zooms in or out of the map by moving the finger pressed down on the touch panel 20 up or down. It becomes possible to reduce. For this reason, the user can realize the intended function with only one finger.
  • the user can continuously perform an operation for moving and displaying a map such as a map or a photo with one finger and an operation for changing the display magnification of the map. Operation can be realized.
  • the navigation device body 2 may output control information to the user interface device 3 so that the user cannot operate using the user interface device 3 while the vehicle is traveling. Conversely, control information that allows the user to operate using the user interface device 3 while the vehicle is stopped may be output to the user interface device 3.
  • the map may be set to be reduced when the direction of movement of the finger pressed by the user is up and to be enlarged when the direction of movement of the finger is down. Also, for example, the map may be enlarged when the direction of the finger pressed by the user is right, and the map may be reduced when the finger is left, or conversely, when the direction of finger movement is left You may zoom in on the map and shrink the map when you are on the right.
  • a map that has been enlarged or reduced is returned to its original magnification and displayed, for example, an operation of immediately releasing the finger that the user has pressed into the touch panel 20 or any mark (circle, square, etc.) is displayed on the touch panel 20.
  • An operation of drawing with a finger may be performed as a reset operation.
  • image editing software image browsing software, and the like
  • image enlargement or reduction may be realized using the user interface device 3 according to the first embodiment.
  • the gesture operation according to the second embodiment is also a gesture operation performed to enlarge or reduce the map.
  • a display magnification change button for instructing enlargement or reduction may be displayed at a predetermined position in the screen.
  • the user in order to enlarge or reduce the map, the user must always check the position where the display magnification change button is displayed.
  • FIG. 7 is a flowchart illustrating an example of an enlargement or reduction operation according to the second embodiment.
  • the control unit 10 according to the second embodiment outputs control information for enlarging or reducing the map to the display control unit 19 according to the time during which the state in which the touch panel 20 is pressed by the finger continues.
  • the gesture operation is, for example, an operation in which the user pushes the touch panel 20 with one finger.
  • the coordinate detection unit 17 detects that the touch has been made (S21), and detects the coordinates of the touch position.
  • the map moves and displays in accordance with the moving direction of the one finger moved from the lower left to the upper right of the screen. The state of being done is shown.
  • step S22 the user selects an operation of pushing a finger into the touch panel 20 (S22).
  • the screen added in step S22 shows a state in which the user touches the left side of the screen with one finger and further presses the touch panel 20.
  • the pressure-sensitive detection unit 18 determines whether or not the pressure-sensitive sensor 30 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (S23).
  • the pressure-sensitive detection unit 18 determines that the pressure is detected based on the sensor value (YES in S23)
  • the touch operation is performed with the finger pressed.
  • the control part 10 determines whether the pressure-sensitive state is maintained (S24).
  • the state in which the pressure-sensitive detection unit 18 continues to output pressure-sensitive detection information to the control unit 10 when the touch panel 20 is pushed with a finger is expressed as “the pressure-sensitive state is maintained”.
  • the control unit 10 determines whether or not the pressure-sensitive state is maintained based on the time that the pressure-sensitive detection information output from the pressure-sensitive detection unit 18 continues. At this time, the control unit 10 determines whether or not the time during which the state in which the touch panel 20 is pressed by the finger continues is equal to or longer than a predetermined time (for example, 1 second).
  • control part 10 determines with the pressure-sensitive state being maintained, if the time for which the state in which the touch panel 20 was pushed in with the finger continues is a predetermined time or more (YES in S24). Furthermore, the control unit 10 determines whether or not a determination time (for example, 2 seconds) has elapsed while the pressure-sensitive state is maintained (S25). The determination in step S25 is performed by the control unit 10 every determination time.
  • a determination time for example, 2 seconds
  • step S26 the control unit 10 outputs control information for enlarging the map displayed on the display panel 40 to the display control unit 19 (S26). ).
  • the screen added in step S26 shows a state where the map is enlarged and displayed as the display magnification increases as the time when the user presses the finger on the touch panel 20 becomes longer.
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S27).
  • step S27 the control unit 10 returns to step S25 and continues processing. As described above, since the process of steps S25 to S27 is repeated while the user continues to press the finger on the touch panel 20, the map is gradually enlarged and displayed every determination time (for example, 2 seconds).
  • step S25 if the determination time has not elapsed while the pressure-sensitive state is maintained in step S25 (NO in S25), the control unit 10 ends this process. For this reason, when the control part 10 passes step S26, S27, it displays in the state in which the map was expanded.
  • step S24 the control unit 10 determines that the pressure-sensitive state is not maintained if the time during which the state in which the touch panel 20 is pressed by the finger continues is less than the predetermined time (NO in S24), and the display panel 40.
  • the control information for reducing the map displayed on the screen is output to the display control unit 19 (S28).
  • the screen added to step S28 shows a state where the map is reduced and displayed at a reduced display magnification.
  • Step S23 when it is determined that the pressure-sensitive detection unit 18 does not detect pressure based on the sensor value (NO in S23), the control unit 10 moves the finger touching the touch panel 20 in the moving direction.
  • control information for moving and displaying the map is output to the display control unit 19 (S29).
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S30), and ends this process.
  • the map is enlarged and displayed when the touch panel 20 is pushed with a finger, and the finger pushed into the touch panel 20 immediately leaves the touch panel 20.
  • the map is reduced and displayed. For example, in order to display the map displayed on the display panel 40 in detail or in a wide area, if you press and hold a point on the screen for a long time, the map is enlarged and displayed at a predetermined magnification every 2 seconds. The map is reduced and displayed at a predetermined magnification by pressing a point on the screen and releasing it immediately. Since this gesture operation is performed with one finger, the conventional gesture operation using two fingers becomes unnecessary.
  • the pressure-sensitive detection unit 18 can detect that the touch panel 20 has been pushed at any position on the touch panel 20. Therefore, the user does not need to search for a conventional magnification change button, and can easily enlarge or reduce the map.
  • the map may be reduced and displayed when the touch panel 20 is pressed with a finger and the pressure-sensitive state is maintained.
  • the map is enlarged. May be displayed.
  • the map may be moved and displayed in accordance with the moving direction of the finger while enlarging or reducing the map.
  • a plurality of layers may be switched and displayed.
  • the gesture operation according to the third embodiment is performed in order to enlarge or reduce the map according to the number of fingers that perform the touch operation with the finger pressed.
  • the map has to be enlarged or reduced by a gesture operation generally called pinch-out or pinch-in that widens or narrows the interval between two fingers touching the screen.
  • the map is enlarged or reduced without performing pinch-out and pinch-in operations with two fingers.
  • FIG. 8 is a flowchart illustrating an example of the enlargement or reduction operation according to the third embodiment.
  • the map can be enlarged or reduced by a simple operation using one or two fingers. That is, the control unit 10 according to the third embodiment outputs control information for enlarging or reducing the map to the display control unit 19 in accordance with the number of fingers that press the touch panel 20.
  • the gesture operation is, for example, an operation in which the user pushes the touch panel 20 with one or two fingers. Note that the processing in steps S31 to S34 in FIG. 8 is the same as the processing in steps S21 to S23 and S29 in FIG. 7 in the second embodiment described above, and detailed description thereof is omitted.
  • step S33 When it is determined in step S33 that the pressure-sensitive detection unit 18 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (YES in S33), the control unit 10 detects the coordinates detected from the coordinate detection unit 17. Based on the information, the number of touches of the finger touching the touch panel 20 is detected (S35). When it is determined that the number of touches of the finger pushing the touch panel 20 is one (an example of the first number) (one of S35), the control unit 10 controls to enlarge the map displayed on the display panel 40. Information is output to the display control unit 19 (S36). The screen added in step S36 shows a state where the map is enlarged by the user pressing the touch panel 20 with one finger.
  • step S35 when it is determined in step S35 that the number of touches of the finger pressing the touch panel 20 is two (an example of the second number) (two of S35), the control unit 10 is displayed on the display panel 40.
  • the control information for reducing the map is output to the display control unit 19 (S37).
  • the screen added in step S37 shows a state where the map is enlarged by the user pressing the touch panel 20 with two fingers.
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S38), and ends this process.
  • the map can be enlarged or reduced according to the number of fingers that have pressed the touch panel 20, and a more intuitive and intuitive operation than before can be realized. can do. This eliminates the need for a gesture operation for expanding or narrowing two fingers using “touch panel touch detection” and “XY coordinate movement detection”, which has been conventionally performed.
  • the map may be reduced when the number of touches of the finger pressing the touch panel 20 is one, and the map may be enlarged when the number of finger touches is two. Further, even when the number of touched fingers is three or more, various operations may be selected according to the number of touched fingers. For this reason, for example, the first number may be the number of touches of two fingers, and the second number may be the number of touches of one or three fingers. Further, when the user moves the finger pressed into the touch panel 20 in the same direction, the map may be moved and displayed in accordance with the moving direction of the finger while enlarging or reducing the map.
  • the volume of the speaker may be increased when the number of finger touches that press the touch panel 20 is 1, and the volume of the speaker may be decreased when the number of finger touches is two.
  • the speaker volume may be decreased when the number of finger touches is one and the speaker volume may be increased when the number of finger touches is two.
  • the contrast of the screen may be increased or decreased by a gesture operation in the third embodiment. Further, the head-up (displaying the traveling direction upward) or the north-up (displaying north upward) of the own vehicle icon displayed during car navigation may be switched.
  • the user can input characters from the software keyboard displayed on the display panel using a conventional touch panel.
  • the conventional touch panel is a touch panel that can detect only the XY coordinates of the touch position.
  • the conventional touch panel when the user enters characters by changing from lowercase letters to uppercase letters, it is necessary to input characters after switching the character type by pressing the shift key or character type switching key displayed on the software keyboard. It was. For this reason, it is sometimes necessary to frequently switch the character type, and the shift key and the character type switching key have to be pressed many times, resulting in poor operability.
  • FIG. 9 is a flowchart showing an example of a conventional character type switching operation.
  • a user inputs a lowercase or uppercase alphabet using a software keyboard displayed on a conventional display panel.
  • the user selects the character type of the software keyboard displayed on the display panel of the conventional user interface device 100 (S41). Since the input mode of a character input key (hereinafter abbreviated as “key”) provided in the software keyboard is a lowercase input mode, lowercase characters are displayed on the key. Therefore, in this state, the user can input a lowercase letter by touching the displayed key.
  • the user selects either uppercase or lowercase character types (S42).
  • the character type selected by the user is not an upper case character (lower case in S42)
  • the key of the character to be input without switching the character type S43
  • the lower case character of the key is input.
  • the lower case letter of the touch position is determined (S44). Then, the fixed lowercase letter is displayed on the display panel, and the present process ends.
  • step S42 if the character type selected by the user in step S42 is uppercase (uppercase in S42), the user touches the shift key on the software keyboard (S45).
  • the screen added in step S45 shows a state in which the user touches the shift key (key represented by the upward arrow) at the lower left of the software keyboard.
  • step S46 the user removes his / her finger from the shift key to determine the capital letter input mode (S46).
  • the screen added in step S46 shows a state where the user lifts his / her finger from the shift key of the software keyboard.
  • the key input mode is changed from the lowercase input mode to the uppercase input mode.
  • Uppercase letters are displayed on the keys, and the user can input uppercase letters.
  • step S47 shows a state in which the user touches a key on the software keyboard.
  • step S48 When the user removes his / her finger from the key, the capital letter at the touch position is confirmed (S48).
  • the screen added in step S48 shows a state where the user lifts his / her finger from the software keyboard. Then, this fixed uppercase letter is displayed on the display panel, and the present process ends.
  • FIG. 10 is a flowchart illustrating an example of a character type switching operation according to the fourth embodiment.
  • the character type can be switched by a simple operation using one finger. That is, the control unit 10 according to the fourth embodiment is input from a key when the touch panel 20 is pressed at a position where a key (an example of an object) of the software keyboard displayed on the display panel 40 is displayed. Control information for changing the input mode that defines the character type of the character to be output is output to the display control unit 19.
  • the gesture operation is an operation in which the user presses the finger on the touch panel 20 to switch the key input mode and releases the finger to confirm the character.
  • Changing the object display form is, for example, The key to change the input mode is to display.
  • the user selects the character type of the software keyboard displayed on the display panel 40 provided in the user interface device 3 (S51).
  • the key input mode is a lowercase input mode in which the key characters can be input in lowercase letters (an example of the first character type), and lowercase letters are displayed on the keys of the software keyboard.
  • the screen added to step S52 shows a state in which the user touches a key on the software keyboard.
  • the pressure sensitive detector 18 detects pressure based on the sensor value input from the pressure sensor 30 (S53).
  • the pressure sensitive detection unit 18 does not detect the pressure sensitive (NO in S53)
  • the pressure sensitive detection information is not output to the control unit 10. For this reason, when the user removes his / her finger from the key, the lowercase letter of the key touched by the user is determined and displayed on the display panel 40 (S54), and this process ends.
  • step S55 shows a state in which the uppercase key on the software keyboard displayed by switching the key to the uppercase input mode is touched.
  • step S56 shows how the capital letter is fixed and displayed when the user lifts his / her finger from the software keyboard. Thereafter, the control unit 10 returns the key input mode from the uppercase input mode to the lowercase input mode.
  • the character type of the character to be input can be easily switched by simply pressing the finger on the touch panel 20 when inputting the character with the software keyboard. For this reason, conventionally, it has been necessary to move the finger to the shift key and touch the shift key to switch the character type.
  • the gesture operation in the fourth embodiment the user places the finger at the key position to be input by the user. You can change the character type from lower case to upper case while touching. For this reason, the user can easily switch the character type by an intuitive gesture operation.
  • the key input mode is set to the upper case input mode, and when the pressure sensitive detection unit 18 detects the pressure, the mode is temporarily switched to the lower case input mode, and after the lower case is fixed and inputted, You may return to uppercase input mode.
  • the key input mode is set to Hiragana input mode, and when the pressure-sensitive detector 18 detects pressure, the mode is temporarily switched to Katakana input mode, and the Katakana character is confirmed and input. After that, you may return to hiragana input mode.
  • the first input mode may be the katakana input mode, and the input mode that is switched when pressure is detected may be the hiragana input mode.
  • the key input mode may be switched from the full-width input mode to the half-width input mode, or from the half-width input mode to the full-width input mode. Further, the control unit 10 may maintain the switched input mode in the next character input, and may return to the original input mode when the user presses the finger on the touch panel 20 again.
  • the pressure-sensitive detection unit 18 detects pressure, the muddy sound corresponding to the character at the position touched by the user's finger ( Gi, Gu, etc.), semi-turbid sound (Pa, Pi, Pu, etc.), sound (tsu), small letters (nya, yu, yo, ai, ⁇ , etc.), symbols (+,-, x, ⁇ , etc.) ), Umlauts or the like may be input.
  • gesture operation of the user interface device 3 according to the fifth embodiment of the present invention is performed in order to select an edit item (an example of a menu item) at a position where a certain operation is performed.
  • FIG. 11 is a flowchart showing an example of a conventional menu selection operation.
  • the control unit included in the conventional user interface apparatus 100 detects the position indicated by the black arrow instruction icon by operating the mouse or the like (S61).
  • the screen added to step S61 shows a state in which the instruction icon indicates the character string in the screen.
  • the selected character string is highlighted in bold.
  • the user selects a menu by moving the instruction icon (S62).
  • the instruction icon moves according to the locus of the mouse moved by the user (S63), and this process ends.
  • step S62 when the user selects a menu (YES in S62), for example, edit items in the edit menu are displayed (S64). Thereafter, when the user selects an edit menu from the menu bar at the top of the screen, edit items (copy, paste, cut, etc.) included in the edit menu are displayed in a list.
  • the character string selected by the user is indicated by a dashed ellipse, and a list of edit menus and edit items is indicated in the menu bar on the upper side of the screen.
  • FIG. 11 shows an example in which the user clicks the edit menu on the menu bar.
  • Edit items similar to the edit menu are displayed in a list, and the edit items can be selected.
  • the instruction icon in order to perform an editing operation on a character string selected by the user, the instruction icon must be moved to the menu bar, or a list of editing items can be displayed by right-clicking the mouse, and the editing item must be selected. I must.
  • a mouse is not connected to the conventional user interface device 100, it is difficult to perform these operations.
  • FIG. 12 is a flowchart illustrating an example of a menu selection operation according to the fifth embodiment.
  • the control unit 10 according to the fifth embodiment displays a menu item related to the object at the position where the touch panel 20 is pressed.
  • the gesture operation is an operation in which the user presses a finger on the touch panel 20 to select an edit item from an edit menu. Changing the object display form is, for example, an edit menu. Displaying the edited character string through the edit item.
  • the coordinate detection unit 17 detects that the touch has been made (S71), and detects the coordinates of the touch position.
  • the user selects a character string (an example of an object) in a predetermined area by a touch operation, and selects an operation of pushing a finger into the touch panel 20 (S72).
  • the pressure-sensitive detection unit 18 determines whether or not the pressure-sensitive sensor 30 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (S73).
  • the control unit 10 moves the instruction icon in accordance with the direction in which the finger touching the touch panel 20 moves. Are displayed (S74), and this process is terminated.
  • step S73 if it is determined in step S73 that the pressure-sensitive detection unit 18 has detected pressure sensitivity based on the sensor value (YES in S73), the touch panel 20 is pressed in a state where an object in a predetermined area is selected.
  • the control unit 10 outputs control information for displaying the edit item of the edit menu at the position where the touch panel 20 is pressed to the display control unit 19.
  • the edit item of the edit menu is an example of a menu item related to the character string when the object is a character string.
  • the display control unit 19 displays the edit items of the edit menu on the display panel 40 (S75).
  • a region for example, a character string
  • a list of edit items is indicated.
  • the user selects an arbitrary edit item from the displayed edit items (S76), and ends this process.
  • the edit items of the edit menu for the area selected through the touch panel 20 are displayed. In this way, the user can select an edit item without moving his / her finger largely, since the edit item is displayed in the same manner as a right-click operation with a conventional mouse while a predetermined area is selected.
  • the menu bar includes a file menu, a display menu, a help menu, and the like.
  • the menu displayed when the touch panel 20 is pressed may be a menu other than the edit menu.
  • the menu may be switched and displayed according to the pressing amount of the touch panel 20. For example, if the pressing amount is small (the pressing force is greater than or equal to the threshold th1 and less than the threshold th2), the editing menu edit item is displayed. If the pressing amount is large (the pressing force is the threshold th2 or more), the display menu item is displayed. May be.
  • the menu may be switched and displayed every 2 seconds, for example, according to the time during which the pressure-sensitive detector 18 is detecting pressure. Thereby, the user can select an item from a menu other than the edit menu.
  • FIG. 13 is a flowchart showing an example of a conventional cancel operation.
  • the control unit included in the conventional user interface device 100 detects that a finger has touched the touch panel included in the user interface device 100 (S81).
  • the screen added in step S81 shows a state in which the user touches the lower left of the screen with one finger.
  • step S82 shows a state in which a character string input by the user is displayed.
  • step S82 If the user determines in step S82 that the character input is correct (YES in S82), the process is terminated. On the other hand, when it is determined that the character input is inaccurate (NO in S82), the user performs a character input cancel operation.
  • This canceling operation is performed by either an input operation (S83) performed through an external input device or a touch operation (S84) for a cancel icon displayed at the lower right of the screen.
  • the operations in steps S83 and S84 are both called cancel operations.
  • the user returns to step S82 and performs character input again to determine whether or not the character input is accurate.
  • the cancel operation was realized by using an external input device or touching a cancel icon, it was difficult to perform the cancel operation in an application where installation space is limited.
  • the previous operation can be canceled by the pushing operation.
  • FIG. 14 is a flowchart illustrating an example of a cancel operation according to the sixth embodiment.
  • the canceling operation can be performed by a simple operation using one finger. That is, the control unit 10 according to the sixth embodiment provides control information for performing an editing operation on the operation performed immediately before, according to the direction in which the finger moves in a state where the touch panel 20 is pressed by the finger. The data is output to the display control unit 19.
  • the gesture operation is an operation of moving the finger in a direction opposite to the direction in which the character string is input in a state where the user presses the finger on the touch panel 20, and the object display form is changed. To change is to display the edited character string, for example.
  • the coordinate detection unit 17 detects that the touch has been made (S91), and detects the coordinates of the touch position.
  • the screen added in step S91 shows a state where the user touches the lower left of the screen with one finger.
  • step S92 shows a state in which a character string input by the user is displayed.
  • step S92 If the user determines in step S92 that the character input is correct (YES in S92), the process is terminated. On the other hand, if it is determined that the character input is inaccurate (NO in S92), the user performs a character input cancel operation. In this canceling operation, first, the user performs an operation of pushing a finger into the touch panel 20 (S93). The screen added in step S93 shows a state where the user presses the touch panel 20.
  • step S94 shows how all the input operations for the character string input immediately before are canceled.
  • a series of operations performed in the order of steps S93 and S94 is referred to as a cancel operation.
  • the user interface device 3 since the last operation can be canceled by combining the pressing operation and the touch operation that moves the finger in one direction, the user can cancel the cancel operation. Intuitive and simple implementation. For this reason, conventionally, an icon is displayed on the display panel or an external input device is provided to perform the cancel operation. In the present embodiment, there are no restrictions on the installation space and the corresponding application. . As described above, the user can easily perform the cancel operation by using the user interface device 3, and thus can be applied not only to the navigation device 1 but also to devices for various purposes.
  • a plurality of operations may be canceled together according to the amount of pressing when the finger pressed into the touch panel 20 moves in the pressing direction.
  • the moving direction of the finger is not limited to the left direction but may be another direction.
  • the character string may be redisplayed as the re-operation.
  • the operation may be repeated when the moving direction of the finger is upward.
  • other operations may be assigned to gesture operations performed using the user interface device 3.
  • the navigation device 1 when a user specifies a place and inputs a destination, and then specifies another place and inputs a destination, the user presses the finger into the touch panel 20 and moves it to the left. May display the location where the destination input was previously performed.
  • the user interface device 3 may be used in order to cancel or redo an operation on an image.
  • the user may perform a plurality of cancel operations by changing the moving direction of the pressed finger halfway.
  • the operation may be an operation for canceling two consecutive input operations performed immediately before.
  • FIG. 15 is an explanatory diagram showing an installation example of the conventional user interface device 110.
  • a user interface device 110 that does not have a touch panel has been used in a navigation device mounted on a vehicle in order to allow a user to operate the navigation device at hand.
  • This navigation device includes a navigation device body and a user interface device 110 that can display only a navigation screen.
  • the display panel 111 (an example of the second display panel) provided in the user interface device 110 is not provided with a touch panel. For this reason, the user cannot perform a touch operation through the user interface device 110.
  • An external input device is installed in the vehicle so that the user can operate the navigation device.
  • Examples of the external input device include a button 101 and a joystick 102 that are installed on a center console in a vehicle and connected to a navigation device.
  • a knob type controller may be installed instead of the joystick 102.
  • the center console has a limited space where an external input device can be installed, and the use of the external input device is limited to a selection operation and a determination operation. For this reason, in order to open a function that the user wants to use, selection and determination must be repeated using an external input device, which takes time and effort.
  • FIG. 16 is a flowchart showing an operation example of a conventional navigation device.
  • a navigation operation is performed using a joystick 102 and a button 101 as an external input device.
  • the control unit (not shown) of the navigation device detects the operation of the joystick 102 (S101).
  • the screen added in step S101 shows a state in which the instruction icon is displayed on the display panel 111 of the user interface device 110.
  • step S ⁇ b> 102 shows how the instruction icon moves as the user moves the joystick 102.
  • the destination instruction icon is represented by a white arrow in the screen.
  • step S103 shows how the information input by the user is determined when the user presses the button 101.
  • the selection operation must be performed by the joystick 102 and the determination operation must be performed by the button 101.
  • the joystick 102 and the button 101 are arranged at different positions, an erroneous operation may occur. Further, it is impossible to assign operation functions other than the selection operation and the determination operation to the joystick 102 and the button 101.
  • FIG. 17 is an explanatory diagram showing a configuration example of a navigation device 1A according to the seventh embodiment.
  • the navigation device 1A includes another user interface device 110 in addition to the navigation device body 2 and the user interface device 3 as shown in FIG.
  • the user interface device 110 is a conventional user interface device and does not include the touch panel 20. Only objects used for navigation can be displayed on the display panel 111. On the display panel 111, the navigation apparatus body 2 performs a predetermined process based on the content instructed by the gesture operation input from the control unit 10, and an object whose display form has been changed by performing the predetermined process is displayed. .
  • the navigation device body 2 and the user interface device 3 can communicate with each other wirelessly or by wire. Further, an object output from the navigation device body 2 is displayed on the display panel 111 of the user interface device 110.
  • the user interface device 3 is called a “pressure-sensitive controller” and is distinguished from the conventional user interface device 110.
  • FIG. 17 (1) shows a state in which the navigation device body 2 including the user interface device 110 is installed on the instrument panel of the vehicle and the pressure-sensitive controller is installed on the center console of the vehicle. .
  • a map is displayed on the user interface device 110, and objects of three types of command icons (for example, current location, AV (Audio Visual), menu) are displayed on the display panel 40 (an example of the first display panel) of the pressure-sensitive controller. Is done. For this reason, the user can intuitively select a required command by pressing a command icon through the pressure-sensitive controller.
  • a switch button not shown
  • command icons for example, setting and adjustment
  • an instruction icon is displayed on the user interface device 110.
  • the instruction icon is moved and displayed in accordance with the moving direction of the finger. While the user moves the finger, it is not necessary to push the finger into the touch panel 20.
  • the user operates the touch panel 20 of the pressure-sensitive controller to select an enlarged icon displayed on the user interface device 110 with an instruction icon. Thereafter, when the user presses the touch panel 20 of the pressure-sensitive controller with a finger, selection of the enlarged icon is determined. For this reason, the map is enlarged and displayed on the user interface device 110.
  • FIG. 18 is a flowchart illustrating an operation example of the navigation device 1 according to the seventh embodiment.
  • a desired object can be displayed on the user interface device 110 by a simple operation using one finger. That is, the control unit 10 according to the seventh embodiment displays an instruction icon for instructing selection of a menu icon displayed on the display panel 111 of the user interface device 110 when the touch panel 20 is pushed by a finger. Control information to be displayed on the user interface device 110 is output to the navigation device body 2. Then, the control unit 10 displays control information for determining selection of the instruction icon when the touch panel 20 is pressed again with a finger in a state where the instruction icon is displayed superimposed on the menu icon. Output to the main body 2. The navigation device body 2 performs a predetermined process based on the content instructed by the touch operation or the gesture operation input from the control unit 10, changes the display form of the object that has performed the predetermined process, and displays the display panel 111. Display the object.
  • the gesture operation means, for example, that a user pushes a finger into the touch panel 20 to display an instruction icon on the user interface device 110, move the instruction icon in accordance with the direction in which the finger is moved, and then the user again Is an operation of determining a selection icon displayed on the user interface device 110 by pushing a finger into the touch panel 20.
  • changing the display form of the object in the seventh embodiment means, for example, displaying the map in an enlarged or reduced manner.
  • the control unit 10 detects that a user's finger has touched the touch panel 20 of the pressure-sensitive controller (S111).
  • the screen added in step S111 shows a state in which a user's finger touches the touch panel 20 of the pressure-sensitive controller and a state in which a map is displayed on the display panel 111 of the user interface device 110.
  • the control unit 10 determines that the touch panel 20 has been pressed with a finger based on the pressure-sensitive detection information that the pressure-sensitive detection unit 18 detects and outputs a pressure-sensitive value based on the sensor value.
  • the control unit 10 transmits control information for displaying an instruction icon for selecting a menu icon displayed on the display panel 111 of the user interface device 110 when the touch panel 20 is pushed by a finger through the navigation device body 2.
  • the data is output to the display control unit of the user interface device 110.
  • an instruction icon is displayed on the display panel 111 of the user interface device 110.
  • the screen added in step S112 shows that the selection icon is displayed on the display panel 40 and the instruction icon is displayed on the user interface device 110 when the user presses the touch panel 20 with a finger. ing.
  • the control unit 10 moves the instruction information displayed on the display panel 111 of the user interface device 110 in accordance with the direction in which the finger touching the touch panel 20 moves, and displays control information through the navigation device body 2. Output to the interface device 110. Then, the display control unit of the user interface device 110 moves and displays the instruction icon displayed on the display panel 111 according to the operation of the pressure-sensitive controller.
  • the user moves his / her finger on the touch panel 20 and inputs information, and the instruction icon displayed on the user interface device 110 moves according to the movement of the user's finger. It is shown.
  • a moving instruction icon is represented by a white arrow in the screen.
  • the user selects a selection icon displayed on the display panel 40 of the pressure-sensitive controller, and presses this selection icon to determine the selection icon (S114).
  • the instruction icon displayed on the display panel 111 of the user interface device 110 is displayed superimposed on the menu icon.
  • the pressure-sensitive detection unit 18 outputs pressure-sensitive detection information when pressure is detected based on the sensor value.
  • the control unit 10 can determine that the touch panel 20 has been pressed with a finger based on the pressure-sensitive detection information input from the pressure-sensitive detection unit 18, and can determine the menu icon selected by the instruction icon.
  • the screen added in step S114 shows a state in which the user presses the touch panel 20 with a finger.
  • control part 10 changes the control information for changing and displaying the map displayed on the display panel 111 of the user interface apparatus 110 based on the information determined with the instruction
  • the data is output to the display control unit of the interface device 110.
  • a map that is an operation result of the operation performed by the user through the pressure-sensitive controller is displayed on the display panel 111 of the user interface device 110.
  • the user interface device 3 is used as a pressure-sensitive controller that allows the user to perform navigation operations at hand.
  • the user can perform the selection operation and the determination operation with one finger, and can perform the determination operation for the instruction icon at the position where the instruction icon is selected. For this reason, the movement of the finger can be minimized, and the operation can be performed simply and intuitively.
  • the pressure-sensitive controller is disposed at a position away from the navigation apparatus body 2, the user can easily operate it at hand.
  • command icons for example, volume buttons
  • various command icons can be displayed on the display panel 40 of the pressure-sensitive controller, and the display magnification can be arbitrarily changed.
  • a joystick 102 or the like is used to perform a complicated and time-consuming selection operation or determination operation, but in the pressure-sensitive controller according to the seventh embodiment, the user operates the pressure-sensitive controller. Quickly select menus and make decisions. Even if the number of command icons that can be displayed on one screen is limited, it is possible to display and select many command icons by switching the screens.
  • the command icon may be displayed by dividing the display panel 40 of the user interface device 3 into cells divided into four columns in two vertical columns and two horizontal columns, for example. This can prevent a user from making a mistake in pressing.
  • the interface device 110 can be operated. Further, the pressure-sensitive controller and the personal computer device may be connected wirelessly or by wire. It is also possible to cause the personal computer device to execute a predetermined function by operating the pressure sensitive controller at a location away from the personal computer device.
  • the user interface device 3 according to each embodiment described above may be applied to devices other than the navigation device. For example, operability can be improved by applying the user interface device 3 to a touch panel portion of a mobile terminal, a tablet terminal, or the like. In this way, the user interface device 3 according to the present embodiment can be combined with various electronic devices that require a touch panel.
  • the user interface device 3 includes the control unit 10, but the navigation device body 2 may include the control unit 10.
  • the touch panel 20 may be configured to detect a touch operation by a method other than the electrostatic capacitance method. Further, the pressure sensor 30 may detect that the touch panel 20 is pushed in by a press switch or the like provided below the touch panel 20.
  • the operation combined with the pressure sensitive function in the user interface device 3 may be used for, for example, navigation of a person or a bicycle. Further, the user interface device 3 may be used for the operation of the image editing software as described above, or may be used for the operation of other application software.
  • the navigation device body 2 and the user interface device 3 are combined. However, since the user interface device 3 itself has a navigation function, only the user interface device 3 may be used as the navigation device.
  • the pressure-sensitive sensor 30 may be configured not to include the elastic body 33. For example, even if the elastic body 33 is removed from the pressure sensor 30, if a pressing force is applied to the pressure sensor 30 in a state where the upper electrode 31 and the lower electrode 35 are separated from each other with a constant distance, the upper electrode 31 is applied. The lower electrode 35 approaches and the electrostatic capacity between the upper electrode 31 and the lower electrode 35 decreases. For this reason, the pressure sensitive detection unit 18 can obtain the capacitance change rate based on the sensor values output from the upper electrode 31 and the lower electrode 35.
  • the present invention is not limited to the embodiment described above, and various other application examples and modifications can of course be taken without departing from the gist of the present invention described in the claims.
  • the configuration of the apparatus is described in detail and specifically in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the configuration including all the configurations described.
  • a part of the configuration of the embodiment described here can be replaced with the configuration of the other embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Is possible.
  • the control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
  • SYMBOLS 1 ... Navigation apparatus, 2 ... Navigation apparatus main body, 3 ... User interface apparatus, 10 ... Control part, 17 ... Coordinate detection part, 18 ... Pressure detection part, 19 ... Display control part, 20 ... Touch panel, 30 ... Pressure sensor 40 ... Display panel

Abstract

A control unit provided to this user interface device determines a touch operation on the basis of coordinate detection information, which a coordinate detection unit detects and outputs. Furthermore, the control unit determines a pressing operation on the basis of pressure sensitive detection information, which a pressure sensitive detection unit outputs when the pressure sensitive detection unit detects that the pressing operation has been performed. Then, on the basis of a gesture operation comprising the combination of the touch operation and the pressing operation, the control unit outputs, to a display control unit, control information for changing the display mode of an object.

Description

ユーザーインターフェイス装置及び電子機器User interface device and electronic device
 本発明は、ユーザーインターフェイス装置及び電子機器に関する。 The present invention relates to a user interface device and an electronic device.
 近年、指等で容易かつ感覚的に操作できるタッチパネルが広く普及しており、タッチパネルの小型化、薄型化、軽量化、省電力化及び低コスト化等に関する多様な研究及び開発が行われている。指等の指示媒体がタッチパネルにタッチした位置(「タッチ位置」と呼ぶ)を検知する方式としては、電気抵抗の変化を検知する抵抗膜式、超音波等を利用する表面弾性波方式、静電容量の変化を検知する静電容量方式等が知られている。中でも、複数のタッチ位置を検知できる点等において、静電容量方式のタッチパネルが注目されている。 In recent years, touch panels that can be easily and sensibly operated with a finger or the like have become widespread, and various studies and developments have been made on downsizing, thinning, lightening, power saving, and cost reduction of touch panels. . As a method for detecting a position where an instruction medium such as a finger touches the touch panel (referred to as “touch position”), a resistance film type that detects a change in electrical resistance, a surface acoustic wave method that uses ultrasonic waves, an electrostatic A capacitance method for detecting a change in capacitance is known. In particular, a capacitive touch panel is attracting attention in that it can detect a plurality of touch positions.
 静電容量方式のタッチパネルは、静電容量を発生させる透明電極と、静電容量の変化を検知する外部回路とを備えている。近年では、スタイラスペン、指等によりタッチパネルが押圧されたときの押圧力を検知可能な感圧センサーを備えたタッチパネルが提供されつつある。 The capacitive touch panel includes a transparent electrode that generates a capacitance and an external circuit that detects a change in the capacitance. In recent years, a touch panel including a pressure-sensitive sensor capable of detecting a pressing force when the touch panel is pressed with a stylus pen, a finger, or the like is being provided.
 例えば、特許文献1には、トッププレート及びタッチパネルが押圧方向に移動することにより変位した感圧センサーの上部電極と下部電極の静電容量の変化に基づいて、入力操作面への押圧による入力決定操作がなされたことを判定する技術が開示されている。 For example, in Patent Document 1, input determination by pressing on the input operation surface is performed based on a change in capacitance of the upper electrode and lower electrode of the pressure-sensitive sensor displaced by the top plate and the touch panel moving in the pressing direction. A technique for determining that an operation has been performed is disclosed.
 また、特許文献2には、弾性部材を用いない感圧センサーを備えるセンサー装置が開示されている。 Patent Document 2 discloses a sensor device including a pressure-sensitive sensor that does not use an elastic member.
特開2011-134000号公報JP 2011-134000 A 特開2011-100364号公報Japanese Unexamined Patent Publication No. 2011-100364
 タッチパネルは様々な電子機器に用いられている。タッチパネルが用いられる電子機器として、例えば、自動車に搭載され、現在の自車の位置から目的地までの地図及び道順等を表示パネルに表示するナビゲーション装置がある。ナビゲーション装置では、簡易な操作が求められる。しかし、従来のナビゲーション装置に取り付けられるタッチパネルには、タッチ位置のXY座標を検知する機能しかなく、ユーザーが目的とする結果を得るまでに複雑な操作を行わなくてはならず、不便であった。 Touch panels are used in various electronic devices. As an electronic device using a touch panel, for example, there is a navigation device that is mounted on a car and displays a map, directions, and the like from the current position of the vehicle to a destination on a display panel. A navigation device requires a simple operation. However, the touch panel attached to the conventional navigation device has only a function of detecting the XY coordinates of the touch position, and it is inconvenient because a complicated operation must be performed by the user to obtain a desired result. .
 また、特許文献1及び2に開示された感圧センサーを備えるタッチパネルでは、タッチパネルに指等がタッチしたことを確実に検知するために、感圧センサーが補助的に用いられるに過ぎない。このため、特許文献1及び2に開示された技術を適用したタッチパネルであっても、操作自体は従来の感圧機能を有さないタッチパネルと同様であり、操作の不便さは解消していなかった。また、感圧センサーを備えないタッチパネルでは、外部の入力装置等を用いた操作が要求されるため、ユーザーが操作に習熟するまで時間がかかっていた。 Moreover, in the touch panel provided with the pressure sensitive sensor disclosed in Patent Documents 1 and 2, the pressure sensitive sensor is merely used in an auxiliary manner in order to reliably detect that a finger or the like has touched the touch panel. For this reason, even if it is the touch panel which applied the technique disclosed by patent document 1 and 2, operation itself is the same as that of the touch panel which does not have a conventional pressure-sensitive function, and the inconvenience of operation was not eliminated. . Moreover, since the operation using an external input device etc. is requested | required in the touchscreen which is not provided with a pressure sensor, it took time until the user became familiar with operation.
 本発明はこのような状況に鑑みて成されたものであり、ユーザーが要求する結果を容易に取得できるようにすることを目的とする。 The present invention has been made in view of such a situation, and an object thereof is to make it possible to easily obtain a result requested by a user.
 本発明に係るユーザーインターフェイス装置は、オブジェクトを表示する表示パネルと、制御情報に従って表示パネルにオブジェクトを表示させる制御を行う表示制御部と、指示媒体によりタッチ操作が行われるタッチパネルと、タッチ操作が行われたタッチパネルのタッチ位置の座標を検知して座標検知情報を出力する座標検知部と、指示媒体によりタッチパネルに印加される押圧力に応じて変化するセンサー値を出力する感圧センサーと、センサー値に基づいて、指示媒体によりタッチパネルが押込まれる押込み操作が行われたことを検知して感圧検知情報を出力する感圧検知部と、座標検知情報に基づいて判定したタッチ操作、及び感圧検知情報に基づいて判定した押込み操作を組み合わせたジェスチャー操作に基づいて、オブジェクトの表示形態を変更するための制御情報を表示制御部に出力する制御部と、を備える。
 また、本発明に係る電子機器は、上述したユーザーインターフェイス装置と、電子機器本体と、を備える。電子機器本体は、制御部から入力するタッチ操作又はジェスチャー操作により指示された内容に基づいて所定の処理を行い、所定の処理を行ったオブジェクトをユーザーインターフェイス装置の制御部に出力する。
A user interface device according to the present invention includes a display panel that displays an object, a display control unit that performs control to display an object on the display panel according to control information, a touch panel that performs a touch operation using an instruction medium, and a touch operation that is performed. A coordinate detector that detects the coordinates of the touch position of the touch panel and outputs coordinate detection information; a pressure sensor that outputs a sensor value that changes according to the pressing force applied to the touch panel by the pointing medium; and a sensor value , A pressure sensitive detection unit that detects that a push-in operation in which the touch panel is pushed by the instruction medium is performed and outputs pressure sensitive detection information, a touch operation determined based on the coordinate detection information, and pressure sensitive Based on the gesture operation that combines the push operation determined based on the detection information, And a control unit, the outputs of the control information for changing the display form on the display control unit.
An electronic apparatus according to the present invention includes the above-described user interface device and an electronic apparatus main body. The electronic device main body performs a predetermined process based on content instructed by a touch operation or a gesture operation input from the control unit, and outputs an object on which the predetermined process has been performed to the control unit of the user interface device.
 本発明によれば、ユーザーは、タッチ操作及び押込み操作を組み合わせた直感的なジェスチャー操作を行うことで、ユーザーが要求した結果を容易に取得できる。
 上記した以外の課題、構成及び効果は、以下の実施の形態の説明により明らかにされる。
According to the present invention, the user can easily obtain the result requested by the user by performing an intuitive gesture operation that combines the touch operation and the push-in operation.
Problems, configurations, and effects other than those described above will be clarified by the following description of embodiments.
本発明の第1の実施の形態に係るナビゲーション装置が搭載されている車両の運転席及び助手席を進行方向の後方から前方に向かって眺めたときの状態を示す説明図である。It is explanatory drawing which shows a state when the driver's seat and front passenger seat of the vehicle by which the navigation apparatus which concerns on the 1st Embodiment of this invention is mounted are seen toward the front from the back of the advancing direction. 本発明の第1の実施の形態に係るユーザーインターフェイス装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of the user interface apparatus which concerns on the 1st Embodiment of this invention. 本発明の第1の実施の形態に係るユーザーインターフェイス装置の概略構成図である。1 is a schematic configuration diagram of a user interface device according to a first embodiment of the present invention. 本発明の第1の実施の形態に係る感圧センサーの動作原理の説明図である。It is explanatory drawing of the operation principle of the pressure-sensitive sensor which concerns on the 1st Embodiment of this invention. 従来の拡大又は縮小操作の例を示すフローチャートである。It is a flowchart which shows the example of the conventional expansion or reduction operation. 本発明の第1の実施の形態に係る拡大又は縮小操作の例を示すフローチャートである。It is a flowchart which shows the example of expansion or reduction operation which concerns on the 1st Embodiment of this invention. 本発明の第2の実施の形態に係る拡大又は縮小操作の例を示すフローチャートである。It is a flowchart which shows the example of expansion or reduction operation which concerns on the 2nd Embodiment of this invention. 本発明の第3の実施の形態に係る拡大又は縮小操作の例を示すフローチャートである。It is a flowchart which shows the example of expansion or reduction operation which concerns on the 3rd Embodiment of this invention. 従来の文字種の切替操作の例を示すフローチャートである。It is a flowchart which shows the example of the switching operation of the conventional character type. 本発明の第4の実施の形態に係る文字種の切替操作の例を示すフローチャートである。It is a flowchart which shows the example of switching operation of the character type which concerns on the 4th Embodiment of this invention. 従来のメニュー選択操作の例を示すフローチャートである。It is a flowchart which shows the example of the conventional menu selection operation. 本発明の第5の実施の形態に係るメニュー選択操作の例を示すフローチャートである。It is a flowchart which shows the example of menu selection operation which concerns on the 5th Embodiment of this invention. 従来の取消操作の例を示すフローチャートである。It is a flowchart which shows the example of the conventional cancellation operation. 本発明の第6の実施の形態に係る取消操作の例を示すフローチャートである。It is a flowchart which shows the example of cancellation operation which concerns on the 6th Embodiment of this invention. 従来のユーザーインターフェイス装置の設置例を示す説明図である。It is explanatory drawing which shows the example of installation of the conventional user interface apparatus. 従来のナビゲーション装置の操作例を示すフローチャートである。It is a flowchart which shows the example of operation of the conventional navigation apparatus. 本発明の第7の実施の形態に係るナビゲーション装置の構成例を示す説明図である。It is explanatory drawing which shows the structural example of the navigation apparatus which concerns on the 7th Embodiment of this invention. 本発明の第7の実施の形態に係るナビゲーション装置の操作例を示すフローチャートである。It is a flowchart which shows the operation example of the navigation apparatus which concerns on the 7th Embodiment of this invention.
 以下、本発明を実施するための形態について、添付図面を参照して説明する。本明細書及び図面において、実質的に同一の機能又は構成を有する構成要素については、同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments for carrying out the present invention will be described with reference to the accompanying drawings. In the present specification and drawings, components having substantially the same function or configuration are denoted by the same reference numerals, and redundant description is omitted.
[第1の実施の形態]
 図1は、ナビゲーション装置1が搭載されている車両の運転席及び助手席を進行方向の後方から前方に向かって眺めたときの状態を示す説明図である。
 ナビゲーション装置1(電子機器の一例)は、ダッシュボード5とメーターパネル6とに挟まれた位置であって、ステアリングホイール7を握るユーザーが視認可能な位置に設置されている。ナビゲーション装置1の下には、シフトレバー8が設けられている。
[First Embodiment]
FIG. 1 is an explanatory diagram showing a state when a driver seat and a passenger seat of a vehicle on which the navigation device 1 is mounted are viewed from the rear in the traveling direction toward the front.
The navigation device 1 (an example of an electronic device) is installed at a position sandwiched between the dashboard 5 and the meter panel 6 and visible to a user who holds the steering wheel 7. A shift lever 8 is provided below the navigation device 1.
 このナビゲーション装置1は、ナビゲーションに必要な処理を行うナビゲーション装置本体2と、ユーザーインターフェイス装置3とを備える。ナビゲーション装置本体2は、ダッシュボード5に形成された凹部に嵌め込まれておりユーザーが直接視認することはできない。ユーザーインターフェイス装置3は、ユーザーが視認可能な位置に配置されており、ユーザーから入力された指示をナビゲーション装置本体2に出力したり、ナビゲーション装置本体2による処理結果をユーザーに示したりする。 The navigation device 1 includes a navigation device body 2 that performs processing necessary for navigation, and a user interface device 3. The navigation device body 2 is fitted in a recess formed in the dashboard 5 and cannot be directly recognized by the user. The user interface device 3 is disposed at a position where the user can visually recognize, and outputs an instruction input by the user to the navigation device main body 2 and indicates a processing result by the navigation device main body 2 to the user.
 ユーザーインターフェイス装置3は、ダッシュボード5及びメーターパネル6並びにシフトレバー8からフロントガラス4の基部にかけての内装の形状と一体感を持つように構成されている。ユーザーインターフェイス装置3には、ナビゲーション装置本体2から出力されたナビゲーション処理等に必要な情報(地図、各種アイコン等)が表示されている。ユーザーは、ユーザーの指、スタイラスペン等の指示媒体をタッチパネル20(後述する図2を参照)にタッチして、ユーザーインターフェイス装置3に表示されたアイコンを動かしたり、地図等を拡大又は縮小させたりするジェスチャー操作を行うことが可能である。 The user interface device 3 is configured to have a sense of unity with the interior shape from the dashboard 5, the meter panel 6 and the shift lever 8 to the base of the windshield 4. The user interface device 3 displays information (a map, various icons, etc.) necessary for navigation processing and the like output from the navigation device body 2. The user touches an instruction medium such as a user's finger or stylus pen on the touch panel 20 (see FIG. 2 to be described later) to move an icon displayed on the user interface device 3 or to enlarge or reduce a map or the like. It is possible to perform gesture operations.
 以下の説明では、タッチパネル20にタッチする指示媒体を、ユーザーの指であるとする。また、ユーザーがタッチパネル20にタッチしたまま指を移動して行う操作を「タッチ操作」と呼び、ユーザーがタッチパネル20を指で押し込む操作を「押込み操作」と呼ぶ。そして、タッチ操作と押込み操作を組み合わせて行われる操作を「ジェスチャー操作」と呼ぶ。なお、従来のジェスチャー操作は、タッチ操作だけで行われる。 In the following description, it is assumed that the instruction medium touching the touch panel 20 is a user's finger. In addition, an operation performed by the user moving his / her finger while touching the touch panel 20 is referred to as a “touch operation”, and an operation in which the user presses the touch panel 20 with a finger is referred to as a “pressing operation”. An operation performed by combining the touch operation and the pressing operation is referred to as a “gesture operation”. Note that the conventional gesture operation is performed only by a touch operation.
 図2は、ユーザーインターフェイス装置3の内部構成例を示すブロック図である。
 ユーザーインターフェイス装置3の内部構成を説明する前に、ナビゲーション装置本体2の動作を説明する。ナビゲーション装置本体2(電子機器本体の一例)は、ユーザーインターフェイス装置3の制御部10から入力されるタッチ操作又はジェスチャー操作により、指示された内容に基づいて所定の処理を行う。そして、ナビゲーション装置本体2は、指示に基づいて所定の処理を行った後のオブジェクトをユーザーインターフェイス装置3の制御部10に出力する。また、制御部10は、オブジェクトに行われたタッチ操作(指の移動方向、タッチ位置等)を判定する。
FIG. 2 is a block diagram illustrating an internal configuration example of the user interface device 3.
Before describing the internal configuration of the user interface device 3, the operation of the navigation device body 2 will be described. The navigation device main body 2 (an example of the electronic device main body) performs predetermined processing based on the instructed content by a touch operation or a gesture operation input from the control unit 10 of the user interface device 3. Then, the navigation device body 2 outputs the object after performing the predetermined processing based on the instruction to the control unit 10 of the user interface device 3. Further, the control unit 10 determines a touch operation (finger moving direction, touch position, etc.) performed on the object.
 ナビゲーション装置本体2からユーザーインターフェイス装置3に出力されるオブジェクトには、例えば、地図、文字列、アイコン、画像等が含まれる。ナビゲーション装置本体2は、ユーザーインターフェイス装置3になされたジェスチャー操作に基づくナビゲーション処理を行っており、ユーザーインターフェイス装置3が表示するための地図を出力したり、ユーザーインターフェイス装置3から入力された文字列を編集したりする。 Objects that are output from the navigation device body 2 to the user interface device 3 include, for example, maps, character strings, icons, images, and the like. The navigation device body 2 performs navigation processing based on the gesture operation performed on the user interface device 3, outputs a map for display by the user interface device 3, and receives a character string input from the user interface device 3. Edit it.
 ユーザーインターフェイス装置3は、バスBで接続された、制御部10、記憶媒体制御部14、記憶媒体15、通信制御部16、座標検知部17、感圧検知部18、表示制御部19を備える。また、ユーザーインターフェイス装置3は、座標検知部17に接続されたタッチパネル20、感圧検知部18に接続された感圧センサー30、表示制御部19に接続された表示パネル40を備える。 The user interface device 3 includes a control unit 10, a storage medium control unit 14, a storage medium 15, a communication control unit 16, a coordinate detection unit 17, a pressure sensitive detection unit 18, and a display control unit 19 connected by a bus B. The user interface device 3 also includes a touch panel 20 connected to the coordinate detection unit 17, a pressure sensor 30 connected to the pressure detection unit 18, and a display panel 40 connected to the display control unit 19.
 制御部10は、ユーザーインターフェイス装置3内の各部の動作を制御する。この制御部10は、座標検知部17から入力する座標検知情報に基づいてタッチ操作が行われたことを判定すると共に、タッチ位置から入力された情報をナビゲーション装置本体2に出力する。また、制御部10は、感圧検知部18から入力する感圧検知情報に基づいてタッチパネル20が指で押込まれた押込み操作が行われたことを判定する。また、制御部10は、ナビゲーション装置本体2がユーザーインターフェイス装置3に出力したオブジェクトを表示パネル40に表示させる制御情報を表示制御部19に出力する。そして、制御部10は、タッチ操作及び押込み操作を組み合わせたジェスチャー操作に基づいて、表示パネル40に表示するオブジェクトの表示形態を変更するための制御情報を表示制御部19に出力する。第1~第3の実施の形態においてオブジェクトの表示形態を変更するとは、例えば、地図を拡大又は縮小して表示することである。 The control unit 10 controls the operation of each unit in the user interface device 3. The control unit 10 determines that a touch operation has been performed based on the coordinate detection information input from the coordinate detection unit 17, and outputs information input from the touch position to the navigation device body 2. Further, the control unit 10 determines that the pressing operation in which the touch panel 20 is pressed with a finger is performed based on the pressure-sensitive detection information input from the pressure-sensitive detection unit 18. In addition, the control unit 10 outputs control information for causing the display panel 40 to display the object output from the navigation device body 2 to the user interface device 3 to the display control unit 19. Then, the control unit 10 outputs control information for changing the display form of the object displayed on the display panel 40 to the display control unit 19 based on a gesture operation that combines the touch operation and the push-in operation. In the first to third embodiments, changing the display form of an object means, for example, displaying a map on an enlarged or reduced scale.
 制御部10は、CPU11,RAM12,ROM13により構成され、CPU11,RAM12,ROM13が連動して制御部10の機能を実現する。
 CPU(Central Processing Unit)11は、ユーザーインターフェイス装置3内の各部の動作を制御するコンピューターの一例である。CPU11は、例えば、ROM13から読出したプログラムを実行し、本実施の形態に係るジェスチャー操作に関わる処理を行う。
 RAM(Random Access Memory)12は、CPU11によって実行されるプログラム等の一時データを記憶する。
The control unit 10 includes a CPU 11, a RAM 12, and a ROM 13, and the CPU 11, the RAM 12, and the ROM 13 work together to realize the function of the control unit 10.
The CPU (Central Processing Unit) 11 is an example of a computer that controls the operation of each unit in the user interface device 3. For example, the CPU 11 executes a program read from the ROM 13 and performs processing related to the gesture operation according to the present embodiment.
A RAM (Random Access Memory) 12 stores temporary data such as a program executed by the CPU 11.
 ROM(Read Only Memory)13は、CPU11によって読出されるプログラム等を記憶する。ROM13は、CPU11によって実行されるプログラムを格納したコンピューター読取可能な非一過性の記録媒体の一例として用いられる。このため、ROM13には、このプログラムが永続的に格納される。なお、CPU11によって実行されるプログラムを格納したコンピューター読取可能な非一過性の記録媒体としては、他にもCD-ROM、DVD-ROM等の記録媒体であってもよい。 ROM (Read Only Memory) 13 stores a program read by the CPU 11 and the like. The ROM 13 is used as an example of a computer-readable non-transitory recording medium that stores a program executed by the CPU 11. For this reason, this program is permanently stored in the ROM 13. Note that the computer-readable non-transitory recording medium storing the program executed by the CPU 11 may be a recording medium such as a CD-ROM or a DVD-ROM.
 記憶媒体制御部14は、記憶媒体15に対する制御を行う。例えば、記憶媒体制御部14は、制御部10から入力するデータを記憶媒体15に書き込んだり、制御部10からの指示により記憶媒体15に記憶されたデータを読出して制御部10に出力したりする。
 記憶媒体15は、例えば、ユーザーインターフェイス装置3に設けられるスロット等に差し込まれ、記憶媒体制御部14により書き込まれたデータを記憶し、又は記憶媒体制御部14によりデータが読出される。例えば、記憶媒体15には、ナビゲーション装置本体2で動作するナビゲーションプログラム、文字編集プログラムのバージョンアップデータ等が記憶される。
 通信制御部16は、ナビゲーション装置本体2とユーザーインターフェイス装置3との間でネットワークNを通じて行われるデータの通信処理を制御する。
The storage medium control unit 14 controls the storage medium 15. For example, the storage medium control unit 14 writes data input from the control unit 10 to the storage medium 15, or reads data stored in the storage medium 15 according to an instruction from the control unit 10 and outputs the data to the control unit 10. .
For example, the storage medium 15 is inserted into a slot or the like provided in the user interface device 3 and stores data written by the storage medium control unit 14 or data is read by the storage medium control unit 14. For example, the storage medium 15 stores a navigation program that operates in the navigation apparatus main body 2, upgrade data of a character editing program, and the like.
The communication control unit 16 controls data communication processing performed through the network N between the navigation device body 2 and the user interface device 3.
 座標検知部17は、タッチ操作が行われたタッチパネル20のタッチ位置の座標を検知する。タッチパネル20は、平面矩形状に形成されており、互いに交差するX電極とY電極の交差する点の位置が座標情報となり、この位置の静電容量の変化に応じた値が座標検知部17に出力される。座標検知部17は、タッチパネル20から入力された座標情報が変化した箇所の座標を、指のタッチ位置であると検知して、タッチ位置の座標を含む座標検知情報を制御部10に出力する。 The coordinate detection unit 17 detects the coordinates of the touch position of the touch panel 20 where the touch operation is performed. The touch panel 20 is formed in a planar rectangular shape, and the position of the intersection of the X electrode and the Y electrode that intersect each other is coordinate information, and a value corresponding to the change in capacitance at this position is given to the coordinate detection unit 17. Is output. The coordinate detection unit 17 detects the coordinate of the location where the coordinate information input from the touch panel 20 has changed as a finger touch position, and outputs coordinate detection information including the coordinate of the touch position to the control unit 10.
 感圧検知部18は、タッチパネル20が指で押され、感圧センサー30が感圧したことを検知する。感圧センサー30は、タッチパネル20の裏面に設けられており、指によりタッチパネル20に印加される押圧力に応じて変化するセンサー値を感圧検知部18に出力する。感圧検知部18は、感圧センサー30から入力されるセンサー値に基づいて、指によりタッチパネル20が押込まれる押込み操作が行われたことを検知して感圧検知情報を制御部10に出力する。 The pressure sensitive detection unit 18 detects that the touch panel 20 is pressed with a finger and the pressure sensitive sensor 30 senses pressure. The pressure-sensitive sensor 30 is provided on the back surface of the touch panel 20 and outputs a sensor value that changes according to the pressing force applied to the touch panel 20 by a finger to the pressure-sensitive detection unit 18. Based on the sensor value input from the pressure-sensitive sensor 30, the pressure-sensitive detection unit 18 detects that a pressing operation for pressing the touch panel 20 with a finger is performed, and outputs pressure-sensitive detection information to the control unit 10. To do.
 表示制御部19は、制御部10から入力される制御情報に従って、平面矩形状に形成された表示パネル40に対して、例えば、ナビゲーションに必要なアイコン、地図等のオブジェクトを表示させる制御を行う。 The display control unit 19 performs control to display objects such as icons and maps necessary for navigation on the display panel 40 formed in a planar rectangular shape according to the control information input from the control unit 10.
 図3は、ユーザーインターフェイス装置3の概略構成図である。
 図3の上には、上面視したタッチパネル20に設置される感圧センサー30の位置と、筐体55の額縁の位置とが破線で示されている。タッチパネル20を内包するユーザーインターフェイス装置3の筐体55の額縁上のタッチパネル20の裏面には、6個の感圧センサー30が設けられている。このため、ユーザーが感圧センサー30を直接見ることはない。感圧センサー30が設置される数は、図3に示すように通常は複数個設けられるが、必ずしも複数設ける必要はなく、1個でもよい。
FIG. 3 is a schematic configuration diagram of the user interface device 3.
In FIG. 3, the position of the pressure-sensitive sensor 30 installed on the touch panel 20 as viewed from above and the position of the frame of the housing 55 are indicated by broken lines. Six pressure-sensitive sensors 30 are provided on the back surface of the touch panel 20 on the frame of the housing 55 of the user interface device 3 including the touch panel 20. For this reason, the user does not see the pressure sensor 30 directly. As shown in FIG. 3, a plurality of pressure-sensitive sensors 30 are usually provided, but a plurality of pressure-sensitive sensors 30 are not necessarily provided, and may be one.
 図3の下には、タッチパネル20及び感圧センサー30の断面図が示される。
 トッププレート50は、タッチパネル20の表面を保護する。トッププレート50には透明のガラス基板やフィルム等が用いられる。トッププレート50の表面は、ユーザーがタッチ操作を行うために指を接触させる入力操作面51である。
A cross-sectional view of the touch panel 20 and the pressure sensor 30 is shown below FIG.
The top plate 50 protects the surface of the touch panel 20. A transparent glass substrate, a film, or the like is used for the top plate 50. The surface of the top plate 50 is an input operation surface 51 on which a user touches a finger for performing a touch operation.
 タッチパネル20は、例えば、透明なX電極基板21、接着層22、Y電極基板23を順に積層して構成されたものである。トッププレート50とX電極基板21は、接着層52により接着固定されている。X電極基板21及びY電極基板23はそれぞれ矩形状を有する。X電極基板21及びY電極基板23は、接着層22にて接着される。X電極基板21に形成されるX方向検知電極(不図示)と、Y電極基板23に形成されるY方向検知電極(不図示)が互いに平面的に重なる領域が、XY平面の座標検知領域となる。感圧センサー30は、タッチパネル20のXY平面の座標検知領域外の周縁領域(額縁)に配置される。 The touch panel 20 is configured, for example, by laminating a transparent X electrode substrate 21, an adhesive layer 22, and a Y electrode substrate 23 in this order. The top plate 50 and the X electrode substrate 21 are bonded and fixed by an adhesive layer 52. Each of the X electrode substrate 21 and the Y electrode substrate 23 has a rectangular shape. The X electrode substrate 21 and the Y electrode substrate 23 are bonded by the adhesive layer 22. An area where the X direction detection electrode (not shown) formed on the X electrode substrate 21 and the Y direction detection electrode (not shown) formed on the Y electrode substrate 23 overlap each other in a plane is a coordinate detection area on the XY plane. Become. The pressure sensitive sensor 30 is disposed in a peripheral area (frame) outside the coordinate detection area on the XY plane of the touch panel 20.
 感圧センサー30は、タッチパネル20と筐体55との間に配置された誘電材料からなる弾性体33と、弾性体33を挟むように配置され、コンデンサを形成する上部電極31及び下部電極35とを備える。感圧センサー30はさらに、弾性体33と上部電極31とを接着固定する接着層32と、弾性体33と下部電極35とを接着固定する接着層34とを備える。本実施の形態においては、6個の各感圧センサー30を構成する弾性体が連結して1つの枠状の弾性体33を構成し、6個の感圧センサー30で1つの弾性体33を共有している。弾性体33を環状に設けることにより、タッチパネル20と筐体55との間、つまりタッチパネル20と表示パネル40との間の空隙41に外部からのゴミ等が侵入することを防止できる。また、6個の感圧センサー30を構成する下部電極35が連結して1つの枠状の下部電極35を構成し、6個の感圧センサー30で1つの下部電極35を共有している。尚、上部電極31も下部電極35と同様に枠状に形成されてもよい。 The pressure-sensitive sensor 30 includes an elastic body 33 made of a dielectric material disposed between the touch panel 20 and the housing 55, and an upper electrode 31 and a lower electrode 35 that are disposed so as to sandwich the elastic body 33 and form a capacitor. Is provided. The pressure-sensitive sensor 30 further includes an adhesive layer 32 that bonds and fixes the elastic body 33 and the upper electrode 31, and an adhesive layer 34 that bonds and fixes the elastic body 33 and the lower electrode 35. In the present embodiment, the elastic bodies constituting the six pressure-sensitive sensors 30 are connected to form one frame-shaped elastic body 33, and the six pressure-sensitive sensors 30 form one elastic body 33. Sharing. By providing the elastic body 33 in a ring shape, it is possible to prevent foreign dust or the like from entering the gap 41 between the touch panel 20 and the housing 55, that is, between the touch panel 20 and the display panel 40. Further, the lower electrodes 35 constituting the six pressure sensors 30 are connected to constitute one frame-like lower electrode 35, and the six pressure sensors 30 share one lower electrode 35. The upper electrode 31 may also be formed in a frame shape like the lower electrode 35.
 弾性体33には、例えば、残留ひずみが少なく、復元率(復元速度)が高い材料が用いられる。弾性体33に用いられる材料としては、例えばシリコーンゴム、ウレタン系ゴムがある。弾性体33は、例えば、元の弾性体33の高さに対して最大で10%程度変位すればよい。例えば、感圧センサー30に用いられる厚み0.5mmの弾性体33であれば、10μm程度変位すればよい。 For the elastic body 33, for example, a material having a small residual strain and a high restoration rate (restoration speed) is used. Examples of the material used for the elastic body 33 include silicone rubber and urethane rubber. The elastic body 33 may be displaced about 10% at the maximum with respect to the height of the original elastic body 33, for example. For example, the elastic body 33 having a thickness of 0.5 mm used for the pressure-sensitive sensor 30 may be displaced by about 10 μm.
 入力操作面51と垂直な押圧方向(z軸方向)に入力操作面51を、指で押圧すると、感圧センサー30の弾性体33が縮小するように歪み、この感圧センサー30が接着固定されているトッププレート50及びタッチパネル20が押圧方向に移動する。このように、感圧センサー30は、押圧されると押圧方向にその厚みが変位する。そして感圧センサー30の変位分だけ、タッチパネル20の裏面が、表示パネル40の表面に近づくので、タッチパネル20と表示パネル40との間には、タッチパネル20の移動分を考慮した空隙41が設けられている。ただし、タッチパネル20と表示パネル40が接着される場合には、空隙41が設けられない。そして、感圧センサー30は、上部電極31及び下部電極35により形成されるコンデンサの静電容量に応じたセンサー値を感圧検知部18に出力する。 When the input operation surface 51 is pressed with a finger in a pressing direction (z-axis direction) perpendicular to the input operation surface 51, the elastic body 33 of the pressure sensor 30 is distorted so that the pressure sensor 30 is bonded and fixed. The top plate 50 and the touch panel 20 are moved in the pressing direction. Thus, when the pressure sensor 30 is pressed, its thickness is displaced in the pressing direction. Then, the back surface of the touch panel 20 approaches the front surface of the display panel 40 by the amount of displacement of the pressure sensor 30, so that a gap 41 is provided between the touch panel 20 and the display panel 40 in consideration of the movement of the touch panel 20. ing. However, the gap 41 is not provided when the touch panel 20 and the display panel 40 are bonded. The pressure-sensitive sensor 30 outputs a sensor value corresponding to the capacitance of the capacitor formed by the upper electrode 31 and the lower electrode 35 to the pressure-sensitive detection unit 18.
 図4は、感圧センサー30の動作原理を説明するための図である。ここでは、タッチパネル20の記載を省略している。
 図4の左上には、指で押圧されていない感圧センサー30の例を示し、図4の右上には、指で押圧された感圧センサー30の例を示している。
FIG. 4 is a diagram for explaining the operating principle of the pressure-sensitive sensor 30. Here, the description of the touch panel 20 is omitted.
An example of the pressure-sensitive sensor 30 that is not pressed with a finger is shown in the upper left of FIG. 4, and an example of the pressure-sensitive sensor 30 that is pressed with a finger is shown in the upper right of FIG.
 上述したようにタッチパネル20が指で押され、感圧センサー30に押圧力が加わると、弾性体33は厚みが減少するように歪む。そして、押圧されていない感圧センサー30の厚みに対して、押圧された感圧センサー30の厚みが変わることで、感圧センサー30の静電容量が変化する。例えば、感圧センサー30が指で押圧されて弾性体33の厚みが変位dだけ減少すると、上部電極31と下部電極35との間の静電容量が変化する。このため、感圧検知部18は、弾性体33の変位dによる上部電極31と下部電極35間の静電容量変化率を利用して、タッチパネル20が押圧され、感圧センサー30が感圧したことを検知している(以下、「感圧を検知」と呼ぶ)。なお、静電容量変化率は、感圧検知部18が上部電極31と下部電極35から出力されるセンサー値に基づいて求められる。ここでセンサー値とは、上部電極31と下部電極35間の静電容量によって決定される電圧である。すなわち、静電容量変化率は、押圧されていない感圧センサー30の上部電極31と下部電極35間の静電容量に対する、押圧された感圧センサー30の上部電極31と下部電極35間の静電容量の割合で求められる。 As described above, when the touch panel 20 is pressed with a finger and a pressing force is applied to the pressure sensor 30, the elastic body 33 is distorted so that the thickness decreases. And the electrostatic capacitance of the pressure sensor 30 changes because the thickness of the pressed pressure sensor 30 changes with respect to the thickness of the pressure sensor 30 which is not pressed. For example, when the pressure sensor 30 is pressed with a finger and the thickness of the elastic body 33 is reduced by the displacement d, the capacitance between the upper electrode 31 and the lower electrode 35 changes. For this reason, the pressure-sensitive detection unit 18 uses the capacitance change rate between the upper electrode 31 and the lower electrode 35 due to the displacement d of the elastic body 33, the touch panel 20 is pressed, and the pressure-sensitive sensor 30 senses pressure. (Hereinafter referred to as “pressure detection”). The capacitance change rate is obtained based on sensor values output from the upper electrode 31 and the lower electrode 35 by the pressure-sensitive detection unit 18. Here, the sensor value is a voltage determined by the capacitance between the upper electrode 31 and the lower electrode 35. That is, the rate of change in capacitance is the static between the upper electrode 31 and the lower electrode 35 of the pressed pressure sensor 30 with respect to the capacitance between the upper electrode 31 and the lower electrode 35 of the pressure sensor 30 that is not pressed. It is obtained as a percentage of electric capacity.
 図4の下のグラフは、感圧センサー30の静電容量変化率と、感圧センサー30に加わる押圧力との関係を示すグラフである。このグラフから、感圧センサー30の静電容量変化率は、感圧センサー30に加わる押圧力にほぼ比例するリニアな特性を有することが分かる。本実施の形態に係る感圧検知部18は、感圧センサー30から入力するセンサー値により、上部電極31と下部電極35間の静電容量が変化したこと、すなわち静電容量変化率を求めることが可能である。 4 is a graph showing the relationship between the capacitance change rate of the pressure sensor 30 and the pressing force applied to the pressure sensor 30. The lower graph in FIG. From this graph, it can be seen that the capacitance change rate of the pressure sensor 30 has a linear characteristic that is substantially proportional to the pressing force applied to the pressure sensor 30. The pressure-sensitive detection unit 18 according to the present embodiment obtains a change in capacitance between the upper electrode 31 and the lower electrode 35 based on a sensor value input from the pressure-sensitive sensor 30, that is, obtains a rate of change in capacitance. Is possible.
 感圧検知部18は、図4の下の押圧力と静電容量変化率の関係を示すグラフに基づいて、タッチパネル20の裏面に配置された各感圧センサー30で検知されたセンサー値を、タッチパネル20が押込まれたときの押圧力に変換する。そして、感圧検知部18は、この押圧力が押圧閾値を越えたときに、タッチパネル20に対してユーザーが意識的に押圧を加えたと判定し、感圧検知情報を制御部10に出力する。 The pressure-sensitive detection unit 18 detects the sensor value detected by each pressure-sensitive sensor 30 disposed on the back surface of the touch panel 20 based on the graph showing the relationship between the pressing force at the bottom of FIG. It is converted into a pressing force when the touch panel 20 is pressed. Then, when the pressing force exceeds the pressing threshold, the pressure-sensitive detection unit 18 determines that the user has consciously pressed the touch panel 20 and outputs pressure-sensitive detection information to the control unit 10.
 なお、感圧検知部18は、各感圧センサー30により検出されるそれぞれの静電容量変化率の合算値に基づいて、押圧力を求めてもよい。これにより、入力操作面51へのタッチ位置だけに依存しない高精度な押圧力検知が可能となる。なお、感圧検知部18は、例えば、静電容量変化率の合算値を感圧センサー30の数で除算して得られる平均値から押圧力を求めてもよい。 Note that the pressure-sensitive detection unit 18 may obtain the pressing force based on the total value of the respective capacitance change rates detected by the pressure-sensitive sensors 30. Thereby, it is possible to detect the pressing force with high accuracy without depending on only the touch position on the input operation surface 51. Note that the pressure-sensitive detection unit 18 may obtain the pressing force from an average value obtained by dividing the total value of the capacitance change rates by the number of the pressure-sensitive sensors 30, for example.
 また、感圧検知部18は、押圧力の大きさにより異なる感圧検知情報を制御部10に出力してもよい。例えば、図4の下のグラフにおいて、静電容量変化率が2.0%となる押圧力を閾値th1(第1押圧閾値の一例)とし、静電容量変化率が6.0%となる押圧力を閾値th2(第2押圧閾値の一例)とする。押圧力が閾値th1未満であれば、筐体55を通じてユーザーインターフェイス装置3に加わる振動等により静電容量が変化したに過ぎないと考える。つまり、ユーザーが意図して指をタッチパネル20に押込んだものではないとする。この場合、感圧検知部18は、ユーザーが指をタッチパネル20に押込んだと判定しないので、感圧検知情報を出力しない。しかし、押圧力が閾値th1以上であれば、感圧検知部18は、ユーザーが指をタッチパネル20に押込んだと判定し、第1感圧検知情報を出力する。さらに、押圧力が閾値th2以上であれば、感圧検知部18は、ユーザーが指をタッチパネル20に強く押込んだと判定し、第2感圧検知情報を出力する。 Further, the pressure-sensitive detection unit 18 may output pressure-sensitive detection information that varies depending on the magnitude of the pressing force to the control unit 10. For example, in the lower graph of FIG. 4, the pressing force at which the capacitance change rate is 2.0% is defined as a threshold th1 (an example of the first pressing threshold value), and the pressing force at which the capacitance change rate is 6.0%. Let the pressure be a threshold th2 (an example of a second pressing threshold). If the pressing force is less than the threshold th <b> 1, it is considered that the capacitance has only changed due to vibration applied to the user interface device 3 through the housing 55. That is, it is assumed that the user does not intentionally press the finger into the touch panel 20. In this case, the pressure-sensitive detection unit 18 does not determine that the user has pressed the finger into the touch panel 20, and therefore does not output pressure-sensitive detection information. However, if the pressing force is equal to or greater than the threshold th1, the pressure-sensitive detection unit 18 determines that the user has pressed the finger into the touch panel 20, and outputs first pressure-sensitive detection information. Furthermore, if the pressing force is greater than or equal to the threshold th2, the pressure-sensitive detection unit 18 determines that the user has pressed the finger strongly into the touch panel 20, and outputs second pressure-sensitive detection information.
 このように感圧検知部18は、ユーザーが指をタッチパネル20に押込む押込み量に応じて変わる静電容量変化率から求めた押圧力に基づいて、第1感圧検知情報又は第2感圧検知情報を制御部10に出力することができる。この結果、制御部10は、感圧検知部18から入力される第1感圧検知情報又は第2感圧検知情報に応じて異なる処理を行うことが可能となる。なお、感圧検知情報を第1感圧検知情報又は第2感圧検知情報のいずれか1つだけとしてもよい。第1感圧検知情報又は第2感圧検知情報を区別しない場合、「感圧検知情報」と呼ぶ。 As described above, the pressure-sensitive detection unit 18 uses the first pressure-sensitive detection information or the second pressure-sensitive information based on the pressing force obtained from the capacitance change rate that changes according to the pressing amount of the user pushing the finger into the touch panel 20. Detection information can be output to the control unit 10. As a result, the control unit 10 can perform different processes according to the first pressure detection information or the second pressure detection information input from the pressure detection unit 18. Note that the pressure-sensitive detection information may be only one of the first pressure-sensitive detection information and the second pressure-sensitive detection information. When the first pressure detection information or the second pressure detection information is not distinguished, it is referred to as “pressure detection information”.
 制御部10は、感圧検知部18から感圧検知情報が入力したことにより、タッチパネル20に指が押込まれ、ユーザーによる決定操作がなされたことを判定する。このとき、制御部10は、ユーザーに押圧状態であることを知らせるために、筐体55を振動させたり、表示パネル40にメッセージを表示したりする。また、不図示のスピーカーから音声ガイドを放音するようにしてもよい。あるいは、制御部10は、押込まれているタッチ位置に丸、四角等のアイコンを表示することもできる。この場合、アイコンを点滅させてもよいし、アイコンの表示色を変えてもよい。また、筐体55は、タッチパネル20が押込まれた時に、クリック音を発生させたり、指の感触を変化させたりすることもできる。 The control unit 10 determines that the finger has been pushed into the touch panel 20 and the determination operation by the user has been performed when the pressure-sensitive detection information is input from the pressure-sensitive detection unit 18. At this time, the control unit 10 vibrates the housing 55 or displays a message on the display panel 40 in order to inform the user that it is in the pressed state. Further, a voice guide may be emitted from a speaker (not shown). Or the control part 10 can also display icons, such as a circle and a square, in the pressed touch position. In this case, the icon may be blinked, or the icon display color may be changed. The housing 55 can also generate a click sound or change the touch of the finger when the touch panel 20 is pushed.
 以下、図5及び図6を参照して、従来のユーザーインターフェイス装置100を用いた操作例と、第1の実施の形態に係るユーザーインターフェイス装置3を用いた操作例とを比較して説明する。
 図5に示す従来のフローチャートと、図6に示す本発明の第1の実施の形態に係るフローチャートの各ステップには、ジェスチャー操作の具体的な内容を説明するために、ユーザーインターフェイス装置3,100の表示パネルに表示された画面の表示例が示されている。この表示例には、タッチパネルにおけるユーザーの指の位置が付加されている。
 図5及び図6のステップに付加した画面には、矩形状の破線の範囲内に、ある瞬間におけるユーザーインターフェイス装置3,100の地図の画面、及び指の正面図、下面図、左側面図が同時に表示されている。また、画面内の細い白抜き矢印は指の移動方向を表す。
Hereinafter, an operation example using the conventional user interface device 100 and an operation example using the user interface device 3 according to the first embodiment will be described with reference to FIGS. 5 and 6.
Each step of the conventional flowchart shown in FIG. 5 and the flowchart according to the first embodiment of the present invention shown in FIG. 6 includes user interface devices 3 and 100 in order to explain specific contents of the gesture operation. A display example of the screen displayed on the display panel is shown. In this display example, the position of the user's finger on the touch panel is added.
The screen added to the steps of FIG. 5 and FIG. 6 includes a map screen of the user interface device 3, 100 at a certain moment, a front view of the finger, a bottom view, and a left side view within a rectangular broken line range. They are displayed at the same time. A thin white arrow in the screen indicates the direction of finger movement.
<従来の拡大又は縮小操作の課題>
 図5に示すような、タッチ位置のXY座標を検知することしかできない従来のタッチパネルを用いて、ユーザーが表示パネルに表示される地図の拡大又は縮小を行う場合、一般的にピンチアウト、ピンチインと呼ばれる、2つのタッチ位置の座標間の距離を広げたり、狭めたりする動作を行っていた。このように2つのタッチ位置の座標間の距離を広げたり、狭めたりする動作を行う必要性から、ユーザーが2本の指を使用しなければならない。なお、ユーザーが1本の指をダブルタップ等することにより地図の拡大又は縮小操作を行う方法もあるが、この操作では予め設定された倍率に変更できるだけであって、任意の倍率に変更することができなかった。その他にもダブルタップした後、指を離さずパン(ドラッグ)するといった方法で、指一本で拡大又は縮小を実現している例もある。しかし、このようなタッチ操作は、「ダブルタップ」と「ドラッグ」という二つのジェスチャーの組合せからなっており、タッチ操作が煩雑となるばかりか、ユーザーが直感的に操作することが難しいという問題があった。
<Problems of conventional enlargement or reduction operations>
When the user enlarges or reduces the map displayed on the display panel using a conventional touch panel that can only detect the XY coordinates of the touch position as shown in FIG. The operation of expanding or narrowing the distance between the coordinates of the two touch positions called. Thus, the user must use two fingers because it is necessary to perform an operation of widening or narrowing the distance between the coordinates of the two touch positions. In addition, there is a method in which the user can enlarge or reduce the map by double-tapping one finger, but this operation can only be changed to a preset magnification, and can be changed to an arbitrary magnification. I could not. In addition, there is an example in which enlargement or reduction is realized with one finger by a method of double-tapping and then panning (dragging) without releasing the finger. However, such a touch operation is a combination of two gestures, “double tap” and “drag”, which not only makes the touch operation complicated, but also makes it difficult for the user to operate intuitively. there were.
<従来のジェスチャー操作>
 図6に示す本発明の第1の実施形態を説明する前に、図5により従来の拡大又は縮小操作の例を説明する。
 図5は、従来の拡大又は縮小操作の例を示すフローチャートである。
 始めに、従来のユーザーインターフェイス装置100が備える不図示の制御部は、ユーザーインターフェイス装置100が備えるタッチパネルに指がタッチしたことを検知する(S1)。以下、従来のユーザーインターフェイス装置100の制御部を、符号無しで単に「制御部」と呼ぶ。次に、ユーザーは、タッチパネルに対して、指の本数で変わる動作を選択する(S2)。ここでは、1本の指で行う動作(スワイプ)、又は2本の指で行う動作(ピンチイン、ピンチアウト)が選択される。
<Conventional gesture operation>
Prior to describing the first embodiment of the present invention shown in FIG. 6, an example of a conventional enlargement or reduction operation will be described with reference to FIG.
FIG. 5 is a flowchart showing an example of a conventional enlargement or reduction operation.
First, a control unit (not shown) provided in the conventional user interface device 100 detects that a finger touches the touch panel provided in the user interface device 100 (S1). Hereinafter, the control unit of the conventional user interface device 100 is simply referred to as a “control unit” without reference numeral. Next, the user selects an operation that changes depending on the number of fingers on the touch panel (S2). Here, an operation performed with one finger (swipe) or an operation performed with two fingers (pinch-in, pinch-out) is selected.
 制御部は、タッチパネルにタッチした指のタッチ数を検知する(S3)。指のタッチ数が1本である場合(S3の1本)、制御部は、タッチパネルにタッチした指がタッチパネルを移動する方向に合わせて地図を移動して表示させる制御情報を不図示の表示制御部に出力する(S4)。以下、従来のユーザーインターフェイス装置100の表示制御部を、符号無しで単に「表示制御部」と呼ぶ。ステップS4に付加された画面には、ユーザーが1本の指で画面をタッチしたまま、画面の左下から右上に移動させ、1本の指の移動方向に合わせて地図が移動して表示される様子が示されている。 The control unit detects the number of touches of the finger touching the touch panel (S3). When the number of finger touches is one (one in S3), the control unit displays control information (not shown) that displays the control information by moving the map in accordance with the direction in which the finger touching the touch panel moves the touch panel. (S4). Hereinafter, the display control unit of the conventional user interface device 100 is simply referred to as a “display control unit” without reference numerals. On the screen added in step S4, the user moves the map from the lower left to the upper right while touching the screen with one finger, and the map is moved and displayed in accordance with the moving direction of the one finger. The situation is shown.
 一方、指のタッチ数が2本である場合(S3の2本)、制御部は、2箇所のタッチ位置間の距離が拡大又は縮小のいずれであるかを判定する(S5)。ステップS5に付加された画面には、ユーザーが画面の中央付近を2本の指でタッチした様子が示されている。 On the other hand, when the number of finger touches is two (two in S3), the control unit determines whether the distance between the two touch positions is enlarged or reduced (S5). The screen added in step S5 shows a state in which the user touches the vicinity of the center of the screen with two fingers.
 タッチ位置間の距離が拡大するピンチアウトが行われた場合(S5の拡大)、制御部は、地図を拡大する制御情報を表示制御部に出力する(S6)。ステップS6に付加された画面には、ユーザーが画面に2本の指をタッチしたまま、2本の指の間隔を広げて(ピンチアウト)、地図を拡大表示させる様子が示されている。 When a pinch-out that increases the distance between touch positions is performed (enlargement in S5), the control unit outputs control information for enlarging the map to the display control unit (S6). The screen added in step S6 shows a state in which the user enlarges the map by widening the interval between the two fingers (pinch out) while touching the two fingers on the screen.
 一方、ステップS5でタッチ位置間の距離が縮小するピンチインが行われた場合(S5の縮小)、制御部は、地図を縮小する制御情報を表示制御部に出力する(S7)。ステップS7に付加された画面には、ユーザーが画面に2本の指をタッチしたまま、2本の指の間隔を狭めて(ピンチイン)、地図を縮小表示させる様子が示されている。ステップS4,S6,S7のいずれかの後、表示制御部は制御部から入力した制御情報に従って、ユーザーが要求した画面を表示パネルに表示し(S8)、本処理を終了する。 On the other hand, when a pinch-in that reduces the distance between touch positions is performed in step S5 (reduction in S5), the control unit outputs control information for reducing the map to the display control unit (S7). The screen added in step S7 shows how the map is reduced and displayed by narrowing the interval between the two fingers (pinch in) while the user touches the screen with two fingers. After any of Steps S4, S6, and S7, the display control unit displays the screen requested by the user on the display panel in accordance with the control information input from the control unit (S8), and ends this process.
 このように従来は、「タッチパネルのタッチ検知」と「XY座標の移動検知」を利用して地図の拡大又は縮小が行われていた。そして、地図の拡大又は縮小を行うためには、ユーザーが2本の指を使用してピンチアウト、ピンチインを行うか、2つの異なるジェスチャーを組合せて行うか、又は表示倍率変更ボタンを押す方法しかなかった。 Thus, conventionally, the map has been enlarged or reduced using “touch panel touch detection” and “XY coordinate movement detection”. In order to enlarge or reduce the map, the user can only pinch out and pinch in with two fingers, combine two different gestures, or press the display magnification change button. There wasn't.
<第1の実施の形態のジェスチャー操作>
 図6は、第1の実施の形態に係る拡大又は縮小操作の例を示すフローチャートである。この第1の実施の形態においては、1本の指を用いた簡易な操作により地図の拡大又は縮小を行うようにしている。すなわち、第1の実施の形態に係る制御部10は、タッチパネル20が指により押込まれた状態で指が移動する方向に応じて、地図を拡大し、又は縮小する制御情報を表示制御部19に出力することが可能である。第1の実施の形態においてジェスチャー操作とは、例えば、ユーザーがタッチパネル20に押込んだ指を上又は下方向に移動させる操作である。
<Gesture operation of the first embodiment>
FIG. 6 is a flowchart illustrating an example of the enlargement or reduction operation according to the first embodiment. In the first embodiment, the map is enlarged or reduced by a simple operation using one finger. That is, the control unit 10 according to the first embodiment provides the display control unit 19 with control information for enlarging or reducing the map in accordance with the direction in which the finger moves while the touch panel 20 is pressed by the finger. It is possible to output. In the first embodiment, the gesture operation is, for example, an operation of moving a finger pressed by the user into the touch panel 20 upward or downward.
 始めに、ユーザーインターフェイス装置3が備えるタッチパネル20にユーザーの指が触れたことにより、座標検知部17は、タッチされたことを検知し(S11)、タッチ位置の座標を検知する。ステップS11に付加された画面には、ユーザーが画面の左下を1本の指でタッチした様子が示されている。 First, when the user's finger touches the touch panel 20 provided in the user interface device 3, the coordinate detection unit 17 detects that the touch has been made (S11), and detects the coordinates of the touch position. The screen added in step S11 shows a state in which the user touches the lower left of the screen with one finger.
 次に、ユーザーが指をタッチパネル20に押し込む動作を選択する(S12)。ステップS12に付加された画面には、ユーザーが画面の左上を1本の指でタッチし、さらにタッチパネル20を押込む様子が示されている。タッチパネル20が指で押込まれていることは、タッチパネル20に対して下向き矢印で表された指の移動方向により示される。 Next, the user selects an operation of pushing a finger into the touch panel 20 (S12). The screen added in step S <b> 12 shows a state in which the user touches the upper left of the screen with one finger and further presses the touch panel 20. That the touch panel 20 is pushed by a finger is indicated by the movement direction of the finger represented by a downward arrow with respect to the touch panel 20.
 次に、感圧検知部18は、感圧センサー30から出力されるセンサー値に基づいて、感圧センサー30が感圧を検知したか否かを判定する(S13)。感圧検知部18が、センサー値により感圧を検知していないと判定した場合(S13のNO)、制御部10は、タッチパネル20にタッチした指が移動する方向に合わせて地図を移動して表示させる制御情報を表示制御部19に出力する(S14)。ステップS14に付加された画面には、従来通りユーザーが画面に1本の指をタッチしたまま、画面の左下から右上に移動させた1本の指の移動方向に合わせて地図が移動して表示される様子が示されている。 Next, based on the sensor value output from the pressure sensor 30, the pressure sensor detector 18 determines whether or not the pressure sensor 30 has detected pressure (S13). When the pressure-sensitive detection unit 18 determines that pressure is not detected based on the sensor value (NO in S13), the control unit 10 moves the map in accordance with the direction in which the finger touching the touch panel 20 moves. The control information to be displayed is output to the display control unit 19 (S14). On the screen added in step S14, the map is moved and displayed in accordance with the moving direction of one finger moved from the lower left to the upper right of the screen while the user touches the finger as usual. The state of being done is shown.
 一方、感圧検知部18が、センサー値により感圧を検知したと判定した場合(S13のYES)、指が押し込まれた状態でタッチ操作が行われる。このため、制御部10は、指が押し込まれた状態で移動する指の移動方向がタッチパネル20に対して上下のいずれであるかを、座標検知部17より入力する座標検知情報に基づいて判定する(S15)。 On the other hand, when the pressure-sensitive detection unit 18 determines that pressure is detected based on the sensor value (YES in S13), the touch operation is performed with the finger pressed. For this reason, the control unit 10 determines, based on the coordinate detection information input from the coordinate detection unit 17, whether the moving direction of the finger moving in a state where the finger is pressed is up or down with respect to the touch panel 20. (S15).
 制御部10は、タッチパネル20が指により押込まれた状態で移動する方向が上(第1方向の一例)である場合(S15の上)に、表示パネル40に表示される地図を拡大する制御情報を表示制御部19に出力する(S16)。この地図は、タッチパネル20に押込まれた地図の移動距離が長くなるにつれて、連続して拡大する。ステップS16に付加された画面には、ユーザーが指をタッチパネル20に押込んだまま指を上に移動させたことにより、地図が拡大した様子が示されている。 The control unit 10 enlarges the map displayed on the display panel 40 when the direction in which the touch panel 20 is moved by the finger is upward (an example of the first direction) (above S15). Is output to the display control unit 19 (S16). This map continuously expands as the moving distance of the map pushed into the touch panel 20 increases. The screen added in step S16 shows a state in which the map is enlarged by the user moving his / her finger up while pressing the finger on the touch panel 20.
 一方、制御部10は、タッチパネル20が指により押込まれた状態で移動する方向が下(第2方向の一例)である場合(S15の下)に、表示パネル40に表示される地図を縮小する制御情報を表示制御部19に出力する(S17)。この地図は、タッチパネル20に押込まれた地図の移動距離が長くなるにつれて、連続して縮小する。ステップS17に付加された画面には、ユーザーが指をタッチパネル20に押込んだまま指を下に移動させたことにより、地図が縮小した様子が示されている。 On the other hand, the control unit 10 reduces the map displayed on the display panel 40 when the direction in which the touch panel 20 is moved by the finger is down (an example of the second direction) (below S15). The control information is output to the display control unit 19 (S17). This map is continuously reduced as the moving distance of the map pushed into the touch panel 20 becomes longer. The screen added in step S <b> 17 shows a state where the map is reduced by the user moving the finger down while pressing the finger on the touch panel 20.
 ステップS14,S16,S17のいずれかの後、表示制御部19は制御部10から入力された制御情報に従って、ユーザーが要求した画面を表示パネル40に表示し(S18)、本処理を終了する。 After any of Steps S14, S16, and S17, the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S18), and ends this process.
 以上説明した第1の実施の形態に係るユーザーインターフェイス装置3では、1本の指を用いたシンプルなジェスチャー操作により任意の倍率で表示パネル40に表示される地図の拡大又は縮小を行うことが可能となる。このため、従来は、2本の指、または複数のジェスチャーを組合せて実現していた操作方法を簡易化することができる。 In the user interface device 3 according to the first embodiment described above, a map displayed on the display panel 40 can be enlarged or reduced at an arbitrary magnification by a simple gesture operation using one finger. It becomes. For this reason, it is possible to simplify an operation method that is conventionally realized by combining two fingers or a plurality of gestures.
 例えば、ナビゲーション装置1にて地図を表示パネル40に表示する際、この地図の座標変更を伴う場所探索モード(1つの座標を指示した後、指をXY方向に移動させる操作)から、ユーザーが自車の詳細な位置を確認するための倍率変更モード(地図を拡大させる操作)に変更する場合がある。この際、ユーザーは、タッチパネル20にタッチしていた指を押込んで倍率変更モードに移行させた後は、タッチパネル20に押込んだままの指を上又は下に移動するだけで、地図を拡大又は縮小することが可能となる。このため、ユーザーは、指1本だけで意図した機能を実現することが可能となる。また、ユーザーは、1本の指で、地図や写真等の地図を移動して表示させる操作と、地図の表示倍率を変更する操作とを連続して行えるため、従来よりも感覚的かつ直感的な操作を実現することができる。 For example, when a map is displayed on the display panel 40 by the navigation device 1, the user can automatically move from a location search mode (an operation to move a finger in the XY direction after designating one coordinate) that involves changing the coordinates of the map. There is a case of changing to a magnification change mode (operation for enlarging a map) for confirming the detailed position of the car. At this time, after the user presses the finger touching the touch panel 20 and shifts to the magnification change mode, the user simply zooms in or out of the map by moving the finger pressed down on the touch panel 20 up or down. It becomes possible to reduce. For this reason, the user can realize the intended function with only one finger. In addition, the user can continuously perform an operation for moving and displaying a map such as a map or a photo with one finger and an operation for changing the display magnification of the map. Operation can be realized.
 また、ユーザーがユーザーインターフェイス装置3に指をタッチしている間は、走行中の車両が揺れていたとしても指がタッチ位置から移動しにくい。このため、従来の近接式のタッチパネルを用いたユーザーインターフェイス装置に比べて正確に操作することができる。なお、ナビゲーション装置本体2は、車両が走行している間はユーザーインターフェイス装置3を用いてユーザーが操作できないようにする制御情報をユーザーインターフェイス装置3に出力してもよい。逆に、車両が停車している間はユーザーインターフェイス装置3を用いてユーザーが操作できるようにする制御情報をユーザーインターフェイス装置3に出力してもよい。 In addition, while the user touches the user interface device 3 with his / her finger, the finger is difficult to move from the touch position even if the running vehicle is shaking. Therefore, it can be operated more accurately than a conventional user interface device using a proximity touch panel. The navigation device body 2 may output control information to the user interface device 3 so that the user cannot operate using the user interface device 3 while the vehicle is traveling. Conversely, control information that allows the user to operate using the user interface device 3 while the vehicle is stopped may be output to the user interface device 3.
 また、ユーザーが押込んだ指の移動方向が上であるときに地図を縮小し、指の移動方向が下であるときに地図を拡大するように設定してもよい。また、例えば、ユーザーが押込んだ指の移動方向が右であるときに地図を拡大し、左であるときに地図を縮小してもよいし、逆に、指の移動方向が左であるときに地図を拡大し、右であるときに地図を縮小してもよい。 Also, the map may be set to be reduced when the direction of movement of the finger pressed by the user is up and to be enlarged when the direction of movement of the finger is down. Also, for example, the map may be enlarged when the direction of the finger pressed by the user is right, and the map may be reduced when the finger is left, or conversely, when the direction of finger movement is left You may zoom in on the map and shrink the map when you are on the right.
 また、一旦拡大し、又は縮小した地図を元の倍率に戻して表示する場合、例えば、ユーザーがタッチパネル20に押込んだ指をすぐに離す操作や、何らかのマーク(丸、四角等)をタッチパネル20に指で描く操作をリセット操作として行ってもよい。 In addition, when a map that has been enlarged or reduced is returned to its original magnification and displayed, for example, an operation of immediately releasing the finger that the user has pressed into the touch panel 20 or any mark (circle, square, etc.) is displayed on the touch panel 20. An operation of drawing with a finger may be performed as a reset operation.
 また、画像編集ソフトウェア、画像閲覧ソフトウェア等においても、第1の実施の形態に係るユーザーインターフェイス装置3を用いて画像の拡大又は縮小を実現してもよい。 Also, in image editing software, image browsing software, and the like, image enlargement or reduction may be realized using the user interface device 3 according to the first embodiment.
[第2の実施の形態]
 次に、図7を参照して、本発明の第2の実施の形態に係るユーザーインターフェイス装置3のジェスチャー操作の例について説明する。第2の実施の形態に係るジェスチャー操作も、地図を拡大又は縮小するために行われるジェスチャー操作である。
[Second Embodiment]
Next, an example of a gesture operation of the user interface device 3 according to the second embodiment of the present invention will be described with reference to FIG. The gesture operation according to the second embodiment is also a gesture operation performed to enlarge or reduce the map.
<従来の拡大又は縮小操作の課題>
 従来のタッチ位置のXY座標だけを検知可能なタッチパネルを備えたユーザーインターフェイス装置では、上述したようにピンチアウト、ピンチインと呼ばれる動作により地図の拡大又は縮小を行っていた。ピンチアウト、ピンチインは画面にタッチした2本の指の間隔を広げたり、狭めたりするジェスチャー操作を必須とする。また、従来は、拡大又は縮小を指示する表示倍率変更ボタンが画面内の所定位置に表示される場合もあった。しかし、ユーザーは、地図を拡大又は縮小するために、表示倍率変更ボタンが表示される位置を常に確認しなければならなかった。
<Problems of conventional enlargement or reduction operations>
In a conventional user interface device having a touch panel that can detect only the XY coordinates of the touch position, the map is enlarged or reduced by operations called pinch out and pinch in as described above. Pinch-out and pinch-in require gesture operations that increase or decrease the distance between two fingers that touch the screen. Conventionally, a display magnification change button for instructing enlargement or reduction may be displayed at a predetermined position in the screen. However, in order to enlarge or reduce the map, the user must always check the position where the display magnification change button is displayed.
<第2の実施の形態のジェスチャー操作>
 図7は、第2の実施の形態に係る拡大又は縮小操作の例を示すフローチャートである。図7に示す第2の実施の形態においては、1本の指を用いた簡易な操作により地図の拡大だけでなく、地図の縮小を行うことが可能である。すなわち、第2の実施の形態に係る制御部10は、タッチパネル20が指により押込まれた状態が継続する時間に応じて、地図を拡大し、又は縮小する制御情報を表示制御部19に出力する。第2の実施の形態においてジェスチャー操作とは、例えば、ユーザーが1本の指でタッチパネル20を押込む操作である。
<Gesture operation of the second embodiment>
FIG. 7 is a flowchart illustrating an example of an enlargement or reduction operation according to the second embodiment. In the second embodiment shown in FIG. 7, it is possible not only to enlarge the map but also to reduce the map by a simple operation using one finger. That is, the control unit 10 according to the second embodiment outputs control information for enlarging or reducing the map to the display control unit 19 according to the time during which the state in which the touch panel 20 is pressed by the finger continues. . In the second embodiment, the gesture operation is, for example, an operation in which the user pushes the touch panel 20 with one finger.
 始めに、ユーザーインターフェイス装置3が備えるタッチパネル20にユーザーの指が触れたことにより、座標検知部17は、タッチされたことを検知し(S21)、タッチ位置の座標を検知する。ステップS21に付加された画面には、ユーザーが画面の左下を1本の指でタッチした後、画面の左下から右上に移動させた1本の指の移動方向に合わせて地図が移動して表示される様子が示されている。 First, when the user's finger touches the touch panel 20 provided in the user interface device 3, the coordinate detection unit 17 detects that the touch has been made (S21), and detects the coordinates of the touch position. On the screen added in step S21, after the user touches the lower left of the screen with one finger, the map moves and displays in accordance with the moving direction of the one finger moved from the lower left to the upper right of the screen. The state of being done is shown.
 次に、ユーザーが指をタッチパネル20に押し込む動作を選択する(S22)。ステップS22に付加された画面には、ユーザーが画面の左を1本の指でタッチし、さらにタッチパネル20を押込む様子が示されている。 Next, the user selects an operation of pushing a finger into the touch panel 20 (S22). The screen added in step S22 shows a state in which the user touches the left side of the screen with one finger and further presses the touch panel 20.
 次に、感圧検知部18は、感圧センサー30から出力されるセンサー値に基づいて、感圧センサー30が感圧を検知したか否かを判定する(S23)。感圧検知部18が、センサー値により感圧を検知したと判定した場合(S23のYES)、指が押し込まれた状態でタッチ操作が行われる。 Next, the pressure-sensitive detection unit 18 determines whether or not the pressure-sensitive sensor 30 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (S23). When the pressure-sensitive detection unit 18 determines that the pressure is detected based on the sensor value (YES in S23), the touch operation is performed with the finger pressed.
 そして、制御部10は、感圧状態が維持しているか否かを判定する(S24)。以下の説明では、タッチパネル20が指で押込まれたことにより、感圧検知部18が制御部10に感圧検知情報を出力し続けている状態を「感圧状態が維持している」と表現する。制御部10は、例えば、感圧検知部18が出力する感圧検知情報が続く時間に基づいて、感圧状態が維持されているか否かを判定している。このとき、制御部10は、タッチパネル20が指により押込まれた状態が継続する時間が所定時間(例えば、1秒)以上であるか否かを判定している。 And the control part 10 determines whether the pressure-sensitive state is maintained (S24). In the following description, the state in which the pressure-sensitive detection unit 18 continues to output pressure-sensitive detection information to the control unit 10 when the touch panel 20 is pushed with a finger is expressed as “the pressure-sensitive state is maintained”. To do. For example, the control unit 10 determines whether or not the pressure-sensitive state is maintained based on the time that the pressure-sensitive detection information output from the pressure-sensitive detection unit 18 continues. At this time, the control unit 10 determines whether or not the time during which the state in which the touch panel 20 is pressed by the finger continues is equal to or longer than a predetermined time (for example, 1 second).
 そして、制御部10は、タッチパネル20が指により押込まれた状態が継続する時間が所定時間以上であれば、感圧状態が維持されていると判定する(S24のYES)。さらに、制御部10は、感圧状態が維持されたまま判定時間(例えば、2秒)を経過したか否かを判定する(S25)。ステップS25の判定は、制御部10により判定時間毎に行われる。 And the control part 10 determines with the pressure-sensitive state being maintained, if the time for which the state in which the touch panel 20 was pushed in with the finger continues is a predetermined time or more (YES in S24). Furthermore, the control unit 10 determines whether or not a determination time (for example, 2 seconds) has elapsed while the pressure-sensitive state is maintained (S25). The determination in step S25 is performed by the control unit 10 every determination time.
 感圧状態が維持されたまま判定時間を経過していれば(S25のYES)、制御部10は、表示パネル40に表示される地図を拡大する制御情報を表示制御部19に出力する(S26)。ステップS26に付加された画面には、ユーザーがタッチパネル20に指を押込んだ時間が長くなるにつれて、表示倍率が大きくなったことにより、地図が拡大表示される様子が示されている。ステップS26の後、表示制御部19は制御部10から入力した制御情報に従って、ユーザーが要求した画面を表示パネル40に表示する(S27)。ステップS27の後、制御部10は、ステップS25に戻って処理を続ける。このようにユーザーがタッチパネル20に指を押込み続けている間、ステップS25~S27の処理が繰り返されるため、判定時間(例えば、2秒)毎に地図が徐々に拡大して表示される。 If the determination time has passed with the pressure-sensitive state maintained (YES in S25), the control unit 10 outputs control information for enlarging the map displayed on the display panel 40 to the display control unit 19 (S26). ). The screen added in step S26 shows a state where the map is enlarged and displayed as the display magnification increases as the time when the user presses the finger on the touch panel 20 becomes longer. After step S26, the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S27). After step S27, the control unit 10 returns to step S25 and continues processing. As described above, since the process of steps S25 to S27 is repeated while the user continues to press the finger on the touch panel 20, the map is gradually enlarged and displayed every determination time (for example, 2 seconds).
 一方、ステップS25にて感圧状態が維持されたまま判定時間を経過していなければ(S25のNO)、制御部10は、本処理を終了する。このため、制御部10が、ステップS26,S27を経た場合、地図が拡大された状態で表示される。 On the other hand, if the determination time has not elapsed while the pressure-sensitive state is maintained in step S25 (NO in S25), the control unit 10 ends this process. For this reason, when the control part 10 passes step S26, S27, it displays in the state in which the map was expanded.
 ステップS24にて制御部10は、タッチパネル20が指により押込まれた状態が継続する時間が所定時間未満であれば、感圧状態が維持されていないと判定し(S24のNO)、表示パネル40に表示される地図を縮小する制御情報を表示制御部19に出力する(S28)。ステップS28に付加された画面には、小さくなった表示倍率で地図が縮小表示される様子が示されている。 In step S24, the control unit 10 determines that the pressure-sensitive state is not maintained if the time during which the state in which the touch panel 20 is pressed by the finger continues is less than the predetermined time (NO in S24), and the display panel 40. The control information for reducing the map displayed on the screen is output to the display control unit 19 (S28). The screen added to step S28 shows a state where the map is reduced and displayed at a reduced display magnification.
 また、ステップS23にて、感圧検知部18が、センサー値により感圧を検知していないと判定した場合(S23のNO)、制御部10は、タッチパネル20にタッチした指が移動する方向に合わせて地図を移動して表示させる制御情報を表示制御部19に出力する(S29)。ステップS28,S29のいずれかの後、表示制御部19は制御部10から入力した制御情報に従って、ユーザーが要求した画面を表示パネル40に表示し(S30)、本処理を終了する。 In Step S23, when it is determined that the pressure-sensitive detection unit 18 does not detect pressure based on the sensor value (NO in S23), the control unit 10 moves the finger touching the touch panel 20 in the moving direction. In addition, control information for moving and displaying the map is output to the display control unit 19 (S29). After one of steps S28 and S29, the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S30), and ends this process.
 以上説明した第2の実施の形態に係るユーザーインターフェイス装置3では、タッチパネル20が指で押込まれたときに地図を拡大して表示し、タッチパネル20に押込まれた指がすぐにタッチパネル20から離れたときに地図を縮小して表示する。例えば、表示パネル40に表示される地図を詳細表示または広域表示するためには、画面内の一点を長押しすれば、2秒毎に所定の倍率で地図を段階的に拡大して表示し、画面内の一点を押込んですぐに離すことで所定の倍率で地図を縮小して表示する。このジェスチャー操作は、1本の指で行われるため、従来のような2本の指を用いたジェスチャー操作が不要となる。 In the user interface device 3 according to the second embodiment described above, the map is enlarged and displayed when the touch panel 20 is pushed with a finger, and the finger pushed into the touch panel 20 immediately leaves the touch panel 20. Sometimes the map is reduced and displayed. For example, in order to display the map displayed on the display panel 40 in detail or in a wide area, if you press and hold a point on the screen for a long time, the map is enlarged and displayed at a predetermined magnification every 2 seconds. The map is reduced and displayed at a predetermined magnification by pressing a point on the screen and releasing it immediately. Since this gesture operation is performed with one finger, the conventional gesture operation using two fingers becomes unnecessary.
 また、タッチパネル20のどの位置であっても、タッチパネル20が押込まれたことを感圧検知部18が検知可能である。このため、ユーザーは、従来の倍率変更ボタンを探す必要がなく、地図の拡大又は縮小を簡単に行えるようになる。 Further, the pressure-sensitive detection unit 18 can detect that the touch panel 20 has been pushed at any position on the touch panel 20. Therefore, the user does not need to search for a conventional magnification change button, and can easily enlarge or reduce the map.
 なお、タッチパネル20が指で押込まれ、感圧状態が維持されたときに地図を縮小して表示してもよく、タッチパネル20が押込まれたすぐ後にタッチパネル20から指が離れたときには地図を拡大して表示してもよい。
 また、ユーザーがタッチパネル20に押込んだ状態の指を同一方向に移動させた場合には、地図を拡大又は縮小しながら指の移動方向に合わせて地図を移動して表示してもよい。
 また、例えば、画像編集ソフトウェアの処理において、感圧状態が維持され、タッチパネル20が2秒以上押込まれたときに複数のレイヤーを切替えて表示してもよい。
Note that the map may be reduced and displayed when the touch panel 20 is pressed with a finger and the pressure-sensitive state is maintained. When the finger is released from the touch panel 20 immediately after the touch panel 20 is pressed, the map is enlarged. May be displayed.
Further, when the user moves the finger pressed into the touch panel 20 in the same direction, the map may be moved and displayed in accordance with the moving direction of the finger while enlarging or reducing the map.
Further, for example, in the processing of the image editing software, when the pressure sensitive state is maintained and the touch panel 20 is pressed for 2 seconds or more, a plurality of layers may be switched and displayed.
[第3の実施の形態]
 次に、図8を参照して、本発明の第3の実施の形態に係るユーザーインターフェイス装置3のジェスチャー操作の例について説明する。第3の実施の形態に係るジェスチャー操作は、指を押込んだ状態でタッチ操作を行う指の本数に応じて地図を拡大又は縮小するために行われる。
[Third Embodiment]
Next, an example of gesture operation of the user interface device 3 according to the third embodiment of the present invention will be described with reference to FIG. The gesture operation according to the third embodiment is performed in order to enlarge or reduce the map according to the number of fingers that perform the touch operation with the finger pressed.
<従来の拡大又は縮小操作の課題>
 従来のユーザーインターフェイス装置では、一般的にピンチアウト、ピンチインと呼ばれる、画面にタッチした2本の指の間隔を広げたり、狭めたりするジェスチャー操作により地図の拡大又は縮小を行わなければならなかった。一方、第3の実施の形態においては、2本の指によるピンチアウト、ピンチインの操作を行うことなく、地図の拡大又は縮小を行うようにしている。
<Problems of conventional enlargement or reduction operations>
In the conventional user interface device, the map has to be enlarged or reduced by a gesture operation generally called pinch-out or pinch-in that widens or narrows the interval between two fingers touching the screen. On the other hand, in the third embodiment, the map is enlarged or reduced without performing pinch-out and pinch-in operations with two fingers.
<第3の実施の形態のジェスチャー操作>
 図8は、第3の実施の形態に係る拡大又は縮小操作の例を示すフローチャートである。図8に示す第3の実施の形態では、1本又は2本の指を用いた簡易な操作により地図を拡大し、又は縮小することが可能である。すなわち、第3の実施の形態に係る制御部10は、タッチパネル20を押込む指の数に応じて、地図を拡大し、又は縮小する制御情報を表示制御部19に出力する。第3の実施の形態においてジェスチャー操作とは、例えば、ユーザーが1本又は2本の指でタッチパネル20を押込む操作である。
 なお、図8のステップS31~S34の処理は、上述した第2の実施の形態における図7のステップS21~S23,S29の処理と同様であるため、詳細な説明を省略する。
<Gesture operation of the third embodiment>
FIG. 8 is a flowchart illustrating an example of the enlargement or reduction operation according to the third embodiment. In the third embodiment shown in FIG. 8, the map can be enlarged or reduced by a simple operation using one or two fingers. That is, the control unit 10 according to the third embodiment outputs control information for enlarging or reducing the map to the display control unit 19 in accordance with the number of fingers that press the touch panel 20. In the third embodiment, the gesture operation is, for example, an operation in which the user pushes the touch panel 20 with one or two fingers.
Note that the processing in steps S31 to S34 in FIG. 8 is the same as the processing in steps S21 to S23 and S29 in FIG. 7 in the second embodiment described above, and detailed description thereof is omitted.
 ステップS33にて感圧検知部18が、感圧センサー30が出力したセンサー値により感圧を検知したと判定した場合(S33のYES)、制御部10は、座標検知部17から入力する座標検知情報に基づいて、タッチパネル20にタッチしている指のタッチ数を検知する(S35)。制御部10は、タッチパネル20を押込む指のタッチ数が1本(第1個数の一例)であると判定された場合(S35の1本)、表示パネル40に表示される地図を拡大する制御情報を表示制御部19に出力する(S36)。ステップS36に付加された画面には、ユーザーがタッチパネル20を1本の指で押込んだことにより、地図が拡大した様子が示されている。 When it is determined in step S33 that the pressure-sensitive detection unit 18 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (YES in S33), the control unit 10 detects the coordinates detected from the coordinate detection unit 17. Based on the information, the number of touches of the finger touching the touch panel 20 is detected (S35). When it is determined that the number of touches of the finger pushing the touch panel 20 is one (an example of the first number) (one of S35), the control unit 10 controls to enlarge the map displayed on the display panel 40. Information is output to the display control unit 19 (S36). The screen added in step S36 shows a state where the map is enlarged by the user pressing the touch panel 20 with one finger.
 一方、ステップS35で、タッチパネル20を押込む指のタッチ数が2本(第2個数の一例)であると判定された場合(S35の2本)、制御部10は、表示パネル40に表示される地図を縮小する制御情報を表示制御部19に出力する(S37)。ステップS37に付加された画面には、ユーザーがタッチパネル20を2本の指で押込んだことにより、地図が拡大した様子が示されている。 On the other hand, when it is determined in step S35 that the number of touches of the finger pressing the touch panel 20 is two (an example of the second number) (two of S35), the control unit 10 is displayed on the display panel 40. The control information for reducing the map is output to the display control unit 19 (S37). The screen added in step S37 shows a state where the map is enlarged by the user pressing the touch panel 20 with two fingers.
 ステップS34,S36,S37のいずれかの後、表示制御部19は制御部10から入力した制御情報に従って、ユーザーが要求した画面を表示パネル40に表示し(S38)、本処理を終了する。 After any of Steps S34, S36, and S37, the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S38), and ends this process.
 以上説明した第3の実施の形態におけるユーザーインターフェイス装置3では、タッチパネル20を押込んだ指の本数によって地図の拡大又は縮小を行うことが可能となり、従来よりも感覚的かつ直感的な操作を実現することができる。このため、従来行っていたような、「タッチパネルのタッチ検知」と「XY座標の移動検知」を利用して、2本の指を広げたり、狭めたりするジェスチャー操作が不要となる。 In the user interface device 3 in the third embodiment described above, the map can be enlarged or reduced according to the number of fingers that have pressed the touch panel 20, and a more intuitive and intuitive operation than before can be realized. can do. This eliminates the need for a gesture operation for expanding or narrowing two fingers using “touch panel touch detection” and “XY coordinate movement detection”, which has been conventionally performed.
 なお、タッチパネル20を押込んだ指のタッチ数が1本のときに地図を縮小し、指のタッチ数が2本のときに地図を拡大してもよい。また、タッチした指の本数が3本以上である場合においても、タッチした指の本数に応じて様々な、動作を選択可能としてもよい。このため、例えば、第1個数を2本の指のタッチ数、第2個数を1本又は3本の指のタッチ数としてもよい。また、ユーザーがタッチパネル20に押込んだ状態の指を同一方向に移動させた場合には、地図を拡大又は縮小しながら指の移動方向に合わせて地図を移動して表示してもよい。 It should be noted that the map may be reduced when the number of touches of the finger pressing the touch panel 20 is one, and the map may be enlarged when the number of finger touches is two. Further, even when the number of touched fingers is three or more, various operations may be selected according to the number of touched fingers. For this reason, for example, the first number may be the number of touches of two fingers, and the second number may be the number of touches of one or three fingers. Further, when the user moves the finger pressed into the touch panel 20 in the same direction, the map may be moved and displayed in accordance with the moving direction of the finger while enlarging or reducing the map.
 また、タッチパネル20を押込んだ指のタッチ数が1本のときにスピーカーの音量を大きくし、指のタッチ数が2本のときにスピーカーの音量を小さくしてもよい。逆に、指のタッチ数が1本のときにスピーカーの音量を小さくし、指のタッチ数が2本のときにスピーカーの音量を大きくしてもよい。また、第3の実施の形態におけるジェスチャー操作により、画面のコントラストを高くしたり低くしたりしてもよい。また、カーナビゲーション時に表示される自車アイコンのヘッドアップ(進行方向を上に表示する)又はノースアップ(北を上に表示する)を切替えてもよい。 Alternatively, the volume of the speaker may be increased when the number of finger touches that press the touch panel 20 is 1, and the volume of the speaker may be decreased when the number of finger touches is two. Conversely, the speaker volume may be decreased when the number of finger touches is one and the speaker volume may be increased when the number of finger touches is two. Further, the contrast of the screen may be increased or decreased by a gesture operation in the third embodiment. Further, the head-up (displaying the traveling direction upward) or the north-up (displaying north upward) of the own vehicle icon displayed during car navigation may be switched.
[第4の実施の形態]
 次に、図10を参照して、本発明の第4の実施の形態に係るユーザーインターフェイス装置3のジェスチャー操作の例について説明する。この第4の実施の形態に係るジェスチャー操作は、文字種を切替えるために行われる。
[Fourth Embodiment]
Next, an example of a gesture operation of the user interface device 3 according to the fourth embodiment of the present invention will be described with reference to FIG. The gesture operation according to the fourth embodiment is performed to switch the character type.
<従来の文字種切替操作の課題>
 図9で後述するように、ユーザーは、従来のタッチパネルを用いて、表示パネルに表示されたソフトウェアキーボードから文字入力を行うことができた。ここで従来のタッチパネルは、タッチ位置のXY座標だけを検知可能なタッチパネルである。しかし、従来のタッチパネルでは、ユーザーが小文字から大文字に変えて文字を入力する場合には、ソフトウェアキーボードに表示されるシフトキーや文字種の切り替えキーを押して、文字種を切替えた後に文字を入力する必要があった。このため、時には文字種の切替を頻繁に行う必要が生じ、シフトキーや文字種切り替えキーを何度も押さなければならず、操作性が悪いという問題があった。
<Problems of conventional character type switching operation>
As will be described later with reference to FIG. 9, the user can input characters from the software keyboard displayed on the display panel using a conventional touch panel. Here, the conventional touch panel is a touch panel that can detect only the XY coordinates of the touch position. However, with the conventional touch panel, when the user enters characters by changing from lowercase letters to uppercase letters, it is necessary to input characters after switching the character type by pressing the shift key or character type switching key displayed on the software keyboard. It was. For this reason, it is sometimes necessary to frequently switch the character type, and the shift key and the character type switching key have to be pressed many times, resulting in poor operability.
<従来のジェスチャー操作>
 図10に示す本発明の第4の実施形態を説明する前に、図9により従来の文字種切替操作の例を説明する。
 図9は、従来の文字種の切替操作の例を示したフローチャートである。ここでは、ユーザーが従来の表示パネルに表示されたソフトウェアキーボードを使って、小文字又は大文字のアルファベットを入力する例について説明する。
<Conventional gesture operation>
Prior to describing the fourth embodiment of the present invention shown in FIG. 10, an example of a conventional character type switching operation will be described with reference to FIG.
FIG. 9 is a flowchart showing an example of a conventional character type switching operation. Here, an example will be described in which a user inputs a lowercase or uppercase alphabet using a software keyboard displayed on a conventional display panel.
 始めに、ユーザーは、従来のユーザーインターフェイス装置100が備える表示パネルに表示されたソフトウェアキーボードの文字種を選択する(S41)。このソフトウェアキーボードが備える文字入力キー(以下、「キー」と略記する)の入力モードは小文字入力モードであるため、キーには、小文字が表示されている。したがって、この状態で、ユーザーは、表示されたキーをタッチすることにより、小文字を入力することが可能である。次に、ユーザーは、大文字又は小文字の文字種のいずれかを選択する(S42)。 First, the user selects the character type of the software keyboard displayed on the display panel of the conventional user interface device 100 (S41). Since the input mode of a character input key (hereinafter abbreviated as “key”) provided in the software keyboard is a lowercase input mode, lowercase characters are displayed on the key. Therefore, in this state, the user can input a lowercase letter by touching the displayed key. Next, the user selects either uppercase or lowercase character types (S42).
 ユーザーが選択した文字種が大文字でない場合には(S42の小文字)、特に文字種の切替を行うことなく、ユーザーが入力しようとする文字のキーをタッチすると(S43)、キーの小文字が入力される。そして、ユーザーがタッチパネルから指を離すと、タッチ位置の小文字が確定される(S44)。そして、表示パネルに確定された小文字が表示され、本処理が終了する。 If the character type selected by the user is not an upper case character (lower case in S42), when the user touches the key of the character to be input without switching the character type (S43), the lower case character of the key is input. When the user lifts his / her finger from the touch panel, the lower case letter of the touch position is determined (S44). Then, the fixed lowercase letter is displayed on the display panel, and the present process ends.
 一方、ステップS42でユーザーが選択した文字種が大文字である場合には(S42の大文字)、ユーザーは、ソフトウェアキーボードのシフトキーをタッチする(S45)。ステップS45に付加された画面には、ユーザーがソフトウェアキーボードの左下にあるシフトキー(上向き矢印で表されるキー)をタッチした様子が示されている。 On the other hand, if the character type selected by the user in step S42 is uppercase (uppercase in S42), the user touches the shift key on the software keyboard (S45). The screen added in step S45 shows a state in which the user touches the shift key (key represented by the upward arrow) at the lower left of the software keyboard.
 そして、ユーザーは、指をシフトキーから離して、大文字入力モードを確定する(S46)。ステップS46に付加された画面には、ユーザーがソフトウェアキーボードのシフトキーから指を離した様子が示されている。このとき、キーの入力モードは、小文字入力モードから大文字入力モードに変更される。そして、キーには、大文字が表示され、ユーザーは大文字を入力することが可能となる。 Then, the user removes his / her finger from the shift key to determine the capital letter input mode (S46). The screen added in step S46 shows a state where the user lifts his / her finger from the shift key of the software keyboard. At this time, the key input mode is changed from the lowercase input mode to the uppercase input mode. Uppercase letters are displayed on the keys, and the user can input uppercase letters.
 そして、ユーザーは、ユーザーが入力しようとする文字のキーをタッチすると(S47)、キーの大文字が入力される。ステップS47に付加された画面には、ユーザーがソフトウェアキーボードにあるキーをタッチした様子が示されている。 Then, when the user touches the key of the character to be input (S47), the capital letter of the key is input. The screen added to step S47 shows a state in which the user touches a key on the software keyboard.
 ユーザーがキーから指を離すと、タッチ位置の大文字が確定される(S48)。ステップS48に付加された画面には、ユーザーがソフトウェアキーボードから指を離した様子が示されている。そして、表示パネルには、この確定された大文字が表示され、本処理が終了する。 When the user removes his / her finger from the key, the capital letter at the touch position is confirmed (S48). The screen added in step S48 shows a state where the user lifts his / her finger from the software keyboard. Then, this fixed uppercase letter is displayed on the display panel, and the present process ends.
 このように従来の文字入力においては、ユーザーが、ソフトウェアキーボードに表示されたシフトキーや文字種切り替えキーを押さなければ、文字種が変わらず、したがってキーの入力モードが変更できなかった。このため、例えば、文頭に大文字を入力しなければならない英文を入力する際には、頻繁にシフトキーや文字種切り替えキーを押さなければならず、不便であった。 In this way, in the conventional character input, unless the user presses the shift key or character type switching key displayed on the software keyboard, the character type does not change, and therefore the key input mode cannot be changed. For this reason, for example, when inputting an English sentence in which capital letters must be input at the beginning of the sentence, the shift key and the character type switching key must be frequently pressed, which is inconvenient.
<第4の実施の形態のジェスチャー操作>
 図10は、第4の実施の形態に係る文字種の切替操作の例を示すフローチャートである。図10に示す第4の実施の形態では、1本の指を用いた簡易な操作により文字種を切替えることが可能である。
 すなわち、第4の実施の形態に係る制御部10は、表示パネル40に表示されたソフトウェアキーボードのキー(オブジェクトの一例)が表示される位置でタッチパネル20が押込まれた場合に、キーから入力される文字の文字種を規定する入力モードを変更する制御情報を表示制御部19に出力する。第4の実施の形態においてジェスチャー操作とは、ユーザーがタッチパネル20に指を押込んでキーの入力モードを切替え、指を離して文字を確定する操作であり、オブジェクトの表示形態を変更するとは、例えば、入力モードを変更したキーを表示することである。
<Gesture operation of the fourth embodiment>
FIG. 10 is a flowchart illustrating an example of a character type switching operation according to the fourth embodiment. In the fourth embodiment shown in FIG. 10, the character type can be switched by a simple operation using one finger.
That is, the control unit 10 according to the fourth embodiment is input from a key when the touch panel 20 is pressed at a position where a key (an example of an object) of the software keyboard displayed on the display panel 40 is displayed. Control information for changing the input mode that defines the character type of the character to be output is output to the display control unit 19. In the fourth embodiment, the gesture operation is an operation in which the user presses the finger on the touch panel 20 to switch the key input mode and releases the finger to confirm the character. Changing the object display form is, for example, The key to change the input mode is to display.
 始めに、ユーザーは、ユーザーインターフェイス装置3が備える表示パネル40に表示されたソフトウェアキーボードの文字種を選択する(S51)。このとき、キーの入力モードは、キーの文字を小文字(第1文字種の一例)で入力可能な小文字入力モードであり、ソフトウェアキーボードのキーには、小文字が表示される。そして、ユーザーは、キーをタッチする(S52)。ステップS52に付加された画面には、ユーザーがソフトウェアキーボードにあるキーをタッチした様子が示されている。 First, the user selects the character type of the software keyboard displayed on the display panel 40 provided in the user interface device 3 (S51). At this time, the key input mode is a lowercase input mode in which the key characters can be input in lowercase letters (an example of the first character type), and lowercase letters are displayed on the keys of the software keyboard. Then, the user touches the key (S52). The screen added to step S52 shows a state in which the user touches a key on the software keyboard.
 次に、感圧検知部18は、感圧センサー30から入力するセンサー値に基づいて感圧を検知する(S53)。感圧検知部18が感圧を検知しない場合(S53のNO)、制御部10に感圧検知情報が出力されない。このため、ユーザーがキーから指を離すと、ユーザーがタッチしていたキーの小文字が確定して表示パネル40に表示され(S54)、本処理が終了する。 Next, the pressure sensitive detector 18 detects pressure based on the sensor value input from the pressure sensor 30 (S53). When the pressure sensitive detection unit 18 does not detect the pressure sensitive (NO in S53), the pressure sensitive detection information is not output to the control unit 10. For this reason, when the user removes his / her finger from the key, the lowercase letter of the key touched by the user is determined and displayed on the display panel 40 (S54), and this process ends.
 一方、感圧検知部18が感圧を検知した場合(S53のYES)、制御部10に感圧検知情報が出力される。このとき、制御部10は、キーが表示される位置でタッチパネル20が押込まれた場合に、キーの文字を小文字で入力可能な入力モードから大文字で入力可能な入力モードに変更する(S55)。この結果、キーの入力モードは、キーの文字を大文字(第2文字種の一例)で入力可能な大文字入力モードに変わり、ソフトウェアキーボードのキーには、大文字が表示される。ステップS55に付加された画面には、キーが大文字入力モードに切替えて表示されたソフトウェアキーボードにある大文字のキーをタッチした様子が示されている。 On the other hand, when the pressure-sensitive detector 18 detects pressure (YES in S53), pressure-sensitive detection information is output to the controller 10. At this time, when the touch panel 20 is pressed at the position where the key is displayed, the control unit 10 changes the key character from the input mode in which lowercase characters can be input to the input mode in which uppercase characters can be input (S55). As a result, the key input mode is changed to a capital letter input mode in which a key character can be input in capital letters (an example of the second character type), and capital letters are displayed on the keys of the software keyboard. The screen added to step S55 shows a state in which the uppercase key on the software keyboard displayed by switching the key to the uppercase input mode is touched.
 そして、ユーザーがキーから指を離すと、ユーザーがタッチしていたキーの大文字が確定して表示パネル40に表示され(S56)、本処理が終了する。ステップS56に付加された画面には、ユーザーがソフトウェアキーボードから指を離すと、大文字が確定して表示される様子が示されている。その後、制御部10は、キーの入力モードを、大文字入力モードから小文字入力モードに戻す。 Then, when the user lifts his / her finger from the key, the capital letter of the key touched by the user is fixed and displayed on the display panel 40 (S56), and this process is terminated. The screen added in step S56 shows how the capital letter is fixed and displayed when the user lifts his / her finger from the software keyboard. Thereafter, the control unit 10 returns the key input mode from the uppercase input mode to the lowercase input mode.
 以上説明した第4の実施の形態に係るユーザーインターフェイス装置3では、ソフトウェアキーボードで文字を入力する際に、タッチパネル20に指を押込むだけで入力する文字の文字種を容易に切替えることができる。このため、従来は、シフトキーまで指を移動してシフトキーをタッチして文字種を切替える操作が必要であったが、第4の実施の形態におけるジェスチャー操作では、ユーザーが入力したいキーの位置に指をタッチしたまま文字種を小文字から大文字に変更することができる。このため、ユーザーは直感的なジェスチャー操作で文字種の切替を容易に行うことが可能となる。 In the user interface device 3 according to the fourth embodiment described above, the character type of the character to be input can be easily switched by simply pressing the finger on the touch panel 20 when inputting the character with the software keyboard. For this reason, conventionally, it has been necessary to move the finger to the shift key and touch the shift key to switch the character type. However, in the gesture operation in the fourth embodiment, the user places the finger at the key position to be input by the user. You can change the character type from lower case to upper case while touching. For this reason, the user can easily switch the character type by an intuitive gesture operation.
 なお、始めに、キーの入力モードを大文字入力モードとしておき、感圧検知部18が感圧を検知した場合には、小文字入力モードに一時的に切替え、小文字が確定して入力された後、大文字入力モードに戻してもよい。 First, the key input mode is set to the upper case input mode, and when the pressure sensitive detection unit 18 detects the pressure, the mode is temporarily switched to the lower case input mode, and after the lower case is fixed and inputted, You may return to uppercase input mode.
 また、始めに、キーの入力モードを平仮名入力モードとしておき、感圧検知部18が感圧を検知した場合には、片仮名入力モードに一時的に切替え、片仮名の文字が確定して入力された後、平仮名入力モードに戻してもよい。また、始めの入力モードを片仮名入力モードとし、感圧検知時に切替える入力モードを平仮名入力モードとしてもよい。また、キーの入力モードを、全角入力モードから半角入力モードに切替えたり、半角入力モードから全角入力モードに切替えたりできるようにしてもよい。また、制御部10は、切替えた入力モードを次の文字入力においても維持しておき、ユーザーが再びタッチパネル20に指を押込んだときに元の入力モードに戻してもよい。 First, the key input mode is set to Hiragana input mode, and when the pressure-sensitive detector 18 detects pressure, the mode is temporarily switched to Katakana input mode, and the Katakana character is confirmed and input. After that, you may return to hiragana input mode. Alternatively, the first input mode may be the katakana input mode, and the input mode that is switched when pressure is detected may be the hiragana input mode. Alternatively, the key input mode may be switched from the full-width input mode to the half-width input mode, or from the half-width input mode to the full-width input mode. Further, the control unit 10 may maintain the switched input mode in the next character input, and may return to the original input mode when the user presses the finger on the touch panel 20 again.
 また、キーの入力モードを平仮名入力モード又は片仮名入力モードであるときに、感圧検知部18が感圧を検知した場合には、ユーザーが指でタッチした位置の文字に応じた濁音(が,ぎ,ぐ等)、半濁音(ぱ,ぴ,ぷ等)、促音(っ)、小書き文字(ゃ,ゅ,ょ,ぁ,ぃ,ぅ等)、記号(+,-,×,÷等)、ウムラウト等を入力可能にしてもよい。 When the key input mode is the hiragana input mode or the katakana input mode, if the pressure-sensitive detection unit 18 detects pressure, the muddy sound corresponding to the character at the position touched by the user's finger ( Gi, Gu, etc.), semi-turbid sound (Pa, Pi, Pu, etc.), sound (tsu), small letters (nya, yu, yo, ai, ぅ, etc.), symbols (+,-, x, ÷, etc.) ), Umlauts or the like may be input.
[第5の実施の形態]
 次に、図12を参照して、本発明の第5の実施の形態に係るユーザーインターフェイス装置3のジェスチャー操作の例について説明する。第5の実施の形態に係るジェスチャー操作は、ある操作を行った位置で編集項目(メニュー項目の一例)を選択するために行われる。
[Fifth Embodiment]
Next, an example of gesture operation of the user interface device 3 according to the fifth embodiment of the present invention will be described with reference to FIG. The gesture operation according to the fifth embodiment is performed in order to select an edit item (an example of a menu item) at a position where a certain operation is performed.
<従来のメニュー選択操作の課題>
 従来、ユーザーが文書編集ソフトウェアを用いて文書を作成、編集する場合、主にマウス、キーボードが用いられていた。マウスを用いると、ある領域(例えば、文字列)を選択した状態で右クリックすることで、例えば、コピー、ペースト、切り取りといった編集項目の一覧を表示し、任意の編集項目を選択することが可能であった。しかし、従来のタッチパネルだけを用いて、右クリックで編集項目の一覧を表示する機能を実現できなかった。
<Problems of conventional menu selection operations>
Conventionally, when a user creates and edits a document using document editing software, a mouse and a keyboard are mainly used. Using a mouse, right-clicking a selected area (for example, a character string) displays a list of edit items such as copy, paste, and cut, and allows you to select any edit item. Met. However, a function for displaying a list of edit items by right-clicking using only a conventional touch panel cannot be realized.
<従来のジェスチャー操作>
 図12に示す本発明の第5の実施形態を説明する前に、図11により従来のメニュー選択操作の例を説明する。
 図11は、従来のメニュー選択操作の例を示すフローチャートである。
 始めに、従来のユーザーインターフェイス装置100が備える制御部は、マウス等の操作によって、黒矢印の指示アイコンが示す位置を検知する(S61)。ステップS61に付加された画面には、画面内の文字列を指示アイコンが指示する様子が示されている。ユーザーが指示アイコンを操作して画面内の文字列から任意の文字列を選択すると、選択された文字列が太字で強調表示される。
<Conventional gesture operation>
Before explaining the fifth embodiment of the present invention shown in FIG. 12, an example of a conventional menu selection operation will be described with reference to FIG.
FIG. 11 is a flowchart showing an example of a conventional menu selection operation.
First, the control unit included in the conventional user interface apparatus 100 detects the position indicated by the black arrow instruction icon by operating the mouse or the like (S61). The screen added to step S61 shows a state in which the instruction icon indicates the character string in the screen. When the user operates the instruction icon and selects an arbitrary character string from the character strings on the screen, the selected character string is highlighted in bold.
 次に、ユーザーは、指示アイコンを動かしてメニューを選択する(S62)。ユーザーがメニューを選択しない場合(S62のNO)、ユーザーが動かすマウスの軌跡に従って、指示アイコンが移動し(S63)、本処理を終了する。 Next, the user selects a menu by moving the instruction icon (S62). When the user does not select the menu (NO in S62), the instruction icon moves according to the locus of the mouse moved by the user (S63), and this process ends.
 ステップS62で、ユーザーがメニューを選択した場合(S62のYES)、例えば、編集メニューの編集項目が表示される(S64)。その後、ユーザーが画面上部のメニューバーから編集メニューを選択すると、この編集メニューに含まれる編集項目(コピー、ペースト、切り取り等)が一覧表示される。ステップ64に付加された画面には、ユーザーによって選択された文字列が破線の楕円で示されると共に、画面上側のメニューバーに編集メニュー及び編集項目の一覧が示されている。 In step S62, when the user selects a menu (YES in S62), for example, edit items in the edit menu are displayed (S64). Thereafter, when the user selects an edit menu from the menu bar at the top of the screen, edit items (copy, paste, cut, etc.) included in the edit menu are displayed in a list. In the screen added to step 64, the character string selected by the user is indicated by a dashed ellipse, and a list of edit menus and edit items is indicated in the menu bar on the upper side of the screen.
 次に、ユーザーは、編集項目を選択し(S65)、本処理を終了する。これにより選択した文字列に対する編集を行うことが可能となる。なお、図11では、ユーザーがメニューバーの編集メニューをクリックする例が示されたが、ユーザーが任意の文字列を選択した状態で、例えば、外部入力装置としてのマウスを右クリックした場合にも、編集メニューと同様の編集項目が一覧表示され、編集項目を選択することが可能である。 Next, the user selects an edit item (S65) and ends this process. As a result, the selected character string can be edited. FIG. 11 shows an example in which the user clicks the edit menu on the menu bar. However, for example, when the user selects an arbitrary character string, for example, when the user right-clicks the mouse as an external input device. Edit items similar to the edit menu are displayed in a list, and the edit items can be selected.
 このように従来は、ユーザーが選択した文字列に対する編集操作を行うためには、指示アイコンをメニューバーまで移動するか、マウスを右クリックして編集項目を一覧表示し、編集項目を選択しなければならない。しかし、従来のユーザーインターフェイス装置100にマウスが接続されていなければ、これらの操作を行うことは難しかった。一方、第5の実施の形態においては、押込み操作を行うことにより編集メニューを表示し、編集項目を選択することが可能である。 As described above, conventionally, in order to perform an editing operation on a character string selected by the user, the instruction icon must be moved to the menu bar, or a list of editing items can be displayed by right-clicking the mouse, and the editing item must be selected. I must. However, if a mouse is not connected to the conventional user interface device 100, it is difficult to perform these operations. On the other hand, in the fifth embodiment, it is possible to display an edit menu and select an edit item by performing a push operation.
<第5の実施の形態のジェスチャー操作>
 図12は、第5の実施の形態に係るメニュー選択操作の例を示すフローチャートである。
 図12に示す第5の実施の形態では、1本の指を用いた簡易な操作により文字列を編集することが可能である。すなわち、第5の実施の形態に係る制御部10は、所定領域のオブジェクトが選択された状態でタッチパネル20が押込まれた場合に、オブジェクトに関連するメニュー項目をタッチパネル20が押込まれた位置で表示するための制御情報を表示制御部19に出力する。第5の実施の形態においてジェスチャー操作とは、ユーザーがタッチパネル20に指を押込んで表示される編集メニューから編集項目を選択する操作であり、オブジェクトの表示形態を変更するとは、例えば、編集メニューの編集項目を通じて編集された文字列を表示することである。
<Gesture operation of the fifth embodiment>
FIG. 12 is a flowchart illustrating an example of a menu selection operation according to the fifth embodiment.
In the fifth embodiment shown in FIG. 12, it is possible to edit a character string by a simple operation using one finger. That is, when the touch panel 20 is pressed while an object in a predetermined area is selected, the control unit 10 according to the fifth embodiment displays a menu item related to the object at the position where the touch panel 20 is pressed. Control information for output to the display control unit 19. In the fifth embodiment, the gesture operation is an operation in which the user presses a finger on the touch panel 20 to select an edit item from an edit menu. Changing the object display form is, for example, an edit menu. Displaying the edited character string through the edit item.
 始めに、ユーザーインターフェイス装置3が備えるタッチパネル20にユーザーの指が触れたことにより、座標検知部17は、タッチされたことを検知し(S71)、タッチ位置の座標を検知する。 First, when the user's finger touches the touch panel 20 included in the user interface device 3, the coordinate detection unit 17 detects that the touch has been made (S71), and detects the coordinates of the touch position.
 次に、ユーザーは、タッチ操作で所定領域の文字列(オブジェクトの一例)を選択し、指をタッチパネル20に押し込む動作を選択する(S72)。次に、感圧検知部18は、感圧センサー30から出力されるセンサー値に基づいて、感圧センサー30が感圧を検知したか否かを判定する(S73)。感圧検知部18が、センサー値により感圧を検知していないと判定した場合(S73のNO)、制御部10は、タッチパネル20にタッチした指が移動する方向に合わせて指示アイコンを移動して表示し(S74)、本処理を終了する。 Next, the user selects a character string (an example of an object) in a predetermined area by a touch operation, and selects an operation of pushing a finger into the touch panel 20 (S72). Next, the pressure-sensitive detection unit 18 determines whether or not the pressure-sensitive sensor 30 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (S73). When the pressure-sensitive detection unit 18 determines that pressure is not detected based on the sensor value (NO in S73), the control unit 10 moves the instruction icon in accordance with the direction in which the finger touching the touch panel 20 moves. Are displayed (S74), and this process is terminated.
 一方、ステップS73で、感圧検知部18が、センサー値により感圧を検知したと判定した場合(S73のYES)、所定領域のオブジェクトが選択された状態でタッチパネル20が押込まれている。このとき、制御部10は、編集メニューの編集項目をタッチパネル20が押込まれた位置で表示するための制御情報を表示制御部19に出力する。編集メニューの編集項目とは、オブジェクトが文字列である場合に、この文字列に関連するメニュー項目の一例である。 On the other hand, if it is determined in step S73 that the pressure-sensitive detection unit 18 has detected pressure sensitivity based on the sensor value (YES in S73), the touch panel 20 is pressed in a state where an object in a predetermined area is selected. At this time, the control unit 10 outputs control information for displaying the edit item of the edit menu at the position where the touch panel 20 is pressed to the display control unit 19. The edit item of the edit menu is an example of a menu item related to the character string when the object is a character string.
 そして、表示制御部19は、表示パネル40に編集メニューの編集項目を表示する(S75)。ステップS75に付加された画面には、ユーザーによって選択された領域(例えば、文字列)が破線の楕円で示されると共に、編集項目の一覧が示されている。そして、ユーザーは、表示された編集項目から任意の編集項目を選択し(S76)、本処理を終了する。 Then, the display control unit 19 displays the edit items of the edit menu on the display panel 40 (S75). On the screen added in step S75, a region (for example, a character string) selected by the user is indicated by a dashed ellipse and a list of edit items is indicated. Then, the user selects an arbitrary edit item from the displayed edit items (S76), and ends this process.
 以上説明した第5の実施の形態に係るユーザーインターフェイス装置3では、タッチパネル20が押込まれたときに、タッチパネル20を通じて選択した領域に対する編集メニューの編集項目を表示する。このようにユーザーは、所定の領域を選択したまま、従来のマウスにおける右クリックの動作と同様に編集項目が表示されるため、指を大きく移動させずに編集項目を選択することができる。 In the user interface device 3 according to the fifth embodiment described above, when the touch panel 20 is pressed, the edit items of the edit menu for the area selected through the touch panel 20 are displayed. In this way, the user can select an edit item without moving his / her finger largely, since the edit item is displayed in the same manner as a right-click operation with a conventional mouse while a predetermined area is selected.
 なお、メニューバーには、編集メニュー以外にも、ファイルメニュー、表示メニュー、ヘルプメニュー等がある。このため、タッチパネル20が押込まれたときに表示されるメニューを、編集メニュー以外のメニューとしてもよい。また、タッチパネル20の押込み量に応じて、メニューを切替えて表示してもよい。例えば、押込み量が少なければ(押圧力が閾値th1以上、閾値th2未満)、編集メニューの編集項目を表示し、押込み量が多ければ(押圧力が閾値th2以上)、表示メニューの項目を表示してもよい。 In addition to the edit menu, the menu bar includes a file menu, a display menu, a help menu, and the like. For this reason, the menu displayed when the touch panel 20 is pressed may be a menu other than the edit menu. Further, the menu may be switched and displayed according to the pressing amount of the touch panel 20. For example, if the pressing amount is small (the pressing force is greater than or equal to the threshold th1 and less than the threshold th2), the editing menu edit item is displayed. If the pressing amount is large (the pressing force is the threshold th2 or more), the display menu item is displayed. May be.
 また、感圧検知部18が感圧を検知している時間に応じて、例えば、2秒毎にメニューを切替えて表示してもよい。これにより、ユーザーは、編集メニュー以外のメニューから項目を選択することが可能となる。 Further, the menu may be switched and displayed every 2 seconds, for example, according to the time during which the pressure-sensitive detector 18 is detecting pressure. Thereby, the user can select an item from a menu other than the edit menu.
[第6の実施の形態]
 次に、図14を参照して、本発明の第6の実施の形態に係るユーザーインターフェイス装置3のジェスチャー操作の例について説明する。第6の実施の形態に係るジェスチャー操作は、直前に行った操作を取消す取消操作のために行われる。
[Sixth Embodiment]
Next, with reference to FIG. 14, an example of gesture operation of the user interface device 3 according to the sixth embodiment of the present invention will be described. The gesture operation according to the sixth embodiment is performed for a cancel operation for canceling the operation performed immediately before.
<従来の取消操作の課題>
 タッチ位置のXY座標だけを検知可能な従来のタッチパネルを備えたユーザーインターフェイス装置を用いて、直前に行った操作を取り消すためには、表示パネルに表示されたキャンセルアイコンをクリックする必要があった。また、キャンセルアイコンが表示されていなければ、ユーザーインターフェイス装置に接続したキーボード等の外部入力装置を用いて取消操作を行わなければならなかった。しかし、車両の設置スペースは限られており、タッチパネル及び表示パネルを大きくすることは難しく、外部入力装置を設置するスペースもない。また、ナビゲーション装置が備える表示パネルに表示可能な領域が小さいため、キャンセルアイコンを表示するスペースがない。このため、従来のユーザーインターフェイス装置を用いて、直前の操作に戻す操作や、入力を取り消す操作を直観的かつ簡易的に行うことができなかった。
<Problems of conventional cancellation operations>
In order to cancel the operation performed immediately before using a user interface device having a conventional touch panel that can detect only the XY coordinates of the touch position, it is necessary to click a cancel icon displayed on the display panel. If the cancel icon is not displayed, the cancel operation must be performed using an external input device such as a keyboard connected to the user interface device. However, the installation space of the vehicle is limited, and it is difficult to enlarge the touch panel and the display panel, and there is no space for installing the external input device. In addition, since the area that can be displayed on the display panel included in the navigation device is small, there is no space for displaying the cancel icon. For this reason, it has not been possible to intuitively and simply perform an operation of returning to the previous operation or an operation of canceling an input using a conventional user interface device.
<従来のジェスチャー操作>
 図14に示す本発明の第6の実施形態を説明する前に、図13により従来の取消操作の例を説明する。
 図13は、従来の取消操作の例を示すフローチャートである。
 従来のユーザーインターフェイス装置100が備える制御部は、ユーザーインターフェイス装置100が備えるタッチパネルに指がタッチしたことを検知する(S81)。ステップS81に付加された画面には、ユーザーが画面の左下を1本の指でタッチした様子が示されている。
<Conventional gesture operation>
Before explaining the sixth embodiment of the present invention shown in FIG. 14, an example of a conventional cancel operation will be described with reference to FIG.
FIG. 13 is a flowchart showing an example of a conventional cancel operation.
The control unit included in the conventional user interface device 100 detects that a finger has touched the touch panel included in the user interface device 100 (S81). The screen added in step S81 shows a state in which the user touches the lower left of the screen with one finger.
 そして、ユーザーは、不図示のソフトウェアキーボードにより文字列を入力する。次に、ユーザーは、入力した文字列が正確であるか否かを判定する(S82)。ステップS82に付加された画面には、ユーザーが入力した文字列が表示される様子が示されている。 And the user inputs a character string using a software keyboard (not shown). Next, the user determines whether or not the input character string is accurate (S82). The screen added in step S82 shows a state in which a character string input by the user is displayed.
 ステップS82にて、ユーザーが、文字入力が正確であると判定すると(S82のYES)、本処理を終了する。一方、文字入力が不正確であると判定すると(S82のNO)、ユーザーは文字入力の取消操作を行う。この取消操作は、外部入力装置を通じて行われる入力操作(S83)、又は、画面右下に表示されるキャンセルアイコンに対するタッチ操作(S84)のいずれかにより行われる。ここでは、ステップS83,S84の操作をいずれも取消操作と呼ぶ。そして、ユーザーは、取消操作を行った後、ステップS82に戻って、再び文字入力を行い、文字入力が正確か否かを判定する。 If the user determines in step S82 that the character input is correct (YES in S82), the process is terminated. On the other hand, when it is determined that the character input is inaccurate (NO in S82), the user performs a character input cancel operation. This canceling operation is performed by either an input operation (S83) performed through an external input device or a touch operation (S84) for a cancel icon displayed at the lower right of the screen. Here, the operations in steps S83 and S84 are both called cancel operations. Then, after performing the canceling operation, the user returns to step S82 and performs character input again to determine whether or not the character input is accurate.
 このように従来は、外部入力装置を使ったり、キャンセルアイコンにタッチしたりすることで取消操作を実現していたため、設置スペースが限られる用途では取消操作を行うことが難しかった。一方、第6の実施の形態においては、押込み操作により、直前の操作を取り消すことを可能としている。 Thus, conventionally, since the cancel operation was realized by using an external input device or touching a cancel icon, it was difficult to perform the cancel operation in an application where installation space is limited. On the other hand, in the sixth embodiment, the previous operation can be canceled by the pushing operation.
<第6の実施の形態のジェスチャー操作>
 図14は、第6の実施の形態に係る取消操作の例を示すフローチャートである。図14に示す第6の実施の形態では、1本の指を用いた簡易な操作により取消操作を行うことが可能である。すなわち、第6の実施の形態に係る制御部10は、タッチパネル20が指により押込まれた状態で指が移動する方向に応じて、直前に行われた操作に対する編集操作を行うための制御情報を表示制御部19に出力する。第6の実施の形態においてジェスチャー操作とは、ユーザーがタッチパネル20に指を押込んだ状態で、文字列を入力した方向とは逆の方向に指を移動させる操作であり、オブジェクトの表示形態を変更するとは、例えば、編集した文字列を表示することである。
<Gesture operation of the sixth embodiment>
FIG. 14 is a flowchart illustrating an example of a cancel operation according to the sixth embodiment. In the sixth embodiment shown in FIG. 14, the canceling operation can be performed by a simple operation using one finger. That is, the control unit 10 according to the sixth embodiment provides control information for performing an editing operation on the operation performed immediately before, according to the direction in which the finger moves in a state where the touch panel 20 is pressed by the finger. The data is output to the display control unit 19. In the sixth embodiment, the gesture operation is an operation of moving the finger in a direction opposite to the direction in which the character string is input in a state where the user presses the finger on the touch panel 20, and the object display form is changed. To change is to display the edited character string, for example.
 始めに、ユーザーインターフェイス装置3が備えるタッチパネル20にユーザーの指が触れたことにより、座標検知部17は、タッチされたことを検知し(S91)、タッチ位置の座標を検知する。ステップS91に付加された画面には、ユーザーが画面の左下を1本の指でタッチした様子が示されている。 First, when the user's finger touches the touch panel 20 provided in the user interface device 3, the coordinate detection unit 17 detects that the touch has been made (S91), and detects the coordinates of the touch position. The screen added in step S91 shows a state where the user touches the lower left of the screen with one finger.
 そして、ユーザーは、不図示のソフトウェアキーボードにより文字列を入力する。次に、ユーザーは、入力した文字列が正確であるか否かを判定する(S92)。ステップS92に付加された画面には、ユーザーが入力した文字列が表示される様子が示されている。 And the user inputs a character string using a software keyboard (not shown). Next, the user determines whether or not the input character string is accurate (S92). The screen added in step S92 shows a state in which a character string input by the user is displayed.
 ステップS92にて、ユーザーが、文字入力が正確であると判定すると(S92のYES)、本処理を終了する。一方、文字入力が不正確であると判定すると(S92のNO)、ユーザーは文字入力の取消操作を行う。この取消操作では、始めに、ユーザーが指をタッチパネル20に押込む操作が行われる(S93)。ステップS93に付加された画面には、ユーザーがタッチパネル20を押込む様子が示されている。 If the user determines in step S92 that the character input is correct (YES in S92), the process is terminated. On the other hand, if it is determined that the character input is inaccurate (NO in S92), the user performs a character input cancel operation. In this canceling operation, first, the user performs an operation of pushing a finger into the touch panel 20 (S93). The screen added in step S93 shows a state where the user presses the touch panel 20.
 次に、ユーザーが指をタッチパネル20に押込んだまま、指を左方向に移動させる操作を行う(S94)。このとき、制御部10は、タッチパネル20が指により押込まれた状態で指が移動する方向(例えば、左方向)に応じて、直前に行われた操作を取消し、又はやり直す編集操作を行うための制御情報を表示制御部19に出力する。これにより指が移動した分だけ直前の操作が取り消され、入力した文字列が削除される。ステップS94に付加された画面には、直前に入力された文字列の入力操作が全て取り消される様子が示されている。ここでは、ステップS93,S94の順に行う一連の操作を取消操作と呼ぶ。そして、ユーザーは、取消操作を行った後、ステップS92に戻って、再び文字入力を行い、文字入力が正確か否かを判定し、正確であると判定すると本処理を終了する。 Next, the user performs an operation of moving the finger in the left direction while pushing the finger into the touch panel 20 (S94). At this time, the control unit 10 performs an editing operation for canceling or redoing the operation performed immediately before depending on the direction in which the finger moves (for example, the left direction) while the touch panel 20 is pressed by the finger. The control information is output to the display control unit 19. As a result, the previous operation is canceled by the amount of movement of the finger, and the input character string is deleted. The screen added in step S94 shows how all the input operations for the character string input immediately before are canceled. Here, a series of operations performed in the order of steps S93 and S94 is referred to as a cancel operation. Then, after performing the cancel operation, the user returns to step S92 to input the character again, determines whether or not the character input is correct, and ends the process when determining that it is correct.
 以上説明した第6の実施の形態に係るユーザーインターフェイス装置3では、押込み操作と、一方向に指を移動するタッチ操作を組み合わせて、直前の操作を取り消すことができるため、ユーザーは、取消操作を直観的かつ簡易的に実施することができる。このため、従来は、取消操作を行うために表示パネルにアイコンを表示し、又は外部入力装置を設けていたのに対し、本実施の形態例では、設置スペースや対応アプリケーション等の制約を受けない。このようにユーザーは、ユーザーインターフェイス装置3を用いることで取消操作を簡単に行うことができるため、ナビゲーション装置1に限らず、様々な用途の装置に適用することが可能である。 In the user interface device 3 according to the sixth embodiment described above, since the last operation can be canceled by combining the pressing operation and the touch operation that moves the finger in one direction, the user can cancel the cancel operation. Intuitive and simple implementation. For this reason, conventionally, an icon is displayed on the display panel or an external input device is provided to perform the cancel operation. In the present embodiment, there are no restrictions on the installation space and the corresponding application. . As described above, the user can easily perform the cancel operation by using the user interface device 3, and thus can be applied not only to the navigation device 1 but also to devices for various purposes.
 なお、文字入力の場合には、タッチパネル20に押込まれた指が押圧方向に移動するときの押込み量に応じて複数の操作をまとめて取り消してもよい。また、ユーザーがタッチパネル20に押込んだ指を移動させて取消操作を行う際の、指の移動方向は左方向に限らず、他の方向としてもよい。 In the case of character input, a plurality of operations may be canceled together according to the amount of pressing when the finger pressed into the touch panel 20 moves in the pressing direction. In addition, when the user performs a canceling operation by moving the finger pressed into the touch panel 20, the moving direction of the finger is not limited to the left direction but may be another direction.
 また、ユーザーがタッチパネル20に押込んだ指を移動させて取消操作を行った後に、指を右に移動した場合、操作のやり直しとして、文字列を再表示してもよい。あるいは、指の移動方向が上の場合に、操作の繰り返しとしてもよい。また、その他の操作を、ユーザーインターフェイス装置3を用いて行われるジェスチャー操作に割当ててもよい。 In addition, when the finger is moved to the right after moving the finger pushed into the touch panel 20 by the user and moving the finger to the right, the character string may be redisplayed as the re-operation. Alternatively, the operation may be repeated when the moving direction of the finger is upward. Also, other operations may be assigned to gesture operations performed using the user interface device 3.
 また、ナビゲーション装置1では、ユーザーがある場所を特定して行先入力をした後、別の場所を特定して行先入力をした際、ユーザーが指をタッチパネル20に押込んで左に移動させた場合には、前に行先入力が行われた場所を表示するようにしてもよい。 In the navigation device 1, when a user specifies a place and inputs a destination, and then specifies another place and inputs a destination, the user presses the finger into the touch panel 20 and moves it to the left. May display the location where the destination input was previously performed.
 また、画像編集ソフトウェアにおいて、画像に対する操作の取消し又はやり直しを行うために、第6の実施の形態に係るユーザーインターフェイス装置3を用いてもよい。また、ユーザーは、押込んだ指の移動方向を途中で変えることで、複数の取消操作を行うようにしてもよい。例えば、指の移動方向を左、上の順で変えた場合には、直前に行われた連続する2つの入力操作を取り消す操作であるとしてもよい。 Further, in the image editing software, the user interface device 3 according to the sixth embodiment may be used in order to cancel or redo an operation on an image. Further, the user may perform a plurality of cancel operations by changing the moving direction of the pressed finger halfway. For example, when the finger moving direction is changed from left to top, the operation may be an operation for canceling two consecutive input operations performed immediately before.
[第7の実施の形態]
 次に、図17、図18を参照して、本発明の第7の実施の形態に係るユーザーインターフェイス装置3のジェスチャー操作の例について説明する。本実施の形態に係るジェスチャー操作は、ナビゲーション装置から離れた場所に配置したユーザーインターフェイス装置3を用いて行われる。
[Seventh Embodiment]
Next, an example of a gesture operation of the user interface device 3 according to the seventh embodiment of the present invention will be described with reference to FIGS. 17 and 18. The gesture operation according to the present embodiment is performed using the user interface device 3 arranged at a location away from the navigation device.
<従来のナビゲーション装置の課題>
 まず、図17、図18により本発明の第7の実施の形態に係るユーザーインターフェイス装置3のジェスチャー操作の例を説明する前に、図15、図16を参照して、従来のジェスチャー操作について説明する
 図15は、従来のユーザーインターフェイス装置110の設置例を示す説明図である。
 近年、車両に搭載されるナビゲーション装置には、ユーザーが手元でナビゲーション装置を操作可能とするために、タッチパネルを有さないユーザーインターフェイス装置110が用いられることがあった。このナビゲーション装置は、ナビゲーション装置本体と、ナビゲーション画面だけを表示可能なユーザーインターフェイス装置110を備えている。このユーザーインターフェイス装置110が備える表示パネル111(第2表示パネルの一例)にはタッチパネルが設けられていない。このため、ユーザーは、ユーザーインターフェイス装置110を通じてタッチ操作をすることはできない。
<Problems of conventional navigation devices>
First, before explaining an example of the gesture operation of the user interface device 3 according to the seventh embodiment of the present invention with reference to FIGS. 17 and 18, the conventional gesture operation will be described with reference to FIGS. 15 and 16. FIG. 15 is an explanatory diagram showing an installation example of the conventional user interface device 110.
In recent years, a user interface device 110 that does not have a touch panel has been used in a navigation device mounted on a vehicle in order to allow a user to operate the navigation device at hand. This navigation device includes a navigation device body and a user interface device 110 that can display only a navigation screen. The display panel 111 (an example of the second display panel) provided in the user interface device 110 is not provided with a touch panel. For this reason, the user cannot perform a touch operation through the user interface device 110.
 ナビゲーション装置に対してユーザーが操作を可能とするために、外部入力装置が車内に設置される。外部入力装置としては、例えば、車両内のセンターコンソールに設置され、ナビゲーション装置に接続されたボタン101、ジョイスティック102がある。また、ジョイスティック102の代わりにツマミ型コントローラーが設置されることもある。 An external input device is installed in the vehicle so that the user can operate the navigation device. Examples of the external input device include a button 101 and a joystick 102 that are installed on a center console in a vehicle and connected to a navigation device. Also, a knob type controller may be installed instead of the joystick 102.
 しかし、外部入力装置を用いてナビゲーション装置を操作する場合、選択操作はジョイスティック102を用いて行われ、決定操作はボタン101を用いて行う必要がある。このため、ナビゲーション装置の操作自体が極めて煩雑になり、操作に慣れるためには、長期間の熟練が必要であった。また、センターコンソールには、外部入力装置を設置可能なスペースが限られているばかりか、外部入力装置の用途は選択操作と決定操作に限定されてしまう。このため、ユーザーが使用したい機能を開くためには、外部入力装置を使って選択と決定を繰り返さなければならず、手間や時間が掛かっていた。 However, when the navigation device is operated using an external input device, the selection operation needs to be performed using the joystick 102, and the determination operation needs to be performed using the button 101. For this reason, the operation of the navigation device itself becomes very complicated, and a long-term skill is required to get used to the operation. In addition, the center console has a limited space where an external input device can be installed, and the use of the external input device is limited to a selection operation and a determination operation. For this reason, in order to open a function that the user wants to use, selection and determination must be repeated using an external input device, which takes time and effort.
<従来のジェスチャー操作>
 図16は、従来のナビゲーション装置の操作例を示すフローチャートである。ここでは、外部入力装置として、ジョイスティック102とボタン101を用いてナビゲーション操作を行う例について説明する。
<Conventional gesture operation>
FIG. 16 is a flowchart showing an operation example of a conventional navigation device. Here, an example in which a navigation operation is performed using a joystick 102 and a button 101 as an external input device will be described.
 始めに、ナビゲーション装置の制御部(不図示)は、ジョイスティック102の動作を検知する(S101)。ステップS101に付加された画面には、ユーザーインターフェイス装置110の表示パネル111に指示アイコンが表示される様子が示されている。 First, the control unit (not shown) of the navigation device detects the operation of the joystick 102 (S101). The screen added in step S101 shows a state in which the instruction icon is displayed on the display panel 111 of the user interface device 110.
 次に、ユーザーは、ジョイスティック102を操作して所定の情報を入力する(S102)。ステップS102に付加された画面には、ユーザーがジョイスティック102を動かしたことにより、指示アイコンが移動する様子が示されている。なお、移動先の指示アイコンは、画面内に白抜き矢印で表されている。 Next, the user operates the joystick 102 to input predetermined information (S102). The screen added in step S <b> 102 shows how the instruction icon moves as the user moves the joystick 102. The destination instruction icon is represented by a white arrow in the screen.
 次に、ユーザーは、指示アイコンをボタンアイコンの上に移動し、ボタンアイコンを押すことで入力した情報を決定する(S103)。ステップS103に付加された画面には、ユーザーがボタン101を押したことにより、ユーザーが入力した情報が決定される様子が示されている。 Next, the user moves the instruction icon onto the button icon, and determines the input information by pressing the button icon (S103). The screen added to step S103 shows how the information input by the user is determined when the user presses the button 101.
 このように従来は、ジョイスティック102により選択操作を行い、ボタン101により決定操作を行わなければならなかった。しかし、ジョイスティック102とボタン101は異なる位置に配置されるため、誤操作する場合もある。また、ジョイスティック102とボタン101に選択操作と決定操作以外の操作機能を割当てることができなかった。 Thus, conventionally, the selection operation must be performed by the joystick 102 and the determination operation must be performed by the button 101. However, since the joystick 102 and the button 101 are arranged at different positions, an erroneous operation may occur. Further, it is impossible to assign operation functions other than the selection operation and the determination operation to the joystick 102 and the button 101.
<第7の実施の形態のナビゲーション装置の構成及び動作>
 図17は、第7の実施の形態に係るナビゲーション装置1Aの構成例を示す説明図である。この図17に示す第7の実施の形態においては、押込み操作を併用して入力した情報を決定することが可能となる。
 すなわち、ナビゲーション装置1Aは、上述した図2に示すようなナビゲーション装置本体2、ユーザーインターフェイス装置3に加えて、もう一つのユーザーインターフェイス装置110を備える。
<Configuration and Operation of Navigation Device of Seventh Embodiment>
FIG. 17 is an explanatory diagram showing a configuration example of a navigation device 1A according to the seventh embodiment. In the seventh embodiment shown in FIG. 17, it is possible to determine information that is input by using a pressing operation together.
That is, the navigation device 1A includes another user interface device 110 in addition to the navigation device body 2 and the user interface device 3 as shown in FIG.
 ユーザーインターフェイス装置110は、従来型のユーザーインターフェイス装置であり、タッチパネル20を備えていない。ナビゲーションに用いるオブジェクトだけが表示パネル111に表示可能となっている。この表示パネル111には、制御部10から入力するジェスチャー操作により指示された内容に基づいてナビゲーション装置本体2が所定の処理を行い、所定の処理を行って表示形態を変更したオブジェクトが表示される。 The user interface device 110 is a conventional user interface device and does not include the touch panel 20. Only objects used for navigation can be displayed on the display panel 111. On the display panel 111, the navigation apparatus body 2 performs a predetermined process based on the content instructed by the gesture operation input from the control unit 10, and an object whose display form has been changed by performing the predetermined process is displayed. .
 ナビゲーション装置本体2とユーザーインターフェイス装置3は、無線又は有線により相互に通信可能である。また、ナビゲーション装置本体2から出力されるオブジェクトがユーザーインターフェイス装置110の表示パネル111に表示される。以下の説明では、ユーザーインターフェイス装置3を「感圧コントローラー」と呼んで、従来のユーザーインターフェイス装置110と区別する。 The navigation device body 2 and the user interface device 3 can communicate with each other wirelessly or by wire. Further, an object output from the navigation device body 2 is displayed on the display panel 111 of the user interface device 110. In the following description, the user interface device 3 is called a “pressure-sensitive controller” and is distinguished from the conventional user interface device 110.
 例えば、図17の(1)には、車両のインストゥルメンタルパネルにユーザーインターフェイス装置110を含むナビゲーション装置本体2が設置され、車両のセンターコンソールに感圧コントローラーが設置された様子が示されている。ユーザーインターフェイス装置110には地図が表示され、感圧コントローラーの表示パネル40(第1表示パネルの一例)には3種類のコマンドアイコン(例えば、現在地、AV(Audio Visual)、メニュー)のオブジェクトが表示される。このため、ユーザーは、感圧コントローラーを通じてコマンドアイコンを押して、必要なコマンドを直感的に選択することができる。なお、表示パネル40の所定領域に表示される不図示の切替えボタンを押すことにより、コマンドアイコン(例えば、設定、調整)を切替えて表示することができる。このため、ユーザーは、多数のコマンドアイコンから、切り替えようとするコマンドアイコンを直感的かつ容易に選択することが可能である。 For example, FIG. 17 (1) shows a state in which the navigation device body 2 including the user interface device 110 is installed on the instrument panel of the vehicle and the pressure-sensitive controller is installed on the center console of the vehicle. . A map is displayed on the user interface device 110, and objects of three types of command icons (for example, current location, AV (Audio Visual), menu) are displayed on the display panel 40 (an example of the first display panel) of the pressure-sensitive controller. Is done. For this reason, the user can intuitively select a required command by pressing a command icon through the pressure-sensitive controller. Note that by pressing a switch button (not shown) displayed in a predetermined area of the display panel 40, command icons (for example, setting and adjustment) can be switched and displayed. For this reason, the user can intuitively and easily select a command icon to be switched from a large number of command icons.
 図17の(2)に示すように、ユーザーが感圧コントローラーのタッチパネル20を指で押込むと、ユーザーインターフェイス装置110には指示アイコンが表示される。そして、図17の(3)に示すように、ユーザーが指を移動させると、指の移動方向に合わせて指示アイコンが移動して表示される。ユーザーが指を移動させる間は、指をタッチパネル20に押込まなくてよい。 As shown in (2) of FIG. 17, when the user presses the touch panel 20 of the pressure-sensitive controller with a finger, an instruction icon is displayed on the user interface device 110. Then, as shown in FIG. 17 (3), when the user moves the finger, the instruction icon is moved and displayed in accordance with the moving direction of the finger. While the user moves the finger, it is not necessary to push the finger into the touch panel 20.
 そして、図17の(4)に示すように、ユーザーは、感圧コントローラーのタッチパネル20を操作して、ユーザーインターフェイス装置110に表示される拡大アイコンを指示アイコンで選択する。その後、ユーザーが感圧コントローラーのタッチパネル20を指で押込むと、拡大アイコンの選択が決定される。このため、ユーザーインターフェイス装
置110には地図が拡大して表示される。
Then, as shown in (4) of FIG. 17, the user operates the touch panel 20 of the pressure-sensitive controller to select an enlarged icon displayed on the user interface device 110 with an instruction icon. Thereafter, when the user presses the touch panel 20 of the pressure-sensitive controller with a finger, selection of the enlarged icon is determined. For this reason, the map is enlarged and displayed on the user interface device 110.
<第7の実施の形態のジェスチャー操作>
 図18は、第7の実施の形態に係るナビゲーション装置1の操作例を示すフローチャートである。図18に示す第7の実施の形態では、1本の指を用いた簡易な操作によりユーザーインターフェイス装置110に所望のオブジェクトを表示することが可能である。すなわち、第7の実施の形態に係る制御部10は、タッチパネル20が指により押込まれた場合に、ユーザーインターフェイス装置110の表示パネル111に表示されたメニューアイコンの選択を指示するための指示アイコンをユーザーインターフェイス装置110に表示させる制御情報をナビゲーション装置本体2に出力する。そして、制御部10は、指示アイコンがメニューアイコンに重畳して表示された状態でタッチパネル20が指により再び押込まれた場合に、指示アイコンの選択を決定する制御情報を表示制御部19及びナビゲーション装置本体2に出力する。ナビゲーション装置本体2は、制御部10から入力するタッチ操作又はジェスチャー操作により指示された内容に基づいて所定の処理を行い、所定の処理を行ったオブジェクトの表示形態を変更して、表示パネル111にオブジェクトを表示する。
<Gesture operation of the seventh embodiment>
FIG. 18 is a flowchart illustrating an operation example of the navigation device 1 according to the seventh embodiment. In the seventh embodiment shown in FIG. 18, a desired object can be displayed on the user interface device 110 by a simple operation using one finger. That is, the control unit 10 according to the seventh embodiment displays an instruction icon for instructing selection of a menu icon displayed on the display panel 111 of the user interface device 110 when the touch panel 20 is pushed by a finger. Control information to be displayed on the user interface device 110 is output to the navigation device body 2. Then, the control unit 10 displays control information for determining selection of the instruction icon when the touch panel 20 is pressed again with a finger in a state where the instruction icon is displayed superimposed on the menu icon. Output to the main body 2. The navigation device body 2 performs a predetermined process based on the content instructed by the touch operation or the gesture operation input from the control unit 10, changes the display form of the object that has performed the predetermined process, and displays the display panel 111. Display the object.
 第7の実施の形態においてジェスチャー操作とは、例えば、ユーザーが指をタッチパネル20に押込んでユーザーインターフェイス装置110に指示アイコンを表示させ、指を移動する方向に合わせて指示アイコンを移動させ、再びユーザーが指をタッチパネル20に押込んでユーザーインターフェイス装置110に表示された選択アイコンを決定する操作である。また、第7の実施の形態においてオブジェクトの表示形態を変更するとは、例えば、地図を拡大又は縮小して表示することである。 In the seventh embodiment, the gesture operation means, for example, that a user pushes a finger into the touch panel 20 to display an instruction icon on the user interface device 110, move the instruction icon in accordance with the direction in which the finger is moved, and then the user again Is an operation of determining a selection icon displayed on the user interface device 110 by pushing a finger into the touch panel 20. Moreover, changing the display form of the object in the seventh embodiment means, for example, displaying the map in an enlarged or reduced manner.
 始めに、制御部10は、感圧コントローラーのタッチパネル20にユーザーの指がタッチしたことを検知する(S111)。ステップS111に付加された画面には、感圧コントローラーのタッチパネル20にユーザーの指がタッチする様子と、ユーザーインターフェイス装置110の表示パネル111に地図が表示される様子が示されている。 First, the control unit 10 detects that a user's finger has touched the touch panel 20 of the pressure-sensitive controller (S111). The screen added in step S111 shows a state in which a user's finger touches the touch panel 20 of the pressure-sensitive controller and a state in which a map is displayed on the display panel 111 of the user interface device 110.
 次に、ユーザーは、タッチパネル20を指で押込む(S112)。このとき、制御部10は、感圧検知部18が、センサー値により感圧を検知して出力する感圧検知情報に基づいて、タッチパネル20が指で押込まれたと判定する。そして、制御部10は、タッチパネル20が指により押込まれた場合に、ユーザーインターフェイス装置110の表示パネル111に表示されたメニューアイコンを選択するための指示アイコンを表示させる制御情報をナビゲーション装置本体2を通じてユーザーインターフェイス装置110の表示制御部に出力する。これによりユーザーインターフェイス装置110の表示パネル111に指示アイコンが表示される。ステップS112に付加された画面には、ユーザーがタッチパネル20を指で押込んだことで、表示パネル40に選択アイコンが表示されると共に、ユーザーインターフェイス装置110に指示アイコンが表示される様子が示されている。 Next, the user presses the touch panel 20 with a finger (S112). At this time, the control unit 10 determines that the touch panel 20 has been pressed with a finger based on the pressure-sensitive detection information that the pressure-sensitive detection unit 18 detects and outputs a pressure-sensitive value based on the sensor value. The control unit 10 transmits control information for displaying an instruction icon for selecting a menu icon displayed on the display panel 111 of the user interface device 110 when the touch panel 20 is pushed by a finger through the navigation device body 2. The data is output to the display control unit of the user interface device 110. As a result, an instruction icon is displayed on the display panel 111 of the user interface device 110. The screen added in step S112 shows that the selection icon is displayed on the display panel 40 and the instruction icon is displayed on the user interface device 110 when the user presses the touch panel 20 with a finger. ing.
 次に、ユーザーは、タッチパネル20にタッチした指を移動させ、情報を入力する(S113)。このとき、制御部10は、タッチパネル20にタッチした指が移動する方向に合わせて、ユーザーインターフェイス装置110の表示パネル111に表示した指示アイコンを移動して表示させる制御情報をナビゲーション装置本体2を通じてユーザーインターフェイス装置110に出力する。そして、ユーザーインターフェイス装置110の表示制御部が、表示パネル111に表示される指示アイコンを、感圧コントローラーの操作に応じて移動して表示する。ステップS113に付加された画面には、ユーザーが指をタッチパネル20で移動させて情報を入力する様子と、ユーザーの指の移動に合わせてユーザーインターフェイス装置110に表示された指示アイコンが移動する様子が示されている。移動中の指示アイコンは、画面内に白抜き矢印で表す。 Next, the user moves the finger touching the touch panel 20 and inputs information (S113). At this time, the control unit 10 moves the instruction information displayed on the display panel 111 of the user interface device 110 in accordance with the direction in which the finger touching the touch panel 20 moves, and displays control information through the navigation device body 2. Output to the interface device 110. Then, the display control unit of the user interface device 110 moves and displays the instruction icon displayed on the display panel 111 according to the operation of the pressure-sensitive controller. On the screen added in step S113, the user moves his / her finger on the touch panel 20 and inputs information, and the instruction icon displayed on the user interface device 110 moves according to the movement of the user's finger. It is shown. A moving instruction icon is represented by a white arrow in the screen.
 次に、ユーザーは、感圧コントローラーの表示パネル40に表示された選択アイコンを選択し、この選択アイコンを押込んで、選択アイコンを決定する(S114)。このとき、ユーザーインターフェイス装置110の表示パネル111に表示された指示アイコンがメニューアイコンに重畳して表示されている。この状態で、感圧コントローラーのタッチパネル20が指により再び押込まれると、感圧検知部18が、センサー値により感圧を検知すると感圧検知情報を出力する。そして、制御部10は、感圧検知部18から入力した感圧検知情報に基づいて、タッチパネル20が指で押込まれたと判定し、指示アイコンが選択したメニューアイコンを決定することができる。ステップS114に付加された画面には、ユーザーがタッチパネル20を指で押込んだ様子が示されている。 Next, the user selects a selection icon displayed on the display panel 40 of the pressure-sensitive controller, and presses this selection icon to determine the selection icon (S114). At this time, the instruction icon displayed on the display panel 111 of the user interface device 110 is displayed superimposed on the menu icon. In this state, when the touch panel 20 of the pressure-sensitive controller is pushed again with a finger, the pressure-sensitive detection unit 18 outputs pressure-sensitive detection information when pressure is detected based on the sensor value. Then, the control unit 10 can determine that the touch panel 20 has been pressed with a finger based on the pressure-sensitive detection information input from the pressure-sensitive detection unit 18, and can determine the menu icon selected by the instruction icon. The screen added in step S114 shows a state in which the user presses the touch panel 20 with a finger.
 そして、制御部10は、ユーザーインターフェイス装置110の表示パネル111に表示される地図を、指示アイコンで決定された情報に基づいて変更して表示するための制御情報を、ナビゲーション装置本体2を通じて、ユーザーインターフェイス装置110の表示制御部に出力する。これによりユーザーが感圧コントローラーを通じて行った操作の操作結果である地図がユーザーインターフェイス装置110の表示パネル111に表示される。 And the control part 10 changes the control information for changing and displaying the map displayed on the display panel 111 of the user interface apparatus 110 based on the information determined with the instruction | indication icon through the navigation apparatus main body 2, and a user. The data is output to the display control unit of the interface device 110. As a result, a map that is an operation result of the operation performed by the user through the pressure-sensitive controller is displayed on the display panel 111 of the user interface device 110.
 以上説明した第7の実施の形態に係るユーザーインターフェイス装置3は、ユーザーが手元でナビゲーション操作が可能な感圧コントローラーとして使用される。そして、ユーザーは、選択操作と決定操作を1本の指で行うことができ、かつ、指示アイコンに対する決定操作を、指示アイコンを選択した位置で行うことができる。このため、指の移動を最小限にすることができ、簡易的かつ直感的に操作することが可能となる。また、感圧コントローラーは、ナビゲーション装置本体2と離れた位置に配置されるため、ユーザーは手元で操作しやすい。 The user interface device 3 according to the seventh embodiment described above is used as a pressure-sensitive controller that allows the user to perform navigation operations at hand. The user can perform the selection operation and the determination operation with one finger, and can perform the determination operation for the instruction icon at the position where the instruction icon is selected. For this reason, the movement of the finger can be minimized, and the operation can be performed simply and intuitively. Moreover, since the pressure-sensitive controller is disposed at a position away from the navigation apparatus body 2, the user can easily operate it at hand.
 また、感圧コントローラーの表示パネル40には、様々なコマンドアイコン(例えば、音量ボタン)を表示することができ、表示倍率も任意に変更可能である。このため、従来はジョイスティック102等を用いて、煩雑で手間がかかる選択操作や決定操作を行っていたのに対し、第7の実施の形態に係る感圧コントローラーではユーザーが感圧コントローラーを操作して、素早くメニューを選択し、決定することができる。また、1画面に表示可能なコマンドアイコンの数は制限があっても、画面を切替え表示することで、多くのコマンドアイコンを表示し、選択することが可能である。また、コマンドアイコンは、例えば、ユーザーインターフェイス装置3の表示パネル40を縦2列、横2列に4分割したセルに分けて表示してもよい。これによりユーザーの押し間違いを防ぐことができる。 Further, various command icons (for example, volume buttons) can be displayed on the display panel 40 of the pressure-sensitive controller, and the display magnification can be arbitrarily changed. For this reason, conventionally, a joystick 102 or the like is used to perform a complicated and time-consuming selection operation or determination operation, but in the pressure-sensitive controller according to the seventh embodiment, the user operates the pressure-sensitive controller. Quickly select menus and make decisions. Even if the number of command icons that can be displayed on one screen is limited, it is possible to display and select many command icons by switching the screens. The command icon may be displayed by dividing the display panel 40 of the user interface device 3 into cells divided into four columns in two vertical columns and two horizontal columns, for example. This can prevent a user from making a mistake in pressing.
 なお、ユーザーが所持している携帯端末等を感圧コントローラーとして用いてもよい。そして、従来のユーザーインターフェイス装置110にタッチ位置の座標を検出する機能がなくても、ユーザーインターフェイス装置110が、感圧コントローラーとアドホック通信を行うことで、ユーザーは、携帯端末を感圧コントローラーとしてユーザーインターフェイス装置110を操作することができる。
 また、感圧コントローラーと、パーソナルコンピューター装置とを無線又は有線により接続してもよい。パーソナルコンピューター装置から離れた場所で感圧コントローラーを操作して、パーソナルコンピューター装置に所定の機能を実行させることもできる。
In addition, you may use the portable terminal etc. which the user possesses as a pressure sensitive controller. Even if the conventional user interface device 110 does not have a function of detecting the coordinates of the touch position, the user interface device 110 performs ad hoc communication with the pressure-sensitive controller, so that the user can use the mobile terminal as a pressure-sensitive controller. The interface device 110 can be operated.
Further, the pressure-sensitive controller and the personal computer device may be connected wirelessly or by wire. It is also possible to cause the personal computer device to execute a predetermined function by operating the pressure sensitive controller at a location away from the personal computer device.
[変形例]
 上述した各実施の形態に係るユーザーインターフェイス装置3をナビゲーション装置以外の装置に適用してもよい。例えば、携帯端末、タブレット端末等のタッチパネル部分に、ユーザーインターフェイス装置3を適用することで操作性を高めることができる。このようにタッチパネルを必要とする様々な電子機器に本実施の形態例に係るユーザーインターフェイス装置3を組み合わせることが可能である。
[Modification]
The user interface device 3 according to each embodiment described above may be applied to devices other than the navigation device. For example, operability can be improved by applying the user interface device 3 to a touch panel portion of a mobile terminal, a tablet terminal, or the like. In this way, the user interface device 3 according to the present embodiment can be combined with various electronic devices that require a touch panel.
 図2に示したブロック図では、ユーザーインターフェイス装置3が制御部10を備える構成としたが、ナビゲーション装置本体2が制御部10を備える構成としてもよい。
 また、タッチパネル20には、静電容量方式以外の方式によりタッチ操作を検知可能な構成としてもよい。また、感圧センサー30は、タッチパネル20の下に設けた押圧スイッチ等によりタッチパネル20が押込まれたことを検知してもよい。
In the block diagram shown in FIG. 2, the user interface device 3 includes the control unit 10, but the navigation device body 2 may include the control unit 10.
The touch panel 20 may be configured to detect a touch operation by a method other than the electrostatic capacitance method. Further, the pressure sensor 30 may detect that the touch panel 20 is pushed in by a press switch or the like provided below the touch panel 20.
 また、ユーザーインターフェイス装置3における感圧機能を組み合わせた操作を、カーナビゲーション以外にも、例えば、人、自転車のナビゲーションに用いてもよい。また、ユーザーインターフェイス装置3を、上述したように画像編集ソフトウェアの操作に用いてもよいし、他のアプリケーションソフトウェアの操作に用いてもよい。 In addition to the car navigation, the operation combined with the pressure sensitive function in the user interface device 3 may be used for, for example, navigation of a person or a bicycle. Further, the user interface device 3 may be used for the operation of the image editing software as described above, or may be used for the operation of other application software.
 また、上述した各実施の形態においては、ナビゲーション装置本体2とユーザーインターフェイス装置3とを組み合わせていた。しかし、ユーザーインターフェイス装置3自体がナビゲーション機能を有することにより、ユーザーインターフェイス装置3だけをナビゲーション装置として用いてもよい。 In each embodiment described above, the navigation device body 2 and the user interface device 3 are combined. However, since the user interface device 3 itself has a navigation function, only the user interface device 3 may be used as the navigation device.
 また、感圧センサー30は、弾性体33を備えない構成としてもよい。例えば、感圧センサー30から弾性体33を取り除いたとしても、上部電極31と下部電極35間が一定の距離を保って離れている状態で感圧センサー30に押圧力が加わると、上部電極31と下部電極35が近づき、上部電極31と下部電極35との間の静電容量が減少する。このため感圧検知部18は、上部電極31と下部電極35から出力されるセンサー値に基づいて静電容量変化率を求めることができる。 Further, the pressure-sensitive sensor 30 may be configured not to include the elastic body 33. For example, even if the elastic body 33 is removed from the pressure sensor 30, if a pressing force is applied to the pressure sensor 30 in a state where the upper electrode 31 and the lower electrode 35 are separated from each other with a constant distance, the upper electrode 31 is applied. The lower electrode 35 approaches and the electrostatic capacity between the upper electrode 31 and the lower electrode 35 decreases. For this reason, the pressure sensitive detection unit 18 can obtain the capacitance change rate based on the sensor values output from the upper electrode 31 and the lower electrode 35.
 本発明は上述した実施の形態に限られるものではなく、特許請求の範囲に記載した本発明の要旨を逸脱しない限りその他種々の応用例、変形例を取り得ることは勿論である。
 例えば、上述した実施の形態は本発明を分かりやすく説明するために装置の構成を詳細かつ具体的に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されない。また、ここで説明した実施の形態の構成の一部を他の実施の形態の構成に置き換えることは可能であり、さらにはある実施の形態の構成に他の実施の形態の構成を加えることも可能である。また、各実施の形態の構成の一部について、他の構成の追加、削除、置換をすることも可能である。
 また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。
The present invention is not limited to the embodiment described above, and various other application examples and modifications can of course be taken without departing from the gist of the present invention described in the claims.
For example, in the above-described embodiment, the configuration of the apparatus is described in detail and specifically in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the configuration including all the configurations described. In addition, a part of the configuration of the embodiment described here can be replaced with the configuration of the other embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Is possible. Moreover, it is also possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.
Further, the control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
 1…ナビゲーション装置、2…ナビゲーション装置本体、3…ユーザーインターフェイス装置、10…制御部、17…座標検知部、18…感圧検知部、19…表示制御部、20…タッチパネル、30…感圧センサー、40…表示パネル DESCRIPTION OF SYMBOLS 1 ... Navigation apparatus, 2 ... Navigation apparatus main body, 3 ... User interface apparatus, 10 ... Control part, 17 ... Coordinate detection part, 18 ... Pressure detection part, 19 ... Display control part, 20 ... Touch panel, 30 ... Pressure sensor 40 ... Display panel

Claims (16)

  1.  オブジェクトを表示する表示パネルと、
     制御情報に従って前記表示パネルに前記オブジェクトを表示させる制御を行う表示制御部と、
     指示媒体によりタッチ操作が行われるタッチパネルと、
     前記タッチ操作が行われた前記タッチパネルのタッチ位置の座標を検知して座標検知情報を出力する座標検知部と、
     前記指示媒体により前記タッチパネルに印加される押圧力に応じて変化するセンサー値を出力する感圧センサーと、
     前記センサー値に基づいて、前記指示媒体により前記タッチパネルが押込まれる押込み操作が行われたことを検知して感圧検知情報を出力する感圧検知部と、
     前記座標検知情報に基づいて判定した前記タッチ操作、及び前記感圧検知情報に基づいて判定した前記押込み操作を組み合わせたジェスチャー操作に基づいて、前記オブジェクトの表示形態を変更するための前記制御情報を前記表示制御部に出力する制御部と、を備える
     ユーザーインターフェイス装置。
    A display panel that displays objects,
    A display control unit that performs control to display the object on the display panel according to control information;
    A touch panel on which a touch operation is performed by an instruction medium;
    A coordinate detection unit that detects the coordinates of the touch position of the touch panel on which the touch operation has been performed and outputs coordinate detection information;
    A pressure-sensitive sensor that outputs a sensor value that changes according to a pressing force applied to the touch panel by the indicating medium;
    Based on the sensor value, a pressure-sensitive detection unit that detects that a push-in operation in which the touch panel is pushed by the instruction medium is performed and outputs pressure-sensitive detection information;
    The control information for changing the display form of the object based on a gesture operation that combines the touch operation determined based on the coordinate detection information and the push operation determined based on the pressure-sensitive detection information. And a control unit that outputs to the display control unit.
  2.  前記制御部は、前記タッチパネルが前記指示媒体により押込まれた状態で前記指示媒体が移動する方向に応じて、前記オブジェクトを拡大し、又は縮小する前記制御情報を前記表示制御部に出力する
     請求項1に記載のユーザーインターフェイス装置。
    The control unit outputs the control information for enlarging or reducing the object to the display control unit according to a direction in which the instruction medium moves in a state where the touch panel is pushed by the instruction medium. The user interface device according to 1.
  3.  前記制御部は、前記タッチパネルが前記指示媒体により押込まれた状態で移動する方向が第1方向である場合に前記オブジェクトを拡大し、前記タッチパネルが前記指示媒体により押込まれた状態で移動する方向が第2方向である場合に前記オブジェクトを縮小する前記制御情報を前記表示制御部に出力する
     請求項2に記載のユーザーインターフェイス装置。
    The control unit enlarges the object when the direction in which the touch panel is pushed by the pointing medium is the first direction, and the direction in which the touch panel moves in the state pushed by the pointing medium is The user interface device according to claim 2, wherein the control information for reducing the object in the second direction is output to the display control unit.
  4.  前記制御部は、前記タッチパネルが前記指示媒体により押込まれた状態が継続する時間に応じて、前記オブジェクトを拡大し、又は縮小する前記制御情報を前記表示制御部に出力する
     請求項1~3のいずれか一項に記載のユーザーインターフェイス装置。
    The control unit outputs the control information for enlarging or reducing the object to the display control unit according to a time during which the state where the touch panel is pushed by the instruction medium continues. The user interface device according to any one of the above.
  5.  前記制御部は、前記タッチパネルが前記指示媒体により押込まれた状態が継続する時間が、所定時間以上である場合に前記オブジェクトを拡大し、前記タッチパネルが前記指示媒体により押込まれた状態が継続する時間が、前記所定時間未満である場合に前記オブジェクトを縮小する前記制御情報を前記表示制御部に出力する
     請求項4に記載のユーザーインターフェイス装置。
    The control unit enlarges the object when the time for which the touch panel is pressed by the pointing medium continues for a predetermined time or longer, and the time for which the touch panel is pressed by the pointing medium continues 5. The user interface device according to claim 4, wherein when the time is less than the predetermined time, the control information for reducing the object is output to the display control unit.
  6.  前記制御部は、前記タッチパネルを押込む前記指示媒体のタッチ数に応じて、前記オブジェクトを拡大し、又は縮小する前記制御情報を前記表示制御部に出力する
     請求項1~5のいずれか一項に記載のユーザーインターフェイス装置。
    The control unit outputs the control information for enlarging or reducing the object to the display control unit according to the number of touches of the instruction medium that pushes the touch panel. The user interface device according to.
  7.  前記制御部は、前記タッチパネルを押込む前記指示媒体のタッチ数が第1個数である場合に前記オブジェクトを拡大し、前記タッチパネルを押込む前記指示媒体のタッチ数が第2個数である場合に前記オブジェクトを縮小する前記制御情報を前記表示制御部に出力する
     請求項6に記載のユーザーインターフェイス装置。
    The control unit enlarges the object when the number of touches of the pointing medium that presses the touch panel is the first number, and when the number of touches of the pointing medium that presses the touch panel is the second number, The user interface device according to claim 6, wherein the control information for reducing an object is output to the display control unit.
  8.  前記制御部は、前記表示パネルに表示されたソフトウェアキーボードのキーが表示される位置で前記タッチパネルが押込まれた場合に、前記キーから入力される文字の文字種を規定する入力モードを変更する前記制御情報を前記表示制御部に出力する
     請求項1に記載のユーザーインターフェイス装置。
    The control unit is configured to change an input mode that defines a character type of a character input from the key when the touch panel is pressed at a position where the key of the software keyboard displayed on the display panel is displayed. The user interface device according to claim 1, wherein information is output to the display control unit.
  9.  前記制御部は、前記キーの文字を第1文字種で入力可能な前記入力モードであって、前記キーが表示される位置で前記タッチパネルが押込まれた場合に、前記キーの文字を第2文字種で入力可能な前記入力モードに変更する
     請求項8に記載のユーザーインターフェイス装置。
    In the input mode in which the character of the key can be input in the first character type, and the touch panel is pressed at the position where the key is displayed, the control unit sets the character of the key in the second character type. The user interface device according to claim 8, wherein the input mode is changed to the input mode in which input is possible.
  10.  前記制御部は、所定領域の前記オブジェクトが選択された状態で前記タッチパネルが押込まれた場合に、前記オブジェクトに関連するメニュー項目を前記タッチパネルが押込まれた位置で表示するための前記制御情報を前記表示制御部に出力する
     請求項1,8,9のいずれか一項に記載のユーザーインターフェイス装置。
    The control unit displays the control information for displaying a menu item related to the object at a position where the touch panel is pressed when the touch panel is pressed in a state where the object in a predetermined area is selected. The user interface device according to claim 1, wherein the user interface device is output to a display control unit.
  11.  前記制御部は、前記オブジェクトが文字列である場合に、前記文字列を編集するための編集項目を前記メニュー項目として表示する前記制御情報を前記表示制御部に出力する
     請求項10に記載のユーザーインターフェイス装置。
    The user according to claim 10, wherein when the object is a character string, the control unit outputs the control information for displaying an edit item for editing the character string as the menu item to the display control unit. Interface device.
  12.  前記制御部は、前記タッチパネルが前記指示媒体により押込まれた状態で前記指示媒体が移動する方向に応じて、直前に行われた操作に対する編集操作を行うための前記制御情報を前記表示制御部に出力する
     請求項1,8~10のいずれか一項に記載のユーザーインターフェイス装置。
    The control unit provides the display control unit with the control information for performing an editing operation on the operation performed immediately before according to a direction in which the instruction medium moves in a state where the touch panel is pushed by the instruction medium. The user interface device according to any one of claims 1, 8 to 10.
  13.  前記制御部は、前記タッチパネルが前記指示媒体により押込まれた状態で前記指示媒体が移動する方向に応じて、直前に行われた前記操作を取消し、又はやり直す前記編集操作を行うための前記制御情報を前記表示制御部に出力する
     請求項12に記載のユーザーインターフェイス装置。
    The control unit is configured to perform the editing operation to cancel or redo the operation performed immediately before depending on a direction in which the pointing medium moves in a state where the touch panel is pushed by the pointing medium. The user interface device according to claim 12, wherein: is output to the display control unit.
  14.  ユーザーインターフェイス装置と、電子機器本体と、を備え、
     前記ユーザーインターフェイス装置は、
     オブジェクトを表示する表示パネルと、
     制御情報に従って前記表示パネルに前記オブジェクトを表示させる制御を行う表示制御部と、
     指示媒体によりタッチ操作が行われるタッチパネルと、
     前記タッチ操作が行われた前記タッチパネルのタッチ位置の座標を検知して座標検知情報を出力する座標検知部と、
     前記指示媒体により前記タッチパネルに印加される押圧力に応じて変化するセンサー値を出力する感圧センサーと、
     前記センサー値に基づいて、前記指示媒体により前記タッチパネルが押込まれる押込み操作が行われたことを検知して感圧検知情報を出力する感圧検知部と、
     前記座標検知情報に基づいて判定した前記タッチ操作、及び前記感圧検知情報に基づいて判定した前記押込み操作を組み合わせたジェスチャー操作に基づいて、前記オブジェクトの表示形態を変更するための前記制御情報を前記表示制御部に出力する制御部と、を備え、
     前記電子機器本体は、
     前記制御部から入力する前記タッチ操作又は前記ジェスチャー操作により指示された内容に基づいて所定の処理を行い、前記所定の処理を行った前記オブジェクトを前記制御部に出力する
     電子機器。
    A user interface device and an electronic device body,
    The user interface device includes:
    A display panel that displays objects,
    A display control unit that performs control to display the object on the display panel according to control information;
    A touch panel on which a touch operation is performed by an instruction medium;
    A coordinate detection unit that detects the coordinates of the touch position of the touch panel on which the touch operation has been performed and outputs coordinate detection information;
    A pressure-sensitive sensor that outputs a sensor value that changes according to a pressing force applied to the touch panel by the indicating medium;
    Based on the sensor value, a pressure-sensitive detection unit that detects that a push-in operation in which the touch panel is pushed by the instruction medium is performed and outputs pressure-sensitive detection information;
    The control information for changing the display form of the object based on a gesture operation that combines the touch operation determined based on the coordinate detection information and the push operation determined based on the pressure-sensitive detection information. A control unit that outputs to the display control unit,
    The electronic device body is
    An electronic device that performs a predetermined process based on content instructed by the touch operation or the gesture operation input from the control unit, and outputs the object subjected to the predetermined process to the control unit.
  15.  ユーザーインターフェイス装置と、電子機器本体と、を備え、
     前記ユーザーインターフェイス装置は、
     オブジェクトを表示する第1表示パネルと、
     制御情報に従って前記第1表示パネルに前記オブジェクトを表示させる制御を行う表示制御部と、
     指示媒体によりタッチ操作が行われるタッチパネルと、
     前記タッチ操作が行われた前記タッチパネルのタッチ位置の座標を検知する座標検知部と、
     前記指示媒体により前記タッチパネルに印加される押圧力に応じて変化するセンサー値を出力する感圧センサーと、
     前記センサー値に基づいて、前記指示媒体により前記タッチパネルが押込まれる押込み操作が行われたことを検知する感圧検知部と、
     前記タッチ操作及び前記押込み操作を組み合わせたジェスチャー操作に基づいて、前記オブジェクトの表示形態を変更するための前記制御情報を前記表示制御部及び前記電子機器本体に出力する制御部と、を備え、
     前記電子機器本体は、
     前記ユーザーインターフェイス装置の前記制御部から入力する前記制御情報に基づいて前記表示形態が変更された前記オブジェクトを表示する第2表示パネルを備え、
     前記制御部から入力する前記タッチ操作又は前記ジェスチャー操作により指示された内容に基づいて所定の処理を行い、前記所定の処理を行った前記オブジェクトの前記表示形態を変更して前記第2表示パネルに表示する
     電子機器。
    A user interface device and an electronic device body,
    The user interface device includes:
    A first display panel for displaying objects;
    A display control unit that performs control to display the object on the first display panel according to control information;
    A touch panel on which a touch operation is performed by an instruction medium;
    A coordinate detection unit that detects coordinates of a touch position of the touch panel on which the touch operation is performed;
    A pressure-sensitive sensor that outputs a sensor value that changes according to a pressing force applied to the touch panel by the indicating medium;
    Based on the sensor value, a pressure-sensitive detection unit that detects that a pressing operation in which the touch panel is pressed by the pointing medium is performed;
    A control unit that outputs the control information for changing the display form of the object to the display control unit and the electronic device main body based on a gesture operation that combines the touch operation and the push-in operation,
    The electronic device body is
    A second display panel that displays the object whose display form has been changed based on the control information input from the control unit of the user interface device;
    A predetermined process is performed based on contents instructed by the touch operation or the gesture operation input from the control unit, and the display form of the object subjected to the predetermined process is changed to the second display panel. Electronic equipment to display.
  16.  前記制御部は、前記タッチパネルが前記指示媒体により押込まれた場合に、前記表示パネルに表示されたメニューアイコンの選択を指示するための指示アイコンを表示させる前記制御情報を前記電子機器本体に出力し、前記指示アイコンが前記メニューアイコンに重畳して表示された状態で前記タッチパネルが前記指示媒体により再び押込まれた場合に、前記指示アイコンの選択を決定する前記制御情報を前記電子機器本体に出力する
     請求項15に記載の電子機器。 
    The control unit outputs the control information for displaying an instruction icon for instructing selection of a menu icon displayed on the display panel to the electronic device body when the touch panel is pushed by the instruction medium. When the touch panel is pushed again by the instruction medium in a state where the instruction icon is displayed superimposed on the menu icon, the control information for determining selection of the instruction icon is output to the electronic device main body. The electronic device according to claim 15.
PCT/JP2017/041151 2016-12-27 2017-11-15 User interface device and electronic apparatus WO2018123320A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016252318A JP2018106434A (en) 2016-12-27 2016-12-27 User interface apparatus and electronic device
JP2016-252318 2016-12-27

Publications (1)

Publication Number Publication Date
WO2018123320A1 true WO2018123320A1 (en) 2018-07-05

Family

ID=62707252

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/041151 WO2018123320A1 (en) 2016-12-27 2017-11-15 User interface device and electronic apparatus

Country Status (2)

Country Link
JP (1) JP2018106434A (en)
WO (1) WO2018123320A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6576399B2 (en) * 2017-07-20 2019-09-18 ヤフー株式会社 Information display program, information display method, information display device, and distribution device
JP2020042417A (en) * 2018-09-07 2020-03-19 アイシン精機株式会社 Display controller

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134895A (en) * 2008-12-08 2010-06-17 Apple Inc Selective input signal rejection and modification
JP2011022851A (en) * 2009-07-16 2011-02-03 Docomo Technology Inc Display terminal, image processing system, and image processing method
JP2012185710A (en) * 2011-03-07 2012-09-27 Kyocera Corp Electronic apparatus, control method of electronic apparatus, and program
JP2012527034A (en) * 2009-05-15 2012-11-01 サムスン エレクトロニクス カンパニー リミテッド Image processing method for portable terminals
JP2013105410A (en) * 2011-11-16 2013-05-30 Fuji Soft Inc Touch panel operation method and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134895A (en) * 2008-12-08 2010-06-17 Apple Inc Selective input signal rejection and modification
JP2012527034A (en) * 2009-05-15 2012-11-01 サムスン エレクトロニクス カンパニー リミテッド Image processing method for portable terminals
JP2011022851A (en) * 2009-07-16 2011-02-03 Docomo Technology Inc Display terminal, image processing system, and image processing method
JP2012185710A (en) * 2011-03-07 2012-09-27 Kyocera Corp Electronic apparatus, control method of electronic apparatus, and program
JP2013105410A (en) * 2011-11-16 2013-05-30 Fuji Soft Inc Touch panel operation method and program

Also Published As

Publication number Publication date
JP2018106434A (en) 2018-07-05

Similar Documents

Publication Publication Date Title
US20220100368A1 (en) User interfaces for improving single-handed operation of devices
JP5295328B2 (en) User interface device capable of input by screen pad, input processing method and program
US8775966B2 (en) Electronic device and method with dual mode rear TouchPad
KR101424294B1 (en) Multi-touch uses, gestures, and implementation
US20150317054A1 (en) Method and apparatus for gesture recognition
CN106687905B (en) Tactile sensation control system and tactile sensation control method
KR20140098904A (en) Operating Method of Multi-Tasking and Electronic Device supporting the same
JP5003377B2 (en) Mark alignment method for electronic devices
WO2012160829A1 (en) Touchscreen device, touch operation input method, and program
KR20100018883A (en) Method and system for user interface on electronic device
WO2011010411A1 (en) Input control apparatus
WO2018123320A1 (en) User interface device and electronic apparatus
JP2012252652A (en) Touch panel input device
JP5461030B2 (en) Input device
CN114764304A (en) Screen display method
CN114690889A (en) Processing method of virtual keyboard and related equipment
US20110119579A1 (en) Method of turning over three-dimensional graphic object by use of touch sensitive input device
KR101678213B1 (en) An apparatus for user interface by detecting increase or decrease of touch area and method thereof
JP2018128968A (en) Input device for vehicle and control method for input device for vehicle
JP2014006748A (en) Electronic apparatus, apparatus, and method
KR101480775B1 (en) Information display apparatus and method for vehicle using touch-pad, and information input module thereof
WO2018123355A1 (en) User interface device and electronic device
KR20090106312A (en) Apparatus having two displays and user interface method thereof
US20140085340A1 (en) Method and electronic device for manipulating scale or rotation of graphic on display
KR102090443B1 (en) touch control method, apparatus, program and computer readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17885686

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17885686

Country of ref document: EP

Kind code of ref document: A1