WO2018123320A1 - Dispositif d'interface utilisateur et appareil électronique - Google Patents

Dispositif d'interface utilisateur et appareil électronique Download PDF

Info

Publication number
WO2018123320A1
WO2018123320A1 PCT/JP2017/041151 JP2017041151W WO2018123320A1 WO 2018123320 A1 WO2018123320 A1 WO 2018123320A1 JP 2017041151 W JP2017041151 W JP 2017041151W WO 2018123320 A1 WO2018123320 A1 WO 2018123320A1
Authority
WO
WIPO (PCT)
Prior art keywords
control unit
touch panel
user interface
display
interface device
Prior art date
Application number
PCT/JP2017/041151
Other languages
English (en)
Japanese (ja)
Inventor
佐藤 克則
兼平 浩紀
赤間 博
優子 千田
森田 修身
佳 共
光春 菅原
Original Assignee
デクセリアルズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by デクセリアルズ株式会社 filed Critical デクセリアルズ株式会社
Publication of WO2018123320A1 publication Critical patent/WO2018123320A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M11/00Coding in connection with keyboards or like devices, i.e. coding of the position of operated keys
    • H03M11/02Details
    • H03M11/04Coding of multifunction keys

Definitions

  • the present invention relates to a user interface device and an electronic device.
  • touch panels that can be easily and sensibly operated with a finger or the like have become widespread, and various studies and developments have been made on downsizing, thinning, lightening, power saving, and cost reduction of touch panels.
  • a position where an instruction medium such as a finger touches the touch panel referred to as “touch position”
  • a resistance film type that detects a change in electrical resistance
  • a surface acoustic wave method that uses ultrasonic waves
  • an electrostatic A capacitance method for detecting a change in capacitance is known.
  • a capacitive touch panel is attracting attention in that it can detect a plurality of touch positions.
  • the capacitive touch panel includes a transparent electrode that generates a capacitance and an external circuit that detects a change in the capacitance.
  • a touch panel including a pressure-sensitive sensor capable of detecting a pressing force when the touch panel is pressed with a stylus pen, a finger, or the like is being provided.
  • Patent Document 1 input determination by pressing on the input operation surface is performed based on a change in capacitance of the upper electrode and lower electrode of the pressure-sensitive sensor displaced by the top plate and the touch panel moving in the pressing direction.
  • a technique for determining that an operation has been performed is disclosed.
  • Patent Document 2 discloses a sensor device including a pressure-sensitive sensor that does not use an elastic member.
  • JP 2011-134000 A Japanese Unexamined Patent Publication No. 2011-100364
  • Touch panels are used in various electronic devices.
  • an electronic device using a touch panel for example, there is a navigation device that is mounted on a car and displays a map, directions, and the like from the current position of the vehicle to a destination on a display panel.
  • a navigation device requires a simple operation.
  • the touch panel attached to the conventional navigation device has only a function of detecting the XY coordinates of the touch position, and it is inconvenient because a complicated operation must be performed by the user to obtain a desired result. .
  • the pressure sensitive sensor is merely used in an auxiliary manner in order to reliably detect that a finger or the like has touched the touch panel. For this reason, even if it is the touch panel which applied the technique disclosed by patent document 1 and 2, operation itself is the same as that of the touch panel which does not have a conventional pressure-sensitive function, and the inconvenience of operation was not eliminated. . Moreover, since the operation using an external input device etc. is requested
  • the present invention has been made in view of such a situation, and an object thereof is to make it possible to easily obtain a result requested by a user.
  • a user interface device includes a display panel that displays an object, a display control unit that performs control to display an object on the display panel according to control information, a touch panel that performs a touch operation using an instruction medium, and a touch operation that is performed.
  • An electronic apparatus includes the above-described user interface device and an electronic apparatus main body.
  • the electronic device main body performs a predetermined process based on content instructed by a touch operation or a gesture operation input from the control unit, and outputs an object on which the predetermined process has been performed to the control unit of the user interface device.
  • the user can easily obtain the result requested by the user by performing an intuitive gesture operation that combines the touch operation and the push-in operation.
  • FIG. 1 is an explanatory diagram showing a state when a driver seat and a passenger seat of a vehicle on which the navigation device 1 is mounted are viewed from the rear in the traveling direction toward the front.
  • the navigation device 1 (an example of an electronic device) is installed at a position sandwiched between the dashboard 5 and the meter panel 6 and visible to a user who holds the steering wheel 7.
  • a shift lever 8 is provided below the navigation device 1.
  • the navigation device 1 includes a navigation device body 2 that performs processing necessary for navigation, and a user interface device 3.
  • the navigation device body 2 is fitted in a recess formed in the dashboard 5 and cannot be directly recognized by the user.
  • the user interface device 3 is disposed at a position where the user can visually recognize, and outputs an instruction input by the user to the navigation device main body 2 and indicates a processing result by the navigation device main body 2 to the user.
  • the user interface device 3 is configured to have a sense of unity with the interior shape from the dashboard 5, the meter panel 6 and the shift lever 8 to the base of the windshield 4.
  • the user interface device 3 displays information (a map, various icons, etc.) necessary for navigation processing and the like output from the navigation device body 2.
  • the instruction medium touching the touch panel 20 is a user's finger.
  • a touch operation an operation performed by the user moving his / her finger while touching the touch panel 20
  • a pressing operation an operation in which the user presses the touch panel 20 with a finger
  • a gesture operation An operation performed by combining the touch operation and the pressing operation.
  • the conventional gesture operation is performed only by a touch operation.
  • FIG. 2 is a block diagram illustrating an internal configuration example of the user interface device 3.
  • the navigation device main body 2 (an example of the electronic device main body) performs predetermined processing based on the instructed content by a touch operation or a gesture operation input from the control unit 10 of the user interface device 3. Then, the navigation device body 2 outputs the object after performing the predetermined processing based on the instruction to the control unit 10 of the user interface device 3. Further, the control unit 10 determines a touch operation (finger moving direction, touch position, etc.) performed on the object.
  • a touch operation finger moving direction, touch position, etc.
  • Objects that are output from the navigation device body 2 to the user interface device 3 include, for example, maps, character strings, icons, images, and the like.
  • the navigation device body 2 performs navigation processing based on the gesture operation performed on the user interface device 3, outputs a map for display by the user interface device 3, and receives a character string input from the user interface device 3. Edit it.
  • the user interface device 3 includes a control unit 10, a storage medium control unit 14, a storage medium 15, a communication control unit 16, a coordinate detection unit 17, a pressure sensitive detection unit 18, and a display control unit 19 connected by a bus B.
  • the user interface device 3 also includes a touch panel 20 connected to the coordinate detection unit 17, a pressure sensor 30 connected to the pressure detection unit 18, and a display panel 40 connected to the display control unit 19.
  • the control unit 10 controls the operation of each unit in the user interface device 3.
  • the control unit 10 determines that a touch operation has been performed based on the coordinate detection information input from the coordinate detection unit 17, and outputs information input from the touch position to the navigation device body 2. Further, the control unit 10 determines that the pressing operation in which the touch panel 20 is pressed with a finger is performed based on the pressure-sensitive detection information input from the pressure-sensitive detection unit 18. In addition, the control unit 10 outputs control information for causing the display panel 40 to display the object output from the navigation device body 2 to the user interface device 3 to the display control unit 19.
  • control unit 10 outputs control information for changing the display form of the object displayed on the display panel 40 to the display control unit 19 based on a gesture operation that combines the touch operation and the push-in operation.
  • changing the display form of an object means, for example, displaying a map on an enlarged or reduced scale.
  • the control unit 10 includes a CPU 11, a RAM 12, and a ROM 13, and the CPU 11, the RAM 12, and the ROM 13 work together to realize the function of the control unit 10.
  • the CPU (Central Processing Unit) 11 is an example of a computer that controls the operation of each unit in the user interface device 3. For example, the CPU 11 executes a program read from the ROM 13 and performs processing related to the gesture operation according to the present embodiment.
  • a RAM (Random Access Memory) 12 stores temporary data such as a program executed by the CPU 11.
  • ROM (Read Only Memory) 13 stores a program read by the CPU 11 and the like.
  • the ROM 13 is used as an example of a computer-readable non-transitory recording medium that stores a program executed by the CPU 11. For this reason, this program is permanently stored in the ROM 13.
  • the computer-readable non-transitory recording medium storing the program executed by the CPU 11 may be a recording medium such as a CD-ROM or a DVD-ROM.
  • the storage medium control unit 14 controls the storage medium 15.
  • the storage medium control unit 14 writes data input from the control unit 10 to the storage medium 15, or reads data stored in the storage medium 15 according to an instruction from the control unit 10 and outputs the data to the control unit 10.
  • the storage medium 15 is inserted into a slot or the like provided in the user interface device 3 and stores data written by the storage medium control unit 14 or data is read by the storage medium control unit 14.
  • the storage medium 15 stores a navigation program that operates in the navigation apparatus main body 2, upgrade data of a character editing program, and the like.
  • the communication control unit 16 controls data communication processing performed through the network N between the navigation device body 2 and the user interface device 3.
  • the coordinate detection unit 17 detects the coordinates of the touch position of the touch panel 20 where the touch operation is performed.
  • the touch panel 20 is formed in a planar rectangular shape, and the position of the intersection of the X electrode and the Y electrode that intersect each other is coordinate information, and a value corresponding to the change in capacitance at this position is given to the coordinate detection unit 17. Is output.
  • the coordinate detection unit 17 detects the coordinate of the location where the coordinate information input from the touch panel 20 has changed as a finger touch position, and outputs coordinate detection information including the coordinate of the touch position to the control unit 10.
  • the pressure sensitive detection unit 18 detects that the touch panel 20 is pressed with a finger and the pressure sensitive sensor 30 senses pressure.
  • the pressure-sensitive sensor 30 is provided on the back surface of the touch panel 20 and outputs a sensor value that changes according to the pressing force applied to the touch panel 20 by a finger to the pressure-sensitive detection unit 18. Based on the sensor value input from the pressure-sensitive sensor 30, the pressure-sensitive detection unit 18 detects that a pressing operation for pressing the touch panel 20 with a finger is performed, and outputs pressure-sensitive detection information to the control unit 10. To do.
  • the display control unit 19 performs control to display objects such as icons and maps necessary for navigation on the display panel 40 formed in a planar rectangular shape according to the control information input from the control unit 10.
  • FIG. 3 is a schematic configuration diagram of the user interface device 3.
  • the position of the pressure-sensitive sensor 30 installed on the touch panel 20 as viewed from above and the position of the frame of the housing 55 are indicated by broken lines.
  • Six pressure-sensitive sensors 30 are provided on the back surface of the touch panel 20 on the frame of the housing 55 of the user interface device 3 including the touch panel 20. For this reason, the user does not see the pressure sensor 30 directly.
  • a plurality of pressure-sensitive sensors 30 are usually provided, but a plurality of pressure-sensitive sensors 30 are not necessarily provided, and may be one.
  • the top plate 50 protects the surface of the touch panel 20.
  • a transparent glass substrate, a film, or the like is used for the top plate 50.
  • the surface of the top plate 50 is an input operation surface 51 on which a user touches a finger for performing a touch operation.
  • the touch panel 20 is configured, for example, by laminating a transparent X electrode substrate 21, an adhesive layer 22, and a Y electrode substrate 23 in this order.
  • the top plate 50 and the X electrode substrate 21 are bonded and fixed by an adhesive layer 52.
  • Each of the X electrode substrate 21 and the Y electrode substrate 23 has a rectangular shape.
  • the X electrode substrate 21 and the Y electrode substrate 23 are bonded by the adhesive layer 22.
  • An area where the X direction detection electrode (not shown) formed on the X electrode substrate 21 and the Y direction detection electrode (not shown) formed on the Y electrode substrate 23 overlap each other in a plane is a coordinate detection area on the XY plane.
  • the pressure sensitive sensor 30 is disposed in a peripheral area (frame) outside the coordinate detection area on the XY plane of the touch panel 20.
  • the pressure-sensitive sensor 30 includes an elastic body 33 made of a dielectric material disposed between the touch panel 20 and the housing 55, and an upper electrode 31 and a lower electrode 35 that are disposed so as to sandwich the elastic body 33 and form a capacitor. Is provided.
  • the pressure-sensitive sensor 30 further includes an adhesive layer 32 that bonds and fixes the elastic body 33 and the upper electrode 31, and an adhesive layer 34 that bonds and fixes the elastic body 33 and the lower electrode 35.
  • the elastic bodies constituting the six pressure-sensitive sensors 30 are connected to form one frame-shaped elastic body 33, and the six pressure-sensitive sensors 30 form one elastic body 33. Sharing.
  • the elastic body 33 By providing the elastic body 33 in a ring shape, it is possible to prevent foreign dust or the like from entering the gap 41 between the touch panel 20 and the housing 55, that is, between the touch panel 20 and the display panel 40.
  • the lower electrodes 35 constituting the six pressure sensors 30 are connected to constitute one frame-like lower electrode 35, and the six pressure sensors 30 share one lower electrode 35.
  • the upper electrode 31 may also be formed in a frame shape like the lower electrode 35.
  • the elastic body 33 for example, a material having a small residual strain and a high restoration rate (restoration speed) is used.
  • the material used for the elastic body 33 include silicone rubber and urethane rubber.
  • the elastic body 33 may be displaced about 10% at the maximum with respect to the height of the original elastic body 33, for example.
  • the elastic body 33 having a thickness of 0.5 mm used for the pressure-sensitive sensor 30 may be displaced by about 10 ⁇ m.
  • the elastic body 33 of the pressure sensor 30 is distorted so that the pressure sensor 30 is bonded and fixed.
  • the top plate 50 and the touch panel 20 are moved in the pressing direction.
  • the pressure sensor 30 is pressed, its thickness is displaced in the pressing direction.
  • the back surface of the touch panel 20 approaches the front surface of the display panel 40 by the amount of displacement of the pressure sensor 30, so that a gap 41 is provided between the touch panel 20 and the display panel 40 in consideration of the movement of the touch panel 20. ing.
  • the gap 41 is not provided when the touch panel 20 and the display panel 40 are bonded.
  • the pressure-sensitive sensor 30 outputs a sensor value corresponding to the capacitance of the capacitor formed by the upper electrode 31 and the lower electrode 35 to the pressure-sensitive detection unit 18.
  • FIG. 4 is a diagram for explaining the operating principle of the pressure-sensitive sensor 30.
  • the description of the touch panel 20 is omitted.
  • An example of the pressure-sensitive sensor 30 that is not pressed with a finger is shown in the upper left of FIG. 4, and an example of the pressure-sensitive sensor 30 that is pressed with a finger is shown in the upper right of FIG.
  • the elastic body 33 is distorted so that the thickness decreases.
  • the electrostatic capacitance of the pressure sensor 30 changes because the thickness of the pressed pressure sensor 30 changes with respect to the thickness of the pressure sensor 30 which is not pressed.
  • the pressure-sensitive detection unit 18 uses the capacitance change rate between the upper electrode 31 and the lower electrode 35 due to the displacement d of the elastic body 33, the touch panel 20 is pressed, and the pressure-sensitive sensor 30 senses pressure.
  • the capacitance change rate is obtained based on sensor values output from the upper electrode 31 and the lower electrode 35 by the pressure-sensitive detection unit 18.
  • the sensor value is a voltage determined by the capacitance between the upper electrode 31 and the lower electrode 35. That is, the rate of change in capacitance is the static between the upper electrode 31 and the lower electrode 35 of the pressed pressure sensor 30 with respect to the capacitance between the upper electrode 31 and the lower electrode 35 of the pressure sensor 30 that is not pressed. It is obtained as a percentage of electric capacity.
  • the pressure-sensitive detection unit 18 obtains a change in capacitance between the upper electrode 31 and the lower electrode 35 based on a sensor value input from the pressure-sensitive sensor 30, that is, obtains a rate of change in capacitance. Is possible.
  • the pressure-sensitive detection unit 18 detects the sensor value detected by each pressure-sensitive sensor 30 disposed on the back surface of the touch panel 20 based on the graph showing the relationship between the pressing force at the bottom of FIG. It is converted into a pressing force when the touch panel 20 is pressed. Then, when the pressing force exceeds the pressing threshold, the pressure-sensitive detection unit 18 determines that the user has consciously pressed the touch panel 20 and outputs pressure-sensitive detection information to the control unit 10.
  • the pressure-sensitive detection unit 18 may obtain the pressing force based on the total value of the respective capacitance change rates detected by the pressure-sensitive sensors 30. Thereby, it is possible to detect the pressing force with high accuracy without depending on only the touch position on the input operation surface 51. Note that the pressure-sensitive detection unit 18 may obtain the pressing force from an average value obtained by dividing the total value of the capacitance change rates by the number of the pressure-sensitive sensors 30, for example.
  • the pressure-sensitive detection unit 18 may output pressure-sensitive detection information that varies depending on the magnitude of the pressing force to the control unit 10. For example, in the lower graph of FIG. 4, the pressing force at which the capacitance change rate is 2.0% is defined as a threshold th1 (an example of the first pressing threshold value), and the pressing force at which the capacitance change rate is 6.0%. Let the pressure be a threshold th2 (an example of a second pressing threshold). If the pressing force is less than the threshold th ⁇ b> 1, it is considered that the capacitance has only changed due to vibration applied to the user interface device 3 through the housing 55. That is, it is assumed that the user does not intentionally press the finger into the touch panel 20.
  • the pressure-sensitive detection unit 18 does not determine that the user has pressed the finger into the touch panel 20, and therefore does not output pressure-sensitive detection information. However, if the pressing force is equal to or greater than the threshold th1, the pressure-sensitive detection unit 18 determines that the user has pressed the finger into the touch panel 20, and outputs first pressure-sensitive detection information. Furthermore, if the pressing force is greater than or equal to the threshold th2, the pressure-sensitive detection unit 18 determines that the user has pressed the finger strongly into the touch panel 20, and outputs second pressure-sensitive detection information.
  • the pressure-sensitive detection unit 18 uses the first pressure-sensitive detection information or the second pressure-sensitive information based on the pressing force obtained from the capacitance change rate that changes according to the pressing amount of the user pushing the finger into the touch panel 20. Detection information can be output to the control unit 10. As a result, the control unit 10 can perform different processes according to the first pressure detection information or the second pressure detection information input from the pressure detection unit 18. Note that the pressure-sensitive detection information may be only one of the first pressure-sensitive detection information and the second pressure-sensitive detection information. When the first pressure detection information or the second pressure detection information is not distinguished, it is referred to as “pressure detection information”.
  • the control unit 10 determines that the finger has been pushed into the touch panel 20 and the determination operation by the user has been performed when the pressure-sensitive detection information is input from the pressure-sensitive detection unit 18. At this time, the control unit 10 vibrates the housing 55 or displays a message on the display panel 40 in order to inform the user that it is in the pressed state. Further, a voice guide may be emitted from a speaker (not shown). Or the control part 10 can also display icons, such as a circle and a square, in the pressed touch position. In this case, the icon may be blinked, or the icon display color may be changed.
  • the housing 55 can also generate a click sound or change the touch of the finger when the touch panel 20 is pushed.
  • FIGS. 5 and 6 Each step of the conventional flowchart shown in FIG. 5 and the flowchart according to the first embodiment of the present invention shown in FIG. 6 includes user interface devices 3 and 100 in order to explain specific contents of the gesture operation.
  • a display example of the screen displayed on the display panel is shown. In this display example, the position of the user's finger on the touch panel is added.
  • the screen added to the steps of FIG. 5 and FIG. 6 includes a map screen of the user interface device 3, 100 at a certain moment, a front view of the finger, a bottom view, and a left side view within a rectangular broken line range. They are displayed at the same time.
  • a thin white arrow in the screen indicates the direction of finger movement.
  • FIG. 5 is a flowchart showing an example of a conventional enlargement or reduction operation.
  • a control unit (not shown) provided in the conventional user interface device 100 detects that a finger touches the touch panel provided in the user interface device 100 (S1).
  • the control unit of the conventional user interface device 100 is simply referred to as a “control unit” without reference numeral.
  • the user selects an operation that changes depending on the number of fingers on the touch panel (S2).
  • an operation performed with one finger swipe
  • an operation performed with two fingers is selected.
  • the control unit detects the number of touches of the finger touching the touch panel (S3).
  • the control unit displays control information (not shown) that displays the control information by moving the map in accordance with the direction in which the finger touching the touch panel moves the touch panel. (S4).
  • the display control unit of the conventional user interface device 100 is simply referred to as a “display control unit” without reference numerals.
  • the user moves the map from the lower left to the upper right while touching the screen with one finger, and the map is moved and displayed in accordance with the moving direction of the one finger. The situation is shown.
  • step S5 shows a state in which the user touches the vicinity of the center of the screen with two fingers.
  • step S5 When a pinch-out that increases the distance between touch positions is performed (enlargement in S5), the control unit outputs control information for enlarging the map to the display control unit (S6).
  • the screen added in step S6 shows a state in which the user enlarges the map by widening the interval between the two fingers (pinch out) while touching the two fingers on the screen.
  • step S5 when a pinch-in that reduces the distance between touch positions is performed in step S5 (reduction in S5), the control unit outputs control information for reducing the map to the display control unit (S7).
  • the screen added in step S7 shows how the map is reduced and displayed by narrowing the interval between the two fingers (pinch in) while the user touches the screen with two fingers.
  • the display control unit displays the screen requested by the user on the display panel in accordance with the control information input from the control unit (S8), and ends this process.
  • the map has been enlarged or reduced using “touch panel touch detection” and “XY coordinate movement detection”.
  • touch panel touch detection and “XY coordinate movement detection”.
  • XY coordinate movement detection In order to enlarge or reduce the map, the user can only pinch out and pinch in with two fingers, combine two different gestures, or press the display magnification change button. There wasn't.
  • FIG. 6 is a flowchart illustrating an example of the enlargement or reduction operation according to the first embodiment.
  • the map is enlarged or reduced by a simple operation using one finger. That is, the control unit 10 according to the first embodiment provides the display control unit 19 with control information for enlarging or reducing the map in accordance with the direction in which the finger moves while the touch panel 20 is pressed by the finger. It is possible to output.
  • the gesture operation is, for example, an operation of moving a finger pressed by the user into the touch panel 20 upward or downward.
  • the coordinate detection unit 17 detects that the touch has been made (S11), and detects the coordinates of the touch position.
  • the screen added in step S11 shows a state in which the user touches the lower left of the screen with one finger.
  • step S ⁇ b> 12 shows a state in which the user touches the upper left of the screen with one finger and further presses the touch panel 20. That the touch panel 20 is pushed by a finger is indicated by the movement direction of the finger represented by a downward arrow with respect to the touch panel 20.
  • the pressure sensor detector 18 determines whether or not the pressure sensor 30 has detected pressure (S13).
  • the control unit 10 moves the map in accordance with the direction in which the finger touching the touch panel 20 moves.
  • the control information to be displayed is output to the display control unit 19 (S14).
  • the map is moved and displayed in accordance with the moving direction of one finger moved from the lower left to the upper right of the screen while the user touches the finger as usual. The state of being done is shown.
  • the control unit 10 determines, based on the coordinate detection information input from the coordinate detection unit 17, whether the moving direction of the finger moving in a state where the finger is pressed is up or down with respect to the touch panel 20. (S15).
  • the control unit 10 enlarges the map displayed on the display panel 40 when the direction in which the touch panel 20 is moved by the finger is upward (an example of the first direction) (above S15). Is output to the display control unit 19 (S16). This map continuously expands as the moving distance of the map pushed into the touch panel 20 increases.
  • the screen added in step S16 shows a state in which the map is enlarged by the user moving his / her finger up while pressing the finger on the touch panel 20.
  • control unit 10 reduces the map displayed on the display panel 40 when the direction in which the touch panel 20 is moved by the finger is down (an example of the second direction) (below S15).
  • the control information is output to the display control unit 19 (S17).
  • This map is continuously reduced as the moving distance of the map pushed into the touch panel 20 becomes longer.
  • the screen added in step S ⁇ b> 17 shows a state where the map is reduced by the user moving the finger down while pressing the finger on the touch panel 20.
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S18), and ends this process.
  • a map displayed on the display panel 40 can be enlarged or reduced at an arbitrary magnification by a simple gesture operation using one finger. It becomes. For this reason, it is possible to simplify an operation method that is conventionally realized by combining two fingers or a plurality of gestures.
  • the user can automatically move from a location search mode (an operation to move a finger in the XY direction after designating one coordinate) that involves changing the coordinates of the map.
  • a magnification change mode operation for enlarging a map
  • the user simply zooms in or out of the map by moving the finger pressed down on the touch panel 20 up or down. It becomes possible to reduce. For this reason, the user can realize the intended function with only one finger.
  • the user can continuously perform an operation for moving and displaying a map such as a map or a photo with one finger and an operation for changing the display magnification of the map. Operation can be realized.
  • the navigation device body 2 may output control information to the user interface device 3 so that the user cannot operate using the user interface device 3 while the vehicle is traveling. Conversely, control information that allows the user to operate using the user interface device 3 while the vehicle is stopped may be output to the user interface device 3.
  • the map may be set to be reduced when the direction of movement of the finger pressed by the user is up and to be enlarged when the direction of movement of the finger is down. Also, for example, the map may be enlarged when the direction of the finger pressed by the user is right, and the map may be reduced when the finger is left, or conversely, when the direction of finger movement is left You may zoom in on the map and shrink the map when you are on the right.
  • a map that has been enlarged or reduced is returned to its original magnification and displayed, for example, an operation of immediately releasing the finger that the user has pressed into the touch panel 20 or any mark (circle, square, etc.) is displayed on the touch panel 20.
  • An operation of drawing with a finger may be performed as a reset operation.
  • image editing software image browsing software, and the like
  • image enlargement or reduction may be realized using the user interface device 3 according to the first embodiment.
  • the gesture operation according to the second embodiment is also a gesture operation performed to enlarge or reduce the map.
  • a display magnification change button for instructing enlargement or reduction may be displayed at a predetermined position in the screen.
  • the user in order to enlarge or reduce the map, the user must always check the position where the display magnification change button is displayed.
  • FIG. 7 is a flowchart illustrating an example of an enlargement or reduction operation according to the second embodiment.
  • the control unit 10 according to the second embodiment outputs control information for enlarging or reducing the map to the display control unit 19 according to the time during which the state in which the touch panel 20 is pressed by the finger continues.
  • the gesture operation is, for example, an operation in which the user pushes the touch panel 20 with one finger.
  • the coordinate detection unit 17 detects that the touch has been made (S21), and detects the coordinates of the touch position.
  • the map moves and displays in accordance with the moving direction of the one finger moved from the lower left to the upper right of the screen. The state of being done is shown.
  • step S22 the user selects an operation of pushing a finger into the touch panel 20 (S22).
  • the screen added in step S22 shows a state in which the user touches the left side of the screen with one finger and further presses the touch panel 20.
  • the pressure-sensitive detection unit 18 determines whether or not the pressure-sensitive sensor 30 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (S23).
  • the pressure-sensitive detection unit 18 determines that the pressure is detected based on the sensor value (YES in S23)
  • the touch operation is performed with the finger pressed.
  • the control part 10 determines whether the pressure-sensitive state is maintained (S24).
  • the state in which the pressure-sensitive detection unit 18 continues to output pressure-sensitive detection information to the control unit 10 when the touch panel 20 is pushed with a finger is expressed as “the pressure-sensitive state is maintained”.
  • the control unit 10 determines whether or not the pressure-sensitive state is maintained based on the time that the pressure-sensitive detection information output from the pressure-sensitive detection unit 18 continues. At this time, the control unit 10 determines whether or not the time during which the state in which the touch panel 20 is pressed by the finger continues is equal to or longer than a predetermined time (for example, 1 second).
  • control part 10 determines with the pressure-sensitive state being maintained, if the time for which the state in which the touch panel 20 was pushed in with the finger continues is a predetermined time or more (YES in S24). Furthermore, the control unit 10 determines whether or not a determination time (for example, 2 seconds) has elapsed while the pressure-sensitive state is maintained (S25). The determination in step S25 is performed by the control unit 10 every determination time.
  • a determination time for example, 2 seconds
  • step S26 the control unit 10 outputs control information for enlarging the map displayed on the display panel 40 to the display control unit 19 (S26). ).
  • the screen added in step S26 shows a state where the map is enlarged and displayed as the display magnification increases as the time when the user presses the finger on the touch panel 20 becomes longer.
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S27).
  • step S27 the control unit 10 returns to step S25 and continues processing. As described above, since the process of steps S25 to S27 is repeated while the user continues to press the finger on the touch panel 20, the map is gradually enlarged and displayed every determination time (for example, 2 seconds).
  • step S25 if the determination time has not elapsed while the pressure-sensitive state is maintained in step S25 (NO in S25), the control unit 10 ends this process. For this reason, when the control part 10 passes step S26, S27, it displays in the state in which the map was expanded.
  • step S24 the control unit 10 determines that the pressure-sensitive state is not maintained if the time during which the state in which the touch panel 20 is pressed by the finger continues is less than the predetermined time (NO in S24), and the display panel 40.
  • the control information for reducing the map displayed on the screen is output to the display control unit 19 (S28).
  • the screen added to step S28 shows a state where the map is reduced and displayed at a reduced display magnification.
  • Step S23 when it is determined that the pressure-sensitive detection unit 18 does not detect pressure based on the sensor value (NO in S23), the control unit 10 moves the finger touching the touch panel 20 in the moving direction.
  • control information for moving and displaying the map is output to the display control unit 19 (S29).
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S30), and ends this process.
  • the map is enlarged and displayed when the touch panel 20 is pushed with a finger, and the finger pushed into the touch panel 20 immediately leaves the touch panel 20.
  • the map is reduced and displayed. For example, in order to display the map displayed on the display panel 40 in detail or in a wide area, if you press and hold a point on the screen for a long time, the map is enlarged and displayed at a predetermined magnification every 2 seconds. The map is reduced and displayed at a predetermined magnification by pressing a point on the screen and releasing it immediately. Since this gesture operation is performed with one finger, the conventional gesture operation using two fingers becomes unnecessary.
  • the pressure-sensitive detection unit 18 can detect that the touch panel 20 has been pushed at any position on the touch panel 20. Therefore, the user does not need to search for a conventional magnification change button, and can easily enlarge or reduce the map.
  • the map may be reduced and displayed when the touch panel 20 is pressed with a finger and the pressure-sensitive state is maintained.
  • the map is enlarged. May be displayed.
  • the map may be moved and displayed in accordance with the moving direction of the finger while enlarging or reducing the map.
  • a plurality of layers may be switched and displayed.
  • the gesture operation according to the third embodiment is performed in order to enlarge or reduce the map according to the number of fingers that perform the touch operation with the finger pressed.
  • the map has to be enlarged or reduced by a gesture operation generally called pinch-out or pinch-in that widens or narrows the interval between two fingers touching the screen.
  • the map is enlarged or reduced without performing pinch-out and pinch-in operations with two fingers.
  • FIG. 8 is a flowchart illustrating an example of the enlargement or reduction operation according to the third embodiment.
  • the map can be enlarged or reduced by a simple operation using one or two fingers. That is, the control unit 10 according to the third embodiment outputs control information for enlarging or reducing the map to the display control unit 19 in accordance with the number of fingers that press the touch panel 20.
  • the gesture operation is, for example, an operation in which the user pushes the touch panel 20 with one or two fingers. Note that the processing in steps S31 to S34 in FIG. 8 is the same as the processing in steps S21 to S23 and S29 in FIG. 7 in the second embodiment described above, and detailed description thereof is omitted.
  • step S33 When it is determined in step S33 that the pressure-sensitive detection unit 18 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (YES in S33), the control unit 10 detects the coordinates detected from the coordinate detection unit 17. Based on the information, the number of touches of the finger touching the touch panel 20 is detected (S35). When it is determined that the number of touches of the finger pushing the touch panel 20 is one (an example of the first number) (one of S35), the control unit 10 controls to enlarge the map displayed on the display panel 40. Information is output to the display control unit 19 (S36). The screen added in step S36 shows a state where the map is enlarged by the user pressing the touch panel 20 with one finger.
  • step S35 when it is determined in step S35 that the number of touches of the finger pressing the touch panel 20 is two (an example of the second number) (two of S35), the control unit 10 is displayed on the display panel 40.
  • the control information for reducing the map is output to the display control unit 19 (S37).
  • the screen added in step S37 shows a state where the map is enlarged by the user pressing the touch panel 20 with two fingers.
  • the display control unit 19 displays the screen requested by the user on the display panel 40 in accordance with the control information input from the control unit 10 (S38), and ends this process.
  • the map can be enlarged or reduced according to the number of fingers that have pressed the touch panel 20, and a more intuitive and intuitive operation than before can be realized. can do. This eliminates the need for a gesture operation for expanding or narrowing two fingers using “touch panel touch detection” and “XY coordinate movement detection”, which has been conventionally performed.
  • the map may be reduced when the number of touches of the finger pressing the touch panel 20 is one, and the map may be enlarged when the number of finger touches is two. Further, even when the number of touched fingers is three or more, various operations may be selected according to the number of touched fingers. For this reason, for example, the first number may be the number of touches of two fingers, and the second number may be the number of touches of one or three fingers. Further, when the user moves the finger pressed into the touch panel 20 in the same direction, the map may be moved and displayed in accordance with the moving direction of the finger while enlarging or reducing the map.
  • the volume of the speaker may be increased when the number of finger touches that press the touch panel 20 is 1, and the volume of the speaker may be decreased when the number of finger touches is two.
  • the speaker volume may be decreased when the number of finger touches is one and the speaker volume may be increased when the number of finger touches is two.
  • the contrast of the screen may be increased or decreased by a gesture operation in the third embodiment. Further, the head-up (displaying the traveling direction upward) or the north-up (displaying north upward) of the own vehicle icon displayed during car navigation may be switched.
  • the user can input characters from the software keyboard displayed on the display panel using a conventional touch panel.
  • the conventional touch panel is a touch panel that can detect only the XY coordinates of the touch position.
  • the conventional touch panel when the user enters characters by changing from lowercase letters to uppercase letters, it is necessary to input characters after switching the character type by pressing the shift key or character type switching key displayed on the software keyboard. It was. For this reason, it is sometimes necessary to frequently switch the character type, and the shift key and the character type switching key have to be pressed many times, resulting in poor operability.
  • FIG. 9 is a flowchart showing an example of a conventional character type switching operation.
  • a user inputs a lowercase or uppercase alphabet using a software keyboard displayed on a conventional display panel.
  • the user selects the character type of the software keyboard displayed on the display panel of the conventional user interface device 100 (S41). Since the input mode of a character input key (hereinafter abbreviated as “key”) provided in the software keyboard is a lowercase input mode, lowercase characters are displayed on the key. Therefore, in this state, the user can input a lowercase letter by touching the displayed key.
  • the user selects either uppercase or lowercase character types (S42).
  • the character type selected by the user is not an upper case character (lower case in S42)
  • the key of the character to be input without switching the character type S43
  • the lower case character of the key is input.
  • the lower case letter of the touch position is determined (S44). Then, the fixed lowercase letter is displayed on the display panel, and the present process ends.
  • step S42 if the character type selected by the user in step S42 is uppercase (uppercase in S42), the user touches the shift key on the software keyboard (S45).
  • the screen added in step S45 shows a state in which the user touches the shift key (key represented by the upward arrow) at the lower left of the software keyboard.
  • step S46 the user removes his / her finger from the shift key to determine the capital letter input mode (S46).
  • the screen added in step S46 shows a state where the user lifts his / her finger from the shift key of the software keyboard.
  • the key input mode is changed from the lowercase input mode to the uppercase input mode.
  • Uppercase letters are displayed on the keys, and the user can input uppercase letters.
  • step S47 shows a state in which the user touches a key on the software keyboard.
  • step S48 When the user removes his / her finger from the key, the capital letter at the touch position is confirmed (S48).
  • the screen added in step S48 shows a state where the user lifts his / her finger from the software keyboard. Then, this fixed uppercase letter is displayed on the display panel, and the present process ends.
  • FIG. 10 is a flowchart illustrating an example of a character type switching operation according to the fourth embodiment.
  • the character type can be switched by a simple operation using one finger. That is, the control unit 10 according to the fourth embodiment is input from a key when the touch panel 20 is pressed at a position where a key (an example of an object) of the software keyboard displayed on the display panel 40 is displayed. Control information for changing the input mode that defines the character type of the character to be output is output to the display control unit 19.
  • the gesture operation is an operation in which the user presses the finger on the touch panel 20 to switch the key input mode and releases the finger to confirm the character.
  • Changing the object display form is, for example, The key to change the input mode is to display.
  • the user selects the character type of the software keyboard displayed on the display panel 40 provided in the user interface device 3 (S51).
  • the key input mode is a lowercase input mode in which the key characters can be input in lowercase letters (an example of the first character type), and lowercase letters are displayed on the keys of the software keyboard.
  • the screen added to step S52 shows a state in which the user touches a key on the software keyboard.
  • the pressure sensitive detector 18 detects pressure based on the sensor value input from the pressure sensor 30 (S53).
  • the pressure sensitive detection unit 18 does not detect the pressure sensitive (NO in S53)
  • the pressure sensitive detection information is not output to the control unit 10. For this reason, when the user removes his / her finger from the key, the lowercase letter of the key touched by the user is determined and displayed on the display panel 40 (S54), and this process ends.
  • step S55 shows a state in which the uppercase key on the software keyboard displayed by switching the key to the uppercase input mode is touched.
  • step S56 shows how the capital letter is fixed and displayed when the user lifts his / her finger from the software keyboard. Thereafter, the control unit 10 returns the key input mode from the uppercase input mode to the lowercase input mode.
  • the character type of the character to be input can be easily switched by simply pressing the finger on the touch panel 20 when inputting the character with the software keyboard. For this reason, conventionally, it has been necessary to move the finger to the shift key and touch the shift key to switch the character type.
  • the gesture operation in the fourth embodiment the user places the finger at the key position to be input by the user. You can change the character type from lower case to upper case while touching. For this reason, the user can easily switch the character type by an intuitive gesture operation.
  • the key input mode is set to the upper case input mode, and when the pressure sensitive detection unit 18 detects the pressure, the mode is temporarily switched to the lower case input mode, and after the lower case is fixed and inputted, You may return to uppercase input mode.
  • the key input mode is set to Hiragana input mode, and when the pressure-sensitive detector 18 detects pressure, the mode is temporarily switched to Katakana input mode, and the Katakana character is confirmed and input. After that, you may return to hiragana input mode.
  • the first input mode may be the katakana input mode, and the input mode that is switched when pressure is detected may be the hiragana input mode.
  • the key input mode may be switched from the full-width input mode to the half-width input mode, or from the half-width input mode to the full-width input mode. Further, the control unit 10 may maintain the switched input mode in the next character input, and may return to the original input mode when the user presses the finger on the touch panel 20 again.
  • the pressure-sensitive detection unit 18 detects pressure, the muddy sound corresponding to the character at the position touched by the user's finger ( Gi, Gu, etc.), semi-turbid sound (Pa, Pi, Pu, etc.), sound (tsu), small letters (nya, yu, yo, ai, ⁇ , etc.), symbols (+,-, x, ⁇ , etc.) ), Umlauts or the like may be input.
  • gesture operation of the user interface device 3 according to the fifth embodiment of the present invention is performed in order to select an edit item (an example of a menu item) at a position where a certain operation is performed.
  • FIG. 11 is a flowchart showing an example of a conventional menu selection operation.
  • the control unit included in the conventional user interface apparatus 100 detects the position indicated by the black arrow instruction icon by operating the mouse or the like (S61).
  • the screen added to step S61 shows a state in which the instruction icon indicates the character string in the screen.
  • the selected character string is highlighted in bold.
  • the user selects a menu by moving the instruction icon (S62).
  • the instruction icon moves according to the locus of the mouse moved by the user (S63), and this process ends.
  • step S62 when the user selects a menu (YES in S62), for example, edit items in the edit menu are displayed (S64). Thereafter, when the user selects an edit menu from the menu bar at the top of the screen, edit items (copy, paste, cut, etc.) included in the edit menu are displayed in a list.
  • the character string selected by the user is indicated by a dashed ellipse, and a list of edit menus and edit items is indicated in the menu bar on the upper side of the screen.
  • FIG. 11 shows an example in which the user clicks the edit menu on the menu bar.
  • Edit items similar to the edit menu are displayed in a list, and the edit items can be selected.
  • the instruction icon in order to perform an editing operation on a character string selected by the user, the instruction icon must be moved to the menu bar, or a list of editing items can be displayed by right-clicking the mouse, and the editing item must be selected. I must.
  • a mouse is not connected to the conventional user interface device 100, it is difficult to perform these operations.
  • FIG. 12 is a flowchart illustrating an example of a menu selection operation according to the fifth embodiment.
  • the control unit 10 according to the fifth embodiment displays a menu item related to the object at the position where the touch panel 20 is pressed.
  • the gesture operation is an operation in which the user presses a finger on the touch panel 20 to select an edit item from an edit menu. Changing the object display form is, for example, an edit menu. Displaying the edited character string through the edit item.
  • the coordinate detection unit 17 detects that the touch has been made (S71), and detects the coordinates of the touch position.
  • the user selects a character string (an example of an object) in a predetermined area by a touch operation, and selects an operation of pushing a finger into the touch panel 20 (S72).
  • the pressure-sensitive detection unit 18 determines whether or not the pressure-sensitive sensor 30 has detected pressure based on the sensor value output from the pressure-sensitive sensor 30 (S73).
  • the control unit 10 moves the instruction icon in accordance with the direction in which the finger touching the touch panel 20 moves. Are displayed (S74), and this process is terminated.
  • step S73 if it is determined in step S73 that the pressure-sensitive detection unit 18 has detected pressure sensitivity based on the sensor value (YES in S73), the touch panel 20 is pressed in a state where an object in a predetermined area is selected.
  • the control unit 10 outputs control information for displaying the edit item of the edit menu at the position where the touch panel 20 is pressed to the display control unit 19.
  • the edit item of the edit menu is an example of a menu item related to the character string when the object is a character string.
  • the display control unit 19 displays the edit items of the edit menu on the display panel 40 (S75).
  • a region for example, a character string
  • a list of edit items is indicated.
  • the user selects an arbitrary edit item from the displayed edit items (S76), and ends this process.
  • the edit items of the edit menu for the area selected through the touch panel 20 are displayed. In this way, the user can select an edit item without moving his / her finger largely, since the edit item is displayed in the same manner as a right-click operation with a conventional mouse while a predetermined area is selected.
  • the menu bar includes a file menu, a display menu, a help menu, and the like.
  • the menu displayed when the touch panel 20 is pressed may be a menu other than the edit menu.
  • the menu may be switched and displayed according to the pressing amount of the touch panel 20. For example, if the pressing amount is small (the pressing force is greater than or equal to the threshold th1 and less than the threshold th2), the editing menu edit item is displayed. If the pressing amount is large (the pressing force is the threshold th2 or more), the display menu item is displayed. May be.
  • the menu may be switched and displayed every 2 seconds, for example, according to the time during which the pressure-sensitive detector 18 is detecting pressure. Thereby, the user can select an item from a menu other than the edit menu.
  • FIG. 13 is a flowchart showing an example of a conventional cancel operation.
  • the control unit included in the conventional user interface device 100 detects that a finger has touched the touch panel included in the user interface device 100 (S81).
  • the screen added in step S81 shows a state in which the user touches the lower left of the screen with one finger.
  • step S82 shows a state in which a character string input by the user is displayed.
  • step S82 If the user determines in step S82 that the character input is correct (YES in S82), the process is terminated. On the other hand, when it is determined that the character input is inaccurate (NO in S82), the user performs a character input cancel operation.
  • This canceling operation is performed by either an input operation (S83) performed through an external input device or a touch operation (S84) for a cancel icon displayed at the lower right of the screen.
  • the operations in steps S83 and S84 are both called cancel operations.
  • the user returns to step S82 and performs character input again to determine whether or not the character input is accurate.
  • the cancel operation was realized by using an external input device or touching a cancel icon, it was difficult to perform the cancel operation in an application where installation space is limited.
  • the previous operation can be canceled by the pushing operation.
  • FIG. 14 is a flowchart illustrating an example of a cancel operation according to the sixth embodiment.
  • the canceling operation can be performed by a simple operation using one finger. That is, the control unit 10 according to the sixth embodiment provides control information for performing an editing operation on the operation performed immediately before, according to the direction in which the finger moves in a state where the touch panel 20 is pressed by the finger. The data is output to the display control unit 19.
  • the gesture operation is an operation of moving the finger in a direction opposite to the direction in which the character string is input in a state where the user presses the finger on the touch panel 20, and the object display form is changed. To change is to display the edited character string, for example.
  • the coordinate detection unit 17 detects that the touch has been made (S91), and detects the coordinates of the touch position.
  • the screen added in step S91 shows a state where the user touches the lower left of the screen with one finger.
  • step S92 shows a state in which a character string input by the user is displayed.
  • step S92 If the user determines in step S92 that the character input is correct (YES in S92), the process is terminated. On the other hand, if it is determined that the character input is inaccurate (NO in S92), the user performs a character input cancel operation. In this canceling operation, first, the user performs an operation of pushing a finger into the touch panel 20 (S93). The screen added in step S93 shows a state where the user presses the touch panel 20.
  • step S94 shows how all the input operations for the character string input immediately before are canceled.
  • a series of operations performed in the order of steps S93 and S94 is referred to as a cancel operation.
  • the user interface device 3 since the last operation can be canceled by combining the pressing operation and the touch operation that moves the finger in one direction, the user can cancel the cancel operation. Intuitive and simple implementation. For this reason, conventionally, an icon is displayed on the display panel or an external input device is provided to perform the cancel operation. In the present embodiment, there are no restrictions on the installation space and the corresponding application. . As described above, the user can easily perform the cancel operation by using the user interface device 3, and thus can be applied not only to the navigation device 1 but also to devices for various purposes.
  • a plurality of operations may be canceled together according to the amount of pressing when the finger pressed into the touch panel 20 moves in the pressing direction.
  • the moving direction of the finger is not limited to the left direction but may be another direction.
  • the character string may be redisplayed as the re-operation.
  • the operation may be repeated when the moving direction of the finger is upward.
  • other operations may be assigned to gesture operations performed using the user interface device 3.
  • the navigation device 1 when a user specifies a place and inputs a destination, and then specifies another place and inputs a destination, the user presses the finger into the touch panel 20 and moves it to the left. May display the location where the destination input was previously performed.
  • the user interface device 3 may be used in order to cancel or redo an operation on an image.
  • the user may perform a plurality of cancel operations by changing the moving direction of the pressed finger halfway.
  • the operation may be an operation for canceling two consecutive input operations performed immediately before.
  • FIG. 15 is an explanatory diagram showing an installation example of the conventional user interface device 110.
  • a user interface device 110 that does not have a touch panel has been used in a navigation device mounted on a vehicle in order to allow a user to operate the navigation device at hand.
  • This navigation device includes a navigation device body and a user interface device 110 that can display only a navigation screen.
  • the display panel 111 (an example of the second display panel) provided in the user interface device 110 is not provided with a touch panel. For this reason, the user cannot perform a touch operation through the user interface device 110.
  • An external input device is installed in the vehicle so that the user can operate the navigation device.
  • Examples of the external input device include a button 101 and a joystick 102 that are installed on a center console in a vehicle and connected to a navigation device.
  • a knob type controller may be installed instead of the joystick 102.
  • the center console has a limited space where an external input device can be installed, and the use of the external input device is limited to a selection operation and a determination operation. For this reason, in order to open a function that the user wants to use, selection and determination must be repeated using an external input device, which takes time and effort.
  • FIG. 16 is a flowchart showing an operation example of a conventional navigation device.
  • a navigation operation is performed using a joystick 102 and a button 101 as an external input device.
  • the control unit (not shown) of the navigation device detects the operation of the joystick 102 (S101).
  • the screen added in step S101 shows a state in which the instruction icon is displayed on the display panel 111 of the user interface device 110.
  • step S ⁇ b> 102 shows how the instruction icon moves as the user moves the joystick 102.
  • the destination instruction icon is represented by a white arrow in the screen.
  • step S103 shows how the information input by the user is determined when the user presses the button 101.
  • the selection operation must be performed by the joystick 102 and the determination operation must be performed by the button 101.
  • the joystick 102 and the button 101 are arranged at different positions, an erroneous operation may occur. Further, it is impossible to assign operation functions other than the selection operation and the determination operation to the joystick 102 and the button 101.
  • FIG. 17 is an explanatory diagram showing a configuration example of a navigation device 1A according to the seventh embodiment.
  • the navigation device 1A includes another user interface device 110 in addition to the navigation device body 2 and the user interface device 3 as shown in FIG.
  • the user interface device 110 is a conventional user interface device and does not include the touch panel 20. Only objects used for navigation can be displayed on the display panel 111. On the display panel 111, the navigation apparatus body 2 performs a predetermined process based on the content instructed by the gesture operation input from the control unit 10, and an object whose display form has been changed by performing the predetermined process is displayed. .
  • the navigation device body 2 and the user interface device 3 can communicate with each other wirelessly or by wire. Further, an object output from the navigation device body 2 is displayed on the display panel 111 of the user interface device 110.
  • the user interface device 3 is called a “pressure-sensitive controller” and is distinguished from the conventional user interface device 110.
  • FIG. 17 (1) shows a state in which the navigation device body 2 including the user interface device 110 is installed on the instrument panel of the vehicle and the pressure-sensitive controller is installed on the center console of the vehicle. .
  • a map is displayed on the user interface device 110, and objects of three types of command icons (for example, current location, AV (Audio Visual), menu) are displayed on the display panel 40 (an example of the first display panel) of the pressure-sensitive controller. Is done. For this reason, the user can intuitively select a required command by pressing a command icon through the pressure-sensitive controller.
  • a switch button not shown
  • command icons for example, setting and adjustment
  • an instruction icon is displayed on the user interface device 110.
  • the instruction icon is moved and displayed in accordance with the moving direction of the finger. While the user moves the finger, it is not necessary to push the finger into the touch panel 20.
  • the user operates the touch panel 20 of the pressure-sensitive controller to select an enlarged icon displayed on the user interface device 110 with an instruction icon. Thereafter, when the user presses the touch panel 20 of the pressure-sensitive controller with a finger, selection of the enlarged icon is determined. For this reason, the map is enlarged and displayed on the user interface device 110.
  • FIG. 18 is a flowchart illustrating an operation example of the navigation device 1 according to the seventh embodiment.
  • a desired object can be displayed on the user interface device 110 by a simple operation using one finger. That is, the control unit 10 according to the seventh embodiment displays an instruction icon for instructing selection of a menu icon displayed on the display panel 111 of the user interface device 110 when the touch panel 20 is pushed by a finger. Control information to be displayed on the user interface device 110 is output to the navigation device body 2. Then, the control unit 10 displays control information for determining selection of the instruction icon when the touch panel 20 is pressed again with a finger in a state where the instruction icon is displayed superimposed on the menu icon. Output to the main body 2. The navigation device body 2 performs a predetermined process based on the content instructed by the touch operation or the gesture operation input from the control unit 10, changes the display form of the object that has performed the predetermined process, and displays the display panel 111. Display the object.
  • the gesture operation means, for example, that a user pushes a finger into the touch panel 20 to display an instruction icon on the user interface device 110, move the instruction icon in accordance with the direction in which the finger is moved, and then the user again Is an operation of determining a selection icon displayed on the user interface device 110 by pushing a finger into the touch panel 20.
  • changing the display form of the object in the seventh embodiment means, for example, displaying the map in an enlarged or reduced manner.
  • the control unit 10 detects that a user's finger has touched the touch panel 20 of the pressure-sensitive controller (S111).
  • the screen added in step S111 shows a state in which a user's finger touches the touch panel 20 of the pressure-sensitive controller and a state in which a map is displayed on the display panel 111 of the user interface device 110.
  • the control unit 10 determines that the touch panel 20 has been pressed with a finger based on the pressure-sensitive detection information that the pressure-sensitive detection unit 18 detects and outputs a pressure-sensitive value based on the sensor value.
  • the control unit 10 transmits control information for displaying an instruction icon for selecting a menu icon displayed on the display panel 111 of the user interface device 110 when the touch panel 20 is pushed by a finger through the navigation device body 2.
  • the data is output to the display control unit of the user interface device 110.
  • an instruction icon is displayed on the display panel 111 of the user interface device 110.
  • the screen added in step S112 shows that the selection icon is displayed on the display panel 40 and the instruction icon is displayed on the user interface device 110 when the user presses the touch panel 20 with a finger. ing.
  • the control unit 10 moves the instruction information displayed on the display panel 111 of the user interface device 110 in accordance with the direction in which the finger touching the touch panel 20 moves, and displays control information through the navigation device body 2. Output to the interface device 110. Then, the display control unit of the user interface device 110 moves and displays the instruction icon displayed on the display panel 111 according to the operation of the pressure-sensitive controller.
  • the user moves his / her finger on the touch panel 20 and inputs information, and the instruction icon displayed on the user interface device 110 moves according to the movement of the user's finger. It is shown.
  • a moving instruction icon is represented by a white arrow in the screen.
  • the user selects a selection icon displayed on the display panel 40 of the pressure-sensitive controller, and presses this selection icon to determine the selection icon (S114).
  • the instruction icon displayed on the display panel 111 of the user interface device 110 is displayed superimposed on the menu icon.
  • the pressure-sensitive detection unit 18 outputs pressure-sensitive detection information when pressure is detected based on the sensor value.
  • the control unit 10 can determine that the touch panel 20 has been pressed with a finger based on the pressure-sensitive detection information input from the pressure-sensitive detection unit 18, and can determine the menu icon selected by the instruction icon.
  • the screen added in step S114 shows a state in which the user presses the touch panel 20 with a finger.
  • control part 10 changes the control information for changing and displaying the map displayed on the display panel 111 of the user interface apparatus 110 based on the information determined with the instruction
  • the data is output to the display control unit of the interface device 110.
  • a map that is an operation result of the operation performed by the user through the pressure-sensitive controller is displayed on the display panel 111 of the user interface device 110.
  • the user interface device 3 is used as a pressure-sensitive controller that allows the user to perform navigation operations at hand.
  • the user can perform the selection operation and the determination operation with one finger, and can perform the determination operation for the instruction icon at the position where the instruction icon is selected. For this reason, the movement of the finger can be minimized, and the operation can be performed simply and intuitively.
  • the pressure-sensitive controller is disposed at a position away from the navigation apparatus body 2, the user can easily operate it at hand.
  • command icons for example, volume buttons
  • various command icons can be displayed on the display panel 40 of the pressure-sensitive controller, and the display magnification can be arbitrarily changed.
  • a joystick 102 or the like is used to perform a complicated and time-consuming selection operation or determination operation, but in the pressure-sensitive controller according to the seventh embodiment, the user operates the pressure-sensitive controller. Quickly select menus and make decisions. Even if the number of command icons that can be displayed on one screen is limited, it is possible to display and select many command icons by switching the screens.
  • the command icon may be displayed by dividing the display panel 40 of the user interface device 3 into cells divided into four columns in two vertical columns and two horizontal columns, for example. This can prevent a user from making a mistake in pressing.
  • the interface device 110 can be operated. Further, the pressure-sensitive controller and the personal computer device may be connected wirelessly or by wire. It is also possible to cause the personal computer device to execute a predetermined function by operating the pressure sensitive controller at a location away from the personal computer device.
  • the user interface device 3 according to each embodiment described above may be applied to devices other than the navigation device. For example, operability can be improved by applying the user interface device 3 to a touch panel portion of a mobile terminal, a tablet terminal, or the like. In this way, the user interface device 3 according to the present embodiment can be combined with various electronic devices that require a touch panel.
  • the user interface device 3 includes the control unit 10, but the navigation device body 2 may include the control unit 10.
  • the touch panel 20 may be configured to detect a touch operation by a method other than the electrostatic capacitance method. Further, the pressure sensor 30 may detect that the touch panel 20 is pushed in by a press switch or the like provided below the touch panel 20.
  • the operation combined with the pressure sensitive function in the user interface device 3 may be used for, for example, navigation of a person or a bicycle. Further, the user interface device 3 may be used for the operation of the image editing software as described above, or may be used for the operation of other application software.
  • the navigation device body 2 and the user interface device 3 are combined. However, since the user interface device 3 itself has a navigation function, only the user interface device 3 may be used as the navigation device.
  • the pressure-sensitive sensor 30 may be configured not to include the elastic body 33. For example, even if the elastic body 33 is removed from the pressure sensor 30, if a pressing force is applied to the pressure sensor 30 in a state where the upper electrode 31 and the lower electrode 35 are separated from each other with a constant distance, the upper electrode 31 is applied. The lower electrode 35 approaches and the electrostatic capacity between the upper electrode 31 and the lower electrode 35 decreases. For this reason, the pressure sensitive detection unit 18 can obtain the capacitance change rate based on the sensor values output from the upper electrode 31 and the lower electrode 35.
  • the present invention is not limited to the embodiment described above, and various other application examples and modifications can of course be taken without departing from the gist of the present invention described in the claims.
  • the configuration of the apparatus is described in detail and specifically in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the configuration including all the configurations described.
  • a part of the configuration of the embodiment described here can be replaced with the configuration of the other embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Is possible.
  • the control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
  • SYMBOLS 1 ... Navigation apparatus, 2 ... Navigation apparatus main body, 3 ... User interface apparatus, 10 ... Control part, 17 ... Coordinate detection part, 18 ... Pressure detection part, 19 ... Display control part, 20 ... Touch panel, 30 ... Pressure sensor 40 ... Display panel

Abstract

La présente invention concerne une unité de commande fournie à ce dispositif d'interface utilisateur qui détermine une opération tactile sur la base d'informations de détection de coordonnées, que détectent et délivrent une unité de détection. En outre, l'unité de commande détermine une opération de pression sur la base d'informations de détection sensibles à la pression, que délivre une unité de détection sensible à la pression lorsque l'unité de détection sensible à la pression détecte que l'opération de pression a été effectuée. Ensuite, sur la base d'une opération gestuelle comprenant la combinaison de l'opération tactile et de l'opération de pression, l'unité de commande délivre, à une unité de commande d'affichage, des informations de commande servant à changer le mode d'affichage d'un objet.
PCT/JP2017/041151 2016-12-27 2017-11-15 Dispositif d'interface utilisateur et appareil électronique WO2018123320A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-252318 2016-12-27
JP2016252318A JP2018106434A (ja) 2016-12-27 2016-12-27 ユーザーインターフェイス装置及び電子機器

Publications (1)

Publication Number Publication Date
WO2018123320A1 true WO2018123320A1 (fr) 2018-07-05

Family

ID=62707252

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/041151 WO2018123320A1 (fr) 2016-12-27 2017-11-15 Dispositif d'interface utilisateur et appareil électronique

Country Status (2)

Country Link
JP (1) JP2018106434A (fr)
WO (1) WO2018123320A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6576399B2 (ja) * 2017-07-20 2019-09-18 ヤフー株式会社 情報表示プログラム、情報表示方法、情報表示装置、及び配信装置
JP2020042417A (ja) * 2018-09-07 2020-03-19 アイシン精機株式会社 表示制御装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134895A (ja) * 2008-12-08 2010-06-17 Apple Inc 選択的入力信号拒否及び修正
JP2011022851A (ja) * 2009-07-16 2011-02-03 Docomo Technology Inc ディスプレイ端末、画像処理システム、及び画像処理方法
JP2012185710A (ja) * 2011-03-07 2012-09-27 Kyocera Corp 電子機器、電子機器の制御方法及びプログラム
JP2012527034A (ja) * 2009-05-15 2012-11-01 サムスン エレクトロニクス カンパニー リミテッド 携帯端末機のイメージ処理方法
JP2013105410A (ja) * 2011-11-16 2013-05-30 Fuji Soft Inc タッチパネル操作方法及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134895A (ja) * 2008-12-08 2010-06-17 Apple Inc 選択的入力信号拒否及び修正
JP2012527034A (ja) * 2009-05-15 2012-11-01 サムスン エレクトロニクス カンパニー リミテッド 携帯端末機のイメージ処理方法
JP2011022851A (ja) * 2009-07-16 2011-02-03 Docomo Technology Inc ディスプレイ端末、画像処理システム、及び画像処理方法
JP2012185710A (ja) * 2011-03-07 2012-09-27 Kyocera Corp 電子機器、電子機器の制御方法及びプログラム
JP2013105410A (ja) * 2011-11-16 2013-05-30 Fuji Soft Inc タッチパネル操作方法及びプログラム

Also Published As

Publication number Publication date
JP2018106434A (ja) 2018-07-05

Similar Documents

Publication Publication Date Title
US20220100368A1 (en) User interfaces for improving single-handed operation of devices
JP5295328B2 (ja) スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム
US8775966B2 (en) Electronic device and method with dual mode rear TouchPad
KR101424294B1 (ko) 터치스크린 장치의 사용자로부터 수신된 입력 및 제스쳐에 응답하여 동작을 수행하는 컴퓨터로 구현된 방법 및 컴퓨터판독가능 매체
US20150317054A1 (en) Method and apparatus for gesture recognition
CN106687905B (zh) 触感控制系统及触感控制方法
KR20140098904A (ko) 멀티태스킹 운용 방법 및 이를 지원하는 단말기
JP5003377B2 (ja) 電子機器におけるマークの位置あわせ方法
WO2012160829A1 (fr) Dispositif à écran tactile, procédé d'entrée d'opération tactile et programme
KR20100018883A (ko) 전자기기의 사용자 인터페이스 방법 및 장치
CN114764304A (zh) 一种屏幕显示方法
WO2011010411A1 (fr) Appareil de contrôle d’entrée
WO2018123320A1 (fr) Dispositif d'interface utilisateur et appareil électronique
JP2012252652A (ja) タッチパネル入力装置
JP5461030B2 (ja) 入力装置
CN114690889A (zh) 一种虚拟键盘的处理方法以及相关设备
US20110119579A1 (en) Method of turning over three-dimensional graphic object by use of touch sensitive input device
CN114690887A (zh) 一种反馈方法以及相关设备
JP2018128968A (ja) 車両用入力装置、及び、車両用入力装置の制御方法
JP2014006748A (ja) 電子機器、機器及び方法
KR101480775B1 (ko) 필기 인식을 이용한 차량용 정보 표시 장치 및 방법, 그 정보 입력 모듈
WO2018123355A1 (fr) Dispositif d'interface utilisateur et dispositif electronique
KR101678213B1 (ko) 터치 영역 증감 검출에 의한 사용자 인터페이스 장치 및 그 제어 방법
KR20090106312A (ko) 2개의 디스플레이를 포함하는 장치 및 그 사용자인터페이스 방법
US20140085340A1 (en) Method and electronic device for manipulating scale or rotation of graphic on display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17885686

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17885686

Country of ref document: EP

Kind code of ref document: A1