WO2010064389A1 - 表示入力装置 - Google Patents
表示入力装置 Download PDFInfo
- Publication number
- WO2010064389A1 WO2010064389A1 PCT/JP2009/006393 JP2009006393W WO2010064389A1 WO 2010064389 A1 WO2010064389 A1 WO 2010064389A1 JP 2009006393 W JP2009006393 W JP 2009006393W WO 2010064389 A1 WO2010064389 A1 WO 2010064389A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vibration
- touch panel
- input device
- sensor
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0446—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04805—Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
Definitions
- the present invention particularly relates to a display input device suitable for use in an in-vehicle information device such as a navigation system.
- a touch panel is an electronic component that combines a display device such as a liquid crystal panel and a coordinate position input device such as a touch pad, and is touched simply by touching an image area such as an icon displayed on the liquid crystal panel with a finger. It is a display input device that can sense the position information of an image and operate the device, and is often incorporated in a device that is mainly required to be handled intuitively, such as an in-vehicle navigation system.
- Patent Literature 1 when a finger is brought closer, a nearby icon is enlarged, so that an erroneous operation can be prevented and a selection operation is facilitated.
- FIG. 10 for example, assuming that a software keyboard is displayed on the touch panel, the contents of some of the keys that are enlarged are also changed with the vibration, so that the touch operation is difficult. Therefore, the operability is inevitably lowered, and if the image is always displayed in an enlarged state without vibration, the operability may be impaired.
- the present invention has been made to solve the above-described problems, and an object of the present invention is to provide a display input device that facilitates an input operation even during vibration and improves operability.
- a display input device of the present invention includes a touch panel that displays and inputs an image, a vibration sensor that detects vibration of the detection target based on a position change of the detection target facing the touch panel, Based on the vibration correction unit that performs vibration correction to remove the vibration component from the position change of the detection target when the vibration of the detection target detected by the vibration sensor exceeds a predetermined amount, and the position of the detection target that has been subjected to vibration correction, And a control unit that transforms an image in a display area within a certain range displayed on the touch panel.
- FIG. 1 is a block diagram showing a configuration of a display input device according to Embodiment 1 of the present invention.
- the display input device according to Embodiment 1 of the present invention includes a touch panel display device (hereinafter abbreviated as a touch panel) 1, an external sensor 2, and a control unit 3.
- the touch panel 1 displays and inputs information.
- a touch sensor 11 that inputs information is stacked on an LCD panel 10 that displays information.
- the touch sensor 11 further includes a touch sensor 11.
- a plurality of proximity sensors 12 for non-contact detection of the movement of the detection target in a two-dimensional manner, such as a finger and a pen positioned facing the touch panel 1, are mounted in units of cells.
- the proximity sensor 12 will be described as a vibration sensor.
- the proximity sensor 12 uses, for example, infrared rays as a detection cell, an infrared light emitting LED (Light Emitted Diode) and a light receiving transistor are arranged opposite to each other in a grid pattern on the outer periphery of the touch sensor 11 and the detection target is approached. The approach and the coordinate position are detected by the shielding or reflected light.
- the proximity sensor 12 detects the approach of the detection target based on a change in capacitance generated between two flat plates arranged in parallel like a capacitor, for example, the detection cell is not limited to the infrared rays described above. A capacitance type may be used instead.
- the flat plate has a grounding surface on one side facing the detection target, and the other side serves as a sensor detection surface, and detects the approach of the detection target by a change in capacitance formed between the two poles and coordinates thereof. The position can be detected.
- the external sensor 2 is mounted everywhere in the vehicle and includes at least a GPS (Global Positioning System) sensor 21, a vehicle speed sensor 22, and an acceleration sensor 23.
- the GPS sensor 21 receives a radio wave from a GPS satellite, and the control unit 3 generates a signal for positioning the latitude and longitude and outputs the signal to the control unit 3.
- the vehicle speed sensor 22 measures a vehicle speed pulse for determining whether or not the vehicle is traveling and outputs it to the control unit 3.
- the acceleration sensor 23 is a sensor that estimates, for example, the acceleration applied to the weight by measuring the amount of displacement of the weight attached to the spring.
- a triaxial acceleration sensor for example, from 0 (gravity acceleration only) to several hundred Hz
- the direction (attitude) with respect to the ground is measured from the sum of the acceleration vectors in the X and Y directions, and is output to the control unit 3.
- the control unit 3 performs vibration correction when it is determined that there is a predetermined amount of finger vibration detected by the proximity sensor 12. And has a function of enlarging and displaying an image in a display area within a certain range displayed on the touch panel 1. Therefore, the control unit 3 includes a CPU (hereinafter referred to as a navigation CPU 30) that mainly controls the touch panel 1 for navigation processing, a drawing circuit 31, a memory 32, and a map DB (Data Base) 33. .
- a navigation CPU 30 that mainly controls the touch panel 1 for navigation processing
- a drawing circuit 31 mainly controls the touch panel 1 for navigation processing
- a memory 32 mainly controls the touch panel 1 for navigation processing
- a map DB Data Base
- the navigation CPU 30 performs a navigation process according to a menu selected by the user such as a route search displayed on the touch panel 1 as a route search.
- the navigation CPU 30 refers to the map information stored in the map DB 33 and performs navigation such as route search or destination guidance based on various sensor signals acquired from the external sensor 2. Further, the navigation CPU 30 performs vibration correction when it is determined that there is a predetermined amount of finger vibration detected by the proximity sensor 12, and enlarges and displays an image in a certain range of display area displayed on the touch panel 1.
- the image information is generated according to the program stored in the memory 32 and the drawing circuit 31 is controlled.
- the structure of the program executed by the navigation CPU 30 in that case is shown in FIG. 2, and details thereof will be described later.
- the drawing circuit 31 develops image information generated by the navigation CPU 30 at a constant speed on a built-in or external bitmap memory unit, and develops it on the bitmap memory unit by a built-in display control unit.
- the read image information is read out in synchronization with the display timing of the touch panel 1 (LCD panel 10) and displayed on the touch panel 1.
- the above-described bitmap memory unit and display control unit are shown in FIG. 3 and will be described in detail later.
- the memory 32 stores an image information storage area and the like assigned to the work area. Further, the map DB 33 stores maps and facility information necessary for navigation such as route search and guidance.
- FIG. 2 is a block diagram showing the functions of the structure of the program executed by the navigation CPU 30 in FIG. 1 included in the control unit 3 of the display input device according to Embodiment 1 of the present invention.
- the navigation CPU 30 includes a main control unit 300, a proximity coordinate position calculation unit 301, a touch coordinate position calculation unit 302, a vibration determination unit 303, a vibration correction unit 304, and an image information generation unit. 305, an image information transfer unit 306, and an operation information processing unit 307.
- the proximity coordinate position calculation unit 301 has a function of calculating the XY coordinate position of the finger and passing it to the main control unit 300 and the vibration determination unit 303 when the proximity sensor 12 detects the approach of the finger to the touch panel 1. .
- the XY coordinates calculated by the proximity coordinate position detection unit 301 are continuously output for 0.1 seconds at intervals of 0.01 seconds, and the vibration determination unit 303 outputs the XY coordinate values during the 0.1 seconds.
- the vibration determination unit has determined vibration in 0.1 seconds. However, if the vibration is determined based on 0.1 seconds ⁇ number of times of information while remembering a continuous past history, the accuracy can be further improved.
- the touch coordinate position calculation unit 302 has a function of calculating the XY coordinate position and delivering it to the main control unit 300 when a touch on the touch panel 1 with a finger is detected by the touch sensor 11.
- the vibration determination unit 303 measures the amount of vibration of the finger according to the fluctuation state of the XY coordinates of the finger output from the proximity coordinate position calculation unit 301 under the control of the main control unit 300, and determines whether or not there is a predetermined amount of vibration. To do.
- the vibration determination unit 303 accumulates the XY coordinates of the finger output from the proximity coordinate position calculation unit 301 in a time series for a predetermined time, and inputs the time series data to an HPF (high pass filter) to remove low frequency components. By doing so, time-series data representing the vibration of the finger is obtained. If the variance of the time-series data representing the vibration of the finger is equal to or greater than a predetermined value, it is determined that there is vibration, and the vibration correction unit 304 is controlled based on the result.
- HPF high pass filter
- the vibration correction unit 304 filters the time-series data representing the finger vibration obtained by the vibration determination unit 303 with a predetermined cutoff frequency. Apply processing to correct vibration. Thus, time series data representing the XY coordinates of the finger from which the vibration component has been removed is obtained and output to the image information generation unit 305. On the other hand, when the vibration determination unit 303 determines that there is no predetermined amount of vibration, time-series data representing the XY coordinates of the finger is output to the image information generation unit 305 as it is.
- the image information generation unit 305 has a function of generating image information to be displayed on the touch panel 1 (LCD panel 10) under the control of the main control unit 300 and outputting the image information to the image information transfer unit 306.
- the image information generation unit 305 refers to the time-series data representing the XY coordinates of the finger input from the vibration determination unit 303 and enlarges and displays an image of a certain range area near the finger.
- the image information generation unit 305 enlarges and displays an image of a display area in a certain range displayed on the touch panel 1, for example, an image of the display area in a certain range that has already been generated is displayed at a certain ratio in units of pixels. Is read out, interpolated, and rendered as an updated image.
- the image information transfer unit 306 has a function of transferring the image information generated by the image information generation unit 305 to the drawing circuit 31 together with a drawing command based on timing control by the main control unit 300.
- the operation information processing unit 307 is operation information defined in information on a certain display area based on the touch coordinate position calculated by the touch coordinate position calculation unit 302 under the control of the main control unit 300, for example, a soft keyboard. For example, image information based on the touched key is generated and output to the image information transfer unit 306. If the button is an icon button, navigation processing such as a destination search defined for the icon button is executed to obtain the image information. It has a function of generating, outputting to the image information transfer unit 306, and displaying each on the touch panel 1 (LCD monitor 10).
- a predetermined amount of work area is allocated to the memory 32 in addition to the above-described program area 321 in which the program is stored, and an image generated by the image information generation unit 305 is included in this work area.
- An image information storage area 322 in which information is temporarily stored is included.
- FIG. 3 is a block diagram showing an internal configuration of the drawing circuit 31 shown in FIG.
- the drawing circuit 31 includes a drawing control unit 310, an image buffer unit 311, a drawing unit 312, a bitmap memory unit 313, and a display control unit 314, all of which are addresses.
- the data and control lines are commonly connected via a local bus 315 composed of a plurality of lines.
- image information transferred from the image information transfer unit 306 of the navigation CPU 30 shown in FIG. 2 is held in the image buffer unit 311 under the control of the drawing control unit 310, and the drawing control unit 310 receives the navigation CPU 30.
- a drawing command such as straight line drawing or rectangular drawing transmitted from the image information transfer unit 306 is decoded, or drawing preprocessing for straight line inclination or the like is performed.
- the drawing unit 312 activated by the drawing control unit 310 draws the image information decoded by the drawing control unit 310 in the bitmap memory unit 313 at high speed.
- the display control unit 314 reads and displays the image information held in the bitmap memory unit 313 via the local bus 315 in synchronization with the display timing of the LCD panel 10 of the touch panel 1.
- FIG. 4 is a flowchart showing the operation of the display input apparatus according to Embodiment 1 of the present invention.
- FIG. 5A shows a soft keyboard image displayed on the touch panel 1 at that time
- FIG. 5B shows an example of a cursor image.
- FIGS. 1 to 3 the operation of the display input device according to Embodiment 1 of the present invention shown in FIGS. 1 to 3 will be described in detail with reference to FIG. 4 and FIGS. 5 (a) and 5 (b).
- step ST41 a soft keyboard image used when searching for a 50 sound facility as shown in FIG. 5A is displayed on the touch panel 1 (step ST41).
- the proximity sensor 12 detects the proximity of the finger (step ST42 “YES”), and the proximity coordinate position calculation unit 301 of the navigation CPU 30 operates.
- the proximity coordinate position calculation unit 301 calculates the X (Y) coordinate of the finger on the panel surface of the touch panel 1, and the calculated coordinate value is output to the vibration determination unit 303 (step ST43).
- the proximity coordinate position calculation unit 301 outputs the acquired finger coordinate value to the vibration determination unit 303 for, for example, 0.1 seconds every 0.01 seconds.
- the vibration determination unit 303 determines the presence or absence of finger vibration by continuously receiving the input finger coordinate values for 0.1 seconds (step ST44).
- the vibration correction unit 304 performs vibration correction and the XY coordinates of the finger from which the vibration component has been removed. Are obtained and output to the image information generation unit 305 (step ST45).
- vibration correction is not performed, and time-series data of the XY coordinates of the finger is output to the image information generation unit 305 as it is.
- the image information generation unit 305 Based on the time-series data of the XY coordinates of the finger output from the vibration correction unit 304, the image information generation unit 305 performs an enlarged display of the image in a certain range of display area near the finger (step ST46). That is, when it is determined that the finger vibration has a predetermined amount, an image in a certain range centered on the XY coordinates of the finger from which the vibration component is removed is enlarged, and when it is determined that there is no predetermined amount of vibration. A certain range of image centered on the XY coordinates of the finger that has not undergone vibration correction is enlarged.
- FIG. 6 is a waveform diagram showing an outline of vibration correction by the vibration correction unit 304.
- 6A shows time series data of the shake (cm) of the finger in the X direction, which is generated by the proximity coordinate position calculation unit 301 for the past 3 seconds
- FIG. 6B shows a cutoff frequency of 3 Hz.
- the time series data after LPF (Low Pass Filter) processing is shown.
- LPF Low Pass Filter
- the image information generation unit 305 activated by the main control unit 300 performs, for example, FIG. 5A in order to enlarge the image of the display area in a certain range displayed on the touch panel 1.
- An image of a partial area of the soft keyboard that has already been generated is read from the image information storage area 322 of the memory 32 at a certain rate and interpolated. Then, it is combined with peripheral image information (no enlargement) and updated as a soft keyboard image with new image information.
- the updated image information is stored in the image information storage area 322 of the memory 32 and is output to the image information transfer unit 306.
- the image information transfer unit 306 receives the updated image information and transfers the image information to the drawing circuit 31.
- the drawing circuit 31 expands the image information transferred by the drawing control unit 310, and the drawing unit 312 uses the bitmap memory.
- the image is drawn on the unit 313 at high speed.
- the display control unit 314 reads an image drawn in the bitmap memory unit 313 and performs a desired display on the LCD panel 10 of the touch panel 1.
- the touch coordinate position calculation unit 302 calculates the touch coordinate position and causes the operation information processing unit 307 to operate. to start.
- the operation information processing unit 307 executes an operation process based on a key corresponding to the touch coordinates calculated by the touch coordinate position calculation unit 302 (step ST48).
- the operation processing based on the key corresponding to the touch coordinates generates image information based on the key defined in the touched icon button and outputs the image information to the image information transfer unit 306.
- navigation processing such as destination search defined for the touched icon button is executed, image information is generated, output to the image information transfer unit 306, and displayed on the touch panel 1 (LCD monitor 10).
- the control unit 3 performs vibration correction when it is determined that there is a predetermined amount of finger vibration detected by the proximity sensor 12. Since the image in the display area within a certain range displayed on the touch panel 1 is enlarged and displayed, the vibration of the image enlarged and displayed on the screen is reduced even during vibration.
- An improved display input device can be provided.
- the LPF is used for vibration correction.
- vibration correction processing by LPF may be prohibited.
- the 50-sound search screen shown by the software keyboard is exemplified as an image displayed in the display area of a certain range. For example, as shown in FIG. The same effect can be obtained even when applied to a cross hair cursor display or a menu screen.
- the explanation has been made by paying attention to only the X-axis fluctuation.
- the LPF has been described as a blur correction filter.
- the filter is not limited to the LPF.
- a Kalman filter or a moving average filter that averages the positions of the past one second or the like may be used. good.
- the Kalmanfill process has better performance than the LPF, but the CPU processing amount is increased.
- image stabilization for enlarged display
- the form of image control is not limited to this, when the proximity state of a finger is detected and detailed explanation such as help is displayed in a balloon, You may apply when displaying a cursor position.
- FIG. FIG. 7 is a block diagram showing a functional development of the program structure of the navigation CPU 30 which is a component of the control unit 3 of the display input device according to Embodiment 2 of the present invention.
- the difference between the program structure of the navigation CPU 30 and the first embodiment shown in FIG. 2 is that the vehicle information acquisition unit 308 is added to the program structure of the navigation CPU 30 of the first embodiment.
- the vehicle information acquisition unit 308 acquires a vehicle speed signal or an acceleration signal from the external sensor 2 including the vehicle speed sensor 22 and the acceleration sensor 23, supplies the vehicle speed signal or acceleration signal to the main control unit 300 and the vibration correction unit 304, and performs vibration correction processing by the vibration correction unit 304. It has a function to control.
- FIG. 8 is a flowchart showing the operation of the display input apparatus according to Embodiment 2 of the present invention.
- the operation of the navigation CPU 30 according to the second embodiment will be described in detail with reference to the flowchart of FIG.
- step ST81 to ST83 The processing from normal search display to calculation of finger coordinates (steps ST81 to ST83) is the same as the processing from steps ST41 to ST43 of the first embodiment shown in the flowchart of FIG. The description is omitted.
- the vehicle information acquisition unit 308 acquires a vehicle speed signal measured by the vehicle speed sensor 22 of the external sensor 2 and supplies it to the main control unit 300 and the vibration correction unit 304.
- the main control unit 300 determines whether or not the vehicle is traveling based on the vehicle speed signal (step ST84). If it is determined that the vehicle is stopped (step ST84 “NO”), vibration correction is not performed, Based on the time-series data of the XY coordinates of the finger, the image information generation unit 305 performs image enlargement processing of a display area in a certain range near the finger (step ST85).
- step ST84 if it is determined that the vehicle is running (“YES” in step ST84), the vibration correction unit 304 is controlled to perform finger vibration correction processing (step ST88), and the image information generation unit 305 Based on the time-series data of the XY coordinates of the finger from which the component has been removed, image enlargement processing is performed on a display area in a certain range near the finger (step ST85).
- the image information generation unit 305 reads out the image information of a partial area of the soft keyboard that has already been generated from the image information storage area 322 of the memory 32 at a constant rate when executing the image enlargement process of the display area in a certain range. Interpolate and combine with surrounding image information (no enlargement) to update as a soft keyboard image with new image information.
- the updated image information is stored in the image information storage area 322 of the memory 32 and is output to the image information transfer unit 306.
- the image information transfer unit 306 receives the updated image information and transfers the image information to the drawing circuit 31.
- the drawing circuit 31 expands the image information transferred by the drawing control unit 310, and the drawing unit 312 uses the bitmap memory.
- the image is drawn on the unit 313 at high speed.
- the display control unit 314 reads an image drawn in the bitmap memory unit 313 and performs an enlarged display on the LCD panel 10 of the touch panel 1.
- the vibration correction unit 304 operates when the vibration determination unit 303 determines that there is a predetermined amount of vibration of the finger when performing the vibration correction process of the finger, and the time series of past vibrations (frequency components of finger vibrations). ) Is subjected to a filter process with a predetermined cut-off frequency to perform vibration correction. Then, the image information generation unit 305 is controlled via the main control unit 300, and the image information generation unit 305 performs an image enlargement process in the display area of the certain range described above, and sends the drawing circuit 31 via the image information transfer unit 306. The generated image information is transferred to obtain a desired display on the LCD panel 10 of the touch panel 1 (step ST85).
- step ST86 When the touch sensor 11 of the touch panel 1 detects that the icon is touched with a finger (step ST86 “YES”), the touch coordinate position calculation unit 302 calculates the touch coordinate position and calculates the operation information processing unit. 307 is activated, and the operation information processing unit 307 executes an operation process based on the key corresponding to the touch coordinates calculated by the touch coordinate position calculation unit 302 (step ST87). This process is the same as in the first embodiment.
- the control unit 3 performs vibration correction when the external sensor 2 determines that the vehicle is running, and the vehicle By not performing vibration correction when it is determined that the vehicle is stopped, there is almost no finger vibration while the vehicle is stopped, so it is possible to reflect the intentions of the operator without forcibly correcting vibration, and unnecessary vibration correction processing is performed. Since this can be avoided and the response speed is increased, the enlarged display speed of the image information in the display area within a certain range is increased.
- an acceleration sensor is mounted in the touch panel 1 to detect vibration of the touch panel 1, or vibration of the touch panel 1 is detected by the acceleration sensor 23 mounted on the vehicle, and vibration correction is performed when there is a predetermined amount of vibration.
- vibration correction optimum for the running state can be performed by changing the strength of vibration correction according to the traveling speed of the vehicle. Since the vibration of the touch panel 1 itself means an environment in which a finger is easily shaken, it can be corrected only when vibration correction is necessary, and a user-friendly display input device can be provided without adversely affecting the response speed. .
- FIG. 9 is a flowchart showing the operation of the display input device according to Embodiment 3 of the present invention, and shows the operation when the intensity of vibration correction is changed according to the conditions.
- the vibration correction unit 304 changes the strength of vibration correction according to the traveling speed.
- the vibration correction unit 304 is an LPF having a cutoff frequency of 5 Hz when the speed information from the vehicle speed sensor 22 of the external sensor 2 acquired by the vehicle information acquisition unit 308 is 0 km / hour ⁇ vehicle speed ⁇ 10 km / hour. Processing is performed or control is performed with emphasis on response speed without performing vibration correction. When vehicle speed> 10 Km / hour, control is performed to suppress vibration by performing strong vibration correction by LPF processing with a cutoff frequency of 3 Hz.
- the vibration correction unit 304 may change the strength of vibration correction depending on the type of image in the display area within a certain range displayed on the touch panel 1 regardless of the vehicle speed described above. For example, in the case of an icon display including keys of a soft keyboard, strong vibration correction is performed by LPF processing with a cutoff frequency of 3 Hz, while in the case of a cursor display, a rapid movement is required and the intention of the user is reflected. Therefore, it is necessary to take measures to weaken the strength of vibration correction by LPF processing with a cutoff frequency of 5 Hz with an emphasis on response speed. In order to respond to the user's quick finger movement, for example, when movement in the same direction exceeding a predetermined amount of 3 cm or more is detected, the vibration correction processing may not be performed.
- the touch panel capable of detecting a three-dimensional coordinate position capable of measuring the vertical distance (Z-axis) from the panel surface is used instead of the touch panel 1 shown in FIG.
- the intensity of vibration correction may be changed according to the above. For example, as the distance gets closer, the input operation becomes easier by setting the intensity of vibration correction to be higher.
- the technique disclosed in Patent Document 2 is used to measure the vertical distance (Z-axis) from the panel surface, but in addition, a panel is obtained by image processing using a surveillance camera. The distance in the vertical direction of the finger from the surface (Z axis) may be measured.
- the vibration correction unit 304 does not perform vibration correction by the LPF process, and when 0 ⁇ distance ⁇ 1 cm, the LPF process with a cutoff frequency of 3 Hz. When 1 cm ⁇ distance ⁇ 5 cm, LPF processing with a cut-off frequency of 5 Hz is performed. When the distance> 5 cm, the main control unit 300 does not determine that the finger has approached. The displayed image of the display area in a certain range is not enlarged and, of course, vibration correction by the vibration correction unit 304 is not performed.
- the image is touched after a series of processes (steps ST91 to ST94) starting from a normal search display and reaching a travel determination process, and an enlarged display of an image in a certain range of display area.
- the process (ST95 to ST97) for detecting the error and executing the operation process based on the key corresponding to the coordinates is the same as the processes in steps ST81 to ST84 and ST85 to ST87 of the second embodiment shown in FIG. Therefore, the description here is omitted to avoid duplication.
- the control unit 3 performs vibration correction using the vehicle speed detected by the vehicle speed sensor 22 or the cutoff frequency determined according to the amount of vibration of the vehicle.
- the strength of the vibration correction by changing the strength of the vibration correction by the cut-off frequency determined according to the type of the image of the display area in the fixed range displayed on the touch panel 1, or between the panel surface and the finger
- the strength of vibration correction with the cutoff frequency determined according to the distance in the vertical direction it is possible to select either vibration correction that emphasizes ease of input operation and vibration correction that emphasizes ease of input operation or vibration correction that emphasizes response speed. Therefore, vibration correction reflecting the user's intention is possible, and usability is improved.
- the conditions for setting the vibration correction intensity can be set independently or in combination. Usability is further improved.
- the LPF is described as a blur correction filter, and the blur correction has been tightened by a method of lowering the cutoff frequency. You may do it.
- the smoothing action is strong, the response speed is slow with respect to a rapid change, and when the smoothing action is weak, the response speed is good.
- the third embodiment can also be realized by preparing and switching a plurality of filters having different filter coefficients as described above.
- the display input device includes the touch panel 1 that displays and inputs an image, and the finger that faces the touch panel 1.
- a vibration sensor proximity sensor 12 that detects vibration and a predetermined range of vibration displayed on the touch panel 1 by performing a predetermined amount of vibration correction when it is determined that there is a predetermined amount of finger vibration detected by the vibration sensor.
- the control unit 3 that enlarges and displays the image in the region, it is possible to provide a display input device that facilitates the input operation even during vibration and improves operability.
- the keys of the software keyboard are illustrated as the images of the display area in a certain range, but the input is performed for navigation. It may be an icon to be operated or a specific image such as a cursor. Further, although only the finger is illustrated as the vibration detection target, the same effect can be obtained even with a detection object such as a pen instead of the finger.
- the acceleration sensor 23 mounted on the vehicle for vibration detection of the touch sensor 11 has been described, it is also possible to mount an acceleration sensor on the touch panel 1 or the control unit 3, and in this case, the vibration of the vehicle Is the acceleration sensor 23 of the external sensor 2, and the vibration of the touch panel 1 can be measured by the acceleration sensor mounted on the touch panel 1 or the control unit 3, so that vibration can be detected with higher accuracy.
- the functions of the navigation CPU 30 of the control unit 3 shown in FIGS. 2 and 7 may be realized entirely by hardware, or at least a part thereof may be realized by software.
- the control unit 3 determines that there is a predetermined amount of finger vibration detected by the vibration sensor (proximity sensor 12)
- the control unit 3 performs a predetermined amount of vibration correction and displays a predetermined range of display area displayed on the touch panel 1.
- Data processing for enlarging and displaying an image in may be realized on a computer by one or a plurality of programs, or at least a part thereof may be realized by hardware.
- the display input device is suitable for use in an in-vehicle information device of a navigation system because the input operation can be facilitated and the operability can be improved even during vibration.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
例えば、指を近づけたときに指の近傍に位置するキースイッチを拡大して表示し、選択操作を容易化した表示入力装置(例えば、特許文献1参照)、垂直方向の距離を検出し、その距離に応じた拡大率で情報を表示するCRT装置(例えば、特許文献2参照)、タッチ圧力を検知して軽いタッチで拡大表示、強いタッチで所定のキー操作を実現する入力装置(例えば、特許文献3参照)等が知られている。
また、特許文献2に開示された技術によれば、拡大縮小を制御しようとした場合、タッチパネル面と指の位置が離れすぎ、指のZ軸方向の振れにより拡大縮小が揺れ、制御が困難になることがある。更に、特許文献3に開示された技術によれば、ユーザの俊敏な指の操作を触れと誤認識して操作にユーザの意思が十分に反映されなくなるといった不都合を有していた。
実施の形態1.
図1は、この発明の実施の形態1に係る表示入力装置の構成を示すブロック図である。図1に示されるように、この発明の実施の形態1に係る表示入力装置は、タッチパネル式表示装置(以下タッチパネルと略称する)1と、外部センサ2と、制御部3とにより構成される。
近接センサ12は、その検出セルが上記した赤外線に限らず、例えば、コンデンサのように平行に配置された2枚の平板との間に生じる静電容量の変化により、検出対象の接近を検出する静電容量型で代替してもよい。この場合、平板は、一方の片側が検出対象に向く接地面、他方の片側がセンサ検出面となり、この2極間に形成される静電容量の変化により検出対象の接近を検出するとともにその座標位置を検出することができる。
GPSセンサ21は、GPS衛星からの電波を受信して、制御部3が、緯度、経度を測位するための信号を生成して制御部3に出力する。車速センサ22は、車両が走行中か否かを判定するための車速パルスを計測して制御部3に出力する。加速度センサ23は、例えば、バネに取り付けた錘が変位する量を計測して錘にかかる加速度を推定するセンサであり、3軸加速度センサの場合、例えば、0(重力加速度のみ)から数100Hzまでの加速度変動に追従し、X、Y方向の加速度ベクトルの合計から地面に対する向き(姿勢)を測定して制御部3に出力する。
このため、制御部3は、ナビゲーション処理を主にタッチパネル1の制御を行うCPU(以下、ナビCPU30という)と、描画回路31と、メモリ32と、地図DB(Data Base)33とにより構成される。
また、ナビCPU30は、近接センサ12で検知された指の振動が所定量あると判定された場合に振動補正を行い、タッチパネル1に表示された一定範囲の表示領域における画像を拡大処理して表示する制御部3としての機能を実現するために、メモリ32に記憶されたプログラムにしたがい画像情報を生成して描画回路31の制御を行う。その場合のナビCPU30が実行するプログラムの構造は図2に示されており、その詳細は後述する。
また、地図DB33には、ルート検索、誘導等、ナビゲーションに必要な地図や施設情報等が格納されている。
図2に示されるように、ナビCPU30は、主制御部300と、近接座標位置計算部301と、タッチ座標位置計算部302と、振動判定部303と、振動補正部304と、画像情報生成部305と、画像情報転送部306と、操作情報処理部307とを含む。
近接座標位置検出部301により計算されるXY座標は、例えば、0.01秒間隔で0.1秒間連続して出力され、振動判定部303は、その0.1秒の間に、XY座標値の変化がほとんどない場合に振動無し、所定量の変化がある場合に振動ありと判定し、主制御部300を介して画像情報生成部305を制御する。説明の都合上、振動判定部は0.1秒間で振動を判断したが、連続する過去の履歴を覚えておき0.1秒×何回かの情報で振動を判定すれば更に精度があがる。
振動判定部303は、近接座標位置計算部301から出力される指のXY座標を所定時間分だけ時系列に蓄積し、この時系列データをHPF(ハイパスフィルタ)に入力して低周波成分を除去することにより、指の振動を表す時系列データを得る。そして、この指の振動を表す時系列データの分散値が所定値以上あった場合に振動ありと判定し、その結果に基づいて振動補正部304を制御する。
そして、表示制御部314は、ビットマップメモリ部313に保持された画像情報をタッチパネル1のLCDパネル10の表示タイミングに同期してローカルバス315経由で読出し表示する。
以下、図4、ならびに図5(a)(b)を参照しながら、図1~図3に示すこの発明の実施の形態1に係る表示入力装置の動作について詳細に説明する。
この状態において、まず、ユーザがタッチパネル1に指を近づけると、近接センサ12は指の近接を検出し(ステップST42“YES”)、ナビCPU30の近接座標位置計算部301が動作する。近接座標位置計算部301は、タッチパネル1のパネル面上における指のX(Y)座標を計算し、ここで計算された座標値は、振動判定部303に出力される(ステップST43)。近接座標位置計算部301は、取得した指座標値を振動判定部303に対し、例えば、0.01秒毎に0.1秒間だけ出力する。
図6から理解されるように、LPF処理により指の振動成分が除去されるため、指の振動が所定量を超えた場合に、指近傍の一定範囲の拡大画像が指の振動に応じて変動するのを防ぐことができる。
ここで更新された画像情報は、メモリ32の画像情報記憶領域322に記憶されるとともに画像情報転送部306に出力される。画像情報転送部306は、これをうけて更新された画像情報を描画回路31に転送し、描画回路31は、描画制御部310が転送された画像情報を展開し、描画部312がビットマップメモリ部313に高速で描画する。最後に、表示制御部314がビットマップメモリ部313に描画された画像を読み出してタッチパネル1のLCDパネル10に所望の表示を行う。
これをうけて操作情報処理部307は、タッチ座標位置計算部302で計算されたタッチ座標に相当するキーに基づく操作処理を実行する(ステップST48)。ここで、タッチ座標に相当するキーに基づく操作処理とは、ソフトキーボードの場合、タッチされたアイコンボタンに定義されたキーに基づく画像情報を生成して画像情報転送部306に出力し、アイコンボタンの場合、タッチされたアイコンボタンに定義された目的地検索等のナビゲーション処理を実行して画像情報を生成して画像情報転送部306に出力し、それぞれタッチパネル1(LCDモニタ10)に表示することをいう。
本実施例では、拡大表示について手ぶれ補正の応用例を示したが、画像制御の形態はこれに拘るものではなく、指の近接状態を検知してヘルプなどの詳細説明をバルーン表示する場合や、カーソル位置を表示する場合に適用しても良い。
図7は、この発明の実施の形態2に係る表示入力装置の制御部3の構成要素であるナビCPU30が有するプログラムの構造を機能展開して示したブロック図である。このナビCPU30が有するプログラムの構造において、図2に示す実施の形態1との差異は、実施の形態1のナビCPU30が有するプログラム構造に、車両情報取得部308が付加されたことにある。
以下、図8のフローチャートを参照しながら実施の形態2に係るナビCPU30の動作について詳細に説明する。
近接座標位置計算部301による指座標計算後、車両情報取得部308は、外部センサ2の車速センサ22により測定される車速信号を取得して主制御部300および振動補正部304へ供給する。
ここで更新された画像情報は、メモリ32の画像情報記憶領域322に記憶されるとともに画像情報転送部306に出力される。画像情報転送部306は、これをうけて更新された画像情報を描画回路31に転送し、描画回路31は、描画制御部310が転送された画像情報を展開し、描画部312がビットマップメモリ部313に高速で描画する。最後に、表示制御部314がビットマップメモリ部313に描画された画像を読み出してタッチパネル1のLCDパネル10に拡大表示を行う。
そして、主制御部300経由で画像情報生成部305を制御し、画像情報生成部305は、上記した一定範囲の表示領域における画像の拡大処理を行い、描画回路31に画像情報転送部306経由で生成された画像情報の転送を行い、タッチパネル1のLCDパネル10に所望の表示を得る(ステップST85)。
また、後述するが、車両の走行速度に応じて振動補正の強度を変更することで走行状態に最適な振動補正が可能になる。タッチパネル1自体の振動は、指が振れやすい環境を意味するため、振動補正が必要な場合にのみ補正を行うことが出来、応答速度に悪影響を与えることなく、使い勝手の良い表示入力装置を提供できる。
図9は、この発明の実施の形態3に係る表示入力装置の動作を示すフローチャートであり、条件によって振動補正の強度を変化させる場合の動作が示されている。
ここでは、ステップST98の振動補正処理において、振動補正部304は、走行速度に応じて振動補正の強度を変更している。具体的に、振動補正部304は、車両情報取得部308により取得される外部センサ2の車速センサ22からの速度情報が、0Km/時<車速<10Km/時の場合はカットオフ周波数5HzのLPF処理を行うか、振動補正を行わず応答速度を重視した制御を行い、車速>10Km/時の場合は、カットオフ周波数3HzのLPF処理により強めの振動補正を行なって振動を抑える制御を行う。
例えば、ソフトキーボードのキーを含むアイコン表示の場合は、カットオフ周波数3HzのLPF処理による強い振動補正を行い、一方、カーソル表示の場合は、急激な動きを必要とし、かつユーザの意思を反映させるために、応答速度を重視してカットオフ周波数5Hz強度のLPF処理により振動補正の強度を弱める措置が必要である。ユーザのすばやい指の動きに対応するために、例えば、3cm以上の所定量を超えた同一方向への移動を検出した場合には振動補正処理は行わない等が考えられる。
例えば、距離が近くなるにしたがい振動補正の強度を強めにして設定することで入力操作が容易になる。なお、パネル面からの垂直方向の距離(Z軸)を測定するのに、ここでは、特許文献2に開示された技術を用いることとするが、他に、監視カメラを用い、画像処理によりパネル面からの指の垂直方向の距離(Z軸)を測定してもよい。
なお、上記した振動補正の強度を設定する条件(車速、表示する画像の種類、パネル面と指との垂直方向の距離)は、単独で、あるいはその組み合わせによる設定が可能であり、このことにより一層使い勝手が向上する。説明の都合上、LPFをぶれ補正用のフィルタとして説明し、カットオフ周波数を低くする方法で、ぶれ補正をきつくしたが、カルマンフィルタなど他のフィルタのフィルタ係数を変えることによりぶれ補正の強弱を変えるようにしても良い。一般的には平滑化の作用が強いと急激な変化に対して応答速度が鈍くなり、平滑化の作用が弱いと応答速度は良くなる。ここのようにフィルタ係数の違うフィルタを複数準備し切り替えることによっても実施例3は実現できる。
また、タッチセンサ11の振動検知のために車両に実装された加速度センサ23を用いて説明したが、タッチパネル1あるいは制御部3に加速度センサを実装することも可能であり、この場合、車両の振動は外部センサ2の加速度センサ23で、タッチパネル1の振動はタッチパネル1もしくは制御部3に実装された加速度センサで計測できるため、より精度の高い振動検知が可能になる。
例えば、制御部3が、振動センサ(近接センサ12)で検知された指の振動が所定量あると判定された場合に所定量の振動補正を行い、タッチパネル1に表示された一定範囲の表示領域における画像を拡大処理して表示するデータ処理は、1または複数のプログラムによりコンピュータ上で実現してもよく、また、その少なくとも一部をハードウェアで実現してもよい。
Claims (11)
- 画像の表示および入力を行うタッチパネルと、
前記タッチパネルに対向する検出対象の位置変化に基づいて当該検出対象の振動を検出する振動センサと、
前記振動センサで検知された検出対象の振動が所定量を超えた場合に前記検出対象の位置変化から振動成分を除去する振動補正を行う振動補正部と、
前記振動補正された検出対象の位置に基づいて、前記タッチパネルに表示された一定範囲の表示領域における画像を変形処理する制御部と、
を備えたことを特徴とする表示入力装置。 - 前記振動センサは、
前記タッチパネルに対向する検出対象の振動を非接触で検出する近接センサであることを特徴とする請求項1記載の表示入力装置。 - 前記制御部は、
前記検出対象の振動波形に所定のカットオフ周波数によるフィルタ処理を施して前記振動補正を行うことを特徴とする請求項1記載の表示入力装置。 - 前記制御部は、
前記検出対象の振動波形に所定のフィルタ処理を施して前記振動補正を行うことを特徴とする請求項1記載の表示入力装置。 - 画像の表示および入力を行うタッチパネルと、
前記タッチパネルに対向する検出対象の位置変化に基づいて当該検出対象の振動を検出する振動センサと、
車両の状態を検出する外部センサと
前記外部センサにより車両が走行中であると判定された場合に前記検出対象の位置変化から振動成分を除去する振動補正を行う振動補正部と、
前記振動補正された検出対象の位置に基づいて、前記タッチパネルに表示された一定範囲の表示領域における画像を変形処理する制御部とを備えたことを特徴とする表示入力装置。 - 前記外部センサは、
前記車両の車速を検出する車速センサと、前記車両の走行中における振動を検出する加速度センサを含むことを特徴とする請求項5記載の表示入力装置。 - 前記制御部は、
前記車速センサにより検出される車速、もしくは前記加速度センサにより検出される振動に応じて定められた、フィルタ特性により振動補正の強度を変更することを特徴とする請求項6記載の表示入力装置。 - 前記制御部は、
前記タッチパネルに表示される一定範囲における表示領域の画像の種類に応じて予め定められた、フィルタ特性により振動補正の強度を変更することを特徴とする請求項3記載の表示入力装置。 - 前記制御部は、
前記タッチパネルに表示される一定範囲における表示領域の画像の種類に応じて予め定められた、フィルタ特性により振動補正の強度を変更することを特徴とする請求項4記載の表示入力装置。 - 前記制御部は、
前記振動センサにより前記タッチパネルに対向する検出対象の垂直方向の距離を検知し、前記垂直方向の距離に応じて決まるフィルタ特性により振動補正の強度を変更することを特徴とする請求項3記載の表示入力装置。 - 前記制御部は、
前記振動センサにより前記タッチパネルに対向する検出対象の垂直方向の距離を検知し、前記垂直方向の距離に応じて決まるフィルタ特性により振動補正の強度を変更することを特徴とする請求項4記載の表示入力装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010541214A JP5052677B2 (ja) | 2008-12-04 | 2009-11-26 | 表示入力装置 |
DE112009002612.5T DE112009002612B4 (de) | 2008-12-04 | 2009-11-26 | Anzeige-Eingabevorrichtung, Navigationssystem mit einer Anzeige-Eingabevorrichtung sowie Fahrzeuginformationssystem mit einer Anzeige-Eingabevorrichtung |
US13/059,937 US9069453B2 (en) | 2008-12-04 | 2009-11-26 | Display input device |
CN2009801490359A CN102239068B (zh) | 2008-12-04 | 2009-11-26 | 显示输入装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008309805 | 2008-12-04 | ||
JP2008-309805 | 2008-12-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010064389A1 true WO2010064389A1 (ja) | 2010-06-10 |
Family
ID=42233048
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/006393 WO2010064389A1 (ja) | 2008-12-04 | 2009-11-26 | 表示入力装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9069453B2 (ja) |
JP (1) | JP5052677B2 (ja) |
CN (1) | CN102239068B (ja) |
DE (1) | DE112009002612B4 (ja) |
WO (1) | WO2010064389A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011065589A (ja) * | 2009-09-18 | 2011-03-31 | Fujitsu Toshiba Mobile Communications Ltd | ユーザインタフェース装置 |
US20120105613A1 (en) * | 2010-11-01 | 2012-05-03 | Robert Bosch Gmbh | Robust video-based handwriting and gesture recognition for in-car applications |
WO2012080094A1 (de) * | 2010-12-14 | 2012-06-21 | Preh Gmbh | Touchpad mit beschleunigungsentzerrung |
JP2012181703A (ja) * | 2011-03-01 | 2012-09-20 | Fujitsu Ten Ltd | 表示装置 |
JP2013015461A (ja) * | 2011-07-05 | 2013-01-24 | Clarion Co Ltd | ナビゲーション装置とその入力受付方法 |
JP5166630B1 (ja) * | 2012-08-23 | 2013-03-21 | 春佳 西守 | コンピュータプログラム |
JP2013539113A (ja) * | 2010-08-24 | 2013-10-17 | クアルコム,インコーポレイテッド | 電子デバイスディスプレイの上方の空気中で物体を移動させることによって電子デバイスアプリケーションと相互作用するための方法および装置 |
JP2014502070A (ja) * | 2010-10-20 | 2014-01-23 | ヨタ デバイセズ アイピーアール リミテッド | 入力手段を有する携帯機器 |
JP2015174648A (ja) * | 2014-03-18 | 2015-10-05 | 株式会社日本自動車部品総合研究所 | 操作入力装置、および操作入力方法 |
JP2019008634A (ja) * | 2017-06-27 | 2019-01-17 | 三菱電機株式会社 | タッチパネル制御装置、車載機器、およびタッチパネル制御プログラム |
JP2019012490A (ja) * | 2017-07-03 | 2019-01-24 | 株式会社ミツトヨ | 端末装置およびプログラム |
WO2021166801A1 (ja) * | 2020-02-20 | 2021-08-26 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
Families Citing this family (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB201020282D0 (en) * | 2010-11-30 | 2011-01-12 | Dev Ltd | An improved input device and associated method |
US8797284B2 (en) * | 2011-01-05 | 2014-08-05 | Motorola Mobility Llc | User interface and method for locating an interactive element associated with a touch sensitive interface |
US8319746B1 (en) * | 2011-07-22 | 2012-11-27 | Google Inc. | Systems and methods for removing electrical noise from a touchpad signal |
US8976135B2 (en) * | 2011-10-13 | 2015-03-10 | Autodesk, Inc. | Proximity-aware multi-touch tabletop |
US10684768B2 (en) * | 2011-10-14 | 2020-06-16 | Autodesk, Inc. | Enhanced target selection for a touch-based input enabled user interface |
CN108762577A (zh) | 2011-10-18 | 2018-11-06 | 卡内基梅隆大学 | 用于分类触敏表面上的触摸事件的方法和设备 |
US8791913B2 (en) * | 2012-01-26 | 2014-07-29 | Honeywell International Inc. | Adaptive gesture recognition system and method for unstable work environments |
US9182233B2 (en) * | 2012-05-17 | 2015-11-10 | Robert Bosch Gmbh | System and method for autocompletion and alignment of user gestures |
CN103491462B (zh) * | 2012-06-12 | 2016-08-03 | 深圳富泰宏精密工业有限公司 | 音量调节结构及应用该音量调节结构的电子装置 |
JP6202345B2 (ja) * | 2012-11-02 | 2017-09-27 | ソニー株式会社 | 表示制御装置、表示制御方法、およびプログラム |
KR20140066378A (ko) * | 2012-11-23 | 2014-06-02 | 삼성전자주식회사 | 디스플레이장치와 그 제어방법 |
US20140172557A1 (en) * | 2012-12-19 | 2014-06-19 | FootTrafficeker LLC | Interactive display system |
JP2014123293A (ja) * | 2012-12-21 | 2014-07-03 | Panasonic Corp | 振動制御装置、電子機器、および振動制御方法 |
US9202351B2 (en) * | 2013-03-11 | 2015-12-01 | Immersion Corporation | Systems and methods for haptics in vibrating environments and devices |
US20140267082A1 (en) * | 2013-03-15 | 2014-09-18 | Lenovo (Singapore) Pte, Ltd. | Enlarging touch screen portions |
KR20140114766A (ko) | 2013-03-19 | 2014-09-29 | 퀵소 코 | 터치 입력을 감지하기 위한 방법 및 장치 |
CN103246441B (zh) * | 2013-03-25 | 2016-02-10 | 东莞宇龙通信科技有限公司 | 终端设备的屏幕显示方法及终端设备 |
US9013452B2 (en) | 2013-03-25 | 2015-04-21 | Qeexo, Co. | Method and system for activating different interactive functions using different types of finger contacts |
US9612689B2 (en) * | 2015-02-02 | 2017-04-04 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer |
JP2014211738A (ja) * | 2013-04-18 | 2014-11-13 | 株式会社デンソー | 車載機器の制御装置、車載機器 |
DE102013215742A1 (de) * | 2013-08-09 | 2015-02-12 | Ford Global Technologies, Llc | Verfahren sowie Bedienvorrichtung zum Bedienen eines elektronischen Gerätes über einen Touchscreen |
DE102014215049A1 (de) | 2013-08-09 | 2015-02-12 | Ford Global Technologies, Llc | Verfahren sowie Bedienvorrichtung zum Bedienen eines elektronischen Gerätes über einen Touchscreen |
KR20240033121A (ko) * | 2013-11-29 | 2024-03-12 | 가부시키가이샤 한도오따이 에네루기 켄큐쇼 | 데이터 처리 장치 및 이의 구동 방법 |
CN104679313A (zh) * | 2013-11-30 | 2015-06-03 | 深圳富泰宏精密工业有限公司 | 触控面板初始化系统、方法及触控装置 |
US9329715B2 (en) | 2014-09-11 | 2016-05-03 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
US10606417B2 (en) | 2014-09-24 | 2020-03-31 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
CN105988568B (zh) * | 2015-02-12 | 2020-07-24 | 北京三星通信技术研究有限公司 | 获取笔记信息的方法和装置 |
JP6507918B2 (ja) * | 2015-08-04 | 2019-05-08 | アイシン精機株式会社 | 操作入力検知装置 |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
US20170123525A1 (en) * | 2015-10-29 | 2017-05-04 | Synaptics Incorporated | System and method for generating reliable electrical connections |
JP6447530B2 (ja) * | 2016-01-29 | 2019-01-09 | オムロン株式会社 | 信号処理装置、信号処理装置の制御方法、制御プログラム、および記録媒体 |
GB2551520B (en) * | 2016-06-20 | 2018-11-21 | Ge Aviat Systems Ltd | Correction of vibration-induced error for touch screen display in an aircraft |
EP3511801B1 (en) * | 2016-09-08 | 2022-06-01 | Sony Group Corporation | Output device, output control method, and program |
JP6801347B2 (ja) * | 2016-09-30 | 2020-12-16 | ブラザー工業株式会社 | 表示入力装置及び記憶媒体 |
CN107992200B (zh) * | 2017-12-21 | 2021-03-26 | 爱驰汽车有限公司 | 车载显示屏的画面补偿方法、装置及电子设备 |
US20190318711A1 (en) * | 2018-04-16 | 2019-10-17 | Bell Helicopter Textron Inc. | Electronically Damped Touch Screen Display |
US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
JP6731022B2 (ja) * | 2018-09-14 | 2020-07-29 | 本田技研工業株式会社 | 駐車支援装置、車両及び駐車支援方法 |
JP7155805B2 (ja) * | 2018-09-21 | 2022-10-19 | コニカミノルタ株式会社 | 入力装置および画像形成装置 |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
KR102297202B1 (ko) * | 2019-07-23 | 2021-09-01 | 엘지전자 주식회사 | 차량용 표시 장치 |
CN110767144B (zh) * | 2019-10-12 | 2022-08-16 | Oppo广东移动通信有限公司 | 移动终端屏幕显示防抖方法、装置、移动终端及存储介质 |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002108564A (ja) * | 2000-10-03 | 2002-04-12 | Alpine Electronics Inc | タッチパネル入力装置 |
JP2006031499A (ja) * | 2004-07-20 | 2006-02-02 | Denso Corp | 情報入力表示装置 |
JP2006520024A (ja) * | 2002-11-29 | 2006-08-31 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 接触エリアの移動させられた表現を用いたユーザインタフェース |
JP2007190947A (ja) * | 2006-01-17 | 2007-08-02 | Xanavi Informatics Corp | 車載情報端末 |
JP2008265544A (ja) * | 2007-04-20 | 2008-11-06 | Airesu Denshi Kogyo Kk | 車両搭載用のタッチパネル |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2835167B2 (ja) * | 1990-09-20 | 1998-12-14 | 株式会社東芝 | Crt表示装置 |
US5543591A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
JPH10171600A (ja) * | 1996-12-06 | 1998-06-26 | Brother Ind Ltd | 入力装置 |
US7256770B2 (en) * | 1998-09-14 | 2007-08-14 | Microsoft Corporation | Method for displaying information responsive to sensing a physical presence proximate to a computer input device |
DE10000218A1 (de) * | 2000-01-05 | 2001-07-12 | Bosch Gmbh Robert | Vorrichtung für manuelle Eingaben in einem Fahrzeug und Verfahren zur Verarbeitung von manuellen Eingaben in einem Fahrzeug |
US7176899B2 (en) * | 2002-01-31 | 2007-02-13 | Kabushiki Kaisha Tokai-Rika-Denki-Seisakusho | Display screen operation device |
DE102006037156A1 (de) | 2006-03-22 | 2007-09-27 | Volkswagen Ag | Interaktive Bedienvorrichtung und Verfahren zum Betreiben der interaktiven Bedienvorrichtung |
US20080055259A1 (en) * | 2006-08-31 | 2008-03-06 | Honeywell International, Inc. | Method for dynamically adapting button size on touch screens to compensate for hand tremor |
EP1980451A1 (en) * | 2007-04-10 | 2008-10-15 | IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. | Automotive vehicle with system for detecting the proximity of an accupant |
US8681093B2 (en) * | 2008-02-11 | 2014-03-25 | Apple Inc. | Motion compensation for screens |
-
2009
- 2009-11-26 CN CN2009801490359A patent/CN102239068B/zh active Active
- 2009-11-26 WO PCT/JP2009/006393 patent/WO2010064389A1/ja active Application Filing
- 2009-11-26 DE DE112009002612.5T patent/DE112009002612B4/de active Active
- 2009-11-26 US US13/059,937 patent/US9069453B2/en active Active
- 2009-11-26 JP JP2010541214A patent/JP5052677B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002108564A (ja) * | 2000-10-03 | 2002-04-12 | Alpine Electronics Inc | タッチパネル入力装置 |
JP2006520024A (ja) * | 2002-11-29 | 2006-08-31 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 接触エリアの移動させられた表現を用いたユーザインタフェース |
JP2006031499A (ja) * | 2004-07-20 | 2006-02-02 | Denso Corp | 情報入力表示装置 |
JP2007190947A (ja) * | 2006-01-17 | 2007-08-02 | Xanavi Informatics Corp | 車載情報端末 |
JP2008265544A (ja) * | 2007-04-20 | 2008-11-06 | Airesu Denshi Kogyo Kk | 車両搭載用のタッチパネル |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011065589A (ja) * | 2009-09-18 | 2011-03-31 | Fujitsu Toshiba Mobile Communications Ltd | ユーザインタフェース装置 |
JP2013539113A (ja) * | 2010-08-24 | 2013-10-17 | クアルコム,インコーポレイテッド | 電子デバイスディスプレイの上方の空気中で物体を移動させることによって電子デバイスアプリケーションと相互作用するための方法および装置 |
JP2014502070A (ja) * | 2010-10-20 | 2014-01-23 | ヨタ デバイセズ アイピーアール リミテッド | 入力手段を有する携帯機器 |
US8817087B2 (en) * | 2010-11-01 | 2014-08-26 | Robert Bosch Gmbh | Robust video-based handwriting and gesture recognition for in-car applications |
US20120105613A1 (en) * | 2010-11-01 | 2012-05-03 | Robert Bosch Gmbh | Robust video-based handwriting and gesture recognition for in-car applications |
WO2012080094A1 (de) * | 2010-12-14 | 2012-06-21 | Preh Gmbh | Touchpad mit beschleunigungsentzerrung |
JP2012181703A (ja) * | 2011-03-01 | 2012-09-20 | Fujitsu Ten Ltd | 表示装置 |
US8947381B2 (en) | 2011-03-01 | 2015-02-03 | Fujitsu Ten Limited | Display device |
JP2013015461A (ja) * | 2011-07-05 | 2013-01-24 | Clarion Co Ltd | ナビゲーション装置とその入力受付方法 |
JP5166630B1 (ja) * | 2012-08-23 | 2013-03-21 | 春佳 西守 | コンピュータプログラム |
JP2015174648A (ja) * | 2014-03-18 | 2015-10-05 | 株式会社日本自動車部品総合研究所 | 操作入力装置、および操作入力方法 |
US10088923B2 (en) | 2014-03-18 | 2018-10-02 | Denso Corporation | Operation input device and operation input method |
JP2019008634A (ja) * | 2017-06-27 | 2019-01-17 | 三菱電機株式会社 | タッチパネル制御装置、車載機器、およびタッチパネル制御プログラム |
JP2019012490A (ja) * | 2017-07-03 | 2019-01-24 | 株式会社ミツトヨ | 端末装置およびプログラム |
JP6991754B2 (ja) | 2017-07-03 | 2022-01-13 | 株式会社ミツトヨ | 端末装置およびプログラム |
WO2021166801A1 (ja) * | 2020-02-20 | 2021-08-26 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010064389A1 (ja) | 2012-05-10 |
DE112009002612T5 (de) | 2012-08-02 |
CN102239068B (zh) | 2013-03-13 |
CN102239068A (zh) | 2011-11-09 |
US20110141066A1 (en) | 2011-06-16 |
DE112009002612B4 (de) | 2015-07-23 |
JP5052677B2 (ja) | 2012-10-17 |
US9069453B2 (en) | 2015-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5052677B2 (ja) | 表示入力装置 | |
JP5349493B2 (ja) | 表示入力装置および車載情報装置 | |
JP5355683B2 (ja) | 表示入力装置および車載情報機器 | |
JP5430782B2 (ja) | 表示入力装置および車載情報機器 | |
US8963849B2 (en) | Display input device | |
JP4555538B2 (ja) | 操作コマンド処理プログラムおよびナビゲーションシステム | |
US20100318573A1 (en) | Method and apparatus for navigation system for selecting icons and application area by hand drawing on map image | |
JP5933468B2 (ja) | 情報表示制御装置、情報表示装置および情報表示制御方法 | |
JP2013117770A (ja) | 画像表示装置およびナビゲーション装置 | |
JP2000029382A (ja) | 地図表示装置、地図表示方法および地図表示プログラムの記録媒体 | |
JP5423279B2 (ja) | ユーザインタフェース装置 | |
JP5984718B2 (ja) | 車載情報表示制御装置、車載情報表示装置および車載表示装置の情報表示制御方法 | |
JP5889230B2 (ja) | 情報表示制御装置、情報表示装置および情報表示制御方法 | |
JP5950851B2 (ja) | 情報表示制御装置、情報表示装置および情報表示制御方法 | |
JP2014191818A (ja) | 操作支援システム、操作支援方法及びコンピュータプログラム | |
JP2011191651A (ja) | 地図表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980149035.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09830160 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010541214 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13059937 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120090026125 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09830160 Country of ref document: EP Kind code of ref document: A1 |