CN109324746A - Gesture identification method for touch screen - Google Patents
Gesture identification method for touch screen Download PDFInfo
- Publication number
- CN109324746A CN109324746A CN201810869603.3A CN201810869603A CN109324746A CN 109324746 A CN109324746 A CN 109324746A CN 201810869603 A CN201810869603 A CN 201810869603A CN 109324746 A CN109324746 A CN 109324746A
- Authority
- CN
- China
- Prior art keywords
- gesture
- sliding trace
- gesture sliding
- identification method
- steering wheel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of gesture identification methods for touch screen, comprising: the first gesture sliding trace that detection is generated in the touch screen;Second gesture sliding trace is detected in the preset time window after first gesture sliding trace generation;Determine the track trend of the second gesture sliding trace;In response to the track trend for indicating backhaul and being intended to of the second gesture sliding trace, the second gesture sliding trace is removed from the queue to be resolved that storage has the first gesture sliding trace.Based on the present invention, by the track trend for identifying the second gesture sliding trace for following the first gesture sliding trace to generate in preset time window, and, there can be interference of the second gesture sliding trace of backhaul intention to gesture identification to avoid track trend, to reduce the probability for generating hysterisis error in gesture operation, the accuracy of gesture identification is thus improved.
Description
Technical field
The present invention relates to Gesture Recognitions, in particular to are suitable for a kind of gesture identification side for touch screen of automobile
Method.
Background technique
Console is control unit common in existing automobile.With the continuous development of technology, more and more automobiles are adopted
It can be held in touch screen with the front-row occupant (driver or other members of the passenger side seat) of the console with touch screen, automobile
Row gesture slide, to trigger the generation of corresponding operating instruction.
For the driver, after the touch screen of console executes gesture slide, it is easy to for driving safety
The considerations of and manipulator is repositioned to steering wheel rapidly, and when manipulator contacts during return with touch screen, then having can
It can be generated in touch screen and follow the mistake slide of normal gesture slide, such accidentally slide can be referred to as backhaul
Error operation.
Due to hysterisis error operation track can equally identify with device processed, therefore, it is possible to make processor according to time
Journey error operation generates and with normal gesture slide acts on opposite operational order, alternatively, it is also possible that processor according to
The combination of the track and hysterisis error track of normal gesture slide generates the operational order of mistake.
As it can be seen that hysterisis error operation will affect the accuracy of gesture identification.
Summary of the invention
In one embodiment, a kind of gesture identification method for touch screen, the gesture identification method packet are provided
It includes:
Detect the first gesture sliding trace generated in the touch screen;
Second gesture sliding trace is detected in the preset time window after first gesture sliding trace generation;
Determine the track trend of the second gesture sliding trace;
In response to the track trend for indicating backhaul and being intended to of the second gesture sliding trace, the second gesture is slided
Track is removed from the queue to be resolved that storage has the first gesture sliding trace.
Optionally, the side of the steering wheel of automobile is arranged in the touch screen, also, determines that the second gesture slides rail
The track trend of mark comprises determining that the second gesture sliding trace relative to the first gesture sliding trace and the direction
The vector orientation of disk;The directional property of the track trend of the second gesture sliding trace is determined based on the vector orientation,
In, if the track trend of the second gesture sliding trace, which has from the final position of the first gesture sliding trace, is directed toward institute
State the directional property of steering wheel, it is determined that the track trend of the second gesture sliding trace indicates that backhaul is intended to.
Optionally it is determined that vector orientation packet of the second gesture sliding trace relative to the first gesture sliding trace
It includes: in the final position creation association zones of extensibility of first gesture sliding trace;Determine the starting of the second gesture sliding trace
The overlapping relationship of position and the association zones of extensibility, wherein if the initial position of the second gesture sliding trace falls within institute
Association zones of extensibility is stated, then indicates the termination from the first gesture sliding trace based on the directional property that the vector orientation determines
It initiates to be directed toward in position.
Optionally it is determined that the second gesture sliding trace includes: to create certainly relative to the vector orientation of the steering wheel
The backhaul packet that the final position of first gesture sliding trace or the association zones of extensibility are radiated to orientation where the steering wheel
Network area;Determine the overlapping relationship of the second gesture sliding trace Yu backhaul envelope area, wherein if the second gesture is sliding
Dynamic rail mark is fallen in backhaul envelope area, then indicates to be directed toward the steering wheel based on the directional property that the vector orientation determines.
Optionally, the gesture identification method further comprises: detecting the travel speed of the automobile;According to the traveling
Speed adjusts the radiation angle size in backhaul envelope area.
Optionally, the gesture identification method characterizes the steering wheel institute with the thumb manipulation gripping area of the steering wheel
In orientation.
Optionally, the gesture identification method further comprises: detecting the rotation angle of the steering wheel;According to detecting
The variation of the rotation angle update orientation where the steering wheel.
Optionally, the first gesture sliding trace and the second gesture sliding trace press detected value and/
Or stroke detection value.
Another embodiment provides a kind of non-transitory computer-readable storage medium, the non-instantaneous computer
Readable storage medium storing program for executing store instruction, described instruction make the processor execute gesture as described above when executed by the processor
Step in recognition methods.
In yet another embodiment, a kind of automobile is provided, including being installed in the touch screen of console, being located at the middle control
The steering wheel of platform side, the automobile further include non-transitory computer-readable storage medium as described above and with the touching
Touch the processor of screen and non-transitory computer-readable storage medium electrical connection.
Based on above-mentioned each embodiment, the first gesture sliding trace is followed to produce in preset time window by identifying
The track trend of the raw second gesture sliding trace, and it is possible to avoid described that track trend has that backhaul is intended to
Thus interference of the two gesture sliding traces to gesture identification improves to reduce the probability for generating hysterisis error in gesture operation
The accuracy of gesture identification.
Detailed description of the invention
Only illustratively description and explain the present invention for the following drawings, not delimit the scope of the invention.
Fig. 1 is the exemplary flow diagram in one embodiment for the gesture identification method of touch screen;
Fig. 2 is the example schematic based on gesture identification method as shown in Figure 1;
Fig. 3 is another example schematic based on gesture identification method as shown in Figure 1;
Fig. 4 is the yet another embodiment schematic diagram based on gesture identification method as shown in Figure 1;
Fig. 5 is the schematic diagram of the directional property judgment mechanism in example as shown in Figure 4;
Fig. 6 is the system framework schematic diagram of the automobile in one embodiment.
Specific embodiment
In order to which the technical features, objects and effects of invention are more clearly understood, the Detailed description of the invention present invention is now compareed
Specific embodiment, identical label indicates identical part in the various figures.
Herein, " schematic " expression " serving as examplea, instances, or illustrations " should not will be described herein as " showing
Any diagram, the embodiment of meaning property " are construed to technical solution that is a kind of preferred or more having advantages.
To make simplified form, part related to the present invention is only schematically shown in each figure, and does not represent it
Practical structures as product.In addition, so that simplified form is easy to understand, with the portion of identical structure or function in some figures
Part only symbolically depicts one of those, or has only marked one of those.
Herein, " first ", " second " etc. are only used for mutual differentiation, rather than indicate significance level and sequence and
Existing premise etc. each other.
Fig. 1 is the exemplary flow diagram in one embodiment for the gesture identification method of touch screen.Refer to figure
1, in one embodiment, the gesture identification method for touch screen includes:
S101: the first gesture sliding trace that detection is generated in touch screen;
S102: second gesture sliding trace is detected in the preset time window after the generation of first gesture sliding trace;
S103: the track trend of second gesture sliding trace is determined;
S104: in response to the track trend for indicating backhaul and being intended to of second gesture sliding trace, second gesture is slided into rail
Mark is removed from the queue to be resolved that storage has first gesture sliding trace.
In above-mentioned process, first gesture sliding trace can be regarded as the sliding trace of normal operating gesture, correspondingly, the
Two gesture sliding traces are likely to be the sliding trace of normal operating gesture, but it could also be possible that when manipulator's backhaul subsidiary mistake
The sliding trace of operating gesture.It therefore, can be uncertain to existing by identifying the track trend of second gesture sliding trace
Second gesture sliding trace screen, to avoid track trend have backhaul be intended to second gesture sliding trace to gesture know
Thus other interference improves the accuracy of gesture identification to reduce the probability for generating hysterisis error in gesture operation.
Fig. 2 is the example schematic based on gesture identification method as shown in Figure 1.As shown in Fig. 2, the example includes:
S201: the first gesture sliding trace that detection is generated in touch screen;
S202: the first gesture sliding trace that will test is stored in queue to be resolved;
S203: second gesture sliding trace is detected in the preset time window after the generation of first gesture sliding trace;
S204: the second gesture sliding trace that will test is stored in queue to be resolved;
S205: determine whether the track trend of second gesture sliding trace has backhaul intention;
S206: in response to the track trend for indicating backhaul and being intended to of second gesture sliding trace, second gesture is slided into rail
Mark is removed from queue to be resolved, and triggers the parsing to the first gesture sliding trace in queue to be resolved;
S207: it in response to the track trend of the non-backhaul intention of second gesture sliding trace, triggers in queue to be resolved
First gesture sliding trace and second gesture sliding trace parsing.
Wherein, parsing of the S207 to first gesture sliding trace and second gesture sliding trace can be successively to first
Gesture sliding trace and second gesture sliding trace parse respectively, alternatively, being also possible to first gesture sliding trace and second
Gesture sliding trace is parsed as a combination gesture.
In actual use, for the second gesture sliding trace that there is track trend backhaul to be intended to, it is possible that
It is maloperation of the operator (such as driver) under automatism, and such maloperation presses detected value and/or row
What journey detected value was often less than the first gesture sliding trace for belonging to normal gesture operation presses detected value and/or stroke detection
Value, for example, the half for pressing detected value and/or stroke detection value of insufficient first gesture sliding trace, one third, five/
One or 1/10th, in addition it is less.Based on this, the examination of second gesture sliding trace may further include more first-hand
Gesture sliding trace and the step of pressing detected value and/or stroke detection value of second gesture sliding trace.
Fig. 3 is another example schematic based on gesture identification method as shown in Figure 1.As shown in figure 3, the example packet
It includes:
S301: the first gesture sliding trace that detection is generated in touch screen;
S302: the first gesture sliding trace that will test is stored in queue to be resolved;
S303: second gesture sliding trace is detected in the preset time window after the generation of first gesture sliding trace;
S304: the second gesture sliding trace that will test is stored in queue to be resolved;
S305: determine second gesture sliding trace press detected value and/or whether stroke detection value is less than first gesture
Sliding trace;
S306: it is slided in response to press detected value and/or the stroke detection value of second gesture sliding trace less than first gesture
The comparison result of dynamic rail mark, determines whether the track trend of second gesture sliding trace has backhaul intention;
S307: in response to the track trend for indicating backhaul and being intended to of second gesture sliding trace, second gesture is slided into rail
Mark is removed from queue to be resolved, and triggers the parsing to the first gesture sliding trace in queue to be resolved;
S308: detected value and/or stroke detection value are pressed not less than first gesture in response to second gesture sliding trace
The track trend of the non-backhaul intention of the comparison result or second gesture sliding trace of sliding trace is triggered to team to be resolved
The parsing of first gesture sliding trace and second gesture sliding trace in column.
Examples detailed above can be directed to second gesture sliding trace by pressing the judgement of detected value and/or stroke detection value
Have save subsequent processing the case where greater probability belongs to normal operating gesture, and has greater probability for second gesture sliding trace
The case where belonging to maloperation gesture, it can screen whether its track trend has by the directive property of second gesture sliding trace
Backhaul is intended to.
For example, it is contemplated that can habitually be returned after completing normal gesture operation to the manipulator that driver completes gesture operation
Return to the posture for holding steering wheel, therefore, backhaul is intended on the one hand show with the final position of normal gesture as mobile starting
Position and using orientation where steering wheel as moving target position.
Fig. 4 is the yet another embodiment schematic diagram based on gesture identification method as shown in Figure 1.As shown in figure 4, the example packet
It includes:
S401: the first gesture sliding trace that detection is generated in touch screen;
S402: the first gesture sliding trace that will test is stored in queue to be resolved;
S403: second gesture sliding trace is detected in the preset time window after the generation of first gesture sliding trace;
S404: the second gesture sliding trace that will test is stored in queue to be resolved;
S405a: vector orientation of the second gesture sliding trace relative to first gesture sliding trace and steering wheel is determined;
S405b: the directional property of the track trend of second gesture sliding trace is determined based on vector orientation, that is, based on arrow
Amount orientation determines whether the track trend of second gesture sliding trace has the final position direction from first gesture sliding trace
The directional property of steering wheel;
S406: there is the final position from first gesture sliding trace in response to the track trend of second gesture sliding trace
The directional property of pointing direction disk removes second gesture sliding trace from queue to be resolved, and triggers to queue to be resolved
In first gesture sliding trace parsing;
S407: refer to if the track trend of second gesture sliding trace does not have from the final position of first gesture sliding trace
To the directional property of steering wheel, then trigger to the first gesture sliding trace and second gesture sliding trace in queue to be resolved
Parsing.
Fig. 5 is the schematic diagram of the directional property judgment mechanism in example as shown in Figure 4.Fig. 5 is referred to, when in being installed in
After the touch screen 50 of control platform 51 successively detects first gesture sliding trace 511 and second gesture sliding trace 512, Ke Yi
The final position creation association zones of extensibility 510a of first gesture sliding trace 511, and determine rising for second gesture sliding trace 512
Overlapping relationship beginning position and be associated with zones of extensibility 510a, wherein if the initial position of second gesture sliding trace 512 falls within pass
Join zones of extensibility 510a, then the track trend of the second gesture sliding trace 512 determined based on vector orientation is had from first gesture
The final position of sliding trace 51 initiates the directional property being directed toward.Due to being associated with the creation of zones of extensibility 510a, can not only cover
The case where initial position of second gesture sliding trace 512 is overlapped with the final position of first gesture sliding trace 511 (second-hand
Gesture sliding trace 512 and first gesture sliding trace 511 are continuous gesture), second gesture sliding trace 512 can also be covered
The initial position situation different from the final position of first gesture sliding trace 511 (second gesture sliding trace 512 with it is first-hand
Gesture sliding trace 511 is interruption gesture), for example, manipulator is of short duration after the completion of first gesture sliding trace 511 to leave touch
Then screen has accidentally touched touch screen and has formed second gesture sliding trace 512 again in return stroke.
Referring also to Fig. 5, when successively detecting 511 He of first gesture sliding trace in the touch screen 50 for being installed in console 51
After second gesture sliding trace 512, it can also create and prolong from the final position or auto correlation of first gesture sliding trace 511
The backhaul envelope area 510b that exhibition section 510a is radiated to 52 place orientation of steering wheel, and determine second gesture sliding trace 512 and return
The overlapping relationship of journey envelope area 510b, wherein if second gesture sliding trace 512 partially or completely falls within backhaul envelope area
In 510b, then the track trend of the second gesture sliding trace 512 determined based on vector orientation has the finger of pointing direction disk 52
To characteristic.
In Fig. 5, with thumb manipulation gripping area 520 (such as close to region of key panel) characterization side of steering wheel 52
The size in orientation where to disk, thumb manipulation gripping area 520 can be greater than association zones of extensibility 510a, be extended with being formed from association
Area 510a is to thumb manipulation gripping area 520 in predetermined radiation angle alpha-emitting backhaul envelope area 510b.
Wherein, the position of thumb manipulation gripping area 520 can be dynamic change, for example, the rotation of detection direction disk 52
Gyration (steering wheel 52 has rotary freedom R) simultaneously updates 52 place of steering wheel according to the variation of the rotation angle detected
Orientation.Also, irradiation angle can be dynamic change, for example, can detecte the travel speed of automobile and according to travel speed
Adjust the irradiation angle size of backhaul envelope area 510b.
Fig. 6 is the system framework schematic diagram of the automobile in one embodiment.As shown in fig. 6, automobile may include being installed in
The touch screen 50 of console 51, the steering wheel 52 positioned at 51 side of console, automobile can also include non-instantaneous computer-readable
Storage medium 61 and the processor 62 being electrically connected with touch screen 50 and non-transitory computer-readable storage medium 61.The automobile
Can also include be set to steering wheel 52 and for detection direction disk 52 rotate angle sensor 63, processor 62 also with this
Sensor 63 is electrically connected.
Wherein, non-transitory computer-readable storage medium 61 can store instruction 600, and instruction 600 is held by processor 62
Processor 62 can be made to execute the step in previously described process as shown in Figures 1 to 4 when row.
In addition, processor 62 can be VCU (entire car controller) or ECU (electronic control unit) or independently of this two
Processor except person.
The series of detailed descriptions listed above only for feasible embodiment of the invention specifically
Protection scope that is bright, and being not intended to limit the invention, it is all without departing from equivalent embodiments made by technical spirit of the present invention or
Change, such as the combination, segmentation or repetition of feature, should all be included in the protection scope of the present invention.
Claims (10)
1. a kind of gesture identification method for touch screen, which is characterized in that the gesture identification method includes:
Detect the first gesture sliding trace generated in the touch screen;
Second gesture sliding trace is detected in the preset time window after first gesture sliding trace generation;
Determine the track trend of the second gesture sliding trace;
In response to the track trend for indicating backhaul and being intended to of the second gesture sliding trace, by the second gesture sliding trace
It is removed from the queue to be resolved that storage has the first gesture sliding trace.
2. gesture identification method according to claim 1, which is characterized in that the steering wheel of automobile is arranged in the touch screen
Side, also, determine that the track trend of the second gesture sliding trace includes:
Determine vector orientation of the second gesture sliding trace relative to the first gesture sliding trace and the steering wheel;
The directional property of the track trend of the second gesture sliding trace is determined based on the vector orientation, wherein if described
The track trend of second gesture sliding trace, which has from the final position of the first gesture sliding trace, is directed toward the steering wheel
Directional property, it is determined that the track trend of the second gesture sliding trace indicates that backhaul is intended to.
3. gesture identification method according to claim 2, which is characterized in that determine that the second gesture sliding trace is opposite
Include: in the vector orientation of the first gesture sliding trace
In the final position creation association zones of extensibility of first gesture sliding trace;
Determine the initial position of the second gesture sliding trace and the overlapping relationship of the association zones of extensibility, wherein if described
The initial position of second gesture sliding trace falls within the association zones of extensibility, then the direction spy determined based on the vector orientation
Property indicate to initiate to be directed toward from the final position of the first gesture sliding trace.
4. gesture identification method according to claim 3, which is characterized in that determine that the second gesture sliding trace is opposite
Include: in the vector orientation of the steering wheel
It creates from orientation spoke where the final position of first gesture sliding trace or the association zones of extensibility to the steering wheel
The backhaul envelope area penetrated;
Determine the overlapping relationship of the second gesture sliding trace Yu backhaul envelope area, wherein if the second gesture is sliding
Dynamic rail mark is fallen in backhaul envelope area, then indicates to be directed toward the steering wheel based on the directional property that the vector orientation determines.
5. gesture identification method according to claim 4, which is characterized in that the gesture identification method further comprises:
Detect the travel speed of the automobile;
The radiation angle size in backhaul envelope area is adjusted according to the travel speed.
6. gesture identification method according to claim 4, which is characterized in that the gesture identification method is with the steering wheel
Thumb manipulation gripping area characterize orientation where the steering wheel.
7. gesture identification method according to claim 6, which is characterized in that the gesture identification method further comprises:
Detect the rotation angle of the steering wheel;
Orientation where updating the steering wheel according to the variation of the rotation angle detected.
8. gesture identification method according to claim 1, which is characterized in that the first gesture sliding trace and institute
That states second gesture sliding trace presses detected value and/or stroke detection value.
9. a kind of non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium store instruction is special
Sign is that described instruction executes the processor as described in any item of the claim 1 to 8
Step in gesture identification method.
10. a kind of automobile, the touch screen including being installed in console, the steering wheel positioned at the console side, feature exist
In, the automobile further include non-transitory computer-readable storage medium as claimed in claim 9 and with the touch screen and
The processor of the non-transitory computer-readable storage medium electrical connection.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810869603.3A CN109324746A (en) | 2018-08-02 | 2018-08-02 | Gesture identification method for touch screen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810869603.3A CN109324746A (en) | 2018-08-02 | 2018-08-02 | Gesture identification method for touch screen |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109324746A true CN109324746A (en) | 2019-02-12 |
Family
ID=65263569
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810869603.3A Pending CN109324746A (en) | 2018-08-02 | 2018-08-02 | Gesture identification method for touch screen |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109324746A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11841991B2 (en) | 2020-07-31 | 2023-12-12 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for gesture control and related devices |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013016213A1 (en) * | 2011-07-22 | 2013-01-31 | American Megatrends, Inc. | Steering wheel input device having gesture recognition and angle compensation capabilities |
CN104309480A (en) * | 2014-09-28 | 2015-01-28 | 小米科技有限责任公司 | Operating function execution method and device |
CN105531663A (en) * | 2013-09-14 | 2016-04-27 | 戴姆勒股份公司 | Method for operating a gesture recognition system for a motor vehicle |
CN107340861A (en) * | 2017-06-26 | 2017-11-10 | 联想(北京)有限公司 | Gesture identification method and its equipment |
-
2018
- 2018-08-02 CN CN201810869603.3A patent/CN109324746A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013016213A1 (en) * | 2011-07-22 | 2013-01-31 | American Megatrends, Inc. | Steering wheel input device having gesture recognition and angle compensation capabilities |
CN105531663A (en) * | 2013-09-14 | 2016-04-27 | 戴姆勒股份公司 | Method for operating a gesture recognition system for a motor vehicle |
CN104309480A (en) * | 2014-09-28 | 2015-01-28 | 小米科技有限责任公司 | Operating function execution method and device |
CN107340861A (en) * | 2017-06-26 | 2017-11-10 | 联想(北京)有限公司 | Gesture identification method and its equipment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11841991B2 (en) | 2020-07-31 | 2023-12-12 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for gesture control and related devices |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9994233B2 (en) | Hands accelerating control system | |
US10025388B2 (en) | Touchless human machine interface | |
EP2786902B1 (en) | Vehicle operating device | |
US20170308075A1 (en) | Determination of continuous user interaction and intent through measurement of force variability | |
CN102622087B (en) | Mobile terminal and unlocking method thereof | |
JP6211506B2 (en) | Open / close detection device for vehicle opening / closing body | |
CN105683004A (en) | Systems and methods for controlling vehicle ignition using biometric data | |
CN108170264B (en) | Vehicle user input control system and method | |
CN105408853A (en) | Method and device for remote-controlling a function of a vehicle | |
CN102866803A (en) | Method and device for operating and controlling virtual center console of Blind-operation-supported automobile by gestures | |
KR102084032B1 (en) | User interface, means of transport and method for distinguishing a user | |
JP6233248B2 (en) | Gripping state determination device, gripping state determination method, input device, input acquisition method | |
WO2013101047A1 (en) | Systems, methods, and apparatus for invehicle fiducial mark tracking and interpretation | |
US10035539B2 (en) | Steering wheel control system | |
CN110588759B (en) | Vehicle steering control method and device | |
CN104360809B (en) | Control method, device and system | |
CN109324746A (en) | Gesture identification method for touch screen | |
EP4137380A1 (en) | Human-computer interaction method and apparatus, and electronic device and storage medium | |
CN111483406A (en) | Vehicle-mounted infotainment device, control method thereof and vehicle comprising same | |
JP2017097607A (en) | Image recognition device | |
JP6167932B2 (en) | Input device and input acquisition method | |
EP2963530A1 (en) | Operation detection device | |
US11535268B2 (en) | Vehicle and control method thereof | |
KR101976497B1 (en) | System and method for operating unit support of vehicle | |
CN112433619B (en) | Human-computer interaction method and system for automobile, electronic equipment and computer storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20190212 |
|
WD01 | Invention patent application deemed withdrawn after publication |