CN106030462A - User interface and method for switching from a first operating mode of a user interface to a 3D gesture mode - Google Patents

User interface and method for switching from a first operating mode of a user interface to a 3D gesture mode Download PDF

Info

Publication number
CN106030462A
CN106030462A CN201580009060.2A CN201580009060A CN106030462A CN 106030462 A CN106030462 A CN 106030462A CN 201580009060 A CN201580009060 A CN 201580009060A CN 106030462 A CN106030462 A CN 106030462A
Authority
CN
China
Prior art keywords
user
gesture
user interface
hand
predefined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201580009060.2A
Other languages
Chinese (zh)
Inventor
H.维尔德
M.P.切尔尼克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of CN106030462A publication Critical patent/CN106030462A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a user interface and a method for switching from a first operating mode of a user interface to a 3D gesture mode, in which an operation of the user interface can be conducted using a plurality of gestures, henceforth referred to as 3D gestures, which are carried out freely in space. The method has the following steps: - detecting a hand (4) of a user in a specified region by means of a sensor, - identifying a specified position of the hand (4) of the user and/or a specified movement of the hand (4) of the user, and - switching from the first operating mode to the 3D gesture mode.

Description

User interface and for the method being transformed into 3D gesture mode from the first operator scheme of user interface
Technical field
The present invention relates to user interface and method that the first operator scheme for user interface is transformed into 3D gesture mode, the operation by means of 3D gesture mode user interface can be realized by multiple gestures the most freely realized.Especially the present invention relates to avoid faulty operation step by initializing easily but safely 3D gesture mode.
Background technology
Gesture operation in free space is very popular.Can be with man-machine interface communication by means of this 3D gesture user, and need not be with input equipment Body contact.Such as by optically-based and/or infrared system identification gesture and be converted to control command.3D gesture operation is started and next operation realize different method well known in the art.
The open method and apparatus of DE102012000263A1 is for by using the function in the gesture operation vehicle realized in three dimensions.If hand or the finger of user detected in effectively detection region for the first predetermined lasting time by the detection equipment arranged in the cap region of vehicle, then activate gesture and control.Learning process for user proposes, and is realized the blind operation of user during gesture operation by optics and/or acoustic feedback.
DE102006037156A1 describes mutual operation equipment, is realized the gesture operation of the graphical content represented at display device by operation equipment.It is intended to optimally represent the graphical content represented on the display device for activating function associated there according to the operation determined.Such as in the case of button, representing button enlargedly.
DE102009008041A1 is open for by the method for touch screen operation motor vehicles, being provided with proximity transducer for identifying the gesture of the realization for the function realizing or initializing motor vehicles.If realizing the gesture of hand around touch screen and not touching touch screen, then realize the function of motor vehicles.
Method well known in the art reuses the initialized engineering feasibility simplifying gesture operation the most as far as possible.Task in this present invention is to meet aforementioned need.
Summary of the invention
According to foregoing task of the present invention by there is the method for feature described in claim 1 and solving by having the user interface of feature described in claim 6.User terminal apparatus for solving this task, computer program, signal sequence and means of transport are proposed in addition.The method for from the first operator scheme (wherein for start user interface or with the 3D gesture operation of multiple functions of its EDV-facility being connected with communication technology can not) convert from 3D gesture mode (operation by means of 3D gesture mode user interface can be realized by multiple gestures the most freely realized).First operator scheme can be the most operationally by the button with hardware designs, rotational pressure actuator, movement/slide rheostat etc., and alternatively or additionally by using the sensitive surface that touches of display unit (such as touch screen) to design.The function associated with graphical content is not called by 3D gesture.Beginning in 3D gesture mode detects the hand of user in the first step in the predefined detection region of sensor.Sensor can be designed to such as optical pickocff and/or infrared-sensor (the most infrared-LED-lath (Leiste)).Can use all for identifying the known in the state of the art of 3D gesture and proving effective sensor in principle.Identify the predefined posture of the hand of user and/or the predefined motion of the hand of user in the second step.The predefined posture of the hand of user and/or the predefined motion of hand can the most arbitrarily or in an appropriate manner select.Only it is essential that the predefined posture of hand or predefined motion are predefined proceeding by of 3D gesture mode.It is not only suitable for the existence of hand in predefined region at this relative to prior art, thus avoids accidentally starting from the first operator scheme to the conversion of 3D gesture mode as far as possible.
Dependent claims illustrates the advantageous modification of the present invention.
Method advantageously according to the present invention includes starting intervalometer.Intervalometer can be such as in response to identifying the predefined posture of user's hand and/or starting in response to the predefined motion identifying user's hand.If the most predefined posture and/or motion are not over before the timer expires in this improvement project, then it is transformed into 3D gesture mode.In other words when intervalometer expires check, predefined posture/motion whether also continue to and first in response to two conditions from the first mode transitions to 3D gesture mode.
Predefined posture can include the palm such as opened up on the direction of the sensor of the use for 3D gesture-identification and/or in the side of the display unit of user interface.The palm opened at this is located essentially parallel to the surface of display unit or is substantially perpendicular to the direction of sensor.Additionally the condition as predefined posture may require that, the finger of hand is diverged in a predefined way and/or stretches.Alternatively or additionally can predefine other gestures one or more, in order to start or terminate gesture operation.It is so-called " thumb is upwards " gesture to this other examples or grasps gesture (holding of the fist of the hand positions met or open up on the direction of the sensor used and/or in the side of the display unit of user interface of the most multiple or all finger tip).This gesture turns out to be sensor can safer identification and be the beginning being suitable for 3D gesture mode at this.
Optics and/or acoustics notice that at this moment the predefined motion of the predefined posture or hand that are optionally in response to identification hand activates about 3D gesture mode can export user.What alternatively or additionally this notice may also respond to is transformed into 3D gesture-pattern realization.This avoid, although 3D gesture operation can have been carried out, user is unnecessary to be occupied predefined posture longly or implements motion.Such as can obtain the most again its driving task according to the driver of the vehicle of present invention design, this improves traffic safety.Notifying that such as hand symbol can represent in the marginal area of the graphical content of display unit as optics, especially animation illustrates.Alternatively or additionally acoustical signal, noise taken by card or voice output realizes notifying as acoustics.Described notice represents unnoticed probability, notifies that user starts about the success of 3D gesture mode.
Proposing user interface according to the second aspect of the invention, it includes that sensor is for gesture that the hand detecting user the most freely realizes.In addition user interface has display unit for representing the graphical content of unrestricted choice.This display unit can be such as designed as matrix screen, its in the instrument board of means of transport as combination instrument and/or as Central Information Display embed.It is additionally provided in the multiple gestures assessing unit (it such as can include programmable processor) in the signal identifying sensor.In automotive environment, this assessment unit is also known as " electronic-controlled installation (ECU) ".Assessment unit is set up to identify the predefined motion of the hand of the predefined posture of the hand of user and/or user.The motor pattern that this assessment unit can be analyzed detection and the reference stored with (such as local or the data base of wireless connections) compare.If relatively success, the starting of the 3D gesture the most namely identified and 3D gesture mode associates, then assessment unit by the user interface according to the present invention from the first operator scheme (although wherein identifying individually in the gesture started setting up of 3D gesture mode, but it is not that using of multiple 3D gesture combines from different funcalls) it is replaced into 3D gesture mode (wherein, this is possibly realized).Feature, feature combination and the advantage thus drawn are the most obvious corresponding to implement together with the method according to the invention, thus in order to avoid implementing above repeated reference.
Preferably sensor can be optical pickocff, such as, arrange on the top cover of means of transport.Alternatively or additionally can also arrange in the sensor of region of ultra-red work, to realize or to support gesture identification.
Optics and/or acoustic output unit preferably can be set for exporting the notice that the success about 3D gesture mode starts.When LED or output unit that matrix-display is optics, preferably speaker is used for exporting acoustics notice (such as voice output, acoustical signal, noise).
Computer program (such as data storage) is proposed according to the third aspect of the invention we, storing instruction on it, it makes the programmable processor of assessment unit of user interface of the second aspect mentioned according to the present invention be able to carry out the step of method of the inventive aspect according to Top of Mind.Computer program can be designed as CD, DVD, Blu-ray disc, flash memory, hard disk, RAM/ROM, cache etc..
Proposing to represent the instruction of signal sequence according to the fourth aspect of the invention, it makes the programmable processor of assessment unit of user interface of the second aspect mentioned according to the present invention be able to carry out the method step of the aspect of the present invention according to Top of Mind.The most also instruction will be provided for following situations to be placed under protection with information technology: outside the storage medium that there is a requirement that is positioned at the scope of application of claims.
Proposing user terminal apparatus according to the fifth aspect of the invention, it may be designed to such as portable electronic equipment.Especially electronic radio communication device (such as smart phone, flat board-PC or notebook computer/kneetop computer) is adapted to pass through the user interface extension according to the present invention and makes to become user friendly operable.Portable user terminal device often has the hardware of overall needs so that supporting 3D gesture operation.In addition implement above about user terminal apparatus and the reference of relative feature and advantage.
Proposing means of transport according to the sixth aspect of the invention, it can be designed to such as automobile, carrier, truck, aircraft and/or boats and ships.Include user interface according to means of transport of the present invention, as its together with second mention of the present invention in terms of described in detail above.Display unit can be designed to fixedly embedded central screen in the instrument board of means of transport and/or as combination instrument.So means of transport is set up equally for solving according to the task of the present invention or for using in the method according to the invention.
Accompanying drawing explanation
Embodiments of the invention are described in detail below with reference to appended accompanying drawing.In the accompanying drawings:
Fig. 1 is the schematic overview of the assembly of the embodiment of the user interface according to the present invention in the embodiment of the automobile according to present invention design;
Fig. 2 is the schematic overview of the assembly of the embodiment of the user interface according to the present invention in the embodiment of the user terminal apparatus according to present invention design;
Fig. 3 is the expression of the first method step of the embodiment for explaining the method according to the invention;
Fig. 4 is the selection according to the spendable gesture of the present invention;
Fig. 5 is the expression for explaining first method step according to the second embodiment of the method according to the invention;And
Fig. 6 is the flow chart of the step of the embodiment illustrating the method according to the invention.
Detailed description of the invention
Fig. 1 is shown as the automobile 10 of means of transport, is wherein embedded in the instrument board of automobile 10 as display unit according to the screen 3 of the user interface of the present invention.Placement sensor 5 under screen 3, described sensor launched detection region 9 before screen 3 in space.Speaker 11 is set for exporting the notice of user interface 1.It is eventually used for providing the data storage for the predefined reference of gesture identification to be also included in automobile 10.Described assembly is connected to the assessment unit of form with electronic-controlled installation 7 with information technology.In addition electronic-controlled installation 7 is set up for representing graphical content 2 on screen 3.
Fig. 2 illustrates the general view of the assembly of the embodiment about the user interface according to the present invention in the embodiment with the user terminal apparatus 20 according to the present invention of smart phone form.The screen 3 of smart phone is display unit, and microprocessor 7 is assessment unit, the signal transmitter that flash memory 8 is storage medium and speaker 11 is the user interface 1 according to the present invention.The most infrared-LED-lath 5 is connected with information technology with microprocessor 7 as sensor.The latter also couple with antenna 17 in case wirelessly with radio infrastructure or other user terminal apparatus communication.Keyboard 6 part of smart phone 20 freely cuts (freigeschnitten), in order to it can be seen that be positioned at assembly thereafter.
Fig. 3 illustrates operating procedure, is wherein being used for the radio transmitter selected as graphical content as on the screen 3 of display unit by hand 4 actions menu 2 of user.The hand 4 starting user at 3D gesture operation is implemented in the predefined gesture in the detection region of user interface.It is in this example, and the hand opened points on the direction of screen 3 and five fingers of hand 4 keep stretching and appropriateness fork.It is set up to also other gestures of hand 4 be converted into the order of the menu 2 controlled on screen 3 as its response (not shown) assessment unit.The most only optionally it can be stated that the posture illustrated of hand 4 must take for the fixing predefined time or keep.
Fig. 4 illustrates the possible hand positions mutual with the user interface 1 according to the present invention and gesture.In subgraph a, the palm opened of hand 4 points on the direction of screen 3.The finger extension of hand 4 and (slightly) diverge.Representing the instruction gesture of hand 4 in subgraph b, the most only indicate finger extension, remaining finger is retracted to fist.Pointing gesture being described in subgraph c or clicks on gesture, wherein first the instruction finger of hand 4 stretches and direction at screen 3 is turned over up and down subsequently.
Fig. 5 illustrates operating procedure, and it follows after the enrollment process discussed in figure 3 can be connected on successfully 3D gesture operation.Terminating and therefore to protection prevents accidental gesture operation according to the user interface of the present invention for 3D gesture operation, that opens has a stretching, extension and outside the hand 4 in direction that (appropriate) finger of diverging points to screen 3 guides to from the detection region of sensor (the most not shown) along arrow P.In other words it further provides for according to the present invention, the motion of the predefined posture of the hand 4 of user and/or the hand 4 of user is provided for the end of 3D gesture mode.So user can be with Independent Decisiveness, whether the probability for 3D gesture operation exists (hand 4 guides to outside in the case of not having predefined posture/motion from the detection region of user interface in this case), or whether he wants to terminate 3D gesture mode as previously mentioned.
Fig. 6 illustrates for the method step from the first mode transitions of user interface to the embodiment of the method according to the invention of 3D gesture mode.Representing graphical content in first step 100 on the display unit, described graphical content is suitable for subsequently about 3D gesture operation.In the predefined detection region of the sensor that its hand is directed into by this user user interface.The hand of user detects and detects in step 300 predefined posture and/or the motion of user's hand by sensor in step 200.Especially to this be suitable for be on the one hand sensor can detect safely and but or when user job with the gesture of the good discrimination of general posture/motion.Starting intervalometer in step 400, expiring of intervalometer proposes the other condition of the beginning for 3D gesture mode.If intervalometer expires, (YES) after predefined posture/motion terminates, then during system keeps the first operator scheme.Method returns step 100 the most in this case.If intervalometer expires, before the predefined posture/motion of user's hand terminates, then system is in step 500 from the first mode transitions to 3D gesture mode and export optics and acoustics step 600 about following notice: successful activation 3D gesture mode.
In addition when describing in detail according to aspects of the present invention with Advantageous embodiments according to the embodiment explained together with appended accompanying drawing; amendment and the combination of the feature of the embodiment represented in the case of without departing substantially from the scope of the present invention for the technician are possible, and protection scope of the present invention is defined by the following claims.
Reference numerals list
1 user interface
2 menus
3 screens
The hands of 4 users
5 sensors
7 electronic-controlled installations
8 data storages
9 detection regions
10 automobiles
11 speakers
100-600 method step
P arrow

Claims (12)

1. the method that first operator scheme being used for from user interface (1) is transformed into 3D gesture mode, the operation of user interface (1) can be realized by means of described 3D gesture mode by multiple gestures the most freely realized of hereinafter referred to as 3D gesture, comprise the following steps:
-by means of sensor (5) detection (200) user in predefined region (9) hand (4),
-identify the predefined posture of (300) user's hand (4) and/or the predefined motion of the hand (4) of user, and
-from first operator scheme conversion (500) be described 3D gesture mode.
2. the method for claim 1, also includes step
-start (400) intervalometer, and under the conditions of the predefined motion of the predefined posture of identification of hand (4) of user before the timer expires and/or the hand (4) of user is unclosed,
-become described 3D gesture mode from described first operator scheme conversion (500).
3. method as claimed in claim 2, wherein said intervalometer starts in response to the identification (300) of the identification (300) of the predefined posture of the hand (4) of user and/or the predefined motion of the hand (4) of user.
4. the method as described in one of precedent claims, wherein
-described predefined posture includes
-especially there is the palm opened up on the direction for the sensor (5) of gesture identification and/or in the side of display unit (3) of finger of fork,
-and/or " thumb is upwards " gesture, and/or wherein
-described predefined motion includes grasping gesture.
5. the method as described in one of precedent claims, also includes
Output (600) is about described 3D gesture mode at this moment effective optics and/or acoustics notice.
6. a user interface, including
-sensor (5), is used for detecting the gesture that (200) are the most freely realized by the hand (4) of user, hereinafter referred to as 3D gesture,
-display unit (3), is used for representing graphical content (2), and
-assessment unit (7), is used for identifying the multiple gestures in sensor (5) signal,
Wherein said assessment unit (7) is set up to identify the predefined motion of the hand (4) of the predefined posture of the hand (4) of (300) user and/or user, and
Described user interface (1) moves to 3D gesture mode as its response from the first operator scheme.
7. user interface as claimed in claim 6, wherein said sensor (5) is optical pickocff and/or the sensor worked in region of ultra-red.
User interface the most as claimed in claims 6 or 7 also includes, the especially output unit of optics and/or acoustics activates or the notice of deexcitation about described 3D gesture mode for output.
9. user terminal apparatus, especially electronic radio communication device, including the user interface (1) as according to any one of claim 6-8.
10. computer program, including instruction, when it performs on the programmable processor (8) of the user interface (1) as according to any one of claim 6-8, it promotes described user interface (1) to perform the step of the method as according to any one of claim 1-5.
The instruction of 11. representation signal sequences, when it performs on the programmable processor (8) as according to any one of claim 6-8, it promotes described user interface (1), the step of execution method as according to any one of claim 1-5.
12. 1 kinds of means of transports, it includes the user interface (1) as according to any one of claim 6-8, and wherein said display unit (3) is designed as the central screen being fixedly embedded in the instrument board of means of transport (10) and/or the combination instrument being designed as means of transport (10).
CN201580009060.2A 2014-02-17 2015-02-06 User interface and method for switching from a first operating mode of a user interface to a 3D gesture mode Pending CN106030462A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102014202833.7 2014-02-17
DE102014202833.7A DE102014202833A1 (en) 2014-02-17 2014-02-17 User interface and method for switching from a first user interface operating mode to a 3D gesture mode
PCT/EP2015/052549 WO2015121173A1 (en) 2014-02-17 2015-02-06 User interface and method for switching from a first operating mode of a user interface to a 3d gesture mode

Publications (1)

Publication Number Publication Date
CN106030462A true CN106030462A (en) 2016-10-12

Family

ID=52544458

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580009060.2A Pending CN106030462A (en) 2014-02-17 2015-02-06 User interface and method for switching from a first operating mode of a user interface to a 3D gesture mode

Country Status (4)

Country Link
EP (1) EP3108332A1 (en)
CN (1) CN106030462A (en)
DE (1) DE102014202833A1 (en)
WO (1) WO2015121173A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016011365A1 (en) * 2016-09-21 2018-03-22 Daimler Ag Method for controlling a motor vehicle module and motor vehicle module
DE102017201312A1 (en) 2017-01-27 2018-08-02 Audi Ag Method for controlling a temperature control element of a container holder

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1499344A (en) * 2002-10-25 2004-05-26 �����ɣ���ͳ���˾ Gesture switch
US20060136846A1 (en) * 2004-12-20 2006-06-22 Sung-Ho Im User interface apparatus using hand gesture recognition and method thereof
US20100295781A1 (en) * 2009-05-22 2010-11-25 Rachid Alameh Electronic Device with Sensing Assembly and Method for Interpreting Consecutive Gestures
CN102221891A (en) * 2011-07-13 2011-10-19 广州视源电子科技有限公司 Method and system for realizing optical image gesture recognition
DE202012005255U1 (en) * 2012-05-29 2012-06-26 Youse Gmbh Operating device with a gesture monitoring unit
CN102713794A (en) * 2009-11-24 2012-10-03 奈克斯特控股公司 Methods and apparatus for gesture recognition mode control
DE102012000263A1 (en) * 2012-01-10 2013-07-11 Daimler Ag A method and apparatus for operating functions in a vehicle using gestures executed in three-dimensional space and related computer program product

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3903968B2 (en) * 2003-07-30 2007-04-11 日産自動車株式会社 Non-contact information input device
US7834847B2 (en) * 2005-12-01 2010-11-16 Navisense Method and system for activating a touchless control
DE102006037156A1 (en) 2006-03-22 2007-09-27 Volkswagen Ag Interactive operating device and method for operating the interactive operating device
DE102009008041A1 (en) 2009-02-09 2010-08-12 Volkswagen Ag Method for operating a motor vehicle with a touchscreen

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1499344A (en) * 2002-10-25 2004-05-26 �����ɣ���ͳ���˾ Gesture switch
US20060136846A1 (en) * 2004-12-20 2006-06-22 Sung-Ho Im User interface apparatus using hand gesture recognition and method thereof
US20100295781A1 (en) * 2009-05-22 2010-11-25 Rachid Alameh Electronic Device with Sensing Assembly and Method for Interpreting Consecutive Gestures
CN102713794A (en) * 2009-11-24 2012-10-03 奈克斯特控股公司 Methods and apparatus for gesture recognition mode control
CN102221891A (en) * 2011-07-13 2011-10-19 广州视源电子科技有限公司 Method and system for realizing optical image gesture recognition
DE102012000263A1 (en) * 2012-01-10 2013-07-11 Daimler Ag A method and apparatus for operating functions in a vehicle using gestures executed in three-dimensional space and related computer program product
DE202012005255U1 (en) * 2012-05-29 2012-06-26 Youse Gmbh Operating device with a gesture monitoring unit

Also Published As

Publication number Publication date
WO2015121173A1 (en) 2015-08-20
DE102014202833A1 (en) 2015-08-20
EP3108332A1 (en) 2016-12-28

Similar Documents

Publication Publication Date Title
JP6851197B2 (en) Multidimensional trackpad
JP5267388B2 (en) Information processing apparatus, information processing method, and program
JP4391193B2 (en) Operating device
CN104039582B9 (en) Method and device for operating functions displayed on a display unit of a vehicle using gestures which are carried out in a three-dimensional space, and corresponding computer program product
CN103955393B (en) A kind of method and device starting application program
KR20150116037A (en) Mobile terminal and control method for the mobile terminal
US11433937B2 (en) Vehicle and steering unit
WO2013101058A1 (en) Systems, methods, and apparatus for controlling gesture initiation and termination
KR20140139241A (en) Method for processing input and an electronic device thereof
CN107111397B (en) For the method for the operating device of operation motor vehicle and operating device and motor vehicle in different operation modes
US11119576B2 (en) User interface and method for contactlessly operating a hardware operating element in a 3-D gesture mode
CN105511781A (en) Starting application program method, device and user device
US20130117705A1 (en) Electronic Device and Method of Controlling the Same
US9588584B2 (en) System and method for processing touch input
KR102091509B1 (en) Method for processing character input and apparatus for the same
US20170024119A1 (en) User interface and method for controlling a volume by means of a touch-sensitive display unit
JP2019169128A (en) Method for operating man-machine interface and man-machine interface
US20140258860A1 (en) System and method for providing feedback to three-touch stroke motion
CN106775223A (en) The control method and terminal device of suspension button
EP2846312A1 (en) Remote key for control of vehicles
CN106030462A (en) User interface and method for switching from a first operating mode of a user interface to a 3D gesture mode
JPWO2019021418A1 (en) Display control apparatus and display control method
CN105283829B (en) Method for operating a touch-sensitive operating system and touch-sensitive operating system
TWI662452B (en) Portable electronic device and unlocking method
EP2762345A2 (en) Instruction feedback system and method for a vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20161012

RJ01 Rejection of invention patent application after publication