CN107203320A - User interface control method based on multiple point touching - Google Patents

User interface control method based on multiple point touching Download PDF

Info

Publication number
CN107203320A
CN107203320A CN201610157017.7A CN201610157017A CN107203320A CN 107203320 A CN107203320 A CN 107203320A CN 201610157017 A CN201610157017 A CN 201610157017A CN 107203320 A CN107203320 A CN 107203320A
Authority
CN
China
Prior art keywords
user
user interface
page
multiple point
point touching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610157017.7A
Other languages
Chinese (zh)
Inventor
刘琦
林园
熊雅琴
闻心泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive Asia Pacific Beijing Co Ltd
Original Assignee
Continental Automotive Asia Pacific Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Asia Pacific Beijing Co Ltd filed Critical Continental Automotive Asia Pacific Beijing Co Ltd
Priority to CN201610157017.7A priority Critical patent/CN107203320A/en
Publication of CN107203320A publication Critical patent/CN107203320A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0264Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for control means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

A kind of user interface control method based on multiple point touching, including:The current page of query user interface;When the multiple point touching for detecting user is operated and is inquired present user interface and is not belonging to safety-related User Page, predefined application menu is shown in the current page of user interface.The user interface control method make it that operation is simpler, accordingly also improves the security in driving conditions.

Description

User interface control method based on multiple point touching
Technical field
The present invention relates to the design of user interface, the more particularly to user interface control based on multiple point touching Method.
Background technology
During the use of automobile, user is except wishing to know the situation of remote that some are conventional, for example It can exercise outside the information such as mileage, oil consumption, tire pressure, it is also desirable to know to drive such as air-conditioning state and relax Adaptive information.Also, after corresponding information is known, it can be adjusted, obtained by some controls More comfortable experience.
By taking airconditioning control as an example, the airconditioning control module of some current vehicles does not have single display Part, thus need to carry out the display of air-conditioning state by the display screen of inter-vehicle information system.In view of big All standard configuration has had the inter-vehicle information system of touch-screen to most vehicles, thus can also be carried out by touch-screen The control of air-conditioning.Currently existing scheme is mostly by the way of pulling down or pulling up from the boundary slip of touch-screen To breathe out air-conditioning menu, because the touch-screen of inter-vehicle information system is difficult to accomplish flush edge, thus this Plant slide mode and inconvenient.In addition, this mode needs user to be first manually placed into operation The edge of touch-screen performs slide again, so can also trigger user to divert one's attention.And this divert one's attention to be expert at The generation of accident is this may result in during car.
The content of the invention
The problem of present invention is solved is to provide a kind of user interface control method based on multiple point touching, with The mode of simpler and safer exhalation application menu is provided.
In order to solve the above problems, the user interface control method of the invention based on multiple point touching, including: The current page of query user interface;Operate and inquire current in the multiple point touching for detecting user When user interface is not belonging to safety-related User Page, show predetermined in the current page of user interface The application menu of justice.
Compared with prior art, such scheme has advantages below:Multiple point touching (i.e. many only need to be used Finger is touched) mode as trigger action breathe out predefined application menu, and be not required to specific touch Touch position.Limitation that compared with the prior art must be since marginal position, the operation threshold of trigger action Reduce so that operation is simpler.Also, also imply that user without diverting one's attention without specific position Observe hand position can execution, accordingly also improve the security in driving conditions.
Brief description of the drawings
Fig. 1 is a kind of embodiment flow of the user interface control method of the invention based on multiple point touching Schematic diagram;
Fig. 2 is that a kind of embodiment of the user interface control method of the invention based on multiple point touching was realized Journey schematic diagram;
Fig. 3 is a kind of user of embodiment of the user interface control method of the invention based on multiple point touching Interface change schematic diagram.
Embodiment
In the following description, many details are elaborated to make those of skill in the art The present invention is appreciated more fully in member.But, for the technical staff in art it is evident that The present invention's realizes some that can not have in these details.However, it should be understood that this Invention is not limited to introduced specific embodiment.Conversely, it may be considered that with following feature and key element Any combination implement the present invention, regardless of whether whether they are related to different embodiments.Therefore, under Aspect, feature, embodiment and the advantage in face are used and are not construed as claim for illustrative purposes only Key element or restriction, unless clearly proposed in the claims.
Shown in reference picture 1, it provides the handling process according to one embodiment of the present invention, specifically Including:
Step S10, the current page of query user interface;
Step S20, judges whether current page belongs to safety-related User Page;If it is not, then holding Row step S30, if so, then performing step S50;
Step S30, judges whether to have detected that the action of multiple point touching;If so, step S40 is then performed, If it is not, then performing step S60;
Step S40, predefined application menu is shown in the current page of user interface;
Step S50, original setting according to active user's page performs corresponding operating;
Step S60, holds according to original setting of actual touch detection result combination active user's page Row corresponding operating.
Above-mentioned security-related User Page can be the User Page related to moving backward, for example, move backward The radar page/reverse image the page;Or it is or related to distress call (e-call) User Page;Or, it can also be the User Page related to roadside assistance (b-call).Working as When the preceding page belongs to these User Pages, if exhalation application menu, which may be brought, is unfavorable for safe shadow Ringing, thus breathing out the operation of application menu should be excluded in these cases.
The present invention embodiment described above provides the user interface control mode based on multiple point touching, remove Under conditions of detecting multiple point touching and current page is not belonging to safety-related User Page, touch Send out this branch of the display of application menu predefined, belong to safety-related user page in current page During face, and current page is when being not belonging to safety-related User Page and being not detected by multiple point touching, It will all be operated by original set.
When belonging to safety-related User Page for current page, no matter whether user has done many The action of touch, rational operation should be the normal display for still maintaining current page (as above Explain).That is, operation is performed according to original setting of active user's page.
And when being not belonging to safety-related User Page for current page and being not detected by multiple point touching, It generally may require that and done touch action but tactile by single-point according to whether user has done touch action, user Touch a variety of different situations such as somewhere position and carry out corresponding response operation.Obviously, do not appoint in user During what touch action, it is similar to above-mentioned safety-related User Page operation, still maintains current page Normal display;And in user by single-touch during the position of somewhere, it is necessary to according to this position To respond, when clicking some icon of current page, it will be realized for example according to original set Open some application or return to the operation such as Previous Menu, when clicking the blank space of current page, The operation such as not responding and maintain the normal display of current page can be realized according to original set.
Certainly, above-mentioned simply schematically illustrate only carries out the situation of user interface control by touch action, When there is touch and the mode such as application entity knob, voice is combined, relative set can be also combined Come set-up procedure S50, step S60 realization.Thus, the present invention be not precluded from application entity knob, The possibility that the modes such as voice are combined.In addition, supporting the capacitance plate of multiple point touching identification to be applied at present In various electronic, such as inter-vehicle information system, tablet personal computer, smart mobile phone.Therefore, originally The applicable electronic equipment of institute is not limited in invention yet.
Fig. 2 provides the process that user interface control according to embodiments of the present invention is realized.With reference to Fig. 1 and Fig. 2, touch-screen hardware is responsible for that user is identified in the operation that its surface is carried out.User interface is in Existing module is responsible for the image procossing of the page elements correlation of whole user interface, for example:User interface shows Show the division, the realization of display content/style, the presentation of the menu of Infotainment function in region, etc. Deng.And touch operation ruling processing module is then responsible for combined user interface and the current page that module is provided is presented The touch detection result of face information and touch-screen hardware, is handled with the ruling for carrying out touch operation.
After user has done multiple point touching operation (such as the click or slip of many fingers) to touch-screen, Touch-screen hardware can send corresponding touch detection result to responsible touch operation ruling processing module.
Touch operation ruling processing module can therefrom confirm that the result is after touch detection result is obtained It is no to reflect the multiple point touching operation identified.Also, touch operation ruling processing module is by reading Take user interface that the current page information that module is provided is presented, to confirm whether current page belongs to safety Related User Page (for example whether current page is the reverse image page).Had been described above with reference to above-mentioned , if the reflection of touch detection result has identified that multiple point touching is operated, and current page is not belonging to safety During the User Page of correlation, touch operation ruling processing module will make the predefined application menu of exhalation Ruling, the ruling, which will be sent to user interface module is presented, carries out corresponding image procossing.
And module is presented after this ruling is received in user interface, it will touched according to pre-setting The presentation mode pre-set in respective regions on screen, with correspondence shows this predefined application Menu.
Fig. 3 shows a kind of user interface change procedure according to mode of the present invention.It is assumed that in the embodiment User interface be the user interface presented in the touch-screen of inter-vehicle information system, shown in reference picture 3, Current page shows the icon of 4 applications.As user by 3 fingers while clicking on on-vehicle information system After the touch-screen of system, air-conditioning state column will be shown at the top of current page.The display on air-conditioning state column Project can include:Any one such as temperature, air conditioning mode, wind speed, aeration seat heated condition or Multinomial (wherein 3 are illustrated in figure).Understand according to the above description, air-conditioning state column is exactly predefined Application menu, and the position of its display and its content of display can also be determined by pre-setting.Can Selection of land, for some specific application example, when further single give directions hits air-conditioning state column region Screen when, current page will switch to air-conditioning set the page.Certainly, so before refer to, when right When the mode such as operation knob, voice is also supported in the control of user interface, actual setting may also be combined with Adjustment is corresponding to be realized.For example, when having shown that air-conditioning state column, by move up or push-turn key, User sends the phonetic order that relevant air-conditioning is set, and can also make current page switch to air-conditioning and set the page.
Although the present invention is disclosed as above with preferred embodiment, the present invention is not limited to this.It is any Those skilled in the art, the various changes made without departing from the spirit and scope of the present invention and modification, It all should include in protection scope of the present invention, therefore protection scope of the present invention should be with claim institute The scope of restriction is defined.

Claims (10)

1. a kind of user interface control method based on multiple point touching, it is characterised in that including:Inquire about user The current page at interface;Operated in the multiple point touching for detecting user and inquire present user interface When being not belonging to safety-related User Page, predefined application is shown in the current page of user interface Menu.
2. the user interface control method of multiple point touching as claimed in claim 1, it is characterised in that working as The edge of the preceding page shows predefined application menu.
3. the user interface control method of multiple point touching as claimed in claim 2, it is characterised in that working as The top of the preceding page shows predefined application menu.
4. the user interface control method of multiple point touching as claimed in claim 1, it is characterised in that described The multiple point touching operation of user, including:User touches screen simultaneously using two or more ground fingers.
5. the user interface control method of multiple point touching as claimed in claim 1, it is characterised in that described Application menu includes:Air-conditioning state column.
6. the user interface control method of multiple point touching as claimed in claim 5, it is characterised in that described The display items on air-conditioning state column include following any one or more:Temperature, wind speed, pattern, ventilation Seat heating state.
7. the user interface control method of multiple point touching as claimed in claim 1, it is characterised in that also wrap Include:After predefined application menu is shown, the touch operation according to user to the application menu, Show related menu.
8. the user interface control method of multiple point touching as claimed in claim 7, it is characterised in that described Application menu is air-conditioning state column;After user singly gives directions and hits the air-conditioning state column, by current page Switch to air-conditioning and the page is set.
9. the user interface control method of multiple point touching as claimed in claim 1, it is characterised in that with peace The User Page of total correlation, including following any one:The User Page related to reversing, with it is urgent Call for help related User Page, the User Page related to roadside assistance.
10. the user interface control method of multiple point touching as claimed in claim 9, it is characterised in that with falling The related User Page of car, including:The radar for backing car page or the reverse image page.
CN201610157017.7A 2016-03-18 2016-03-18 User interface control method based on multiple point touching Pending CN107203320A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610157017.7A CN107203320A (en) 2016-03-18 2016-03-18 User interface control method based on multiple point touching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610157017.7A CN107203320A (en) 2016-03-18 2016-03-18 User interface control method based on multiple point touching

Publications (1)

Publication Number Publication Date
CN107203320A true CN107203320A (en) 2017-09-26

Family

ID=59904479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610157017.7A Pending CN107203320A (en) 2016-03-18 2016-03-18 User interface control method based on multiple point touching

Country Status (1)

Country Link
CN (1) CN107203320A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040021704A1 (en) * 2002-06-03 2004-02-05 Fuji Xerox Co. Ltd. Function control unit and method thereof
CN101180599A (en) * 2005-03-28 2008-05-14 松下电器产业株式会社 User interface system
CN101563666A (en) * 2006-12-22 2009-10-21 松下电器产业株式会社 User interface device
US20110210928A1 (en) * 2010-03-01 2011-09-01 Kouichi Matsuda Information processing apparatus, information processing method, and program
CN102485536A (en) * 2010-12-03 2012-06-06 上海博泰悦臻电子设备制造有限公司 Control method and device of vehicular system and vehicular system
JP5172485B2 (en) * 2008-06-10 2013-03-27 シャープ株式会社 Input device and control method of input device
CN104898877A (en) * 2014-03-06 2015-09-09 丰田自动车株式会社 Information processing apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040021704A1 (en) * 2002-06-03 2004-02-05 Fuji Xerox Co. Ltd. Function control unit and method thereof
CN101180599A (en) * 2005-03-28 2008-05-14 松下电器产业株式会社 User interface system
CN101563666A (en) * 2006-12-22 2009-10-21 松下电器产业株式会社 User interface device
US20090309848A1 (en) * 2006-12-22 2009-12-17 Tomohiro Terada User interface device
JP5172485B2 (en) * 2008-06-10 2013-03-27 シャープ株式会社 Input device and control method of input device
US20110210928A1 (en) * 2010-03-01 2011-09-01 Kouichi Matsuda Information processing apparatus, information processing method, and program
CN102485536A (en) * 2010-12-03 2012-06-06 上海博泰悦臻电子设备制造有限公司 Control method and device of vehicular system and vehicular system
CN104898877A (en) * 2014-03-06 2015-09-09 丰田自动车株式会社 Information processing apparatus

Similar Documents

Publication Publication Date Title
JP4522475B1 (en) Operation input device, control method, and program
EP2508964B1 (en) Touch operation determination device, and touch operation determination method and program
US20150309657A1 (en) Mobile terminal and control method thereof
US20120203544A1 (en) Correcting typing mistakes based on probabilities of intended contact for non-contacted keys
CN102902471B (en) Input interface switching method and input interface switching device
US20100238129A1 (en) Operation input device
EP3379812B1 (en) Apparatus and method for controlling operation of mobile terminal
EP2148267B1 (en) Mobile device having touch screen and method for setting virtual keypad thereof
CN103927119B (en) Switch to the method and system at account interface
WO2013029257A1 (en) Vehicle's interactive system
US20130298079A1 (en) Apparatus and method for unlocking an electronic device
CN104035706A (en) Display method and electronic device
CN104808943A (en) Input implementation method, input implementation device and portable terminal of virtual keyboard
JP2004355426A (en) Software for enhancing operability of touch panel and terminal
CN107179849B (en) Terminal, input control method thereof, and computer-readable storage medium
CN106681626A (en) Intelligent vehicular navigation operation method
CN111104038A (en) Application function processing method and electronic equipment
CN105283829B (en) Method for operating a touch-sensitive operating system and touch-sensitive operating system
CN108700990B (en) Screen locking method, terminal and screen locking device
EP3249878B1 (en) Systems and methods for directional sensing of objects on an electronic device
CN103135896A (en) Positioning method and electronic device
CN111104035B (en) Display interface control method, device, equipment and computer readable storage medium
CN104866218A (en) Control method of electronic touch equipment
CN107203320A (en) User interface control method based on multiple point touching
WO2022199540A1 (en) Unread message identifier clearing method and apparatus, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 200082 538 Dalian Road, Yangpu District, Shanghai

Applicant after: Mainland Investment (China) Co., Ltd.

Address before: 200082 538 Dalian Road, Yangpu District, Shanghai

Applicant before: Continental Automotive Asia Pacific (Beijing) Co., Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170926