CN107967091A - A kind of man-machine interaction method and the computing device for human-computer interaction - Google Patents

A kind of man-machine interaction method and the computing device for human-computer interaction Download PDF

Info

Publication number
CN107967091A
CN107967091A CN201711345498.5A CN201711345498A CN107967091A CN 107967091 A CN107967091 A CN 107967091A CN 201711345498 A CN201711345498 A CN 201711345498A CN 107967091 A CN107967091 A CN 107967091A
Authority
CN
China
Prior art keywords
display screen
touch
touch pad
computing device
human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711345498.5A
Other languages
Chinese (zh)
Other versions
CN107967091B (en
Inventor
刘霞
张建伟
胡军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bright Wind Taiwan (shanghai) Mdt Infotech Ltd
Original Assignee
Bright Wind Taiwan (shanghai) Mdt Infotech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bright Wind Taiwan (shanghai) Mdt Infotech Ltd filed Critical Bright Wind Taiwan (shanghai) Mdt Infotech Ltd
Priority to CN201711345498.5A priority Critical patent/CN107967091B/en
Publication of CN107967091A publication Critical patent/CN107967091A/en
Application granted granted Critical
Publication of CN107967091B publication Critical patent/CN107967091B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04809Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The purpose of the application is to provide a kind of man-machine interaction method and the computing device for human-computer interaction;Obtain touch operation of the user in the touch pad;First position information based on the touch operation in the touch pad, and the coordinate mapping relations of the display screen and the touch pad, determine second place information of the touch operation in the display screen;The touch operation is performed according to the second place information.Compared with prior art, the application can pass through the positional information in touch pad fast positioning display screen, and then realize the interactive operation to the display information in positional information, remove the operation for the touch pad that slidably reciprocates, unclamps up and down from, so as to effectively reduce this time-consuming man-machine interaction mode, improve the flexibility of human-computer interaction so that man-machine interactive operation is more friendly.

Description

A kind of man-machine interaction method and the computing device for human-computer interaction
Technical field
This application involves the technology of computer realm, more particularly to a kind of human-computer interaction for computing device.
Background technology
In the application of existing smart machine, such as intelligent glasses or intelligent helmet, the input mode of mainstream is that user is led to Cross and input is realized to the control operation of touch pad.Usually when the display screen of smart machine and user carry out the touch of input operation When plate is not in a visual field, if display screen interface needs to input information, user will generally be inputted by touch pad indirect control Method.For example, when intelligent glasses design, generally for beautiful and operability, before display screen being positioned over human eye, just In viewing information, and touch pad can be positioned over mirror holder other positions, such as mirror holder left and right sides.To above-mentioned touch pad and display screen not For smart machine in a visual field, when user uses the smart machine, display screen is immediately ahead of user's sight, intelligence The input method of equipment can be shown in display screen, at this time, usually slide and click on definite thing using four direction on touch pad Part control input method.For example, when user chooses specific key in input method by touch pad, it is necessary to pass through the key to present convergence Position continuous operation, such as slide up and down, finger is sliding on a touchpad once then to be unclamped, so sliding release group repeatedly Closing operation, can just find the key for needing to input in input method, and this input mode is difficult control, and more time-consuming, user's body Underaction is tested, human-computer interaction is not friendly enough.
The content of the invention
The purpose of the application is to provide a kind of man-machine interaction method and the computing device for human-computer interaction.
According to the one side of the application, there is provided a kind of man-machine interaction method for computing device, including:
Obtain touch operation of the user in the touch pad;
First position information based on the touch operation in the touch pad, and the display screen and the touch The coordinate mapping relations of plate, determine second place information of the touch operation in the display screen;
The touch operation is performed according to the second place information.
According to further aspect of the application, there is provided a kind of computing device for human-computer interaction, including:
Touch operation acquisition device, for obtaining touch operation of the user in the touch pad;
Determining device, for the first position information based on the touch operation in the touch pad, and it is described aobvious Display screen and the coordinate mapping relations of the touch pad, determine second place information of the touch operation in the display screen;
Touch operation executive device, for performing the touch operation according to the second place information.
According to the another aspect of the application, a kind of computing device for human-computer interaction is additionally provided, including:
The display screen and touch pad of phase separation;
One or more processors;
Memory;And
One or more programs, wherein one or more of programs are stored in the memory, and are configured Performed into by one or more of processors, described program includes being used to perform following operation:
Obtain touch operation of the user in the touch pad;
First position information based on the touch operation in the touch pad, and the display screen and the touch The coordinate mapping relations of plate, determine second place information of the touch operation in the display screen;
The touch operation is performed according to the second place information.
According to the another aspect of the application, a kind of computer-readable recording medium is additionally provided, is stored thereon with computer Program, the computer program can be executed by processor following operation:
Obtain touch operation of the user in the touch pad;
First position information based on the touch operation in the touch pad, and the display screen and the touch The coordinate mapping relations of plate, determine second place information of the touch operation in the display screen;
The touch operation is performed according to the second place information.
Compared with prior art, the application will be based on described touch after touch operation of the user in touch pad is got First position information in template, and with reference to the coordinate between the display screen and touch pad for being separated in the computing device Mapping relations, determine second place information of the touch operation in display screen, so as to based on the second confidence Breath performs the touch operation.Here, due to there are default coordinate mapping relations, that is, being shown between display screen and the touch pad Positional information in display screen, there is that corresponding positional information is matching on the touch pad, when user's touch operation touch pad On first position information when, based on pre-existing coordinate mapping relations, the execution of the touch operation act on it is described In the display screen of first position information match in the information of the second place, if for example, being in the second place information Now treat the interactive information of selection, user can realize the choosing to the interactive information in the touch operation of the first position information Select.Therefore, based on the application, user can be by the positional information in touch pad fast positioning display screen, and then realizes contraposition The interactive operation of display information on confidence breath, removes the operation for the touch pad that slidably reciprocates, unclamps up and down from, so as to have Effect reduces this time-consuming man-machine interaction mode, improves the flexibility of human-computer interaction so that man-machine interactive operation is more friendly.
Brief description of the drawings
By reading the detailed description made to non-limiting example made with reference to the following drawings, the application's is other Feature, objects and advantages will become more apparent upon:
Fig. 1 shows a kind of schematic diagram of man-machine interaction method for computing device according to the application one side:
Fig. 2 shows a kind of equipment schematic diagram of computing device for human-computer interaction according to the application one side;
Fig. 3 show effective touch area of the touch pad according to the application one embodiment shape and display screen it is man-machine The exemplary plot that the shape of interactive interface matches;
Fig. 4 shows that the mapping of the coordinate of human-computer interaction interface and touch pad is closed in the display screen according to the application one embodiment The exemplary plot of system.
The same or similar reference numeral represents the same or similar component in attached drawing.
Embodiment
The application is described in further detail below in conjunction with the accompanying drawings.
In one typical configuration of the application, terminal, the equipment of service network and computing device include one or more Processor (CPU), input/output interface, network interface and memory.
Memory may include computer-readable medium in volatile memory, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only storage (ROM) or flash memory (flash RAM).Memory is computer-readable medium Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer-readable instruction, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only storage (ROM), electric erasable Programmable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc read-only storage (CD-ROM), Digital versatile disc (DVD) or other optical storages, magnetic cassette tape, magnetic disk storage or other magnetic storage apparatus or Any other non-transmission medium, the information that can be accessed by a computing device available for storage.Defined according to herein, computer Computer-readable recording medium does not include non-temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
Fig. 1 shows a kind of method schematic diagram of human-computer interaction for computing device 1 according to the application one side.Its In, the method includes the steps S11, step S12 and step S13.Wherein, in step s 11, computing device 1 obtains user and exists Touch operation in the touch pad;Then, in step s 12, computing device 1 is based on the touch operation in the touch pad In first position information, and coordinate mapping relations of the display screen and the touch pad determine that the touch operation exists Second place information in the display screen;Then, in step s 13, computing device 1 is performed according to the second place information The touch operation.
Here, the computing device 1 can include but is not limited to various smart machines, such as intelligent appliance, such as intelligence electricity Depending on;And for example portable intelligent device, such as intelligent glasses, intelligent helmet head-mounted display apparatus, in one implementation, institute The functions such as AR, VR, MR can also be loaded by stating intelligent glasses, intelligent helmet etc..
In one embodiment, the computing device includes intelligent glasses or intelligent helmet.Further, in one embodiment In, the computing device 1 includes intelligent glasses, and the touch pad is deployed in the temple of the mirror holder of the intelligent glasses, for example, One of described temple can be deployed in, or touch pad is deployed with two temple based on needs.Further, it is also possible to base In the true form of the intelligent glasses, the touch pad is deployed in other regions with the phase separation of corresponding display screen.One In a embodiment, the computing device includes intelligent helmet, and the touch pad can be deployed in the left side of the intelligent helmet, the right side Side or rear side surface region.In practical application, the touch pad can be deployed in the intelligent helmet surface and the display screen The region of phase separation, for example, it may be the arbitrary region that intelligent helmet surface user can touch, and where the touch pad Region is not overlapping with display screen region, such as left side, right side or the rear side surface region of the intelligent helmet;And for example, if institute The inside that display screen is deployed in intelligent helmet is stated, such as when user wears intelligent helmet, display screen is in immediately ahead of the user visual field, Touch pad can also be deployed in the intelligent helmet surface at the display screen back side at this time.
In the application, the computing device 1 includes the display screen and touch pad of phase separation.In one implementation, institute Both can be included and spatially be separated from each other, do not integrate or be superimposed upon one by stating display screen and touch pad phase separation Rise.In one embodiment, when the computing device 1 in a state of use, the touch pad is outside user's visual range.This When, the touch pad is with display screen not in the same visual field of user.If for example, the computing device 1 includes intelligent glasses, institute In the temple for stating the mirror holder that touch pad is deployed in the intelligent glasses, then the touch pad is outside user's visual range, and institute State display screen to be deployed on the eyeglass of intelligent glasses, within user's visual range, therefore, user can not see described at the same time Touch pad and display screen.And for example, if front and rear two opposite surfaces that the computing device 1 includes, wherein front surface part are deployed on Display screen, when using the computing device 1, which is within the user visual field, and its rear surface, i.e. equipment back side are disposed There is corresponding touch pad, then in a state of use, the touch pad is outside user's visual range, with display screen not in user The same visual field in.In another embodiment, when the computing device 1 in a state of use, the touch pad and display screen At the same time within user's visual range.If for example, the computing device 1 includes smart television, the touch pad is deployed in described On the remote control of smart television, the touch pad can be visible by user at the same time with the display screen of smart television.
In one implementation, the touch pad can be integrated in the computing device 1, such as the intelligent glasses Touch pad is integrated on the mirror holder;The touch pad can also be a relatively independent building block of computing device 1, example Touch pad such as the smart television can be an independent control device.In one implementation, the touch pad can With the touch pad including supporting single-point touch, the touch pad for supporting multi-point touch can also be included.I.e. in the application, user can be with Touch pad described in single-touch or multiple point touching.
In this application, the display screen can include the display screen that carries of computing device 1, for example, can be one piece or Polylith display screen;The perspective plane that the projection arrangement that can also include being included by computing device 1 or computing device 1 projects;Institute Above-mentioned all types of combination can also be included by stating display screen.In one implementation, the size information on the perspective plane can be with Set by 1 corresponding projection arrangement of computing device, such as external projector or internal projecting subassembly etc., as projection arrangement is set out The resolution ratio of projection, and after projector distance is determined, calculate the size information on corresponding perspective plane.
Specifically, in step s 11, computing device 1 obtains touch operation of the user in the touch pad.A kind of real In existing mode, the mode of the touch operation can include but is not limited to click on, and such as clicks or repeatedly clicks on;Pressing, it is such as different The pressing of degree of pressure, the pressing of different durations;It is mobile;Lift finger release;And the combination of above-mentioned various modes.Here, Those skilled in the art it should be understood that the mode of above-mentioned touch operation is only for example, it is existing or occur from now on other touch It can be suitable for the application if touching the mode of operation, should also be included in the protection domain of the application, and with the shape of reference Formula is incorporated herein.Here, the touch operation that user submits can be that user is realized by touch of the finger to touch pad, User be can also be by other touch-control components, as stylus realizes the touch-control of touch pad.
In one implementation, the touch operation can be used for realizing the interaction to interactive information in the display screen Operation, the mode of the touch operation and the implementing result of the follow-up touch operation match.Can for example, performing touch operation For any customized interactive operation such as selection of the realization to corresponding interactive information in the display screen, confirmation, deletion.
Then, in step s 12, first position letter of the computing device 1 based on the touch operation in the touch pad Breath, and the coordinate mapping relations of the display screen and the touch pad, determine the touch operation in the display screen Second place information.
In one implementation, the coordinate mapping relations of the display screen and the touch pad can include the display The overall coordinate mapping relations with the touch pad of screen, at this time, due to the display screen and the touch pad be it is fixed, because This, after the coordinate mapping relations between the display screen and the touch pad have been pre-established, the coordinate mapping relations It will be relatively fixed existing.In another implementation, the coordinate mapping relations of the display screen and the touch pad may be used also With the coordinate mapping relations including human-computer interaction interface in display screen Yu the touch pad.Here, it can be wrapped in the display screen Human-computer interaction interface is included, i.e. interactive information in display screen is human-computer interaction interface.In one implementation, the man-machine friendship Mutual interface includes interface of input method, and the interface of input method can include letter key, number key, symbolic key, other function keys etc., The input of corresponding informance can be realized by operation of the user to interface of input method.In addition, the human-computer interaction interface can be with Including the various interactive interfaces such as search interface or function selection interface;The human-computer interaction interface can be with above-mentioned various interactions The combination at interface.Here, those skilled in the art are it should be understood that above-mentioned various human-computer interaction interfaces are only for example, it is existing Or if the other kinds of human-computer interaction interface occurred from now on can be suitable for the application, the guarantor of the application should be also included in In the range of shield, and it is incorporated herein in the form of reference.At this time, since the human-computer interaction interface in display screen can be based on actual answer Changed with different, so the coordinate mapping relations of human-computer interaction interface and the touch pad can also flexibly become in display screen Change.
Here, the coordinate mapping relations can include the positional information on the display screen, have on the touch pad Corresponding positional information is matching, for example, as first position information (x0, y0) on user's touch operation touch pad, base In pre-existing coordinate mapping relations, the execution of the touch operation acts on the institute with the first position information match State in display screen in second place information (x, y).
For example, the display screen includes a row in input method:1234567890, user wants to find key " 0 ", it is only necessary to The first position information of second place information match on touch operation touch pad with key " 0 " in display screen can be achieved.
And for example, being still the display screen includes a row in input method:1234567890, user is intended to from current key " 0 " goes to key " 1 ";Or the predeterminated target key of user is key " 8 ", but before there are deviation and touch operation is corresponding to " 0 " First position information.Then under above application scene, user can be by the touch operation of slip from the second place of key " 0 " The corresponding first position information of information, is moved to the corresponding first position information of second place information of key " 1 " or key " 8 ", from And realize the selection to key " 1 " or key " 8 ".Here, company can be passed through based on the touch operation that the coordinate mapping relations perform The continuous finger that slides is realized, without the operation for the touch pad that slidably reciprocated, unclamped in the prior art, effectively reduces interaction consumption When.
Then, in step s 13, computing device 1 performs the touch operation according to the second place information.Here, It is described to perform the response that the touch operation is the touch operation to user in the first position information.Realized in one kind In mode, perform the touch operation and can be used for realizing selection, confirmation, the deletion to corresponding interactive information in the display screen Deng any customized interactive operation.For example, can set, the touch operation includes selection operation, the side of the touch operation Formula includes slide, currently slides past first position information on touch pad, then the touch operation performed for select this first The interactive information in the information of the second place in the corresponding display screen of positional information, for example, when the man-machine friendship in the display screen Mutual interface is interface of input method, and user's finger currently streaks number key " 1 ", then performs the selection operation to number key " 1 ".Again Such as, can also set, the touch operation includes confirmation operation, in one implementation, the mode bag of the touch operation Pressing operation is included, continues the example above, confirms for the number key " 1 " of selection, that is, determines the selected number key of input " 1 ", then after user is carrying out pressing operation in the corresponding first position information of the matched second place information of number key " 1 ", Perform the confirmation operation to number key " 1 ";In another implementation, can also set the mode of the confirmation operation includes Release operation, that is, continue the example above, after user have selected number key " 1 " by slide, believes in the first position Release on breath, by lifting finger operates the confirmation operation completed to number key " 1 ";In another implementation, may be used also Further included in a manner of the confirmation operation is set, length exceedes scheduled duration when touching, and automated execution confirmation operation, that is, continue The example above, after user is by sliding into number key " 1 " corresponding first position information, exceedes pre- in the position residence time Timing is long, then automated execution confirmation operation.
In one embodiment, the method further includes step S14 (not shown), and in step S14, computing device 1 can With according to the second place information it is described display screen display described in touch operation, if for example, the touch operation includes Slide, then when sliding past each first position information on touch pad, to display screen each second place information accordingly On interactive information be differently shown successively, such as in interface of input method be chosen pass through key be highlighted successively, Or change it is currently selected select by the shape of key, size, color etc., thus, it is possible to facilitate user to know touch operation Process so that human-computer interaction is more friendly.
Alternatively, in the execution result information of touch operation described in the display screen display.For example, the touch operation bag The confirmation operation to corresponding key in interface of input method is included, such as under English input state, based on touch operation confirmation input letter " a ", then be presented region, such as input the letter " a " that display is selected at text cursor accordingly on the display screen;And for example, It is alphabetical " a " based on touch operation confirmation input under Pinyin Input state, then it will include in the corresponding region of interface of input method Confirming may corresponding chinese character option after the letter " a ".
Here, those skilled in the art it should be understood that it is above-mentioned according to the second place information in the display screen The mode of the upper display touch operation, or the mode in the execution result information of touch operation described in the display screen display It is only for example, other touches according to the second place information in the display screen display existing or occur from now on are grasped The mode of work, if or this can be suitable in the mode of the execution result information of touch operation described in the display screen display Application, should also be included in the protection domain of the application, and is incorporated herein in the form of reference.
In one implementation, the touch pad can support multi-point touch operation, can obtain user at the same time at this time Multiple touch operations in touch pad, such as obtain the touch operation of the multiple fingers of user.For example, multiple fingers difference of user When being moved to corresponding first position information, that is, the corresponding second confidence of each first position information is determined Key display high-brightness in breath, such as corresponding interface of input method of each second place information, and then can be with it is anticipated that the elder generation confirmed Order afterwards, the touch operation is performed according to each second place information, such as user discharges each second place successively Finger in the corresponding first position information of information, to realize corresponding confirmation operation successively.
Here, the application is after touch operation of the user in touch pad is got, by based in the touch pad One positional information, and with reference to the coordinate mapping relations between the display screen and touch pad for being separated in the computing device 1, Second place information of the touch operation in display screen is determined, described in being performed based on the second place information Touch operation.Here, due between display screen and the touch pad there are default coordinate mapping relations, i.e., the position on display screen Confidence ceases, and has that corresponding positional information is matching on the touch pad, first on user's touch operation touch pad When confidence ceases, based on pre-existing coordinate mapping relations, the execution of the touch operation is acted on to be believed with the first position In the matched display screen of manner of breathing in the information of the second place, if treating selection for example, being presented in the second place information Interactive information, user can realize the selection to the interactive information in the touch operation of the first position information.Therefore, base In the application, user can be by the positional information in touch pad fast positioning display screen, and then realizes in the positional information Display information interactive operation, remove the operation for the touch pad that slidably reciprocates, unclamps up and down from, so as to effectively reduce this The time-consuming man-machine interaction mode of kind, improves the flexibility of human-computer interaction so that man-machine interactive operation is more friendly.
In another embodiment, the method further includes step S15 (not shown), in step S15, computing device 1 The coordinate mapping relations of human-computer interaction interface and the touch pad in the display screen can be established;Then, in step s 12, Computing device 1 can be based on first position information of the touch operation in the touch pad, and human-computer interaction circle Face and the coordinate mapping relations of the touch pad, determine second place information of the touch operation in the display screen.
Here, can include human-computer interaction interface in the display screen, i.e., the interactive information in display screen is human-computer interaction Interface.In one implementation, the human-computer interaction interface includes interface of input method, and the interface of input method can include word Female key, number key, symbolic key, other function keys etc., corresponding informance can be realized by operation of the user to interface of input method Input.In addition, the human-computer interaction interface can also include the various interactive interfaces such as search interface, function selection interface.
In this application, in order to avoid the complex operations that user slidably reciprocates, unclamps on a touchpad, meanwhile, in order to logical Interactive information in touch pad fast positioning display screen is crossed, such as the positioning of the key in interface of input method, reduces time overhead, can be incited somebody to action Human-computer interaction interface is scaled with the touch pad according to relative scale in display screen, forms coordinate mapping relations.Realized in one kind , can be by the putting in order of interactive interface unit, position relationship in the region of touch pad and the human-computer interaction interface in mode One-to-one coordinate mapping relations are established, by taking interface of input method as an example, the interactive interface unit can include input method circle Each letter key, number key, symbolic key, other function keys in face etc..
It is a kind of to establish human-computer interaction interface and citing such as Fig. 4 of the coordinate mapping relations of the touch pad in the display screen It is shown.Assuming that the resolution ratio of the touch pad is x1*y1, display resolution x2*y2.Assuming that user's finger touches touch First position information (x0, y0) on plate, then fall on the second place information (x, y) in display screen interface of input method.It is calculated Formula:
It can obtain
It can obtain
Thus, it is possible to the coordinate mapping relations of human-computer interaction interface and the touch pad in the display screen are obtained.
Here, those skilled in the art are it should be understood that above-mentioned establish human-computer interaction interface and institute in the display screen The mode for stating the coordinate mapping relations of touch pad is only for example, it is existing or occur from now on other establish it is man-machine in the display screen If interactive interface and the mode of the coordinate mapping relations of the touch pad can be suitable for the application, this Shen should be also included in In protection domain please, and it is incorporated herein in the form of reference.
In one implementation, the foundation of the overall coordinate mapping relations between the touch pad of the display screen Human-computer interaction interface and the method for building up of the coordinate mapping relations of the touch pad in the display screen are may be referred to, at this time, by Be in the display screen and the touch pad it is fixed, therefore, the coordinate mapping relations be also be relatively fixed it is existing.
Further, in one embodiment, in step S15, computing device 1 can work as human-computer interaction in the display screen Interface is updated, and establishes the coordinate mapping relations of human-computer interaction interface and touch pad in the display screen.In practical applications, may be used During being arranged on the switching of human-computer interaction interface, the coordinate mapping relations are re-established.For example, the selection change based on user After the pattern of interface of input method, the coordinate mapping relations re-establish, and for example, if other interaction letters of human-computer interaction interface Breath switches, changes, and can also re-establish the coordinate mapping relations.Thus, it is possible to ensure of man-machine interactive operation With degree and accuracy, friendly user experience is brought.
In one embodiment, in step S15, computing device 1 establishes human-computer interaction interface and institute in the display screen State the coordinate mapping relations of effective touch area in touch pad.In practical applications, can be based in touch pad partly or entirely The coordinate mapping relations with human-computer interaction interface in the display screen are established in touch area.If effective touch area is part Touch area, the then touch operation that user implements in the region are just effective;If effective touch area includes whole touch areas, User is effective in the touch operation that whole touch areas of the touch pad are implemented.
Further, in one embodiment, the method further includes step S16 (not shown), in step s 16, calculates Equipment 1 is when first position information of the touch operation in the touch pad is outside the effectively touch area, there is provided right The operation prompt information answered.In one implementation, the operation prompt information can be used for the touch for prompting user current Zone void, or guiding user enter effective touch area to perform effective touch operation etc..For example, it can prompt " to ask Click on effective interaction area " or prompting " please moving touch operation to xx directions " etc..In one implementation, the operation Prompt message can include but is not limited to the mode such as picture and text prompting in voice prompt, display screen.
In one embodiment, the effectively shape of touch area and the shape of the human-computer interaction interface match.
In practical applications, in display screen human-computer interaction interface shape, as interface of input method be laid out shape can be each It is different, and then, can the shape based on effective touch area that the setting of the shape of the human-computer interaction interface matches.Also, institute Human-computer interaction interface is stated, the optional position in the display screen as described in being presented on interface of input method.Fig. 3 is shown according to this Shen Please one embodiment touch pad the shape of effective touch area and the shape of human-computer interaction interface of display screen match Several groups of examples.Wherein, the interface of input method can take up whole display screen, or a part of area being presented in display screen Domain.In one implementation, effective touch area in touch pad can be adjusted according to the shape of interface of input method in display screen Layout, such as:The shape of the interface of input method is square, and the shape of effective touch area of touch pad can be preferably side Shape, thus, the mapping relations of foundation are more accurate, and it is higher to touch precision;And for example, the shape of the interface of input method is ellipse Circle, touch pad effective coverage can also be rectangles.In one implementation, even if the shape of effective touch area and institute State that the shape of human-computer interaction interface is not exactly the same, the mapping relations of the two can also be established.In one implementation, when with When family needs to use input method, it will such as be bounced based on WiFi passwords, typewriting, input contact person's scene, interface of input method information Out it is shown in display screen, therefore, can the change based on the shape of the different interface of input method under different application scenarios It is flexibly matched with or adjusts the shape of effective touch area on the touch pad.
In one embodiment, the second place information includes one or more on human-computer interaction interface in display screen hand over The positional information of mutual boundary element.In one implementation, the interactive interface unit is carried in the human-computer interaction interface Supply the effective object that user performs interactive operation.For example, in interface of input method, each key, as letter key, number key, Symbolic key, other function keys etc. can correspond to an interactive interface unit.Here, the second place information can be only right Ying Yu includes the positional information of an interactive interface unit on human-computer interaction interface in display screen, can also correspond to multiple friendships at the same time The positional information of mutual boundary element.
Further, in one embodiment, in the step S13, computing device 1 can be according to the second confidence Breath, one or more of interactive interface units corresponding to the second place information perform the touch operation.
Specifically, in one implementation, if the corresponding interactive interface unit of the second place information only has one, The touch operation is then performed to be used directly for realizing that selection to the interactive interface unit, confirmation, deletion etc. are arbitrarily made by oneself The interactive operation of justice.For example, can set, the touch operation includes selection operation, and the mode of the touch operation includes sliding Operation, currently slides past first position information on touch pad, then the touch operation performed is the selection first position information pair The interactive information in the information of the second place in the display screen answered, for example, when the human-computer interaction interface in the display screen is defeated Enter method interface, user's finger currently streaks number key " 1 ", then performs the selection operation to number key " 1 ".
In another implementation, if the second place information includes multiple friendships on human-computer interaction interface in display screen The positional information of mutual boundary element;Wherein, in step s 13, computing device 1 is according to the second place information, to described The corresponding the multiple interactive interface unit of two positional informations performs the touch operation;Wherein, the method further includes step S17 (not shown), in step S17, computing device 1 can obtain selection operation of the user in the touch pad;And then hold To realize, target interacts boundary to the row selection operation in the multiple interactive interface unit corresponding to the second place information The selection of face unit.In one implementation, if user is expected to choose the corresponding multiple interactive boundaries of the second place information Some in the unit of face or several target interactive interface units, then computing device 1 is according to the second place information, to described The result that the corresponding the multiple interactive interface unit of two positional informations performs the touch operation has been locked out the second Confidence ceases.And then can on the basis of the touch operation, by user in the touch pad further touch control operation, That is selection operation, to realize the selection to target interactive interface unit.
For example, when the second place information corresponds in interface of input method the position of two keys, while highlight the two Key, then touched again by upper and lower or left and right, select the key of one of them;Similarly, when the second place information correspond to it is defeated When entering in method interface the adjacent position of four keys, while this four keys are highlighted, then by touching again up and down, selection is wherein One of key.Those skilled in the art it should be understood that above-mentioned selection operation is only for example, it is existing or occur from now on other If selection operation can be suitable for the application, should also be included in the protection domain of the application, and the bag in the form of reference It is contained in this.
And for example, if there is multiple target interaction circles in the corresponding multiple interactive interface units of the second place information Face unit, such as second place information correspond to multiple keys in interface of input method, and wherein have part key to need continuously to wait to choose, Then at this point it is possible to be expected selected order based on each target interactive interface unit, the continuous multiple selections behaviour for obtaining user Make.Such as, when the second place information corresponds in interface of input method the position of two keys, and two keys are that user is expected to connect When continuing two keys chosen, while the two keys are highlighted, at this time, the touch operation is actually to the second place information The locking of corresponding two keys, then touched again by upper and lower or left and right, the preceding object key of order of preference, the selection operation Weight operation or release operation are can include but is not limited to, completes after once selecting, that is, confirms an object key, and then can be with Further carry out the selection operation of next object key, if for example, the selection operation include weight operation, can not release Let go finger in the case of, continue through up and down or left and right touch again, the posterior object key of order of preference, again perform weight behaviour Make, complete the selection operation of the posterior object key of the order;And for example, if the selection operation includes release and operates, then may be used Touched again with default operational use time, continuing through upper and lower or left and right, the posterior object key of order of preference, again Release operation is performed, completes the selection operation of the posterior object key of the order.Those skilled in the art it should be understood that on State selection operation to be only for example, if other selection operations existing or occur from now on can be suitable for the application, should also wrap It is contained in the protection domain of the application, and is incorporated herein in the form of reference.
Fig. 2 shows a kind of equipment schematic diagram of computing device 1 for human-computer interaction according to the application one side.Its In, the computing device 1 includes touch operation acquisition device 21, determining device 22 and touch operation executive device 23.Wherein, touch Touch operation acquisition device 21 and obtain touch operation of the user in the touch pad;Determining device 22 is existed based on the touch operation First position information in the touch pad, and the coordinate mapping relations of the display screen and the touch pad, determine described Second place information of the touch operation in the display screen;Touch operation executive device 23 is held according to the second place information The row touch operation.
Here, the computing device 1 can include but is not limited to various smart machines, such as intelligent appliance, such as intelligence electricity Depending on;And for example portable intelligent device, such as intelligent glasses, intelligent helmet head-mounted display apparatus, in one implementation, institute The functions such as AR, VR, MR can also be loaded by stating intelligent glasses, intelligent helmet etc..
In one embodiment, the computing device includes intelligent glasses or intelligent helmet.Further, in one embodiment In, the computing device 1 includes intelligent glasses, and the touch pad is deployed in the temple of the mirror holder of the intelligent glasses, for example, One of described temple can be deployed in, or touch pad is deployed with two temple based on needs.Further, it is also possible to base In the true form of the intelligent glasses, the touch pad is deployed in other regions with the phase separation of corresponding display screen.One In a embodiment, the computing device includes intelligent helmet, and the touch pad can be deployed in the left side of the intelligent helmet, the right side Side or rear side surface region.In practical application, the touch pad can be deployed in the intelligent helmet surface and the display screen The region of phase separation, for example, it may be the arbitrary region that intelligent helmet surface user can touch, and where the touch pad Region is not overlapping with display screen region, such as left side, right side or the rear side surface region of the intelligent helmet;And for example, if institute The inside that display screen is deployed in intelligent helmet is stated, such as when user wears intelligent helmet, display screen is in immediately ahead of the user visual field, Touch pad can also be deployed in the intelligent helmet surface at the display screen back side at this time.
In the application, the computing device 1 includes the display screen and touch pad of phase separation.In one implementation, institute Both can be included and spatially be separated from each other, do not integrate or be superimposed upon one by stating display screen and touch pad phase separation Rise.In one embodiment, when the computing device 1 in a state of use, the touch pad is outside user's visual range.This When, the touch pad is with display screen not in the same visual field of user.If for example, the computing device 1 includes intelligent glasses, institute In the temple for stating the mirror holder that touch pad is deployed in the intelligent glasses, then the touch pad is outside user's visual range, and institute State display screen to be deployed on the eyeglass of intelligent glasses, within user's visual range, therefore, user can not see described at the same time Touch pad and display screen.And for example, if front and rear two opposite surfaces that the computing device 1 includes, wherein front surface part are deployed on Display screen, when using the computing device 1, which is within the user visual field, and its rear surface, i.e. equipment back side are disposed There is corresponding touch pad, then in a state of use, the touch pad is outside user's visual range, with display screen not in user The same visual field in.In another embodiment, when the computing device 1 in a state of use, the touch pad and display screen At the same time within user's visual range.If for example, the computing device 1 includes smart television, the touch pad is deployed in described On the remote control of smart television, the touch pad can be visible by user at the same time with the display screen of smart television.
In one implementation, the touch pad can be integrated in the computing device 1, such as the intelligent glasses Touch pad is integrated on the mirror holder;The touch pad can also be a relatively independent building block of computing device 1, example Touch pad such as the smart television can be an independent control device.In one implementation, the touch pad can With the touch pad including supporting single-point touch, the touch pad for supporting multi-point touch can also be included.I.e. in the application, user can be with Touch pad described in single-touch or multiple point touching.
In this application, the display screen can include the display screen that carries of computing device 1, for example, can be one piece or Polylith display screen;The perspective plane that the projection arrangement that can also include being included by computing device 1 or computing device 1 projects;Institute Above-mentioned all types of combination can also be included by stating display screen.In one implementation, the size information on the perspective plane can be with Set by 1 corresponding projection arrangement of computing device, such as external projector or internal projecting subassembly etc., as projection arrangement is set out The resolution ratio of projection, and after projector distance is determined, calculate the size information on corresponding perspective plane.
Specifically, touch operation acquisition device 21 obtains touch operation of the user in the touch pad.Realized in one kind In mode, the mode of the touch operation can include but is not limited to click on, and such as clicks or repeatedly clicks on;Pressing, such as different pressures The pressing of range degree, the pressing of different durations;It is mobile;Lift finger release;And the combination of above-mentioned various modes.Here, this Field technology personnel are it should be understood that the mode of above-mentioned touch operation is only for example, existing or other touches for occurring from now on If the mode of operation can be suitable for the application, should also be included in the protection domain of the application, and in the form of reference It is incorporated herein.Here, the touch operation that user submits can be that user is realized by touch of the finger to touch pad, also Can be user by other touch-control components, as stylus realizes the touch-control of touch pad.
In one implementation, the touch operation can be used for realizing the interaction to interactive information in the display screen Operation, the mode of the touch operation and the implementing result of the follow-up touch operation match.Can for example, performing touch operation For any customized interactive operation such as selection of the realization to corresponding interactive information in the display screen, confirmation, deletion.
First position information of the determining device 22 based on the touch operation in the touch pad, and the display screen With the coordinate mapping relations of the touch pad, second place information of the touch operation in the display screen is determined.
In one implementation, the coordinate mapping relations of the display screen and the touch pad can include the display The overall coordinate mapping relations with the touch pad of screen, at this time, due to the display screen and the touch pad be it is fixed, because This, after the coordinate mapping relations between the display screen and the touch pad have been pre-established, the coordinate mapping relations It will be relatively fixed existing.In another implementation, the coordinate mapping relations of the display screen and the touch pad may be used also With the coordinate mapping relations including human-computer interaction interface in display screen Yu the touch pad.Here, it can be wrapped in the display screen Human-computer interaction interface is included, i.e. interactive information in display screen is human-computer interaction interface.In one implementation, the man-machine friendship Mutual interface includes interface of input method, and the interface of input method can include letter key, number key, symbolic key, other function keys etc., The input of corresponding informance can be realized by operation of the user to interface of input method.In addition, the human-computer interaction interface can be with Including the various interactive interfaces such as search interface or function selection interface;The human-computer interaction interface can be with above-mentioned various interactive boundaries The combination in face.Here, those skilled in the art are it should be understood that above-mentioned various human-computer interaction interfaces are only for example, it is existing or If the other kinds of human-computer interaction interface occurred from now on can be suitable for the application, the protection of the application should be also included in In the range of, and be incorporated herein in the form of reference.At this time, since the human-computer interaction interface in display screen can be based on practical application It is different and change, so the coordinate mapping relations of human-computer interaction interface and the touch pad can also flexibly change in display screen.
Here, the coordinate mapping relations can include the positional information on the display screen, have on the touch pad Corresponding positional information is matching, for example, as first position information (x0, y0) on user's touch operation touch pad, base In pre-existing coordinate mapping relations, the execution of the touch operation acts on the institute with the first position information match State in display screen in second place information (x, y).
For example, the display screen includes a row in input method:1234567890, user wants to find key " 0 ", it is only necessary to The first position information of second place information match on touch operation touch pad with key " 0 " in display screen can be achieved.
And for example, being still the display screen includes a row in input method:1234567890, user is intended to from current key " 0 " goes to key " 1 ";Or the predeterminated target key of user is key " 8 ", but before there are deviation and touch operation is corresponding to " 0 " First position information.Then under above application scene, user can be by the touch operation of slip from the second place of key " 0 " The corresponding first position information of information, is moved to the corresponding first position information of second place information of key " 1 " or key " 8 ", from And realize the selection to key " 1 " or key " 8 ".Here, company can be passed through based on the touch operation that the coordinate mapping relations perform The continuous finger that slides is realized, without the operation for the touch pad that slidably reciprocated, unclamped in the prior art, effectively reduces interaction consumption When.
Touch operation executive device 23 performs the touch operation according to the second place information.Here, the execution The touch operation is the response of the touch operation in the first position information to user.In one implementation, The touch operation is performed to can be used for realizing selection to corresponding interactive information in the display screen, confirmation, deletion etc. arbitrarily Customized interactive operation.For example, can set, the touch operation includes selection operation, and the mode of the touch operation includes Slide, currently slides past first position information on touch pad, then the touch operation performed is selection first position letter The interactive information in the second place information in corresponding display screen is ceased, for example, when the human-computer interaction interface in the display screen For interface of input method, user's finger currently streaks number key " 1 ", then performs the selection operation to number key " 1 ".And for example, may be used also To set, the touch operation includes confirmation operation, and in one implementation, the mode of the touch operation includes pressing and grasps Make, continue the example above, confirm for the number key " 1 " of selection, that is, determine the selected number key " 1 " of input, then when After user carries out pressing operation in the corresponding first position information of the matched second place information of number key " 1 ", logarithm is performed The confirmation operation of keyboard " 1 ";In another implementation, the mode of the confirmation operation can also be set to include release and grasped Make, that is, continue the example above, after user have selected number key " 1 " by slide, in the first position information, lead to Cross the confirmation operation for lifting the release operation completion of finger to number key " 1 ";In another implementation, institute can also be set The mode for stating confirmation operation further includes, and length exceedes scheduled duration when touching, and automated execution confirmation operation, that is, continue above-mentioned act Example, after user is by sliding into number key " 1 " corresponding first position information, pre- timing is exceeded in the position residence time Grow, then automated execution confirmation operation.
In one embodiment, the computing device 1 further includes display device (not shown), and the display device can root According to the second place information in touch operation described in the display screen display, if for example, the touch operation includes sliding Operation, then when sliding past each first position information on touch pad, to display screen accordingly in each second place information Interactive information is differently shown successively, such as the key for being chosen to pass through in interface of input method is highlighted successively, or Change it is currently selected select by the shape of key, size, color etc., thus, it is possible to facilitate user to know the process of touch operation, So that human-computer interaction is more friendly.
Alternatively, in the execution result information of touch operation described in the display screen display.For example, the touch operation bag The confirmation operation to corresponding key in interface of input method is included, such as under English input state, based on touch operation confirmation input letter " a ", then be presented region, such as input the letter " a " that display is selected at text cursor accordingly on the display screen;And for example, It is alphabetical " a " based on touch operation confirmation input under Pinyin Input state, then it will include in the corresponding region of interface of input method Confirming may corresponding chinese character option after the letter " a ".
Here, those skilled in the art it should be understood that it is above-mentioned according to the second place information in the display screen The mode of the upper display touch operation, or the mode in the execution result information of touch operation described in the display screen display It is only for example, other touches according to the second place information in the display screen display existing or occur from now on are grasped The mode of work, if or this can be suitable in the mode of the execution result information of touch operation described in the display screen display Application, should also be included in the protection domain of the application, and is incorporated herein in the form of reference.
In one implementation, the touch pad can support multi-point touch operation, can obtain user at the same time at this time Multiple touch operations in touch pad, such as obtain the touch operation of the multiple fingers of user.For example, multiple fingers difference of user When being moved to corresponding first position information, that is, the corresponding second confidence of each first position information is determined Key display high-brightness in breath, such as corresponding interface of input method of each second place information, and then can be with it is anticipated that the elder generation confirmed Order afterwards, the touch operation is performed according to each second place information, such as user discharges each second place successively Finger in the corresponding first position information of information, to realize corresponding confirmation operation successively.
Here, the application is after touch operation of the user in touch pad is got, by based in the touch pad One positional information, and with reference to the coordinate mapping relations between the display screen and touch pad for being separated in the computing device 1, Second place information of the touch operation in display screen is determined, described in being performed based on the second place information Touch operation.Here, due between display screen and the touch pad there are default coordinate mapping relations, i.e., the position on display screen Confidence ceases, and has that corresponding positional information is matching on the touch pad, first on user's touch operation touch pad When confidence ceases, based on pre-existing coordinate mapping relations, the execution of the touch operation is acted on to be believed with the first position In the matched display screen of manner of breathing in the information of the second place, if treating selection for example, being presented in the second place information Interactive information, user can realize the selection to the interactive information in the touch operation of the first position information.Therefore, base In the application, user can be by the positional information in touch pad fast positioning display screen, and then realizes in the positional information Display information interactive operation, remove the operation for the touch pad that slidably reciprocates, unclamps up and down from, so as to effectively reduce this The mode of the time-consuming human-computer interaction of kind, improves the flexibility of human-computer interaction so that man-machine interactive operation is more friendly.
In another embodiment, the computing device 1, which further includes, establishes device (not shown), and the device of establishing can be with Establish the coordinate mapping relations of human-computer interaction interface and the touch pad in the display screen;Then, in step s 12, calculate Equipment 1 can based on first position information of the touch operation in the touch pad, and the human-computer interaction interface with The coordinate mapping relations of the touch pad, determine second place information of the touch operation in the display screen.
Here, can include human-computer interaction interface in the display screen, i.e., the interactive information in display screen is human-computer interaction Interface.In one implementation, the human-computer interaction interface includes interface of input method, and the interface of input method can include word Female key, number key, symbolic key, other function keys etc., corresponding informance can be realized by operation of the user to interface of input method Input.In addition, the human-computer interaction interface can also include the various interactive interfaces such as search interface, function selection interface.
In this application, in order to avoid the complex operations that user slidably reciprocates, unclamps on a touchpad, meanwhile, in order to logical Interactive information in touch pad fast positioning display screen is crossed, such as the positioning of the key in interface of input method, reduces time overhead, can be incited somebody to action Human-computer interaction interface is scaled with the touch pad according to relative scale in display screen, forms coordinate mapping relations.Realized in one kind , can be by the putting in order of interactive interface unit, position relationship in the region of touch pad and the human-computer interaction interface in mode One-to-one coordinate mapping relations are established, by taking interface of input method as an example, the interactive interface unit can include input method circle Each letter key, number key, symbolic key, other function keys in face etc..
It is a kind of to establish human-computer interaction interface and citing such as Fig. 4 of the coordinate mapping relations of the touch pad in the display screen It is shown.Assuming that the resolution ratio of the touch pad is x1*y1, display resolution x2*y2.Assuming that user's finger touches touch First position information (x0, y0) on plate, then fall on the second place information (x, y) in display screen interface of input method.It is calculated Formula:
It can obtain
It can obtain
Thus, it is possible to the coordinate mapping relations of human-computer interaction interface and the touch pad in the display screen are obtained.
Here, those skilled in the art are it should be understood that above-mentioned establish human-computer interaction interface and institute in the display screen The mode for stating the coordinate mapping relations of touch pad is only for example, it is existing or occur from now on other establish it is man-machine in the display screen If interactive interface and the mode of the coordinate mapping relations of the touch pad can be suitable for the application, this Shen should be also included in In protection domain please, and it is incorporated herein in the form of reference.
In one implementation, the foundation of the overall coordinate mapping relations between the touch pad of the display screen Human-computer interaction interface and the method for building up of the coordinate mapping relations of the touch pad in the display screen are may be referred to, at this time, by Be in the display screen and the touch pad it is fixed, therefore, the coordinate mapping relations be also be relatively fixed it is existing.
Further, in one embodiment, it is described establish device can work as the display screen in human-computer interaction interface by more Newly, the coordinate mapping relations of human-computer interaction interface and touch pad in the display screen are established.In practical applications, can be arranged on During the switching of human-computer interaction interface, the coordinate mapping relations are re-established.For example, selection change input method circle based on user After the pattern in face, the coordinate mapping relations re-establish, and for example, if other interactive information of human-computer interaction interface are cut Change, change, the coordinate mapping relations can also be re-established.Thus, it is possible to ensure the matching degree and standard of man-machine interactive operation Exactness, brings the user experience of close friend.
In one embodiment, the device of establishing can establish human-computer interaction interface and the touch in the display screen The coordinate mapping relations of effective touch area in plate.In practical applications, part or all of Petting Area in touch pad can be based on The coordinate mapping relations with human-computer interaction interface in the display screen are established in domain.If effective touch area is partial touch area Domain, the then touch operation that user implements in the region are just effective;If effective touch area includes whole touch areas, user exists The touch operation that whole touch areas of the touch pad are implemented is all effective.
Further, in one embodiment, the computing device 1 further includes suggestion device (not shown), the prompting dress Put when first position information of the touch operation in the touch pad is outside the effectively touch area, there is provided corresponding Operation prompt information.In one implementation, the operation prompt information can be used for the touch area for prompting user current It is invalid, or guide user to enter effective touch area to perform effective touch operation etc..For example, it can prompt " please click on Effective interaction area " or prompting " please moving touch operation to xx directions " etc..In one implementation, the operation indicating Information can include but is not limited to the mode such as picture and text prompting in voice prompt, display screen.
In one embodiment, the effectively shape of touch area and the shape of the human-computer interaction interface match.
In practical applications, in display screen human-computer interaction interface shape, as interface of input method be laid out shape can be each It is different, and then, can the shape based on effective touch area that the setting of the shape of the human-computer interaction interface matches.Also, institute Human-computer interaction interface is stated, the optional position in the display screen as described in being presented on interface of input method.Fig. 3 is shown according to this Shen Please one embodiment touch pad the shape of effective touch area and the shape of human-computer interaction interface of display screen match Several groups of examples.Wherein, the interface of input method can take up whole display screen, or a part of area being presented in display screen Domain.In one implementation, effective touch area in touch pad can be adjusted according to the shape of interface of input method in display screen Layout, such as:The shape of the interface of input method is square, and the shape of effective touch area of touch pad can be preferably side Shape, thus, the mapping relations of foundation are more accurate, and it is higher to touch precision;And for example, the shape of the interface of input method is ellipse Circle, touch pad effective coverage can also be rectangles.In one implementation, even if the shape of effective touch area and institute It is not exactly the same to state the shape of human-computer interaction interface, based on corresponding matching relationship, the mapping relations of the two can also be established. It is defeated such as based on WiFi passwords, typewriting, input contact person's scene when user needs to use input method in a kind of implementation Enter method interface information to be out shown in spring in display screen, therefore, can be based on the difference input under different application scenarios The change of the shape at method interface is flexibly matched with or adjusts the shape of effective touch area on the touch pad.
In one embodiment, the second place information includes one or more on human-computer interaction interface in display screen hand over The positional information of mutual boundary element.In one implementation, the interactive interface unit is carried in the human-computer interaction interface Supply the effective object that user performs interactive operation.For example, in interface of input method, each key, as letter key, number key, Symbolic key, other function keys etc. can correspond to an interactive interface unit.Here, the second place information can be only right Ying Yu includes the positional information of an interactive interface unit on human-computer interaction interface in display screen, can also correspond to multiple friendships at the same time The positional information of mutual boundary element.
Further, in one embodiment, the touch operation executive device 23 can according to the second place information, One or more of interactive interface units corresponding to the second place information perform the touch operation.
Specifically, in one implementation, if the corresponding interactive interface unit of the second place information only has one, The touch operation is then performed to be used directly for realizing that selection to the interactive interface unit, confirmation, deletion etc. are arbitrarily made by oneself The interactive operation of justice.For example, can set, the touch operation includes selection operation, and the mode of the touch operation includes sliding Operation, currently slides past first position information on touch pad, then the touch operation performed is the selection first position information pair The interactive information in the information of the second place in the display screen answered, for example, when the human-computer interaction interface in the display screen is defeated Enter method interface, user's finger currently streaks number key " 1 ", then performs the selection operation to number key " 1 ".
In another implementation, if the second place information includes multiple friendships on human-computer interaction interface in display screen The positional information of mutual boundary element;Wherein, the touch operation executive device 23 can be according to the second place information, to institute State the corresponding the multiple interactive interface unit of second place information and perform the touch operation;Wherein, the computing device 1 is gone back Including selection operation acquisition device (not shown) and selection operation executive device (not shown), the selection operation acquisition device can To obtain selection operation of the user in the touch pad;And then the selection operation executive device performs the selection operation To realize the selection of target interactive interface unit in the multiple interactive interface unit corresponding to the second place information. In a kind of implementation, if user be expected choose some in the corresponding multiple interactive interface units of the second place information or Several target interactive interface units, then computing device 1 is corresponding to the second place information according to the second place information The result that the multiple interactive interface unit performs the touch operation has been locked out the second place information.And then can be On the basis of the touch operation, by user in the touch pad further touch control operation, i.e. selection operation, with realize Selection to target interactive interface unit.
For example, when the second place information corresponds in interface of input method the position of two keys, while highlight the two Key, then touched again by upper and lower or left and right, select the key of one of them;Similarly, when the second place information correspond to it is defeated When entering in method interface the adjacent position of four keys, while this four keys are highlighted, then by touching again up and down, selection is wherein One of key.Those skilled in the art it should be understood that above-mentioned selection operation is only for example, it is existing or occur from now on other If selection operation can be suitable for the application, should also be included in the protection domain of the application, and the bag in the form of reference It is contained in this.
And for example, if there is multiple target interaction circles in the corresponding multiple interactive interface units of the second place information Face unit, such as second place information correspond to multiple keys in interface of input method, and wherein have part key to need continuously to wait to choose, Then at this point it is possible to be expected selected order based on each target interactive interface unit, the continuous multiple selections behaviour for obtaining user Make.Such as, when the second place information corresponds in interface of input method the position of two keys, and two keys are that user is expected to connect During two keys of continuous selection, while the two keys are highlighted, at this time, the touch operation is actually to the second place information The locking of corresponding two keys, then touched again by upper and lower or left and right, the preceding object key of order of preference, the selection operation Weight operation or release operation are can include but is not limited to, completes after once selecting, that is, confirms an object key, and then can be with Further carry out the selection operation of next object key, if for example, the selection operation include weight operation, can not release Let go finger in the case of, continue through up and down or left and right touch again, the posterior object key of order of preference, again perform weight behaviour Make, complete the selection operation of the posterior object key of the order;And for example, if the selection operation includes release and operates, then may be used Touched again with default operational use time, continuing through upper and lower or left and right, the posterior object key of order of preference, again Release operation is performed, completes the selection operation of the posterior object key of the order.Those skilled in the art it should be understood that on State selection operation to be only for example, if other selection operations existing or occur from now on can be suitable for the application, should also wrap It is contained in the protection domain of the application, and is incorporated herein in the form of reference.
Present invention also provides a kind of computing device for human-computer interaction, including:
The display screen and touch pad of phase separation;
One or more processors;
Memory;And
One or more programs, wherein one or more of programs are stored in the memory, and are configured Performed into by one or more of processors, described program includes being used to perform following operation:
Obtain touch operation of the user in the touch pad;
First position information based on the touch operation in the touch pad, and the display screen and the touch The coordinate mapping relations of plate, determine second place information of the touch operation in the display screen;
The touch operation is performed according to the second place information.
Further, the described program of the equipment can be also used for performing in other related embodiments based on aforesaid operations Respective operations.
Present invention also provides a kind of computer-readable recording medium, is stored thereon with computer program, the computer Program can be executed by processor following operation:
Obtain touch operation of the user in the touch pad;
First position information based on the touch operation in the touch pad, and the display screen and the touch The coordinate mapping relations of plate, determine second place information of the touch operation in the display screen;
The touch operation is performed according to the second place information.
Further, the computer program can be also executed by processor in other related embodiments based on aforesaid operations Respective operations.
Obviously, those skilled in the art can carry out the application essence of the various modification and variations without departing from the application God and scope.In this way, if these modifications and variations of the application belong to the scope of the application claim and its equivalent technologies Within, then the application is also intended to comprising including these modification and variations.
It should be noted that the present invention can be carried out in the assembly of software and/or software and hardware, for example, can adopt With application-specific integrated circuit (ASIC), general purpose computer or any other realized similar to hardware device.In one embodiment In, software program of the invention can be performed by processor to realize steps described above or function.Similarly, it is of the invention Software program (including relevant data structure) can be stored in computer readable recording medium storing program for performing, for example, RAM memory, Magnetically or optically driver or floppy disc and similar devices.In addition, some steps or function of the present invention can employ hardware to realize, example Such as, as coordinating with processor so as to performing the circuit of each step or function.
In addition, the part of the present invention can be applied to computer program product, such as computer program instructions, when its quilt When computer performs, by the operation of the computer, the method according to the invention and/or technical solution can be called or provided. And the programmed instruction of the method for the present invention is called, it is possibly stored in fixed or moveable recording medium, and/or pass through Broadcast or the data flow in other signal bearing medias and be transmitted, and/or be stored according to described program instruction operation In the working storage of computer equipment.Here, including a device according to one embodiment of present invention, which includes using Memory in storage computer program instructions and processor for execute program instructions, wherein, when the computer program refers to When order is performed by the processor, method and/or skill of the device operation based on foregoing multiple embodiments according to the present invention are triggered Art scheme.
It is obvious to a person skilled in the art that the invention is not restricted to the details of above-mentioned one exemplary embodiment, Er Qie In the case of without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in other specific forms.Therefore, no matter From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and the scope of the present invention is by appended power Profit requires rather than described above limits, it is intended that all in the implication and scope of the equivalency of claim by falling Change is included in the present invention.Any reference numeral in claim should not be considered as to the involved claim of limitation.This Outside, it is clear that one word of " comprising " is not excluded for other units or step, and odd number is not excluded for plural number.That is stated in device claim is multiple Unit or device can also be realized by a unit or device by software or hardware.The first, the second grade word is used for table Show title, and be not offered as any specific order.

Claims (32)

1. a kind of man-machine interaction method for computing device, the computing device includes the display screen and touch pad of phase separation, Wherein, the described method includes:
Obtain touch operation of the user in the touch pad;
First position information based on the touch operation in the touch pad, and the display screen and the touch pad Coordinate mapping relations, determine second place information of the touch operation in the display screen;
The touch operation is performed according to the second place information.
2. according to the method described in claim 1, wherein, the method further includes:
The touch operation according to the second place information in the display screen display;Alternatively,
In the execution result information of touch operation described in the display screen display.
3. method according to claim 1 or 2, wherein, the method further includes:
Establish the coordinate mapping relations of human-computer interaction interface and the touch pad in the display screen;
Wherein, the first position information based on the touch operation in the touch pad, and the display screen and institute The coordinate mapping relations of touch pad are stated, determine that second place information of the touch operation in the display screen includes:
First position information based on the touch operation in the touch pad, and the human-computer interaction interface are touched with described The coordinate mapping relations of template, determine second place information of the touch operation in the display screen.
It is 4. described to establish human-computer interaction interface and the touch in the display screen according to the method described in claim 3, wherein The coordinate mapping relations of plate include:
When human-computer interaction interface is updated in the display screen, the seat of human-computer interaction interface and touch pad in the display screen is established Mark mapping relations.
5. the method according to claim 3 or 4, wherein, it is described establish in the display screen human-computer interaction interface with it is described The coordinate mapping relations of touch pad include:
Establish human-computer interaction interface and the coordinate mapping relations of effective touch area in the touch pad in the display screen.
6. according to the method described in claim 5, wherein, the effectively shape of touch area and the human-computer interaction interface Shape matches.
7. the method according to claim 5 or 6, wherein, the method further includes:
When first position information of the touch operation in the touch pad is outside the effectively touch area, there is provided corresponding Operation prompt information.
8. according to the method described in claim 3, wherein, the human-computer interaction interface includes following at least any one:
Interface of input method;
Search interface;
Function selection interface.
9. according to the method described in claim 3, wherein, the second place information is included in display screen on human-computer interaction interface The positional information of one or more interactive interface units.
It is 10. described that the touch operation is performed according to the second place information according to the method described in claim 9, wherein Including:
According to the second place information, one or more of interactive interface units corresponding to the second place information are held The row touch operation.
11. according to the method described in claim 10, wherein, if the second place information includes human-computer interaction circle in display screen The positional information of multiple interactive interface units on face;
Wherein, it is described to be included according to the second place information execution touch operation:
According to the second place information, described in the multiple interactive interface unit execution corresponding to the second place information Touch operation;
Wherein, the method further includes:
Obtain selection operation of the user in the touch pad;
The selection operation is performed to realize target in the multiple interactive interface unit corresponding to the second place information The selection of interactive interface unit.
12. according to the method described in claim 1, wherein, the computing device includes intelligent glasses or intelligent helmet.
13. according to the method for claim 12, wherein, the computing device includes intelligent glasses, the touch pad deployment In the temple of the mirror holder of the intelligent glasses.
14. according to the method for claim 12, wherein, the computing device includes intelligent helmet, the touch pad deployment In the left side of the intelligent helmet, right side or rear side surface region.
15. according to the method described in claim 1, wherein, when the computing device in a state of use, the touch pad with Outside the visual range of family.
16. a kind of computing device for human-computer interaction, the computing device includes the display screen and touch pad of phase separation, its In, the computing device includes:
Touch operation acquisition device, for obtaining touch operation of the user in the touch pad;
Determining device, for the first position information based on the touch operation in the touch pad, and the display screen With the coordinate mapping relations of the touch pad, second place information of the touch operation in the display screen is determined;
Touch operation executive device, for performing the touch operation according to the second place information.
17. computing device according to claim 16, wherein, the equipment further includes:
Display device, for according to the second place information it is described display screen display described in touch operation;Alternatively, institute State the execution result information of touch operation described in display screen display.
18. the computing device according to claim 16 or 17, wherein, the computing device further includes:
Device is established, for establishing the coordinate mapping relations of human-computer interaction interface and the touch pad in the display screen;
Wherein, the determining device is used for:
First position information based on the touch operation in the touch pad, and the human-computer interaction interface are touched with described The coordinate mapping relations of template, determine second place information of the touch operation in the display screen.
19. computing device according to claim 18, wherein, the device of establishing is used for:
When human-computer interaction interface is updated in the display screen, the seat of human-computer interaction interface and touch pad in the display screen is established Mark mapping relations.
20. the computing device according to claim 18 or 19, wherein, the device of establishing is used for:
Establish human-computer interaction interface and the coordinate mapping relations of effective touch area in the touch pad in the display screen.
21. computing device according to claim 20, wherein, the effectively shape of touch area and the human-computer interaction The shape at interface matches.
22. the computing device according to claim 20 or 21, wherein, the computing device further includes:
Suggestion device, for when first position information of the touch operation in the touch pad is in effective touch area Outside, there is provided corresponding operation prompt information.
23. computing device according to claim 18, wherein, the human-computer interaction interface includes following at least any one:
Interface of input method;
Search interface;
Function selection interface.
24. computing device according to claim 18, wherein, the second place information includes human-computer interaction in display screen The positional information of one or more interactive interface units on interface.
25. computing device according to claim 24, wherein, the touch operation executive device is used for:
According to the second place information, one or more of interactive interface units corresponding to the second place information are held The row touch operation.
26. computing device according to claim 25, wherein, if the second place information includes man-machine friendship in display screen The positional information of multiple interactive interface units on mutual interface;
Wherein, the touch operation executive device is used for:
According to the second place information, described in the multiple interactive interface unit execution corresponding to the second place information Touch operation;
Wherein, the computing device further includes:
Selection operation acquisition device, for obtaining selection operation of the user in the touch pad;
Selection operation executive device is corresponding to the second place information described more to realize for performing the selection operation The selection of target interactive interface unit in a interactive interface unit.
27. computing device according to claim 16, wherein, the computing device includes intelligent glasses or intelligent helmet.
28. computing device according to claim 27, wherein, the computing device includes intelligent glasses, the touch pad It is deployed in the temple of the mirror holder of the intelligent glasses.
29. computing device according to claim 27, wherein, the computing device includes intelligent helmet, the touch pad It is deployed in the left side, right side or rear side surface region of the intelligent helmet.
30. computing device according to claim 16, wherein, when the computing device in a state of use, the touch Plate is outside user's visual range.
31. a kind of computing device for human-computer interaction, including:
The display screen and touch pad of phase separation;
One or more processors;
Memory;And
One or more programs, wherein one or more of programs are stored in the memory, and be configured to by One or more of processors perform, and described program includes being used to perform the side as any one of claim 1-15 Method.
32. a kind of computer-readable recording medium, is stored thereon with computer program, the computer program can be held by processor Method of the row as any one of claim 1-15.
CN201711345498.5A 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction Active CN107967091B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711345498.5A CN107967091B (en) 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711345498.5A CN107967091B (en) 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction

Publications (2)

Publication Number Publication Date
CN107967091A true CN107967091A (en) 2018-04-27
CN107967091B CN107967091B (en) 2022-12-06

Family

ID=61995326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711345498.5A Active CN107967091B (en) 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction

Country Status (1)

Country Link
CN (1) CN107967091B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347469A (en) * 2019-07-12 2019-10-18 北大方正集团有限公司 Interaction processing method and device
CN112328160A (en) * 2020-10-26 2021-02-05 歌尔智能科技有限公司 Input control method of terminal equipment, terminal equipment and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104793353A (en) * 2015-01-04 2015-07-22 北京君正集成电路股份有限公司 Intelligent glasses
CN105183175A (en) * 2015-10-23 2015-12-23 周维 Wearable communication equipment
CN106527916A (en) * 2016-09-22 2017-03-22 乐视控股(北京)有限公司 Operating method and device based on virtual reality equipment, and operating equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104793353A (en) * 2015-01-04 2015-07-22 北京君正集成电路股份有限公司 Intelligent glasses
CN105183175A (en) * 2015-10-23 2015-12-23 周维 Wearable communication equipment
CN106527916A (en) * 2016-09-22 2017-03-22 乐视控股(北京)有限公司 Operating method and device based on virtual reality equipment, and operating equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347469A (en) * 2019-07-12 2019-10-18 北大方正集团有限公司 Interaction processing method and device
CN112328160A (en) * 2020-10-26 2021-02-05 歌尔智能科技有限公司 Input control method of terminal equipment, terminal equipment and readable storage medium

Also Published As

Publication number Publication date
CN107967091B (en) 2022-12-06

Similar Documents

Publication Publication Date Title
Biener et al. Breaking the screen: Interaction across touchscreen boundaries in virtual reality for mobile knowledge workers
EP2940571B1 (en) Mobile terminal and controlling method thereof
Hertel et al. A taxonomy of interaction techniques for immersive augmented reality based on an iterative literature review
US9733719B2 (en) Mobile terminal and method of controlling the same
CN102880290B (en) A kind of display control method, device and terminal
US9250730B2 (en) Target acquisition system for use in touch screen graphical interface
EP3001299B1 (en) Mobile terminal and controlling method thereof
KR20140026723A (en) Method for providing guide in portable device and portable device thereof
CN109697265A (en) A kind of page returning method and device
KR102266196B1 (en) Apparatus and method for displaying images
WO2014201831A1 (en) Wearable smart glasses as well as device and method for controlling the same
CN103809751A (en) Information sharing method and device
KR102237659B1 (en) Method for input and apparatuses performing the same
KR20160033547A (en) Apparatus and method for styling a content
CN105843467A (en) Icon displaying method and device
CN103135852A (en) Touch screen device and data input method
CN107967091A (en) A kind of man-machine interaction method and the computing device for human-computer interaction
Brasier et al. AR-enhanced Widgets for Smartphone-centric Interaction
US20140368432A1 (en) Wearable smart glasses as well as device and method for controlling the same
US20170108923A1 (en) Historical representation in gaze tracking interface
US20150253980A1 (en) Information processing method and electronic device
CN106201320A (en) User interface control method and system
US8826192B1 (en) Graphical method of inputting parameter ranges
US20170108924A1 (en) Zoom effect in gaze tracking interface
EP2998833A1 (en) Electronic device and method of controlling display of screen thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 1109, No. 570, Shengxia Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai, March 2012

Patentee before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.