CN107967091B - Human-computer interaction method and computing equipment for human-computer interaction - Google Patents

Human-computer interaction method and computing equipment for human-computer interaction Download PDF

Info

Publication number
CN107967091B
CN107967091B CN201711345498.5A CN201711345498A CN107967091B CN 107967091 B CN107967091 B CN 107967091B CN 201711345498 A CN201711345498 A CN 201711345498A CN 107967091 B CN107967091 B CN 107967091B
Authority
CN
China
Prior art keywords
position information
display screen
touch
human
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711345498.5A
Other languages
Chinese (zh)
Other versions
CN107967091A (en
Inventor
刘霞
张建伟
胡军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hiscene Information Technology Co Ltd
Original Assignee
Hiscene Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hiscene Information Technology Co Ltd filed Critical Hiscene Information Technology Co Ltd
Priority to CN201711345498.5A priority Critical patent/CN107967091B/en
Publication of CN107967091A publication Critical patent/CN107967091A/en
Application granted granted Critical
Publication of CN107967091B publication Critical patent/CN107967091B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04809Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application aims to provide a human-computer interaction method and computing equipment for human-computer interaction; acquiring touch operation of a user in the touch pad; determining second position information of the touch operation in the display screen based on first position information of the touch operation in the touch pad and a coordinate mapping relation between the display screen and the touch pad; and executing the touch operation according to the second position information. Compared with the prior art, the touch pad can be used for rapidly positioning the position information in the display screen, so that the interactive operation of the display information on the position information is realized, the operations of sliding back and forth and loosening the touch pad from top to bottom and from left to right are eliminated, the time-consuming man-machine interaction mode can be effectively reduced, the flexibility of man-machine interaction is improved, and the man-machine interaction operation is more friendly.

Description

Human-computer interaction method and computing equipment for human-computer interaction
Technical Field
The present application relates to the field of computers, and more particularly, to a technique for human-computer interaction of a computing device.
Background
In an application of an existing smart device, such as smart glasses or a smart helmet, a mainstream input method is that a user performs input through a control operation on a touch panel. Generally, when a display screen of the intelligent device and a touch pad for a user to perform input operation are not in a field of view, if the display screen interface needs to input information, the user generally controls the input method indirectly through the touch pad. For example, in smart glasses design, a display screen is generally placed in front of human eyes for easy viewing of information, and a touch pad is placed at other positions of a lens frame, such as left and right sides of the lens frame, for aesthetic appearance and operability. For the intelligent device with the touch pad and the display screen not in the same field of view, when the user uses the intelligent device, the display screen is directly in front of the sight line of the user, the input method of the intelligent device is displayed in the display screen, and at this time, the event control input method is generally determined by sliding and clicking in four directions on the touch pad. For example, when a user selects a specific key in the input method through the touch pad, the key to be input in the input method can be found only by continuously operating the position of the currently focused key, such as sliding up and down, left and right, sliding the finger on the touch pad, then releasing, and repeatedly sliding and releasing the combined operation.
Disclosure of Invention
The application aims to provide a human-computer interaction method and computing equipment for human-computer interaction.
According to one aspect of the application, a human-computer interaction method for a computing device is provided, and comprises the following steps:
acquiring touch operation of a user in the touch panel;
determining second position information of the touch operation in the display screen based on first position information of the touch operation in the touch panel and a coordinate mapping relation between the display screen and the touch panel;
and executing the touch operation according to the second position information.
According to another aspect of the application, there is provided a computing device for human-computer interaction, comprising:
the touch operation acquisition device is used for acquiring the touch operation of a user in the touch panel;
the determining device is used for determining second position information of the touch operation in the display screen based on first position information of the touch operation in the touch panel and a coordinate mapping relation between the display screen and the touch panel;
and the touch operation executing device is used for executing the touch operation according to the second position information.
According to another aspect of the present application, there is also provided a computing device for human-computer interaction, comprising:
a display screen and a touch panel that are separated;
one or more processors;
a memory; and
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs comprising instructions for:
acquiring touch operation of a user in the touch panel;
determining second position information of the touch operation in the display screen based on first position information of the touch operation in the touch pad and a coordinate mapping relation between the display screen and the touch pad;
and executing the touch operation according to the second position information.
According to another aspect of the present application, there is also provided a computer-readable storage medium having a computer program stored thereon, the computer program being executable by a processor to:
acquiring touch operation of a user in the touch panel;
determining second position information of the touch operation in the display screen based on first position information of the touch operation in the touch panel and a coordinate mapping relation between the display screen and the touch panel;
and executing the touch operation according to the second position information.
Compared with the prior art, after the touch operation of the user in the touch pad is acquired, the second position information of the touch operation in the display screen is determined based on the first position information in the touch pad and the coordinate mapping relation between the display screen and the touch pad which are separated from each other in the computing equipment, so that the touch operation can be executed based on the second position information. Here, since a preset coordinate mapping relationship exists between the display screen and the touch pad, that is, the position information on the display screen, and the corresponding position information is matched with the position information on the touch pad, when a user touches and operates the first position information on the touch pad, based on the pre-existing coordinate mapping relationship, the execution of the touch operation is acted on the second position information in the display screen matched with the first position information, for example, if the interaction information to be selected is presented on the second position information, the user can select the interaction information by the touch operation on the first position information. Therefore, based on the application, a user can quickly position the position information in the display screen through the touch pad, so that interactive operation of the display information on the position information is realized, the operation of sliding back and forth and loosening the touch pad from up and down and left and right is avoided, the time-consuming man-machine interaction mode can be effectively reduced, the flexibility of man-machine interaction is improved, and the man-machine interaction operation is more friendly.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 shows a schematic diagram of a human-computer interaction method for a computing device, according to an aspect of the application:
FIG. 2 illustrates a device diagram of a computing device for human-computer interaction, in accordance with an aspect of the subject application;
FIG. 3 illustrates an exemplary diagram of a shape of an active touch area of a touch pad matching a shape of a human-machine interface of a display screen according to one embodiment of the present application;
FIG. 4 illustrates an example diagram of a coordinate mapping relationship of a human-computer interaction interface and a touch pad in a display screen according to one embodiment of the application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the computing device include one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
FIG. 1 illustrates a schematic diagram of a method for human-computer interaction of a computing device 1, according to an aspect of the subject application. Wherein the method comprises step S11, step S12 and step S13. In step S11, the computing device 1 obtains a touch operation of a user in the touch panel; next, in step S12, the computing device 1 determines second position information of the touch operation in the display screen based on the first position information of the touch operation in the touch panel and the coordinate mapping relationship between the display screen and the touch panel; next, in step S13, the computing device 1 performs the touch operation according to the second position information.
Here, the computing device 1 may include, but is not limited to, various smart devices, such as smart appliances, e.g., smart televisions; also like portable smart machine, wear-type display device such as intelligent glasses, intelligent helmet, in an implementation, functions such as AR, VR, MR can also be loaded to intelligent glasses, intelligent helmet etc..
In one embodiment, the computing device comprises smart glasses or a smart helmet. Further, in one embodiment, the computing device 1 comprises smart glasses, and the touch pad is disposed on a temple of a frame of the smart glasses, for example, one of the temples may be disposed, or a touch pad may be disposed on both temples as needed. In addition, the touch pad may be disposed in other areas separate from the respective display screen based on the actual shape of the smart glasses. In one embodiment, the computing device includes a smart helmet, and the touchpad can be disposed on a left, right, or rear side surface area of the smart helmet. In practical applications, the touch pad may be disposed on a region of the surface of the smart helmet separate from the display screen, for example, any region on the surface of the smart helmet that can be touched by a user may be, and the region where the touch pad is located does not overlap with the region where the display screen is located, such as a left side surface region, a right side surface region, or a rear side surface region of the smart helmet; for another example, if the display screen is disposed inside the intelligent helmet, for example, when the user wears the intelligent helmet, the display screen is located right in front of the field of view of the user, and at this time, the touch pad may also be disposed on the surface of the intelligent helmet on the back of the display screen.
In the present application, the computing device 1 includes a separate display screen and touchpad. In one implementation, the separation of the display screen from the touch panel may include the two being spatially separated from each other, not integrated or superimposed. In one embodiment, the touchpad is outside the user's visual range when the computing device 1 is in use. At this time, the touch pad and the display screen are not in the same field of view of the user. For example, if the computing device 1 includes smart glasses and the touch pad is disposed on a temple of a frame of the smart glasses, the touch pad is out of the user's visible range, and the display screen is disposed on a lens of the smart glasses within the user's visible range, so the user cannot see both the touch pad and the display screen at the same time. For another example, if the computing device 1 includes two front and back opposite surfaces, where the front surface is disposed on a display screen, and the display screen is in the user's field of view when the computing device 1 is used, and the back surface, i.e. the back surface of the device, is disposed with a corresponding touch pad, in the use state, the touch pad is out of the user's visible range and is not in the same field of view as the display screen. In another embodiment, the touchpad is simultaneously within user-viewable range of the display screen when the computing device 1 is in use. For example, if the computing device 1 includes a smart tv, the touch panel is disposed on a remote control device of the smart tv, and the touch panel and a display screen of the smart tv may be simultaneously visible to the user.
In one implementation, the touch pad may be integrated in the computing device 1, such as the touch pad of the smart glasses integrated on the frame; the touch pad may also be a relatively separate component of the computing device 1, for example the touch pad of the smart tv may be a separate control device. In one implementation, the touch panel may include a touch panel supporting a single touch, and may also include a touch panel supporting multiple touches. That is, in the present application, the user may touch the touch panel at a single point or at multiple points.
In the present application, the display screen may include a display screen of the computing device 1 itself, for example, one or more display screens; may also include a projection surface projected by computing device 1, or a projection apparatus included in computing device 1; the display screen may also comprise a combination of the types described above. In an implementation manner, the size information of the projection surface may be set by a projection apparatus corresponding to the computing device 1, for example, an external projector or an internal projection component, for example, the projection apparatus sets a projection resolution, and after determining a projection distance, calculates the size information of the corresponding projection surface.
Specifically, in step S11, the computing device 1 acquires a touch operation of the user in the touch panel. In one implementation, the touch operation mode may include, but is not limited to, clicking, such as single clicking or multi-clicking; pressing, such as pressing with different pressure degrees and pressing for different time lengths; moving; lifting the finger to release; and combinations of the above. It should be understood by those skilled in the art that the above-mentioned touch operation manner is only an example, and other touch operation manners existing in the present application or appearing in the future are also included in the protection scope of the present application if applicable to the present application, and are included in the form of reference. Here, the touch operation submitted by the user may be realized by the user touching the touch panel with a finger, or may be realized by the user touching the touch panel with another touch component, such as a stylus.
In an implementation manner, the touch operation may be used to implement an interactive operation on interactive information in the display screen, and a manner of the touch operation is matched with an execution result of a subsequent touch operation. For example, the touch operation can be performed to implement any customized interaction operation such as selection, confirmation, deletion and the like of the corresponding interaction information in the display screen.
Next, in step S12, the computing device 1 determines second position information of the touch operation in the display screen based on the first position information of the touch operation in the touch panel and the coordinate mapping relationship between the display screen and the touch panel.
In one implementation, the coordinate mapping relationship between the display screen and the touch pad may include a coordinate mapping relationship between the entire display screen and the touch pad, and at this time, since the display screen and the touch pad are fixed, after the coordinate mapping relationship between the display screen and the touch pad is established in advance, the coordinate mapping relationship also exists relatively fixedly. In another implementation manner, the coordinate mapping relationship between the display screen and the touch pad may further include a coordinate mapping relationship between a human-computer interaction interface in the display screen and the touch pad. Here, the display screen may include a human-computer interaction interface, that is, the interaction information in the display screen is a human-computer interaction interface. In one implementation manner, the human-computer interaction interface includes an input method interface, the input method interface may include letter keys, number keys, symbol keys, other function keys, and the like, and the input of corresponding information may be implemented through the operation of a user on the input method interface. In addition, the human-computer interaction interface can also comprise various interaction interfaces such as a search interface or a function selection interface; the human-computer interaction interface can also be a combination of the above interaction interfaces. It should be understood by those skilled in the art that the above-mentioned various human-computer interaction interfaces are only examples, and other types of human-computer interaction interfaces, which are present or come out later, should be included in the scope of the present application if they can be applied to the present application, and are included herein by reference. At this time, since the human-computer interaction interface in the display screen can be changed based on different practical applications, the coordinate mapping relationship between the human-computer interaction interface in the display screen and the touch pad can also be flexibly changed.
Here, the coordinate mapping relationship may include position information on the display screen, and corresponding position information on the touch panel is matched thereto, for example, when a user touches and operates first position information (x 0, y 0) on the touch panel, the touch operation is performed based on a pre-existing coordinate mapping relationship, that is, the touch operation acts on second position information (x, y) in the display screen matched to the first position information.
For example, the display screen includes one column in the input method: 1234567890, the user wants to find the key "0", and only needs to touch the first position information on the touch pad that matches the second position information of the key "0" on the display screen.
As another example, the display screen still includes one column of the input method: 1234567890, the user wants to go from the current key "0" to the key "1"; or the predetermined target key of the user is the key "8", but there is a deviation before, and the touch operation is to the first position information corresponding to "0". In the application scenario, the user may move from the first position information corresponding to the second position information of the key "0" to the first position information corresponding to the second position information of the key "1" or the key "8" through a sliding touch operation, thereby implementing selection of the key "1" or the key "8". Here, the touch operation executed based on the coordinate mapping relationship may be implemented by continuously sliding the finger, without performing operations of sliding back and forth and releasing the touch pad in the prior art, thereby effectively reducing interaction time.
Next, in step S13, the computing device 1 performs the touch operation according to the second position information. Here, the performing of the touch operation is a response to a touch operation of the user on the first position information. In one implementation, the execution of the touch operation may be used to implement any self-defined interaction operation, such as selection, confirmation, deletion, and the like, of corresponding interaction information in the display screen. For example, it may be set that the touch operation includes a selection operation, the touch operation is performed in a manner of a sliding operation, and the current sliding operation passes through first position information on the touch pad, and then the touch operation is performed to select interactive information on second position information in the display screen corresponding to the first position information, for example, when a human-computer interaction interface in the display screen is an input method interface and a user's finger currently strokes a numeric key "1", then a selection operation on the numeric key "1" is performed. For another example, it may also be provided that the touch operation includes a confirmation operation, and in an implementation manner, the touch operation includes a pressing operation, and continuing the above example, confirming the selected numeric key "1", that is, determining to input the selected numeric key "1", and after the user performs the pressing operation on the first position information corresponding to the second position information matching the numeric key "1", performing the confirmation operation on the numeric key "1"; in another implementation manner, the confirmation operation may be further configured to include a release operation, that is, in the above example, after the user selects the numeric key "1" through a sliding operation, the confirmation operation on the numeric key "1" is completed through the release operation of lifting the finger on the first position information; in another implementation manner, the method for confirming the operation may further include automatically performing the confirmation operation when the touch duration exceeds a predetermined duration, that is, continuing the above example, and automatically performing the confirmation operation when the user stops at the position for a duration exceeding a predetermined duration after sliding to the first position information corresponding to the numeric key "1".
In one embodiment, the method further includes step S14 (not shown), in step S14, the computing device 1 may display the touch operation on the display screen according to the second position information, for example, if the touch operation includes a sliding operation, when the touch operation slides through each piece of first position information on the touch panel, the interactive information on each piece of second position information corresponding to the display screen is sequentially and distinctively displayed, for example, the passed key in the input method interface is sequentially highlighted, or the shape, size, color, and the like of the passed key currently selected are changed, so that the user can know the progress of the touch operation conveniently, and the human-computer interaction is more friendly.
Or displaying the execution result information of the touch operation on the display screen. For example, the touch operation includes a confirmation operation on a corresponding key in the input method interface, if the letter "a" is input based on the touch operation in an english input state, the selected letter "a" is displayed in a corresponding presentation area on the display screen, such as at a position of a text input cursor; for another example, in the pinyin input state, the input letter "a" is confirmed based on the touch operation, and then the corresponding area on the input method interface is displayed with the option of the kanji character that may correspond after the letter "a" is confirmed.
Here, it should be understood by those skilled in the art that the above-mentioned manner of displaying the touch operation on the display screen according to the second position information or the manner of displaying the execution result information of the touch operation on the display screen is only an example, and other manners existing or appearing in the future for displaying the touch operation on the display screen according to the second position information or the manner of displaying the execution result information of the touch operation on the display screen should be included in the scope of the present application if applicable to the present application and are included by reference.
In one implementation, the touch panel may support multi-touch operation, and at this time, a plurality of touch operations of the user in the touch panel may be acquired at the same time, for example, the touch operations of a plurality of fingers of the user may be acquired. For example, when a plurality of fingers of the user move to the respective corresponding first position information, the second position information corresponding to each first position information is determined, for example, a key in the input method interface corresponding to each second position information is highlighted, and then the touch operation may be executed according to each second position information according to the sequence of expected confirmation, for example, the user sequentially releases the fingers on the first position information corresponding to each second position information to sequentially implement the corresponding confirmation operation.
After the touch operation of the user in the touch pad is acquired, second position information of the touch operation in the display screen is determined based on the first position information in the touch pad and in combination with a coordinate mapping relation between the display screen and the touch pad which are separated from each other in the computing device 1, so that the touch operation can be executed based on the second position information. Here, since a preset coordinate mapping relationship exists between the display screen and the touch pad, that is, the position information on the display screen, and the corresponding position information is matched with the display screen, when a user touches and operates the first position information on the touch pad, based on the pre-existing coordinate mapping relationship, the execution of the touch operation is acted on the second position information in the display screen matched with the first position information, for example, if the interactive information to be selected is presented on the second position information, the touch operation of the user on the first position information can realize the selection of the interactive information. Therefore, based on the application, the user can quickly position the position information in the display screen through the touch pad, further realize the interactive operation of the display information on the position information, remove the operations of sliding back and forth and loosening the touch pad from top to bottom and left and right, thereby effectively reducing the time-consuming human-computer interaction mode, improving the flexibility of human-computer interaction and ensuring that the human-computer interaction operation is more friendly.
In another embodiment, the method further includes step S15 (not shown), in step S15, the computing device 1 may establish a coordinate mapping relationship between the human-computer interaction interface in the display screen and the touch pad; next, in step S12, the computing device 1 may determine second position information of the touch operation in the display screen based on the first position information of the touch operation in the touch panel and the coordinate mapping relationship between the human-computer interaction interface and the touch panel.
Here, the display screen may include a human-computer interaction interface, that is, the interaction information in the display screen is a human-computer interaction interface. In one implementation manner, the human-computer interaction interface includes an input method interface, the input method interface may include letter keys, number keys, symbol keys, other function keys, and the like, and the input of corresponding information may be implemented through the operation of the user on the input method interface. In addition, the human-computer interaction interface can also comprise various interaction interfaces such as a search interface, a function selection interface and the like.
In the application, in order to avoid complex operations of sliding and releasing back and forth on the touch pad by a user, and meanwhile, in order to quickly position interactive information in the display screen through the touch pad, such as positioning of keys in an input method interface and reduce time overhead, a man-machine interaction interface in the display screen and the touch pad can be scaled according to relative proportion to form a coordinate mapping relationship. In an implementation manner, a one-to-one coordinate mapping relationship may be established between the area of the touch panel and the arrangement order and the position relationship of the interaction interface units in the human-computer interaction interface, taking an input method interface as an example, the interaction interface units may include various letter keys, number keys, symbol keys, other function keys, and the like in the input method interface.
An example of establishing a coordinate mapping relationship between the human-computer interaction interface and the touch pad in the display screen is shown in fig. 4. Suppose the resolution of the touch pad is x1 x y1 and the resolution of the display screen is x2 x y2. Assuming that the user's finger touches the first location information (x 0, y 0) on the touch pad, it falls to the second location information (x, y) in the display screen input method interface. The calculation formula is as follows:
Figure BDA0001508708730000101
can obtain the product
Figure BDA0001508708730000102
Figure BDA0001508708730000103
Can obtain
Figure BDA0001508708730000104
Therefore, the coordinate mapping relation between the man-machine interaction interface in the display screen and the touch pad can be obtained.
It should be understood by those skilled in the art that the above-mentioned manner for establishing the coordinate mapping relationship between the human-computer interaction interface and the touch pad in the display screen is only an example, and other manners existing now or later to establish the coordinate mapping relationship between the human-computer interaction interface and the touch pad in the display screen, if applicable to the present application, should be included in the scope of protection of the present application and are included herein by reference.
In an implementation manner, the establishment of the coordinate mapping relationship between the whole display screen and the touch pad may also refer to an establishment method of the coordinate mapping relationship between a human-computer interaction interface in the display screen and the touch pad, and at this time, since the display screen and the touch pad are fixed, the coordinate mapping relationship also exists relatively fixedly.
Further, in one embodiment, in step S15, when the human-computer interaction interface in the display screen is updated, the computing device 1 may establish a coordinate mapping relationship between the human-computer interaction interface in the display screen and the touch panel. In practical application, the coordinate mapping relationship can be reestablished when the human-computer interaction interface is switched. For example, after the style of the input method interface is changed based on the selection of the user, the coordinate mapping relationship is reestablished, or for example, if other interaction information of the human-computer interaction interface is switched or changed, the coordinate mapping relationship may be reestablished. Therefore, the matching degree and accuracy of man-machine interaction operation can be guaranteed, and friendly user experience is brought.
In one embodiment, in step S15, computing device 1 establishes a coordinate mapping relationship between the human-computer interaction interface in the display screen and the effective touch area in the touch pad. In practical application, a coordinate mapping relation with the human-computer interaction interface in the display screen can be established based on part or all of the touch areas of the touch pad. If the effective touch area is a partial touch area, the touch operation performed by the user in the area is effective; and if the effective touch area comprises all touch areas, the touch operation performed by the user in all the touch areas of the touch pad is effective.
Further, in an embodiment, the method further includes step S16 (not shown), in step S16, when the first position information of the touch operation in the touch panel is outside the effective touch area, the computing device 1 provides corresponding operation prompt information. In one implementation, the operation prompt message may be used to prompt the user that the current touch area is invalid, or guide the user to enter the valid touch area to perform a valid touch operation, or the like. For example, a prompt of "please click on the effective interaction area", or a prompt of "please move the touch operation in the xx direction", or the like may be prompted. In one implementation, the operation prompt information may include, but is not limited to, a voice prompt, a text prompt in a display screen, and the like.
In one embodiment, the shape of the effective touch area matches the shape of the human-computer interaction interface.
In practical application, the shape of the human-computer interaction interface in the display screen, such as the shape of the input method interface layout, may be different, and further, the shape of the matched effective touch area may be set based on the shape of the human-computer interaction interface. Moreover, the human-computer interaction interface, such as an input method interface, can be presented at any position in the display screen. Fig. 3 shows several sets of examples of matching the shape of the active touch area of a touch pad to the shape of the human-machine interface of a display according to one embodiment of the present application. The input method interface can occupy the whole display screen or a part of the area presented in the display screen. In one implementation, the layout of the effective touch area in the touch pad may be adjusted according to the shape of the input method interface in the display screen, for example: the shape of the input method interface is square, and the shape of the effective touch area of the touch pad can be preferably square, so that the established mapping relation is accurate, and the touch accuracy is high; for another example, the shape of the input method interface is an ellipse, and the effective area of the touch pad may also be a rectangle. In one implementation, even if the shape of the effective touch area is not identical to the shape of the human-computer interaction interface, a mapping relation between the effective touch area and the human-computer interaction interface can be established. In one implementation, when a user needs to use an input method, for example, based on WiFi passwords, typing, inputting contacts, and other scenarios, the input method interface information will bounce out and be displayed on the display screen, so that the shape of the effective touch area on the touch pad can be flexibly matched or adjusted based on changes in the shapes of different input method interfaces in different application scenarios.
In one embodiment, the second position information comprises position information of one or more interactive interface units on the human-computer interactive interface in the display screen. In one implementation mode, the interactive interface unit is an interaction object which is provided for a user to perform interactive operation in the human-computer interaction interface. For example, in the input method interface, each key, such as an alphabetic key, a numeric key, a symbol key, other function keys, etc., may correspond to an interactive interface unit. Here, the second location information may correspond to location information of only one interactive interface unit on the human-computer interactive interface included in the display screen, or may correspond to location information of a plurality of interactive interface units at the same time.
Further, in an embodiment, in step S13, the computing device 1 may perform the touch operation on the one or more interactive interface units corresponding to the second position information according to the second position information.
Specifically, in an implementation manner, if only one interactive interface unit corresponding to the second location information is provided, the touch operation may be directly performed to implement any self-defined interactive operation, such as selection, confirmation, deletion, and the like, of the interactive interface unit. For example, it may be set that the touch operation includes a selection operation, the manner of the touch operation includes a sliding operation, and the current sliding passes through first position information on the touch panel, then the executed touch operation is to select interactive information on second position information in the display screen corresponding to the first position information, for example, when a human-computer interaction interface in the display screen is an input method interface, and a user's finger currently crosses a numeric key "1", then a selection operation on the numeric key "1" is executed.
In another implementation mode, if the second position information comprises position information of a plurality of interactive interface units on a human-computer interactive interface in a display screen; in step S13, according to the second position information, the computing device 1 performs the touch operation on the multiple interactive interface units corresponding to the second position information; wherein the method further comprises a step S17 (not shown), in step S17, the computing device 1 may acquire a selection operation of the user in the touch panel; and then, executing the selection operation to realize the selection of the target interactive interface unit in the plurality of interactive interface units corresponding to the second position information. In one implementation, if the user expects to select one or more target interactive interface units of the multiple interactive interface units corresponding to the second location information, the result of performing the touch operation on the multiple interactive interface units corresponding to the second location information by the computing device 1 is that the second location information is locked. And further, on the basis of the touch operation, the user can further perform touch operation, namely selection operation, in the touch panel to select the target interactive interface unit.
For example, when the second position information corresponds to the positions of two keys in the input method interface, the two keys are highlighted simultaneously, and then one key is selected by touching up and down or left and right again; similarly, when the second position information corresponds to the adjacent positions of the four keys in the input method interface, the four keys are highlighted simultaneously, and then one key is selected by touching the four keys up, down, left and right again. It will be understood by those skilled in the art that the above selection operations are merely exemplary, and that other selection operations now or later developed, if applicable to the present application, are intended to be included within the scope of the present application and are hereby incorporated by reference.
For another example, if there are a plurality of target interactive interface units in the plurality of interactive interface units corresponding to the second location information, and if there are a plurality of keys in the input method interface corresponding to the second location information and some of the keys need to be selected continuously, then at this time, a plurality of selection operations of the user may be obtained continuously based on an expected selected order of each target interactive interface unit. For example, when the second position information corresponds to positions of two keys in the input method interface, and the two keys are two keys that the user expects to continuously select, the two keys are simultaneously highlighted, at this time, the touch operation is actually locking the two keys corresponding to the second position information, and then the previous target key is selected by touching up and down or left and right again, and the selection operation may include but is not limited to a re-pressing operation or a releasing operation, and after one selection is completed, one target key is confirmed, and then the selection operation of the next target key may be further performed, for example, if the selection operation includes a re-pressing operation, the next target key may be selected by touching up and down or left and right again without releasing a finger, and the re-pressing operation is performed again, so as to complete the selection operation of the next target key; for another example, if the selection operation includes a release operation, the selection operation of the target key in the subsequent order may be completed by continuously touching again up and down or left and right within the preset effective operation time to select the target key in the subsequent order and perform the release operation again. It will be appreciated by those skilled in the art that the above-described selection operations are merely exemplary, and that other selection operations now or later developed, if applicable to the present application, are intended to be encompassed within the scope of the present application and are hereby incorporated by reference.
FIG. 2 illustrates a device diagram of a computing device 1 for human-computer interaction in accordance with an aspect of the subject application. Wherein the computing device 1 comprises touch operation acquisition means 21, determination means 22 and touch operation execution means 23. Wherein the touch operation acquisition means 21 acquires a touch operation of a user in the touch panel; the determining device 22 determines second position information of the touch operation in the display screen based on first position information of the touch operation in the touch panel and the coordinate mapping relation between the display screen and the touch panel; the touch operation executing device 23 executes the touch operation according to the second position information.
Here, the computing device 1 may include, but is not limited to, various smart devices, such as smart appliances, e.g., smart televisions; for another example, portable smart machine, wear-type display device such as intelligent glasses, intelligent helmet, in an implementation, functions such as AR, VR, MR can also be loaded to intelligent glasses, intelligent helmet etc..
In one embodiment, the computing device comprises smart glasses or a smart helmet. Further, in one embodiment, the computing device 1 includes smart glasses, and the touch pad is disposed on a temple of a frame of the smart glasses, for example, one of the temples may be disposed, or both temples may be disposed with a touch pad as needed. In addition, the touch pad may also be disposed in other areas separate from the respective display screens based on the actual shape of the smart glasses. In one embodiment, the computing device includes a smart helmet, and the touchpad can be disposed on a left, right, or rear side surface area of the smart helmet. In practical applications, the touch pad may be disposed on a region of the surface of the smart helmet separate from the display screen, for example, any region on the surface of the smart helmet that can be touched by a user may be, and the region where the touch pad is located does not overlap with the region where the display screen is located, such as a left side surface region, a right side surface region, or a rear side surface region of the smart helmet; for another example, if the display screen is disposed inside the intelligent helmet, for example, when the user wears the intelligent helmet, the display screen is located right in front of the field of view of the user, and at this time, the touch pad may also be disposed on the surface of the intelligent helmet on the back of the display screen.
In the present application, the computing device 1 includes a separate display screen and touchpad. In one implementation, the separation of the display screen from the touch panel may include the two being spatially separated from each other, not integrated or superimposed. In one embodiment, the touchpad is outside the user's field of view when the computing device 1 is in use. At this time, the touch pad and the display screen are not in the same field of view of the user. For example, if the computing device 1 includes smart glasses and the touch pad is disposed on a temple of a frame of the smart glasses, the touch pad is out of the user's visible range, and the display screen is disposed on a lens of the smart glasses within the user's visible range, so the user cannot see both the touch pad and the display screen at the same time. For another example, if the computing device 1 includes two front and back opposite surfaces, where the front surface is disposed on a display screen, and the display screen is in the user's field of view when the computing device 1 is used, and the back surface, i.e. the back surface of the device, is disposed with a corresponding touch pad, in the use state, the touch pad is out of the user's visible range and is not in the same field of view as the display screen. In another embodiment, the touchpad is simultaneously within user-viewable range of the display screen when the computing device 1 is in use. For example, if the computing device 1 includes a smart tv, the touch panel is disposed on a remote control device of the smart tv, and the touch panel and a display screen of the smart tv may be simultaneously visible to a user.
In one implementation, the touch pad may be integrated in the computing device 1, such as the touch pad of the smart glasses integrated on the frame; the touchpad may also be a relatively separate component of the computing device 1, for example the touchpad of the smart tv may be a separate control device. In one implementation, the touch panel may include a touch panel supporting a single touch, and may also include a touch panel supporting multiple touches. That is, in the present application, the user may touch the touch panel at a single point or at multiple points.
In the present application, the display screen may include a display screen of the computing device 1, for example, one or more display screens; may also include a projection surface projected by computing device 1, or a projection apparatus included in computing device 1; the display screen may also comprise a combination of the types described above. In an implementation manner, the size information of the projection surface may be set by a projection apparatus corresponding to the computing device 1, for example, an external projector or an internal projection component, for example, the projection apparatus sets a resolution of projection, and after the projection distance is determined, the size information of the corresponding projection surface is calculated.
Specifically, the touch operation acquisition means 21 acquires a touch operation of the user in the touch panel. In one implementation, the touch operation mode may include, but is not limited to, clicking, such as single clicking or multi-clicking; pressing, such as pressing with different pressure degrees and pressing for different time lengths; moving; lifting the finger to release; and combinations of the above. It should be understood by those skilled in the art that the above-mentioned touch operation manner is only an example, and other touch operation manners existing or coming out later should be included in the protection scope of the present application if applicable to the present application, and are included by reference. Here, the touch operation submitted by the user may be realized by the user touching the touch panel with a finger, or may be realized by the user touching the touch panel with another touch component, such as a stylus.
In an implementation manner, the touch operation may be used to implement an interactive operation on interactive information in the display screen, and a manner of the touch operation is matched with an execution result of a subsequent touch operation. For example, the touch operation may be performed to implement any customized interaction operation, such as selection, confirmation, deletion, etc., of the corresponding interaction information in the display screen.
The determining device 22 determines second position information of the touch operation in the display screen based on the first position information of the touch operation in the touch panel and the coordinate mapping relation between the display screen and the touch panel.
In one implementation, the coordinate mapping relationship between the display screen and the touch pad may include a coordinate mapping relationship between the entire display screen and the touch pad, and at this time, since the display screen and the touch pad are fixed, after the coordinate mapping relationship between the display screen and the touch pad is established in advance, the coordinate mapping relationship also exists relatively fixedly. In another implementation manner, the coordinate mapping relationship between the display screen and the touch pad may further include a coordinate mapping relationship between a human-computer interaction interface in the display screen and the touch pad. Here, the display screen may include a human-computer interaction interface, that is, the interaction information in the display screen is a human-computer interaction interface. In one implementation manner, the human-computer interaction interface includes an input method interface, the input method interface may include letter keys, number keys, symbol keys, other function keys, and the like, and the input of corresponding information may be implemented through the operation of the user on the input method interface. In addition, the human-computer interaction interface can also comprise various interaction interfaces such as a search interface or a function selection interface; the human-computer interaction interface can also be a combination of the above interaction interfaces. It should be understood by those skilled in the art that the above-mentioned various human-computer interaction interfaces are only examples, and other types of human-computer interaction interfaces, which are present or come out later, should be included in the scope of the present application if they can be applied to the present application, and are included herein by reference. At this time, since the human-computer interaction interface in the display screen may change based on different practical applications, the coordinate mapping relationship between the human-computer interaction interface in the display screen and the touch pad may also change flexibly.
Here, the coordinate mapping relationship may include position information on the display screen, and corresponding position information on the touch panel is matched thereto, for example, when a user touches and operates first position information (x 0, y 0) on the touch panel, the touch operation is performed based on a pre-existing coordinate mapping relationship, that is, the touch operation acts on second position information (x, y) in the display screen matched to the first position information.
For example, the display screen includes one column of the input method: 1234567890, the user wants to find the key "0", and can do so by touching the first position information on the touchpad that matches the second position information of the key "0" on the display screen.
As another example, the display screen still includes one column of the input method: 1234567890, the user wants to go from the current key "0" to the key "1"; or the predetermined target key of the user is the key "8", but there is a deviation before, and the touch operation is to the first position information corresponding to "0". In the application scenario, the user may move from the first position information corresponding to the second position information of the key "0" to the first position information corresponding to the second position information of the key "1" or the key "8" through a sliding touch operation, so as to select the key "1" or the key "8". Here, the touch operation executed based on the coordinate mapping relationship may be implemented by continuously sliding the finger, without performing operations of sliding back and forth and releasing the touch pad in the prior art, thereby effectively reducing interaction time.
The touch operation executing device 23 executes the touch operation according to the second position information. Here, the performing of the touch operation is a response to a touch operation of the user on the first position information. In one implementation, the touch operation may be performed to implement any self-defined interaction operation, such as selection, confirmation, deletion, and the like, of corresponding interaction information in the display screen. For example, it may be set that the touch operation includes a selection operation, the touch operation is performed in a manner of a sliding operation, and the current sliding operation passes through first position information on the touch pad, and then the touch operation is performed to select interactive information on second position information in the display screen corresponding to the first position information, for example, when a human-computer interaction interface in the display screen is an input method interface and a user's finger currently strokes a numeric key "1", then a selection operation on the numeric key "1" is performed. For another example, it may also be provided that the touch operation includes a confirmation operation, and in an implementation manner, the touch operation includes a pressing operation, and continuing the above example, confirming the selected numeric key "1", that is, determining to input the selected numeric key "1", and after the user performs the pressing operation on the first position information corresponding to the second position information matching the numeric key "1", performing the confirmation operation on the numeric key "1"; in another implementation manner, the confirmation operation may also be set to include a release operation, that is, continuing the above example, after the user selects the number key "1" through a sliding operation, the confirmation operation on the number key "1" is completed through a release operation of lifting a finger on the first position information; in another implementation manner, the manner of confirming the operation may further include automatically performing the confirmation operation when the touch duration exceeds a predetermined duration, that is, continuing the above example, after the user slides to the first position information corresponding to the numeric key "1", and then stays at the position for a time exceeding the predetermined duration, automatically performing the confirmation operation.
In one embodiment, the computing device 1 further includes a display device (not shown), and the display device may display the touch operation on the display screen according to the second position information, for example, if the touch operation includes a sliding operation, when the touch operation slides through each piece of first position information on the touch panel, the interactive information on each piece of second position information corresponding to the display screen is sequentially and distinctively displayed, for example, the passing key selected in the input method interface is sequentially highlighted, or the shape, size, color, and the like of the currently passing key selected are changed, so that a user can conveniently know the progress of the touch operation, and human-computer interaction is more friendly.
Or displaying the execution result information of the touch operation on the display screen. For example, the touch operation includes a confirmation operation on a corresponding key in the input method interface, if the letter "a" is input based on the touch operation in an english input state, the selected letter "a" is displayed in a corresponding presentation area on the display screen, such as at a position of a text input cursor; for another example, in the pinyin input state, the input letter "a" is confirmed based on the touch operation, and then the corresponding area on the input method interface is displayed with the option of the kanji character that may correspond after the letter "a" is confirmed.
Here, it should be understood by those skilled in the art that the above-mentioned manner of displaying the touch operation on the display screen according to the second position information or the manner of displaying the execution result information of the touch operation on the display screen is only an example, and other manners existing or appearing in the future of displaying the touch operation on the display screen according to the second position information or the manner of displaying the execution result information of the touch operation on the display screen should be included in the scope of protection of the present application if applicable to the present application and are included by reference.
In one implementation, the touch panel may support multi-touch operation, and at this time, a plurality of touch operations of the user in the touch panel may be acquired at the same time, for example, the touch operations of a plurality of fingers of the user may be acquired. For example, when a plurality of fingers of the user move to the corresponding first position information, the second position information corresponding to each first position information is determined, for example, the keys in the input method interface corresponding to each second position information are highlighted, and then the touch operation may be executed according to each second position information according to the sequence of expected confirmation, for example, the user releases the fingers on the first position information corresponding to each second position information in sequence, so as to implement the corresponding confirmation operation in sequence.
After the touch operation of the user in the touch pad is acquired, second position information of the touch operation in the display screen is determined based on the first position information in the touch pad and in combination with a coordinate mapping relation between the display screen and the touch pad which are separated from each other in the computing device 1, so that the touch operation can be executed based on the second position information. Here, since a preset coordinate mapping relationship exists between the display screen and the touch pad, that is, the position information on the display screen, and the corresponding position information is matched with the display screen, when a user touches and operates the first position information on the touch pad, based on the pre-existing coordinate mapping relationship, the execution of the touch operation is acted on the second position information in the display screen matched with the first position information, for example, if the interactive information to be selected is presented on the second position information, the touch operation of the user on the first position information can realize the selection of the interactive information. Therefore, based on the application, a user can quickly position the position information in the display screen through the touch pad, further realize the interactive operation of the display information on the position information, remove the operations of sliding back and forth and loosening the touch pad from the upper part and the lower part, so that the time-consuming man-machine interaction mode can be effectively reduced, the flexibility of man-machine interaction is improved, and the man-machine interaction operation is more friendly.
In another embodiment, the computing device 1 further comprises a building device (not shown), and the building device can build a coordinate mapping relation between a human-computer interaction interface in the display screen and the touch pad; next, in step S12, the computing device 1 may determine second position information of the touch operation in the display screen based on the first position information of the touch operation in the touch panel and the coordinate mapping relationship between the human-computer interaction interface and the touch panel.
Here, the display screen may include a human-computer interaction interface, that is, the interaction information in the display screen is a human-computer interaction interface. In one implementation manner, the human-computer interaction interface includes an input method interface, the input method interface may include letter keys, number keys, symbol keys, other function keys, and the like, and the input of corresponding information may be implemented through the operation of the user on the input method interface. In addition, the human-computer interaction interface can also comprise various interaction interfaces such as a search interface, a function selection interface and the like.
In the application, in order to avoid complex operations of sliding and releasing back and forth on the touch pad by a user, and meanwhile, in order to quickly position interactive information in the display screen through the touch pad, such as positioning of keys in an input method interface and reduce time overhead, a man-machine interaction interface in the display screen and the touch pad can be scaled according to relative proportion to form a coordinate mapping relationship. In an implementation manner, a coordinate mapping relationship corresponding to each other may be established between the area of the touch pad and the arrangement order and the position relationship of the interactive interface units in the human-computer interaction interface, where, taking an input method interface as an example, the interactive interface units may include various letter keys, numeric keys, symbol keys, other function keys, and the like in the input method interface.
An example of establishing a coordinate mapping relationship between the human-computer interaction interface and the touch pad in the display screen is shown in fig. 4. Suppose the resolution of the touch pad is x1 x y1 and the resolution of the display screen is x2 x y2. Assuming that the user's finger touches the first location information (x 0, y 0) on the touch pad, it falls to the second location information (x, y) in the display screen input method interface. The calculation formula is as follows:
Figure BDA0001508708730000201
can obtain the product
Figure BDA0001508708730000202
Figure BDA0001508708730000203
Can obtain
Figure BDA0001508708730000204
Therefore, the coordinate mapping relation between the man-machine interaction interface in the display screen and the touch pad can be obtained.
It should be understood by those skilled in the art that the above-mentioned manner for establishing the coordinate mapping relationship between the human-computer interaction interface and the touch pad in the display screen is only an example, and other manners existing now or later to establish the coordinate mapping relationship between the human-computer interaction interface and the touch pad in the display screen, if applicable to the present application, should be included in the scope of protection of the present application and are included herein by reference.
In an implementation manner, the establishment of the coordinate mapping relationship between the whole display screen and the touch pad may also refer to an establishment method of the coordinate mapping relationship between a human-computer interaction interface in the display screen and the touch pad, and at this time, since the display screen and the touch pad are fixed, the coordinate mapping relationship also exists relatively fixedly.
Further, in an embodiment, the establishing means may establish the coordinate mapping relationship between the human-computer interaction interface in the display screen and the touch pad when the human-computer interaction interface in the display screen is updated. In practical application, the coordinate mapping relationship can be reestablished when the human-computer interaction interface is switched. For example, after the style of the input method interface is changed based on the selection of the user, the coordinate mapping relationship is reestablished, and for example, if other interaction information of the human-computer interaction interface is switched and changed, the coordinate mapping relationship may also be reestablished. Therefore, the matching degree and accuracy of the man-machine interaction operation can be guaranteed, and friendly user experience is brought.
In one embodiment, the establishing means may establish a coordinate mapping relationship between a human-computer interaction interface in the display screen and an effective touch area in the touch pad. In practical application, a coordinate mapping relation with the human-computer interaction interface in the display screen can be established based on part or all of the touch areas of the touch pad. If the effective touch area is a partial touch area, the touch operation performed by the user in the area is effective; and if the effective touch area comprises all touch areas, the touch operation of the user in all the touch areas of the touch pad is effective.
Further, in one embodiment, the computing device 1 further includes a prompting device (not shown) that provides corresponding operation prompting information when the first position information of the touch operation in the touch panel is outside the effective touch area. In one implementation, the operation prompt message may be used to prompt the user that the current touch area is invalid, or guide the user to enter the valid touch area to perform a valid touch operation, or the like. For example, a prompt of "please click on the effective interaction area", or a prompt of "please move the touch operation in the xx direction", or the like may be prompted. In one implementation, the operation prompt information may include, but is not limited to, a voice prompt, a text prompt in a display screen, and the like.
In one embodiment, the shape of the active touch area matches the shape of the human-computer interaction interface.
In practical application, the shape of the human-computer interaction interface in the display screen, such as the shape of the input method interface layout, may be different, and further, the shape of the matched effective touch area may be set based on the shape of the human-computer interaction interface. Moreover, the human-computer interaction interface, such as an input method interface, can be presented at any position in the display screen. Fig. 3 shows several sets of examples of matching the shape of the active touch area of a touch pad to the shape of the human-machine interface of a display according to one embodiment of the present application. The input method interface can occupy the whole display screen or a part of the area presented in the display screen. In one implementation, the layout of the effective touch area in the touch pad may be adjusted according to the shape of the input method interface in the display screen, for example: the shape of the input method interface is square, and the shape of the effective touch area of the touch pad can be preferably square, so that the established mapping relation is accurate, and the touch accuracy is high; for another example, the shape of the input method interface is an ellipse, and the effective area of the touch pad may also be a rectangle. In one implementation, even if the shape of the effective touch area is not identical to the shape of the human-computer interaction interface, a mapping relation between the effective touch area and the human-computer interaction interface can be established based on a corresponding matching relation. In one implementation, when a user needs to use an input method, such as based on WiFi passwords, typing, inputting contacts, and other scenarios, the input method interface information will bounce out and be displayed on the display screen, so that the shape of the effective touch area on the touch pad can be flexibly matched or adjusted based on changes in the shapes of different input method interfaces in different application scenarios.
In one embodiment, the second position information comprises position information of one or more interactive interface units on the man-machine interactive interface in the display screen. In one implementation mode, the interactive interface unit is an interaction object which is provided for a user to perform interactive operation in the human-computer interaction interface. For example, in the input method interface, each key, such as an alphabetic key, a numeric key, a symbol key, other function keys, etc., may correspond to an interactive interface unit. Here, the second location information may correspond to location information of only one interactive interface unit on the human-computer interactive interface included in the display screen, or may correspond to location information of a plurality of interactive interface units at the same time.
Further, in an embodiment, the touch operation executing device 23 may execute the touch operation on the one or more interactive interface units corresponding to the second position information according to the second position information.
Specifically, in an implementation manner, if only one interactive interface unit corresponding to the second location information is provided, the touch operation may be directly performed to implement any self-defined interactive operation, such as selection, confirmation, deletion, and the like, of the interactive interface unit. For example, it may be set that the touch operation includes a selection operation, the touch operation is performed in a manner of a sliding operation, and the current sliding operation passes through first position information on the touch pad, and then the touch operation is performed to select interactive information on second position information in the display screen corresponding to the first position information, for example, when a human-computer interaction interface in the display screen is an input method interface and a user's finger currently strokes a numeric key "1", then a selection operation on the numeric key "1" is performed.
In another implementation mode, if the second position information comprises position information of a plurality of interactive interface units on a human-computer interactive interface in a display screen; the touch operation executing device 23 may execute the touch operation on the multiple interactive interface units corresponding to the second position information according to the second position information; wherein the computing device 1 further comprises a selection operation acquisition means (not shown) that can acquire a selection operation of a user in the touch panel, and a selection operation execution means (not shown); and then, the selection operation executing device executes the selection operation to realize the selection of the target interactive interface unit in the plurality of interactive interface units corresponding to the second position information. In one implementation, if the user expects to select one or more target interactive interface units of the multiple interactive interface units corresponding to the second location information, the result of performing the touch operation on the multiple interactive interface units corresponding to the second location information by the computing device 1 is that the second location information is locked. And further, on the basis of the touch operation, the user can further perform touch operation, namely selection operation, in the touch panel to select the target interactive interface unit.
For example, when the second position information corresponds to the positions of two keys in the input method interface, the two keys are highlighted simultaneously, and then one key is selected by touching up and down or left and right again; similarly, when the second position information corresponds to the adjacent positions of the four keys in the input method interface, the four keys are highlighted simultaneously, and then one key is selected by touching the four keys up, down, left and right again. It will be understood by those skilled in the art that the above selection operations are merely exemplary, and that other selection operations now or later developed, if applicable to the present application, are intended to be included within the scope of the present application and are hereby incorporated by reference.
For another example, if there are multiple target interactive interface units in the multiple interactive interface units corresponding to the second location information, and if there are multiple keys in the input method interface corresponding to the second location information, and some of the keys need to be selected continuously, at this time, multiple selection operations of the user may be obtained continuously based on an expected selected order of the target interactive interface units. For example, when the second position information corresponds to the positions of two keys in the input method interface, and the two keys are two keys that the user expects to select continuously, the two keys are highlighted simultaneously, at this time, the touch operation is actually locking the two keys corresponding to the second position information, and then the previous target key is selected by touching up and down or left and right again, and the selection operation may include but is not limited to a re-pressing operation or a releasing operation, and after one selection is completed, one target key is confirmed, and then the selection operation of the next target key may be further performed, for example, if the selection operation includes a re-pressing operation, the next target key may be selected by touching up and down or left and right again without releasing a finger, and the re-pressing operation is performed again to complete the selection operation of the next target key; for another example, if the selection operation includes a release operation, the selection operation of the target key in the subsequent order may be completed by continuously touching again up and down or left and right within the preset effective operation time to select the target key in the subsequent order and perform the release operation again. It will be appreciated by those skilled in the art that the above-described selection operations are merely exemplary, and that other selection operations now or later developed, if applicable to the present application, are intended to be encompassed within the scope of the present application and are hereby incorporated by reference.
The present application further provides a computing device for human-computer interaction, comprising:
a display screen and a touch panel that are separated;
one or more processors;
a memory; and
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs comprising instructions for:
acquiring touch operation of a user in the touch pad;
determining second position information of the touch operation in the display screen based on first position information of the touch operation in the touch pad and a coordinate mapping relation between the display screen and the touch pad;
and executing the touch operation according to the second position information.
Further, the program of the apparatus may also be used to perform corresponding operations in other related embodiments based on the above operations.
The present application further provides a computer-readable storage medium having a computer program stored thereon, the computer program being executable by a processor to:
acquiring touch operation of a user in the touch pad;
determining second position information of the touch operation in the display screen based on first position information of the touch operation in the touch panel and a coordinate mapping relation between the display screen and the touch panel;
and executing the touch operation according to the second position information.
Further, the computer program may also be executed by the processor for corresponding operations in other related embodiments based on the operations described above.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
It should be noted that the present invention may be implemented in software and/or in a combination of software and hardware, for example, as an Application Specific Integrated Circuit (ASIC), a general purpose computer or any other similar hardware device. In one embodiment, the software program of the present invention may be executed by a processor to implement the steps or functions described above. Also, the software programs (including associated data structures) of the present invention can be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Further, some of the steps or functions of the present invention may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
Furthermore, parts of the invention may be applied as a computer program product, e.g. computer program instructions, which, when executed by a computer, may invoke or provide the method and/or solution according to the invention by operation of the computer. Program instructions which invoke the methods of the present invention may be stored on a fixed or removable recording medium and/or transmitted via a data stream on a broadcast or other signal-bearing medium and/or stored within a working memory of a computer device operating in accordance with the program instructions. An embodiment according to the invention herein comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or solution according to embodiments of the invention as described above.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it will be obvious that the term "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not to denote any particular order.

Claims (24)

1. A human-computer interaction method for a computing device comprising a separate display screen and touch pad, the touch pad not in the same field of view as the display screen, the display screen comprising a human-computer interaction interface, wherein the method comprises:
acquiring touch operation of a user in the touch pad, wherein the touch operation comprises sliding operation through a plurality of pieces of first position information on the touch pad;
establishing a coordinate mapping relation between a human-computer interaction interface in the display screen and an effective touch area in the touch pad, wherein the layout of the effective touch area is determined by the shape of the human-computer interaction interface, and the human-computer interaction interface is presented in a partial area in the display screen;
determining a plurality of second position information of the touch operation in the display screen based on a plurality of first position information of the touch operation in the touch pad and a coordinate mapping relation between the man-machine interaction interface in the display screen and the touch pad, wherein the coordinate mapping relation comprises that the position information in the display screen has corresponding position information matched with the position information on the touch pad, and each second position information comprises the position information of one or more interaction interface units on the man-machine interaction interface in the display screen;
and sequentially and distinctively displaying the interactive interface units on each second position information in the display screen according to the plurality of second position information, and executing the touch operation.
2. The method of claim 1, wherein the method further comprises:
displaying the touch operation on the display screen according to the second position information; alternatively, the first and second liquid crystal display panels may be,
and displaying the execution result information of the touch operation on the display screen.
3. The method of claim 1, wherein the establishing a coordinate mapping relationship between a human-computer interaction interface in the display screen and an effective touch area in the touch pad comprises:
and when the human-computer interaction interface in the display screen is updated, establishing a coordinate mapping relation between the human-computer interaction interface in the display screen and the effective touch area in the touch pad.
4. The method of claim 1, wherein the method further comprises:
and when the first position information of the touch operation in the touch pad is out of the effective touch area, providing corresponding operation prompt information.
5. The method of claim 1, wherein the human-machine interface comprises at least any one of:
inputting a method interface;
searching an interface;
and (6) function selection interface.
6. The method of claim 1, wherein the performing the touch operation according to the second location information comprises:
and executing the touch operation on the one or more interactive interface units corresponding to the second position information according to the second position information.
7. The method of claim 6, wherein if the second position information comprises position information of a plurality of interactive interface units on a human-computer interactive interface in a display screen; wherein the performing the touch operation according to the second position information includes:
according to the second position information, the touch operation is executed on the plurality of interactive interface units corresponding to the second position information;
wherein the method further comprises:
acquiring selection operation of a user in the touch pad;
and executing the selection operation to realize the selection of the target interactive interface unit in the plurality of interactive interface units corresponding to the second position information.
8. The method of claim 1, wherein the computing device comprises smart glasses or a smart helmet.
9. The method of claim 8, wherein the computing device comprises smart glasses, the touch pad being disposed on a temple of a frame of the smart glasses.
10. The method of claim 8, wherein the computing device comprises a smart helmet, the touchpad disposed on a left, right, or rear side surface area of the smart helmet.
11. The method of claim 1, wherein the touchpad is outside of user-viewable range when the computing device is in use.
12. A computing device for human-computer interaction, the computing device comprising a separate display screen and touch pad, the touch pad not in a same field of view as the display screen, the display screen comprising a human-computer interaction interface, wherein the computing device comprises:
touch operation acquisition means for acquiring a touch operation of a user in the touch panel, wherein the touch operation includes a slide operation passing through a plurality of pieces of first position information on the touch panel;
the device comprises an establishing device and a touch panel, wherein the establishing device is used for establishing a coordinate mapping relation between a human-computer interaction interface in the display screen and an effective touch area in the touch panel, the layout of the effective touch area is determined by the shape of the human-computer interaction interface, and the human-computer interaction interface is displayed in a part of area in the display screen;
the determining device is used for determining a plurality of pieces of second position information of the touch operation in the display screen based on a plurality of pieces of first position information of the touch operation in the touch panel and a coordinate mapping relation between the man-machine interaction interface in the display screen and the touch panel, wherein the coordinate mapping relation comprises that the position information in the display screen has corresponding position information matched with the position information on the touch panel, and each piece of second position information comprises the position information of one or more interaction interface units on the man-machine interaction interface in the display screen;
and the touch operation executing device is used for sequentially and differently displaying the interactive interface units on each piece of second position information in the display screen according to the plurality of pieces of second position information and executing the touch operation.
13. The computing device of claim 12, wherein the device further comprises:
the display device is used for displaying the touch operation on the display screen according to the second position information; or displaying the execution result information of the touch operation on the display screen.
14. The computing device of claim 12, wherein the establishing means is to:
and when the human-computer interaction interface in the display screen is updated, establishing a coordinate mapping relation between the human-computer interaction interface in the display screen and the effective touch area in the touch pad.
15. The computing device of claim 12, wherein the computing device further comprises:
and the prompt device is used for providing corresponding operation prompt information when the first position information of the touch operation in the touch panel is outside the effective touch area.
16. The computing device of claim 12, wherein the human-machine interaction interface comprises at least any one of:
inputting a method interface;
searching an interface;
and (6) function selection interface.
17. The computing device of claim 12, wherein the touch operation performing means is to:
and executing the touch operation on the one or more interactive interface units corresponding to the second position information according to the second position information.
18. The computing device of claim 17, wherein if the second location information includes location information of a plurality of interactive interface units on a human-computer interactive interface in a display screen;
wherein the touch operation execution device is used for:
according to the second position information, the touch operation is executed on the plurality of interactive interface units corresponding to the second position information;
wherein the computing device further comprises:
the selection operation acquisition device is used for acquiring the selection operation of the user in the touch panel;
and the selection operation executing device is used for executing the selection operation to realize the selection of a target interactive interface unit in the plurality of interactive interface units corresponding to the second position information.
19. The computing device of claim 12, wherein the computing device comprises smart glasses or a smart helmet.
20. The computing device of claim 19, wherein the computing device comprises smart glasses, the touchpad disposed on a temple of a frame of the smart glasses.
21. The computing device of claim 19, wherein the computing device comprises a smart helmet, the touchpad disposed at a left, right, or rear side surface area of the smart helmet.
22. The computing device of claim 12, wherein the touchpad is outside of a user's field of view when the computing device is in use.
23. A computing device for human-computer interaction, comprising:
a display screen and a touch panel that are separated;
one or more processors;
a memory; and
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs comprising instructions for performing the method of any of claims 1-11.
24. A computer-readable storage medium, on which a computer program is stored, which computer program can be executed by a processor for performing the method according to any of claims 1-11.
CN201711345498.5A 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction Active CN107967091B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711345498.5A CN107967091B (en) 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711345498.5A CN107967091B (en) 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction

Publications (2)

Publication Number Publication Date
CN107967091A CN107967091A (en) 2018-04-27
CN107967091B true CN107967091B (en) 2022-12-06

Family

ID=61995326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711345498.5A Active CN107967091B (en) 2017-12-14 2017-12-14 Human-computer interaction method and computing equipment for human-computer interaction

Country Status (1)

Country Link
CN (1) CN107967091B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347469A (en) * 2019-07-12 2019-10-18 北大方正集团有限公司 Interaction processing method and device
CN112328160A (en) * 2020-10-26 2021-02-05 歌尔智能科技有限公司 Input control method of terminal equipment, terminal equipment and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104793353A (en) * 2015-01-04 2015-07-22 北京君正集成电路股份有限公司 Intelligent glasses
CN105183175A (en) * 2015-10-23 2015-12-23 周维 Wearable communication equipment
CN106527916A (en) * 2016-09-22 2017-03-22 乐视控股(北京)有限公司 Operating method and device based on virtual reality equipment, and operating equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104793353A (en) * 2015-01-04 2015-07-22 北京君正集成电路股份有限公司 Intelligent glasses
CN105183175A (en) * 2015-10-23 2015-12-23 周维 Wearable communication equipment
CN106527916A (en) * 2016-09-22 2017-03-22 乐视控股(北京)有限公司 Operating method and device based on virtual reality equipment, and operating equipment

Also Published As

Publication number Publication date
CN107967091A (en) 2018-04-27

Similar Documents

Publication Publication Date Title
CN108132744B (en) Method and equipment for remotely controlling intelligent equipment
US9250789B2 (en) Information processing apparatus, information processing apparatus control method and storage medium
US11068149B2 (en) Indirect user interaction with desktop using touch-sensitive control surface
US9524097B2 (en) Touchscreen gestures for selecting a graphical object
JP2018097364A (en) Information processing apparatus, program, and information processing method
US20120299876A1 (en) Adaptable projection on occluding object in a projected user interface
US10387033B2 (en) Size reduction and utilization of software keyboards
KR102205283B1 (en) Electro device executing at least one application and method for controlling thereof
US20170047065A1 (en) Voice-controllable image display device and voice control method for image display device
CN102768607B (en) Method and device for realizing touch operation application program
JP2014527673A (en) Widget processing method and apparatus, and mobile terminal
WO2018010440A1 (en) Projection picture adjusting method and apparatus, and projection terminal
US11275501B2 (en) Creating tables using gestures
US20120179963A1 (en) Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display
KR102237659B1 (en) Method for input and apparatuses performing the same
CN107967091B (en) Human-computer interaction method and computing equipment for human-computer interaction
TWI442305B (en) A operation method and a system of the multi-touch
US20140317549A1 (en) Method for Controlling Touchscreen by Using Virtual Trackball
JPWO2017022031A1 (en) Information terminal equipment
US20180173398A1 (en) Touch panel type information terminal device, information input processing method and program thereof
WO2013047023A1 (en) Display apparatus, display method, and program
WO2015090092A1 (en) Method and device for generating individualized input panel
WO2010020961A1 (en) Displaying an image
JP2015022675A (en) Electronic apparatus, interface control method, and program
JP6327834B2 (en) Operation display device, operation display method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Room 1109, No. 570, Shengxia Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai, March 2012

Patentee before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd.