CN109782920A - One kind is for extending realistic individual machine exchange method and processing terminal - Google Patents
One kind is for extending realistic individual machine exchange method and processing terminal Download PDFInfo
- Publication number
- CN109782920A CN109782920A CN201910094783.7A CN201910094783A CN109782920A CN 109782920 A CN109782920 A CN 109782920A CN 201910094783 A CN201910094783 A CN 201910094783A CN 109782920 A CN109782920 A CN 109782920A
- Authority
- CN
- China
- Prior art keywords
- interactive controls
- control signal
- human
- extending
- computer interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
The present invention relates to for extending realistic individual machine exchange method and processing terminal, the described method comprises the following steps: step 1, all interactive controls for identifying human-computer interaction interface;Step 2 binds at least one or more interactive controls with corresponding control signal, establishes the one-to-one or one-to-many mapping relations of interactive controls and control signal, is operated by control signal to interactive controls.The present invention is compared to the man-machine interaction method in existing extension reality, the interactive controls of current human-computer interaction interface are directly tied to control signal automatically, user does not need specially to be moved to target widget to be operated again, but control signal is tied to automatically by corresponding trigger switch by system, corresponding control signal is controlled by trigger switch, the physical space very big without occupancy, does not interfere or influences other people when to using operation, very convenient.
Description
Technical field
It is specifically a kind of for extending realistic individual machine exchange method and processing the present invention relates to human-computer interaction technique field
Terminal.
Background technique
It is had been applied in current XR (Chinese is known as extending reality, including AR, VR, MR) field, especially AR and VR existing
In all trades and professions grown directly from seeds in living, and during these extend practical applications, common human-computer interaction control method is all logical
It crosses cursor or after manipulation focus is moved to target, operation, the various controls of the completion to interactive controls such as is clicked, dragged and scale
System operation, thus finishing man-machine interaction process.In such human-computer interaction control mode, user's control mode be by cursor or
Manipulation focus moves on to target widget, then executes corresponding actions and completes object run, user is needed actively to be moved to target widget,
It is inconvenient not only to manipulate interactive controls trouble, but also usually require that user at one's side to need certain physical activity space, ability
User is enough allowed to carry out various interactive operations.And the eyeball tracking technology used in extension reality, although not having to activity space
It requires, but since human eye vision track is more jumped, sight dispersion, so still defeated not as good as keyboard and mouse in terms of precision
Enter, at present still using this technology as auxiliary operation means.Therefore, it is necessary to a kind of new realistic individual machines that can be used for extending to interact
The various operations of human-computer interaction also may be implemented without special activity space for method.
Summary of the invention
In view of the deficiencies of the prior art, an object of the present invention provides a kind of for extending realistic individual machine interaction side
Method is able to solve in human-computer interaction the problem of facilitating operation and being not necessarily to activity space;
The second object of the present invention provides a kind of processing terminal, is able to solve the problem of inputting character in extension reality.
The technical solution one of achieved the object of the present invention are as follows: one kind for extending realistic individual machine exchange method, including
Following steps:
Step 1, all interactive controls for identifying human-computer interaction interface;
Step 2 binds at least one or more interactive controls with corresponding control signal, establishes interaction
The one-to-one or one-to-many mapping relations of control and control signal operate interactive controls by control signal.
Further, the interactive controls are bound with corresponding control signal by the first predefined rule.
Further, first predefined rule is by sequence of positions of the interactive controls in human-computer interaction interface to each
A interactive controls are bound.
Further, first predefined rule is to be controlled according to frequency of use of the user to interactive controls to each interaction
Part is bound.
It further, further include, when receiving switching binding signal, releasing tying up for current control signal and interactive controls
It is fixed, and automatically bound remaining interactive controls of human-computer interaction interface again with control signal.
Further, the control signal by portable wearable device or gesture identifying device or speech recognition equipment or
Handle or signal controller input.
Further, further comprising the steps of before the step 1: to pass through eyeball tracking device or brain wave identification device
The eyeball or brain wave for identifying active user judge the concern range of the man-machine interface in the concern range of human-computer interaction interface
Inside whether there are interactive controls, if so, thening follow the steps 2, otherwise, executes step 1.
Further, the eyeball or brain electricity that active user is identified by eyeball tracking device or brain wave identification device
Wave is specifically realized by following steps in the step of concern range of human-computer interaction interface: being known by eyeball tracking device or brain wave
Focus of the eyeball or brain wave of other device identification active user on human-computer interaction interface, sets on the human-computer interaction interface
A concern range is set, the concern range is the point on the basis of the focus, extends the void to be formed according to the second predefined rule
Quasi- region.
Further, the control signal is inputted by external trigger device.
A kind of two technical solution achieved the object of the present invention are as follows: processing terminal comprising,
Memory, for storing program instruction;
Processor, it is described for extending the step of realistic individual machine exchange method to execute for running described program instruction
Suddenly.
The invention has the benefit that the present invention will directly work as compared to the man-machine interaction method in existing extension reality
The interactive controls of preceding human-computer interaction interface are tied to control signal automatically, user do not need specially to be moved to target widget again into
Row operation, but control signal is tied to automatically by corresponding trigger switch by system, corresponding behaviour is controlled by trigger switch
Signal is controlled, user clicks directly on corresponding trigger switch and issues specified control signal, and it is complete to execute corresponding actions to corresponding control
At object run.This allows to issue control signal in any position, such as is attached on clothes, or is placed in pocket, or patch
In finger tip, the operation of extension practical application can be completed by not needing movement very big gesture or movement, without occupying
Very big physical space, does not interfere or influences other people, very convenient.
Detailed description of the invention
Fig. 1 is the binding schematic diagram of control signal and interactive controls of the invention;
Specific embodiment
In the following, being described further in conjunction with the drawings and the specific embodiments to the present invention:
As shown in Figure 1, a kind of for extending realistic individual machine exchange method, the interaction control including obtaining human-computer interaction interface
Part, human-computer interaction interface are usually the operation interface of browsing device net page, software or APP, will be each by the first predefined rule
A interactive controls are bound with corresponding control signal, and an interactive controls can control signal or multiple corresponding with one
Corresponding control signal is bound, and the first predefined rule includes opsition dependent sequence, or according to the use habit or root of user
Each interactive controls are bound according to frequency of use of the user to interactive controls.
The one-to-one or one-to-many mapping relations of interactive controls and control signal are established, control signal can be by just
It takes formula wearable device or gesture identifying device or speech recognition equipment or handle or signal controller issues, pass through control signal pair
Interactive controls are operated, thus finishing man-machine interaction.Detailed process is as follows for it:
Firstly, identifying all interactive controls on human-computer interaction interface, browsing device net page, software are either identified also
It is the interactive controls on APP, can be usually tested automatically by means of automated test tool and identify interactive controls,
This method belongs to the prior art, does not specifically describe here.For it is some it is special can not Direct Recognition go out interactive controls,
It can also be arranged in advance by the technological development personnel with developer, special identifier is carried out to control to identify interactive control
Part, for example, by define control attribute value or developer provide can operational controls interface inventory, to identify interactive control
Part, to sum up, interactive controls can identify.
After identifying interactive controls, interactive controls are bound to corresponding control signal respectively, control signal, which can be, to be passed through
It produced by portable wearable device and controls, portable wearable device can be Intelligent glove, or can be carried on arm or leg
Etc. controller, such as Intelligent bracelet, smartwatch, intelligent ring in physical feelings etc., or can carry and for example be put into pocket
Controller, naturally it is also possible to be the combination of the above plurality of devices.By taking the interactive controls in Intelligent glove binding social software as an example,
The Miiglove Intelligent glove such as Jinan telegraphy intelligent technology limited can be used in Intelligent glove, realizes human-computer interaction.Intelligence
Expert covers the various sensors being equipped with including pressure sensor on the finger tips of five finger components, to identify use
Each generic operation of the hand at family, for example the thumb of the first interactive controls and Intelligent glove is bound, for realizing the function for sending key
Can, once the hair in social software is then realized by the first interactive controls so that Intelligent glove identifies that thumb has operational motion
Function is sent, for example the chat message that user inputs is sent.Similarly by the index finger of the second interactive controls and Intelligent glove
Binding realizes that the input including information such as text, expressions, the middle finger binding of third interactive controls and Intelligent glove are realized social
The nameless binding of the opening and closing of software, the 4th interactive controls and Intelligent glove, realizes opening for social software user head portrait
It opens and closes, show or close user's head image information, social software is realized in the little finger of toe binding of the 5th interactive controls and Intelligent glove
The opening and closing of middle chat window, to realize the chat with some specific good friend.Five interactions are bound by Intelligent glove
The major function of social software can be realized in control, completes social chat demand.Certainly, due to the interactive controls on interactive interface
Quantity is relatively more, is only used as and ties up there are five finger component more than the carrier that Intelligent glove can be bound, such as an Intelligent glove
Fixed carrier can bind five interactive controls.Encounter this kind of situation, can be bound i.e. using by main common interaction
Can, such bindings are to meet in the application scenarios of some extension reality, such as under some scene of game, be typically used for
Interactive controls quantity is limited, and is entirely that can satisfy application demand;The load that can be bound can certainly be further increased
Body solves the problems, such as this, for example, remaining interactive controls can be tied on the finger component of another Intelligent glove, or
Person triggers corresponding interactive controls by defining more than two finger components jointly;If the interactive controls of certain human-computer interaction interface
Quantity is really mostly many compared to the carrier that can be bound, and can define one of control signal is to bind signal again, passes through
Again binding signal unbinds the current control signal bound with interactive controls, and simultaneously by the control signal of unbundlings and remaining friendship
Mutual control is bound again by the first predefined rule, such as sliding selection signal is used in advance will as binding signal again
All interactive controls of human-computer interaction interface are divided into several groups, when receiving sliding selection signal, then the current interaction of automatic unbundlings
Control, and control signal is tied to next group of interactive controls, or by sliding which group interactive controls is selection signal choose,
The interactive controls chosen are bound with corresponding control signal then, to sum up, may be implemented to own on interactive interface in this way
Interactive controls bound, it can be ensured that while or timesharing manipulate the interactive controls on all interactive interfaces.
Above-mentioned is that each interactive controls are tied on Intelligent glove, and pressure sensor is provided on Intelligent glove, is passed through
Pressure sensor senses to user various operations and realize and interactive controls operated.Bending sensor generation can also be used
For pressure sensor, it is bent by finger component come operating interactive control, it can also be using the various sensors such as stretching, benefit
Control signal is triggered with corresponding movement, realization operates interactive controls.
The interactive controls of binding include drag operation control, such as a predefined control signal A binds drag operation, when
When keeping triggering control signal A, while defining other four control signal C, D, E and F and being respectively used to up and down, left and right direction
It is mobile, when length presses the triggering key of control signal A, while inputting the manipulation letter in four control signals C, D, E and F
Number, up and down, the left and right mobile unit distance in direction, namely drag operation control here and two respectively can be realized
Control signal is bound, one of control signal be control signal A, another control signal can for control signal C, D,
Any of E and F, if (such as 2s) inputs a control signal in four control signals C, D, E and F for a long time,
Up and down, left and right direction may be implemented to move a certain distance.Similar, the scroll operation interaction of three-dimensional space can also be bound
Control, such as the scroll operation interactive controls of three-dimensional space are bound into control signal W, and define three control signals X, Y and Z point
X-axis, the Y-axis, Z axis rotation in the scroll operation interactive controls of three-dimensional space Yong Yu not be bound, in this way when the long touching for pressing signal W
When sending out key, while clicking X, Y, Z can be respectively along X-axis, Y-axis, Z axis one unit angle of rotation, if long-pressing (such as 2s)
X, Y, Z can rotate by a certain angle along X-axis, Y-axis, Z axis respectively.
It is to bind each interactive controls by Intelligent glove above, can also be realized by eyeball tracking device to each
The binding of a interactive controls.By eyeball tracking device, can identify eyeball in the concern range of human-computer interaction interface, thus
Automatically the interactive controls by eyeball within the scope of the concern of human-computer interaction interface are bound with control signal.Usual user eyeball
The concern range of human-computer interaction interface represent user in next step want operation interactive controls, this is because usually user into
Before row one specific operation, user's sight can first be moved to the interactive controls region or near zone, that is to say eyeball
Concern range contain user in next step want operation interactive controls, so as to by by eyeball in human-computer interaction interface
Concern within the scope of interactive controls bound, realize automatically by user need carry out next step operation interactive controls into
Row binding.Detailed process is as follows, is to identify that user currently to the concern bounds of human-computer interaction interface, that is to say root first
According to eyeball in the concern range of human-computer interaction interface, which is knowledge of the eyeball tracking device on human-computer interaction interface
Other range, the identification range can for using by focus of the eyeball on human-computer interaction interface as datum mark, and according to second
Predefined rule extends to form corresponding virtual region, such as the second predefined rule can be extended the focus as the center of circle
The circle of formation, or geometric center as polygon extend the regular geometric including rectangular, diamond shape, triangle to be formed
Shape area, or the irregular shape region, such as convex shape, M shape etc. to be formed are extended as datum mark, and set in advance
It has set above-mentioned extension and has obtained the length and width or diameter parameters of shape, such as presetting circular diameter is 10 length unit (examples
Such as using cm as length unit), the length and width in setting regular geometric shapes region are respectively 12X10, the maximum gauge of irregular shape
It is respectively 14X12 etc. with minimum diameter;Then phase is tied to automatically to the interactive controls of the human-computer interaction interface in concern range
The control signal answered, if without interactive controls within the scope of current concern, it can be according to the first predefined rule by man-machine friendship
Control and control signal on mutual interface are bound, for example the sequence of positions according to interactive controls in human-computer interaction interface is tied up
It is fixed, can also distance according to interactive controls away from focus bound, i.e., what distance focal point was nearest is preferentially tied to manipulation letter
Number.When the eyeball of user is moved to another region of human-computer interaction interface and generates corresponding concern range, then again
It identifies the interactive controls of current concern range and is bound.Here control signal, which can be, passes through portable wearable device
Intelligent glove or gesture identifying device or speech recognition equipment or the control signal of the generations such as handle or signal controller it is direct
Interactive controls are bound and are operated, certain control signal is in the icon of display interface or mark position and interactive controls
Position be not required for being overlapped, such as interactive controls are in the left area of human-computer interaction interface, and control signal is (e.g. intelligent
Corresponding control signal produced by gloves are stopped according to eyeball) in the right area of human-computer interaction interface, the mark of control signal
It is connect with interactive controls by same root line or same color indicates binding relationship, convenient for user intuitively know control signal
With the binding relationship of interactive controls, still interactive controls can be controlled by control signal;It can also be external trigger device
The control signal of generation, external trigger device can be arbitrary trigger switch, such as any number of touchings being attached on clothes
Point, the finger or any object of people clicks corresponding contacts to trigger corresponding control signal, and realizes human-computer interaction.So as to
Physical activity space is occupied so that hardly needing, will not interfere and influence other people, it is very convenient.
When the interactive controls to human-computer interaction interface are bound, each interactive controls can be carried out with opsition dependent sequence
It binds, for example the interactive controls in the left side of human-computer interaction interface is tied to the Intelligent glove of left hand, the right side of human-computer interaction interface
The interactive controls of side are tied to the Intelligent glove of the right hand, or the interactive controls of the upside of human-computer interaction interface are tied to left hand
Intelligent glove, the interactive controls of downside are tied to the Intelligent glove of the right hand, or by interactive controls in human-computer interaction interface
Sequence of positions from left to right successively each interactive controls are bound, or by interactive controls in human-computer interaction interface
Sequence of positions from top to bottom successively each interactive controls are bound, be conducive in this way forming the habit property of user memory,
It is also convenient for being bound, consequently facilitating user carries out the operation to interactive controls.
Each interactive controls are bound according to the use habit of user, including are accustomed to depending on the user's operation, such as
Gesture is done commonly using the right hand and carries out various operations, then each interactive controls can be bound according to the gesture of the right hand;Also
Can be according to user to the frequency of use of interactive controls, i.e., user is a certain period to the access times pair of some interactive controls
Each interactive controls are bound.
When human-computer interaction interface has the operation interface of multiple browsing device net pages, software or APP or the same human-computer interaction
When interface there are multiple interactive controls, the interactive controls of current interface operated by user are bound automatically, and are released and elder generation
The interactive controls of current human-computer interaction interface are bound in the binding of interactive controls in preceding operation interface again.Such as it is current
There are two operation interface, the human-computer interaction interface of a social software, the human-computer interaction interface of a web browser works as user
When being switched to the interface of social software from web browser, social software is as current interface, when receiving switching human-computer interaction
When interface signals, then by control signal by with the interactive controls of web browser bindings, be switched to and interact control with social software
Part is bound, or by control signal by with the binding of the current interactive controls of web browser, be switched to and web browser
Other interactive controls binding, convenient for user's changeover program or switching human-computer interaction interface after various operations.
When user does not need the interactive controls of operation current interface and needs to operate other interactive controls of current interface,
When receiving switching binding signal, then to the interactive controls of control signal unbundlings current bindings, and according to the first predefined rule
Bind other remaining interactive controls.Switching binding signal can be by two thumbs while click action is to generate or according to it
His user action generates, and when user is there are two thumb click action simultaneously, then can receive switching binding signal.User
Customized rule can be accustomed to being bound depending on the user's operation, such as user is often operated with right-hand gesture, then by other
The control signal that the operation of the right-hand gesture of interactive controls and user generates is bound, and the operating habit of for another example user is often adopted
Corresponding operation is carried out with dragging or rolling, then is tied up other interactive controls and dragging control signal or rolling control signal
It is fixed, it is convenient and efficient, improve the experience of user.
Compared to the man-machine interaction method in existing extension reality, the present invention is directly by the friendship of current human-computer interaction interface
Mutual control is tied to control signal automatically, and user does not need specially to be moved to target widget to be operated again, but by system from
It is dynamic that control signal is tied to corresponding trigger switch, corresponding control signal is controlled by trigger switch, user clicks directly on
Corresponding trigger switch issues specified control signal, executes corresponding actions to corresponding control and completes object run.This makes can
It to issue control signal in any position, such as is attached on clothes, or be placed in pocket, or be attached to finger tip, does not need to act
Very big gesture or movement can complete the operation of extension practical application, without occupying very big physical space, not do
It disturbs or influences other people, it is very convenient.
The invention further relates to a kind of processing terminals of entity apparatus for realizing above method comprising,
Memory, for storing program instruction;
Processor, it is described for extending in realistic individual machine exchange method to execute for running described program instruction
Step.
Embodiment disclosed in this specification is an illustration to folk prescription region feature of the present invention, protection model of the invention
Embodiment without being limited thereto is enclosed, the equivalent embodiment of other any functions is fallen within the protection scope of the present invention.For this field
Technical staff for, various other corresponding changes and change can be made according to the above description of the technical scheme and ideas
Shape, and all these change and deformation all should belong within the scope of protection of the claims of the present invention.
Claims (10)
1. one kind is for extending realistic individual machine exchange method, it is characterised in that: the following steps are included:
Step 1, all interactive controls for identifying human-computer interaction interface;
Step 2 binds at least one or more interactive controls with corresponding control signal, establishes interactive controls
With the one-to-one or one-to-many mapping relations of control signal, interactive controls are operated by control signal.
2. according to claim 1 for extending realistic individual machine exchange method, it is characterised in that: predefined by first
Rule binds the interactive controls with corresponding control signal.
3. according to claim 2 for extending realistic individual machine exchange method, it is characterised in that: described first is predefined
Rule is to bind by sequence of positions of the interactive controls in human-computer interaction interface to each interactive controls.
4. according to claim 2 for extending realistic individual machine exchange method, it is characterised in that: described first is predefined
Rule is to be bound according to frequency of use of the user to interactive controls to each interactive controls.
5. according to claim 1 for extending realistic individual machine exchange method, it is characterised in that: further include working as reception
To switching bind signal when, release the binding of current control signal and interactive controls, and automatically by human-computer interaction interface remaining
Interactive controls are bound again with control signal.
6. according to claim 1 for extending realistic individual machine exchange method, it is characterised in that: the control signal is logical
Cross portable wearable device or gesture identifying device or speech recognition equipment or handle or signal controller input.
7. according to claim 1 for extending realistic individual machine exchange method, it is characterised in that: before the step 1
Further comprising the steps of: the eyeball or brain wave for identifying active user by eyeball tracking device or brain wave identification device are in people
The concern range of machine interactive interface judges whether there is interactive controls within the scope of the concern of the man-machine interface, if so, then executing step
Rapid 2, otherwise, execute step 1.
8. according to claim 7 for extending realistic individual machine exchange method, it is characterised in that: described to be chased after by eyeball
Step of the eyeball or brain wave of track device or brain wave identification device identification active user in the concern range of human-computer interaction interface
Suddenly it is specifically realized by following steps: identifying the eyeball or brain electricity of active user by eyeball tracking device or brain wave identification device
Focus of the wave on human-computer interaction interface, on the human-computer interaction interface be arranged one concern range, the concern range for
Point on the basis of the focus extends the virtual region to be formed according to the second predefined rule.
9. according to claim 1 for extending realistic individual machine exchange method, it is characterised in that: the control signal is logical
External trigger device is crossed to be inputted.
10. a kind of processing terminal comprising,
Memory, for storing program instruction;
Processor, for running described program instruction, to execute as claim 1-9 is described in any item for extending reality
The step of man-machine interaction method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910094783.7A CN109782920A (en) | 2019-01-30 | 2019-01-30 | One kind is for extending realistic individual machine exchange method and processing terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910094783.7A CN109782920A (en) | 2019-01-30 | 2019-01-30 | One kind is for extending realistic individual machine exchange method and processing terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109782920A true CN109782920A (en) | 2019-05-21 |
Family
ID=66504029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910094783.7A Pending CN109782920A (en) | 2019-01-30 | 2019-01-30 | One kind is for extending realistic individual machine exchange method and processing terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109782920A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112527112A (en) * | 2020-12-08 | 2021-03-19 | 中国空气动力研究与发展中心计算空气动力研究所 | Multi-channel immersive flow field visualization man-machine interaction method |
CN112764834A (en) * | 2021-01-21 | 2021-05-07 | 乐聚(深圳)机器人技术有限公司 | Control action binding method, device, equipment and storage medium |
CN113655927A (en) * | 2021-08-24 | 2021-11-16 | 亮风台(上海)信息科技有限公司 | Interface interaction method and device |
CN114443038A (en) * | 2022-04-08 | 2022-05-06 | 绿城科技产业服务集团有限公司 | Zero code configurable display system based on browser |
CN114997186A (en) * | 2021-09-02 | 2022-09-02 | 荣耀终端有限公司 | Control method of translation control and electronic equipment |
WO2023173726A1 (en) * | 2022-03-16 | 2023-09-21 | 北京字跳网络技术有限公司 | Interaction method and apparatus, and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120113223A1 (en) * | 2010-11-05 | 2012-05-10 | Microsoft Corporation | User Interaction in Augmented Reality |
CN103631364A (en) * | 2012-08-20 | 2014-03-12 | 联想(北京)有限公司 | Control method and electronic device |
CN104639966A (en) * | 2015-01-29 | 2015-05-20 | 小米科技有限责任公司 | Method and device for remote control |
CN105389003A (en) * | 2015-10-15 | 2016-03-09 | 广东欧珀移动通信有限公司 | Control method and apparatus for application in mobile terminal |
CN106774399A (en) * | 2016-12-26 | 2017-05-31 | 深圳市道通智能航空技术有限公司 | The control method of VR equipment, device and remote control |
CN107358953A (en) * | 2017-06-30 | 2017-11-17 | 努比亚技术有限公司 | Sound control method, mobile terminal and storage medium |
CN107957774A (en) * | 2016-10-18 | 2018-04-24 | 阿里巴巴集团控股有限公司 | Exchange method and device in virtual reality space environment |
-
2019
- 2019-01-30 CN CN201910094783.7A patent/CN109782920A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120113223A1 (en) * | 2010-11-05 | 2012-05-10 | Microsoft Corporation | User Interaction in Augmented Reality |
CN103631364A (en) * | 2012-08-20 | 2014-03-12 | 联想(北京)有限公司 | Control method and electronic device |
CN104639966A (en) * | 2015-01-29 | 2015-05-20 | 小米科技有限责任公司 | Method and device for remote control |
CN105389003A (en) * | 2015-10-15 | 2016-03-09 | 广东欧珀移动通信有限公司 | Control method and apparatus for application in mobile terminal |
CN107957774A (en) * | 2016-10-18 | 2018-04-24 | 阿里巴巴集团控股有限公司 | Exchange method and device in virtual reality space environment |
CN106774399A (en) * | 2016-12-26 | 2017-05-31 | 深圳市道通智能航空技术有限公司 | The control method of VR equipment, device and remote control |
CN107358953A (en) * | 2017-06-30 | 2017-11-17 | 努比亚技术有限公司 | Sound control method, mobile terminal and storage medium |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112527112A (en) * | 2020-12-08 | 2021-03-19 | 中国空气动力研究与发展中心计算空气动力研究所 | Multi-channel immersive flow field visualization man-machine interaction method |
CN112527112B (en) * | 2020-12-08 | 2023-05-02 | 中国空气动力研究与发展中心计算空气动力研究所 | Visual man-machine interaction method for multichannel immersion type flow field |
CN112764834A (en) * | 2021-01-21 | 2021-05-07 | 乐聚(深圳)机器人技术有限公司 | Control action binding method, device, equipment and storage medium |
CN113655927A (en) * | 2021-08-24 | 2021-11-16 | 亮风台(上海)信息科技有限公司 | Interface interaction method and device |
CN113655927B (en) * | 2021-08-24 | 2024-04-26 | 亮风台(上海)信息科技有限公司 | Interface interaction method and device |
CN114997186A (en) * | 2021-09-02 | 2022-09-02 | 荣耀终端有限公司 | Control method of translation control and electronic equipment |
WO2023173726A1 (en) * | 2022-03-16 | 2023-09-21 | 北京字跳网络技术有限公司 | Interaction method and apparatus, and storage medium |
CN114443038A (en) * | 2022-04-08 | 2022-05-06 | 绿城科技产业服务集团有限公司 | Zero code configurable display system based on browser |
CN114443038B (en) * | 2022-04-08 | 2023-08-18 | 绿城科技产业服务集团有限公司 | Zero code configurable display system based on browser |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109782920A (en) | One kind is for extending realistic individual machine exchange method and processing terminal | |
US11782511B2 (en) | Tactile glove for human-computer interaction | |
CN106843739B (en) | A kind of display control method and mobile terminal of mobile terminal | |
Lindeman et al. | Hand-held windows: towards effective 2D interaction in immersive virtual environments | |
US8866781B2 (en) | Contactless gesture-based control method and apparatus | |
CN109791468A (en) | User interface for both hands control | |
Song et al. | GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application | |
Strohmeier et al. | zPatch: Hybrid resistive/capacitive etextile input | |
DE202017105307U1 (en) | Switching active objects in an augmented reality and / or virtual reality environment | |
CN104331154B (en) | Realize the man-machine interaction method and system of non-contact type mouse control | |
CN103793057A (en) | Information processing method, device and equipment | |
KR101807655B1 (en) | Method, device, system and non-transitory computer-readable recording medium for providing user interface | |
CN103823548B (en) | Electronic equipment, wearable device, control system and method | |
US20160085296A1 (en) | Wearable input device | |
KR20130099570A (en) | System and method for implemeting 3-dimensional user interface | |
US20240077948A1 (en) | Gesture-based display interface control method and apparatus, device and storage medium | |
Radhakrishnan et al. | Finger-based multitouch interface for performing 3D CAD operations | |
CN108027648A (en) | The gesture input method and wearable device of a kind of wearable device | |
CN108027655A (en) | Information processing system, information processing equipment, control method and program | |
CN105224089A (en) | Gesture operation method and device, mobile terminal | |
Wolf et al. | Performance envelopes of in-air direct and smartwatch indirect control for head-mounted augmented reality | |
WO2021173839A1 (en) | Hand gesture input for wearable system | |
CN107450717A (en) | A kind of information processing method and Wearable | |
CN104573459A (en) | Interacting method, interacting device and user equipment | |
TW202004433A (en) | Control instruction input method and apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190521 |
|
RJ01 | Rejection of invention patent application after publication |