CN114356069A - Interaction method and device, equipment and storage medium - Google Patents

Interaction method and device, equipment and storage medium Download PDF

Info

Publication number
CN114356069A
CN114356069A CN202011091797.2A CN202011091797A CN114356069A CN 114356069 A CN114356069 A CN 114356069A CN 202011091797 A CN202011091797 A CN 202011091797A CN 114356069 A CN114356069 A CN 114356069A
Authority
CN
China
Prior art keywords
acceleration
control end
control
instruction
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011091797.2A
Other languages
Chinese (zh)
Inventor
詹雪超
赵华球
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011091797.2A priority Critical patent/CN114356069A/en
Publication of CN114356069A publication Critical patent/CN114356069A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interaction method, an interaction device, interaction equipment and a computer-readable storage medium. The interaction method is used for interaction equipment, the interaction equipment is communicated with terminal equipment, the interaction equipment comprises a first control end and a second control end, and the interaction method comprises the following steps: the method comprises the steps of obtaining a first acceleration of a first control end, a second acceleration of a second control end and a distance between the first control end and the second control end, generating a control command according to the first acceleration, the second acceleration and the distance, and sending the control command to the terminal equipment to control a display picture of the terminal equipment. Therefore, the user can generate various control instructions through the interaction equipment to realize interaction with the terminal equipment, the operation is flexible and simple, and the user interaction experience is enhanced.

Description

Interaction method and device, equipment and storage medium
Technical Field
The present application relates to computer technologies, and in particular, to an interaction method, an interaction apparatus, an interaction device, and a computer-readable storage medium.
Background
In the related art, a picture displayed by an Augmented Reality (AR) or Virtual Reality (VR) technology is a Virtual image projected by an AR device or a VR device, and a user can interact with the Virtual image projected by the AR device or the VR device by using a gesture or by operating a peripheral control device corresponding to the AR device or the VR device. However, when the user can interact through the gesture, the recognition accuracy of the AR device or the VR device is related to the gesture accuracy of the user, so that the requirement on user operation is high, frequent gesture operation easily causes user fatigue, and user experience is poor.
Disclosure of Invention
The application provides an interaction method, which is used for interaction equipment, wherein the interaction equipment is communicated with terminal equipment, the interaction equipment comprises a first control end and a second control end, and the interaction method comprises the following steps:
acquiring a first acceleration of the first control end, a second acceleration of the second control end and a distance between the first control end and the second control end;
generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end; and
and sending the control instruction to the terminal equipment to control a display picture of the terminal equipment.
The application also provides an interaction device, which is used for interaction equipment, the interaction equipment is communicated with terminal equipment, the control equipment comprises a first control end and a second control end, and the interaction device comprises:
an acquisition module: the acquisition module is used for acquiring a first acceleration of the first control end, a second acceleration of the second control end and a distance between the first control end and the second control end;
the generating module is used for generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end; and
and the sending module is used for sending the control instruction to the terminal equipment so as to control the display picture of the terminal equipment.
The present application provides an interactive device that is,
the interactive device is communicated with the terminal device and comprises a first control end, a second control end, a processor and a communication module, wherein the first control end comprises a magnetic module and a first acceleration sensor, and the second control end comprises a Hall element and a second acceleration sensor; the processor is configured to:
acquiring a first acceleration of the first control end detected by the first acceleration sensor, a second acceleration of the second control end detected by the second acceleration sensor, and a distance between the first control end and the second control end, which is obtained by the Hall element detecting the magnetic module;
generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end;
the communication module is configured to:
and sending the control instruction to the terminal equipment to control a display picture of the terminal equipment.
An interactive device is provided that includes one or more processors, memory; and
one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the interaction method. The interaction method comprises the following steps: acquiring a first acceleration of the first control end, a second acceleration of the second control end and a distance between the first control end and the second control end; generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end; and sending the control instruction to the terminal equipment to control a display picture of the terminal equipment.
A non-transitory computer-readable storage medium containing a computer program which, when executed by one or more processors, causes the processors to perform the interaction method is provided. The interaction method comprises the following steps: acquiring a first acceleration of the first control end, a second acceleration of the second control end and a distance between the first control end and the second control end; generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end; and sending the control instruction to the terminal equipment to control a display picture of the terminal equipment.
In the interaction method, the interaction device and the computer readable storage medium of the embodiment of the application, a plurality of interaction instructions can be generated to interact with the terminal device by acquiring the distance between the first control end and the second control end in the interaction device, the first acceleration of the first control end and the second acceleration of the second control end, and according to the first acceleration, the direction and the magnitude of the second acceleration and the distance between the first control end and the second control end, so that the control of the display picture of the terminal device is realized.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart diagram of an interaction method in some embodiments of the present application.
FIG. 2 is a block diagram of an interaction device according to some embodiments of the present application.
FIG. 3 is a block diagram of an interaction device according to some embodiments of the present application.
FIG. 4 is a block diagram of an interaction device according to some embodiments of the present application.
FIG. 5 is a schematic diagram of a connection between a processor and a computer-readable storage medium according to some embodiments of the present application.
FIG. 6 is a schematic diagram of a scenario of an interaction method according to some embodiments of the present application.
FIG. 7 is a flow chart diagram illustrating an interaction method according to some embodiments of the present application.
FIG. 8 is a schematic diagram of a scenario of an interaction method according to some embodiments of the present application.
FIG. 9 is a schematic diagram of a scenario of an interaction method according to some embodiments of the present application.
FIG. 10 is a flow chart illustrating an interaction method according to some embodiments of the present application.
FIG. 11 is a flow chart illustrating an interaction method according to some embodiments of the present application.
FIG. 12 is a flow chart illustrating an interaction method according to some embodiments of the present application.
FIG. 13 is a schematic diagram of a scenario of an interaction method according to some embodiments of the present application.
FIG. 14 is a flow chart illustrating an interaction method according to some embodiments of the present application.
FIG. 15 is a schematic diagram of a scenario of an interaction method according to some embodiments of the present application.
FIG. 16 is a flow chart illustrating an interaction method according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the present application, it is to be understood that the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically defined otherwise.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
The following disclosure provides many different embodiments or examples for implementing different features of the application. In order to simplify the disclosure of the present application, specific example components and arrangements are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
Referring to fig. 1, an interaction method provided in an embodiment of the present application is used for an interaction device, where the interaction device communicates with a terminal device, the interaction device includes a first control end and a second control end, and the interaction method includes:
s12: acquiring a first acceleration of a first control end, a second acceleration of a second control end and a distance between the first control end and the second control end;
s14: generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end; and
s16: and sending a control instruction to the terminal equipment to control the display screen of the terminal equipment.
Referring further to fig. 2, the present embodiment provides an interactive apparatus 10. The interaction device 10 includes an acquisition module 12, a generation module 14, and a transmission module 16.
The step S12 may be implemented by the obtaining module 12, the step S14 may be implemented by the generating module 14, and the step S16 may be implemented by the transmitting module 16.
Alternatively, the obtaining module 12 may be configured to obtain a first acceleration of the first control end, a second acceleration of the second control end, and a distance between the first control end and the second control end.
The generating module 14 may be configured to generate a control command according to the first acceleration and the second acceleration and a distance between the first control end and the second control end.
The sending module 16 may be configured to send a control instruction to the terminal device to control a display screen of the terminal device.
Referring to fig. 3, an interactive device 1 is provided in an embodiment of the present application, where the interactive device 1 communicates with a terminal device. The interaction method of the present application may be performed by the interaction device 1. The interactive device 1 comprises a first control terminal 20, a second control terminal 30 and a processor and communication module 50.
The processor may be configured to obtain a first acceleration of the first control end 20, a second acceleration of the second control end 30, and a distance between the first control end 20 and the second control end 30, and generate a control command according to the first acceleration, the second acceleration, and the distance between the first control end 20 and the second control end 30. The communication module 50 may be configured to send a control instruction to the terminal device to control a display screen of the terminal device.
Referring to fig. 4, the present application provides an interactive device 1 comprising one or more processors 40, a memory 60; and one or more programs 62, wherein the one or more programs 62 are stored in the memory 60 and executed by the one or more processors, the programs 62 being executed by the processors 60 to execute instructions of the interaction method.
Referring to fig. 5, the present application provides a non-transitory computer readable storage medium 70 always containing a computer program 72, which when executed by one or more processors 72 causes the processors to perform an interaction method.
The interaction method, the interaction apparatus 10, the interaction device 1, and the computer readable storage medium 70 of these embodiments can generate a variety of interaction instructions to interact with the terminal device by obtaining the distance between the first control end 20 and the second control end 30 and the first acceleration of the first control end 20 and the second acceleration of the second control end 30 in the interaction device 1, and according to the first acceleration, the direction and the magnitude of the second acceleration, and the distance between the first control end 20 and the second control end 30, so as to implement control of the display screen of the terminal device. Therefore, the user can interact with the terminal equipment through the interaction equipment 1, and the operation difficulty of the user can be reduced during interaction, so that the user operation is simple.
In some implementations, the interaction device can be a cell phone, tablet, or electronic device such as a wearable device, e.g., a watch, bracelet, ring, etc. The terminal equipment can be a mobile phone, a computer, intelligent wearable equipment (an intelligent watch, an intelligent bracelet, an intelligent helmet, intelligent glasses and the like), virtual reality equipment or augmented reality equipment and the like.
Referring to fig. 6, in this embodiment, the interactive device 1 is a ring as an example, wherein the first control end 20 and the second control end are both rings and can be respectively worn on the fingers of the user. That is, the interaction method and the interaction apparatus 10 are applied to a ring but not limited to a ring. Therefore, the interactive equipment is small in size and convenient to carry.
In some embodiments, the interaction means 10 may be part of the interaction device 1. Alternatively, the interaction device 1 comprises interaction means 10.
In some embodiments, the interaction means 10 may be discrete components assembled in such a way that they have the aforementioned functions, or be in the form of integrated circuits present in a chip having the aforementioned functions, or be computer software code segments which, when run on a computer, cause the computer to have the aforementioned functions.
In some embodiments, the interactive apparatus 10 may be a stand-alone or add-on peripheral component to a computer or computer system as hardware. The interaction means 10 may also be integrated into a computer or computer system, for example, the interaction means 10 may be integrated into the processor 40 when the interaction means 10 is part of the interaction device 1.
In some embodiments where the interaction means 10 is part of the interaction device 1, as software, code segments corresponding to the interaction means 10 may be stored on a memory and executed by the processor 40 to implement the aforementioned functions. Or the interactive apparatus 10, comprises one or more of the programs 62 as described above, or the one or more of the programs 62 as described above comprises the interactive apparatus 10.
In some embodiments, the computer-readable storage medium 70 may be a storage medium built into the interaction device 1, and may be, for example, the memory 60.
In some embodiments, the communication module 50 may be a wireless module such as a bluetooth module, a Wi-Fi module, or a Lora module, so as to realize wireless communication with the terminal device.
In some embodiments, the display screen of the terminal device may be a screen displayed by a display screen of the terminal device, or may be a screen projected to a space or a plane by the terminal device.
It should be noted that the control command refers to content generated during the interaction between the user and the interaction device 1. For example, content may include, but is not limited to, interactive actions such as moving, sliding, tapping, pressing, and the like. Each interaction may correspond to a control instruction, for example, a control instruction may be generated during a process of pressing the first control terminal 20 or the second control terminal by a user, or a control instruction may be generated by moving the first control terminal 20 or the second control terminal.
When the user performs an interactive action such as moving, sliding, tapping, pressing, etc. on the interactive device 1, the processor 40 may generate a corresponding control instruction according to the state of the interactive device 1. It is understood that, since the state of the interactive device 1 changes when the user moves, slides or presses the interactive device 1, the processor 40 may generate a corresponding control instruction according to the state change of the interactive device 1.
For example, when the user moves the first control end 20 to approach the second control end 30, the acceleration of the first control end 20 changes, and the distance between the first control end 20 and the second control end 30 also changes, so that the processor 40 can generate the control command according to the change of the acceleration and the change of the distance.
Referring to fig. 3, in detail, the first control terminal 20 includes a magnetic module 21 and a first acceleration sensor 22, and the second control terminal 30 includes a hall element 31 and a second acceleration sensor 32. The first acceleration sensor 22 is configured to detect a first acceleration of the first control end 20; the second acceleration sensor 32 is used for detecting a second acceleration of the second control end 30; the hall element 31 is used to detect magnetic induction.
The magnetic module 21 is an element capable of generating a magnetic field, and may be made of a hard magnetic material or a soft magnetic material. The hall element 31 is within the range of the magnetic field generated by the magnetic module 21, that is, the hall element 31 can sense magnetism generated by the magnetic module 21. The first acceleration sensor 22 and the second acceleration sensor 32 can detect acceleration in any direction.
It should be further noted that, the processor 40 may be preset with a mapping function, and may map the magnetic induction detected by the hall element 31 to a corresponding distance through the mapping function, that is, there is a mapping relationship between the magnetic induction detected by the hall element 31 and the distance between the hall element 31 and the magnetic module 21, and it can be understood that the magnetic induction detected by the hall element 31 is negatively correlated with the distance between the hall element 31 and the magnetic module 21, that is, the larger the magnetic induction, the smaller the distance between the hall element 31 and the magnetic module 21. Since the hall element 31 is disposed at the second control terminal 30 and the magnetic module 21 is disposed at the first control terminal 20, the processor 40 can determine the distance between the second control terminal 30 and the second control terminal 30.
As will be understood by those skilled in the art, a mapping function refers to a function for mapping one data into another data according to a certain mapping relationship, and the mapping result is the other data. For example, there is a function y and data a, data a can be mapped to data B by function y, then function y is a mapping function, and data B is a mapping result.
Further, the processor 40 may further acquire the first acceleration or the second acceleration from the first acceleration sensor 22 and the second acceleration sensor 32, process the first acceleration or the second acceleration and the distance between the first control end 20 and the second control end 30 to generate a corresponding control instruction, and then send the control instruction to the terminal device through the communication module 50, so as to control a display screen of the terminal device, and finally implement interaction with the terminal device.
Referring to fig. 7, in some embodiments, step S14 includes the steps of:
s141: judging the magnitude and direction of the first acceleration and the second acceleration and the distance between the first control end and the second control end;
s142: when the first acceleration is larger than a first preset value and the direction is parallel to the first plane, and the second acceleration is smaller than or equal to the first preset value and the direction is parallel to the first plane, generating a first moving instruction; or
S143: and when the first acceleration is smaller than or equal to a first preset value and the direction is parallel to the first plane, the second acceleration is larger than the first preset value and the direction is parallel to the first plane, and a first movement instruction is generated.
Step S16 includes the steps of:
s161: and sending a first moving instruction to the terminal equipment to control the target object in the display picture to move according to the direction of the first acceleration and the distance between the first control end and the second control end.
Referring further to fig. 2, in some embodiments, the generating module 14 includes a determining unit 141, a first generating unit 142, and a second generating unit 143. Step S141 may be implemented by the determination unit 142, step S142 may be implemented by the first generation unit 142, and step S143 may be implemented by the second generation unit 143. Step S161 may be implemented by the sending module 16.
Alternatively, the determination unit 141 is configured to determine the magnitude and direction of the first acceleration and the second acceleration and the magnitude of the distance.
The first generating unit 142 is configured to generate a first movement instruction when the first acceleration is greater than a first preset value and the direction is parallel to the first plane, and the second acceleration is less than or equal to the first preset value and the direction is parallel to the first plane.
The second generating unit 143 is configured to generate a first moving instruction when the first acceleration is smaller than or equal to a first preset value and the direction is parallel to the first plane, and the second acceleration is larger than the first preset value and the direction is parallel to the first plane.
The sending module 16 is configured to send a first moving instruction to the terminal device to control the target object in the display to move according to the direction of the second acceleration and the distance between the first control end 20 and the second control end 30.
In some embodiments, the processor 40 is configured to determine the magnitude and direction of the first acceleration and the second acceleration and the magnitude of the distance; the processor 40 is further configured to generate a first movement instruction when the first acceleration is greater than the first preset value and the direction is parallel to the first plane, and the second acceleration is less than or equal to the first preset value and the direction is parallel to the first plane, and the processor 40 is further configured to generate a first movement instruction when the first acceleration is less than or equal to the first preset value and the direction is parallel to the first plane, and the second acceleration is greater than the first preset value and the direction is parallel to the first plane. The communication module 50 may be further configured to send a first movement instruction to the terminal device to control the target object in the display to move according to the direction of the first acceleration and the distance between the first control terminal 20 and the second control terminal 30.
In the initial state, the first control terminal 20 and the second control terminal 30 are on a first plane, and the first plane may be a horizontal plane, a vertical plane, or a picture parallel to or perpendicular to the display picture, and the like, and is not limited in particular. For example, in some embodiments, the first plane may be a horizontal plane. The first preset value is a preset value of the interactive device 1, and is used for comparing with the first acceleration and the second acceleration, the first preset value is greater than zero, specifically, the numerical value is not limited, and if the first acceleration or the second acceleration is greater than the first preset value, it is determined that the first control end 20 or the second control end 30 moves in a direction parallel to the first plane.
Please refer to fig. 6, 8 and 9, it should be further noted that the target object may be a movable cursor point, which can be used to select an area of the display screen. The relative positions of the first control end 20 and the second control end 30 and the relative position of the target object and the center of the display screen in the display screen form a mapping relationship, for example, the distance between the first control end 20 and the second control end 30 and the distance between the target object and the center of the display screen in the display screen are in a multiple relationship, if the distance between the first control end 20 and the second control end 30 is zero, the target object is at the center of the display screen, and if the distance between the first control end 20 and the second control end 30 is greater than zero, the distance between the target object and the center of the display screen is also greater than zero and is in a multiple relationship with the distance between the first control end 20 and the second control end 30.
Further, after the terminal device receives the first moving instruction, the target object may be controlled to move according to the relative position relationship between the first control end 20 and the second control end 30, so that the relative position between the target object and the center of the display screen and the relative position between the first control end 20 and the second control end 30 form a mapping relationship.
Specifically, the processor 40 obtains the first acceleration of the first acceleration sensor 22 and the second acceleration of the second acceleration sensor 32, then determines the magnitude and direction of the first acceleration and the second acceleration, and also obtains the magnitude of the distance between the first control end 20 and the second control end 30. If the first acceleration is smaller than or equal to the first preset value, the second acceleration is larger than the first preset value, and the direction of the second acceleration is parallel to the first plane, or the second acceleration is smaller than or equal to the first preset value, the first acceleration is larger than the first preset value, and the direction of the first acceleration is parallel to the first plane, the processor 40 may generate a first movement instruction according to the distance between the first control end 20 and the second control end 30 and the direction of the first acceleration or the second acceleration, and send the first movement instruction to the terminal device through the communication module 50.
It can be understood that when the first acceleration is greater than the first predetermined value, the direction of the first acceleration is parallel to the first plane, and the second predetermined value is less than or equal to the first predetermined value, the first control end 20 is considered to move and move close to or away from the first control end 20, and the second control end 30 is considered to be stationary. When the first acceleration is smaller than or equal to the first preset value, the second acceleration is greater than the first preset value, and the direction of the second acceleration is parallel to the first plane, it can be considered that the first control end 20 is stationary, and the second control end 30 moves and is approaching to or departing from the first control end 20.
Further, after the terminal device receives the first moving instruction, the target object may be controlled to move according to the relative position relationship between the first control end 20 and the second control end 30, so that the relative position between the target object and the center of the display screen corresponds to the relative position between the first control end 20 and the second control end 30.
Referring to fig. 10, in some embodiments, step S16 further includes the steps of:
s162: if the first acceleration is larger than a second preset value and the direction is vertical to the first plane, and the second acceleration is smaller than or equal to the second preset value, generating a click command and sending the click command to the terminal equipment to control the area where the click target object is located;
s163: and if the second acceleration is larger than the second preset value and the direction is vertical to the first plane, and the first acceleration is smaller than or equal to the second preset value, generating a click command and sending the click command to the terminal equipment to control the area where the click target object is located.
In some embodiments, the step S162 and the step S163 may be implemented by the generating module 16, or the generating module 16 may be configured to generate a click command and send the click command to the terminal device to control the area where the click target object is located if the first acceleration is greater than the second preset value and the direction is perpendicular to the first plane and the second acceleration is less than or equal to the second preset value, and the generating module 16 may be further configured to generate the click command and send the click command to the terminal device to control the area where the click target object is located if the second acceleration is greater than the second preset value and the direction is perpendicular to the first plane and the first acceleration is less than or equal to the second preset value.
In some embodiments, the processor 40 is configured to generate a click command and send the click command to the terminal device to control an area where the click target object is located, if the first acceleration is greater than a second preset value, and the direction of the first acceleration is perpendicular to the first plane, and the second acceleration is less than or equal to the second preset value. The processor 40 may be further configured to generate a click command and send the click command to the terminal device to control an area where the click target object is located if the second acceleration is greater than the second preset value and the direction is perpendicular to the first plane, and the first acceleration is less than or equal to the second preset value.
In the present embodiment, the interactive device 1 may generate a click command when any one of the first control terminal 20 and the second control terminal 30 moves in a direction perpendicular to the first plane. And if the terminal equipment receives the click command, the position of the target object in the click display picture can be controlled.
It can be understood that, since the acceleration is smaller than the first preset value, the first control terminal 20 is considered to be stationary, and therefore, if the processor 40 determines that the first acceleration is larger than the second preset value and the direction is perpendicular to the first plane, and the second acceleration is smaller than or equal to the second preset value, the first control terminal 20 is indicated to move along the direction perpendicular to the first plane, and the second control terminal 30 is in a stationary state. If the processor 40 determines that the second acceleration is greater than the second preset value, the direction of the second acceleration is perpendicular to the first plane, and the first acceleration is less than or equal to the second preset value, it indicates that the second control end 30 moves along the direction perpendicular to the first plane, and the first control end 20 is in a stationary state, and further, the processor 40 may generate a click command, and may send the click command to the terminal device through the communication module 50, so as to control the area where the click target object is located.
Referring to fig. 11, in some embodiments, step S14 further includes the steps of:
s144: and generating a first zooming instruction when the distance between the first control end and the second control end is greater than zero, the first acceleration and the second acceleration are greater than a second preset value, and the direction is vertical to the first plane.
Step S16 further includes the steps of:
s164: and sending a first zooming instruction to the terminal equipment to control the zooming of the area where the target object is located according to the change of the distance between the first control end and the second control end.
Referring further to fig. 2, in some embodiments, step S144 may be implemented by the generating module 14. Step S164 may be implemented by the generation module 16.
Or, the generating module 14 is configured to generate a first scaling instruction when the distance is greater than zero, the first acceleration and the second acceleration are greater than a second preset value, and the direction is perpendicular to the first plane. The generating module 16 is configured to send a first scaling instruction to the terminal device to control the area where the target object is located to scale according to the change of the distance between the first control end 20 and the second control end 30.
In some embodiments, the processor 40 may be configured to generate a first zooming instruction when the distance is greater than zero, the first acceleration and the second acceleration are greater than a second preset value, and the direction is perpendicular to the first plane, and the processor 40 may be further configured to send the first zooming instruction to the terminal device to control the area where the target object is located to zoom according to the change of the distance between the first control end 20 and the second control end 30.
The second preset value is a value preset by the interactive device 1 and used for comparing with the first acceleration and the second acceleration, the second preset value is greater than zero, specifically, the value is not limited, the second preset value may be the same as the first preset value or different from the first preset value, and if the first acceleration or the second acceleration is greater than the second preset value, it is determined that the first control end 20 or the second control end 30 moves in the direction perpendicular to the first plane.
In the interaction device 1 and the terminal device of this embodiment, when the distance between the first control end 20 and the second control end 30 is greater than zero, the first control end 20 and the second control end 30 move in the direction perpendicular to the first plane at the same time to define as generating or starting a first zoom instruction, and after receiving the first zoom instruction, the terminal device may control the area where the target object is located to zoom in or out according to the change in the distance between the first control end 20 and the second control end 30.
Specifically, if the processor 40 determines that the distance between the first control end 20 and the second control end 30 obtained by detecting the magnetic module 21 by the hall element 31 is greater than zero, and the first acceleration and the second acceleration are greater than the second preset value in the direction perpendicular to the first plane, it indicates that the first control end 20 and the second control end 30 move in the direction perpendicular to the first plane at the same time, and further, the processor 40 generates a first zoom instruction, and the communication module 50 sends the first zoom instruction to the terminal device, and after the terminal device receives the first zoom instruction, the area where the target object is located can be controlled to be enlarged or reduced according to the change in the distance between the first control end 20 and the second control end 30.
Referring to fig. 12, in some embodiments, step S14 includes the steps of:
s145: and when the distance between the first control end and the second control end is zero, and the first acceleration and the second acceleration are equal and larger than the first preset value and smaller than the third preset value, generating a second movement instruction.
Step S16 further includes the steps of:
s165: and sending a second movement instruction to the terminal equipment to control the target object in the display picture to move according to the direction and the magnitude of the first acceleration.
Referring further to fig. 2, in some embodiments, step S145 may be implemented by the generation module 14. Step S165 may be implemented by the sending module 16.
Or, the generating module 14 may be configured to generate the second moving instruction when the distance between the first control end 20 and the second control end 30 is zero, and the first acceleration and the second acceleration are equal to each other and are greater than the first preset value and less than the third preset value.
The sending module 16 may be configured to send a second movement instruction to the terminal device to control the target object in the display screen to move according to the direction and the magnitude of the first acceleration.
In some embodiments, the processor 40 is configured to generate a second movement instruction when the distance between the first control end 20 and the second control end 30 is zero, and the first acceleration and the second acceleration are equal to each other and are greater than the first preset value and less than the third preset value, and the communication module 50 is further configured to send the second movement instruction to the terminal device to control the target object in the display to move according to the direction and the magnitude of the first acceleration.
It should be noted that the third preset value is a preset value of the interaction device 1, and is used for comparing with the first acceleration and the second acceleration, and the third preset value is greater than the first preset value, and specifically, the value is not limited.
Referring to fig. 13, in the interaction device 1 and the terminal device of the present embodiment, the distance between the first control end 20 and the second control end 30 is equal to zero, the first control end 20 and the second control end 30 move simultaneously, and when the moving speed of the first control end 20 and the second control end 30 is between the first preset value and the third preset value, the first movement instruction is defined as a second movement instruction, and after receiving the second movement instruction, the terminal device may control the target object in the display screen to move according to the direction and the magnitude of the first acceleration or the second acceleration.
Specifically, if the processor 40 determines that the distance between the first control end 20 and the second control end 30 is equal to zero, the first acceleration and the second acceleration are the same, and the first acceleration and the second acceleration are greater than the first preset value and less than or equal to the third preset value in the direction parallel to the first plane, it is indicated that the first control end 20 and the second control end 30 move in the direction parallel to the first plane at the same time, and then the processor 40 generates a second movement instruction, and the communication module 50 sends the second movement instruction to the terminal device, and after the terminal device receives the second movement instruction, the target object can be controlled to move according to the direction and the magnitude of the first acceleration.
Referring to fig. 14, in some embodiments, step S14 further includes:
s146: if the distance between the first control end and the second control end is greater than zero, the first acceleration and the second acceleration are greater than a first preset value, and the directions are opposite, a second zooming instruction is generated;
step S16 further includes:
s166: and sending a second zooming instruction to the terminal equipment to control the zooming of the area where the target object is located according to the change of the distance between the first control end and the second control end 30.
In certain embodiments, step S146 may be implemented by the generation module 14. Step S166 may be implemented by the sending module 16.
Or, the first generating module 14 may be configured to generate a second scaling instruction if the distance between the first control end 20 and the second control end 30 is greater than zero, and the first acceleration and the second acceleration are greater than the first preset value and have opposite directions, and the sending module 16 may be further configured to send the second scaling instruction to the terminal device to control the area where the target object is located to scale according to the change of the distance between the first control end 20 and the second control end 30.
In some embodiments, the processor 40 may be configured to generate a second scaling instruction if the distance is greater than zero, and the first acceleration and the second acceleration are greater than the first preset value and opposite in direction, and the processor 40 may be further configured to send the second scaling instruction to the terminal device to control the area where the target object is located to scale according to the change in the distance between the first control end 20 and the second control end 30.
Referring to fig. 15, in the interaction device 1 and the terminal device of the present embodiment, after the processor 40 generates the second movement instruction, if the first control end 20 and the second control end 30 move in opposite directions so that the distance between the first control end 20 and the second control end 30 is greater than zero, the second movement instruction is defined as the second zoom instruction, and after the terminal device receives the first zoom instruction, the terminal device may control the area where the target object is located to zoom in or out according to the change in the distance between the first control end 20 and the second control end 30.
It should be noted that the moving directions of the first control end 20 and the second control end 30 may be parallel to the first plane, or may be parallel to the first plane, and the moving directions of the first control end 20 and the second control end 30 are not limited. For example, in the present embodiment, the moving directions of the first control end 20 and the second control end 30 are parallel to the first plane.
Specifically, if the processor 40 determines that the distance between the first control end 20 and the second control end 30 obtained by detecting the magnetic module 21 by the hall element 31 is greater than zero, the first acceleration and the second acceleration are greater than the first preset value in the direction parallel to the first plane, and the direction of the first acceleration is opposite to the direction of the second acceleration, it indicates that the first control end 20 and the second control end 30 move in the opposite direction on the plane parallel to the first plane, and then the processor 40 generates the second zoom instruction, and the communication module 50 sends the second zoom instruction to the terminal device, and after the terminal device receives the second zoom instruction, the area where the target object is located can be controlled to be enlarged or reduced according to the change in the distance between the first control end 20 and the second control end 30.
Referring to fig. 16, in some embodiments, step S14 further includes the steps of:
s147: when the distance between the first control end and the second control end is zero and the first acceleration and the second acceleration are greater than or equal to a third preset value, generating a sliding instruction;
step S16 further includes the steps of:
s167: and sending a sliding instruction to the terminal equipment to control the display picture to slide according to the direction of the first acceleration.
In certain embodiments, step S147 may be implemented by the generation module 14. Step S167 may be implemented by the sending module 16. In other words, the generating module 14 may be configured to generate the sliding instruction according to the direction of the first acceleration when the distance between the first control end and the second control end 30 is zero, and the first acceleration and the second acceleration are greater than or equal to a third preset value.
The sending module 16 may be configured to send a sliding instruction to the terminal device to control the display to slide according to the direction of the first acceleration.
In some embodiments, the processor 40 is further configured to generate a sliding instruction when the distance between the first control end 20 and the second control end 30 is zero, and the first acceleration and the second acceleration are greater than or equal to a third preset value, and the processor 40 is further configured to send the sliding instruction to the terminal device to control the display screen to slide according to the direction of the second acceleration.
In the interaction device 1 and the terminal device of the embodiment, when the distance between the first control end 20 and the second control end 30 obtained by detecting the magnetic module 21 by the hall element 31 is zero, the first control end 20 and the second control end 30 move simultaneously, and the acceleration during the movement is greater than the third preset value, which is defined as a sliding instruction, that is, if the processor 40 detects that the distance between the first control end and the second control end is zero, and at the same time, detects that the first acceleration and the second acceleration are greater than the third preset value, a sliding instruction is generated, and after the terminal device receives the sliding instruction, the display screen can be controlled to slide according to the direction of the first acceleration or the second acceleration.
It should be noted that the direction of the sliding of the display screen is the same as the direction of the first acceleration or the second acceleration, that is, the direction of the sliding of the display screen is the same as the direction of the first acceleration or the second acceleration every time the terminal device receives the sliding instruction.
Specifically, if the processor 40 determines that the distance between the first control end 20 and the second control end 30 obtained by detecting the magnetic module 21 according to the hall element 31 is equal to zero, and the first acceleration and the second acceleration are the same and are greater than the first preset value and the third preset value in the direction parallel to the first plane, it is indicated that the first control end 20 and the second control end 30 move in the direction parallel to the first plane at the same time, and then the processor 40 generates a first zoom instruction, and the communication module 50 sends the first zoom instruction to the terminal device, and after the terminal device receives the first zoom instruction, the display screen can be controlled to slide according to the direction of the first acceleration or the second acceleration.
In some embodiments, the sliding distance of the display screen may be proportional to the magnitude of the first acceleration or the second acceleration, that is, the greater the first acceleration or the second acceleration, the longer the sliding distance of the display screen. For example, if the direction of the first acceleration or the second acceleration is leftward and the first acceleration or the second acceleration is 0.5 m/second squared, the display screen slides leftward by a distance of 5 cm.
In other embodiments, the sliding distance of the display screen may also be a preset distance, that is, the terminal device slides the preset distance every time the terminal device receives a sliding instruction.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (11)

1. An interaction method is characterized in that the interaction device is used for communicating with a terminal device, the interaction device comprises a first control end and a second control end, and the interaction method comprises the following steps:
acquiring a first acceleration of the first control end, a second acceleration of the second control end and a distance between the first control end and the second control end;
generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end; and
and sending the control instruction to the terminal equipment to control a display picture of the terminal equipment.
2. The interaction method according to claim 1, wherein the generating a control command according to the first acceleration and the second acceleration and the distance between the first control end and the second control end comprises:
judging the magnitude and direction of the first acceleration and the second acceleration and the distance between the first control end and the second control end;
when the first acceleration is larger than the first preset value and the direction is parallel to a first plane, and the second acceleration is smaller than or equal to the first preset value and the direction is parallel to the first plane, generating a first movement instruction; or
When the first acceleration is smaller than or equal to the first preset value and the direction is parallel to the first plane, and the second acceleration is larger than the first preset value and the direction is parallel to the first plane, generating the first movement instruction;
the sending the control instruction to the terminal device to control the display screen of the terminal device includes:
and sending the first moving instruction to the terminal equipment to control the target object in the display picture to move according to the direction of the second acceleration and the distance between the first control end and the second control end.
3. The interaction method according to claim 2, wherein after the sending of the first movement instruction to the terminal device to control the movement of the target object in the display screen, the interaction method further comprises:
if the first acceleration is larger than a second preset value and the direction is vertical to the first plane, and the second acceleration is smaller than or equal to the second preset value, generating a click command and sending the click command to the terminal equipment to control the click on the area where the target object is located; or
And if the second acceleration is larger than a second preset value and the direction is vertical to the first plane, and the first acceleration is smaller than or equal to the second preset value, generating the click command and sending the click command to the terminal equipment so as to control the click on the area where the target object is located.
4. The interaction method of claim 3, wherein the generating a control command according to the first acceleration and the second acceleration and the distance between the first control end and the second control end further comprises:
and generating a first zooming instruction when the distance between the first control end and the second control end is greater than zero, the first acceleration and the second acceleration are greater than a second preset value, and the direction of the first acceleration and the second acceleration is vertical to the first plane.
The sending the control instruction to the terminal device to control the display screen of the terminal device further comprises:
and sending the first zooming instruction to the terminal equipment to control the area where the target object is located to zoom according to the change of the distance between the first control end and the second control end.
5. The interaction method of claim 2, wherein the generating a control command according to the first acceleration and the second acceleration and the distance between the first control end and the second control end further comprises:
generating a second movement instruction when the distance between the first control end and the second control end is zero, and the first acceleration and the second acceleration are equal and larger than the first preset value and smaller than a third preset value;
the sending the control instruction to the terminal device to control the display screen of the terminal device further comprises:
and sending the second movement instruction to the terminal equipment to control the target object in the display picture to move according to the direction and the magnitude of the first acceleration.
6. The interaction method according to claim 5, wherein after sending the second movement instruction to the terminal device to control the target object in the display screen to move, the interaction method further comprises:
if the distance between the first control end and the second control end is greater than zero, and the first acceleration and the second acceleration are greater than the first preset value and opposite in direction, generating a second zooming instruction;
the sending the control instruction to the terminal device to control the display screen of the terminal device further comprises:
and sending the second zooming instruction to the terminal equipment to control the area where the target object is located to zoom according to the change of the distance between the first control end and the second control end.
7. The interaction method of claim 5, wherein the generating a control command according to the first acceleration and the second acceleration and the distance between the first control end and the second control end further comprises:
when the distance between the first control end and the second control end is zero and the first acceleration and the second acceleration are greater than or equal to the third preset value, generating a sliding instruction;
the sending the control instruction to the terminal device to control the display screen of the terminal device further comprises:
and sending the sliding instruction to the terminal equipment to control the display picture to slide according to the direction of the first acceleration.
8. An interaction apparatus, configured to be used in an interaction device, where the interaction device is in communication with a terminal device, and the control device includes a first control end and a second control end, where the interaction apparatus includes:
an acquisition module: the acquisition module is used for acquiring a first acceleration of the first control end, a second acceleration of the second control end and a distance between the first control end and the second control end;
the generating module is used for generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end; and
and the sending module is used for sending the control instruction to the terminal equipment so as to control the display picture of the terminal equipment.
9. The interactive device is characterized by being communicated with a terminal device, and comprising a first control end, a second control end, a processor and a communication module, wherein the first control end comprises a magnetic module and a first acceleration sensor, and the second control end comprises a Hall element and a second acceleration sensor; the processor is configured to:
acquiring a first acceleration of the first control end detected by the first acceleration sensor, a second acceleration of the second control end detected by the second acceleration sensor, and a distance between the first control end and the second control end, which is obtained by the Hall element detecting the magnetic module;
generating a control instruction according to the first acceleration, the second acceleration and the distance between the first control end and the second control end;
the communication module is configured to:
and sending the control instruction to the terminal equipment to control a display picture of the terminal equipment.
10. An interactive device, characterized by one or more processors, memory; and
one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the interaction method of any of claims 1-7.
11. A non-transitory computer-readable storage medium containing a computer program, which when executed by one or more processors causes the processors to perform the interaction method of any one of claims 1-7.
CN202011091797.2A 2020-10-13 2020-10-13 Interaction method and device, equipment and storage medium Pending CN114356069A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011091797.2A CN114356069A (en) 2020-10-13 2020-10-13 Interaction method and device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011091797.2A CN114356069A (en) 2020-10-13 2020-10-13 Interaction method and device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114356069A true CN114356069A (en) 2022-04-15

Family

ID=81090079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011091797.2A Pending CN114356069A (en) 2020-10-13 2020-10-13 Interaction method and device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114356069A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102486677A (en) * 2010-12-06 2012-06-06 鸿富锦精密工业(深圳)有限公司 Ring sensor interaction system
CN104765460A (en) * 2015-04-23 2015-07-08 王晓军 Intelligent ring and method for controlling intelligent terminal through intelligent ring via gestures
WO2015179262A1 (en) * 2014-05-23 2015-11-26 Microsoft Technology Licensing, Llc Finger tracking
CN105138117A (en) * 2015-07-28 2015-12-09 百度在线网络技术(北京)有限公司 Interaction system and method for wrist-mounted intelligent device
CN107678542A (en) * 2017-09-23 2018-02-09 武汉市烨震科技有限公司 A kind of finger ring class wearable device and man-machine interaction method
CN110096131A (en) * 2018-01-29 2019-08-06 华为技术有限公司 Sense of touch exchange method, device and sense of touch wearable device
CN110456907A (en) * 2019-07-24 2019-11-15 广东虚拟现实科技有限公司 Control method, device, terminal device and the storage medium of virtual screen
CN210605644U (en) * 2019-11-22 2020-05-22 东莞市星铭塑胶模具有限公司 Wearable keyboard
CN111383345A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102486677A (en) * 2010-12-06 2012-06-06 鸿富锦精密工业(深圳)有限公司 Ring sensor interaction system
WO2015179262A1 (en) * 2014-05-23 2015-11-26 Microsoft Technology Licensing, Llc Finger tracking
CN104765460A (en) * 2015-04-23 2015-07-08 王晓军 Intelligent ring and method for controlling intelligent terminal through intelligent ring via gestures
CN105138117A (en) * 2015-07-28 2015-12-09 百度在线网络技术(北京)有限公司 Interaction system and method for wrist-mounted intelligent device
CN107678542A (en) * 2017-09-23 2018-02-09 武汉市烨震科技有限公司 A kind of finger ring class wearable device and man-machine interaction method
CN110096131A (en) * 2018-01-29 2019-08-06 华为技术有限公司 Sense of touch exchange method, device and sense of touch wearable device
CN111383345A (en) * 2018-12-29 2020-07-07 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
CN110456907A (en) * 2019-07-24 2019-11-15 广东虚拟现实科技有限公司 Control method, device, terminal device and the storage medium of virtual screen
CN210605644U (en) * 2019-11-22 2020-05-22 东莞市星铭塑胶模具有限公司 Wearable keyboard

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴亚东等: "《人机交互技术及应用》", 31 August 2020, 机械工业出版社, pages: 161 - 163 *

Similar Documents

Publication Publication Date Title
CN102906671B (en) Gesture input device and gesture input method
EP2068235A2 (en) Input device, display device, input method, display method, and program
EP3147819A1 (en) Method and device for fingerprint image alignment
US20090002324A1 (en) Method, Apparatus and Computer Program Product for Providing a Scrolling Mechanism for Touch Screen Devices
KR20210042621A (en) Electronic device and method for controlling zoom rate of camera thereof
EP3276478A1 (en) Mobile terminal and method for determining scrolling speed
EP2558924B1 (en) Apparatus, method and computer program for user input using a camera
JP2004280836A (en) System for recognizing stroke of writing motion and recognition method thereof
US20180260032A1 (en) Input device, input method, and program
US20150009136A1 (en) Operation input device and input operation processing method
CN106527923B (en) Graph display method and device
CN112965773A (en) Method, apparatus, device and storage medium for information display
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN114356069A (en) Interaction method and device, equipment and storage medium
KR20150008624A (en) Controlling method of electronic device and apparatus thereof
CN106909272A (en) A kind of display control method and mobile terminal
US10474409B2 (en) Response control method and electronic device
CN104020933A (en) Menu displaying method and device
Maidi et al. Interactive media control using natural interaction-based Kinect
JP2010067090A (en) Information terminal device
US20200371681A1 (en) Method for zooming an image displayed on a touch-sensitive screen of a mobile terminal
CN107977071B (en) Operation method and device suitable for space system
CN105787971A (en) Information processing method and electronic equipment
KR101165388B1 (en) Method for controlling screen using different kind of input devices and terminal unit thereof
US11036287B2 (en) Electronic device, control method for electronic device, and non-transitory computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination