CN115657843A - Interaction method, intelligent ring, interaction system and computer readable storage medium - Google Patents

Interaction method, intelligent ring, interaction system and computer readable storage medium Download PDF

Info

Publication number
CN115657843A
CN115657843A CN202211186084.3A CN202211186084A CN115657843A CN 115657843 A CN115657843 A CN 115657843A CN 202211186084 A CN202211186084 A CN 202211186084A CN 115657843 A CN115657843 A CN 115657843A
Authority
CN
China
Prior art keywords
key
touch pad
user
touch
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211186084.3A
Other languages
Chinese (zh)
Inventor
郭铁成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Peaper Information Technology Co ltd
Original Assignee
Shenzhen Peaper Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Peaper Information Technology Co ltd filed Critical Shenzhen Peaper Information Technology Co ltd
Priority to CN202211186084.3A priority Critical patent/CN115657843A/en
Publication of CN115657843A publication Critical patent/CN115657843A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the application relates to the technical field of intelligent wearing, and provides an interaction method, an intelligent ring, an interaction system and a computer-readable storage medium. The interaction method is applied to a smart ring, the smart ring comprises a touch pad, and the interaction method comprises the following steps: after the intelligent ring is in communication connection with the VR/AR/MR device, the VR/AR/MR device displays an interactive interface for interactive operation with the touch pad; the intelligent ring receives interactive operation input by a user on the touch pad; and the intelligent ring sends an operation instruction corresponding to the interactive operation to the VR/AR/MR device so that the VR/AR/MR device executes the operation instruction in response to the interactive operation.

Description

Interaction method, intelligent ring, interaction system and computer readable storage medium
Technical Field
The application relates to the technical field of intelligent wearing, in particular to an interaction method, an intelligent ring, an interaction system and a computer-readable storage medium.
Background
With the development of computer graphics technology, technologies such as Virtual Reality (VR), augmented Reality (AR), and Mixed Reality (MR) are gradually applied to the lives of people. When a user of an existing electronic device such as a VR/AR/MR helmet or VR/AR/MR glasses performs an interactive operation using technologies such as VR, AR, and MR, one way is to interact by the user operating a mechanism such as a button or a wheel on the electronic device such as the VR/AR/MR helmet or VR/AR/MR glasses, and the other way is to interact by the user operating a mobile phone connected to the electronic device such as the VR/AR/MR helmet or VR/AR/MR glasses. However, both of the two ways need the arm of the user to drive the palm and the finger, and the user experience needs to be improved.
Disclosure of Invention
The application provides an interaction method in a first aspect. The interaction method is applied to a smart ring, the smart ring comprises a touch pad, and the interaction method comprises the following steps:
after the intelligent ring is in communication connection with the VR/AR/MR device, the VR/AR/MR device displays an interactive interface for interactive operation with the touch pad;
the intelligent ring receives interactive operation input by a user on the touch pad; and
and the intelligent ring sends an operation instruction corresponding to the interactive operation to the VR/AR/MR device so that the VR/AR/MR device responds to the interactive operation to execute the operation instruction.
The interaction method of the first aspect of the application is applied to the smart ring. Specifically, one finger (such as the index finger of the right hand) of the user wears the intelligent ring, and the other finger (such as the thumb of the right hand) of the hand is used for inputting an interactive operation on the touch pad, so that the VR/AR/MR device executes a corresponding operation instruction in response to the interactive operation. Therefore, the user can operate the intelligent ring by one hand, the other hand of the user not wearing the intelligent ring is liberated, the elbow, the arm, the palm and other fingers of the hand wearing the intelligent ring are liberated, and the interaction experience of the VR/AR/MR device is improved. In addition, in the above interaction method, the eyes of the user can focus on the image in the virtual screen currently displayed by the VR/AR/MR device, and the control of the VR/AR/MR device on the touch pad is not performed on the VR/AR/MR device under the condition that the line of sight does not deviate from the image displayed by the VR/AR/MR device, so that the visual control separation of the VR/AR/MR device by the user is favorably realized.
The second aspect of the present application provides a smart ring. The intelligent ring comprises a touch pad, a processor and a memory, wherein the touch pad is electrically connected with the processor, the processor is electrically connected with the memory, the memory is used for storing a computer program, and when the processor executes the computer program, the intelligent ring is enabled to execute the interaction method provided by the first aspect of the application.
A third aspect of the present application provides an interactive system. The interactive system comprises the smart ring provided according to the second aspect of the application and a VR/AR/MR device connected with the smart ring.
A fourth aspect of the present application provides a computer-readable storage medium. The computer-readable storage medium comprises a computer program which, when run on a computer, causes the computer to perform the interaction method provided by the first aspect of the present application.
For technical effects brought by any design of the second aspect to the fourth aspect of the present application, reference may be made to the technical effects brought by the above corresponding interaction method, and details are not repeated herein.
Drawings
Fig. 1 is a schematic structural diagram of an intelligent ring according to an embodiment of the present application.
Fig. 2 is a schematic diagram of an interactive interface of the smart ring shown in fig. 1.
Fig. 3 is a schematic diagram of another interactive interface of the smart ring shown in fig. 1.
Fig. 4A and 4B are schematic diagrams illustrating an interactive interface of the smart ring shown in fig. 1.
Fig. 5 is a schematic diagram of still another interactive interface of the smart ring shown in fig. 1.
Fig. 6 is a schematic diagram of still another interactive interface of the smart ring shown in fig. 1.
Fig. 7 is a schematic diagram of the dimensions of the various parts of the smart ring shown in fig. 1.
Description of the main element symbols:
intelligent finger ring 10
Annular member 11
Inner circumferential surface 11a
Outer circumferential surface 11b
Hole H
Touch pad 12
Touch surface 121
Projection 122
First projection 1221
Second bump 1222
Virtual rocker zone 131
Virtual keyboard zone 132
Function key area 133
Detailed Description
It should be noted that in the description of the present application, "/" indicates "or" unless otherwise noted, for example, a/B may indicate a or B. In the description of the present application, unless otherwise specified, "and/or" is only one kind of association relationship describing an associated object, and means that three kinds of relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments.
Fig. 1 is a schematic structural diagram of an intelligent ring according to an embodiment of the present application. As shown in fig. 1, the smart ring 10 includes a ring member 11 and a touch pad 12. Wherein the annular member 11 has opposite inner and outer circumferential surfaces 11a, 11b. The inner circumferential surface 11a defines a hole H through which a user's finger passes, or the inner circumferential surface 11a is a surface that the user's finger fits when wearing the smart ring 10. The touch pad 12 is located on one side of the outer circumferential surface 11b.
The material of the annular member 11 is, for example, metal or plastic. In addition, the annular member 11 may also be made of elastic plastic (e.g., silicone) to accommodate the difference in the size of the user's fingers, and to fit the user's fingers to prevent the fingers from falling off. The smart ring 10 may be worn on any portion of any finger of the user.
In some embodiments, the smart ring 10 is worn between the first joint and the second joint of the index finger to be more ergonomic and to facilitate the user's thumb of the hand wearing the smart ring 10 to perform a touch operation on the touchpad 12. In other embodiments, the smart ring 10 may be worn at the base or tip of the user's index finger, or on other fingers of the user. The type of the touch pad 12 is not limited, and may be a capacitive touch pad or a resistive touch pad.
The smart ring 10 further includes a processor (not shown), a memory (not shown), a wireless communication module (not shown), a power supply module (not shown), and the like. The processor, the memory, the wireless communication module and the power module can be contained in the annular member 11. The power module is electrically connected to the touch pad 12, the processor, the memory, the wireless communication module, and the like to provide electric energy for the above components. The touch pad 12 is electrically connected to the processor. The touch pad 12 is configured to detect a touch operation applied thereto or thereabout and to communicate the detected touch operation to the processor for determining the type of touch event. The processor is electrically connected to the memory. The memory is for storing a computer program. The memory optionally includes one or more computer-readable storage media comprising the computer program. The wireless communication module is electrically connected to the processor and is configured to communicatively couple to an external electronic device (e.g., a VR/AR/MR device). The wireless communication module comprises one or more of Bluetooth, infrared, wi-Fi hotspot networks, 3rd-generation wireless telephone technology (3G) networks, fourth-generation mobile communication technology (4G) networks, and fifth-generation mobile communication technology (5G) networks. Wherein, the VR/AR/MR device is, for example, VR glasses, AR glasses, MR glasses, VR helmet, AR helmet, MR helmet, etc.
The embodiment of the present application further provides an interaction method, which is applied to the above-mentioned smart ring 10. The interaction method comprises the following steps S1 to S3.
Step S1: after the intelligent ring is in communication connection with the VR/AR/MR device, the VR/AR/MR device displays an interactive interface for interactive operation with the touch pad.
In some embodiments, the wireless communication module of the smart ring comprises a bluetooth communication module. The wireless communication module of the VR/AR/MR device also includes a bluetooth communication module. The intelligent ring and the VR/AR/MR device can be connected in a pairing mode through a Bluetooth near field communication technology. Or the wireless communication module of the intelligent ring comprises an infrared transmitting module for transmitting infrared signals, the wireless communication module of the VR/AR/MR device comprises an infrared receiving module for receiving and processing the infrared signals, and the intelligent ring is in communication connection with the VR/AR/MR device through the infrared signals. Still alternatively, the smart ring and the VR/AR/MR device may establish communication over a Wi-Fi hotspot network, a 3G, 4G, or 5G network, or the like.
Furthermore, the smart ring is not limited to being directly connected to the VR/AR/MR device via wireless communication or the like. In other embodiments, the smart ring can be transferred by terminal devices such as a mobile phone through technologies such as wireless communication and the like, and the smart ring is indirectly in communication connection with the VR/AR/MR device. In addition, in step S1, after the smart ring is in communication connection with the VR/AR/MR device, the VR/AR/MR device further displays information of successful connection with the touch pad, connection status, and the like.
Different touch areas on the touch pad correspond to different virtual coordinates in a virtual screen currently displayed by the VR/AR/MR equipment one by one. The operation of the user on different touch areas on the touch pad can trigger the VR/AR/MR equipment to realize corresponding functions at corresponding virtual coordinates in the currently displayed virtual screen.
In some embodiments, after the smart ring is communicatively connected to the VR/AR/MR device, the interactive interface generated by the VR/AR/MR device for performing an interactive operation with the touch pad includes a virtual joystick region.
When a user inputs interactive operation (such as click operation, sliding operation or dragging operation) on a touch area corresponding to a virtual rocker area on a touch pad, the intelligent ring sends an instruction corresponding to the interactive operation to the VR/AR/MR device so as to control the movement of a cursor in a virtual screen currently displayed by the VR/AR/MR device. Controlling the movement of the cursor in the virtual screen currently displayed by the VR/AR/MR device includes controlling the direction and/or speed of movement of the cursor in the virtual screen currently displayed by the VR/AR/MR device. The touch control area corresponds to the virtual rocker area on the touch control plate, the angle and the direction of the position touched by the user relative to the center of the touch control area determine the direction of dragging the rocker, and the distance of the position touched by the user relative to the center of the touch control area determines the force and the amplitude of dragging the rocker.
Specifically, for example, a user may control a moving direction and/or a moving speed of a virtual object (e.g., a virtual game character) in a virtual scene presented in a virtual screen currently displayed by the VR/AR/MR device through a drag operation of a virtual joystick within a virtual joystick region.
In other embodiments, the smart ring may be communicatively connected to other electronic devices (e.g., an aircraft, a vehicle, or a mechanical device mechanism) other than the VR/AR/MR device, and after receiving an interaction operation (e.g., a click operation, a slide operation, or a drag operation) input by a user on the touch pad, the smart ring sends an instruction corresponding to the interaction operation to the other electronic devices (e.g., the aircraft, the vehicle, or the mechanical device mechanism) to control the other electronic devices (e.g., the aircraft, the vehicle, or the mechanical device mechanism) (e.g., aircraft flight direction control, vehicle driving direction control, or control of a moving direction or speed of the mechanical device mechanism).
In some embodiments, after the smart ring is communicatively connected to the VR/AR/MR device, the interactive interface displayed by the VR/AR/MR device and interactively operated with the touch pad further includes a virtual keyboard region. The virtual keyboard area is used for receiving text input by a user, and/or realizing an expansion function matched with an input method, and/or performing mathematical operation and/or realizing a telephone number dialing function. The virtual keyboard area is used for receiving user input text, which can include but is not limited to Chinese characters, letters, numbers, punctuation marks and special symbols. The expansion function of the virtual keyboard area matched with the input method comprises a plurality of shortcut keys. When the virtual keyboard area realizes a mathematical operation function, the virtual keyboard area includes, for example, numbers 0 to 9 and addition and subtraction operation symbols. When the virtual keyboard area realizes the telephone number dialing function, the virtual keyboard area comprises the numbers used for inputting the telephone numbers and the symbols corresponding to the characters, #, plus and the like used for realizing the telephone number dialing function.
The following is a detailed description.
In some embodiments, the virtual keyboard region is configured to receive user-entered text, the virtual keyboard region including a plurality of character keys. The character keys are used for receiving characters input by a user so as to input the characters in a virtual screen currently displayed by the VR/AR/MR device. The characters include at least one of: letters, numbers, strokes, and symbols. Because the size of the intelligent ring is small, the space for the finger operation of the user is small, and therefore, the letters, the numbers, the strokes and the symbols can be displayed on different interactive interfaces respectively. For example, the touch area (e.g., the middle area of the touch area) of the corresponding virtual keyboard area of the touch pad is divided into a plurality of touch areas, and each touch area of the corresponding virtual keyboard area of the touch pad corresponds to one character key, respectively, and can receive the input of the user. In different interactive interfaces, the character keys corresponding to the same position of the touch pad can correspond to letters, numbers, strokes or symbols. For example, when a user needs to input letters, each character key corresponds to a different letter; when a user needs to input a digital text, each character key can respectively correspond to different numbers; when the user needs to input strokes, each character key can respectively correspond to different strokes; when the user needs to input the symbol, each character key can correspond to different symbols respectively.
Specifically, the letters include 26 English letters, so that Chinese or English can be input by using the intelligent ring and the VR/AR/MR device. In addition, in order to reduce the size of the intelligent ring and the size of the touch pad, the size of the touch pad corresponding to each character key cannot be too large; in order to ensure the operability of the user's finger on each character key, the size of the touch area of the touch pad corresponding to each character key cannot be too small. In some embodiments, each character key corresponds to at least two letters (preferably, no more than four letters), and the user can select the specific character to be input by clicking the area on the touch pad corresponding to the character key. For example, a character key corresponds to N letters (N is greater than or equal to 2 and less than or equal to 4), a user can select one of the N letters by clicking the area on the touch pad corresponding to the character key, select the other one of the N letters by double clicking the area on the touch pad corresponding to the character key, and so on, and select different letters corresponding to the same character key by different numbers of clicks of the user on the area on the touch pad corresponding to the same character key.
The numbers include 0 to 9 for a total of ten numbers to enable the input of digital text using the smart ring and the VR/AR/MR device. Since the number of the numbers is smaller than that of the English letters, each character key can correspond to a different number. For example, ten character keys represent 0 to 9, and the user can select the input number by clicking the area on the touch pad corresponding to the character key.
The strokes include dot (stroke), horizontal (stroke) vertical (horizontal), left falling (horizontal), right falling (vertical)
Figure BDA0003867747300000051
Lifting device
Figure BDA0003867747300000052
Folding device
Figure BDA0003867747300000053
And (threatened) and the like to realize the input of Chinese characters by using the intelligent ring and VR/AR/MR equipment by using a stroke input method. Because the number of the strokes is less than that of the English letters, each character key can correspond to a different stroke, and a user can select the input stroke by clicking the area on the touch pad corresponding to the character key. The symbols include, for example, punctuation marks, special symbols, and the like. Understandably, each character key can correspond to one or more symbols, and the user can use the same character key to correspond to one or more symbols on the touch padDifferent numbers of clicks of the region are used for selecting different symbols corresponding to the same character key. In addition, to facilitate user input, common punctuation marks (e.g., comma, period, and pause) may be presented in the same interactive interface as letters, numbers, or strokes.
In some embodiments, the virtual keyboard region is further configured to implement an expansion function associated with the input method. The virtual keyboard region includes a plurality of shortcut keys. The shortcut key is used for receiving touch operation of a user so as to realize a preset function corresponding to the shortcut key in a virtual screen currently displayed by the VR/AR/MR equipment. Specifically, the shortcut key includes at least one of: the system comprises a voice recognition shortcut key, a character input shortcut key, an expression input shortcut key, an emptying shortcut key, a voice message shortcut key, a special symbol shortcut key, a shooting shortcut key, a file transmission shortcut key, a full-angle half-angle switching shortcut key, an input method switching shortcut key, an audio and video communication shortcut key, a Chinese and English switching shortcut key, a number switching shortcut key and more function shortcut keys. Understandably, the touch area (e.g., the middle area of the touch area) of the corresponding virtual keyboard area of the touch pad is divided into a plurality of sets, and each set of the touch area of the corresponding virtual keyboard area of the touch pad corresponds to a shortcut key respectively and can receive the input of the user. In different interactive interfaces, the same position of the touch pad can correspond to a shortcut key, or correspond to the letter, number, stroke or symbol. In some embodiments, the shortcut key may receive a click operation or a long-press operation of a user on a region of the touch pad corresponding to the shortcut key, so as to implement a preset function corresponding to the shortcut key in a virtual screen currently displayed by the VR/AR/MR device. For example, when the user clicks, double clicks, or long presses a chinese and english switching shortcut key on a corresponding area on the touch pad, a chinese and english switching function may be implemented in a virtual screen currently displayed by the VR/AR/MR device. By analogy, through the clicking operation or long-time pressing operation of the shortcut key on the corresponding area on the touch pad by the user, corresponding voice recognition, character input, expression input, emptying, voice message, special symbol, shooting, file transmission, full-angle and half-angle switching, input method switching, audio and video conversation, digital switching, more function keyboard menu calling and the like can be realized in the virtual screen currently displayed by the VR/AR/MR equipment.
In some embodiments, the virtual keyboard zone is further configured to implement telephone number dialing functionality. The virtual keyboard region includes a plurality of character keys for implementing a telephone number dialing function. The character keys for realizing the dialing function of the telephone number include the numbers (0 to 9) for inputting the telephone number and the symbols corresponding to the dialing function of the telephone number, such as #, plus, carriage return, etc. Understandably, the touch area (e.g. the middle area of the touch area) of the corresponding virtual keyboard area of the touch pad is divided into a plurality of parts, and each part of the touch area of the corresponding virtual keyboard area of the touch pad corresponds to one character key respectively and can receive the input of the user. In particular, a character key corresponds, for example, to a number or symbol (symbols such as "#, +, enter, etc.); or a combination of one digit and one symbol (symbols such as, #, +, carriage returns, etc.); or a plurality of numbers; or multiple symbols (symbols such as #, +, carriage returns, etc.). When one character key corresponds to one number or symbol, the user can select the input number or symbol by clicking the area on the touch pad corresponding to the character key. When a character key corresponds to a combination of a number and a symbol (e.g. #, +, enter, etc.); or a plurality of numbers; or multiple symbols (e.g., #, # +, enter, etc.), the user may select different numbers or symbols corresponding to the same character key by clicking different times on the area of the touch pad corresponding to the character key.
In some embodiments, after the smart ring is communicatively connected to the VR/AR/MR device, the interactive interface displayed by the VR/AR/MR device and performing interactive operation with the touch pad further includes a function button area. The function key area comprises function icons, and when a user inputs touch operation in the touch area corresponding to the function icons on the touch pad, the intelligent ring sends an instruction corresponding to the touch operation to the VR/AR/MR device so as to realize navigation interaction in a virtual screen currently displayed by the VR/AR/MR device. The functional icons include at least one of: a screen float/hold key, a zoom-in/out screen key, a back key, a home menu key, a backspace key, a page scroll key, and a page flip key. The page scrolling keys include a page up-down scrolling key and a page left-right scrolling key.
In some embodiments, the function key region corresponds to an edge region of a touch region of the touch pad. The function icon can receive sliding operation of a user and determine the function characteristic represented by the function icon according to the sliding direction of the finger of the user. For example, when a finger of a user slides a region corresponding to the screen float/hold key on the touch pad in a direction from the edge region to the inner region of the touch pad, the currently displayed virtual screen may be implemented as a hold state in the virtual screen currently displayed on the VR/AR/MR device, and when a finger of a user slides a region corresponding to the screen float/hold key on the touch pad in a direction from the inner region to the edge region of the touch pad, the currently displayed virtual screen may be implemented as a float state in the virtual screen currently displayed on the VR/AR/MR device. When the finger of the user slides the area corresponding to the zoom-in/zoom-out screen key on the touch pad from the direction in which the edge area of the touch pad points to the inner area, the zoom-in virtual screen can be realized in the virtual screen currently displayed by the VR/AR/MR device, and when the finger of the user slides the area corresponding to the zoom-in/zoom-out screen key on the touch pad from the direction in which the inner area of the touch pad points to the edge area, the zoom-out virtual screen can be realized in the virtual screen currently displayed by the VR/AR/MR device.
It should be noted that the operations of zooming in and dragging out the virtual screen described above require the virtual screen currently displayed by the VR/AR/MR device to be in a floating state, not a fixed state. That is, when the virtual screen currently displayed by the VR/AR/MR device is in a floating state, the function corresponding to the zoom-in/zoom-out screen key can be triggered; when the virtual screen currently displayed by the VR/AR/MR device is in a fixed state, the function corresponding to the zooming-in/zooming-out screen key cannot be triggered. In addition, when the virtual screen currently displayed by the VR/AR/MR device is in a floating state, the user can also control the position of the virtual screen currently displayed by the VR/AR/MR device through the operation of the area corresponding to the virtual rocker area on the touch pad.
In addition, when the finger of the user slides the area corresponding to the page up-and-down scroll key on the touch pad from the edge area to the direction of the inner area of the touch pad, the page can be scrolled upwards in the virtual screen currently displayed by the VR/AR/MR device, and when the finger of the user slides the area corresponding to the page up-and-down scroll key on the touch pad from the direction of the inner area to the edge area of the touch pad, the page can be scrolled downwards in the virtual screen currently displayed by the VR/AR/MR device; when the finger of the user slides the area corresponding to the left and right page scroll keys on the touch pad from the edge area of the touch pad to the direction of the inner area, the page can be scrolled to the right in the virtual screen currently displayed by the VR/AR/MR device, and when the finger of the user slides the area corresponding to the left and right page scroll keys on the touch pad from the direction pointing to the edge area from the inner area of the touch pad, the page can be scrolled to the left in the virtual screen currently displayed by the VR/AR/MR device. In this way, the up-down scrolling and the left-right scrolling of the page in the virtual screen currently displayed by the VR/AR/MR device can be realized according to the sliding direction of the finger of the user on the region of the touchpad corresponding to the function icon. By analogy, the functions of returning, main menu, backspace and page turning can be realized in the virtual screen currently displayed by the VR/AR/MR device through the sliding operation of the finger of the user on the touch pad corresponding to the areas of the return key, the main menu key, the backspace key and the page turning key. Wherein the edge sliding trigger function is prior to the key function.
It should be noted that, after the smart ring is in communication connection with the VR/AR/MR device, the interactive interface displayed by the VR/AR/MR device and performing interactive operation with the touch pad is not limited to simultaneously include the virtual joystick area, the virtual keyboard area, and the function button area. In other embodiments, after the smart ring is communicatively connected to the VR/AR/MR device, an interactive interface displayed by the VR/AR/MR device and performing an interactive operation with the touch pad may include one of a virtual joystick area, a virtual keyboard area, and a function button area, or a combination of any two of them.
Step S2: and the intelligent ring receives the interactive operation input by the user on the touch pad.
And step S3: and the intelligent ring sends an operation instruction corresponding to the interactive operation to the VR/AR/MR equipment so that the VR/AR/MR equipment responds to the interactive operation to execute the operation instruction.
Referring to the description of step S1, the interactive operation input by the user on the touch pad includes at least one of the following operations: click operation, long-time press operation, drag operation and slide operation. The clicking operation includes a single-click operation and a multi-click operation. It can be appreciated that the interactive operation is not limited thereto, so as to improve the flexibility of the user in operating the smart ring.
Specifically, if the interactive interface generated in step S1 includes the virtual joystick area, in step S2, the smart ring may receive a click operation, a sliding operation, or a dragging operation input by the user, and generate a corresponding operation instruction. In step S3, the smart ring sends an operation instruction corresponding to a click operation, a sliding operation, or a dragging operation input by a user to the VR/AR/MR device, so as to control a moving direction and/or a moving speed of a cursor in a virtual screen currently displayed by the VR/AR/MR device.
If the interactive interface generated in step S1 includes a virtual keyboard region, and the virtual keyboard region includes character keys, in step S2, the intelligent ring may receive a click operation (e.g., single click, double click, three click, or four click) of a user on a region of the touch pad corresponding to the character keys, so as to generate an operation instruction of a certain character corresponding to the selected character key. In step S3, the smart ring sends an operation instruction corresponding to a click operation of the user on the touch pad in a region corresponding to the character key to the VR/AR/MR device, so as to input the selected character in the virtual screen currently displayed by the VR/AR/MR device.
If the interactive interface generated in step S1 includes the virtual keyboard region and the virtual keyboard region includes the shortcut key, in step S2, the intelligent ring may receive a click operation or a long press operation of the user on the area of the touch pad corresponding to the shortcut key. In step S3, the smart ring sends an operation instruction corresponding to a click operation or a long-press operation of a user on a touch pad in a region corresponding to the shortcut key to the VR/AR/MR device, so as to implement a preset function corresponding to the shortcut key in a virtual screen currently displayed by the VR/AR/MR device.
If the interactive interface generated in step S1 includes a function key area, in step S2, the smart ring may receive a sliding operation of the user on an area of the touch pad corresponding to the function icon. In step S3, the smart ring sends an operation instruction corresponding to a sliding operation of the user on the function icon to the VR/AR/MR device, so as to implement navigation interaction in a virtual screen currently displayed by the VR/AR/MR device.
Specifically, in step S3, the smart ring may directly send the operation instruction to the VR/AR/MR device through technologies such as wireless communication; or the operation instruction can be transferred by terminal equipment such as a mobile phone through technologies such as wireless communication and the like, and the intelligent ring indirectly sends the operation instruction to the VR/AR/MR equipment.
Different interactive interfaces displayed by the VR/AR/MR device for interacting with the touch pad are respectively described below with reference to fig. 2 to 6.
As shown in FIG. 2, the VR/AR/MR device displays an interactive interface that includes a virtual rocker area 131, a virtual keyboard area 132, and a function button area 133. The function key region 133 surrounds the virtual stick region 131 and the virtual keyboard region 132. The virtual keyboard zone 132 includes twelve grids arranged in four rows and three columns, and sequentially represents punctuation marks (,), ABC, DEF, GHI, JKL, MNO, PQRS, TUV, WXYZ, cap/shift/menu, space bar (space), and enter key (←). Wherein ABC, DEF, GHI, JKL, MNO, PQRS, TUV and WXYZ are character keys for receiving a click operation instruction of a user to input letters in a virtual screen currently displayed by the VR/AR/MR device. The function key area 133 includes eight function icons, specifically, a screen float/fix key (screen/pinned), a zoom-in/zoom-out screen key (zoom in/out), a main menu key (main page), a page left/right scroll key (roll left/right), a backspace key (back space) and a return key (return), a page up/down scroll key (roll up/down), and a page up/down scroll key (page up/down) that respectively surround the virtual keyboard area 132. Among them, a screen float/fix key (screen float/pinned) and a zoom in/out key (zoom in/out) are located at an upper side of the virtual keyboard region 132, and a page up/down scroll key (roll up/down) and a page up/down key (page up/down) are located at a lower side of the virtual keyboard region 132. And the other four function icons are located on the left and right sides of the virtual keyboard zone 132, respectively. The virtual rocker region 131 is located between the screen float/park keys (zoom in/out) and the zoom-in/zoom-out screen keys (zoom in/out) and the virtual keyboard region 132.
In the interactive interface shown in fig. 2, the virtual joystick area 131 may control the movement of the cursor in the virtual screen currently displayed by the VR/AR/MR device when the user inputs an interactive operation (e.g., a click operation, a sliding operation, or a dragging operation) in the touch area corresponding to the virtual joystick area on the touch pad, so as to meet an interactive requirement of the user for positioning and moving the cursor by using the VR/AR/MR device; through the setting of twelve lattices in the virtual keyboard area 132, each lattice can correspond to different letters or characters or functions, and the interaction requirement of Chinese pinyin or English input by utilizing VR/AR/MR equipment can be realized through the clicking operation of the area corresponding to each lattice on the touch pad by a user; and each function icon in the function key area 133 can receive an operation of sliding inward or outward across the edge on the area corresponding to the function icon on the touch pad by the user, so as to realize navigation interaction in the virtual screen currently displayed by the VR/AR/MR device. As can be seen, the smart ring 10 is used as an input device of a VR/AR/MR device, and can cover a plurality of input modes of interactive tasks through the operation of the touch pad 12 by the fingers of the user. Specifically, the hole H of the smart ring 10 may receive one finger (e.g., index finger) of the user, and the touch pad 12 may receive the interactive operation of the other finger (e.g., thumb) of the user's hand, so that the user can operate the smart ring 10 with one hand to release the other hand of the user not wearing the smart ring 10 and to release the elbow, arm, palm and other fingers of the hand of the user wearing the smart ring 10, thereby improving the interactive experience of the VR/AR/MR device. In addition, it should be noted that, in the interactive interface shown in fig. 2, the user may trigger different functions by using different numbers of clicks on the touch areas corresponding to the lower left character key (cap/shift/menu). Specifically, when a user clicks a touch area corresponding to a character key (cap/shift/menu), the character key (cap/shift/menu) is used for triggering a function (cap) for switching between upper and lower English cases; when a user double clicks a touch area corresponding to the character key (cap/shift/menu), the character key (cap/shift/menu) is used for triggering a function (shift) for switching Chinese and English input methods; when the user triple-clicks the touch area corresponding to the character key (cap/shift/menu), the character key (cap/shift/menu) is triggered to switch to the function (menu) of the function menu (the interactive interface of the function menu is shown in fig. 6).
The difference between the interactive interface shown in fig. 3 and the interactive interface shown in fig. 2 is the virtual keyboard. In the embodiment shown in fig. 3, the virtual keyboard region 132 includes twelve character keys having three functions similar to those in fig. 2, and the remaining nine character keys include five character keys corresponding to different strokes, one character key representing "wildcard", one character key representing "participle", and two punctuation marks. In the interactive interface shown in fig. 3, besides the interaction requirement of cursor positioning and movement and the requirement of navigation interaction by using the VR/AR/MR device, the user can also realize the interaction requirement of inputting chinese characters by using a stroke input method in a virtual screen currently displayed by the VR/AR/MR device.
The difference between the interactive interface shown in fig. 4A and the interactive interface shown in fig. 2 is the virtual keyboard region 132. In the embodiment shown in fig. 4A, the virtual keyboard region 132 includes twelve character keys, one of which has the same function as that of fig. 2 (i.e., enter key), and one of the remaining character keys represents the number 0 and the decimal point (); nine character keys represent the numbers 1 to 9 and one character key represents (+ -/keyboard menu), respectively. Specifically, the numbers 1 to 9 are represented by one character key, respectively, and the number to be input can be selected by clicking the touch area corresponding to the character key by the user. The character key (0.) can select and input the number 0 by clicking the touch area corresponding to the character key by a user, and select and input the decimal point by double clicking. In addition, the functions to be performed by the single click, the double click and the triple click of the touch area corresponding to the character key (+/keyboard menu) can be respectively selected as the addition operation, the subtraction operation and the expansion operation (keyboard menu). Wherein when the function of performing other operations is triggered, the interaction interface displayed by the VR/AR/MR device is as shown in FIG. 4B. The other operations include, for example, trigonometric function operations, logarithmic operations, and the like. Thus, in the interactive interface shown in fig. 4A and 4B, the user can not only realize the interactive requirement of cursor positioning and movement and the requirement of navigation interaction by using the VR/AR/MR device, but also realize the interactive requirement of digital input, elementary mathematical addition and subtraction operation and expansion of other operations in the virtual screen currently displayed by the VR/AR/MR device.
The interactive interface shown in fig. 5 is substantially the same as the interactive interface shown in fig. 4A. Nine number keys, 1 through 9, enter keys, are also included in the interactive interface shown in fig. 5. In addition, in the interactive interface shown in fig. 5, the key (#/keyboard menu) at the leftmost lower corner of the virtual keyboard region 132; the character key at the lowest middle position is (0 +). Specifically, when a user dials a telephone number, the user can select a number to be input by clicking a touch area corresponding to nine number keys from 1 to 9 on a touch pad; selecting input 0 by clicking a touch area corresponding to a character key (0 +) on a touch pad, and selecting input + by double clicking; and selecting input # by clicking the touch area corresponding to the character key (#/keyboard menu) on the touch pad, selecting input # by double clicking, and selecting the function corresponding to the keyboard menu by triple clicking. The function corresponding to the keyboard menu is, for example, expanding the contact information of the stored contact. Thus, in the interactive interface shown in fig. 5, in addition to the interactive requirement of cursor positioning and movement and the requirement of navigation interaction by using the VR/AR/MR device, the user can also realize the interactive requirement of telephone number dialing in the virtual screen currently displayed by the VR/AR/MR device.
The difference between the interactive interface shown in FIG. 6 and the interactive interface shown in FIG. 2 is the virtual keyboard region 132. In the embodiment shown in FIG. 6, the virtual keyboard zone 132 includes twelve character keys representing different shortcut keys. Part of the character keys represent shortcut keys for two functions. In the interactive interface shown in fig. 6, the user can not only realize the requirements of performing key pressing, rocker control and cursor positioning by using the VR/AR/MR device, but also realize the preset function corresponding to the shortcut key in the virtual screen currently displayed by the VR/AR/MR device.
It should be noted that, in the embodiment of the present application, when one of the multiple interactive interfaces is displayed on the touch pad 12, the other interactive interfaces are hidden. Therefore, the user can switch the displayed interactive interface as required. In addition, the correspondence between the interactive operation and the work corresponding to the interactive operation may be preset by the user, for example, the user may preset to click a character key or a shortcut key in the interactive interface to trigger a corresponding function; or the direction of the preset sliding operation is consistent with the direction of the page sliding in the function icon, and the like.
In conclusion, in the embodiment of the application, the user can utilize the smart ring 10 to meet the complex interaction requirements in more scenes with the VR/AR/MR device under the constraint condition that the sight line does not depart from the image of the VR/AR/MR device, and the user experience is improved.
In addition, referring to fig. 1 and fig. 7 in combination, in order to further enhance the interaction experience, the touch pad 12 includes a touch surface 121 and a protrusion 122 protruding from the touch surface 121. The projections 122 include a plurality of bar-shaped first projections 1221 (indicated by thick lines in fig. 7) and a generally oval-shaped second projection 1222. The touch surface 121 of the touch pad 12 is divided into a plurality of areas by the plurality of bar-shaped second protrusions 1222.
Specifically, in the embodiment shown in fig. 1 and 7, the touch surface 121 of the touch pad 12 is octagonal and divided into 7 areas. Wherein the 7 zones are distributed in axial symmetry. The 7 regions are respectively a hexagonal region, two rectangular regions and four parallelogram regions. The hexagonal area is adjacent to a rectangular area and two parallelogram areas. Two opposite sides of each rectangular area are respectively adjacent to one parallelogram area. Wherein the oval-shaped second protrusion 1222 is located at a substantially middle position of the hexagonal area.
In the above-mentioned interactive interface, the virtual joystick area 131 corresponds to a hexagonal area, two character keys in the virtual keyboard area 132 correspond to a rectangular area or two character keys correspond to a parallelogram area, and two shortcut keys in the virtual keyboard area 132 correspond to a rectangular area or two shortcut keys correspond to a parallelogram area. By setting the first protrusion 1221, the fingertips of the user can sense the boundary between the virtual rocker area 131 and the virtual keyboard area 132, the boundary between the character keys in the virtual keyboard area 132, and the boundary between the shortcut keys in the virtual keyboard area 132, so that the eyes of the user can focus on the image in the virtual screen currently displayed by the VR/AR/MR device, and under the condition that the sight line does not deviate from the image displayed by the VR/AR/MR device, the control of the VR/AR/MR device on the touch pad 12 is facilitated, instead of the control of the VR/AR/MR device by the virtual screen currently displayed by the VR/AR/MR device, so that the visual control separation of the VR/AR/MR device by the user is facilitated. In addition, in other embodiments, the number of the first protrusions 1221 may also be increased, so that one character key in the virtual keyboard region 132 corresponds to one touch region. It should be noted that, because the area of the touch pad on the smart ring is small, if the number of the first protrusions is too large, discomfort may be caused to the user during the touch operation. It should be noted that the oval second protrusion 1222 in the virtual joystick area 131 can be used as a positioning mark, and when a user performs a drag operation at a position of the oval second protrusion 1222 in different directions, the cursor in the virtual screen currently displayed by the VR/AR/MR device can be moved to a corresponding direction. For example, if the user performs a drag operation on the oval second protrusion 1222, the virtual screen currently displayed by the VR/AR/MR device is moved upward by the cursor.
Specifically, the heights of the first protrusion 1221 and the second protrusion 1222 are, for example, 0.15mm, but are not limited thereto. The hexagonal area is an equilateral hexagon, two angles on two sides of the equilateral hexagon are right angles, and the side length W1 is 1cm. The two rectangular areas are rectangles of equal size, the long side W2 of the rectangle is 1.414cm, and the short side W1 of the rectangle is 1cm. The two sides of each of the four parallelogram regions are W1 and W2 in length, respectively, i.e., the two sides are 1cm and 1.414cm in length, respectively.
In other embodiments, the shape of the touch surface 121 of the touch pad 12, the shape, the number and the layout of the first protrusions 1221 and the second protrusions 1222 are not limited thereto. For example, the touch surface 121 is rectangular or elliptical. In some embodiments, each side of the touch pad 12 is between 1cm and 5cm in size, such that the touch pad 12 has a small size that is easy for a user to wear, but the touch pad 12 still has enough space to operate with a finger of the user.
The embodiment of the present application further provides an interactive system, which includes the above-mentioned smart ring 10 and a VR/AR/MR device connected to the smart ring 10. The VR/AR/MR device is, for example, VR glasses, AR glasses, MR glasses, VR helmet, AR helmet, MR helmet, or the like.
Embodiments of the present application further provide a computer-readable storage medium, which includes a computer program and when the computer program runs on a computer, the computer is caused to execute the above interaction method. Understandably, the computer program, when executed on a processor of the smart ring 10, causes the smart ring 10 to perform the interaction method described above. Specifically, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
Although the present application has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the present application.

Claims (11)

1. An interaction method is applied to a smart ring, the smart ring comprises a touch pad, and the interaction method comprises the following steps:
after the intelligent ring is in communication connection with the VR/AR/MR device, the VR/AR/MR device displays an interactive interface for interactive operation with the touch pad;
the intelligent ring receives interactive operation input by a user on the touch pad; and
and the intelligent ring sends an operation instruction corresponding to the interactive operation to the VR/AR/MR device so that the VR/AR/MR device responds to the interactive operation to execute the operation instruction.
2. The interaction method according to claim 1, wherein the interaction interface comprises a virtual joystick region, and the touch pad comprises a touch region corresponding to the virtual joystick region;
the intelligent ring receives interactive operation of a user on the touch pad and a touch area corresponding to the virtual rocker area so as to control the movement of a cursor in a virtual screen currently displayed by the VR/AR/MR equipment; and/or when the virtual screen currently displayed by the VR/AR/MR device is in a floating state, the intelligent ring receives interactive operation of a user on the touch pad and the touch area corresponding to the virtual rocker area so as to control the position of the virtual screen currently displayed by the VR/AR/MR device;
the interactive operation comprises at least one of click operation, sliding operation and dragging operation.
3. The interaction method according to claim 1, wherein the interaction interface comprises a virtual keyboard zone, and the touch pad comprises a touch zone corresponding to the virtual keyboard zone;
the intelligent ring receives interactive operation of a user on the touch pad and a touch area corresponding to the virtual keyboard area so as to control the VR/AR/MR equipment to realize text input, mathematical operation, telephone number dialing and preset functions corresponding to shortcut keys in a currently displayed virtual screen;
the interactive operation comprises at least one of click operation and long-press operation.
4. The interaction method of claim 3, wherein the shortcut key comprises at least one of: the system comprises a voice recognition shortcut key, a character input shortcut key, an expression input shortcut key, an emptying shortcut key, a voice message shortcut key, a special symbol shortcut key, a shooting shortcut key, a file transmission shortcut key, a full-angle and half-angle switching shortcut key, an input method switching shortcut key, an audio and video call shortcut key, a Chinese and English switching shortcut key, a number switching shortcut key and more function shortcut keys.
5. The interaction method according to any one of claims 1 to 4, wherein the interaction interface comprises a function key area, and the touch pad comprises a touch area corresponding to the function key area;
and the intelligent ring receives the sliding operation of a user in a touch area corresponding to the function key area on the touch pad so as to control the VR/AR/MR equipment to realize navigation interaction in a currently displayed virtual screen.
6. The interaction method according to claim 5, wherein the function key region comprises function icons, and the function icons comprise at least one of: a screen floating/fixing key, a zoom-in/zoom-out screen key, a return key, a main menu key, a backspace key, a page rolling key and a page turning key;
and the intelligent ring receives sliding operation of a user on the touch pad in a touch area corresponding to the functional icon so as to control the VR/AR/MR equipment to realize the function corresponding to the functional icon in a currently displayed virtual screen.
7. The interaction method according to claim 6, wherein when the virtual screen currently displayed by the VR/AR/MR device is in a floating state, the function corresponding to the zoom-in/zoom-out screen key can be triggered; when the virtual screen currently displayed by the VR/AR/MR device is in a fixed state, the function corresponding to the zooming-in/zooming-out screen key cannot be triggered.
8. A smart ring comprising a touch pad, a processor and a memory, the touch pad being electrically connected to the processor, the processor being electrically connected to the memory, the memory being configured to store a computer program which, when executed by the processor, causes the smart ring to perform the interaction method of any one of claims 1 to 7.
9. The smart ring of claim 8 wherein the touch pad includes a touch surface and a protrusion protruding from the touch surface.
10. An interactive system, characterized by comprising the smart ring according to claim 8 or 9 and a VR/AR/MR device connected to the smart ring.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a computer program which, when run on a computer, causes the computer to perform the interaction method according to any one of claims 1 to 7.
CN202211186084.3A 2022-09-27 2022-09-27 Interaction method, intelligent ring, interaction system and computer readable storage medium Pending CN115657843A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211186084.3A CN115657843A (en) 2022-09-27 2022-09-27 Interaction method, intelligent ring, interaction system and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211186084.3A CN115657843A (en) 2022-09-27 2022-09-27 Interaction method, intelligent ring, interaction system and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN115657843A true CN115657843A (en) 2023-01-31

Family

ID=84985886

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211186084.3A Pending CN115657843A (en) 2022-09-27 2022-09-27 Interaction method, intelligent ring, interaction system and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN115657843A (en)

Similar Documents

Publication Publication Date Title
CA2615359C (en) Virtual keypad input device
EP2718788B1 (en) Method and apparatus for providing character input interface
US6765556B2 (en) Two-key input per character text entry apparatus and method
US7362243B2 (en) Apparatus and method using color-coded or pattern-coded keys in two-key input per character text entry
KR100954594B1 (en) Virtual keyboard input system using pointing apparatus in digial device
KR100736195B1 (en) Mobile phone and mobile phone control method
US9104247B2 (en) Virtual keypad input device
EP1183590B1 (en) Communication system and method
US20060082540A1 (en) Data input system
JP2004054589A (en) Information display input device and method, and information processor
US20070200828A1 (en) Small form-factor key design for keypads of mobile computing devices
WO2010089918A1 (en) Electronic device and electronic device program
CA2837752A1 (en) Graphic object selection by way of directional swipe gestures
US20230236673A1 (en) Non-standard keyboard input system
KR20140131070A (en) Apparatus and method for generating a message in a portable terminal
CN115657843A (en) Interaction method, intelligent ring, interaction system and computer readable storage medium
KR101741662B1 (en) Display apparatus and control method thereof
CA2642788C (en) Raised rail enhanced reduced keyboard upon a handheld electronic device
CA2936364C (en) Virtual keypad input device
CA2639373C (en) Device and method for application navigation enhancement on a handheld electronic device
CA3173029A1 (en) Virtual keypad input device
WO2012021049A1 (en) Device for entering information in electronic devices
JP2009205344A (en) Information input device
KR20150052905A (en) Display apparatus with touch screen and screen keypad control method thereof
KR20060037065A (en) Assumed mouse system using buttons of a mobile personal eletronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination