WO2021227918A1 - Procédé d'interaction et dispositif de réalité augmentée - Google Patents

Procédé d'interaction et dispositif de réalité augmentée Download PDF

Info

Publication number
WO2021227918A1
WO2021227918A1 PCT/CN2021/091864 CN2021091864W WO2021227918A1 WO 2021227918 A1 WO2021227918 A1 WO 2021227918A1 CN 2021091864 W CN2021091864 W CN 2021091864W WO 2021227918 A1 WO2021227918 A1 WO 2021227918A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
action
reality device
information
target
Prior art date
Application number
PCT/CN2021/091864
Other languages
English (en)
Chinese (zh)
Inventor
张志灵
刘梦婷
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021227918A1 publication Critical patent/WO2021227918A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • This application belongs to the field of communication technology, and specifically relates to an interaction method and an augmented reality device.
  • AR Augmented Reality
  • the AR device can capture the environment where the video participant is located, thereby enabling face-to-face communication between multiple users. For example, a user conducts a multi-person video conference through an AR device, and each participant can view the environment where other participants are located through the AR device.
  • AR devices can only display the collected user's video images, but cannot interact with users, and the interactivity of AR devices is poor. .
  • the purpose of the embodiments of the present application is to provide an interaction method and an augmented reality device, which can solve the problem of poor interactivity of the AR device.
  • an embodiment of the present application provides an interaction method applied to a first augmented reality device, and the method includes:
  • the second augmented reality device is sent to the second augmented reality device corresponding to the first action Target information
  • the target identifier is an identifier in the projection area of the first augmented reality device
  • the second augmented reality device is an electronic device associated with the target identifier
  • an embodiment of the present application provides an interaction method applied to a second augmented reality device, and the method includes:
  • Target information sent by a first augmented reality device where the target information corresponds to a first action detected by the first augmented reality device
  • an embodiment of the present application provides an interaction device, which is applied to a first augmented reality device, and the interaction device includes:
  • the detection module is used to detect the line of sight direction of the first user and the first action of the first user;
  • the first sending module is configured to send and to the second augmented reality device when the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets the first preset condition Target information corresponding to the first action;
  • the target identifier is an identifier in the projection area of the first augmented reality device
  • the second augmented reality device is an electronic device associated with the target identifier
  • an embodiment of the present application provides an interaction device, which is applied to a second augmented reality device, and the interaction device includes:
  • a first receiving module configured to receive target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device;
  • the third output module is configured to output third prompt information when it is detected that the second user's second action satisfies the second preset condition within the third preset time, and send the information to the first augmented reality device Response information corresponding to the target information;
  • the fourth output module is configured to output fourth prompt information when the second action of the second user that satisfies the second preset condition is not detected within the third preset time.
  • an embodiment of the present application provides an augmented reality device.
  • the electronic device includes a processor, a memory, and a program or instruction that is stored on the memory and can run on the processor.
  • the program or instruction When executed by the processor, the steps in the method according to the first aspect or the second aspect are realized.
  • an embodiment of the present application provides a readable storage medium that stores a program or instruction on the readable storage medium, and when the program or instruction is executed by a processor, the implementation is as described in the first or second aspect. Steps in the method.
  • an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used to run a program or an instruction to implement the chip as in the first aspect Or the method described in the second aspect.
  • an embodiment of the present application provides an interaction device, wherein the interaction device is configured to execute the method according to the first aspect or the second aspect.
  • the direction of the line of sight of the first user and the first action of the first user are detected; the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user
  • the target information corresponding to the first action is sent to the second augmented reality device; wherein the target identifier is the identifier in the projection area of the first augmented reality device, and
  • the second augmented reality device is an electronic device associated with the target identifier.
  • the first augmented reality device can detect the user's actions, so as to realize the interaction between the augmented reality device and the user, and can participate in the interaction between the user and the user, and can improve the interaction effect.
  • FIG. 1 is one of the flowcharts of the interaction method provided by the embodiment of the present invention.
  • Figures 2 to 6 are images displayed by an augmented reality device provided by an embodiment of the present invention.
  • FIG. 7 is the second flowchart of the interaction method provided by the embodiment of the present invention.
  • FIG. 8 is a structural diagram of a first augmented reality device provided by an embodiment of the present invention.
  • FIG. 9 is a structural diagram of a second augmented reality device provided by an embodiment of the present invention.
  • FIG. 10 is a structural diagram of a first augmented reality device or a second augmented reality device provided by an embodiment of the present invention.
  • the interaction method of the embodiment of the present invention may be applied to augmented reality (Augmented Reality, AR) devices, which may specifically be electronic devices with augmented reality functions or head-mounted augmented reality devices, etc., which are not limited here.
  • augmented reality Augmented Reality, AR
  • AR Augmented Reality
  • each user of the video call can view the scene of other users through the augmented reality device, and each user can also use the augmented reality device Interact with other users.
  • the first augmented reality device may collect the first action and determine whether the first preset condition is satisfied , Such as whether it matches with the preset action or whether the intensity of the first action is greater than the preset value.
  • the first augmented reality device can obtain the action parameters of the first action, including the direction of the action, the force generated, the speed of the action, etc., and send it to the second augmented reality device Generate target information including action parameters.
  • the second augmented reality device may prompt the second user according to the received target information, so that the second user can make an interactive action, thereby realizing the interaction between the first user and the second user.
  • the second augmented reality device receives the prompt information including the action parameters sent from the first augmented reality device, it can output the prompt sound according to the prompt information to simulate a real stereo effect, the volume of the prompt sound and the size of the action parameters
  • the type of prompt sound corresponds to the type of action.
  • the second augmented reality device detects that the second user makes an action response within the preset time after receiving the prompt, such as projecting the line of sight on the account identification corresponding to the first augmented reality device, waving a hand, etc.
  • the logo can display a green halo, which means that the second user is aware of this interactive action and the interaction is successful. If the response is not received within the preset time, the response is deemed to be missed/not heard, and the account logos on the AR devices of both parties can display a red halo or further output prompts to prompt the second user to respond. If the second user still does not respond, a red halo may be displayed, indicating that the user status of the recipient is busy.
  • the first user and the second user can interact through the augmented reality device, and the augmented reality device can prompt the interaction between users, can participate in the interaction between users, and can improve the interaction effect.
  • the first augmented reality device may also send the file to other augmented reality devices according to the user's first action, thereby simulating the actual file sending effect.
  • the interaction method on the side of the first augmented reality device may include the following steps:
  • Step 101 Detect the line of sight direction of the first user and the first action of the first user.
  • the first augmented reality device when the first user uses the first augmented reality device, can recognize the user’s line of sight direction by means of an eye locator or camera. In addition, it can also use a depth camera, infrared Sensors such as cameras and RGB (red, green and blue) cameras recognize the first action of the first user.
  • an eye locator or camera In addition, it can also use a depth camera, infrared Sensors such as cameras and RGB (red, green and blue) cameras recognize the first action of the first user.
  • RGB red, green and blue
  • the above-mentioned first motion may specifically include gesture motion, head motion, or body motion, etc., such as shaking hands, shaking head, and hugging.
  • Step 102 In a case where the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets a first preset condition, send a connection with the first augmented reality device to the second augmented reality device.
  • the target identifier is an identifier in the projection area of the first augmented reality device
  • the second augmented reality device is an electronic device associated with the target identifier
  • the first augmented reality device can project the content that needs to be displayed to a specific area, which is convenient for the user to watch.
  • the above-mentioned projection area may be the content displayed by the first augmented reality device. By projecting the content, the displayed content can be presented to the user, and the user using the first augmented reality device can view the content of the projection area.
  • the projection area of the first augmented reality device may include a target identifier.
  • the target identifier may be an account identifier associated with other electronic devices, and may also be a file identifier, an application program identifier, or other identifiers used to implement specific functions.
  • the above-mentioned first action meeting the first preset condition can be understood as the first action matches the preset action, or the moving speed of the first action is within the preset speed range, or the force generated by the first action is within the preset force range, and many more.
  • the target information corresponding to the first action can be sent according to the action type and action parameters of the first action, and the target The information can be used to indicate the type of information output by the second augmented reality device, including tactile information (such as vibration), display information (such as yellow halo), sound information (such as prompt sound), and so on.
  • tactile information such as vibration
  • display information such as yellow halo
  • sound information such as prompt sound
  • the target information includes prompt information for indicating a handshake.
  • the target information includes the file corresponding to the file identifier.
  • the augmented reality device can participate in the interaction between users and send the interaction information of the first user to the second user, and the first user and the second user can interact through the augmented reality device, which can improve the interaction effect.
  • the present invention acquires the user's line of sight direction and behavior and transmits them to the augmented reality device of the other party, which can bring an interactive communication mode close to the real scene.
  • the target identifier is a second account identifier
  • the method further includes:
  • a prompt identifier is displayed in the area where the second account identifier is located.
  • the projection area of the first augmented reality device may include one or more account identifiers, and each account identifier may correspond to an electronic device.
  • the second account identifier may be an account identifier corresponding to the second augmented reality device.
  • a prompt identifier may be displayed in the area where the second account identifier is located, so that the user confirms that the user needs to interact with the user corresponding to the account identifier.
  • the prompt indicator can be a flashing display, a color-enhanced display, or a focus indicator as shown in the right figure in FIG. 2.
  • the first augmented reality device When the first augmented reality device detects that the line of sight of the first user is projected on the area where the second account identifier is located, and the duration is longer than the preset duration, the first augmented reality device can display the prompt identifier in the area where the second account identifier is located, thereby reducing account identification. The error.
  • the method further includes:
  • the first augmented reality device after the first augmented reality device sends the target information to the second augmented reality device, if the response information to the target information sent by the second augmented reality device is received within the first preset time, the first augmented reality device The reality device outputs the first prompt message to indicate the success of the interaction, such as displaying a green halo on the target logo; if the response information sent by the second augmented reality device is not received within the first preset time, then the second prompt message is output , Such as showing a red halo on the target logo to indicate that the interaction has failed.
  • the first user taps the target identifier through the first AR device, and the first AR device sends prompt information corresponding to the tap action to the second AR device.
  • the second AR device outputs prompt information according to the received prompt. If within 5 seconds after outputting the prompt information, the eye locator recognizes that the second user is looking at the account identification of the first user, and the account identification of the first user and the account identification of the second user on both devices display a green halo respectively, It means that the second user is aware of this simulated tapping action and the interaction is successful.
  • the second user does not look at the account ID of the first user within 5 seconds, it is deemed to have missed the prompt, and the account ID of the first user and the account ID of the second user on the devices of both parties respectively show red halos, indicating the user of the recipient The state is busy.
  • the first augmented reality device outputs different prompt messages to prompt the user of the success or failure of the interaction, which is convenient for the user to perform other processing or interaction according to the result of the interaction.
  • the method further includes:
  • the second augmented reality device is sent to the second augmented reality device.
  • the first information is used to control the second wearing glove to output tactile sensation.
  • the first augmented reality device may further prompt the second augmented reality device information, that is, send the first information.
  • the second augmented reality device may control the second wearing glove to output tactile sensations according to the first information, so as to prompt the second user to make a response action.
  • the second augmented reality device can also prompt the first augmented reality device in the above-mentioned manner, thereby controlling the first augmented reality device to output a tactile sensation to the first wearable glove connected to the first augmented reality device.
  • the prompts can be made more intuitively and effectively, and the effect of interaction can be improved.
  • the sending target information corresponding to the first action to the second augmented reality device includes:
  • the first action may include actions such as clapping, shaking hands, waving, and hugging
  • the action parameters may include at least one of the following: the movement direction of the movement, the movement speed, the strength of the movement, and the like.
  • the first augmented reality device can obtain the movement direction and speed of the action through the depth camera, RGB camera, etc.
  • the strength of the action can be obtained through smart gloves or other sensing devices connected to the first augmented reality device, and can also be calculated by using the movement parameters get.
  • the first augmented reality device may send different information to the second augmented reality device according to the type and size of the action parameter, so as to instruct the second augmented reality device to output the prompt information in a different manner.
  • the second augmented reality device when the action parameter of the first action satisfies the first parameter condition, if the intensity is small, the second augmented reality device can be instructed to prompt with a smaller prompt sound, and the type and size of the prompt sound can simulate the first action The sound produced; when the action parameters of the first action meet the second parameter conditions, such as greater intensity, the second augmented reality device can be instructed to prompt with a larger prompt sound, and the type and size of the prompt sound can simulate the first The sound produced by the action.
  • the second augmented reality device when the first action is an eye action, can be instructed to display an image of the first action; when the first action is a gesture action, the second augmented reality device can be instructed to control wearing gloves to output tactile sensations.
  • the prompts are output in different ways, so that the second user can quickly obtain the interactive actions made by the first user, so as to respond quickly and improve the efficiency and effect of the interaction.
  • the first augmented reality device also sends the above-mentioned action parameters to the second augmented reality device, so that the second augmented reality device outputs corresponding prompts, for example, the prompt sound is loud when the photo is retaken, and the prompt sound is small when the photo is tapped. , It is convenient for the second user to understand the type of action or other information conveyed by the action according to the prompt.
  • the method further includes:
  • the method further includes:
  • response information to the target information sent by the second augmented reality device is received, and the response information is used to indicate a second action
  • the action represented by the third image includes the first action and the second action
  • the first image in the projection area is switched to a fourth image, and the fourth image
  • the indicated action is the opposite of the first action.
  • the first action may include interactive actions such as hugging, shaking hands, blinking, blowing kisses, and taking files.
  • the first augmented reality device may display a first image representing the first action in the projection area, and the image may be a picture or a video image, so that the first user can obtain the interactive action he is making.
  • the second augmented reality device may send response information to the first augmented reality device, for example, the second user has made a second action in response.
  • the response information may include an image used to indicate the second action, or it may be an instruction message, which is used to instruct the first device to display its stored interactive image with the first action in the projection area.
  • the third image may be displayed according to the response information, and the third image may include images of the first action and the second action that match each other.
  • the first augmented reality device sends target information to the second augmented reality device.
  • the second augmented reality device displays the image as shown in FIG. 4 according to the target information.
  • the first augmented reality device and the second augmented reality device may simultaneously display the handshake image as shown in FIG. 5, indicating that the interaction is successful.
  • the first user sends target information including the file receiving request to the second augmented reality device through the first augmented reality device, and both the first augmented reality device and the second augmented reality device display the image of the file sending request.
  • the second augmented reality device receives the nodding action of the second user (that is, the nodding action is the response information to the target information)
  • the first augmented reality device and the second augmented reality device can display the image received by the file at the same time, indicating that they agree to receive document.
  • the action represented by the first image can be withdrawn, that is, switch to The fourth image representing the withdrawal action.
  • the action represented by the first image is a handshake
  • the action represented by the fourth image is an action of retracting the hand.
  • the first augmented reality device can display the interactive actions of the first user and the second user, and can make a corresponding response according to whether the second user responds, and can further restore the interactive effect in the real scene.
  • the first action is a gesture action
  • the first augmented reality device is connected to a first wearing glove worn by the first user, and the method further includes:
  • the response information to the target information sent by the second augmented reality device includes tactile information, controlling the first wearing glove to output the tactile feeling corresponding to the tactile information;
  • the first action is an action of taking a target item
  • the first finger cuff is controlled to output a first contact force
  • the second finger cuff is controlled to output a second contact force, wherein the preset distance is determined according to the size of the target item.
  • the first augmented reality device is connected with the first wearable glove, so that information interaction can be realized.
  • the response information sent by the second augmented reality device includes tactile information, such as touch strength, touch position, etc.
  • the first augmented reality device can control the first wearing glove to output the tactile sensation, and the strength and position of the tactile sensation can be compared with the second augmented reality device. It corresponds to the information carried in the tactile information sent by the real device.
  • the response information includes handshake action information and the strength of the handshake, and the first wearable device outputs a tactile sensation to the first user according to the strength. In this way, the real scene where the first user and the second user make gesture actions can be restored, and the interaction effect can be improved.
  • the first augmented reality device can determine whether to control the first wearing glove to output tactile sensations according to the response information, which improves the flexibility of interactive reminders.
  • the first augmented reality device may also detect the user's first action and the user's line of sight. For example, when the line of sight is projected on the file identifier and the first action meets a preset condition, it is determined to be an action of taking the target item. It is also possible to determine whether to pick up the target item according to parameters such as the action mode and posture. For example, as shown in FIG. 6, the eye locator 1 can identify the direction of the user's eyeballs, and the wearable sensor and the image recognition system 2 can identify the user's gestures. Combining the recognition of the direction of the line of sight and the gesture action can improve the accuracy of the recognition of the user's taking action.
  • the distance between the first finger cuff and the second finger cuff is less than the preset distance, it can be considered that the first finger cuff and the second finger cuff have contacted the target item, and the first finger cuff and the second finger cuff can be controlled to output touch Power to simulate the user’s true feelings when picking up items.
  • wearing gloves can detect the pressure of the taking action on each finger cuff, that is, the pressure generated by the taking action.
  • the target item is located between the first finger and the second finger.
  • the first augmented reality device can output a force that is opposite to the direction of the pressure and equal in strength to the finger according to the pressure of each finger cuff.
  • the aforementioned target item may be a virtual item displayed in the projection area.
  • the user’s action of taking a virtual object is identified by using the parameters of the line of sight projection combined with the gesture action, which can improve the identification accuracy, and the tactile sensation can be output by wearing gloves, which can display the online interactive actions more intuitively and achieve more realism. Effect.
  • the execution subject may be an interaction device corresponding to the first augmented reality device, or a control module in the interaction device for executing the method of loading interaction.
  • the method of loading interaction performed by the interactive device is taken as an example to illustrate the method of interaction provided in the embodiment of the present application.
  • the interaction method on the second augmented reality device side may include the following steps:
  • Step 701 Receive target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device.
  • Step 702 When it is detected that the second action of the second user satisfies the second preset condition within the third preset time, output third prompt information, and send the target information to the first augmented reality device Corresponding response information.
  • the second action meeting the second preset condition can be understood as a match between the second action and the first action, or that the parameters of the second action meet a specific condition, and so on.
  • the second augmented reality device can be recognized by devices such as an eyeball locator, a depth camera, an infrared camera, and an RGB camera.
  • the second augmented reality device may output third prompt information to prompt the second user that the interaction is successful.
  • the second augmented reality device receives prompt information for prompting a handshake request, and when the second user makes a handshake action within a predetermined time, the second augmented reality device outputs a prompt message indicating that the handshake is successful.
  • the second augmented reality device may also send response information of the target information to the first augmented reality device, so that the first augmented reality device obtains the interaction result.
  • the response information may include image information, or only prompt information used to indicate the second action, and may also include other information.
  • the first prompt information on the first augmented reality device is the same as the third prompt information on the second augmented reality device, and the second prompt information is the same as the fourth prompt information.
  • step 702 can also be replaced with step 703.
  • Step 703 If the second user's second action meeting the second preset condition is not detected within the third preset time, output fourth prompt information.
  • the second augmented reality device may output fourth prompt information to prompt the interaction failure.
  • the second augmented reality device may also send response information of the target information to the first augmented reality device, so that the first augmented reality device obtains the interaction result.
  • the second augmented reality device can obtain the action of the second user to determine whether the interaction is successful, and can send the interaction result to the first augmented reality device, which can improve the interaction effect.
  • the second augmented reality device is connected to a second wearable glove worn by the second user, and before the sending response information corresponding to the target information to the first augmented reality device, the Methods also include:
  • the second wearing glove is controlled to output a tactile sensation.
  • the third prompt information can be output according to the gesture parameters, and the third prompt information can specifically output a sound that simulates the first action according to the type of the first action, so that the second user can obtain more information.
  • the real effect For example, if the first action is clapping hands, the second augmented reality device can output the clapping stereo sound, and the color of the account logo can be changed according to the sound; the first action is a handshake, then the second augmented reality device can use the smart glove to move towards the second The user outputs the corresponding grip strength.
  • the second augmented reality device can restore the effect of the first action more realistically, thereby improving the interaction effect between the first user and the second user.
  • the target information includes the first target information
  • the target information includes the second target information
  • the second augmented reality device may output the prompt information in different ways according to the type of information included in the target information.
  • the first target information and the second target information may be information for indicating the output mode of the prompt information, and the first mode and the second mode may be voice output, display screen output, or vibration output, respectively. This can be understood in conjunction with the description on the side of the first augmented reality device, and will not be repeated here.
  • the prompt information is output in different ways, which is convenient for the user to obtain the information of the first action according to the prompt information.
  • the method further includes:
  • the method further includes:
  • a third target image is displayed in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
  • the first target image may be the same or different from the first image in the foregoing embodiment
  • the third target image may be the same or different from the third image in the foregoing embodiment.
  • the first image is a gesture of reaching toward the other party ( Figure 3)
  • the first target image is a gesture of reaching toward oneself (Figure 4).
  • the second augmented reality device may display the third target image in the projection area to indicate that the interaction is successful.
  • participating in user interaction through image presentation can improve the interaction effect, and can prompt the user to improve the interaction efficiency.
  • the second enhanced display device can make corresponding prompt content according to the response action made by the second user to prompt the user that the interaction is successful. It can simulate real interactive scenes more intuitively and improve the interactive effect.
  • the execution subject of the interaction method provided by the embodiment of the present application may be the interaction device corresponding to the second augmented reality device, or the control module in the interaction device for executing the method of loading interaction.
  • the method of loading interaction performed by the interactive device is taken as an example to illustrate the method of interaction provided in the embodiment of the present application.
  • FIG. 8 is a structural diagram of an interaction device provided by an embodiment of the present invention, which can be applied to a first augmented reality device.
  • the interaction device 800 includes:
  • the detection module 801 is configured to detect the line of sight direction of the first user and the first action of the first user;
  • the first sending module 802 is configured to send to the second augmented reality device when the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets the first preset condition Target information corresponding to the first action;
  • the target identifier is an identifier in the projection area of the first augmented reality device
  • the second augmented reality device is an electronic device associated with the target identifier
  • the interaction device further includes:
  • the first output module is configured to output first prompt information when the response information sent by the second augmented reality device is received within the first preset time;
  • the second output module is configured to output second prompt information when the response information sent by the second augmented reality device is not received within the first preset time.
  • the interaction device further includes:
  • the second sending module is configured to send the response information sent by the second augmented reality device to the The second augmented reality device sends the first information
  • the first information is used to control the second wearing glove to output tactile sensation.
  • the first sending module includes:
  • a sending submodule configured to send first target information to the second augmented reality device when the action parameter satisfies the first parameter condition, where the first target information is used to indicate the second augmented reality device Output prompt information in the first way;
  • the interaction device further includes:
  • a first display module configured to display a first image for representing the first action in the projection area
  • the second display module is configured to receive response information to the target information sent by the second augmented reality device, and the response information is used to indicate a second action, according to the response information in the
  • the projection area displays a third image, and the action represented by the third image includes the first action and the second action.
  • the first action is a gesture action
  • the first augmented reality device is connected to a first wearing glove worn by the first user
  • the interaction device further includes:
  • the first control module is configured to control the first wearing glove to output the tactile sensation corresponding to the tactile sensation information in the case that the response information to the target information sent by the second augmented reality device includes tactile sensation information;
  • the second control module is configured to detect the distance between the first finger cuff and the second finger cuff of the first wearing glove when the first action is an action of picking up a target item, and When the distance is less than a preset distance, the first finger cuff is controlled to output a first contact force, and the second finger cuff is controlled to output a second contact force, wherein the preset distance is based on the size of the target item The size is ok.
  • the interaction apparatus 800 can implement the interaction method on the first augmented reality device side in the foregoing method embodiment and achieve the same beneficial effects. To avoid repetition, details are not described herein again.
  • FIG. 9 is a structural diagram of an interaction device provided by an embodiment of the present invention.
  • the interaction device 900 can be applied to a second augmented reality device.
  • the interaction device 900 includes:
  • the first receiving module 901 is configured to receive target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device;
  • the third output module 902 is configured to output third prompt information to the first augmented reality device when it is detected that the second action of the second user meets the second preset condition within the third preset time Response information corresponding to the target information;
  • the fourth output module 903 is configured to output fourth prompt information when the second user's second action that satisfies the second preset condition is not detected within the third preset time.
  • the second augmented reality device is connected to a second wearable glove worn by the second user, and the interaction device further includes:
  • the second receiving module is configured to receive the first information sent by the first augmented reality device
  • the third control module is configured to control the second wearing glove to output tactile sensations according to the first information.
  • the interaction device further includes:
  • a fifth output module configured to output the third prompt information in a first manner when the target information includes the first target information
  • the sixth output module is configured to output the third prompt information in a second manner when the target information includes the second target information.
  • the interaction device further includes:
  • a third display module configured to display a first target image used to represent the first action in the projection area of the second augmented reality device
  • the interaction device further includes:
  • the fourth display module is configured to display a third target image in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
  • the interaction device 900 can implement the interaction method on the second augmented reality device side in the foregoing method embodiment and achieve the same beneficial effects. To avoid repetition, details are not described herein again.
  • the interaction device in the embodiments of the present application may be a device, or a component, integrated circuit, or chip in a terminal.
  • the device may be an electronic device.
  • the electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant). assistant, PDA), etc., and the embodiments of this application are not specifically limited.
  • the interaction device in the embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiment of the present application.
  • the interaction device provided by the embodiment of the present application can implement each process implemented by the interaction device in the method embodiments of FIG. 1 to FIG.
  • the first augmented reality device can detect the user's actions, so as to enhance the interaction between the augmented reality device and the user, and can participate in the interaction between the user and the user, and can improve the interaction effect.
  • an embodiment of the present application further provides an augmented reality device, including a processor, a memory, and a program or instruction that is stored in the memory and can run on the processor, and the program or instruction is executed when the processor is executed.
  • an augmented reality device including a processor, a memory, and a program or instruction that is stored in the memory and can run on the processor, and the program or instruction is executed when the processor is executed.
  • FIG. 10 is a schematic diagram of the hardware structure of an augmented reality device implementing an embodiment of the present application.
  • the augmented reality device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010 And other parts.
  • the augmented reality device 1000 may also include a power source (such as a battery) for supplying power to various components.
  • the power source may be logically connected to the processor 1010 through a power management system, so that the power management system can manage charging, discharging, and Functions such as power management.
  • the structure of the augmented reality device shown in FIG. 10 does not constitute a limitation on the augmented reality device.
  • the augmented reality device may include more or fewer components than shown in the figure, or some components may be combined, or different component arrangements, here No longer.
  • the processor 1010 is configured to detect the direction of the first user's line of sight and the first action of the first user; and the first user's line of sight is projected on the target identifier In the area where the first user’s first action satisfies a first preset condition, sending target information corresponding to the first action to the second augmented reality device;
  • the target identifier is an identifier in the projection area of the first augmented reality device
  • the second augmented reality device is an electronic device associated with the target identifier
  • the first augmented reality device can detect the user's actions, thereby enhancing the interaction between the augmented reality device and the user, and can participate in the interaction between the user and the user, and can improve the interaction effect.
  • the target identifier is a second account identifier
  • the processor 1010 is further configured to:
  • the first action is a first gesture action
  • the target identifier is a second account identifier
  • the processor 1010 is further configured to:
  • the second augmented reality device is sent to the second augmented reality device.
  • the first information is used to control the second wearing glove to output tactile sensation.
  • processor 1010 executes the sending of target information corresponding to the first action to the second augmented reality device includes:
  • processor 1010 is further configured to:
  • the control display unit 1006 displays the first image representing the first action in the projection area; after receiving the response information to the target information sent by the second augmented reality device, and the response information is used In the case of indicating a second action, a third image is displayed in the projection area according to the response information, and the action represented by the third image includes the first action and the second action.
  • the first action is a gesture action
  • the first augmented reality device is connected to a first wearing glove worn by the first user
  • the processor 1010 is further configured to:
  • the response information to the target information sent by the second augmented reality device includes tactile information, controlling the first wearing glove to output the tactile feeling corresponding to the tactile information;
  • the distance between the first finger cuff and the second finger cuff of the first wearing glove is detected, and when the distance is less than a preset distance
  • the first finger cuff is controlled to output a first contact force
  • the second finger cuff is controlled to output a second contact force, wherein the preset distance is determined according to the size of the target item.
  • the processor 1010 is configured to receive target information sent by the first augmented reality device, where the target information corresponds to the first action detected by the first augmented reality device;
  • the second augmented reality device is connected to a second wearable glove worn by the second user, and the processor 1010 is further configured to:
  • the second wearing glove is controlled to output a tactile sensation.
  • processor 1010 is further configured to:
  • the target information includes the first target information
  • the target information includes the second target information
  • processor 1010 is further configured to:
  • a third target image is displayed in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
  • the second user can interact with the first user through the second augmented reality device, which can improve the interaction effect.
  • the second augmented reality device can make a corresponding prompt according to the received target information, which is convenient for the user to quickly obtain the content of the first action.
  • the embodiment of the present application also provides a readable storage medium, the readable storage medium stores a program or instruction, and when the program or instruction is executed by a processor, the first augmented reality device side and the second augmented reality device side are realized.
  • a readable storage medium stores a program or instruction
  • the program or instruction is executed by a processor, the first augmented reality device side and the second augmented reality device side are realized.
  • Each process of the embodiment of the interactive method can achieve the same technical effect. In order to avoid repetition, it will not be repeated here.
  • the processor is the processor in the augmented reality device described in the foregoing embodiment.
  • the readable storage medium includes a computer readable storage medium, such as a computer read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk.
  • An embodiment of the present application further provides a chip.
  • the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run a program or instruction to implement the first augmented reality device side.
  • Each process of the embodiment of the interaction method with the second augmented reality device side can achieve the same technical effect. To avoid repetition, details are not repeated here.
  • An embodiment of the present application further provides an interaction device configured to execute the processes of the foregoing interaction method embodiments on the first augmented reality device side and the second augmented reality device side, and can achieve the same technology The effect, in order to avoid repetition, will not be repeated here.
  • chips mentioned in the embodiments of the present application may also be referred to as system-level chips, system-on-chips, system-on-chips, or system-on-chips.
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
  • a terminal which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé d'interaction et un dispositif de réalité augmentée qui appartiennent au domaine de la technologie de communication et qui résolvent le problème d'une interactivité relativement faible d'un dispositif AR. Un procédé d'interaction pour un premier côté de dispositif de réalité augmentée consiste à : détecter une direction de ligne de visée d'un premier utilisateur et une première action du premier utilisateur (101) ; lorsque la ligne de visée du premier utilisateur est projetée sur une région dans laquelle se trouve un symbole cible et que la première action du premier utilisateur satisfait une première condition prédéfinie, envoyer à un second dispositif de réalité augmentée des informations cibles correspondant à la première action (102) ; le symbole cible étant un symbole dans une région de projection du premier dispositif de réalité augmentée et le second dispositif de réalité augmentée étant un dispositif électronique associé au symbole cible.
PCT/CN2021/091864 2020-05-09 2021-05-06 Procédé d'interaction et dispositif de réalité augmentée WO2021227918A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010389104.1 2020-05-09
CN202010389104.1A CN111580661A (zh) 2020-05-09 2020-05-09 交互方法和增强现实设备

Publications (1)

Publication Number Publication Date
WO2021227918A1 true WO2021227918A1 (fr) 2021-11-18

Family

ID=72122887

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/091864 WO2021227918A1 (fr) 2020-05-09 2021-05-06 Procédé d'interaction et dispositif de réalité augmentée

Country Status (2)

Country Link
CN (1) CN111580661A (fr)
WO (1) WO2021227918A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114338897A (zh) * 2021-12-16 2022-04-12 杭州逗酷软件科技有限公司 对象的分享方法、装置、电子设备以及存储介质
CN114385004A (zh) * 2021-12-15 2022-04-22 北京五八信息技术有限公司 基于增强现实的交互方法、装置、电子设备及可读介质
CN114578966A (zh) * 2022-03-07 2022-06-03 北京百度网讯科技有限公司 交互方法、装置、头戴显示设备、电子设备及介质
CN115550886A (zh) * 2022-11-29 2022-12-30 蔚来汽车科技(安徽)有限公司 一种车载扩展现实设备控制方法、系统及车载交互系统

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580661A (zh) * 2020-05-09 2020-08-25 维沃移动通信有限公司 交互方法和增强现实设备
US11516618B2 (en) 2020-10-09 2022-11-29 Lemon Inc. Social networking using augmented reality
CN114489331A (zh) * 2021-12-31 2022-05-13 上海米学人工智能信息科技有限公司 区别于按钮点击的隔空手势交互方法、装置、设备和介质
CN115191788B (zh) * 2022-07-14 2023-06-23 慕思健康睡眠股份有限公司 一种基于智能床垫的体感互动方法及相关产品

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829247A (zh) * 2018-06-01 2018-11-16 北京市商汤科技开发有限公司 基于视线跟踪的交互方法及装置、计算机设备
CN108933723A (zh) * 2017-05-19 2018-12-04 腾讯科技(深圳)有限公司 消息展示方法、装置及终端
CN109937394A (zh) * 2016-10-04 2019-06-25 脸谱公司 用于虚拟空间中的用户交互的控件和界面
CN110298925A (zh) * 2019-07-04 2019-10-01 珠海金山网络游戏科技有限公司 一种增强现实图像处理方法、装置、计算设备及存储介质
CN110413109A (zh) * 2019-06-28 2019-11-05 广东虚拟现实科技有限公司 虚拟内容的生成方法、装置、系统、电子设备及存储介质
CN111580661A (zh) * 2020-05-09 2020-08-25 维沃移动通信有限公司 交互方法和增强现实设备

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10445935B2 (en) * 2017-05-26 2019-10-15 Microsoft Technology Licensing, Llc Using tracking to simulate direct tablet interaction in mixed reality
CN110716647A (zh) * 2019-10-17 2020-01-21 广州大西洲科技有限公司 一种增强现实交互方法、装置及系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109937394A (zh) * 2016-10-04 2019-06-25 脸谱公司 用于虚拟空间中的用户交互的控件和界面
CN108933723A (zh) * 2017-05-19 2018-12-04 腾讯科技(深圳)有限公司 消息展示方法、装置及终端
CN108829247A (zh) * 2018-06-01 2018-11-16 北京市商汤科技开发有限公司 基于视线跟踪的交互方法及装置、计算机设备
CN110413109A (zh) * 2019-06-28 2019-11-05 广东虚拟现实科技有限公司 虚拟内容的生成方法、装置、系统、电子设备及存储介质
CN110298925A (zh) * 2019-07-04 2019-10-01 珠海金山网络游戏科技有限公司 一种增强现实图像处理方法、装置、计算设备及存储介质
CN111580661A (zh) * 2020-05-09 2020-08-25 维沃移动通信有限公司 交互方法和增强现实设备

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114385004A (zh) * 2021-12-15 2022-04-22 北京五八信息技术有限公司 基于增强现实的交互方法、装置、电子设备及可读介质
CN114338897A (zh) * 2021-12-16 2022-04-12 杭州逗酷软件科技有限公司 对象的分享方法、装置、电子设备以及存储介质
CN114338897B (zh) * 2021-12-16 2024-01-16 杭州逗酷软件科技有限公司 对象的分享方法、装置、电子设备以及存储介质
CN114578966A (zh) * 2022-03-07 2022-06-03 北京百度网讯科技有限公司 交互方法、装置、头戴显示设备、电子设备及介质
CN114578966B (zh) * 2022-03-07 2024-02-06 北京百度网讯科技有限公司 交互方法、装置、头戴显示设备、电子设备及介质
CN115550886A (zh) * 2022-11-29 2022-12-30 蔚来汽车科技(安徽)有限公司 一种车载扩展现实设备控制方法、系统及车载交互系统
CN115550886B (zh) * 2022-11-29 2023-03-28 蔚来汽车科技(安徽)有限公司 一种车载扩展现实设备控制方法、系统及车载交互系统

Also Published As

Publication number Publication date
CN111580661A (zh) 2020-08-25

Similar Documents

Publication Publication Date Title
WO2021227918A1 (fr) Procédé d'interaction et dispositif de réalité augmentée
WO2015188614A1 (fr) Procédé et dispositif de mise en œuvre d'ordinateur et de téléphone mobile dans un monde virtuel, et lunettes les utilisant
US11703941B2 (en) Information processing system, information processing method, and program
CN104813642A (zh) 用于触发手势辨识模式以及经由非触摸手势的装置配对和共享的方法、设备和计算机可读媒体
WO2020078319A1 (fr) Procédé de manipulation basé sur le geste et dispositif de terminal
US10701316B1 (en) Gesture-triggered overlay elements for video conferencing
CN105338238B (zh) 一种拍照方法及电子设备
CN111970456B (zh) 拍摄控制方法、装置、设备及存储介质
WO2021227916A1 (fr) Procédé et appareil de génération d'image faciale, dispositif électronique et support de stockage lisible
WO2022188305A1 (fr) Procédé et appareil de présentation d'informations, dispositif électronique, support de stockage et programme informatique
CN108616712A (zh) 一种基于摄像头的界面操作方法、装置、设备及存储介质
CN112199016A (zh) 图像处理方法、装置、电子设备及计算机可读存储介质
TW200917805A (en) A method for interaction real information between mobile devices
CN107888965A (zh) 图像礼物展示方法及装置、终端、系统、存储介质
CN111601064A (zh) 信息交互方法和信息交互装置
US20220291752A1 (en) Distributed Application Platform Projected on a Secondary Display for Entertainment, Gaming and Learning with Intelligent Gesture Interactions and Complex Input Composition for Control
CN112287767A (zh) 交互控制方法、装置、存储介质以及电子设备
WO2023273372A1 (fr) Procédé et appareil de détermination d'objet de reconnaissance de geste
CN108537149B (zh) 图像处理方法、装置、存储介质及电子设备
Sarkar et al. Augmented reality-based virtual smartphone
CN114327197B (zh) 消息发送方法、装置、设备及介质
WO2022151687A1 (fr) Procédé et appareil de génération d'image photographique de groupe, dispositif, support de stockage, programme informatique et produit
JP2014194675A (ja) プログラム、通信装置
CN107968742B (zh) 影像显示方法、装置及计算机可读存储介质
WO2024041270A1 (fr) Procédé et appareil d'interaction dans une scène virtuelle, dispositif et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21804647

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21804647

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 21804647

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A 04.05.2023)