CN115981544A - Interaction method and device based on augmented reality, electronic equipment and storage medium - Google Patents

Interaction method and device based on augmented reality, electronic equipment and storage medium Download PDF

Info

Publication number
CN115981544A
CN115981544A CN202310165805.0A CN202310165805A CN115981544A CN 115981544 A CN115981544 A CN 115981544A CN 202310165805 A CN202310165805 A CN 202310165805A CN 115981544 A CN115981544 A CN 115981544A
Authority
CN
China
Prior art keywords
virtual
key
preset
user
virtual keyboard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310165805.0A
Other languages
Chinese (zh)
Inventor
路晓创
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202310165805.0A priority Critical patent/CN115981544A/en
Publication of CN115981544A publication Critical patent/CN115981544A/en
Pending legal-status Critical Current

Links

Images

Abstract

An interaction method, an interaction device, electronic equipment and a storage medium based on augmented reality are provided, wherein a virtual keyboard object is displayed in the virtual reality space in response to a first preset operation of a user by displaying the virtual reality space, the virtual keyboard object comprises a plurality of first virtual keys arranged around a preset blank area, and the selected first virtual keys are determined in response to a second preset operation of the user. This is disclosed through making the virtual keyboard object include a plurality of first virtual button of arranging around predetermineeing blank region, makes first virtual button be located each direction of predetermineeing blank region to can adopt the interactive mode that suits the augmented reality to realize the interaction between user and the virtual keyboard object, promote the information input efficiency under the augmented reality scene.

Description

Interaction method and device based on augmented reality, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interaction method and apparatus, an electronic device, and a storage medium based on augmented reality.
Background
The Extended Reality technology (XR for short) can combine Reality and virtual by a computer, and provide a virtual Reality space capable of human-computer interaction for a user. In the virtual reality space, a User may perform social interaction, entertainment, learning, work, telecommuting, creation of UGC (User Generated Content), and the like through a virtual reality device such as a Head Mounted Display (HMD). However, the virtual keyboard layout provided by the related augmented reality application adopts the conventional keyboard layout, which is not suitable for the interactive habit of augmented reality and reduces the information input efficiency in the augmented reality scene.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, according to one or more embodiments of the present disclosure, there is provided an augmented reality-based interaction method, including:
displaying a virtual reality space;
responding to a first preset operation of a user, displaying a virtual keyboard object in the virtual reality space, wherein the virtual keyboard object comprises a plurality of first virtual keys arranged around a preset blank area;
and responding to a second preset operation of the user, and determining the selected first virtual key.
In a second aspect, according to one or more embodiments of the present disclosure, there is provided an augmented reality-based interaction apparatus, including:
a space display unit for displaying a virtual reality space;
the keyboard display unit is used for responding to a first preset operation of a user and displaying a virtual keyboard object in the virtual reality space, wherein the virtual keyboard object comprises a plurality of first virtual keys which are arranged around a preset blank area;
and the key determining unit is used for responding to a second preset operation of the user and determining the selected first virtual key.
In a third aspect, according to one or more embodiments of the present disclosure, there is provided an electronic device, including: at least one memory and at least one processor; wherein the memory is configured to store program code, and the processor is configured to invoke the program code stored by the memory to cause the electronic device to perform an augmented reality based interaction method provided in accordance with one or more embodiments of the present disclosure.
In a fourth aspect, according to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code which, when executed by a computer device, causes the computer device to perform an augmented reality based interaction method provided according to one or more embodiments of the present disclosure.
According to one or more embodiments of the present disclosure, the virtual keyboard object includes a plurality of first virtual keys arranged around the preset blank area, and the first virtual keys are located in each direction of the preset blank area, so that an interaction manner suitable for augmented reality can be adopted to realize interaction between a user and the virtual keyboard object, and information input efficiency in an augmented reality scene is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a flowchart of an interaction method based on augmented reality provided according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an augmented reality device according to an embodiment of the present disclosure;
FIG. 3 is an alternative schematic view of a virtual field of view of an augmented reality device provided in accordance with an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a virtual keyboard object provided in accordance with an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a first virtual keyboard object and a second virtual keyboard object provided in accordance with another embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an interaction apparatus based on augmented reality according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and the embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the steps recited in the embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Moreover, embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". The term "responsive to" and related terms mean that one signal or event is affected to some extent, but not necessarily completely or directly, by another signal or event. If an event x occurs "in response" to an event y, x may respond directly or indirectly to y. For example, the occurrence of y may ultimately result in the occurrence of x, but other intermediate events and/or conditions may exist. In other cases, y may not necessarily result in the occurrence of x, and x may occur even though y has not already occurred. Furthermore, the term "responsive to" may also mean "at least partially responsive to".
The term "determining" broadly encompasses a wide variety of actions that can include obtaining, calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like, and can also include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like, as well as resolving, selecting, choosing, establishing and the like, and the like. Relevant definitions for other terms will be given in the following description. Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
For the purposes of this disclosure, the phrase "a and/or B" means (a), (B), or (a and B).
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The interaction method based on the Extended Reality provided by one or more embodiments of the present disclosure adopts an Extended Reality (XR) technology. The augmented reality technology can combine reality and virtual through a computer, and a virtual reality space capable of human-computer interaction is provided for a user. In the virtual reality space, a User may perform social interaction, entertainment, learning, work, telecommuting, creation of UGC (User Generated Content), and the like through an augmented reality device such as a Head Mounted Display (HMD).
Referring to fig. 2, a user may enter a virtual reality space through an augmented reality device, such as a head-mounted VR glasses, and control his or her Avatar (Avatar) to perform social interaction, entertainment, learning, telecommuting, etc. with avatars controlled by other users in the virtual reality space.
In the virtual reality space, the user can realize related interactive operation through a controller, and the controller can be a handle, for example, the user can perform related operation control through the operation of keys of the handle. Of course, in another embodiment, the target object in the augmented reality device may be controlled by using a gesture or a voice or a multi-modal control manner instead of using the controller.
The augmented reality device described in the embodiments of the present disclosure may include, but is not limited to, the following types:
the computer-side augmented reality (PCVR) equipment utilizes a PC (personal computer) side to perform related calculation and data output of an augmented reality function, and the external computer-side augmented reality equipment utilizes data output by the PC side to achieve an augmented reality effect.
The mobile augmented reality device supports a mobile terminal (such as a smart phone) arranged in various modes (such as a head-mounted display provided with a special card slot), performs related calculation of augmented reality functions through connection with the mobile terminal in a wired or wireless mode, and outputs data to the mobile augmented reality device, for example, an APP of the mobile terminal is used for watching an augmented reality video.
The all-in-one machine augmented reality equipment is provided with a processor for performing relevant calculation of virtual functions, so that the all-in-one machine augmented reality equipment has independent augmented reality input and output functions, does not need to be connected with a PC (personal computer) end or a mobile terminal, and is high in use freedom.
Of course, the form of implementation of the augmented reality device is not limited to this, and may be further reduced in size or increased in size as needed.
The method comprises the steps that a sensor (such as a nine-axis sensor) for detecting the posture of the extended reality equipment is arranged in the extended reality equipment, the posture change of the extended reality equipment is detected in real time, if the extended reality equipment is worn by a user, when the posture of the head of the user changes, the real-time posture of the head of the user is transmitted to a processor, the gaze point of the sight line of the user in a virtual environment is calculated, an image in the user gaze range (namely a virtual field of view) in a three-dimensional model of the virtual environment is calculated according to the gaze point, and the image is displayed on a display screen, so that the user can see the same immersive experience as if the user is in a real environment.
Fig. 3 illustrates an alternative schematic diagram of a virtual field of view of an augmented reality device provided by an embodiment of the present disclosure, which describes a distribution range of the virtual field of view in a virtual environment by using a horizontal field angle and a vertical field angle, where the distribution range in a vertical direction is represented by using a vertical field angle BOC, and the distribution range in a horizontal direction is represented by using a horizontal field angle AOB, and a human eye can always perceive an image located in the virtual field of view in the virtual environment through a lens, and it can be understood that the larger the field angle is, the larger the size of the virtual field of view is, and the larger the area of the virtual environment that can be perceived by a user is. The field angle represents a distribution range of a field angle that the lens senses the environment. For example, the field angle of the augmented reality device represents the distribution range of the angle of view that the human eye has when the virtual environment is perceived through the lens of the augmented reality device; for another example, in a mobile terminal provided with a camera, the field angle of the camera is the distribution range of the field angle that the camera has when sensing a real environment to shoot.
An augmented reality device, such as an HMD, is integrated with several cameras (e.g., depth camera, RGB camera, etc.), the purpose of which is not limited to providing a through view only. Camera images and integrated Inertial Measurement Units (IMUs) provide data that can be processed by computer vision methods to automatically analyze and understand the environment. Also, HMDs are designed to support not only passive but also active computer vision analysis. Passive computer vision methods analyze image information captured from an environment. These methods may be monoscopic (images from a single camera) or stereoscopic (images from two cameras). These include, but are not limited to, feature tracking, object recognition, and depth estimation. Active computer vision methods add information to an environment by projecting a pattern that is visible to a camera, but not necessarily visible to a human visual system. Such techniques include time-of-flight (ToF) cameras, laser scanning, or structured light to simplify the stereo matching problem. Active computer vision is used to enable scene depth reconstruction.
Referring to fig. 1, fig. 1 shows a flowchart of an augmented reality-based interaction method 100 provided by an embodiment of the present disclosure, where the method 100 includes steps S120 to S160.
Step S120: a virtual reality space is displayed.
Step S140: in response to a first preset operation of a user, displaying a virtual keyboard object in the virtual reality space, wherein the virtual keyboard object comprises a plurality of first virtual keys arranged around a preset blank area.
Step S160: and responding to a second preset operation of the user, and determining the selected first virtual key.
The virtual reality space may be a simulation environment of a real world, a semi-simulation semi-fictional virtual scene, or a pure fictional virtual scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiments of the present application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
The first preset operation and the second preset operation include, but are not limited to, a somatosensory control operation, a gesture control operation, an eyeball shaking operation, a touch control operation, a voice control instruction, or an operation on an external control device. For example, a user may evoke a virtual keyboard object in virtual reality space by triggering a preset key on a controller of the augmented reality (e.g., a handle of a VR device), and select a first virtual key from the virtual keyboard object, such as by a handle ray.
In some embodiments, the first virtual keys may be arranged in a ring shape, and a center of the ring shape is the preset blank area. Referring to fig. 4, in virtual reality space, a virtual keyboard object 20, which is substantially annular, includes 8 first virtual keys 30. Each first virtual key corresponds to 3-4 letters.
Also displayed in the virtual reality space is an input box 40. The input box 40 displays a character input by the user by activating the first virtual key and a plurality of candidate entries determined according to the character.
In some embodiments, the second preset operation includes an operation for controlling a virtual ray to select a first virtual key; the method 100 further comprises: after the virtual keyboard object is displayed, setting the initial direction of the virtual ray to face a preset blank area. Illustratively, referring to FIG. 4, the initial direction of the virtual ray may be located at the center of the ring. In this embodiment, the first virtual keys are arranged around the preset blank area, so that the first virtual keys are located in all directions of the preset blank area, the initial direction of the virtual rays is set to face the preset blank area, and therefore a user can select the first virtual keys located in all directions by moving the virtual rays around the preset blank area, the distance between the initial position of the virtual rays and the first virtual keys to be selected is reduced as far as possible, and the information input efficiency under the augmented reality scene is improved.
In some embodiments, the second preset operation includes an operation of changing the pose of the controller; step S160 includes: in response to the operation of changing the posture of the controller, determining the motion direction of the controller; determining the selected first virtual key based on the movement direction. In this embodiment, the first virtual keys are arranged around the preset blank area, so that the first virtual keys are located in each direction of the preset blank area, and thus, a user can select the first virtual key located in a certain direction by driving (for example, shaking, turning, moving) the controller in the certain direction, and a virtual ray (for example, a handle ray) does not need to be moved to the position of the first virtual key, thereby improving the information input efficiency in an extended reality scene.
In some embodiments, step S160 includes: and responding to the second preset operation, rotating the virtual keyboard object or a key indication object used for indicating the currently selected key, and taking the first virtual key indicated by the key indication object as the selected first virtual key after the rotation is stopped. In this embodiment, the first virtual keys are arranged around the preset blank area, so that the first virtual keys are located in all directions of the preset blank area, the annular virtual keyboard or the rotary key indication object can be rotated by taking the preset blank area as an axis, the first virtual keys are selected in a rotary interactive mode, and the information input efficiency under the augmented reality scene is improved.
According to one or more embodiments of the present disclosure, the virtual keyboard object includes a plurality of first virtual keys arranged around the preset blank area, and the first virtual keys are located in each direction of the preset blank area, so as to adapt to the space-based interaction characteristics of the augmented reality, and further, an interaction mode suitable for the augmented reality can be adopted to realize interaction between a user and the virtual keyboard object, thereby improving information input efficiency in an augmented reality scene.
In some embodiments, the characters corresponding to each first virtual key are the same as the characters corresponding to each key in the squared figure input method; the number of the first virtual keys is 8. In the embodiment, the mapping between the first virtual key and the character inherits the squared figure input method, so that the learning cost of a user is reduced, and the information input efficiency in an expanded reality scene is improved.
In some embodiments, the virtual keyboard objects include a first virtual keyboard object and a second virtual keyboard object arranged in parallel; the first virtual keyboard object comprises a plurality of first virtual keys which are arranged around a first preset blank area, and the second virtual keyboard object comprises a plurality of first virtual keys which are arranged around a second preset blank area; the first virtual keyboard object is for control by a first handheld controller and the second virtual keyboard object is for control by a second handheld controller.
In this embodiment, through setting up the two virtual keyboard objects that are applicable to bimanualness, the user can control different virtual keyboard objects through controlling two handles to make the information input interaction more laminate augmented reality scene, promoted the information input efficiency under the augmented reality scene.
Illustratively, referring to fig. 5, in the virtual reality space, the first virtual keyboard object 21 having a substantially ring shape includes 4 first virtual keys 31 arranged around the preset blank regions, and the 4 first virtual keys 31 are respectively located at the upper, lower, left and right sides of the corresponding preset blank regions. Similarly, the first virtual keyboard object 22 has a substantially ring shape and includes 4 first virtual keys 32 arranged around the predetermined blank area, and the 4 first virtual keys 32 are respectively located at the upper, lower, left and right sides of the corresponding predetermined blank area. An input method switching interface 50 is also displayed in the virtual reality space, and a user can select an input method through the input method switching interface 50.
In this example, the virtual keys in each virtual keyboard object are located at the upper, lower, left, and right sides of the corresponding preset blank area, so that when the second preset operation includes an operation of changing the pose of the controller, the user can select the only first virtual key located in the corresponding direction by driving (for example, shaking, flipping, moving) the controller upward, downward, leftward, or rightward, thereby further improving the efficiency and accuracy of selecting the virtual key, greatly reducing the situation of misoperation caused by inaccurate driving direction, and further improving the information input efficiency in the augmented reality scene.
In some embodiments, the candidate entries displayed in the input box may be selected in response to a first preset key provided on the controller being triggered; the selected candidate entry can be confirmed in response to a second preset key arranged on the controller being triggered; the candidate entries displayed by the input box can be switched (for example, page turning) in response to a third preset key arranged on the controller being triggered. Illustratively, the first preset key comprises a rocker.
Accordingly, referring to fig. 6, there is provided an information interacting device 600 according to an embodiment of the present disclosure, including:
a space display unit 601 for displaying a virtual reality space;
a keyboard display unit 602, configured to display, in response to a first preset operation by a user, a virtual keyboard object in the virtual reality space, where the virtual keyboard object includes a plurality of first virtual keys arranged around a preset blank area;
a key determining unit 603, configured to determine the selected first virtual key in response to a second preset operation by the user.
In some embodiments, the characters corresponding to each first virtual key are the same as the characters corresponding to each key in the squared figure input method; the number of the first virtual keys is 8.
In some embodiments, the virtual keyboard objects include a first virtual keyboard object and a second virtual keyboard object arranged in parallel; the first virtual keyboard object comprises a plurality of first virtual keys arranged around a first preset blank area, and the second virtual keyboard object comprises a plurality of first virtual keys arranged around a second preset blank area; the first virtual keyboard object is for control by a first handheld controller and the second virtual keyboard object is for control by a second handheld controller.
In some embodiments, the second preset operation includes an operation for controlling a virtual ray to select a first virtual key; the method further comprises the following steps: after the virtual keyboard object is displayed, setting the initial direction of the virtual ray to face the preset blank area.
In some embodiments, the second preset operation includes an operation of changing the posture of the controller; the step of determining the selected first virtual key in response to a second preset operation of the user includes: in response to the operation of changing the pose of the controller, determining the motion direction of the controller; determining the selected first virtual key based on the movement direction.
In some embodiments, the determining the selected first virtual key in response to the second preset operation of the user includes: and responding to the second preset operation, rotating the virtual keyboard object or a key indication object used for indicating the currently selected key, and determining the selected first virtual key based on the first virtual key indicated by the key indication object after the rotation is stopped.
In some embodiments, the first virtual keys are arranged in a ring shape.
For the embodiments of the apparatus, since they correspond substantially to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described apparatus embodiments are merely illustrative, in that modules illustrated as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Accordingly, in accordance with one or more embodiments of the present disclosure, there is provided an electronic device including:
at least one memory and at least one processor;
wherein the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to enable the electronic equipment to execute the interaction method based on the augmented reality provided by one or more embodiments of the present disclosure.
Accordingly, according to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code executable by a computer device to cause the computer device to perform an augmented reality based interaction method provided according to one or more embodiments of the present disclosure.
Referring now to fig. 7, a schematic diagram of an electronic device (e.g., a terminal device or server) 800 suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 800 may include a processing means (e.g., central processing unit, graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 802 or a program loaded from a storage means 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data necessary for the operation of the electronic apparatus 800 are also stored. The processing device 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
Generally, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, or the like; storage 808 including, for example, magnetic tape, hard disk, etc.; and a communication device 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device 800 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, the processes described above with reference to the flow diagrams may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 809, or installed from the storage means 808, or installed from the ROM 802. The computer program, when executed by the processing apparatus 801, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure as described above.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based interaction method, including: displaying a virtual reality space; responding to a first preset operation of a user, displaying a virtual keyboard object in the virtual reality space, wherein the virtual keyboard object comprises a plurality of first virtual keys arranged around a preset blank area; and responding to a second preset operation of the user, and determining the selected first virtual key.
According to one or more embodiments of the present disclosure, the characters corresponding to each first virtual key are the same as the characters corresponding to each key in the squared figure input method; the number of the first virtual keys is 8.
According to one or more embodiments of the present disclosure, the virtual keyboard object includes a first virtual keyboard object and a second virtual keyboard object arranged in parallel; the first virtual keyboard object comprises a plurality of first virtual keys which are arranged around a first preset blank area, and the second virtual keyboard object comprises a plurality of first virtual keys which are arranged around a second preset blank area; the first virtual keyboard object is for control by a first handheld controller and the second virtual keyboard object is for control by a second handheld controller.
According to one or more embodiments of the present disclosure, the second preset operation includes an operation for controlling a virtual ray to select a first virtual key; the method further comprises the following steps: after the virtual keyboard object is displayed, setting the initial direction of the virtual ray to face the preset blank area.
According to one or more embodiments of the present disclosure, the second preset operation includes an operation of changing the pose of the controller; the step of determining the selected first virtual key in response to a second preset operation of the user includes: in response to the operation of changing the posture of the controller, determining the motion direction of the controller; determining the selected first virtual key based on the movement direction.
According to one or more embodiments of the present disclosure, the determining the selected first virtual key in response to a second preset operation of the user includes: and responding to the second preset operation, rotating the virtual keyboard object or rotating a key indicating object used for indicating the currently selected key, and determining the selected first virtual key based on the first virtual key indicated by the key indicating object after the rotation is stopped.
According to one or more embodiments of the present disclosure, the first virtual keys are arranged in a ring shape.
According to one or more embodiments of the present disclosure, there is provided an augmented reality-based information interaction apparatus, including: a space display unit for displaying a virtual reality space; a keyboard display unit, configured to display a virtual keyboard object in the virtual reality space in response to a first preset operation by a user, where the virtual keyboard object includes a plurality of first virtual keys arranged around a preset blank area; and the key determining unit is used for responding to a second preset operation of the user and determining the selected first virtual key.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: at least one memory and at least one processor; wherein the memory is configured to store program code and the processor is configured to invoke the program code stored by the memory to cause the electronic device to perform an augmented reality based interaction method provided in accordance with one or more embodiments of the present disclosure.
According to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code which, when executed by a computer device, causes the computer device to perform an augmented reality based interaction method provided according to one or more embodiments of the present disclosure.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and the technical features disclosed in the present disclosure (but not limited to) having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (10)

1. An interaction method based on augmented reality, comprising:
displaying a virtual reality space;
responding to a first preset operation of a user, displaying a virtual keyboard object in the virtual reality space, wherein the virtual keyboard object comprises a plurality of first virtual keys arranged around a preset blank area;
and responding to a second preset operation of the user, and determining the selected first virtual key.
2. The method of claim 1,
the characters corresponding to the first virtual keys are the same as the characters corresponding to the keys in the Sudoku input method; the number of the first virtual keys is 8.
3. The method of claim 1,
the virtual keyboard objects comprise a first virtual keyboard object and a second virtual keyboard object which are arranged in parallel; the first virtual keyboard object comprises a plurality of first virtual keys arranged around a first preset blank area, and the second virtual keyboard object comprises a plurality of first virtual keys arranged around a second preset blank area;
the first virtual keyboard object is for control by a first handheld controller and the second virtual keyboard object is for control by a second handheld controller.
4. The method according to claim 1, wherein the second preset operation comprises an operation for controlling a virtual ray to select a first virtual key;
the method further comprises the following steps: after the virtual keyboard object is displayed, setting the initial direction of the virtual ray to face the preset blank area.
5. The method according to claim 1, wherein the second preset operation includes an operation of changing a posture of a controller;
the step of determining the selected first virtual key in response to a second preset operation of the user includes: in response to the operation of changing the pose of the controller, determining the motion direction of the controller; determining the selected first virtual key based on the movement direction.
6. The method of claim 1, wherein the determining the selected first virtual key in response to a second preset operation by the user comprises: and responding to the second preset operation, rotating the virtual keyboard object or rotating a key indicating object used for indicating the currently selected key, and determining the selected first virtual key based on the first virtual key indicated by the key indicating object after the rotation is stopped.
7. The method of claim 1, wherein the first virtual keys are arranged in a ring.
8. An information interaction device based on augmented reality, comprising:
a space display unit for displaying a virtual reality space;
the keyboard display unit is used for responding to a first preset operation of a user and displaying a virtual keyboard object in the virtual reality space, wherein the virtual keyboard object comprises a plurality of first virtual keys which are arranged around a preset blank area;
and the key determining unit is used for responding to a second preset operation of the user and determining the selected first virtual key.
9. An electronic device, comprising:
at least one memory and at least one processor;
wherein the memory is configured to store program code and the processor is configured to invoke the program code stored by the memory to cause the electronic device to perform the method of any of claims 1-7.
10. A non-transitory computer storage medium, characterized in that,
the non-transitory computer storage medium stores program code that, when executed by a computer device, causes the computer device to perform the method of any of claims 1-7.
CN202310165805.0A 2023-02-21 2023-02-21 Interaction method and device based on augmented reality, electronic equipment and storage medium Pending CN115981544A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310165805.0A CN115981544A (en) 2023-02-21 2023-02-21 Interaction method and device based on augmented reality, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310165805.0A CN115981544A (en) 2023-02-21 2023-02-21 Interaction method and device based on augmented reality, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115981544A true CN115981544A (en) 2023-04-18

Family

ID=85970485

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310165805.0A Pending CN115981544A (en) 2023-02-21 2023-02-21 Interaction method and device based on augmented reality, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115981544A (en)

Similar Documents

Publication Publication Date Title
KR20140142337A (en) Augmented reality light guide display
EP2814000A1 (en) Image processing apparatus, image processing method, and program
US20180219975A1 (en) Sharing Mediated Reality Content
US20180314326A1 (en) Virtual space position designation method, system for executing the method and non-transitory computer readable medium
US20140111551A1 (en) Information-processing device, storage medium, information-processing method, and information-processing system
US10591986B2 (en) Remote work supporting system, remote work supporting method, and program
CN115981544A (en) Interaction method and device based on augmented reality, electronic equipment and storage medium
WO2024012106A1 (en) Information interaction method and apparatus, electronic device, and storage medium
US20240078734A1 (en) Information interaction method and apparatus, electronic device and storage medium
WO2023231666A1 (en) Information exchange method and apparatus, and electronic device and storage medium
WO2024016880A1 (en) Information interaction method and apparatus, and electronic device and storage medium
CN117075770A (en) Interaction control method and device based on augmented reality, electronic equipment and storage medium
CN117631921A (en) Information interaction method, device, electronic equipment and storage medium
US20240028130A1 (en) Object movement control method, apparatus, and device
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN117631904A (en) Information interaction method, device, electronic equipment and storage medium
CN117641040A (en) Video processing method, device, electronic equipment and storage medium
CN117435041A (en) Information interaction method, device, electronic equipment and storage medium
CN117979414A (en) Method, device, electronic equipment and storage medium for searching for article
CN117631810A (en) Operation processing method, device, equipment and medium based on virtual reality space
CN117641026A (en) Model display method, device, equipment and medium based on virtual reality space
CN117519457A (en) Information interaction method, device, electronic equipment and storage medium
CN117788759A (en) Information pushing method, device, electronic equipment and storage medium
CN117519456A (en) Information interaction method, device, electronic equipment and storage medium
CN117934769A (en) Image display method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination