WO2024060949A1 - 用于增强现实的方法、装置、设备和存储介质 - Google Patents

用于增强现实的方法、装置、设备和存储介质 Download PDF

Info

Publication number
WO2024060949A1
WO2024060949A1 PCT/CN2023/115804 CN2023115804W WO2024060949A1 WO 2024060949 A1 WO2024060949 A1 WO 2024060949A1 CN 2023115804 W CN2023115804 W CN 2023115804W WO 2024060949 A1 WO2024060949 A1 WO 2024060949A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual objects
augmented reality
scene
distance
virtual
Prior art date
Application number
PCT/CN2023/115804
Other languages
English (en)
French (fr)
Inventor
高星
刘慧琳
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Publication of WO2024060949A1 publication Critical patent/WO2024060949A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • Example embodiments of the present disclosure relate generally to the field of computers, and in particular to methods, apparatus, devices and computer-readable storage media for augmented reality.
  • Augmented Reality (AR) technology is a technology that integrates virtual information with the real world.
  • AR devices can superimpose virtual objects and images in the real world to present them in the AR scene.
  • the images appearing in the user's field of vision include both real-world images and virtual objects, allowing the user to see virtual objects and the real world at the same time.
  • Conventional AR devices are usually suitable for users to experience within a small range of activities, but the experience effect in a large range of activities (such as streets or scenic spots) is not ideal.
  • a method for augmented reality comprising: determining an electronic device for presenting an augmented reality scene and each of a plurality of virtual objects to be presented in the augmented reality scene. a current distance between virtual objects; selecting at least a portion of the plurality of virtual objects based on the current distance; and rendering the at least a portion of the plurality of virtual objects in the augmented reality scene.
  • multiple objects can be dynamically loaded based on the current distance between the virtual object to be presented and the electronic device. At least part of the virtual objects improves the AR experience effect within a large range of activities.
  • an apparatus for augmented reality including: a distance determination module configured to determine the distance between an electronic device used to present an augmented reality scene and a multi-object to be presented in the augmented reality scene. a current distance between each of the virtual objects; an object selection module configured to select at least a portion of the plurality of virtual objects based on the current distance; and an object rendering module configured to select The at least part of the plurality of virtual objects is rendered in an augmented reality scene.
  • a distance determination module configured to determine the distance between an electronic device used to present an augmented reality scene and a multi-object to be presented in the augmented reality scene. a current distance between each of the virtual objects
  • an object selection module configured to select at least a portion of the plurality of virtual objects based on the current distance
  • an object rendering module configured to select The at least part of the plurality of virtual objects is rendered in an augmented reality scene.
  • an electronic device in a third aspect of the present disclosure, includes at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit.
  • the instructions when executed by at least one processing unit, cause the device to perform the method of the first aspect.
  • a computer-readable storage medium is provided.
  • a computer program is stored on the computer-readable storage medium, and the computer program can be executed by a processor to implement the method of the first aspect.
  • FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure can be implemented
  • FIG. 2 illustrates a flowchart of a method for augmented reality according to an embodiment of the present disclosure
  • Figure 3 shows a schematic diagram of an AR scene according to some embodiments of the present disclosure
  • Figure 4 shows a schematic diagram of an AR scene according to some embodiments of the present disclosure
  • Figure 5 shows a schematic diagram of an AR scene according to some embodiments of the present disclosure
  • Figure 6 shows a schematic diagram of an AR scene according to some embodiments of the present disclosure
  • FIG. 7 illustrates a block diagram of an apparatus for augmented reality in accordance with some embodiments of the present disclosure.
  • FIG. 8 illustrates a block diagram of a device capable of implementing various embodiments of the present disclosure.
  • conventional AR devices are usually suitable for users to experience within a small range of activities, but the experience effect in a large range of activities (such as streets or scenic spots) is not ideal.
  • the AR device needs to load a large number of virtual objects (also known as AR special effects or AR) in the AR scene at various locations within the activity range. elements), which not only consumes a lot of processing performance of the AR device, but also causes unclear visual effects when near and far virtual objects are visible at the same time.
  • Embodiments of the present disclosure propose a solution for selectively downloading and loading AR elements related to positions in an AR scene according to the position of an electronic device within a large activity range.
  • this solution for multiple virtual objects to be presented in the AR scene, at least a part of the multiple virtual objects can be dynamically loaded based on the current distance between each virtual object and the electronic device used to present the AR scene. , thereby improving the AR experience effect within a large range of activities.
  • the solution proposed by the embodiments of the present disclosure can be applied to various AR experience scenarios, for example, it can be used to experience AR special effects in applications on electronic devices, or to experience AR content in AR glasses.
  • FIG. 1 shows a schematic diagram of an example environment 100 in which an embodiment of the present disclosure can be implemented.
  • an AR scene 150 is presented to a user 130 at or by a terminal device 110.
  • the AR scene 150 may be presented on a screen of the terminal device 110.
  • the AR scene 150 may include a real-world screen 154 and virtual objects 1531 and 1532 superimposed on the screen 154.
  • objects 1541 and 1542 are representations of real objects in the real world (buildings in this example) in the AR scene 150, such as images or other forms of representations of real objects.
  • objects 1541 and 1542 are also referred to herein as 3D objects.
  • the picture 154 may change as the position and/or perspective of the terminal device 110 changes, and accordingly, the 3D objects presented in the picture 154 may also change.
  • the virtual objects 1531 and 1532 are specifically AR objects, and these virtual objects are superimposed on the real-world picture 154 .
  • virtual object 1531 can be used to represent plants that can be placed on a building
  • virtual object 1532 can be used to represent clouds floating in the sky.
  • AR scene 150 is exemplary only and is not intended to limit the scope of the present disclosure.
  • AR scene 150 may include more or fewer virtual objects overlaid on screen 154, or may include other elements, such as user interface (UI) elements.
  • UI user interface
  • the terminal device 110 may be any type of mobile terminal, fixed terminal or portable terminal. terminals, including mobile phones, desktop computers, laptop computers, notebook computers, netbook computers, tablet computers, media computers, multimedia tablets, gaming devices, wearable devices, personal communications system (PCS) devices, personal navigation devices, personal digital devices Assistant (PDA), audio/video player, digital camera/camcorder, pointing device, television receiver, radio receiver, e-book device or any combination of the foregoing, including accessories and peripherals of these devices or any thereof combination.
  • the terminal device 110 is also capable of supporting any type of user-directed interface (such as "wearable" circuitry, etc.).
  • the engine 120 is installed in the terminal device 110 .
  • the engine 120 is used to drive the presentation of the AR scene 150.
  • engine 120 may be an AR game engine, and accordingly, AR scene 150 may be an AR game scene.
  • the engine 120 may be part of a content sharing application capable of providing users 130 with services related to multimedia content consumption, including browsing, commenting, forwarding, authoring (e.g., filming and/or Editing), publishing, etc.
  • the AR scene 150 may be an AR content creation scene.
  • the terminal device 110 may include any suitable structure and function to implement the presentation of virtual objects in an AR scene when the user performs an AR experience within a large range of activities.
  • Figure 2 illustrates a flowchart of a method 200 for augmented reality according to an embodiment of the present disclosure.
  • the method 200 may be implemented at the terminal device 110 .
  • method 200 will be described with reference to environment 100 of FIG. 1 .
  • the terminal device 110 determines a current distance between the electronic device used to present the AR scene and each of a plurality of virtual objects to be presented in the AR scene. As described above, the terminal device 110 is used to present the AR scene 150. Therefore, the terminal device 110 is used here as an electronic device for presenting an AR scene. When the user uses the terminal device 110 to perform an AR experience within a predetermined geographical area, the terminal device 110 can load corresponding virtual objects among multiple virtual objects to be presented in the AR scene 150 at different locations.
  • the predetermined geographical area may be a large activity range or Therefore, the scope of activity is small, and the embodiments of the present disclosure do not limit this.
  • Figure 3 shows a schematic diagram of an AR scene 150 according to some embodiments of the present disclosure.
  • a real-world picture 154 is presented in the AR scene 150.
  • the picture 154 contains 3D objects 1541 and 1542, which are buildings in this example. .
  • screen 154 may alternatively or additionally contain other types of 3D objects.
  • some of the plurality of virtual objects to be loaded in the AR scene 150 such as virtual objects 1531, 1532, and 1533, are shown in the form of dotted lines in the presented AR scene 150.
  • the virtual objects 1531, 1532, and 1533 have not yet been presented in the AR scene 150, but can be presented in the AR scene 150 when the terminal device 110 is in a specific position.
  • the virtual object 1531 may be, for example, a plant placed on the 3D object 1541
  • the virtual object 1532 may be, for example, clouds floating in the sky
  • the virtual object 1533 may be, for example, a vehicle placed on the 3D object 1542 .
  • the terminal device 110 can load corresponding virtual objects in the AR scene 150 at different locations.
  • the terminal device 110 may download from a server (such as a cloud platform or other types of servers) when performing an AR experience or pre-store in the terminal device 110 the configurations of multiple virtual objects to be presented in the AR scene 150 document.
  • the configuration file may store at least one of the following information of each virtual object: visible distance, visible time period, coordinates, size, direction, resource download path.
  • a configuration file in Json format may be used to store the above information.
  • the visible distance means that the corresponding virtual object can be presented in the AR scene 150 if the distance from the terminal device 110 is less than the visible distance, but cannot be presented in the AR scene 150 if it is greater than the visible distance.
  • the distance between each of the virtual objects 1531 , 1532 and 1533 and the terminal device 110 is greater than the corresponding visible distance, and therefore is in an invisible state in the AR scene 150 , that is, is not presented in the AR scene 150.
  • the visible time period indicates that the corresponding virtual object can be presented in the AR scene 150 within the visible time period, but cannot be presented in the AR scene 150 outside the visible time period.
  • its visible time period can be set to 8 a.m. to 6 p.m., then when the terminal device 110 is used to perform an AR experience in a predetermined geographical area within the visible time period, the virtual object 1531 can be presented in In the AR scene 150, the virtual object 1531 cannot be presented in the AR scene 150 outside the visible time period.
  • the coordinates represent the loading position of the corresponding virtual object in the AR scene 150 .
  • the loading position represented by its coordinates is on the 3D object 1541 in the AR scene 150 .
  • the loading position represented by its coordinates is in the sky in the AR scene 150 .
  • the loading position represented by its coordinates is on the 3D object 1542 in the AR scene 150 .
  • the size represents the loading size of the corresponding virtual object in the AR scene 150 .
  • a load size that can be represented by its size is loaded on the 3D object 1541.
  • a load size that can be represented by its size is loaded in the sky in the AR scene 150 .
  • a load size that can be represented by its size is loaded on the 3D object 1542.
  • the direction represents the loading orientation of the corresponding virtual object in the AR scene 150 and is used to set whether the virtual object always faces the camera of the terminal device 110 .
  • a loading orientation that can be represented by its direction is loaded on the 3D object 1541.
  • a loading orientation that can be represented by its direction is loaded in the sky in the AR scene 150 .
  • a loading orientation that can be represented by its direction is loaded on the 3D object 1542.
  • the resource download path indicates the path when downloading the corresponding virtual object from the server.
  • the resource file of the virtual object such as 2D material or 3D material, can be downloaded from the server according to the resource download path.
  • a special effects editor can edit a dynamic resource package in an electronic device, where the dynamic resource package includes the configuration file as described above and the resource files of each virtual object.
  • the configuration file can be a configuration file in Json format, which can store at least one of the following information about each virtual object: object name, pair Object type, coordinates, size, rotation angle, visible distance, visible time period, rendering blending mode, direction, resource download path.
  • the resource file can be 2D material or 3D material, such as pictures, sequence frames, 3D models and textures, etc.
  • a corresponding instance can be created at the corresponding coordinates of each virtual object.
  • the instance has a script with a customized user interface.
  • the editor can specify the specific information of the instance in this user interface, such as the configuration file as mentioned above. information contained in.
  • the edited dynamic resource package can be uploaded to a server, such as a cloud platform.
  • the server may deliver the configuration file and resource file included in the dynamic resource package according to the request of the terminal device 110 .
  • the terminal device 110 may shoot within a predetermined geographical area and determine the location of the terminal device 110 within the predetermined geographical area based on the captured images. For example, the terminal device 110 may download or pre-store a map and a feature point cloud model corresponding to the predetermined geographical area, and the feature point cloud model is used to model objects within the predetermined geographical area. When the terminal device 110 captures a picture within a predetermined geographical area, the captured picture may be compared with an object represented by a feature point cloud model, thereby determining the location of the terminal device 110 within the predetermined geographical area. In some embodiments, the terminal device 110 may further determine the location of the terminal device 110 within the predetermined geographical area based on Global Positioning System (GPS) information. In this way, the location of the terminal device 110 can be accurately determined.
  • GPS Global Positioning System
  • the terminal device 110 may traverse the configuration file of each of the plurality of virtual objects to be loaded in the AR scene 150 to determine the location of the terminal device 110 within the predetermined geographical area.
  • the position and the coordinates of each virtual object determine the current distance between each virtual object and the terminal device 110 .
  • the terminal device 110 may determine the current distance between each of the virtual objects 1531, 1532, and 1533 and other virtual objects and the terminal device 110.
  • the terminal device 110 selects at least a portion of the plurality of virtual objects based on a current distance between each virtual object and the terminal device 110 .
  • the terminal device 110 may adopt Any suitable strategy to select at least a portion of the plurality of virtual objects for rendering in the AR scene 150 .
  • the terminal device 110 compares the current distance between each virtual object and the terminal device 110 with the corresponding visible distance. If the current distance between the first part of the plurality of virtual objects and the terminal device 110 is less than the corresponding visible distance, the terminal device 110 selects the first part as at least part of the plurality of virtual objects for performing in the AR scene 150 render. For example, for the virtual objects 1531, 1532 and 1533 shown in Figure 3, if as the terminal device 110 moves within the predetermined geographical area, one or more of the virtual objects 1531, 1532 and 1533 are in contact with the terminal device. If the current distance between 110 is less than the corresponding visible distance, the one or more virtual objects are selected for rendering in the AR scene 150 .
  • the terminal device 110 further selects at least a portion for rendering in the AR scene 150 based on a visible time period of each of the plurality of virtual objects to be presented in the AR scene 150 . For example, if the current distance between the first part of the plurality of virtual objects and the terminal device 110 is less than the corresponding visible distance, the terminal device 110 may further determine whether each virtual object in the first part is within the visible time period. The terminal device 110 may select virtual objects in the first part within the visible time period for rendering in the AR scene 150 without selecting virtual objects in the first part outside the visible time period.
  • the terminal device 110 renders at least a portion of the selected virtual objects among the plurality of virtual objects in the AR scene 150 .
  • FIG. 4 shows a schematic diagram of an AR scene 150 according to some embodiments of the present disclosure.
  • the AR scene 150 shown in FIG. 4 is a scene presented on the terminal device 110 after the terminal device 110 changes its position from the position corresponding to the AR scene 150 in FIG. 3 within a predetermined geographical area.
  • the terminal device 110 is at the current position corresponding to the AR scene in Figure 4
  • the current distance between the virtual objects 1531 and 1532 and the terminal device 110 is less than the corresponding visible distance
  • the current distance between the virtual object 1533 and the terminal device 110 The distance is greater than the corresponding visible distance, or the current distances between the virtual objects 1531, 1532, and 1533 and the terminal device 110 are all less than the corresponding visible distance, but the virtual objects 1531 and 1532 are in a visible distance.
  • the terminal device 110 can select the virtual objects 1531 and 1532 according to the manner described above in connection with the block 220, and render the virtual objects 1531 and 1532 in the AR scene 150 using the corresponding resource files (shown in solid lines in FIG. 4 out) without rendering virtual object 1533 (shown as a dotted line in Figure 4).
  • FIG. 5 shows a schematic diagram of an AR scene 150 according to some embodiments of the present disclosure.
  • the AR scene 150 shown in FIG. 5 is a scene presented on the terminal device 110 after the terminal device 110 changes its position from the position corresponding to the AR scene 150 in FIG. 3 or 4 within a predetermined geographical area.
  • the terminal device 110 is at the current position corresponding to the AR scene in Figure 5
  • the current distances between the virtual objects 1531, 1532 and 1533 and the terminal device 110 are all less than the corresponding visible distances, and the virtual objects 1531, 1532 and 1533 are all Within the corresponding visible time period.
  • the terminal device 110 can select the virtual objects 1531, 1532, and 1533 according to the manner described above in connection with the block 220, and render the virtual objects 1531, 1532, and 1533 in the AR scene 150 using the corresponding resource files (in FIG. 5 shown as solid lines).
  • the terminal device 110 can automatically refresh at certain time intervals and display the AR scene 150 according to the current location of the terminal device 110 and the system time. Dynamically load virtual objects in. For example, the terminal device 110 can use a script to sequentially read the information in the Json configuration file. If the distance between the coordinate value of a virtual object and the terminal device 110 is less than the corresponding visible distance and the system time is within the corresponding visible time period, it will be Create an instance of the corresponding virtual object in the AR scene 150, and set the rendering mixing mode and orientation used for rendering according to the values specified in the Json configuration file.
  • the instance of the virtual object is deleted from the AR scene 150 . If there is an instance of a certain virtual object in the AR scene 150 and the distance between the virtual object and the terminal device 110 is still less than the visible distance after refreshing, the instance of the virtual object remains unchanged in the AR scene 150 .
  • the electronic device by based on the virtual object to be presented and the electronic device to dynamically load at least part of multiple virtual objects based on the current distance between devices.
  • it can avoid unclear visual effects caused by near and far virtual objects being visible at the same time; on the other hand, it can avoid Loading a large number of virtual objects affects the performance of the terminal device 110, thereby improving the AR experience effect within a large range of activities.
  • the terminal device 110 when rendering at least a portion of the selected virtual objects among the plurality of virtual objects in the AR scene 150, if the resource files of one or more virtual objects in the selected at least a portion of the virtual objects have not been downloaded, Then the terminal device 110 sends a request to download the resource file of the one or more virtual objects to the server. Subsequently, the terminal device 110 receives the resource files of the one or more virtual objects from the server for rendering in the AR scene 150 . On the contrary, for virtual objects whose resource files have been downloaded in at least a part of the selected virtual objects, there is no need to download them again.
  • the terminal device 110 can save the newly downloaded resource file as an element, for example, in a table. In this way, the terminal device 110 can determine whether the resource files of each virtual object have been downloaded based on the elements stored in the table. If they have been downloaded, they will not be downloaded again. In this way, when the terminal device 110 starts to present the AR scene 150, it does not need to download the resource files of all virtual objects to be presented in the AR scene 150, but only needs to download the configuration files of the virtual objects. As the terminal device 110 changes position within a predetermined geographic area, the terminal device 110 can dynamically load the resource files of the virtual objects, saving the storage space of the terminal device 110 and improving the performance of the terminal device 110.
  • the current distance between some virtual objects and the terminal device 110 may change from less than the visible distance to greater than the visible distance.
  • the virtual object whose current distance changes from less than the visible distance to greater than the visible distance may be removed from the AR scene 150 .
  • the terminal device 110 may select at least a part of the virtual object rendered in the AR scene 150 that should no longer be rendered based on the current distance of the virtual object rendered in the AR scene 150 , and should no longer be rendered. At least a portion of is removed from the AR scene 150 .
  • the terminal device 110 compares the current distance between each virtual object rendered in the AR scene 150 and the terminal device 110 with the corresponding visible distance, and responds to the first virtual object rendered in the AR scene 150 The current distance between the two parts is greater than Corresponding visible distance, this second part is selected for removal from the AR scene 150 .
  • Figure 6 shows a schematic diagram of an AR scene 150 according to some embodiments of the present disclosure.
  • the AR scene 150 shown in FIG. 6 is a scene presented on the terminal device 110 after the terminal device 110 changes its position from the position corresponding to the AR scene 150 in FIG. 5 within a predetermined geographical area.
  • the terminal device 110 When the terminal device 110 changes from the position corresponding to the AR scene in Figure 5 to the position corresponding to the AR scene in Figure 6, the current distance between the virtual objects 1531 and 1533 and the terminal device 110 is still less than the corresponding visible distance, The current distance between the virtual object 1532 and the terminal device 110 changes from less than the visible distance to greater than the visible distance. Therefore, the terminal device 110 can remove the virtual object 1532 from the AR scene 150 (shown with a dotted line in FIG. 6 ).
  • Figure 7 shows a schematic structural block diagram of an apparatus 700 for augmented reality according to certain embodiments of the present disclosure.
  • the apparatus 700 may be implemented as or included in the terminal device 110 .
  • Each module/component in the device 700 may be implemented by hardware, software, firmware, or any combination thereof.
  • the apparatus 700 includes a distance determination module 710 configured to determine a current distance between an electronic device for presenting an augmented reality scene and each of a plurality of virtual objects to be presented in the augmented reality scene. distance; an object selection module 720 configured to select at least a portion of the plurality of virtual objects based on the current distance; and an object rendering module 730 configured to render at least a portion of the plurality of virtual objects in the augmented reality scene.
  • a distance determination module 710 configured to determine a current distance between an electronic device for presenting an augmented reality scene and each of a plurality of virtual objects to be presented in the augmented reality scene. distance
  • an object selection module 720 configured to select at least a portion of the plurality of virtual objects based on the current distance
  • an object rendering module 730 configured to render at least a portion of the plurality of virtual objects in the augmented reality scene.
  • the object rendering module 730 is further configured to: in response to determining that resource files of one or more virtual objects among at least a portion of the selected plurality of virtual objects have not been downloaded, send to download one or more virtual objects.
  • the object selection module 720 is further configured to: compare a current distance of each of the plurality of virtual objects with a corresponding visible distance; and in response to the current distance of the first portion of the plurality of virtual objects being less than the corresponding visible distance. visible distance, select the first part.
  • the apparatus 700 further includes: a second object selection module configured to to select at least a portion of the virtual object rendered in the augmented reality scene that should no longer be rendered based on the current distance; and an object removal module configured to move at least a portion of the virtual object that should no longer be rendered from the augmented reality scene. remove.
  • the second object selection module is further configured to: compare the current distance of each virtual object rendered in the augmented reality scene with the corresponding visible distance; and respond to the virtual object rendered in the augmented reality scene.
  • the current distance of the second part is greater than the corresponding visible distance, select the second part.
  • object selection module 720 is further configured to select at least a portion for rendering based further on a visible time period of each of the plurality of virtual objects.
  • the object rendering module 730 is further configured to perform rendering based on a configuration file associated with the plurality of virtual objects, the configuration file describing at least one of the following information: visible time period, coordinates, size, direction, resource Download path.
  • FIG. 8 shows a block diagram illustrating an electronic device 800 in which one or more embodiments of the present disclosure may be implemented. It should be understood that the electronic device 800 shown in FIG. 8 is merely exemplary and should not constitute any limitation on the functionality and scope of the embodiments described herein. The electronic device 800 shown in FIG. 8 can be used to implement the terminal device 110 of FIG. 1 .
  • electronic device 800 is in the form of a general purpose computing device.
  • Components of electronic device 800 may include, but are not limited to, one or more processors or processing units 810, memory 820, storage devices 830, one or more communication units 840, one or more input devices 850, and one or more output devices. 860.
  • the processing unit 810 may be a real or virtual processor and can perform various processes according to a program stored in the memory 820 . In a multi-processor system, multiple processing units execute computer-executable instructions in parallel to improve the parallel processing capability of the electronic device 800 .
  • Electronic device 800 typically includes a plurality of computer storage media. Such media may be any retrievable media that is accessible to electronic device 800, including, but not limited to, volatile and nonvolatile media, removable and non-removable media.
  • Memory 820 may be volatile memory (e.g., registers, cache, random access memory (RAM)), nonvolatile memory (e.g., read only memory (ROM), electrically erasable programmable read only memory) (EEPROM), Flash memory) or some combination thereof.
  • Storage device 830 may be a removable or non-removable medium and may include machine-readable media such as a flash drive, a magnetic disk, or any other medium that may be capable of storing information and/or data (e.g., training data for training ) and can be accessed within electronic device 800.
  • machine-readable media such as a flash drive, a magnetic disk, or any other medium that may be capable of storing information and/or data (e.g., training data for training ) and can be accessed within electronic device 800.
  • Electronic device 800 may further include additional removable/non-removable, volatile/non-volatile storage media.
  • a disk drive may be provided for reading from or writing to a removable, non-volatile disk (eg, a "floppy disk") and for reading from or writing to a removable, non-volatile optical disk. Read or write to optical disc drives.
  • each drive may be connected to the bus (not shown) by one or more data media interfaces.
  • Memory 820 may include a computer program product 825 having one or more program modules configured to perform various methods or actions of various embodiments of the present disclosure.
  • the communication unit 840 implements communication with other computing devices through communication media. Additionally, the functionality of the components of electronic device 800 may be implemented as a single computing cluster or as multiple computing machines capable of communicating over a communications connection. Accordingly, electronic device 800 may operate in a networked environment using a logical connection to one or more other servers, a networked personal computer (PC), or another network node.
  • PC personal computer
  • Input device 850 may be one or more input devices, such as a mouse, keyboard, trackball, etc.
  • Output device 860 may be one or more output devices, such as a display, speakers, printer, etc.
  • the electronic device 800 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., through the communication unit 840 as needed, and with one or more devices that enable the user to interact with the electronic device 800 Communicate with or with any device (eg, network card, modem, etc.) that enables electronic device 800 to communicate with one or more other computing devices. Such communication may be performed via an input/output (I/O) interface (not shown).
  • I/O input/output
  • a computer-readable storage medium is provided with computer-executable instructions stored thereon, wherein the computer-executable instructions are executed by a processor to implement the method described above.
  • a computer program product is also provided.
  • the computer program product is tangibly stored in a non-transient computing device.
  • the computer-readable medium includes computer-executable instructions, and the computer-executable instructions are executed by a processor to implement the methods described above.
  • These computer-readable program instructions may be provided to a processing unit of a general-purpose computer, a special-purpose computer, or other programmable data processing apparatus, thereby producing a machine such that, when executed by the processing unit of the computer or other programmable data processing apparatus, the computer-readable program instructions , resulting in a device that implements the functions/actions specified in one or more blocks in the flowchart and/or block diagram.
  • These computer-readable program instructions can also be stored in a computer-readable storage medium. These instructions cause the computer, programmable data processing device and/or other equipment to work in a specific manner. Therefore, the computer-readable medium storing the instructions includes An article of manufacture that includes instructions that implement aspects of the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • Computer-readable program instructions may be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operating steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process, Thereby, instructions executed on a computer, other programmable data processing apparatus, or other device implement the functions/actions specified in one or more blocks of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions that contains one or more executable functions for implementing the specified logical functions instruction.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two consecutive blocks may actually execute substantially in parallel, or they may sometimes execute in the reverse order, depending on the functionality involved.
  • each block in the block diagram and/or flowchart illustration, and combinations of blocks in the block diagram and/or flowchart illustration can perform the specified functions or actions. It can be implemented by a dedicated hardware-based system, or it can be implemented by a combination of dedicated hardware and computer instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

根据本公开的实施例,提供了用于增强现实的方法、装置、设备和存储介质。该方法包括:确定用于呈现增强现实场景的电子设备与待在所述增强现实场景中呈现的多个虚拟对象中的每个虚拟对象之间的当前距离;基于所述当前距离,选择所述多个虚拟对象的至少一部分;以及在所述增强现实场景中渲染所述多个虚拟对象的所述至少一部分。以此方式,可以基于待呈现的虚拟对象与电子设备之间的当前距离来动态地加载多个虚拟对象中的至少一部分,提升在大的活动范围内的AR体验效果。

Description

用于增强现实的方法、装置、设备和存储介质
本申请要求2022年09月21日递交的申请号为202211151653.0、标题为“用于增强现实的方法、装置、设备和存储介质”的中国发明专利申请的优先权。该中国专利申请的全部内容通过引用并入本文中。
技术领域
本公开的示例实施例总体涉及计算机领域,特别地涉及用于增强现实的方法、装置、设备和计算机可读存储介质。
背景技术
增强现实(Augmented Reality,简称AR)技术是一种将虚拟信息与真实世界进行融合的技术。在应用AR技术的过程中,AR装置可以将虚拟对象与真实世界中的画面叠加在一起呈现在AR场景中。这样,出现在用户视野中的图像既包括真实世界的画面,也包括虚拟对象,使得用户可以同时看到虚拟对象和真实世界。常规的AR装置通常适于用户在小的活动范围内进行体验,而在大的活动范围(例如街道或景区)内的体验效果并不理想。
发明内容
在本公开的第一方面,提供了一种用于增强现实的方法,包括:确定用于呈现增强现实场景的电子设备与待在所述增强现实场景中呈现的多个虚拟对象中的每个虚拟对象之间的当前距离;基于所述当前距离,选择所述多个虚拟对象的至少一部分;以及在所述增强现实场景中渲染所述多个虚拟对象的所述至少一部分。以此方式,可以基于待呈现的虚拟对象与电子设备之间的当前距离来动态地加载多个 虚拟对象中的至少一部分,提升在大的活动范围内的AR体验效果。
在本公开的第二方面,提供了一种用于增强现实的装置,包括:距离确定模块,被配置为确定用于呈现增强现实场景的电子设备与待在所述增强现实场景中呈现的多个虚拟对象中的每个虚拟对象之间的当前距离;对象选择模块,被配置为基于所述当前距离,选择所述多个虚拟对象的至少一部分;以及对象渲染模块,被配置为在所述增强现实场景中渲染所述多个虚拟对象的所述至少一部分。以此方式,可以基于待呈现的虚拟对象与电子设备之间的当前距离来动态地加载多个虚拟对象中的至少一部分,提升在大的活动范围内的AR体验效果。
在本公开的第三方面,提供了一种电子设备。该设备包括至少一个处理单元;以及至少一个存储器,至少一个存储器被耦合到至少一个处理单元并且存储用于由至少一个处理单元执行的指令。指令在由至少一个处理单元执行时使设备执行第一方面的方法。
在本公开的第四方面,提供了一种计算机可读存储介质。该计算机可读存储介质上存储有计算机程序,计算机程序可由处理器执行以实现第一方面的方法。
应当理解,本发明内容部分中所描述的内容并非旨在限定本公开的实施例的关键特征或重要特征,也不用于限制本公开的范围。本公开的其它特征将通过以下的描述而变得容易理解。
附图说明
结合附图并参考以下详细说明,本公开各实施例的上述和其他特征、优点及方面将变得更加明显。在附图中,相同或相似的附图标记表示相同或相似的元素,其中:
图1示出了本公开的实施例能够在其中实现的示例环境的示意图;
图2示出了根据本公开的实施例的用于增强现实的方法的流程图;
图3示出了根据本公开的一些实施例的AR场景的示意图;
图4示出了根据本公开的一些实施例的AR场景的示意图;
图5示出了根据本公开的一些实施例的AR场景的示意图;
图6示出了根据本公开的一些实施例的AR场景的示意图;
图7示出了根据本公开的一些实施例的用于增强现实的装置的框图;以及
图8示出了能够实施本公开的多个实施例的设备的框图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中示出了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例,相反,提供这些实施例是为了更加透彻和完整地理解本公开。应当理解的是,本公开的附图及实施例仅用于示例性作用,并非用于限制本公开的保护范围。
在本公开的实施例的描述中,术语“包括”及其类似用语应当理解为开放性包含,即“包括但不限于”。术语“基于”应当理解为“至少部分地基于”。术语“一个实施例”或“该实施例”应当理解为“至少一个实施例”。术语“一些实施例”应当理解为“至少一些实施例”。下文还可能包括其他明确的和隐含的定义。
如上文所简要提及的,常规的AR装置通常适于用户在小的活动范围内进行体验,而在大的活动范围(例如街道或景区)内的体验效果并不理想。例如,当用户在大的活动范围内进行AR体验时,为了保证AR效果的丰富性,AR装置需要在活动范围内的各个位置在AR场景中加载大量虚拟对象(也可称为AR特效或AR元素),这不仅需要消耗AR装置的大量的处理性能,而且当近处和远处的虚拟对象同时可见时,也会造成视觉效果的主次不清。此外,当用户在大的活动范围内进行AR体验时,所需要用到的大量二维(2D)和三维(3D)虚拟对象素材如果全部预置在AR装置的特效包中,则会使特效包的 体积过大,并且有可能在用户并未到特定位置或未在特定时间进行体验的情况下,为用户下载到并未使用的素材,影响AR装置的性能。
本公开的实施例提出了一种根据电子设备在大的活动范围内的位置,来选择性地下载和加载与AR场景中的位置相关的AR元素的方案。在该方案中,针对待在AR场景中呈现的多个虚拟对象,可以基于各个虚拟对象与用于呈现AR场景的电子设备之间的当前距离,来动态地加载多个虚拟对象中的至少一部分,从而提升在大的活动范围内的AR体验效果。本公开的实施例提出的方案能够应用于各种AR体验场景,例如可以用于在电子设备上的应用中体验AR特效,或者在AR眼镜中体验AR内容等。
图1示出了本公开的实施例能够在其中实现的示例环境100的示意图。在该示例环境100中,在终端设备110处或者由终端设备110向用户130呈现AR场景150。AR场景150可以被呈现在终端设备110的屏幕上。AR场景150可以包括真实世界的画面154和叠加在画面154上的虚拟对象1531和1532。
在画面154中,对象1541和1542是真实世界中的真实对象(在该示例中是建筑物)在AR场景150中的表示,例如是真实对象的图像或其他形式的表示。仅为了便于讨论,对象1541和1542在本文中也称为3D对象。当用户在大的活动范围内进行AR体验时,画面154可以随着终端设备110的位置和/或视角的改变而发生变化,相应地,在画面154中呈现的3D对象也会发生变化。
在该示例中,虚拟对象1531和1532具体为AR对象,这些虚拟对象叠加在真实世界的画面154上。例如,虚拟对象1531可以用于表示可以放置在建筑物上的植物,虚拟对象1532可以用于表示飘浮在天空中的云朵。
应当理解,AR场景150仅是示例性的,而无意限制本公开的范围。AR场景150可以包括叠加在画面154上的更多或更少的虚拟对象,或者可以包括其他元素,诸如用户界面(UI)元素。
终端设备110可以是任意类型的移动终端、固定终端或便携式终 端,包括移动电话、台式计算机、膝上型计算机、笔记本计算机、上网本计算机、平板计算机、媒体计算机、多媒体平板、游戏设备、可穿戴设备、个人通信系统(PCS)设备、个人导航设备、个人数字助理(PDA)、音频/视频播放器、数码相机/摄像机、定位设备、电视接收器、无线电广播接收器、电子书设备或者前述各项的任意组合,包括这些设备的配件和外设或者其任意组合。在一些实施例中,终端设备110也能够支持任意类型的针对用户的接口(诸如“可佩戴”电路等)。
终端设备110中安装有引擎120。引擎120用于驱动AR场景150的呈现。在一些示例中,引擎120可以是AR游戏引擎,相应地,AR场景150可以是AR游戏场景。在一些实施例中,引擎120可以是内容共享应用的一部分,该内容共享应用能够向用户130提供与多媒体内容消费相关服务,包括多媒体内容的浏览、评论、转发、创作(例如,拍摄和/或编辑)、发布等等。相应地,AR场景150可以是AR内容创作场景。
应当理解,仅出于示例性的目的描述环境100的结构和功能,而不暗示对于本公开的范围的任何限制。终端设备110可以包括任何合适的结构和功能来实现当用户在大的活动范围内进行AR体验时虚拟对象在AR场景中的呈现。
图2示出了根据本公开的实施例的用于增强现实的方法200的流程图。方法200可以在终端设备110处实现。为便于讨论,将参考图1的环境100来描述方法200。
在框210,终端设备110确定用于呈现AR场景的电子设备与待在AR场景中呈现的多个虚拟对象中的每个虚拟对象之间的当前距离。如上所述,终端设备110用于呈现AR场景150。因此,终端设备110在此用作用于呈现AR场景的电子设备。当用户利用终端设备110在预定地理区域内进行AR体验时,终端设备110在不同位置可以在AR场景150中加载待呈现的多个虚拟对象中的相应虚拟对象。在根据本公开的实施例中,预定地理区域可以是大的活动范围,也可 以是小的活动范围,本公开的实施例对此不作限制。
图3示出了根据本公开的一些实施例的AR场景150的示意图。如图3所示,当终端设备110处于预定地理区域内的某一位置时,在AR场景150中呈现了真实世界的画面154,画面154包含3D对象1541和1542,在该示例中是建筑物。在其他示例中,画面154可以替代地或额外地包含其他类型的3D对象。此外,为了便于理解,在所呈现的AR场景150中以虚线的形式示出了待在AR场景150中加载的多个虚拟对象中的一部分,例如虚拟对象1531、1532和1533。换而言之,虚拟对象1531、1532和1533尚未被呈现在AR场景150中,但是能够在终端设备110处于特定位置的情况下被呈现在AR场景150中。虚拟对象1531例如可以是放置在3D对象1541上的植物,虚拟对象1532例如可以是漂浮在天空中的云朵,虚拟对象1533例如可以是放置在3D对象1542上的车辆。
如上所述,当用户利用终端设备110在预定地理区域内进行AR体验时,终端设备110在不同位置可以在AR场景150中加载相应的虚拟对象。针对该预定地理区域,终端设备110可以在进行AR体验时从服务器(例如云端平台或其他类型的服务器)下载或者在终端设备110中预先存储待在AR场景150中呈现的多个虚拟对象的配置文件。在一些实施例中,该配置文件可以存储各个虚拟对象的以下至少一项信息:可见距离、可见时间段、坐标、尺寸、方向、资源下载路径。在一些实施例中,可以采用Json格式的配置文件来存储上述信息。
可见距离表示相应虚拟对象在距终端设备110的距离小于该可见距离的情况下能够在AR场景150中被呈现,而在大于可见距离的情况下不能在AR场景150中被呈现。在图3所示的AR场景150中,虚拟对象1531、1532和1533中的每个对象与终端设备110之间的距离均大于相应的可见距离,因此在AR场景150中处于不可见状态,即未被呈现在AR场景150中。
可见时间段表示相应虚拟对象在该可见时间段内能够在AR场景150中被呈现,而在该可见时间段之外不能在AR场景150中被呈现。 例如,对于虚拟对象1531,可以设置其可见时间段为上午八点至下午六点,则在该可见时间段内利用终端设备110在预定地理区域内进行AR体验时,虚拟对象1531可以被呈现在AR场景150中,而在该可见时间段之外,虚拟对象1531不能被呈现在AR场景150中。
坐标表示相应虚拟对象在AR场景150中的加载位置。例如,对于图3中所示的虚拟对象1531,其坐标表示的加载位置处于AR场景150中的3D对象1541上。对于图3中所示的虚拟对象1532,其坐标表示的加载位置处于AR场景150中的天空中。对于图3中所示的虚拟对象1533,其坐标表示的加载位置处于AR场景150中的3D对象1542上。
尺寸表示相应虚拟对象在AR场景150中的加载大小。例如,对于图3中所示的虚拟对象1531,能够以其尺寸表示的加载大小被加载在3D对象1541上。对于图3中所示的虚拟对象1532,能够以其尺寸表示的加载大小被加载在AR场景150中的天空中。对于图3中所示的虚拟对象1533,能够以其尺寸表示的加载大小被加载在3D对象1542上。
方向表示相应虚拟对象在AR场景150中的加载朝向,用于设置虚拟对象是否始终朝向终端设备110的相机。例如,对于图3中所示的虚拟对象1531,能够以其方向表示的加载朝向被加载在3D对象1541上。对于图3中所示的虚拟对象1532,能够以其方向表示的加载朝向被加载在AR场景150中的天空中。对于图3中所示的虚拟对象1533,能够以其方向表示的加载朝向被加载在3D对象1542上。
资源下载路径表示从服务器中下载相应虚拟对象时的路径。当需要下载虚拟对象时,可以根据资源下载路径从服务器下载虚拟对象的资源文件,例如2D素材或3D素材。
在一些实施例中,针对预定地理区域,特效编辑者可以在电子设备中编辑动态资源包,该动态资源包中包含如上所述的配置文件以及各个虚拟对象的资源文件。例如,该配置文件可以是Json格式的配置文件,其可以存储各个虚拟对象的以下至少一项信息:对象名称、对 象类型、坐标、尺寸、旋转角度、可见距离、可见时间段、渲染混合模式、方向、资源下载路径。该资源文件可以是2D素材或3D素材,例如图片、序列帧、3D模型及贴图等。在编辑动态资源包时,编辑者可以根据预定地理区域的地图和模型在相应位置摆放虚拟对象。例如,可以在各个虚拟对象的相应坐标处创建相应实例,该实例带有含自定义用户界面的脚本,编辑者可以在这一用户界面中指定该实例的具体信息,例如如上所述的配置文件中包含的信息。随后,可以将编辑好的动态资源包上传至服务器,例如云端平台。服务器可以根据终端设备110的请求而下发动态资源包中所包含的配置文件和资源文件。通过使用固定配套的脚本和配置文件,能够对大场景AR道具进行统一的编辑管理。
在一些实施例中,终端设备110可以在预定地理区域内进行拍摄并且基于所拍摄到的画面来确定终端设备110在该预定地理区域内的位置。例如,终端设备110可以下载或预先存储与该预定地理区域对应的地图和特征点云模型,该特征点云模型用于对该预定地理区域内的对象进行建模。当终端设备110拍摄到预定地理区域内的画面时,可以将所拍摄到的画面与由特征点云模型表示的对象进行比对,从而确定终端设备110在该预定地理区域内的位置。在一些实施例中,终端设备110可以进一步基于全球定位系统(GPS)信息来确定终端设备110在该预定地理区域内的位置。利用这样的方式,可以精确地确定终端设备110的位置。
在确定了终端设备110的位置的情况下,终端设备110可以遍历待在AR场景150中加载的多个虚拟对象中的每个虚拟对象的配置文件,以根据终端设备110在预定地理区域内的位置以及每个虚拟对象的坐标确定每个虚拟对象与终端设备110之间的当前距离。例如,终端设备110可以确定虚拟对象1531、1532和1533以及其他虚拟对象中的每个虚拟对象与终端设备110之间的当前距离。
在框220,终端设备110基于每个虚拟对象与终端设备110之间的当前距离选择多个虚拟对象的至少一部分。终端设备110可以采用 任何合适的策略来选择多个虚拟对象的至少一部分,以用于在AR场景150中进行渲染。
在一些实施例中,终端设备110将每个虚拟对象和终端设备110之间的当前距离与相应的可见距离进行比较。如果多个虚拟对象中的第一部分和终端设备110之间的当前距离小于相应的可见距离,则终端设备110选择该第一部分作为多个虚拟对象的至少一部分,以用于在AR场景150中进行渲染。例如,对于图3中所示的虚拟对象1531、1532和1533,如果随着终端设备110在预定地理区域内的移动,使得虚拟对象1531、1532和1533中的一个或多个虚拟对象与终端设备110之间的当前距离小于相应的可见距离,则选择该一个或多个虚拟对象,以用于在AR场景150中进行渲染。
在一些实施例中,终端设备110进一步基于待在AR场景150中呈现的多个虚拟对象中的每个虚拟对象的可见时间段,来选择用于在AR场景150中进行渲染的至少一部分。例如,如果多个虚拟对象中的第一部分和终端设备110之间的当前距离小于相应的可见距离,则终端设备110可以进一步判断第一部分中的每个虚拟对象是否处于可见时间段内。终端设备110可以选择第一部分中的处于可见时间段内的虚拟对象以用于在AR场景150中进行渲染,而不选择第一部分中的处于可见时间段之外的虚拟对象。
在框230,终端设备110在AR场景150中渲染多个虚拟对象中的所选择的至少一部分虚拟对象。
图4示出了根据本公开的一些实施例的AR场景150的示意图。图4中所示的AR场景150是终端设备110在预定地理区域内从图3中的AR场景150所对应的位置改变位置后呈现在终端设备110上的场景。在终端设备110处于图4中的AR场景所对应的当前位置时,虚拟对象1531和1532与终端设备110之间的当前距离小于相应的可见距离,而虚拟对象1533与终端设备110之间的当前距离大于相应的可见距离,或者虚拟对象1531、1532和1533与终端设备110之间的当前距离均小于相应的可见距离,但虚拟对象1531和1532处于可 见时间段内,而虚拟对象1533处于可见时间段之外。因此,终端设备110可以根据上文中结合框220所描述的方式来选择虚拟对象1531和1532,并且利用相应的资源文件在AR场景150中渲染虚拟对象1531和1532(在图4中以实线示出),而不渲染虚拟对象1533(在图4中以虚线示出)。
图5示出了根据本公开的一些实施例的AR场景150的示意图。图5中所示的AR场景150是终端设备110在预定地理区域内从图3或图4中的AR场景150所对应的位置改变位置后呈现在终端设备110上的场景。在终端设备110处于图5中的AR场景所对应的当前位置时,虚拟对象1531、1532和1533与终端设备110之间的当前距离均小于相应的可见距离,并且虚拟对象1531、1532和1533均处于相应的可见时间段内。因此,终端设备110可以根据上文中结合框220所描述的方式来选择虚拟对象1531、1532和1533,并且利用相应的资源文件在AR场景150中渲染虚拟对象1531、1532和1533(在图5中以实线示出)。
在一些实施例中,当用户利用终端设备110在预定地理区域内进行AR体验时,终端设备110可以每隔一定时间间隔自动进行刷新,根据终端设备110的当前位置以及系统时间来在AR场景150中动态加载虚拟对象。例如,终端设备110可以利用脚本依次读取Json配置文件中的信息,若有虚拟对象的坐标值与终端设备110的距离小于相应的可见距离且系统时间在相应的可见时间段内,则会在AR场景150中创建相应虚拟对象的实例,并根据Json配置文件指定的值设置渲染所采用的渲染混合模式和朝向等。如果AR场景150中已有某个虚拟对象的实例,且该虚拟对象与终端设备110的距离已经超过可见距离,则从AR场景150中删除这个虚拟对象的实例。如果AR场景150中已有某个虚拟对象的实例且刷新后且该虚拟对象与终端设备110的距离仍然小于可见距离,则该虚拟对象的实例在AR场景150中保持不变。
在根据本公开的实施例中,通过基于待呈现的虚拟对象与电子设 备之间的当前距离来动态地加载多个虚拟对象中的至少一部分,一方面能够避免因近处和远处的虚拟对象同时可见而造成的视觉效果主次不清,另一方面能够避免因加载大量虚拟对象而对终端设备110的性能造成影响,从而提升在大的活动范围内的AR体验效果。
在一些实施例中,在AR场景150中渲染多个虚拟对象中的所选择的至少一部分虚拟对象时,如果所选择的至少一部分虚拟对象中的一个或多个虚拟对象的资源文件尚未被下载,则终端设备110向服务器发送下载该一个或多个虚拟对象的资源文件的请求。随后,终端设备110从服务器接收该一个或多个虚拟对象的资源文件以用于在AR场景150中进行渲染。相反,对于所选择的至少一部分虚拟对象中的其资源文件已经被下载的虚拟对象,则无需再重复下载。
终端设备110可以将新下载的资源文件保存为一个元素,例如存储在表格中。以此方式,终端设备110可以根据该表格中所存储的元素来判断各个虚拟对象的资源文件是否已经被下载,若已经下载过则不会再重新下载。以此方式,终端设备110在开始呈现AR场景150时,无需下载待在AR场景150中呈现的所有虚拟对象的资源文件,而只需要下载虚拟对象的配置文件。随着终端设备110在预定地理区域内改变位置,终端设备110可以动态地加载虚拟对象的资源文件,节省终端设备110的存储空间,提升终端设备110的性能。
随着终端设备110在预定地理区域内改变位置,一些虚拟对象与终端设备110之间的当前距离可能会从小于可见距离变为大于可见距离。在这种情况下,可以将当前距离从小于可见距离变为大于可见距离的虚拟对象从AR场景150中移除。换而言之,终端设备110可以基于在AR场景150中渲染的虚拟对象的当前距离,从在AR场景150中渲染的虚拟对象选择不应再被渲染的至少一部分,并且将不应再被渲染的至少一部分从AR场景150中移除。
在一些实施例中,终端设备110比较在AR场景150中渲染的每个虚拟对象和终端设备110之间的当前距离与相应的可见距离,并且响应于在AR场景150中渲染的虚拟对象的第二部分的当前距离大于 相应的可见距离,选择该第二部分,以便从AR场景150中移除。图6示出了根据本公开的一些实施例的AR场景150的示意图。图6中所示的AR场景150是终端设备110在预定地理区域内从图5中的AR场景150所对应的位置改变位置后呈现在终端设备110上的场景。在终端设备110从图5中的AR场景所对应的位置改变为图6中的AR场景所对应的位置时,虚拟对象1531和1533与终端设备110之间的当前距离仍然小于相应的可见距离,而虚拟对象1532与终端设备110之间的当前距离从小于可见距离变为大于可见距离。因此,终端设备110可以将虚拟对象1532从AR场景150中移除(在图6中以虚线示出)。
图7示出了根据本公开的某些实施例的用于增强现实的装置700的示意性结构框图。装置700可以被实现为或者被包括在终端设备110中。装置700中的各个模块/组件可以由硬件、软件、固件或者它们的任意组合来实现。
如图所示,装置700包括:距离确定模块710,被配置为确定用于呈现增强现实场景的电子设备与待在增强现实场景中呈现的多个虚拟对象中的每个虚拟对象之间的当前距离;对象选择模块720,被配置为基于当前距离,选择多个虚拟对象的至少一部分;以及对象渲染模块730,被配置为在增强现实场景中渲染多个虚拟对象的至少一部分。
在一些实施例中,对象渲染模块730还被配置为:响应于确定所选择的多个虚拟对象的至少一部分中的一个或多个虚拟对象的资源文件尚未被下载,发送下载一个或多个虚拟对象的资源文件的请求;以及接收一个或多个虚拟对象的资源文件以用于渲染。
在一些实施例中,对象选择模块720还被配置为:比较多个虚拟对象中的每个虚拟对象的当前距离与相应的可见距离;以及响应于多个虚拟对象的第一部分的当前距离小于相应的可见距离,选择第一部分。
在一些实施例中,装置700还包括:第二对象选择模块,被配置 为基于当前距离,从在增强现实场景中渲染的虚拟对象中选择不应再被渲染的至少一部分;以及对象移除模块,被配置为将不应再被渲染的至少一部分从增强现实场景中移除。
在一些实施例中,第二对象选择模块还被配置为:比较在增强现实场景中渲染的每个虚拟对象的当前距离与相应的可见距离;以及响应于在增强现实场景中渲染的虚拟对象的第二部分的当前距离大于相应的可见距离,选择第二部分。
在一些实施例中,对象选择模块720还被配置为:进一步基于多个虚拟对象中的每个虚拟对象的可见时间段,来选择用于渲染的至少一部分。
在一些实施例中,对象渲染模块730还被配置为:基于与多个虚拟对象相关联的配置文件来执行渲染,配置文件描述以下至少一项信息:可见时间段、坐标、尺寸、方向、资源下载路径。
图8示出了示出了其中可以实施本公开的一个或多个实施例的电子设备800的框图。应当理解,图8所示出的电子设备800仅仅是示例性的,而不应当构成对本文所描述的实施例的功能和范围的任何限制。图8所示出的电子设备800可以用于实现图1的终端设备110。
如图8所示,电子设备800是通用计算设备的形式。电子设备800的组件可以包括但不限于一个或多个处理器或处理单元810、存储器820、存储设备830、一个或多个通信单元840、一个或多个输入设备850以及一个或多个输出设备860。处理单元810可以是实际或虚拟处理器并且能够根据存储器820中存储的程序来执行各种处理。在多处理器系统中,多个处理单元并行执行计算机可执行指令,以提高电子设备800的并行处理能力。
电子设备800通常包括多个计算机存储介质。这样的介质可以是电子设备800可访问的任何可以获取的介质,包括但不限于易失性和非易失性介质、可拆卸和不可拆卸介质。存储器820可以是易失性存储器(例如寄存器、高速缓存、随机访问存储器(RAM))、非易失性存储器(例如,只读存储器(ROM)、电可擦除可编程只读存储器 (EEPROM)、闪存)或它们的某种组合。存储设备830可以是可拆卸或不可拆卸的介质,并且可以包括机器可读介质,诸如闪存驱动、磁盘或者任何其他介质,其可以能够用于存储信息和/或数据(例如用于训练的训练数据)并且可以在电子设备800内被访问。
电子设备800可以进一步包括另外的可拆卸/不可拆卸、易失性/非易失性存储介质。尽管未在图8中示出,可以提供用于从可拆卸、非易失性磁盘(例如“软盘”)进行读取或写入的磁盘驱动和用于从可拆卸、非易失性光盘进行读取或写入的光盘驱动。在这些情况中,每个驱动可以由一个或多个数据介质接口被连接至总线(未示出)。存储器820可以包括计算机程序产品825,其具有一个或多个程序模块,这些程序模块被配置为执行本公开的各种实施例的各种方法或动作。
通信单元840实现通过通信介质与其他计算设备进行通信。附加地,电子设备800的组件的功能可以以单个计算集群或多个计算机器来实现,这些计算机器能够通过通信连接进行通信。因此,电子设备800可以使用与一个或多个其他服务器、网络个人计算机(PC)或者另一个网络节点的逻辑连接来在联网环境中进行操作。
输入设备850可以是一个或多个输入设备,例如鼠标、键盘、追踪球等。输出设备860可以是一个或多个输出设备,例如显示器、扬声器、打印机等。电子设备800还可以根据需要通过通信单元840与一个或多个外部设备(未示出)进行通信,外部设备诸如存储设备、显示设备等,与一个或多个使得用户与电子设备800交互的设备进行通信,或者与使得电子设备800与一个或多个其他计算设备通信的任何设备(例如,网卡、调制解调器等)进行通信。这样的通信可以经由输入/输出(I/O)接口(未示出)来执行。
根据本公开的示例性实现方式,提供了一种计算机可读存储介质,其上存储有计算机可执行指令,其中计算机可执行指令被处理器执行以实现上文描述的方法。根据本公开的示例性实现方式,还提供了一种计算机程序产品,计算机程序产品被有形地存储在非瞬态计算 机可读介质上并且包括计算机可执行指令,而计算机可执行指令被处理器执行以实现上文描述的方法。
这里参照根据本公开实现的方法、装置、设备和计算机程序产品的流程图和/或框图描述了本公开的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理单元,从而生产出一种机器,使得这些指令在通过计算机或其他可编程数据处理装置的处理单元执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
可以把计算机可读程序指令加载到计算机、其他可编程数据处理装置、或其他设备上,使得在计算机、其他可编程数据处理装置或其他设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其他可编程数据处理装置、或其他设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本公开的多个实现的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作 的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
以上已经描述了本公开的各实现,上述说明是示例性的,并非穷尽性的,并且也不限于所公开的各实现。在不偏离所说明的各实现的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实现的原理、实际应用或对市场中的技术的改进,或者使本技术领域的其他普通技术人员能理解本文公开的各个实现方式。

Claims (11)

  1. 一种用于增强现实的方法,包括:
    确定用于呈现增强现实场景的电子设备与待在所述增强现实场景中呈现的多个虚拟对象中的每个虚拟对象之间的当前距离;
    基于所述当前距离,选择所述多个虚拟对象的至少一部分;以及
    在所述增强现实场景中渲染所述多个虚拟对象的所述至少一部分。
  2. 根据权利要求1所述的方法,其中在所述增强现实场景中渲染所述多个虚拟对象的所述至少一部分包括:
    响应于确定所选择的所述多个虚拟对象的所述至少一部分中的一个或多个虚拟对象的资源文件尚未被下载,发送下载所述一个或多个虚拟对象的资源文件的请求;以及
    接收所述一个或多个虚拟对象的资源文件以用于渲染。
  3. 根据权利要求1或2所述的方法,其中选择所述多个虚拟对象的至少一部分包括:
    比较所述多个虚拟对象中的每个虚拟对象的所述当前距离与相应的可见距离;以及
    响应于所述多个虚拟对象的第一部分的所述当前距离小于相应的可见距离,选择所述第一部分。
  4. 根据权利要求1或2所述的方法,还包括:
    基于所述当前距离,从在所述增强现实场景中渲染的虚拟对象中选择不应再被渲染的至少一部分;以及
    将所述不应再被渲染的至少一部分从所述增强现实场景中移除。
  5. 根据权利要求4所述的方法,其中从在所述增强现实场景中渲染的虚拟对象中选择不应再被渲染的至少一部分包括:
    比较在所述增强现实场景中渲染的每个虚拟对象的所述当前距离与相应的可见距离;以及
    响应于在所述增强现实场景中渲染的虚拟对象的第二部分的所 述当前距离大于相应的可见距离,选择所述第二部分。
  6. 根据权利要求1所述的方法,其中选择所述多个虚拟对象的至少一部分包括:
    进一步基于所述多个虚拟对象中的每个虚拟对象的可见时间段,来选择用于渲染的所述至少一部分。
  7. 根据权利要求1所述的方法,其中在所述增强现实场景中渲染所述多个虚拟对象的所述至少一部分包括:
    基于与所述多个虚拟对象相关联的配置文件来执行所述渲染,所述配置文件描述以下至少一项信息:可见时间段、坐标、尺寸、方向、资源下载路径。
  8. 一种用于增强现实的装置,包括:
    距离确定模块,被配置为确定用于呈现增强现实场景的电子设备与待在所述增强现实场景中呈现的多个虚拟对象中的每个虚拟对象之间的当前距离;
    对象选择模块,被配置为基于所述当前距离,选择所述多个虚拟对象的至少一部分;以及
    对象渲染模块,被配置为在所述增强现实场景中渲染所述多个虚拟对象的所述至少一部分。
  9. 根据权利要求8所述的装置,其中所述对象选择模块进一步被配置为:
    比较所述多个虚拟对象中的每个虚拟对象的所述当前距离与相应的可见距离;以及
    响应于所述多个虚拟对象的第一部分的所述当前距离小于相应的可见距离,选择所述第一部分。
  10. 一种电子设备,包括:
    至少一个处理单元;以及
    至少一个存储器,所述至少一个存储器被耦合到所述至少一个处理单元并且存储用于由所述至少一个处理单元执行的指令,所述指令在由所述至少一个处理单元执行时使所述设备执行根据权利要求1至 7中任一项所述的方法。
  11. 一种计算机可读存储介质,其上存储有计算机程序,所述程序被处理器执行时实现根据权利要求1至7中任一项所述的方法。
PCT/CN2023/115804 2022-09-21 2023-08-30 用于增强现实的方法、装置、设备和存储介质 WO2024060949A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211151653.0A CN115578540A (zh) 2022-09-21 2022-09-21 用于增强现实的方法、装置、设备和存储介质
CN202211151653.0 2022-09-21

Publications (1)

Publication Number Publication Date
WO2024060949A1 true WO2024060949A1 (zh) 2024-03-28

Family

ID=84580431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/115804 WO2024060949A1 (zh) 2022-09-21 2023-08-30 用于增强现实的方法、装置、设备和存储介质

Country Status (2)

Country Link
CN (1) CN115578540A (zh)
WO (1) WO2024060949A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115578540A (zh) * 2022-09-21 2023-01-06 北京字跳网络技术有限公司 用于增强现实的方法、装置、设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10380803B1 (en) * 2018-03-26 2019-08-13 Verizon Patent And Licensing Inc. Methods and systems for virtualizing a target object within a mixed reality presentation
CN111815781A (zh) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 增强现实数据呈现方法、装置、设备和计算机存储介质
CN112148130A (zh) * 2020-10-23 2020-12-29 深圳市商汤科技有限公司 信息处理方法及装置、电子设备和存储介质
CN114949851A (zh) * 2022-04-25 2022-08-30 上海商汤智能科技有限公司 增强现实交互方法、装置、电子设备及存储介质
CN115578540A (zh) * 2022-09-21 2023-01-06 北京字跳网络技术有限公司 用于增强现实的方法、装置、设备和存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10380803B1 (en) * 2018-03-26 2019-08-13 Verizon Patent And Licensing Inc. Methods and systems for virtualizing a target object within a mixed reality presentation
CN111815781A (zh) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 增强现实数据呈现方法、装置、设备和计算机存储介质
CN112148130A (zh) * 2020-10-23 2020-12-29 深圳市商汤科技有限公司 信息处理方法及装置、电子设备和存储介质
CN114949851A (zh) * 2022-04-25 2022-08-30 上海商汤智能科技有限公司 增强现实交互方法、装置、电子设备及存储介质
CN115578540A (zh) * 2022-09-21 2023-01-06 北京字跳网络技术有限公司 用于增强现实的方法、装置、设备和存储介质

Also Published As

Publication number Publication date
CN115578540A (zh) 2023-01-06

Similar Documents

Publication Publication Date Title
US10311548B2 (en) Scaling render targets to a higher rendering resolution to display higher quality video frames
KR101952983B1 (ko) 콘텐트의 타일-기반 렌더링을 위한 방법 및 콘텐트를 렌더링하기 위한 시스템
US10878224B2 (en) Method and system for image processing
WO2024060949A1 (zh) 用于增强现实的方法、装置、设备和存储介质
WO2017024964A1 (zh) 一种物品关联图片快速预览的方法以及装置
CN111882634B (zh) 一种图像渲染方法、装置、设备及存储介质
CN112672185B (zh) 基于增强现实的显示方法、装置、设备及存储介质
CN109448050B (zh) 一种目标点的位置的确定方法及终端
TW202004674A (zh) 在3d模型上展示豐富文字的方法、裝置及設備
CN112337091A (zh) 人机交互方法、装置及电子设备
CN110476160B (zh) 使用取向在地图上呈现图像
CN112419430A (zh) 动画播放方法、装置及计算机设备
CN117557701A (zh) 一种图像渲染方法和电子设备
CN115311397A (zh) 用于图像渲染的方法、装置、设备和存储介质
WO2022052729A1 (zh) 虚拟卡片的显示方法、装置、计算机设备及存储介质
CN116740254A (zh) 一种图像处理方法及终端
US10909769B1 (en) Mixed reality based 3D sketching device and method
CN114913277A (zh) 一种物体立体交互展示方法、装置、设备及介质
KR102238036B1 (ko) 영상 처리 방법 및 시스템
TWM589834U (zh) 擴增實境整合系統
CN115174993B (zh) 用于视频制作的方法、装置、设备和存储介质
KR102533209B1 (ko) 다이나믹 확장현실(xr) 콘텐츠 생성 방법 및 시스템
KR102276789B1 (ko) 동영상 편집 방법 및 장치
CN112169332B (zh) 2d地图的显示方法、装置、设备及存储介质
CN113542846B (zh) Ar弹幕显示方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23867240

Country of ref document: EP

Kind code of ref document: A1