WO2021073269A1 - 增强现实数据呈现方法、装置、设备、存储介质和程序 - Google Patents

增强现实数据呈现方法、装置、设备、存储介质和程序 Download PDF

Info

Publication number
WO2021073269A1
WO2021073269A1 PCT/CN2020/111890 CN2020111890W WO2021073269A1 WO 2021073269 A1 WO2021073269 A1 WO 2021073269A1 CN 2020111890 W CN2020111890 W CN 2020111890W WO 2021073269 A1 WO2021073269 A1 WO 2021073269A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
data
display
moving
augmented reality
Prior art date
Application number
PCT/CN2020/111890
Other languages
English (en)
French (fr)
Inventor
侯欣如
栾青
Original Assignee
北京市商汤科技开发有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京市商汤科技开发有限公司 filed Critical 北京市商汤科技开发有限公司
Priority to JP2020572499A priority Critical patent/JP2022505002A/ja
Priority to SG11202013054YA priority patent/SG11202013054YA/en
Priority to KR1020207037362A priority patent/KR102417786B1/ko
Priority to US17/131,988 priority patent/US20210110617A1/en
Publication of WO2021073269A1 publication Critical patent/WO2021073269A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • This application relates to the field of augmented reality technology, and relates to but not limited to an augmented reality data presentation method, device, device, computer storage medium, and computer program.
  • Augmented Reality (AR) technology is a technology that integrates virtual information with the real world. Augmented reality technology applies computer-generated text, images, three-dimensional models, music, video and other virtual information to the real world to enhance the real world. The optimization of the effect of the augmented reality scene presented by the AR device and the improvement of the interaction with the user are becoming more and more important.
  • the embodiments of the present application provide an augmented reality data presentation method, device, device, computer storage medium, and computer program.
  • An embodiment of the present application provides a method for presenting augmented reality data, the method including:
  • the AR device In a case where it is detected that the AR device meets the preset display condition that triggers the display of the virtual object, determine the display data including the movement state of the virtual object based on the moving position of the virtual object and the position data of the AR device;
  • the augmented reality data including the display data is displayed through the AR device.
  • the display data including the movement state of the virtual object can be determined, and the augmented reality data containing the display data can be displayed through the AR device, so that the AR device can present the movement state of the virtual object that matches the position of the AR device.
  • the augmented reality effect of the virtual object moving to the position of the AR device is presented, so that the display of the virtual object is more integrated into the real scene, and the flexibility of the augmented reality data presentation is improved.
  • the moving position of the virtual object includes at least one of the following positions:
  • the preset initial position of the virtual object the preset end position of the virtual object, and the position of the virtual object in the current moving state.
  • the movement path of the virtual object is generated based on the position of the AR device and the movement position of the virtual object, and the movement state is displayed based on the movement path, which increases the diversity and flexibility of the virtual object display state, and improves
  • the integration of virtual objects in the real scene also improves the display effect of the AR scene.
  • the preset display condition includes that the location data of the AR device is located within the target area.
  • the preset display condition includes that the location data of the AR device is located within the target area, and the attribute information of the user associated with the AR device meets the preset attribute condition.
  • preset display conditions are set, AR devices are screened, and the location data of AR devices that meet the preset display conditions are determined to include the display data of the movement state of the virtual object, and the display data is displayed on the corresponding AR device.
  • determining the display data including the moving state of the virtual object includes:
  • the method further includes:
  • the determining the display data including the moving state of the virtual object based on the moving position of the virtual object and the position data of the AR device includes:
  • the display data including the moving state of the virtual object is determined.
  • different virtual objects are matched for users with different attribute information, and different display data is generated for different AR devices based on different virtual objects, so as to realize the diversification of display data and improve the flexibility of AR scene presentation.
  • the attribute information includes at least one of the following: the user's age, the user's gender, the user's occupational attributes, and the user's preset interest virtual object information.
  • the server or AR device can match the user's attribute information to the virtual object displayed by the AR device, or judge whether the AR device matches the display data according to the user's attribute information, so as to achieve targeting Different AR devices display the effect of different display data to improve the display effect.
  • the embodiment of the present application also provides another augmented reality data presentation method, the method includes:
  • n is a positive integer
  • the augmented reality data of the display data matched with the m first AR devices is displayed through the m first AR devices.
  • display data including the movement state of the virtual object that matches the multiple AR devices is generated, so that the multiple AR devices can receive the movement state of the virtual object in the target area.
  • the display data and the augmented reality data including the display data are displayed, which realizes the flexible display of virtual objects in multiple AR devices, meets the needs of real scenes, and improves the effect of displaying augmented reality data.
  • the display data includes the movement state displayed during the movement of the virtual object according to the movement path; the waypoint of the movement path includes the location of the m first AR devices; or , The passing points of the moving path include location points that are within a set distance from the location of the m first AR devices.
  • the method further includes:
  • the position data of the second AR device, and the m first AR The location data of the first AR device that has not passed before updating the movement path in the device, and the display data including the movement state of the virtual object is updated.
  • the movement path of the virtual object is updated in real time to realize the real-time display of the AR scene and improve the display effect .
  • An embodiment of the present application also provides an augmented reality data presentation device, which includes:
  • the location data acquisition module is configured to acquire the location data of the augmented reality AR device
  • the display data determination module is configured to, in a case where it is detected that the AR device meets the preset display condition that triggers the display of the virtual object, determine whether the virtual object includes the virtual object based on the moving position of the virtual object and the position data of the AR device. Display data in mobile state;
  • the first display module is configured to display the augmented reality data including the display data through the AR device.
  • the moving position of the virtual object in the display data determining module includes at least one of the following positions:
  • the preset initial position of the virtual object the preset end position of the virtual object, and the position of the virtual object in the current moving state.
  • the preset display condition in the display data determining module includes that the location data of the AR device is located within a target area.
  • the preset display condition in the display data determining module includes that the location data of the AR device is located within the target area, and the attribute information of the user associated with the AR device meets the predetermined conditions. Set attribute conditions.
  • the display data determining module uses the following steps to determine the display data including the movement state of the virtual object:
  • the device further includes:
  • the virtual object matching module is configured to determine a virtual object matching the attribute information based on the attribute information of the user associated with the AR device when it is detected that the AR device meets the preset display condition that triggers the display of the virtual object ;
  • the display data determining module uses the following steps to determine the display data including the movement state of the virtual object, including:
  • the display data including the moving state of the virtual object is determined.
  • the attribute information includes at least one of the following: user age, user gender, user occupational attribute, and interested virtual object information preset by the user.
  • An embodiment of the present application also provides another augmented reality data presentation device, which includes:
  • the first obtaining module is configured to obtain position data of n first augmented reality AR devices, where n is a positive integer;
  • the first determining module is configured to, when it is detected that the m first AR devices among the n first AR devices meet a preset display condition that triggers the display of the virtual object, based on the moving position of the virtual object and the m-th AR device Position data of an AR device, determining the display data that matches the m first AR devices and includes the movement state of the virtual object; m is a positive integer less than or equal to n;
  • the second display module is configured to display the augmented reality data of the display data matched with the m first AR devices through the m first AR devices.
  • the display data in the first determining module includes the moving state displayed in the process of the virtual object moving according to the moving path; the way points of the moving path include the m The location of the first AR device; or, the passing points of the moving path include location points that are within a set distance from the location of the m first AR devices.
  • the device further includes:
  • the second acquiring module is configured to acquire location data of the second AR device
  • the second determining module is configured to, when it is detected that the second AR device meets the preset display condition, according to the position of the virtual object in the current moving state, the position data of the second AR device, and The position data of the first AR devices that have not passed before updating the movement path among the m first AR devices are updated, and the display data including the movement state of the virtual object is updated.
  • An embodiment of the present application further provides an electronic device, including a processor, a memory, and a bus.
  • the memory stores machine-readable instructions executable by the processor.
  • the processor and the bus The memories communicate through a bus.
  • the machine-readable instructions are executed by the processor, the augmented reality data presentation method as described in the first aspect or any one of the implementation manners is executed, or the machine-readable instructions are executed by the processor.
  • the processor is executed, the augmented reality data presentation method as described in the second aspect or any one of the embodiments is executed.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and the computer program executes the augmented reality as described in the first aspect or any one of the embodiments when the computer program is run by a processor.
  • the data presentation method, or, when the computer program is run by the processor, the augmented reality data presentation method as described in the second aspect or any one of the embodiments is executed.
  • An embodiment of the present application also provides a computer program, including computer readable code, when the computer readable code is executed in an electronic device, the processor in the electronic device executes for realizing any of the foregoing augmented reality data presentations method.
  • the display data including the movement state of the virtual object can be determined, and the augmented reality data containing the display data can be displayed through the AR device, so that the AR device can present the movement state of the virtual object that matches the position of the AR device, for example, showing a virtual
  • the augmented reality effect of the object moving to the position of the AR device so that the display of the virtual object is more integrated into the real scene, and the flexibility of the augmented reality data presentation is improved.
  • FIG. 1 shows a schematic flowchart of an augmented reality data presentation method provided by an embodiment of the present application
  • FIG. 2A shows a schematic diagram of an initial movement path provided by an embodiment of the present application
  • FIG. 2B shows a schematic diagram of an updated moving path based on an initial moving path according to an embodiment of the present application
  • FIG. 3A shows a schematic diagram of an image in display data of an AR device provided by an embodiment of the present application
  • FIG. 3B shows a schematic diagram of an image in display data of another AR device provided by an embodiment of the present application.
  • FIG. 4 shows a schematic flowchart of another augmented reality data presentation method provided by an embodiment of the present application
  • Fig. 5A shows a schematic diagram of a movement path provided by an embodiment of the present application
  • FIG. 5B shows a schematic diagram of an updated movement path provided by an embodiment of the present application.
  • FIG. 6 shows a schematic structural diagram of an augmented reality data presentation device provided by an embodiment of the present application
  • FIG. 7 shows a schematic structural diagram of another augmented reality data presentation device provided by an embodiment of the present application.
  • FIG. 8 shows a schematic structural diagram of a first electronic device provided by an embodiment of the present application.
  • FIG. 9 shows a schematic structural diagram of a second electronic device provided by an embodiment of the present application.
  • an embodiment of the present application provides a method for presenting augmented reality data, which is determined by the moving position of the virtual object and the position data of the AR device. Display data of the movement state of virtual objects, and display augmented reality data containing the display data through AR equipment, so as to realize the display of augmented reality scenes by AR equipment.
  • the augmented reality data presentation method can obtain the display data corresponding to the movement state of the virtual object to be presented in the AR device based on the position data of a single AR device, or it can be based on the location data of multiple AR devices. Perform unified calculations on the location data in the real scene to determine the display data corresponding to the movement states of the virtual objects to be presented in multiple AR devices. For example, in the case of multiple AR devices, each AR The movement state of the virtual object presented in the device may be affected by the position data of other AR devices.
  • the AR device is a smart device capable of supporting AR functions.
  • the AR device includes, but is not limited to, electronic devices capable of presenting augmented reality effects such as mobile phones, tablet computers, and AR glasses.
  • FIG. 1 it is a schematic flowchart of an augmented reality data presentation method provided by an embodiment of this application, where the method can be applied to the above-mentioned AR device, or applied to a local or cloud server.
  • the augmented reality data presentation method shown in Figure 1 includes the following steps:
  • S101 Acquire location data of an augmented reality AR device.
  • the display data including the movement state of the virtual object can be determined, and the augmented reality data containing the display data can be displayed through the AR device, so that the AR device can present the movement state of the virtual object that matches the position of the AR device.
  • the augmented reality effect of the virtual object moving to the position of the AR device is presented, so that the display of the virtual object is more integrated into the real scene, and the flexibility of the augmented reality data presentation is improved.
  • S101 to S103 are respectively described below.
  • the location data of the AR device includes the location data of the AR device in the real scene.
  • the location data can be the three-dimensional coordinate data of the AR device in the preset reference coordinate system, or the latitude and longitude corresponding to the AR device. data.
  • the method for obtaining the location data of the AR device includes but is not limited to: Global Positioning System (Global Positioning System, GPS), satellite positioning system, etc. It is also possible to obtain the current real scene image through the AR device, and perform actual geographic location positioning based on the real scene image, such as performing image recognition on the real scene image to determine the geographic location information in the real scene corresponding to the real scene image, When performing specific image recognition, the recognition can be performed based on a pre-trained position prediction model, or the recognition can be performed by comparing the real scene image with a pre-stored sample image.
  • Global Positioning System Global Positioning System
  • GPS Global Positioning System
  • satellite positioning system etc. It is also possible to obtain the current real scene image through the AR device, and perform actual geographic location positioning based on the real scene image, such as performing image recognition on the real scene image to determine the geographic location information in the real scene corresponding to the real scene image.
  • the recognition can be performed based on a pre-trained position prediction model, or the recognition can be performed by comparing the
  • the initial display data including the movement state of the virtual object may be determined based on the movement position of the virtual object and the preset initial movement path of the virtual object.
  • the initial movement path of the virtual object may not be changed, and thus the initial display data may not be changed.
  • the movement path of the virtual object is regenerated, and based on the regenerated movement path, the movement path is changed.
  • the display data of the movement state presents the updated movement state of the virtual object in the current AR scene.
  • FIG. 2A a schematic diagram of an initial movement path is shown.
  • the initial movement path in FIG. 2A includes a preset initial position 21 of the virtual object and a preset end position 22 of the virtual object.
  • the movement path of the virtual object is regenerated, and the movement path of the regenerated virtual object includes various situations, such as FIG. 2B shows one of them.
  • the movement path of the regenerated virtual object includes a preset initial position 21, a preset end position 22 of the virtual object, and a position 23 of the AR device.
  • the moving position of the virtual object includes at least one of the following positions: the preset initial position of the virtual object, the preset end position of the virtual object, and the position of the virtual object in the current moving state.
  • the preset initial position of the virtual object and the preset end position of the virtual object can be set according to actual conditions.
  • the position of the virtual object in the current moving state may be the real-time position of the virtual object under the three-dimensional scene model when the display data is determined.
  • the three-dimensional scene model is a model used to characterize the real scene, and the model can fully describe the appearance characteristics of the real scene.
  • the scale between the real scene and the real scene is 1:1. Based on the three-dimensional scene model to design the display special effects of the virtual object in the real scene, the display special effects of the virtual scene can be better integrated with the real scene.
  • a movement path matching the position data of the AR device and the movement position of the virtual object can be generated based on the position data of the AR device and the movement position of the virtual object, such as generating the current real-time position from the virtual object Or the preset initial position passes through the position of the AR device and reaches the preset end position of the virtual object on a movement path, and then presents the movement state of the virtual object under the movement path in the current AR scene. It can be seen that this method of generating the moving path of the virtual object based on the position of the AR device and displaying the moving state based on the moving path increases the diversity and flexibility of the virtual object display state, and improves the integration of the virtual object in the real scene. It also improves the display effect of AR scenes.
  • the foregoing preset display condition may include that the location data of the AR device is located within the target area.
  • the target area range may be an area range that includes the virtual object, or may be an area range that does not include the virtual object.
  • the target area range may be a preset random area range, or may be a circular area centered on the position of the virtual object and a preset distance as a radius.
  • detecting that the AR device meets the preset display condition that triggers the display of the virtual object includes: determining whether the location data of the AR device is included in the preset area, and if so, the AR device meets the preset display condition.
  • detecting that the AR device meets the preset display condition that triggers the display of the virtual object further includes: determining whether the distance between the position of the AR device and the position of the virtual object is less than or equal to the preset distance, and if so, the AR device meets the preset display conditions. Set up display conditions. This application does not limit the manner of judging whether the AR device meets the aforementioned preset display conditions.
  • the preset display condition includes that the location data of the AR device is located within the target area and the attribute information of the user associated with the AR device meets the preset attribute condition.
  • the users associated with the AR device include, but are not limited to: the holder of the AR device, the user of the AR device, and users whose distance to the AR device is less than a set distance threshold.
  • the attribute information corresponding to the preset attribute condition is selected from the attribute information of the user associated with the AR device as the first attribute information of the AR device, and based on the first attribute information of the AR device, it is determined whether the AR device is Meet the preset attribute conditions.
  • the attribute information may include, but is not limited to, at least one of the following: user age, user gender, user occupation attribute, and user pre-set interested virtual object information.
  • determining the attribute information of a user includes but is not limited to the following implementation manners:
  • Method 1 Determine the attribute information of the user through the registration data corresponding to the AR device;
  • Method 2 Determine the user's attribute information through the behavior data detected by the AR device
  • the third method is to determine the user's attribute information through image recognition technology.
  • the registration data is data input by the user when the AR device is used.
  • the registration data may be basic information that the user fills in when using software with AR scene presentation.
  • the behavior data may be data stored on the AR device.
  • the behavior data may be the type of software browsed by the user, the duration of browsing the software, and so on.
  • the behavior data may also be data detected by the AR device in real time, for example, a trigger operation performed by the user on the AR device detected by the AR device, such as a gesture operation, a voice operation, and a key operation.
  • the attribute information includes, but is not limited to, the user's age, height, clothing, facial expression, etc.
  • the process of determining the attribute information of the user through the image recognition technology includes: obtaining the image data of the user, and obtaining the attribute information of the user from the image data of the user based on the image recognition technology.
  • the method of acquiring image data is, for example, through the built-in camera of the AR device (such as the front camera), or through the camera deployed in the real scene independent of the AR device, or it can also be transmitted to the AR device through other devices Of the user image data.
  • the server or AR device can match the user’s attribute information to the virtual object displayed by the AR device, or determine whether the AR device matches the display data according to the user’s attribute information.
  • Different AR devices display the effect of different display data to improve the display effect.
  • the preset attribute condition is: the gender of the user is female, then the location data of the AR device is within the target area, and the first attribute information of the user associated with the AR device is "the gender of the user is female" , The AR device satisfies the preset display conditions; if the location data of the AR device is not within the target area, or the first attribute information of the user associated with the AR device is "user gender is male", then The AR device does not meet the preset display conditions.
  • preset display conditions AR devices are screened, and the location data of AR devices that meet the preset display conditions are determined to include the display data of the virtual object's movement state, and the display data is displayed on the corresponding AR device to improve the AR scene Show flexibility.
  • determining the display data including the moving state of the virtual object includes:
  • the display data including the movement state of the virtual object is determined.
  • the process of generating the movement path of the virtual object can be executed on the AR device or on the server.
  • the AR device or the server generates the movement path of the virtual object based on the movement position of the virtual object and the position data of the AR device by setting a path planning algorithm.
  • Make the AR device obtain the movement path of the virtual object generated locally, or, after obtaining the movement path of the virtual object generated by the server, merge the special effect data of the virtual object under the three-dimensional scene model matching the real scene with the obtained movement path,
  • the display data including the movement state of the virtual object is determined.
  • the augmented reality data presentation method further includes matching corresponding virtual objects for the AR device according to user attribute information associated with the AR device.
  • the specific process is as follows:
  • the virtual object matching the attribute information is determined according to the attribute information of the user associated with the AR device.
  • the virtual object corresponding to the AR device can be determined according to the attribute information of the user associated with the AR device. Among them, one or more attribute information is selected from the attribute information of the user as the second attribute information of the AR device, and the virtual object is matched for the AR device based on the second attribute information.
  • the second attribute information may be the same as or different from the first attribute information used when judging whether the AR device meets the preset attribute condition.
  • the first attribute information is different from the second attribute information. If the first attribute information is that the user’s gender is female, then the second attribute information can be the user’s age. Based on different user ages, different user ages can be matched to the user. Virtual objects.
  • the corresponding virtual object is a cartoon animal. If the user’s age is 11-18, the corresponding virtual object is a game character. If the user’s age is 19-30, then the corresponding virtual object is a game character. Objects are celebrities, etc.
  • the case where the first attribute information and the second attribute information are the same will be exemplified. Taking the first attribute information and the second attribute information as an example, if the first attribute information can indicate that the user’s age is 20 years old (including 20 years old).
  • the second attribute information can be that when the user’s age is less than 5 years old, the corresponding virtual object is a cartoon animal, and when the user’s age is 5-10 years old (including 5 years old but not including 10 years old), the corresponding virtual object The object is an animated character, and when the user is 10-20 years old (including 10 and 20 years old), the corresponding virtual object is a movie star.
  • the movement state of the virtual object can also be customized for the user based on user needs. For example, if the AR device 1 obtains indication information that is interested in celebrity 1, it will generate a movement state that includes celebrity 1. If the AR device 2 obtains the indication information that is interested in the celebrity 2, it will generate the display data including the movement state of the celebrity 2.
  • the display data including the moving state of the virtual object is generated, including:
  • the display data including the moving state of the virtual object is determined.
  • the moving positions of different virtual objects can be the same or different, and the moving positions of the virtual objects can be set according to actual needs. After determining the matching virtual object based on the attribute information of the user associated with the AR device, first obtain the moving position of the virtual object matching the attribute information, and then generate and attribute information based on the obtained moving position of the virtual object and the position data of the AR device The moving path of the matched virtual object, and finally using the moving path and the special effect data of the virtual object matching the attribute information in the three-dimensional scene model matching the real scene to determine the display data including the moving state of the virtual object matching the attribute information.
  • the augmented reality data includes display data, such as an animation composed of multiple frames of images, or a single frame of images, etc., and may also include: sound data, odor synthesis data, and the like.
  • the sound data is set at the preset frame image position in the display data to realize the fusion of the sound data and the display data.
  • the AR device is made to play
  • the preset sound data for example, the sound data can be: Hello, welcome to the fairy tale world. It realizes the display of both sound data and display data on the AR device, which improves the effect of AR device displaying augmented reality data.
  • FIG. 3A is a schematic diagram of an image in the display data of an AR device
  • FIG. 3B is a schematic diagram of an image in the display data of another AR device, as shown in FIGS. 3A and 3B
  • the virtual object 33 has moved from the position point 31 to the position point 32
  • the special effect data of the virtual object 33 at the position point 31 is different from the special effect data at the position point 32
  • the sound can be set Data, making the AR device play sound data: "Santa Claus, it's time to send a gift.”
  • the augmented reality data presentation method provided by the embodiment of the present application generates display data matching the AR device including the movement state of the virtual object based on a single AR device, wherein the virtual object in the data is displayed in the moving state based on the movement path, that is
  • the virtual objects in the display data are dynamic, so that the AR device can display the augmented reality data including the display data, which improves the effect of the AR device in displaying the augmented reality data.
  • FIG. 4 is a schematic flowchart of another augmented reality data presentation method provided by an embodiment of this application.
  • the method is applied to a server, and the method is applied to display augmented reality data to multiple AR devices, as shown in FIG.
  • the augmented reality data presentation method includes steps S401-S403, and the specific process is as follows:
  • S401 Acquire location data of n first augmented reality AR devices, where n is a positive integer.
  • the location data of each AR device in the n first devices is acquired.
  • the location data of the first AR device can be obtained through GPS and satellite positioning systems. If there is a AR device among n AR devices that are associated AR devices, and a is a positive integer less than or equal to n, the location data of any AR device in the associated AR device can be obtained as each of the associated AR devices.
  • the associating AR device may be for the user to manually associate a AR device, or the server may automatically associate the AR device that meets the association condition.
  • the association condition may be the AR device connected to the same signal.
  • the movement path of the virtual object is generated according to the movement position of the virtual object and the position data of the m first AR devices, and based on the movement path and the special effect data of the virtual object under the three-dimensional scene model matching the real scene, Determine the display data including the movement state of the virtual object corresponding to each first AR device in the m first AR devices.
  • the display data includes the movement state displayed in the process of the virtual object moving according to the movement path; the waypoints of the movement path include the positions of m first AR devices, or the waypoints of the movement path include the respective A point within a set distance from the location of the m first AR devices.
  • the value of m can be updated according to the change in the number of AR devices that actually meet the above preset display conditions. Specifically, when the location data of any AR device in the first AR device does not meet the preset display If the condition is met, the transit point of the moving path does not include the AR device that does not meet the preset display condition, and the value of m is changed at this moment. Illustratively, if the value of the initial moment m is 3, when the location data of any one of the three first AR devices does not meet the preset display condition, that is, there is one first AR device among the three first AR devices. When the location data of the AR device is outside the range of the target area, the value of m changes. At this moment, the value of m is 2.
  • the first AR device matches the display data including the movement state of the virtual object, and the display data includes the movement state displayed during the movement of the virtual object according to the movement path; the passing points of the movement path include the locations of the two first AR devices. .
  • the augmented reality data presentation method further includes:
  • the position data of the second AR device, and the m first AR devices that have not passed before the moving path is updated
  • the location data of the first AR device updates the display data including the movement state of the virtual object.
  • the server after determining the display data including the moving state of the virtual object that matches the m first AR devices based on the moving position of the virtual object and the position data of the m first AR devices, the server detects the second When the AR device is used, the location data of the second AR device is acquired, and if the second AR device meets the preset display conditions, the display data including the movement state of the virtual object is updated, where the second AR device is other than the first AR device Other AR devices.
  • Fig. 5A shows a schematic diagram of a moving path. It can be seen from Fig. 5A that the value of m is 2, and Fig. 5A includes the preset initial position 51 of the virtual object and the initial position 51 of the virtual object. The end position 52 and the positions 53 of the two first AR devices are preset. If at a certain moment, when the location data of the second AR device is detected, the location data of the second AR device is acquired. When the second AR device is full of the preset display conditions, it is based on the virtual object's current moving state.
  • the updated movement path corresponding to the movement path shown in FIG. 5A includes a variety of situations, and any one of them is exemplarily described.
  • the updated movement path is shown in FIG. 5B, and it can be seen from FIG. 5B that, FIG. 5B includes The position 54 of the virtual object in the current moving state, the position 55 of the second AR device, the preset end position 52 of the virtual object, and the position 53 of the first AR device that has not passed before the movement path is updated. Further, based on the updated movement path, the display data including the movement state of the virtual object is updated.
  • S403 Display the augmented reality data of the display data matched with the m first AR devices through the m first AR devices.
  • the server if the value of m is 2, the server generates two first AR devices matching display data including the movement state of the virtual object, and sends the two display data to The corresponding first AR device enables the first AR to display corresponding augmented reality data including display data.
  • the augmented reality data presentation method generateds display data including the movement state of virtual objects that are matched with multiple AR devices based on multiple AR devices, so that multiple AR devices in the target area can receive data including: Display data of the movement state of the virtual object, and display the augmented reality data including the display data, realize the flexible display of the virtual object in multiple AR devices, meet the requirements of the real scene, and improve the effect of displaying the augmented reality data.
  • the writing order of the steps does not mean a strict execution order but constitutes any limitation on the implementation process.
  • the specific execution order of each step should be based on its function and possibility.
  • the inner logic is determined.
  • an embodiment of the present application also provides an augmented reality data presentation device.
  • a schematic diagram of the architecture of the augmented reality data presentation device provided by this embodiment of the present application includes a location data acquisition module 61 and a display
  • the data determining module 62 and the first display module 63 are specifically:
  • the location data acquisition module 61 is configured to acquire the location data of the augmented reality AR device
  • the display data determining module 62 is configured to determine the display data including the moving state of the virtual object based on the moving position of the virtual object and the position data of the AR device when it is detected that the AR device meets the preset display condition that triggers the display of the virtual object ;
  • the first display module 63 is configured to display augmented reality data including display data through the AR device.
  • the moving position of the virtual object in the display data determining module 62 includes at least one of the following positions:
  • the preset initial position of the virtual object the preset end position of the virtual object, and the position of the virtual object in the current moving state.
  • the preset display condition in the display data determining module 62 includes that the location data of the AR device is located within the target area.
  • the preset display condition in the display data determining module 62 includes that the location data of the AR device is located within the target area and the attribute information of the user associated with the AR device meets the preset attribute condition.
  • the display data determining module 62 uses the following steps to determine the display data including the movement state of the virtual object:
  • the display data including the movement state of the virtual object is generated.
  • the device further includes:
  • the virtual object matching module 64 is configured to determine a virtual object matching the attribute information based on the attribute information of the user associated with the AR device when it is detected that the AR device meets the preset display condition that triggers the display of the virtual object;
  • the display data determining module 62 uses the following steps to determine the display data including the movement state of the virtual object, including:
  • the display data including the moving state of the virtual object is determined.
  • the attribute information includes at least one of the following: user age, user gender, user occupational attribute, and interested virtual object information preset by the user.
  • the embodiment of the present application also provides another augmented reality data presentation device.
  • FIG. 7 a schematic diagram of the architecture of another augmented reality data presentation device provided by this embodiment of the present application, including the first acquisition
  • the module 71, the first determination module 72, and the second display module 73 are specifically:
  • the first obtaining module 71 is configured to obtain position data of n first augmented reality AR devices, where n is a positive integer;
  • the first determining module 72 is configured to, in a case where it is detected that the m first AR devices among the n first AR devices meet the preset display condition that triggers the display of the virtual object, based on the moving position of the virtual object and the m first AR devices The location data of, determine the display data including the movement state of the virtual object that matches the m first AR devices respectively; m is a positive integer less than or equal to n;
  • the second display module 73 is configured to display the augmented reality data of the display data matched with the m first AR devices through the m first AR devices.
  • the display data in the first determining module 72 includes the moving state displayed during the process of the virtual object moving according to the moving path; the waypoint of the moving path includes the location of m first AR devices; or, The passing points of the moving path include location points that are within a set distance from the location of the m first AR devices.
  • the device further includes:
  • the second obtaining module 74 is configured to obtain location data of the second AR device
  • the second determining module 75 is configured to, in the case of detecting that the second AR device meets the preset display condition, according to the position of the virtual object in the current moving state, the position data of the second AR device, and the m first AR devices The location data of the first AR device that has not passed before updating the movement path, and the display data including the movement state of the virtual object is updated.
  • the functions or templates contained in the apparatus provided in the embodiments of the application can be used to execute the methods described in the above method embodiments.
  • the functions or templates contained in the apparatus provided in the embodiments of the application can be used to execute the methods described in the above method embodiments.
  • the embodiment of the present application also provides a first electronic device.
  • the first electronic device 800 includes a first processor 801, a first memory 802, and a first bus 803.
  • the first memory 802 is used to store execution instructions, including a first memory 8021 and a first external memory 8022; here, the first memory 8021 is also called an internal memory, and is used to temporarily store operation data in the first processor 801, and
  • the first processor 801 exchanges data with the first external memory 8022 through the first memory 8021.
  • the first processor 801 may be an application specific integrated circuit (ASIC), a digital signal processor (Digital Signal Processor, DSP), a digital signal processing device (Digital Signal Processing Device, DSPD), and a programmable logic device (Programmable Logic Device). At least one of Logic Device, PLD), Field Programmable Gate Array (FPGA), controller, microcontroller, and microprocessor.
  • ASIC application specific integrated circuit
  • DSP Digital Signal Processor
  • DSPD Digital Signal Processing Device
  • a programmable logic device Programmable Logic Device
  • PLD Field Programmable Gate Array
  • controller microcontroller
  • microprocessor microprocessor
  • the first memory 8021 or the first external memory 8022 can be implemented by any type of volatile or non-volatile storage devices or their combination, such as static random-access memory (SRAM), electrically erasable Except for Programmable Read-Only Memory (Erasable Programmable Read Only Memory, EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
  • SRAM static random-access memory
  • EEPROM Electrically erasable Except for Programmable Read-Only Memory
  • EPROM Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • ROM Read-Only Memory
  • magnetic memory flash memory
  • flash memory magnetic disk or optical disk.
  • the first processor 801 and the first memory 802 communicate through the first bus 803, so that the first processor 801 executes the following instructions:
  • the AR device When it is detected that the AR device meets the preset display condition that triggers the display of the virtual object, based on the moving position of the virtual object and the position data of the AR device, determine the display data including the moving state of the virtual object;
  • Display augmented reality data including display data through AR equipment.
  • the embodiment of the present application also provides a second electronic device.
  • the second electronic device 900 includes a second processor 901, a second memory 902, and a second bus 903.
  • the second memory 902 is used to store execution instructions, including a second memory 9021 and a second external memory 9022; here, the second memory 9021 is also called internal memory, and is used to temporarily store operation data in the second processor 901, and
  • the second processor 901 exchanges data with the second external memory 9022 through the second memory 9021.
  • the second processor 901 may be at least one of ASIC, DSP, DSPD, PLD, FPGA, controller, microcontroller, and microprocessor.
  • the second memory 9021 or the second external memory 9022 can be implemented by any type of volatile or non-volatile storage device or their combination, such as SRAM, EEPROM, EPROM, PROM, ROM, magnetic memory, flash memory, disk Or CD.
  • the second processor 901 and the second memory 902 communicate through the second bus 903, so that the second processor 901 executes the following instructions:
  • n is a positive integer
  • m first AR devices among n first AR devices meet the preset display condition that triggers the display of virtual objects, based on the moving position of the virtual object and the position data of the m first AR devices, it is determined that m The first AR device respectively matches the display data including the movement state of the virtual object; m is a positive integer less than or equal to n;
  • the augmented reality data of the display data matched with the m first AR devices is displayed through the m first AR devices.
  • an embodiment of the present application also provides a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and the computer program executes the augmented reality data presentation method described in the above method embodiment when the computer program is run by a processor. .
  • the embodiment of the present application also provides a computer program, which includes a computer-readable storage medium storing program code.
  • the instructions included in the program code can be used to execute the augmented reality data presentation method described in the foregoing method embodiment, specifically Refer to the foregoing method embodiment, which will not be repeated here.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a non-volatile computer readable storage medium executable by a processor.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program code .
  • the embodiment of the application proposes an augmented reality data presentation method, device, electronic device, storage medium, and program.
  • the method includes: acquiring the position data of the augmented reality AR device; In the case of display conditions, based on the moving position of the virtual object and the position data of the AR device, the display data including the moving state of the virtual object is determined; the augmented reality data containing the display data is displayed through the AR device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种增强现实数据呈现方法、装置、电子设备、存储介质和程序,该方法包括:获取增强现实AR设备的位置数据(S101);在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据(S102);通过AR设备展示包含展示数据的增强现实数据(S103)。

Description

增强现实数据呈现方法、装置、设备、存储介质和程序
相关申请的交叉引用
本申请基于申请号为201910979898.4、申请日为2019年10月15日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及增强现实技术领域,涉及但不限于一种增强现实数据呈现方法、装置、设备、计算机存储介质和计算机程序。
背景技术
增强现实(Augmented Reality,AR)技术是一种将虚拟信息与真实世界进行融合的技术。增强现实技术将计算机生成的文字、图像、三维模型、音乐、视频等虚拟信息模拟仿真后,应用到真实世界中,实现对真实世界的增强。对AR设备呈现的增强现实场景的效果的优化以及与用户的交互性的提升,愈发重要。
发明内容
本申请实施例提供了一种增强现实数据呈现方法、装置、设备、计算机存储介质和计算机程序。
本申请实施例提供了一种增强现实数据呈现方法,所述方法包括:
获取增强现实(Augmented Reality,AR)设备的位置数据;
在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据;
通过所述AR设备展示包含所述展示数据的增强现实数据。
采用上述方法,可以确定包括虚拟对象的移动状态的展示数据,并通过AR设备展示包含展示数据的增强现实数据,从而通过该AR设备可以呈现出与AR设备的位置所匹配的虚拟对象移动状态,例如呈现出虚拟对象向AR设备位置处移动的增强现实效果,从而使得虚拟对象的展示更加融入现实场景,提高了增强现实数据呈现的灵活性。
本申请的一些实施例中,所述虚拟对象的移动位置包括以下位置中的至少一种:
所述虚拟对象的预设初始位置、所述虚拟对象的预设终点位置、所述虚拟对象在当前移动状态下的位置。
上述实施方式下,基于AR设备位置、虚拟对象的移动位置生成虚拟对象的移动路径,并基于该移动路径来进行移动状态展示的方式,增加了虚拟对象展示状态的多样性以及灵活性,提升了虚拟对象在现实场景中的融入性,也提高了AR场景的展示效果。
本申请的一些实施例中,所述预设展示条件包括所述AR设备的位置数据位于目标区域范围内。
本申请的一些实施例中,所述预设展示条件包括所述AR设备的位置数据位于所述目标区域范围内、且所述AR设备关联的用户的属性信息符合预设属性条件。
上述实施方式下,设置预设展示条件,对AR设备进行筛选,针对满足预设展示条件的AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据,将展示数据展示在对应的AR设备,提高AR场景呈现的灵活性。
本申请的一些实施例中,基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据,包括:
基于所述虚拟对象的移动位置和所述AR设备的位置数据,获取所述虚拟对象的移动路径;
利用所述移动路径、与所述现实场景匹配的三维场景模型下所述虚拟对象的特效数据,生成包括所述虚拟对象的移动状态的展示数据。
本申请的一些实施例中,所述方法还包括:
在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,根据所述AR设备关联的用户的属性信息,确定与所述属性信息匹配的虚拟对象;
所述基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据,包括:
获取与所述属性信息匹配的虚拟对象的移动位置;
基于所述与所述属性信息匹配的虚拟对象的移动位置、所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据。
上述实施方式下,为不同属性信息的用户匹配不同的虚拟对象,基于不同虚拟对象,为不同的AR设备生成不同的展示数据,实现展示数据的多样化,提高AR场景呈现的灵活性。
本申请的一些实施例中,属性信息包括以下至少一种:用户年龄、用户性别、用户职业属性、用户预先设置的感兴趣虚拟对象信息。
上述实施方式下,通过设置用户的属性信息,使得服务器或者AR设备可以针对用户的属性信息为AR设备匹配展示的虚拟对象,或者针对用户的属性信息判断是否为该AR设备匹配展示数据,实现针对不同的AR设备展示不同的展示数据的效果,提高展示的效果。
本申请实施例还提供了另一种增强现实数据呈现方法,所述方法包括:
获取n个第一增强现实AR设备的位置数据,n为正整数;
在检测到所述n个第一AR设备中m个第一AR设备满足触发虚拟对象展示的预设展示条件情况下,基于虚拟对象的移动位置和所述m个第一AR设备的位置数据,确定与所述m个第一AR设备分别匹配的包括所述虚拟对象的移动状态的展示数据;m为小于或等于n的正整数;
通过所述m个第一AR设备展示与所述m个第一AR设备分别匹配的展示数据的增强现实数据。
上述实施方式下,基于多个AR设备,生成与多个AR设备分别匹配的包括虚拟对象的移动状态的展示数据,使得多个AR设备在目标区域内,均能接收到包括虚拟对象的移动状态的展示数据,并展示包括该展示数据的增强现实数据,实现虚拟对象在多个AR设备中的灵活展示,并符合现实场景需求,提高了展示增强现实数据的效果。
本申请的一些实施例中,所述展示数据包括所述虚拟对象按照移动路径进行移动的过程中展示的移动状态;所述移动路径的途经点包括所述m个第一AR设备所在位置;或者,所述移动路径的途经点包括分别与m个第一AR设备所在位置在设定距离范围内的位置点。
本申请的一些实施例中,所述方法还包括:
获取第二AR设备的位置数据;
在检测到所述第二AR设备满足所述预设展示条件情况下,根据所述虚拟对象在当前移动状态下的位置、所述第二AR设备的位置数据、以及所述m个第一AR设备中在更新移动路径前未经过的第一AR设备的位置数据,更新包括所述虚拟对象的移动状态的展示数据。
上述实施方式下,在第二AR设备的位置数据满足预设展示条件时,基于第二AR设备的位置数据,实时的更新虚拟对象的移动路径,实现对AR场景的实时展示,提高 展示的效果。
以下装置、电子设备等的效果描述参见上述方法的说明,这里不再赘述。
本申请实施例还提供了一种增强现实数据呈现装置,所述装置包括:
位置数据获取模块,配置为获取增强现实AR设备的位置数据;
展示数据确定模块,配置为在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据;
第一展示模块,配置为通过所述AR设备展示包含所述展示数据的增强现实数据。
本申请的一些实施例中,所述展示数据确定模块中所述虚拟对象的移动位置包括以下位置中的至少一种:
所述虚拟对象的预设初始位置、所述虚拟对象的预设终点位置、所述虚拟对象在当前移动状态下的位置。
本申请的一些实施例中,所述展示数据确定模块中所述预设展示条件包括所述AR设备的位置数据位于目标区域范围内。
本申请的一些实施例中,所述展示数据确定模块中所述预设展示条件包括所述AR设备的位置数据位于所述目标区域范围内、且所述AR设备关联的用户的属性信息符合预设属性条件。
本申请的一些实施例中,所述展示数据确定模块,利用下述步骤确定包括所述虚拟对象的移动状态的展示数据:
基于所述虚拟对象的移动位置和所述AR设备的位置数据,获取所述虚拟对象的移动路径;
利用所述移动路径、与所述现实场景匹配的三维场景模型下所述虚拟对象的特效数据,生成包括所述虚拟对象的移动状态的展示数据。
本申请的一些实施例中,所述装置还包括:
虚拟对象匹配模块,配置为在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,根据所述AR设备关联的用户的属性信息,确定与所述属性信息匹配的虚拟对象;
所述展示数据确定模块,利用下述步骤确定包括所述虚拟对象的移动状态的展示数据,包括:
获取与所述属性信息匹配的虚拟对象的移动位置;
基于所述与所述属性信息匹配的虚拟对象的移动位置、所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据。
一种可能的实施方式中,属性信息包括以下至少一种:用户年龄、用户性别、用户职业属性、用户预先设置的感兴趣虚拟对象信息。
本申请实施例还提供了另一种增强现实数据呈现装置,所述装置包括:
第一获取模块,配置为获取n个第一增强现实AR设备的位置数据,n为正整数;
第一确定模块,配置为在检测到所述n个第一AR设备中m个第一AR设备满足触发虚拟对象展示的预设展示条件情况下,基于虚拟对象的移动位置和所述m个第一AR设备的位置数据,确定与所述m个第一AR设备分别匹配的包括所述虚拟对象的移动状态的展示数据;m为小于或等于n的正整数;
第二展示模块,配置为通过所述m个第一AR设备展示与所述m个第一AR设备分别匹配的展示数据的增强现实数据。
本申请的一些实施例中,所述第一确定模块中的所述展示数据包括所述虚拟对象按照移动路径进行移动的过程中展示的移动状态;所述移动路径的途经点包括所述m个第一AR设备所在位置;或者,所述移动路径的途经点包括分别与m个第一AR设备所在 位置在设定距离范围内的位置点。
本申请的一些实施例中,所述装置还包括:
第二获取模块,配置为获取第二AR设备的位置数据;
第二确定模块,配置为在检测到所述第二AR设备满足所述预设展示条件情况下,根据所述虚拟对象在当前移动状态下的位置、所述第二AR设备的位置数据、以及所述m个第一AR设备中在更新移动路径前未经过的第一AR设备的位置数据,更新包括所述虚拟对象的移动状态的展示数据。
本申请实施例还提供一种电子设备,包括:处理器、存储器和总线,所述存储器存储有所述处理器可执行的机器可读指令,当电子设备运行时,所述处理器与所述存储器之间通过总线通信,所述机器可读指令被所述处理器执行时执行如上述第一方面或任一实施方式所述的增强现实数据呈现方法,或者,所述机器可读指令被所述处理器执行时执行如上述第二方面或任一实施方式所述的增强现实数据呈现方法。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行如上述第一方面或任一实施方式所述的增强现实数据呈现方法,或者,该计算机程序被处理器运行时执行如上述第二方面或任一实施方式所述的增强现实数据呈现方法。
本申请实施例还一种计算机程序,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现上述任意一种增强现实数据呈现方法。
本申请实施例具有以下有益效果:
可以确定包括虚拟对象的移动状态的展示数据,并通过AR设备展示包含展示数据的增强现实数据,从而通过该AR设备可以呈现出与AR设备的位置所匹配的虚拟对象移动状态,例如呈现出虚拟对象向AR设备位置处移动的增强现实效果,从而使得虚拟对象的展示更加融入现实场景,提高了增强现实数据呈现的灵活性。
为使本申请的上述目的、特征和优点能更明显易懂,下文特举较佳实施例,并配合所附附图,作详细说明如下。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,此处的附图被并入说明书中并构成本说明书中的一部分,这些附图示出了符合本申请的实施例,并与说明书一起用于说明本申请的技术方案。应当理解,以下附图仅示出了本申请的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1示出了本申请实施例所提供的一种增强现实数据呈现方法的流程示意图;
图2A示出了本申请实施例所提供的一种初始移动路径的示意图;
图2B示出了本申请实施例所提供的一种基于初始移动路径更新后的移动路径的示意图;
图3A示出了本申请实施例所提供的一种AR设备的展示数据中的图像的示意图;
图3B示出了本申请实施例所提供的另一种AR设备的展示数据中的图像的示意图;
图4示出了本申请实施例所提供的另一种增强现实数据呈现方法的流程示意图;
图5A示出了本申请实施例所提供的一种移动路径的示意图;
图5B示出了本申请实施例所提供的一种更新后的移动路径的示意图;
图6示出了本申请实施例所提供的一种增强现实数据呈现装置的架构示意图;
图7示出了本申请实施例所提供的另一种增强现实数据呈现装置的架构示意图;
图8示出了本申请实施例所提供的第一电子设备的结构示意图;
图9示出了本申请实施例所提供的第二电子设备的结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本实施方式中的附图,对本实施方式中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本申请实施例的组件可以以各种不同的配置来布置和设计。因此,以下对在附图中提供的本申请的实施例的详细描述并非旨在限制要求保护的本申请的范围,而是仅仅表示本申请的选定实施例。基于本申请的实施例,本领域技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本申请保护的范围。
为了提高呈现的增强现实场景的优化效果,以及呈现增强现实场景的灵活性,本申请实施例提供了一种增强现实数据呈现方法,通过虚拟对象的移动位置和AR设备的位置数据,确定了包括虚拟对象的移动状态的展示数据,并通过AR设备展示包含展示数据的增强现实数据,实现AR设备对增强现实场景的展示。
为便于对本申请实施例进行理解,首先对本申请实施例所申请的一种增强现实数据呈现方法进行详细介绍。
本申请实施例提供的增强现实数据呈现方法,可以针对单独的AR设备的位置数据,来获取该AR设备中待呈现的虚拟对象的移动状态所对应的展示数据,也可以针对多个AR设备在现实场景中的位置数据来进行统一计算,确定出在多个AR设备中分别待呈现的虚拟对象的移动状态所对应的展示数据,示例性的,针对多个AR设备的情况下,每个AR设备中呈现的虚拟对象的移动状态可以受其它AR设备的位置数据影响后的移动状态。
本申请实施例中,AR设备为能够支持AR功能的智能设备,示例性说明,AR设备包括但不限于:手机、平板电脑、AR眼镜等能够呈现增强现实效果的电子设备。
参见图1所示,为本申请实施例所提供的一种增强现实数据呈现方法的流程示意图,其中,该方法可以应用于上述AR设备中,或应用于本地或云端的服务器中。
图1所示的增强现实数据呈现方法包括以下几个步骤:
S101,获取增强现实AR设备的位置数据。
S102,在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据。
S103,通过AR设备展示包含展示数据的增强现实数据。
基于上述步骤,可以确定包括虚拟对象的移动状态的展示数据,并通过AR设备展示包含展示数据的增强现实数据,从而通过该AR设备可以呈现出与AR设备的位置所匹配的虚拟对象移动状态,例如呈现出虚拟对象向AR设备位置处移动的增强现实效果,从而使得虚拟对象的展示更加融入现实场景,提高了增强现实数据呈现的灵活性。以下对S101~S103分别进行说明。
针对S101:
在本申请实施例中,AR设备的位置数据包括AR设备在现实场景中的位置数据,该位置数据可以为AR设备在预设参照坐标系中的三维坐标数据,也可以为AR设备对应的经纬度数据。
示例性的,获取AR设备位置数据的方法包括但不限于:全球定位系统(Global Positioning System,GPS)、卫星定位系统等。还可以通过AR设备获取当前的现实场景图像,基于该现实场景图像进行实际地理位置的定位,比如对该现实场景图像进行图像识别,确定该现实场景图像所对应的现实场景中的地理位置信息,具体进行图像识别时, 可以基于预先训练好的位置预测模型进行识别,也可以通过将该现实场景图像与预存的样本图像进行比对的方式进行识别。
针对S102:
本实施方式中,可以基于虚拟对象的移动位置以及预设的虚拟对象的初始移动路径,确定包括虚拟对象的移动状态的初始展示数据。在获取AR设备的位置数据之后,在检测到AR设备不满足触发虚拟对象展示的预设展示条件的情况下,可以不更改虚拟对象的初始移动路径,进而不更改初始展示数据。在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和AR设备的位置数据,重新生成虚拟对象的移动路径,并基于该重新生成的移动路径,更改移动状态的展示数据,在当前的AR场景中呈现虚拟对象更新后的移动状态。
示例性的,如图2A所示的一种初始移动路径的示意图,图2A中的初始移动路径包括虚拟对象的预设初始位置21、以及虚拟对象的预设终点位置22。在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于AR设备的位置数据,重新生成虚拟对象的移动路径,重新生成的虚拟对象的移动路径包括多种情况,示例性的图2B所示的是其中的一种,由如图2B可知,重新生成的虚拟对象的移动路径包括预设初始位置21、以及虚拟对象的预设终点位置22、以及AR设备的位置23。
本申请的一些实施例中,虚拟对象的移动位置包括以下位置中的至少一种:虚拟对象的预设初始位置、虚拟对象的预设终点位置、虚拟对象在当前移动状态下的位置。
在具体实施时,虚拟对象的预设初始位置、虚拟对象的预设终点位置可以根据实际情况进行设置。虚拟对象在当前移动状态下的位置可以为在确定展示数据时,虚拟对象在三维场景模型下所处的实时位置。这里,三维场景模型为用于表征现实场景的模型,该模型能够全面描述现实场景的形貌特点,一般与现实场景之间的比例尺为1比1。基于该三维场景模型来设计虚拟对象在现实场景中的展示特效,可以使得虚拟场景的展示特效与现实场景融入性更好。
在确定上述移动状态的情况下,可以基于AR设备的位置数据和虚拟对象的移动位置,生成与AR设备的位置数据和虚拟对象的移动位置匹配的移动路径,比如生成从虚拟对象当前的实时位置或预设的初始位置经过该AR设备的位置、到达虚拟对象的预设终点位置的一条移动路径,进而在当前的AR场景中呈现虚拟对象在该移动路径下的移动状态。可见,这种基于AR设备位置生成虚拟对象的移动路径,并基于该移动路径进行移动状态展示的方式,增加了虚拟对象展示状态的多样性以及灵活性,提升了虚拟对象在现实场景中的融入性,也即提高了AR场景的展示效果。
本申请的一些实施例中,上述预设展示条件可以包括AR设备的位置数据位于目标区域范围内。其中,目标区域范围可以为包括虚拟对象的区域范围,也可以为不包括虚拟对象的区域范围。示例性的,该目标区域范围可以为预设的无规则的区域范围,也可以为以虚拟物的位置为中心,以预设的距离为半径的圆形区域。
本实施方式中,检测AR设备满足触发虚拟对象展示的预设展示条件包括:判断预设的区域范围内是否包括该AR设备的位置数据,若是,则该AR设备满足预设展示条件。或者,检测AR设备满足触发虚拟对象展示的预设展示条件还包括:判断该AR设备的位置与该虚拟物的位置之间的距离是否小于等于预设的距离,若是,则该AR设备满足预设展示条件。本申请中并不限定判断AR设备是否满足上述预设展示条件的方式。
本申请的一些实施例中,预设展示条件包括AR设备的位置数据位于目标区域范围内、且AR设备关联的用户的属性信息符合预设属性条件。
本实施方式中,与AR设备关联的用户包括但不限于:AR设备的持有者、AR设备的使用者、以及与AR设备的距离小于设置的距离阈值的用户。
本申请的一些实施例中,从AR设备关联的用户的属性信息中选择与预设属性条件 对应的属性信息,作为AR设备的第一属性信息,基于AR设备的第一属性信息判断AR设备是否符合预设属性条件。
本申请的一些实施例中,属性信息可以包括但不限于以下至少一种:用户年龄、用户性别、用户职业属性、用户预先设置的感兴趣虚拟对象信息。
本实施方式中,确定用户的属性信息包括但不限于以下几种实施方式:
方式一,通过AR设备对应的注册数据确定用户的属性信息;
方式二,通过AR设备检测到的行为数据确定用户的属性信息;
方式三,通过图像识别技术确定用户的属性信息。
示例性的,注册数据为AR设备使用时用户输入的数据,例如,AR设备为手机时,则注册数据可以为用户使用具备AR场景呈现的软件时填写的基本信息等。
示例性的,行为数据可以是AR设备上存储的数据,例如行为数据可以为用户浏览的软件的类型、浏览软件的时长等。或者,行为数据还可以是AR设备实时检测到的数据,例如为AR设备检测到的用户对AR设备执行的触发操作,如手势操作、语音操作、按键操作等。
示例性的,属性信息包括但不限于用户年龄、身高、穿着、表情等。一种可能的实施方式中,通过图像识别技术确定用户的属性信息的过程包括:获取用户的图像数据,基于图像识别技术从用户的图像数据中获取用户的属性信息。获取图像数据的方式例如为通过AR设备内置摄像头(如前置摄像头)获取,或者,通过现实场景中部署的独立于AR设备之外的摄像头来获取,或者,还可以通过其它设备传输给AR设备的用户图像数据的方式来获取。在具体实施时,通过设置用户的属性信息,使得服务器或者AR设备可以针对用户的属性信息为AR设备匹配展示的虚拟对象,或者针对用户的属性信息判断是否为该AR设备匹配展示数据,实现针对不同的AR设备展示不同的展示数据的效果,提高展示的效果。
举例说明,若预设属性条件为:用户性别为女,则在AR设备的位置数据位于目标区域范围内,且与AR设备关联的用户的第一属性信息为“用户性别为女”的情况下,则该AR设备满足预设展示条件;若在AR设备的位置数据不位于目标区域范围内,或者,与AR设备关联的用户的第一属性信息为“用户性别为男”的情况下,则该AR设备不满足预设展示条件。通过设置预设展示条件,对AR设备进行筛选,针对满足预设展示条件的AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据,将展示数据展示在对应的AR设备,提高AR场景展示的灵活性。
本申请的一些实施例中,基于虚拟对象的移动位置和AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据,包括:
基于虚拟对象的移动位置和AR设备的位置数据,获取虚拟对象的移动路径;
利用移动路径、与现实场景匹配的三维场景模型下虚拟对象的特效数据,确定包括虚拟对象的移动状态的展示数据。
在具体实施时,生成虚拟对象的移动路径的过程可以在AR设备上执行,也可以在服务器上执行。AR设备或者服务器通过设置路径规划算法,基于虚拟对象的移动位置和AR设备的位置数据,生成虚拟对象的移动路径。使得AR设备在获取本地生成的虚拟对象的移动路径,或者,在获取服务器生成的虚拟对象的移动路径后,将与现实场景匹配的三维场景模型下虚拟对象的特效数据与获取的移动路径融合,确定包括虚拟对象的移动状态的展示数据。
本申请的一些实施例中,该增强现实数据呈现方法还包括根据AR设备关联的用户属性信息,为该AR设备匹配对应的虚拟对象,具体过程如下:
在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,根据AR设备关联的用户的属性信息,确定与属性信息匹配的虚拟对象。
本实施方式中,若虚拟对象存在多个时,则可以根据AR设备关联的用户的属性信息,确定AR设备对应的虚拟对象。其中,从用户的属性信息中选择一种或多种属性信息,作为AR设备的第二属性信息,基于第二属性信息,为AR设备匹配虚拟对象。第二属性信息与判断AR设备是否符合预设属性条件时应用的第一属性信息可以相同,也可以不同。对第一属性信息与第二属性信息不同的情况进行示例性说明,若第一属性信息为用户性别为女,则第二属性信息可以为用户年龄,基于不同的用户年龄,为用户匹配不同的虚拟对象,具体的,用户年龄为3-10岁,则对应的虚拟对象为卡通动物,用户年龄为11-18,则对应的虚拟对象为游戏人物,用户年龄为19-30,则对应的虚拟对象为明星人物等。对第一属性信息与第二属性信息相同的情况进行示例性说明,以第一属性信息与第二属性信息均为用户年龄为例,若第一属性信息可以为用户年龄为20岁(包括20岁)以下,第二属性信息可以为用户年龄在小于5岁时,则对应的虚拟对象为卡通动物,用户年龄为5-10岁(包括5岁,不包括10岁)时,则对应的虚拟对象为动画人物,用户年龄为10-20岁(包括10岁和20岁)时,则对应的虚拟对象为影视明星。
通过为不同属性信息的用户匹配不同的虚拟对象,进而基于不同虚拟对象,为不同的AR设备生成不同的展示数据,实现展示数据的多样化,提高AR设备的展示效果。
本申请的一些实施例中,还可以基于用户需求,为用户个性化定制虚拟对象的移动状态,比如,若AR设备1获取到对明星1感兴趣的指示信息,则生成包括明星1的移动状态的展示数据,若AR设备2获取到对明星2感兴趣的指示信息,则生成包括明星2的移动状态的展示数据。
上述描述中,在确定与属性信息匹配的虚拟对象后,基于匹配虚拟对象的移动位置和AR设备的位置数据,生成包括虚拟对象的移动状态的展示数据,包括:
获取与属性信息匹配的虚拟对象的移动位置;
基于与属性信息匹配的虚拟对象的移动位置、AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据。
本实施方式中,不同的虚拟对象的移动位置可以相同,也可以不同,其中,虚拟对象的移动位置可以根据实际需要进行设置。在基于AR设备关联的用户的属性信息确定匹配的虚拟对象后,首先获取与属性信息匹配的虚拟对象的移动位置,然后基于获取的虚拟对象的移动位置、AR设备的位置数据,生成与属性信息匹配的虚拟对象的移动路径,最后利用该移动路径、与现实场景匹配的三维场景模型下与属性信息匹配的虚拟对象的特效数据,确定包括与属性信息匹配的虚拟对象的移动状态的展示数据。
针对S103:
本申请实施例中,增强现实数据包括展示数据,例如由多帧图像构成的动画,或者,单帧图像等,还可以包括:声音数据、气味合成数据等。示例性的,在展示数据中预设的帧图像位置设置声音数据,实现将声音数据与展示数据的融合,例如,展示数据中展示的虚拟对象移动至AR设备的位置处时,使得AR设备播放预设的声音数据,示例性的,声音数据可以为:你好,欢迎来到童话世界。实现在AR设备上既展示声音数据,也展示展示数据,提高了AR设备展示增强现实数据的效果。
示例性的,如图3A所示的是一种AR设备的展示数据中的图像的示意图,如图3B所示的是另一种AR设备的展示数据中的图像的示意图,如图3A以及3B可知,虚拟对象33从位置点31移动到了位置点32,且虚拟对象33在位置点31的特效数据与在位置点32的特效数据不同,同时当虚拟对象移动至位置点32时,可以设置声音数据,使得AR设备播放声音数据:“圣诞老人,发礼物了”。
本申请实施例提供的增强现实数据呈现方法,基于单个AR设备,生成与该AR设备匹配的包括虚拟对象的移动状态的展示数据,其中展示数据中虚拟对象时基于移动路径处于移动状态中,即展示数据中的虚拟对象是动态的,使得该AR设备能够展示包括 该展示数据的增强现实数据,提高了AR设备展示增强现实数据的效果。
参见图4所示,为本申请实施例所提供的另一种增强现实数据呈现方法的流程示意图,该方法应用于服务器,该方法应用于对多个AR设备展示增强现实数据,图4所示的增强现实数据呈现方法包括步骤S401-S403,具体过程如下:
S401,获取n个第一增强现实AR设备的位置数据,n为正整数。
本申请实施例中,获取n个第一设备中每个AR设备的位置数据。其中,可以通过GPS、卫星定位系统获取第一AR设备的位置数据。若n个AR设备中存在有a个AR设备为关联AR设备,a为小于等于n的正整数,则可以通过获取关联AR设备中的任一AR设备的位置数据,作为关联AR设备中每一AR设备的位置数据。其中,关联AR设备可以为用户人工将a个AR设备进行关联,也可以为服务器自动将满足关联条件的AR设备进行关联,例如,关联条件可以为连接同一信号的AR设备。
S402,在检测到n个第一AR设备中m个第一AR设备满足触发虚拟对象展示的预设展示条件情况下,基于虚拟对象的移动位置和m个第一AR设备的位置数据,确定与m个第一AR设备分别匹配的包括虚拟对象的移动状态的展示数据;m为小于或等于n的正整数。
本申请实施例中,根据虚拟对象的移动位置以及m个第一AR设备的位置数据,生成虚拟对象的移动路径,基于该移动路径以及与现实场景匹配的三维场景模型下虚拟对象的特效数据,确定m个第一AR设备中每个第一AR设备对应的包括虚拟对象的移动状态的展示数据。
本申请的一些实施例中,展示数据包括虚拟对象按照移动路径进行移动的过程中展示的移动状态;移动路径的途经点包括m个第一AR设备所在位置,或者,移动路径的途经点包括分别与m个第一AR设备所在位置在设定距离范围内的位置点。
本申请实施例中,m的值可以根据实际满足上述预设展示条件的AR设备的数量的改变而更新,具体的,当第一AR设备中,任一AR设备的位置数据不满足预设展示条件时,则移动路径的途经点不包括该不满足预设展示条件的AR设备,此时刻m的值发生改变。示例性说明,若初始时刻m的值为3,当3个第一AR设备中任意一个第一AR设备的位置数据不满足预设展示条件时,即3个第一AR设备中存在一个第一AR设备的位置数据位于目标区域范围之外时,则m的值发生改变,此时刻m的值为2,则基于2个第一AR设备的位置数据以及虚拟对象的移动位置,确定与2个第一AR设备分别匹配的包括虚拟对象的移动状态的展示数据,该展示数据包括虚拟对象按照移动路径进行移动的过程中展示的移动状态;移动路径的途经点包括2个第一AR设备所在位置。
本申请的一些实施例中,该增强现实数据呈现方法还包括:
获取第二AR设备的位置数据;
在检测到第二AR设备满足预设展示条件情况下,根据虚拟对象在当前移动状态下的位置、第二AR设备的位置数据、以及m个第一AR设备中在更新移动路径前未经过的第一AR设备的位置数据,更新包括虚拟对象的移动状态的展示数据。
本实施方式中,在基于虚拟对象的移动位置和m个第一AR设备的位置数据,确定与m个第一AR设备分别匹配的包括虚拟对象的移动状态的展示数据之后,服务器检测到第二AR设备时,则获取第二AR设备的位置数据,若第二AR设备满足预设展示条件时,则更新包括虚拟对象的移动状态的展示数据,其中,第二AR设备为除第一AR设备之外的其他AR设备。
示例性说明,如图5A所示,图5A中示出的是一种移动路径的示意图,由图5A可知m的值为2,图5A中包括虚拟对象的预设初始位置51、虚拟对象的预设终点位置52、以及2个第一AR设备的位置53。若在某一时刻时,检测到第二AR设备的位置数据时,则获取第二AR设备的位置数据,在第二AR设备满预设展示条件时,则基于虚拟对象 在当前移动状态下的位置、第二AR设备的位置数据、以及2个第一AR设备中在更新移动路径前未经过的第一AR设备的位置数据,更新图5A所示的移动路径。图5A所示的移动路径对应的更新后的移动路径包括多种情况,示例性的以其中任一情况进行说明,更新后的移动路径如图5B所示,由图5B可知,图5B中包括虚拟对象在当前移动状态下的位置54、第二AR设备的位置55、虚拟对象的预设终点位置52、以及更新移动路径前未经过的第一AR设备的位置53。进一步的,基于更新后的移动路径,更新包括虚拟对象的移动状态的展示数据。
S403,通过m个第一AR设备展示与m个第一AR设备分别匹配的展示数据的增强现实数据。
示例性的,承接S402中的例子进行说明,若m的值为2,则服务器生成2个第一AR设备分别匹配的包括虚拟对象的移动状态的展示数据,并将2个展示数据分别发送给对应的第一AR设备,使得第一AR展示对应的包括展示数据的增强现实数据。
本申请提供的增强现实数据呈现方法,基于多个AR设备,生成与多个AR设备分别匹配的包括虚拟对象的移动状态的展示数据,使得多个AR设备在目标区域内,均能接收到包括虚拟对象的移动状态的展示数据,并展示包括该展示数据的增强现实数据,实现虚拟对象在多个AR设备中的灵活展示,并符合现实场景需求,提高了展示增强现实数据的效果。
本领域技术人员可以理解,在具体实施方式的上述方法中,各步骤的撰写顺序并不意味着严格的执行顺序而对实施过程构成任何限定,各步骤的具体执行顺序应当以其功能和可能的内在逻辑确定。
基于相同的构思,本申请实施例还提供了一种增强现实数据呈现装置,参见图6所示,为本申请实施例提供的增强现实数据呈现装置的架构示意图,包括位置数据获取模块61、展示数据确定模块62、第一展示模块63,具体的:
位置数据获取模块61,配置为获取增强现实AR设备的位置数据;
展示数据确定模块62,配置为在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据;
第一展示模块63,配置为通过AR设备展示包含展示数据的增强现实数据。
一种可能的实施方式中,展示数据确定模块62中虚拟对象的移动位置包括以下位置中的至少一种:
虚拟对象的预设初始位置、虚拟对象的预设终点位置、虚拟对象在当前移动状态下的位置。
一种可能的实施方式中,展示数据确定模块62中预设展示条件包括AR设备的位置数据位于目标区域范围内。
一种可能的实施方式中,展示数据确定模块62中预设展示条件包括AR设备的位置数据位于目标区域范围内、且AR设备关联的用户的属性信息符合预设属性条件。
一种可能的实施方式中,展示数据确定模块62,利用下述步骤确定包括虚拟对象的移动状态的展示数据:
基于虚拟对象的移动位置和AR设备的位置数据,获取虚拟对象的移动路径;
利用移动路径、与现实场景匹配的三维场景模型下虚拟对象的特效数据,生成包括虚拟对象的移动状态的展示数据。
一种可能的实施方式中,装置还包括:
虚拟对象匹配模块64,配置为在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,根据AR设备关联的用户的属性信息,确定与属性信息匹配的虚拟对象;
展示数据确定模块62,利用下述步骤确定包括虚拟对象的移动状态的展示数据,包 括:
获取与属性信息匹配的虚拟对象的移动位置;
基于与属性信息匹配的虚拟对象的移动位置、AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据。
一种可能的实施方式中,属性信息包括以下至少一种:用户年龄、用户性别、用户职业属性、用户预先设置的感兴趣虚拟对象信息。
基于相同的构思,本申请实施例还提供了另一种增强现实数据呈现装置,参见图7所示,为本申请实施例提供的另一种增强现实数据呈现装置的架构示意图,包括第一获取模块71、第一确定模块72、第二展示模块73,具体的:
第一获取模块71,配置为获取n个第一增强现实AR设备的位置数据,n为正整数;
第一确定模块72,配置为在检测到n个第一AR设备中m个第一AR设备满足触发虚拟对象展示的预设展示条件情况下,基于虚拟对象的移动位置和m个第一AR设备的位置数据,确定与m个第一AR设备分别匹配的包括虚拟对象的移动状态的展示数据;m为小于或等于n的正整数;
第二展示模块73,配置为通过m个第一AR设备展示与m个第一AR设备分别匹配的展示数据的增强现实数据。
一种可能的实施方式中,第一确定模块72中的展示数据包括虚拟对象按照移动路径进行移动的过程中展示的移动状态;移动路径的途经点包括m个第一AR设备所在位置;或者,移动路径的途经点包括分别与m个第一AR设备所在位置在设定距离范围内的位置点。
一种可能的实施方式中,装置还包括:
第二获取模块74,配置为获取第二AR设备的位置数据;
第二确定模块75,配置为在检测到第二AR设备满足预设展示条件情况下,根据虚拟对象在当前移动状态下的位置、第二AR设备的位置数据、以及m个第一AR设备中在更新移动路径前未经过的第一AR设备的位置数据,更新包括虚拟对象的移动状态的展示数据。
在一些实施例中,本申请实施例提供的装置具有的功能或包含的模板可以用于执行上文方法实施例描述的方法,其具体实现可以参照上文方法实施例的描述,为了简洁,这里不再赘述。
基于同一技术构思,本申请实施例还提供了第一电子设备。参照图8所示,为本申请实施例提供的第一电子设备的结构示意图,第一电子设备800包括第一处理器801、第一存储器802和第一总线803。其中,第一存储器802用于存储执行指令,包括第一内存8021和第一外部存储器8022;这里的第一内存8021也称内存储器,用于暂时存放第一处理器801中的运算数据,以及与硬盘等第一外部存储器8022交换的数据,第一处理器801通过第一内存8021与第一外部存储器8022进行数据交换。
第一处理器801可以是专用集成电路(Application Specific Integrated Circuit,ASIC)、数字信号处理器(Digital Signal Processor,DSP)、数字信号处理设备(Digital Signal Processing Device,DSPD)、可编程逻辑器件(Programmable Logic Device,PLD)、现场可编程门阵列(Field Programmable Gate Array,FPGA)、控制器、微控制器和微处理器中的至少一种。
第一内存8021或第一外部存储器8022可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(Static Random-Access Memory,SRAM),电可擦除可编程只读存储器(Electrically Erasable Programmable Read Only Memory,EEPROM),可擦除可编程只读存储器(Electrical Programmable Read Only Memory,EPROM),可编程只读存储器(Programmable Read-Only Memory,PROM),只读存储器 (Read-Only Memory,ROM),磁存储器,快闪存储器,磁盘或光盘。
当第一电子设备800运行时,第一处理器801与第一存储器802之间通过第一总线803通信,使得第一处理器801在执行以下指令:
获取增强现实AR设备的位置数据;
在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据;
通过AR设备展示包含展示数据的增强现实数据。
第一处理器801执行的具体处理过程可参照上述方法实施例或者对应的装置实施例中的相关描述,这里不再展开说明。
基于同一技术构思,本申请实施例还提供了第二电子设备。参照图9所示,为本申请实施例提供的电子设备的结构示意图,第二电子设备900包括第二处理器901、第二存储器902和第二总线903。其中,第二存储器902用于存储执行指令,包括第二内存9021和第二外部存储器9022;这里的第二内存9021也称内存储器,用于暂时存放第二处理器901中的运算数据,以及与硬盘等第二外部存储器9022交换的数据,第二处理器901通过第二内存9021与第二外部存储器9022进行数据交换。
第二处理器901可以是ASIC、DSP、DSPD、PLD、FPGA、控制器、微控制器和微处理器中的至少一种。
第二内存9021或第二外部存储器9022可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如SRAM,EEPROM,EPROM,PROM,ROM,磁存储器,快闪存储器,磁盘或光盘。
当第二电子设备900运行时,第二处理器901与第二存储器902之间通过第二总线903通信,使得第二处理器901在执行以下指令:
获取n个第一增强现实AR设备的位置数据,n为正整数;
在检测到n个第一AR设备中m个第一AR设备满足触发虚拟对象展示的预设展示条件情况下,基于虚拟对象的移动位置和m个第一AR设备的位置数据,确定与m个第一AR设备分别匹配的包括虚拟对象的移动状态的展示数据;m为小于或等于n的正整数;
通过m个第一AR设备展示与m个第一AR设备分别匹配的展示数据的增强现实数据。
第二处理器901执行的具体处理过程可参照上述方法实施例或者对应的装置实施例中的相关描述,这里不再展开说明。
此外,本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行上述方法实施例中所述的增强现实数据呈现方法。
本申请实施例还提供了一种计算机程序,包括存储了程序代码的计算机可读存储介质,所述程序代码包括的指令可用于执行上述方法实施例中所述的增强现实数据呈现方法,具体可参见上述方法实施例,在此不再赘述。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统和装置的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,又例如,多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些通信接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个处理器可执行的非易失的计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。
工业实用性
本申请实施例提出了一种增强现实数据呈现方法、装置、电子设备、存储介质和程序,该方法包括:获取增强现实AR设备的位置数据;在检测到AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和AR设备的位置数据,确定包括虚拟对象的移动状态的展示数据;通过AR设备展示包含展示数据的增强现实数据。

Claims (23)

  1. 一种增强现实数据呈现方法,所述方法包括:
    获取增强现实AR设备的位置数据;
    在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据;
    通过所述AR设备展示包含所述展示数据的增强现实数据。
  2. 根据权利要求1所述的方法,其中,所述虚拟对象的移动位置包括以下位置中的至少一种:
    所述虚拟对象的预设初始位置、所述虚拟对象的预设终点位置、所述虚拟对象在当前移动状态下的位置。
  3. 根据权利要求1或2所述的方法,其中,所述预设展示条件包括所述AR设备的位置数据位于目标区域范围内。
  4. 根据权利要求1或2所述的方法,其中,所述预设展示条件包括所述AR设备的位置数据位于目标区域范围内、且所述AR设备关联的用户的属性信息符合预设属性条件。
  5. 根据权利要求1至4任一所述的方法,其中,所述基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据,包括:
    基于所述虚拟对象的移动位置和所述AR设备的位置数据,获取所述虚拟对象的移动路径;
    利用所述移动路径、与所述现实场景匹配的三维场景模型下所述虚拟对象的特效数据,生成包括所述虚拟对象的移动状态的展示数据。
  6. 根据权利要求1至5任一所述的方法,其中,所述方法还包括:
    在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,根据所述AR设备关联的用户的属性信息,确定与所述属性信息匹配的虚拟对象;
    所述基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据,包括:
    获取与所述属性信息匹配的虚拟对象的移动位置;
    基于所述与所述属性信息匹配的虚拟对象的移动位置、所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据。
  7. 根据权利要求4或6所述的方法,其中,所述属性信息包括以下至少一种:用户年龄、用户性别、用户职业属性、用户预先设置的感兴趣虚拟对象信息。
  8. 一种增强现实数据呈现方法,其中,所述方法包括:
    获取n个第一增强现实AR设备的位置数据,n为正整数;
    在检测到所述n个第一AR设备中m个第一AR设备满足触发虚拟对象展示的预设展示条件情况下,基于虚拟对象的移动位置和所述m个第一AR设备的位置数据,确定与所述m个第一AR设备分别匹配的包括所述虚拟对象的移动状态的展示数据;m为小于或等于n的正整数;
    通过所述m个第一AR设备展示与所述m个第一AR设备分别匹配的展示数据的增强现实数据。
  9. 根据权利要求8所述的方法,其中,所述展示数据包括所述虚拟对象按照移动路径进行移动的过程中展示的移动状态;所述移动路径的途经点包括所述m个第一AR设备所在位置;或者,所述移动路径的途经点包括分别与m个第一AR设备所在位置在设定距离范围内的位置点。
  10. 根据权利要求9所述的方法,其中,所述方法还包括:
    获取第二AR设备的位置数据;
    在检测到所述第二AR设备满足所述预设展示条件情况下,根据所述虚拟对象在当前移动状态下的位置、所述第二AR设备的位置数据、以及所述m个第一AR设备中在更新移动路径前未经过的第一AR设备的位置数据,更新包括所述虚拟对象的移动状态的展示数据。
  11. 一种增强现实数据呈现装置,所述装置包括:
    位置数据获取模块,配置为获取增强现实AR设备的位置数据;
    展示数据确定模块,配置为在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,基于虚拟对象的移动位置和所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据;
    第一展示模块,配置为通过所述AR设备展示包含所述展示数据的增强现实数据。
  12. 根据权利要求11所述的装置,其中,所述展示数据确定模块中所述虚拟对象的移动位置包括以下位置中的至少一种:
    所述虚拟对象的预设初始位置、所述虚拟对象的预设终点位置、所述虚拟对象在当前移动状态下的位置。
  13. 根据权利要求11或12所述的装置,其中,所述展示数据确定模块中所述预设展示条件包括所述AR设备的位置数据位于目标区域范围内。
  14. 根据权利要求11或12所述的装置,其中,所述展示数据确定模块中所述预设展示条件包括所述AR设备的位置数据位于目标区域范围内、且所述AR设备关联的用户的属性信息符合预设属性条件。
  15. 根据权利要求11至14任一所述的装置,其中,所述展示数据确定模块,利用下述步骤确定包括所述虚拟对象的移动状态的展示数据:
    基于所述虚拟对象的移动位置和所述AR设备的位置数据,获取所述虚拟对象的移动路径;
    利用所述移动路径、与所述现实场景匹配的三维场景模型下所述虚拟对象的特效数据,生成包括所述虚拟对象的移动状态的展示数据。
  16. 根据权利要求11至15任一所述的装置,其中,所述装置还包括:
    虚拟对象匹配模块,配置为在检测到所述AR设备满足触发虚拟对象展示的预设展示条件的情况下,根据所述AR设备关联的用户的属性信息,确定与所述属性信息匹配的虚拟对象;
    所述展示数据确定模块,利用下述步骤确定包括所述虚拟对象的移动状态的展示数据,包括:
    获取与所述属性信息匹配的虚拟对象的移动位置;
    基于所述与所述属性信息匹配的虚拟对象的移动位置、所述AR设备的位置数据,确定包括所述虚拟对象的移动状态的展示数据。
  17. 根据权利要求14或16所述的装置,其中,所述属性信息包括以下至少一种:用户年龄、用户性别、用户职业属性、用户预先设置的感兴趣虚拟对象信息。
  18. 一种增强现实数据呈现装置,其中,所述装置包括:
    第一获取模块,配置为获取n个第一增强现实AR设备的位置数据,n为正整数;
    第一确定模块,配置为在检测到所述n个第一AR设备中m个第一AR设备满足触发虚拟对象展示的预设展示条件情况下,基于虚拟对象的移动位置和所述m个第一AR设备的位置数据,确定与所述m个第一AR设备分别匹配的包括所述虚拟对象的移动状态的展示数据;m为小于或等于n的正整数;
    第二展示模块,配置为通过所述m个第一AR设备展示与所述m个第一AR设备分别匹 配的展示数据的增强现实数据。
  19. 根据权利要求18所述的装置,其中,所述第一确定模块中的所述展示数据包括所述虚拟对象按照移动路径进行移动的过程中展示的移动状态;所述移动路径的途经点包括所述m个第一AR设备所在位置;或者,所述移动路径的途经点包括分别与m个第一AR设备所在位置在设定距离范围内的位置点。
  20. 根据权利要求19所述的装置,其中,所述装置还包括:
    第二获取模块,配置为获取第二AR设备的位置数据;
    第二确定模块,配置为在检测到所述第二AR设备满足所述预设展示条件情况下,根据所述虚拟对象在当前移动状态下的位置、所述第二AR设备的位置数据、以及所述m个第一AR设备中在更新移动路径前未经过的第一AR设备的位置数据,更新包括所述虚拟对象的移动状态的展示数据。
  21. 一种电子设备,包括:处理器、存储器和总线,所述存储器存储有所述处理器可执行的机器可读指令,当电子设备运行时,所述处理器与所述存储器之间通过总线通信,所述机器可读指令被所述处理器执行时执行如权利要求1至7任一所述的增强现实数据呈现方法,或者,所述机器可读指令被所述处理器执行时执行如权利要求8至10任一所述的增强现实数据呈现方法。
  22. 一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行如权利要求1至7任一所述的增强现实数据呈现方法的步骤,或者,所述机器可读指令被所述处理器执行时执行如权利要求8至10任一所述的增强现实数据呈现方法。
  23. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现权利要求1至7任一所述的增强现实数据呈现方法或权利要求8至10任一所述的增强现实数据呈现方法。
PCT/CN2020/111890 2019-10-15 2020-08-27 增强现实数据呈现方法、装置、设备、存储介质和程序 WO2021073269A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2020572499A JP2022505002A (ja) 2019-10-15 2020-08-27 拡張現実データの表示方法、装置、機器、記憶媒体及びプログラム
SG11202013054YA SG11202013054YA (en) 2019-10-15 2020-08-27 Method and apparatus for presenting augmented reality data, device, storage medium and program
KR1020207037362A KR102417786B1 (ko) 2019-10-15 2020-08-27 증강 현실 데이터 제시 방법, 장치, 기기, 저장 매체 및 프로그램
US17/131,988 US20210110617A1 (en) 2019-10-15 2020-12-23 Method and apparatus for presenting augmented reality data, device, storage medium and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910979898.4A CN110764614B (zh) 2019-10-15 2019-10-15 增强现实数据呈现方法、装置、设备及存储介质
CN201910979898.4 2019-10-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/131,988 Continuation US20210110617A1 (en) 2019-10-15 2020-12-23 Method and apparatus for presenting augmented reality data, device, storage medium and program

Publications (1)

Publication Number Publication Date
WO2021073269A1 true WO2021073269A1 (zh) 2021-04-22

Family

ID=69331369

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/111890 WO2021073269A1 (zh) 2019-10-15 2020-08-27 增强现实数据呈现方法、装置、设备、存储介质和程序

Country Status (5)

Country Link
KR (1) KR102417786B1 (zh)
CN (1) CN110764614B (zh)
SG (1) SG11202013054YA (zh)
TW (1) TWI749795B (zh)
WO (1) WO2021073269A1 (zh)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110764614B (zh) * 2019-10-15 2021-10-08 北京市商汤科技开发有限公司 增强现实数据呈现方法、装置、设备及存储介质
KR20210148074A (ko) * 2020-05-26 2021-12-07 베이징 센스타임 테크놀로지 디벨롭먼트 컴퍼니 리미티드 Ar 시나리오 콘텐츠의 생성 방법, 전시 방법, 장치 및 저장 매체
CN111625103A (zh) * 2020-06-03 2020-09-04 浙江商汤科技开发有限公司 雕塑展示方法、装置、电子设备及存储介质
CN111640183A (zh) * 2020-06-04 2020-09-08 上海商汤智能科技有限公司 一种ar数据展示控制方法及装置
CN111639613B (zh) * 2020-06-04 2024-04-16 上海商汤智能科技有限公司 一种增强现实ar特效生成方法、装置及电子设备
CN111693063A (zh) * 2020-06-12 2020-09-22 浙江商汤科技开发有限公司 导航互动展示方法、装置、电子设备及存储介质
CN111815783A (zh) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 虚拟场景的呈现方法及装置、电子设备及存储介质
CN112068703B (zh) 2020-09-07 2021-11-16 北京字节跳动网络技术有限公司 目标物体的控制方法、装置、电子设备及存储介质
CN112861725A (zh) * 2021-02-09 2021-05-28 深圳市慧鲤科技有限公司 一种导航提示方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160077785A1 (en) * 2012-01-27 2016-03-17 Microsoft Technology Licensing, Llc Executable virtual objects associated with real objects
CN107767438A (zh) * 2016-08-16 2018-03-06 上海掌门科技有限公司 一种基于虚拟对象进行用户交互的方法与设备
CN108027657A (zh) * 2015-12-11 2018-05-11 谷歌有限责任公司 增强和/或虚拟现实环境中的场境敏感用户界面激活
CN108958475A (zh) * 2018-06-06 2018-12-07 阿里巴巴集团控股有限公司 虚拟对象控制方法、装置及设备
CN109656441A (zh) * 2018-12-21 2019-04-19 广州励丰文化科技股份有限公司 一种基于虚拟现实的导览方法及系统
CN109725782A (zh) * 2017-10-27 2019-05-07 腾讯科技(深圳)有限公司 一种实现虚拟现实的方法、装置及智能设备、存储介质
CN110764614A (zh) * 2019-10-15 2020-02-07 北京市商汤科技开发有限公司 增强现实数据呈现方法、装置、设备及存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571532B (zh) * 2015-02-04 2018-01-30 网易有道信息技术(北京)有限公司 一种实现增强现实或虚拟现实的方法及装置
RU2601169C1 (ru) * 2015-06-11 2016-10-27 Виталий Витальевич Аверьянов Способ и устройство для взаимодействия с виртуальными объектами
KR102317247B1 (ko) * 2015-06-15 2021-10-26 한국전자통신연구원 영상정보를 이용한 증강현실 기반 손 인터랙션 장치 및 방법
TWI642002B (zh) * 2017-04-14 2018-11-21 李雨暹 適地性空間物件可視範圍管理方法與系統
US10105601B1 (en) * 2017-10-27 2018-10-23 Nicholas T. Hariton Systems and methods for rendering a virtual content object in an augmented reality environment
CN108805989B (zh) * 2018-06-28 2022-11-11 百度在线网络技术(北京)有限公司 场景穿越的方法、装置、存储介质和终端设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160077785A1 (en) * 2012-01-27 2016-03-17 Microsoft Technology Licensing, Llc Executable virtual objects associated with real objects
CN108027657A (zh) * 2015-12-11 2018-05-11 谷歌有限责任公司 增强和/或虚拟现实环境中的场境敏感用户界面激活
CN107767438A (zh) * 2016-08-16 2018-03-06 上海掌门科技有限公司 一种基于虚拟对象进行用户交互的方法与设备
CN109725782A (zh) * 2017-10-27 2019-05-07 腾讯科技(深圳)有限公司 一种实现虚拟现实的方法、装置及智能设备、存储介质
CN108958475A (zh) * 2018-06-06 2018-12-07 阿里巴巴集团控股有限公司 虚拟对象控制方法、装置及设备
CN109656441A (zh) * 2018-12-21 2019-04-19 广州励丰文化科技股份有限公司 一种基于虚拟现实的导览方法及系统
CN110764614A (zh) * 2019-10-15 2020-02-07 北京市商汤科技开发有限公司 增强现实数据呈现方法、装置、设备及存储介质

Also Published As

Publication number Publication date
SG11202013054YA (en) 2021-05-28
TW202117676A (zh) 2021-05-01
TWI749795B (zh) 2021-12-11
KR102417786B1 (ko) 2022-07-06
KR20210046590A (ko) 2021-04-28
CN110764614A (zh) 2020-02-07
CN110764614B (zh) 2021-10-08

Similar Documents

Publication Publication Date Title
WO2021073269A1 (zh) 增强现实数据呈现方法、装置、设备、存储介质和程序
WO2021073268A1 (zh) 一种增强现实数据呈现方法、装置、电子设备及存储介质
US11875439B2 (en) Augmented expression system
US20230245403A1 (en) Augmented reality system
KR102159849B1 (ko) 혼합 현실 디스플레이 제공 기법
EP3827411B1 (en) Conditional modification of augmented reality object
KR101966384B1 (ko) 영상 처리 방법 및 시스템
CN110738737A (zh) 一种ar场景图像处理方法、装置、电子设备及存储介质
TWI783472B (zh) Ar場景內容的生成方法、展示方法、電子設備及電腦可讀儲存介質
US20230185868A1 (en) Automatic website data migration
US20230360344A1 (en) Crowd sourced mapping system
CN111610998A (zh) Ar场景内容的生成方法、展示方法、装置及存储介质
US11227437B2 (en) Three-dimensional model constructing method, apparatus, and system
US11423629B2 (en) Automatic rendering of 3D sound
CN109923509A (zh) 虚拟现实中的对象的协同操纵
CN108697935A (zh) 虚拟环境中的化身
US11347932B2 (en) Decoupling website service from presentation layer
US11579744B2 (en) Systems and methods for seat selection in virtual reality
CN111610997A (zh) Ar场景内容的生成方法、展示方法、展示系统及装置
US20210110617A1 (en) Method and apparatus for presenting augmented reality data, device, storage medium and program
US11481960B2 (en) Systems and methods for generating stabilized images of a real environment in artificial reality
US20230186570A1 (en) Methods and systems to allow three-dimensional maps sharing and updating
US12033270B2 (en) Systems and methods for generating stabilized images of a real environment in artificial reality
US10930077B1 (en) Systems and methods for rendering augmented reality mapping data
WO2023136934A1 (en) Ephemeral artificial reality experiences

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020572499

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20877034

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20877034

Country of ref document: EP

Kind code of ref document: A1