CN111580679A - Space capsule display method and device, electronic equipment and storage medium - Google Patents

Space capsule display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111580679A
CN111580679A CN202010509190.5A CN202010509190A CN111580679A CN 111580679 A CN111580679 A CN 111580679A CN 202010509190 A CN202010509190 A CN 202010509190A CN 111580679 A CN111580679 A CN 111580679A
Authority
CN
China
Prior art keywords
capsule
space capsule
scene image
virtual space
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010509190.5A
Other languages
Chinese (zh)
Inventor
孙红亮
揭志伟
王子彬
潘思霁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Zhejiang Sensetime Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202010509190.5A priority Critical patent/CN111580679A/en
Publication of CN111580679A publication Critical patent/CN111580679A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The disclosure provides a space capsule display method, a space capsule display device, electronic equipment and a storage medium, wherein the space capsule display method comprises the following steps: acquiring a scene image including an indoor environment acquired by AR equipment; determining virtual space capsule environment information corresponding to an indoor environment based on the acquired scene image; and generating AR data of the space capsule based on the determined virtual space capsule environment information, and controlling the AR device to display the AR scene image of the space capsule according to the AR data. The virtual space environment constructed by the AR equipment is capable of enabling a user to generate an immersive outer space experience.

Description

Space capsule display method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of AR (augmented reality) equipment, in particular to a space capsule display method and device, electronic equipment and a storage medium.
Background
With the improvement of living standard of people, more and more users want to find different experiences, and outer space is always a place which is always explored by people but is not more than mystery.
At present, people's exploration of outer space is still in the preliminary stage, the cost for visiting the real outer space environment is very expensive, and people do not have courage to visit the outer space because of some potential threats, so that the requirement that people want to feel the outer space environment in the scene is difficult to meet.
Disclosure of Invention
The embodiment of the disclosure provides at least one space capsule display scheme, wherein AR scene images of the space capsule are displayed through AR equipment, and the virtual space environment constructed by the space capsule display scheme can enable a user to generate an immersive outer space experience, so that the outer space environment can be experienced with lower cost.
Mainly comprises the following aspects:
in a first aspect, an embodiment of the present disclosure provides a method for displaying a space capsule, where the method includes:
acquiring a scene image including an indoor environment acquired by AR equipment;
determining virtual space capsule environment information corresponding to the indoor environment based on the acquired scene image;
generating AR data of the space capsule based on the determined environment information of the virtual space capsule, and controlling the AR device to display an AR scene image of the space capsule according to the AR data.
In one embodiment, the determining virtual space capsule environment information corresponding to the indoor environment based on the acquired scene image comprises:
extracting spatial structure information of the indoor environment from the acquired scene image;
calculating the matching degree between the space structure information and the plurality of kinds of preset virtual space capsule environment information, selecting target virtual space capsule environment information matched with the extracted space structure information from the plurality of kinds of preset virtual space capsule environment information based on the calculated matching degree, and taking the target virtual space capsule environment information as the virtual space capsule environment information corresponding to the indoor environment.
In one embodiment, the generating AR data for the capsule based on the determined virtual capsule environment information comprises:
identifying the placement position information of each real object in the indoor environment from the acquired scene image;
and generating the AR data of the space capsule based on the identified placing position information of each real object and the determined environment information of the virtual space capsule.
In one embodiment, the generating AR data of the capsule based on the identified placing position information of each real object and the determined virtual capsule environment information includes:
determining display position information of each virtual space capsule element in the AR scene image to be generated based on the identified placing position information of each real object;
and generating the AR data of the capsule based on the display position information of each virtual capsule element in the AR scene image to be generated, each virtual capsule element and the virtual capsule environment information.
In one embodiment, the determining, based on the identified placement position information of each real object, display position information of each virtual space capsule element in the AR scene image to be generated includes:
and aiming at each real object in the indoor environment, acquiring a virtual space capsule element matched with the real object, and determining the display position information of the virtual space capsule element matched with the real object in the AR scene image to be generated based on the identified placement position information of the real object.
In one embodiment, the obtaining, for each real object in the indoor environment, a virtual capsule element matching the real object includes:
identifying an object type of the real object based on a scene image of the indoor environment;
based on the identified object type, a virtual space capsule element matching the object type is determined.
In one embodiment, the determining a virtual space capsule element matching the object type based on the identified object type includes:
identifying an object size and a color of the real object based on a scene image of the indoor environment;
and adjusting the size and the color of the virtual space capsule element matched with the object type to be matched with the object size and the color of the real object based on the identified object size and color.
In a second aspect, an embodiment of the present disclosure further provides a space capsule display device, where the device includes:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring scene images including indoor environment collected by AR equipment;
the determining module is used for determining virtual space capsule environment information corresponding to the indoor environment based on the acquired scene image;
and the generating module is used for generating the AR data of the space capsule based on the determined environment information of the virtual space capsule and controlling the AR equipment to display the AR scene image of the space capsule according to the AR data.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine readable instructions when executed by the processor performing the steps of the space capsule displaying method according to the first aspect and any of its various embodiments.
In a fourth aspect, the disclosed embodiments also provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the steps of the space capsule displaying method according to the first aspect and any of the various embodiments thereof.
By adopting the above space capsule display scheme, after the scene image which is acquired by the AR equipment and comprises the indoor environment is acquired, the virtual space capsule environment information corresponding to the indoor environment can be determined based on the scene image, so that the AR data of the space capsule can be generated according to the virtual space capsule environment information, the AR equipment can display the AR scene image of the space capsule according to the AR data, the constructed virtual space environment can enable a user to generate an immersive outer space experience, and the outer space environment can be experienced at a lower cost.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flowchart of a space capsule display method provided in an embodiment of the present disclosure;
fig. 2(a) is a schematic application diagram of a space capsule display method provided by a first embodiment of the disclosure;
fig. 2(b) is a schematic application diagram of a space capsule display method provided by a first embodiment of the disclosure;
fig. 3 shows a schematic view of a space capsule display device provided in the second embodiment of the disclosure;
fig. 4 shows a schematic diagram of an electronic device provided in a third embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Research shows that the cost for visiting the real outer space environment is very expensive at present, and people do not visit the outer space through courage possibly because of some potential threats, so that the requirement that people want to feel the outer space environment in the presence is difficult to meet
Based on the research, the disclosure provides at least one space capsule display scheme, AR scene images of the space capsule are displayed through AR equipment, the virtual space environment constructed by the space capsule display scheme can enable a user to generate an immersive outer space experience, and the outer space environment can be experienced with low cost.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the embodiment, first, a detailed description is given to a space capsule display method disclosed in the embodiment of the present disclosure, where an execution subject of the space capsule display method provided in the embodiment of the present disclosure is generally an electronic device with certain computing capability, and the electronic device includes, for example: a terminal device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, and the like, or a server or other processing device, which may be an Augmented Reality (AR) device such as AR glasses or an AR helmet. In some possible implementations, the space capsule demonstration method may be implemented by a processor calling computer readable instructions stored in a memory.
The space capsule display method provided by the embodiment of the disclosure is described below by taking the execution subject as a server.
Example one
Referring to fig. 1, a flowchart of a space capsule displaying method provided in the embodiment of the present disclosure is shown, the method includes steps S101 to S103, where:
s101, acquiring a scene image including an indoor environment, which is acquired by AR equipment;
s102, determining virtual space capsule environment information corresponding to an indoor environment based on the acquired scene image;
s103, generating the AR data of the space capsule based on the determined virtual space capsule environment information, and controlling the AR device to display the AR scene image of the space capsule according to the AR data.
Here, in order to facilitate understanding of the space capsule display method provided by the embodiment of the present disclosure, an application scenario of the space capsule display method is first described in detail. The space capsule display method provided by the embodiment of the disclosure can be applied to any indoor environment, such as a home environment, a work office environment and the like, so that when a user wears Augmented Reality (AR) equipment to enter an indoor scene in which the space capsule display method is deployed, an AR scene image related to a space capsule can be provided for the user, that is, the user can watch the AR scene image of the space capsule by using the AR equipment, the constructed virtual space environment can enable the user to generate an in-person outer space experience, and the outer space environment can be experienced with lower cost.
The AR scene image of the capsule may be generated based on AR data of the capsule, and the AR data may be derived from virtual capsule environment information. In the embodiment of the present disclosure, the virtual space capsule environment information may be determined based on a scene image including an indoor environment currently acquired by the AR device.
The virtual space capsule environment information can represent relevant information of the created virtual space environment, such as style information, attribute information, type information and the like.
The virtual space capsule environment information can be determined based on the spatial structure information of the indoor environment in the embodiment of the disclosure. The spatial structure information here may be related to an object contour of an indoor environment, for example, spatial structure information obtained by splicing a plurality of desks together may determine that the desk is a working office environment to a certain extent, and at this time, virtual space capsule environment information corresponding to the working office environment may be determined based on the spatial structure information.
In a specific application, the virtual space capsule environment information may be determined according to the following steps:
the method comprises the steps of firstly, extracting spatial structure information of an indoor environment from an acquired scene image;
calculating the matching degree between the space structure information and various preset virtual space capsule environment information;
and thirdly, selecting target virtual space capsule environment information matched with the extracted space structure information from a plurality of kinds of preset virtual space capsule environment information based on the calculated matching degree, and taking the target virtual space capsule environment information as virtual space capsule environment information corresponding to the indoor environment.
Here, in the space capsule display method provided by the embodiment of the present disclosure, first, spatial structure information of an indoor environment may be extracted from an acquired scene image, then, a matching degree between the extracted spatial structure information and various preset virtual space capsule environment information may be calculated, and then, virtual space capsule environment information with a higher matching degree may be screened out from the various preset virtual space capsule environment information as virtual space capsule environment information corresponding to a current indoor environment.
The process of matching the space structure information with the various virtual space capsule environment information may be determined based on vector similarity after the space structure information and the various virtual space capsule environment information vectors are processed.
For example, for a working office environment, the matching degree of the spatial structure information corresponding to the working office environment and the scientific experimental research space environment is higher, and at this time, the scientific experimental research space environment can be used as the virtual space capsule environment information corresponding to the current working office environment, which is mainly used for adapting to a space environment which is relatively uniform as a whole.
After the space capsule display method provided by the embodiment of the disclosure determines the virtual space capsule environment information, the space capsule AR data can be generated based on the virtual space capsule environment information. The generation process of the AR data specifically comprises the following steps:
step one, identifying the placement position information of each real object in the indoor environment from the acquired scene image;
and secondly, generating AR data of the space capsule based on the identified placing position information of each real object and the determined environment information of the virtual space capsule.
Here, in the embodiment of the present disclosure, first, the placing position information of each object in the indoor environment may be identified from the acquired scene image, and the AR data of the capsule may be generated based on the placing position information and the virtual capsule environment information.
In a specific application, based on the placement position information of each real object, the display position information of each virtual capsule element in the AR scene image to be generated is first determined, and then the AR data may be determined based on each virtual capsule element and the display position information thereof.
When determining the display position information of each virtual capsule element in the to-be-generated AR scene image, the embodiment of the present disclosure may first determine, for each real object in the indoor environment, a virtual capsule element matched with the real object, and then may convert, based on the coordinate system conversion relationship, the placement position information of the real object in the two-dimensional image coordinate system into the display position information of the corresponding virtual capsule element in the three-dimensional image coordinate system.
When determining a virtual capsule element that matches a real object, the corresponding virtual capsule element may be selected based on the relevant information of the real object.
In the embodiment of the disclosure, the object type of the real object may be recognized based on the scene image of the indoor environment, and then the virtual space capsule element matched with the real object may be determined based on the object type.
The object type may include information such as size and color of the object. In this way, the size and color of the virtual space capsule element that matches the object type can be adjusted based on the object size and color. For example, the virtual capsule elements presented in the AR scene image may be scaled proportionally to the size of the real objects in the scene image.
In order to facilitate understanding of the implementation process of the space capsule display method, as shown in fig. 2(a) -2 (b), the following description may be made in conjunction with a display effect diagram of an AR device.
As shown in fig. 2(a), after the scene image including the indoor environment is acquired for the acquired AR device, after it is determined that the AR device enters the indoor environment based on the scene image, virtual capsule environment information corresponding to the indoor environment may be determined, and in addition, a virtual capsule element matching a real object in the indoor environment and display position information corresponding to the virtual capsule element may also be determined, so that AR data of the capsule may be generated, and the AR device may display the AR scene image of the capsule according to the AR data, as shown in fig. 2 (b).
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, the embodiment of the present disclosure further provides a capsule display apparatus corresponding to the capsule display method, and as the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the capsule display method described above in the embodiment of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not repeated.
Example two
Referring to fig. 3, a schematic diagram of an architecture of a space capsule display device provided in the embodiment of the present disclosure is shown, the device includes: an acquisition module 301, a determination module 302 and a generation module 303; wherein the content of the first and second substances,
an obtaining module 301, configured to obtain a scene image including an indoor environment collected by an AR device;
a determining module 302, configured to determine, based on the acquired scene image, virtual space capsule environment information corresponding to an indoor environment;
the generating module 303 is configured to generate, based on the determined virtual capsule environment information, AR data of the capsule, and control the AR device to display an AR scene image of the capsule according to the AR data.
By adopting the above-mentioned space capsule display device, after the scene image including the indoor environment acquired by the AR equipment is acquired, the virtual space capsule environment information corresponding to the indoor environment can be determined based on the scene image, so as to generate the AR data of the space capsule according to the virtual space capsule environment information, therefore, the AR equipment can display the AR scene image of the space capsule according to the AR data, the constructed virtual space environment can enable the user to generate the immersive outer space experience, and the outer space environment can be experienced at a lower cost.
In one embodiment, the determining module 302 is configured to determine the virtual space capsule environment information corresponding to the indoor environment according to the following steps:
extracting spatial structure information of an indoor environment from the acquired scene image;
and calculating the matching degree between the space structure information and the multiple kinds of preset virtual space capsule environment information, selecting target virtual space capsule environment information matched with the extracted space structure information from the multiple kinds of preset virtual space capsule environment information based on the calculated matching degree, and taking the target virtual space capsule environment information as the virtual space capsule environment information corresponding to the indoor environment.
In one embodiment, the generating module 303 is configured to generate the AR data of the space capsule according to the following steps:
identifying the placement position information of each real object in the indoor environment from the acquired scene image;
and generating AR data of the space capsule based on the identified placing position information of each real object and the determined environment information of the virtual space capsule.
In one embodiment, the generating module 303 is configured to generate the AR data of the space capsule according to the following steps:
determining display position information of each virtual space capsule element in the AR scene image to be generated based on the identified placing position information of each real object;
and generating the AR data of the capsule based on the display position information of each virtual capsule element in the AR scene image to be generated, each virtual capsule element and the virtual capsule environment information.
In one embodiment, the generating module 303 is configured to determine the display position information of each virtual space capsule element in the image of the AR scene to be generated according to the following steps:
and aiming at each real object in the indoor environment, acquiring a virtual space capsule element matched with the real object, and determining the display position information of the virtual space capsule element matched with the real object in the AR scene image to be generated based on the identified placement position information of the real object.
In one embodiment, the generating module 303 is configured to obtain the virtual space capsule element matched with the real object according to the following steps:
identifying an object type of a real object based on a scene image of an indoor environment;
based on the identified object type, a virtual space capsule element matching the object type is determined.
In one embodiment, the generating module 303 is configured to determine the virtual space capsule element matching the object type according to the following steps:
identifying the object size and color of a real object based on a scene image of an indoor environment;
and adjusting the size and the color of the virtual space capsule element matched with the object type to be matched with the object size and the color of the real object based on the identified object size and color.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
EXAMPLE III
An embodiment of the present disclosure further provides an electronic device, as shown in fig. 4, which is a schematic structural diagram of the electronic device provided in the embodiment of the present disclosure, and the electronic device includes: a processor 401, a memory 402, and a bus 403. The memory 402 stores machine-readable instructions executable by the processor 401, the processor 401 and the memory 402 communicating via the bus 403 when the electronic device is operating, the machine-readable instructions when executed by the processor 401 performing the following:
acquiring a scene image including an indoor environment acquired by AR equipment;
determining virtual space capsule environment information corresponding to an indoor environment based on the acquired scene image;
and generating AR data of the space capsule based on the determined virtual space capsule environment information, and controlling the AR device to display the AR scene image of the space capsule according to the AR data.
In one embodiment, the instructions executed by the processor 401 for determining the virtual space capsule environment information corresponding to the indoor environment based on the acquired scene image include:
extracting spatial structure information of an indoor environment from the acquired scene image;
and calculating the matching degree between the space structure information and the multiple kinds of preset virtual space capsule environment information, selecting target virtual space capsule environment information matched with the extracted space structure information from the multiple kinds of preset virtual space capsule environment information based on the calculated matching degree, and taking the target virtual space capsule environment information as the virtual space capsule environment information corresponding to the indoor environment.
In one embodiment, the instructions executed by the processor 401 for generating the AR data of the capsule based on the determined virtual capsule environment information includes:
identifying the placement position information of each real object in the indoor environment from the acquired scene image;
and generating AR data of the space capsule based on the identified placing position information of each real object and the determined environment information of the virtual space capsule.
In one embodiment, the instructions executed by the processor 401 for generating the AR data of the capsule based on the identified placing position information of each real object and the determined virtual capsule environment information include:
determining display position information of each virtual space capsule element in the AR scene image to be generated based on the identified placing position information of each real object;
and generating the AR data of the capsule based on the display position information of each virtual capsule element in the AR scene image to be generated, each virtual capsule element and the virtual capsule environment information.
In one embodiment, the instructions executed by the processor 401 for determining the display position information of each virtual space capsule element in the AR scene image to be generated based on the identified placing position information of each real object include:
and aiming at each real object in the indoor environment, acquiring a virtual space capsule element matched with the real object, and determining the display position information of the virtual space capsule element matched with the real object in the AR scene image to be generated based on the identified placement position information of the real object.
In one embodiment, the instructions executed by the processor 401 include, for each real object in the indoor environment, acquiring a virtual space capsule element matching the real object, including:
identifying an object type of a real object based on a scene image of an indoor environment;
based on the identified object type, a virtual space capsule element matching the object type is determined.
In one embodiment, the instructions executed by the processor 401 for determining a virtual space capsule element matching the object type based on the identified object type include:
identifying the object size and color of a real object based on a scene image of an indoor environment;
and adjusting the size and the color of the virtual space capsule element matched with the object type to be matched with the object size and the color of the real object based on the identified object size and color.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by the processor 401, the steps of the space capsule displaying method described in the above method embodiments are performed. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the space capsule display method provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the space capsule display method described in the above method embodiments, which may be referred to in the above method embodiments specifically, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A method of displaying a space capsule, the method comprising:
acquiring a scene image including an indoor environment acquired by AR equipment;
determining virtual space capsule environment information corresponding to the indoor environment based on the acquired scene image;
generating AR data of the space capsule based on the determined environment information of the virtual space capsule, and controlling the AR device to display an AR scene image of the space capsule according to the AR data.
2. The method of claim 1, wherein determining virtual space capsule environment information corresponding to the indoor environment based on the acquired scene image comprises:
extracting spatial structure information of the indoor environment from the acquired scene image;
calculating the matching degree between the space structure information and the plurality of kinds of preset virtual space capsule environment information, selecting target virtual space capsule environment information matched with the extracted space structure information from the plurality of kinds of preset virtual space capsule environment information based on the calculated matching degree, and taking the target virtual space capsule environment information as the virtual space capsule environment information corresponding to the indoor environment.
3. The method of claim 2, wherein generating the AR data for the capsule based on the determined virtual capsule environment information comprises:
identifying the placement position information of each real object in the indoor environment from the acquired scene image;
and generating the AR data of the space capsule based on the identified placing position information of each real object and the determined environment information of the virtual space capsule.
4. The method of claim 3, wherein generating the AR data of the capsule based on the identified pose location information of the respective real objects and the determined environment information of the virtual capsule comprises:
determining display position information of each virtual space capsule element in the AR scene image to be generated based on the identified placing position information of each real object;
and generating the AR data of the capsule based on the display position information of each virtual capsule element in the AR scene image to be generated, each virtual capsule element and the virtual capsule environment information.
5. The method of claim 4, wherein the determining display position information of each virtual space capsule element in the AR scene image to be generated based on the identified placing position information of each real object comprises:
and aiming at each real object in the indoor environment, acquiring a virtual space capsule element matched with the real object, and determining the display position information of the virtual space capsule element matched with the real object in the AR scene image to be generated based on the identified placement position information of the real object.
6. The method of claim 5, wherein for each real object in the indoor environment, obtaining a virtual capsule element that matches the real object comprises:
identifying an object type of the real object based on a scene image of the indoor environment;
based on the identified object type, a virtual space capsule element matching the object type is determined.
7. The method of claim 6, wherein determining the virtual capsule element matching the object type based on the identified object type comprises:
identifying an object size and a color of the real object based on a scene image of the indoor environment;
and adjusting the size and the color of the virtual space capsule element matched with the object type to be matched with the object size and the color of the real object based on the identified object size and color.
8. A space capsule display apparatus, the apparatus comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring scene images including indoor environment collected by AR equipment;
the determining module is used for determining virtual space capsule environment information corresponding to the indoor environment based on the acquired scene image;
and the generating module is used for generating the AR data of the space capsule based on the determined environment information of the virtual space capsule and controlling the AR equipment to display the AR scene image of the space capsule according to the AR data.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is running, the machine readable instructions when executed by the processor performing the steps of the space capsule displaying method according to any one of claims 1 to 5.
10. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the space capsule displaying method according to any one of claims 1 to 5.
CN202010509190.5A 2020-06-07 2020-06-07 Space capsule display method and device, electronic equipment and storage medium Pending CN111580679A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010509190.5A CN111580679A (en) 2020-06-07 2020-06-07 Space capsule display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010509190.5A CN111580679A (en) 2020-06-07 2020-06-07 Space capsule display method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111580679A true CN111580679A (en) 2020-08-25

Family

ID=72111324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010509190.5A Pending CN111580679A (en) 2020-06-07 2020-06-07 Space capsule display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111580679A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112153319A (en) * 2020-09-02 2020-12-29 芋头科技(杭州)有限公司 AR information display method and device based on video communication technology
CN112287949A (en) * 2020-11-02 2021-01-29 杭州灵伴科技有限公司 AR information display method and AR display device based on multiple feature information

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445106A (en) * 2016-08-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Space virtual scene experience method and system
CN109903129A (en) * 2019-02-18 2019-06-18 北京三快在线科技有限公司 Augmented reality display methods and device, electronic equipment, storage medium
CN110197532A (en) * 2019-06-05 2019-09-03 北京悉见科技有限公司 System, method, apparatus and the computer storage medium of augmented reality meeting-place arrangement
CN110286773A (en) * 2019-07-01 2019-09-27 腾讯科技(深圳)有限公司 Information providing method, device, equipment and storage medium based on augmented reality
CN110533719A (en) * 2019-04-23 2019-12-03 以见科技(上海)有限公司 Augmented reality localization method and device based on environmental visual Feature point recognition technology
CN111028358A (en) * 2018-10-09 2020-04-17 香港理工大学深圳研究院 Augmented reality display method and device for indoor environment and terminal equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445106A (en) * 2016-08-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Space virtual scene experience method and system
CN111028358A (en) * 2018-10-09 2020-04-17 香港理工大学深圳研究院 Augmented reality display method and device for indoor environment and terminal equipment
CN109903129A (en) * 2019-02-18 2019-06-18 北京三快在线科技有限公司 Augmented reality display methods and device, electronic equipment, storage medium
CN110533719A (en) * 2019-04-23 2019-12-03 以见科技(上海)有限公司 Augmented reality localization method and device based on environmental visual Feature point recognition technology
CN110197532A (en) * 2019-06-05 2019-09-03 北京悉见科技有限公司 System, method, apparatus and the computer storage medium of augmented reality meeting-place arrangement
CN110286773A (en) * 2019-07-01 2019-09-27 腾讯科技(深圳)有限公司 Information providing method, device, equipment and storage medium based on augmented reality

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112153319A (en) * 2020-09-02 2020-12-29 芋头科技(杭州)有限公司 AR information display method and device based on video communication technology
US11521297B2 (en) 2020-09-02 2022-12-06 Sichuan Smart Kids Technology Co., Ltd. Method and device for presenting AR information based on video communication technology
CN112153319B (en) * 2020-09-02 2023-02-24 芋头科技(杭州)有限公司 AR information display method and device based on video communication technology
CN112287949A (en) * 2020-11-02 2021-01-29 杭州灵伴科技有限公司 AR information display method and AR display device based on multiple feature information

Similar Documents

Publication Publication Date Title
CN111638793A (en) Aircraft display method and device, electronic equipment and storage medium
CN111679742A (en) Interaction control method and device based on AR, electronic equipment and storage medium
CN110176197B (en) Holographic display method, system, storage medium and equipment
CN111651047A (en) Virtual object display method and device, electronic equipment and storage medium
CN111638797A (en) Display control method and device
JP2022545598A (en) Virtual object adjustment method, device, electronic device, computer storage medium and program
CN111651057A (en) Data display method and device, electronic equipment and storage medium
CN111580679A (en) Space capsule display method and device, electronic equipment and storage medium
CN112069341A (en) Background picture generation and search result display method, device, equipment and medium
CN111653175B (en) Virtual sand table display method and device
CN111651051A (en) Virtual sand table display method and device
CN106527695A (en) Method and device for information output
CN111652983A (en) Augmented reality AR special effect generation method, device and equipment
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN112882576A (en) AR interaction method and device, electronic equipment and storage medium
CN111640167A (en) AR group photo method, AR group photo device, computer equipment and storage medium
CN111667590A (en) Interactive group photo method and device, electronic equipment and storage medium
CN113127126B (en) Object display method and device
CN111651049B (en) Interaction method, device, computer equipment and storage medium
CN112288881B (en) Image display method and device, computer equipment and storage medium
CN111640195A (en) History scene reproduction method and device, electronic equipment and storage medium
CN112991555A (en) Data display method, device, equipment and storage medium
CN111639818A (en) Route planning method and device, computer equipment and storage medium
CN111639975A (en) Information pushing method and device
CN111599292A (en) Historical scene presenting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination