CN116610396A - Method and device for sharing shooting of native content and meta-universe content and electronic equipment - Google Patents

Method and device for sharing shooting of native content and meta-universe content and electronic equipment Download PDF

Info

Publication number
CN116610396A
CN116610396A CN202310464724.0A CN202310464724A CN116610396A CN 116610396 A CN116610396 A CN 116610396A CN 202310464724 A CN202310464724 A CN 202310464724A CN 116610396 A CN116610396 A CN 116610396A
Authority
CN
China
Prior art keywords
meta
app
native
project
engineering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310464724.0A
Other languages
Chinese (zh)
Inventor
王小军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Hetu United Innovation Technology Co ltd
Original Assignee
Beijing Hetu United Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Hetu United Innovation Technology Co ltd filed Critical Beijing Hetu United Innovation Technology Co ltd
Priority to CN202310464724.0A priority Critical patent/CN116610396A/en
Publication of CN116610396A publication Critical patent/CN116610396A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Abstract

The application discloses a method, a device, electronic equipment, media and a product for sharing shooting of original content and meta-universe content. The method is applied to an APP of terminal equipment, wherein the APP comprises a native APP project and a meta-universe project, the native APP project comprises a native navigation controller component, a native window component and a native view controller component, and the creation mode of the meta-universe window in the meta-universe project comprises the steps of obtaining the native window in the native APP project; the method for sharing shooting of the original content and the meta-universe content comprises the following steps: and (3) carrying out screen capturing on the page of the APP at intervals of a preset time period by calling a preset component in the system function file to obtain a plurality of screen capturing images, wherein one part of the screen capturing images in the plurality of screen capturing images are pages of the original APP engineering, the other part of the screen capturing images are pages of the meta-space engineering, and the other part of the screen capturing images are transition pages when the original APP engineering and the meta-space engineering are mutually switched. The embodiment of the application can realize the sharing shooting of the original content and the meta-universe content in the APP.

Description

Method and device for sharing shooting of native content and meta-universe content and electronic equipment
Technical Field
The application relates to the technical field of augmented reality, and also relates to the technical field of video recording, in particular to a method, a device, electronic equipment, a computer readable storage medium and a computer program product for sharing shooting of original content and meta-universe content.
Background
Hybrid integration of APP in the form of native APP integration meta-universe content, or native APP integration h5 applet, etc. is currently emerging. Often, the hybrid APP will include native content and meta-space content, and when using such APP, a user typically desires to record the rich content presented on the mobile phone interface.
However, for such a hybrid integrated APP interface, according to the existing recording means, there are a plurality of problems, and the higher use requirement cannot be met, for example, some recording modes can only record a pure original APP interface, or can only record a pure meta-universe content interface, and the original content and the meta-universe content cannot be recorded simultaneously; in addition, some recording methods can record system sound of meta-universe content only, but not environment sound at the same time, and the like. How to implement shared recording on the original content and meta-universe content in the hybrid integrated APP is a problem to be solved at present.
In view of the above problems in the related art, no effective solution has been found yet. The above is merely background information related to the present application known to the inventors and does not constitute an admission of prior art.
Disclosure of Invention
In view of the above, embodiments of the present application provide a method, an apparatus, an electronic device, a computer-readable storage medium, and a computer program product for sharing shooting of native content and meta-cosmic content, which are used for solving at least one technical problem.
The embodiment of the application provides a method for sharing shooting of original content and meta-universe content, which is applied to an APP of terminal equipment, wherein the APP comprises an original APP project and a meta-universe project, the original APP project comprises an original navigation controller component, an original window component and an original view controller component, and the creation mode of a meta-universe window in the meta-universe project comprises the steps of obtaining an original window in the original APP project; the method for sharing shooting of the original content and the meta-universe content comprises the following steps: the method comprises the steps that a preset component in a system function file is called to screen the page of the APP at intervals of preset time intervals to obtain a plurality of screen capturing images, one part of screen capturing images in the plurality of screen capturing images are pages of a primary APP project, the other part of screen capturing images are pages of a meta-space project, and the other part of screen capturing images are transition pages when the primary APP project and the meta-space project are mutually switched; storing the plurality of screen capture images to a cache; synthesizing the plurality of screen capturing images to generate a shared video, wherein the shared video is used for displaying the original content of the original APP project and the meta-universe content of the meta-universe project; wherein the native APP engineering and the meta-space engineering are switched to each other by: determining that the current page of the APP is a native scene page of a native APP project; triggering an entry of the meta-universe engineering in a native scene page of the current native APP engineering, loading the native navigation controller component in a mode that the native view controller component in the native APP engineering is taken from the bottom to the top, wherein a root controller component for the meta-universe scene is arranged in the native navigation controller component; the meta-space engineering obtains a native window of the native APP engineering, and fills the native window with the content of the root controller component, so that the current page of the APP displays the meta-space scene of the meta-space engineering, and the switching from the native APP engineering to the meta-space engineering is completed; when needed, the root controller component of the metauniverse hides the metauniverse scene so that the APP returns to the original scene page of the original APP project from the metauniverse scene page of the metauniverse project.
Optionally, the method for creating the meta-space window in the meta-space engineering includes assigning a native window component in the native APP engineering to the meta-space engineering.
Optionally, the native APP engineering is a main engineering, and the meta-space engineering is a sub-engineering.
Optionally, the native view controller component always controls the scene interface displayed at the uppermost layer in the native APP project.
Optionally, the predetermined component is startRecordingWithHandler.
Optionally, the meta-space engineering comprises a dynamic link library, an instance variable of the dynamic link library and a communication protocol, wherein the dynamic link library is used for presenting data of the meta-space engineering, and the data at least comprises scene data and resource data; the instance variables are used for realizing interaction and communication between the native APP engineering and the meta-space engineering; the communication protocol is used for acquiring user information in the original APP engineering by the meta-space engineering.
Optionally, the native APP project includes an appdelete+unity file for implementing a protocol, and all interactions of the meta-universe operation native project are in the appdelete+unity file for implementing the protocol.
Optionally, the communication protocol includes a native callsprotocol protocol.
Optionally, the user information in the native APP project includes at least one of: user account information and user head portrait information.
An embodiment of the application provides an electronic device comprising a processor and a memory storing computer program instructions, which when executed implement the steps of the method as described above.
Embodiments of the present application provide a computer readable storage medium having stored thereon computer program instructions which when executed by a processor perform the steps of the method as described above.
Embodiments of the present application provide a computer program product comprising computer program instructions which, when executed by a processor, implement the steps of the method as described above.
The embodiment of the application also provides a device for sharing shooting of the original content and the meta-cosmic content, which is applied to the APP of the terminal equipment, wherein the APP comprises an original APP project and a meta-cosmic project, the original APP project comprises an original navigation controller component, an original window component and an original view controller component, and the creation mode of the meta-cosmic window in the meta-cosmic project comprises the steps of acquiring the original window in the original APP project; the device for sharing shooting of the native content and the meta-universe content comprises: the screen capturing processing module is used for capturing the screen of the page of the APP at intervals of a preset time period by calling a preset component in the system function file to obtain a plurality of screen capturing images, wherein one part of the screen capturing images in the plurality of screen capturing images are pages of the original APP engineering, the other part of the screen capturing images are pages of the meta-cosmic engineering, and the other part of the screen capturing images are transition pages when the original APP engineering and the meta-cosmic engineering are mutually switched; the storage module is used for storing the plurality of screen capturing images into a cache; the synthesis module is used for carrying out synthesis processing on the plurality of screen capturing images to generate a shared video, wherein the shared video is used for displaying the original content of the original APP project and the meta-universe content of the meta-universe project; the original APP engineering and the meta-space engineering are mutually switched through the following modules: the determining module is used for determining that the current page of the APP is a native scene page of a native APP project; the loading module is used for triggering an entry of the meta-universe engineering in a native scene page of the current native APP engineering, the native view controller component in the native APP engineering is loaded in a mode of being taken from bottom to top, and a root controller component for the meta-universe scene is arranged in the native navigation controller component; the native window filling module is used for acquiring a native window of a native APP project, and filling the native window with the content of the root controller component so that a current page of the APP displays a meta-universe scene of the meta-universe project, and switching from the native APP project to the meta-universe project is completed; the meta-cosmic scene hiding module is used for triggering a root controller component of the meta-cosmic engineering to hide the meta-cosmic scene when needed, so that the APP returns to a native scene page of the native APP engineering from a meta-cosmic scene page of the meta-cosmic engineering.
According to the method provided by the embodiment of the application, through modifying the creation mode of the meta space window in the meta space engineering and setting the root controller component for the meta space scene in the original navigation controller component, the original window in the original APP engineering can be shared by the meta space engineering and the original engineering, the original navigation controller component is loaded in a mode that the original view controller component is taken from the bottom to the top, so that the meta space content can be displayed in the original window of the original engineering, and the meta space content and the original content can be switched, and thus, when video is recorded, the original content and the meta space content can share the same window, and the sharing recording of the original content, the meta space content and the switching transition content of the original content and the meta space content can be realized.
Drawings
In order to more clearly illustrate the implementation manner of the embodiments of the present application, the following description briefly describes the drawings in the embodiments of the present application.
Fig. 1 is a schematic diagram of an AR system architecture based on a server and a terminal device according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a virtual-real fusion image for AR navigation by using a mobile phone APP.
Fig. 3 is a flowchart of a method for sharing shooting of native content and meta-cosmic content according to an embodiment of the present application.
FIG. 4 shows a schematic diagram of a native page of an embodiment of the present application.
Fig. 5 to fig. 7 are schematic diagrams of system architecture relationships of a hybrid integrated APP according to an embodiment of the present application.
Fig. 8 is a block diagram of a structure of a capturing device for sharing native content and meta-cosmic content according to an embodiment of the present application.
Fig. 9 is a schematic diagram of a terminal device or a server for implementing a method for sharing shooting of native content and meta-cosmic content according to an embodiment of the present application.
Fig. 10 is a software architecture diagram of an exemplary terminal device according to an embodiment of the present application.
Detailed Description
The principles and spirit of the present application will be described below with reference to several exemplary embodiments. It will be appreciated that such embodiments are provided to make the principles and spirit of the application clear and thorough, and enabling those skilled in the art to better understand and practice the principles and spirit of the application. The exemplary embodiments provided herein are merely some, but not all embodiments of the application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the embodiments herein, are within the scope of the present application.
Those skilled in the art will appreciate that embodiments of the application may be implemented as a system, apparatus, device, method, computer readable storage medium, or computer program product. Accordingly, the present application may be embodied in at least one of the following forms: complete hardware, complete software, or a combination of hardware and software. According to particular embodiments of the present application, the present application claims a method, apparatus, electronic device, computer-readable storage medium, and computer program product for capturing a share of native content with meta-cosmic content.
In this document, terms such as first, second, third, etc. are used solely to distinguish one entity (or action) from another entity (or action) without necessarily requiring or implying any order or relationship between such entities (or actions).
The embodiment of the application can be applied to a server and terminal equipment. Referring to fig. 1, a schematic diagram of an AR system architecture based on a server and a terminal device is schematically shown. The AR system architecture comprises a server 10 and several terminal devices 20. In some examples, the terminal device 20 is an AR device, which may be a dedicated AR device, such as a Head-mounted display (HMD), smart glove, apparel, or other smart wearable electronic device. In some examples, the terminal device 20 may be a general purpose AR device, such as a cell phone, portable computer, notebook computer, tablet computer, virtual Reality (VR) device, vehicle mounted device, navigation device, game device, and the like.
Taking an AR helmet or AR glasses as an example, a head-mounted display, a machine vision system, a mobile computer and the like can be integrated and arranged in a wearable device, the device is provided with a display similar to the glasses in appearance, and is worn on the head of a user in operation, and the device can transmit augmented reality information to the display or projected to eyeballs of the user, so that the visual immersion of the user is enhanced. In some examples, the AR device also has a camera, which may be a wide angle camera, a tele camera, or a structured light camera (also referred to as a point cloud depth camera, a 3D structured light camera, or a depth camera). The structured light camera is based on a 3D vision technology, and can acquire plane and depth information of an object. The structured light camera can project light with certain structural characteristics onto a photographed object through the near infrared laser, then the infrared camera collects reflected light, the reflected light is processed by the processor chip, and the calculation principle is that the position and depth information of the object are calculated according to the change of light signals caused by the object, and a 3D image is displayed. The conventional terminal equipment, such as a mobile phone, presents a two-dimensional image, cannot display the depth of different positions on the image, and can acquire 3D image information data by shooting through a structured light camera, namely, not only can acquire information of colors and the like of different positions in the image, but also can acquire depth information of different positions, and can be used for AR ranging. Of course, the common terminal device can acquire the 2D image based on the optical camera and acquire the depth information of the 2D image by combining a deep learning algorithm and the like, and finally can also present the 3D image.
In some examples, the terminal device 20 has installed therein AR-enabled software or an application APP. The server 10 may be a management server or an application server of the software or APP. The server 10 may be one server, or may be a server cluster formed by a plurality of servers, or may be a cloud server or cloud server. The terminal device 20 has integrated therein a module having a networking function, such as a Wireless-Fidelity (Wifi) module, a bluetooth module, a 2G/3G/4G/5G communication module, etc., so as to be connected to the server 10 through a network.
Illustratively, a user may log into the user account through an APP installed in the cell phone, and the user may also log into the user account through software installed in the AR glasses.
Taking APP with an AR navigation function as an example, the APP may have, for example, a high-precision map navigation capability, an environment understanding capability, a virtual-real fusion rendering capability, and the like, and the APP may report current geographic location information to the server 10 through the terminal device 20, where the server 10 provides an AR navigation service for a user based on the real-time geographic location information. Taking the terminal device 20 as a mobile phone as an example, in response to a user starting an APP, the mobile phone may start a camera to collect an image of a real environment, then perform AR augmentation on the image of the real environment collected by the camera through the system, blend or superimpose a rendered AR effect (such as a navigation route identifier, a road name, merchant information, advertisement presentation, etc.) in the image of the real environment, and present the image of the virtual-real fusion on a screen of the mobile phone.
Fig. 2 schematically shows a virtual-real fusion image of AR navigation by using a mobile phone APP, wherein an indication arrow of AR navigation is superimposed on a real road surface and in a space in the figure, and electronic resources promoted by merchants float at a designated position in the space in the form of a parachute carrying gift box.
Referring to fig. 3, an embodiment of the present application provides a method for sharing and shooting native content and meta-cosmic content, which is applied to an APP of a terminal device, where the APP includes a native APP project and a meta-cosmic project, the native APP project includes a native navigation controller component, a native window component and a native view controller component, and a manner of creating a meta-cosmic window in the meta-cosmic project includes obtaining a native window in the native APP project; the method for sharing shooting of the original content and the meta-universe content comprises the following steps:
s101, performing screen capturing on the page of the APP at intervals of a preset time period by calling a preset component in a system function file to obtain a plurality of screen capturing images, wherein one part of the screen capturing images in the plurality of screen capturing images are pages of the original APP engineering, the other part of the screen capturing images are pages of the meta-space engineering, and the other part of the screen capturing images are transition pages when the original APP engineering and the meta-space engineering are mutually switched.
S102, storing a plurality of screen capturing images into a cache.
S103, synthesizing the plurality of screen capturing images to generate a shared video, wherein the shared video is used for displaying the original content of the original APP project and the meta-universe content of the meta-universe project.
The method comprises the following steps of:
step 1: determining that the current page of the APP is a native scene page of a native APP project;
step 2: triggering an entry of the meta-universe engineering in a native scene page of the current native APP engineering, loading a native navigation controller component in a mode that a native view controller component in the native APP engineering is taken from bottom to top, wherein a root controller component for a meta-universe scene is arranged in the native navigation controller component;
step 3: the meta-universe engineering acquires a native window of the native APP engineering, and fills the native window with the content of the root controller component, so that the current page of the APP displays the meta-universe scene of the meta-universe engineering, and the switching from the native APP engineering to the meta-universe engineering is completed;
step 4: when needed, the root controller component of the metauniverse hides the metauniverse scene so that the APP returns to the original scene page of the original APP project from the metauniverse scene page of the metauniverse project.
In some embodiments of the present application, the native navigation controller component, the native window component, and the native view controller component refer to a navigation controller component, a window component, and a view controller component disposed in a native project, respectively, where the navigation controller may be a navigation controller, the window component may be a window, and the view controller component may be a view controller.
According to the method provided by the embodiment of the application, through modifying the creation mode of the meta space window in the meta space engineering, when the original engineering is switched into the meta space engineering, the meta space engineering and the original engineering can share the original window component in the original APP engineering, so that the display of the meta space content and the display of the original content can share one original window component (window), and then when the startRecordingWithHandler method in the system function file RPScreen recorder is called, the simultaneous recording of the meta space content, the original content and the transition page when the original APP engineering and the meta space engineering are mutually switched can be realized.
In addition, when the meta-space scene is specifically displayed, the root controller component for the meta-space scene is arranged in the original navigation controller component, the original navigation controller component is loaded in a mode that the original view controller component is taken from the bottom to the top, and therefore the content of the root controller component in the original navigation controller component can be used for filling the original window, and the meta-space scene of the meta-space engineering is displayed on the current page of the APP.
According to the embodiment of the application, the state that only one window (window) exists in the APP of the client can be achieved, then video can be recorded at any position, namely, the shared recording of the original content, the meta-universe content and the transition page during the switching of the original content and the meta-universe content can be achieved, the recorded video has no problems of leakage, missing environmental sound and the like, and the problem that the shared recording of the meta-universe content and the original content can not be achieved in the past is solved.
In some embodiments of the application, optionally, the creation of the metawindow in the metauniverse includes assigning a native window component in a native APP project to the metauniverse.
In some embodiments of the application, optionally, the native APP project is a main project and the meta space project is a sub project. In the application, the APP of the client can be a mixed-editing APP, and the mixed-editing APP comprises a primary project and a meta-universe project, wherein the primary project is a primary project, the meta-universe project is a sub-project, thus, the registration of the user is mainly completed in the primary project, and the meta-universe project can be used as the sub-project to call the registration information of the user through some communication protocols.
In some embodiments of the application, optionally, the native view controller component always controls the scene interface displayed at the top layer in the native APP project.
Specifically, in the process of switching from the native project to the meta-space project, first, a scene interface currently displayed at the uppermost layer in the native project needs to be found, and the scene interface corresponds to the native view controller component. The method comprises the steps that a native controller component controls a native window of a native project to display a scene interface of the native project, the scene interface is a page currently displayed by an APP, in order to switch the page to the scene interface of the meta-space project, a native view controller component for controlling the scene interface displayed at the uppermost layer is found, the native view controller component is utilized to load a native navigation controller component, and then the meta-space scene is filled in the native window, and the scene interface of the currently displayed native project is switched to the scene interface of the meta-space project.
In some embodiments of the application, optionally, the predetermined component is a startRecordingWithHandler. In the application, when the shared video is recorded, the APP can start recording by calling a startRecordingWithHandler method in the system function file RPScreen recorder.
In some embodiments of the present application, optionally, the metauniverse includes a dynamic link library, instance variables of the dynamic link library, and a communication protocol, wherein the dynamic link library is used to present data of the metauniverse, the data including at least scene data and resource data; the instance variables are used for realizing interaction and communication between the native APP engineering and the meta-space engineering; the communication protocol is used for acquiring user information in the original APP engineering by the meta-space engineering.
In order to realize the switching between the original engineering and the meta-space engineering in the APP, the meta-space engineering provided by the embodiment of the application comprises a dynamic link library, instance variables of the dynamic link library and a communication protocol. The dynamic link library is core code logic of the metauniverse and is used for presenting all scene, resource and other data of the metauniverse; the instance variable of the dynamic link library is meta-universe, the native engineering displays the meta-universe scene by rendering the instance variable of the dynamic link library, and all communication and interaction of the native engineering and the meta-universe engineering are realized by the instance variable of the dynamic link library.
With respect to the communication protocol created in the metauniverse, in some embodiments, optionally, the communication protocol is native callsProtocol, which mainly implements the metauniverse operation native project, such as the metauniverse presenting user login account information, avatar information, and the like, through which user information in the native project is obtained.
In addition, in some embodiments, optionally, the communication protocol may also be used for the metauniverse to obtain information required for metauniverse scene initialization, where the information required for metauniverse scene initialization includes a unique number ID of where the metauniverse project is located, location coordinates of a metauniverse scene landmark, and the like. When switching from the native engineering to the meta-cosmic engineering, the meta-cosmic engineering needs to acquire the meta-cosmic item it is to load, because many sub-meta-cosmic items may be set in the meta-cosmic engineering, and the meta-cosmic data corresponding to different sub-meta-cosmic items are different. In addition, in the metauniverse scene, the display of the digital resource needs to combine the current position coordinate of the user and the corresponding metauniverse scene landmark position coordinate stored in the metauniverse database in advance, so when the metauniverse scene is displayed, the metauniverse project needs to acquire the unique number id of the position of the target metauniverse project and the metauniverse scene landmark position coordinate through a communication protocol.
In some embodiments of the present application, optionally, the native APP project includes an appdelta+unity file for implementing a protocol, and all interactions of the meta-universe operation native project are in the appdelta+unity file for implementing a protocol. The communication protocol is realized in the original engineering, and an AppDelegate+Unity file for realizing the protocol needs to be set in the original engineering, so that all interactions of the original engineering for meta-universe operation are in the AppDelegate+Unity file for realizing the protocol.
In some embodiments of the application, optionally, the communication protocol comprises a native callsprotocol protocol.
In some embodiments of the present application, optionally, the user information in the native APP project includes at least one of: user account information and user head portrait information. The original project is the main project, the information registered by the user can be stored in the original project, and the metauniverse can obtain the user account information and/or the user head portrait information in the original project through a communication protocol. The method can also be used for acquiring information required by the metauniverse scene initialization by combining the communication protocol, and in some embodiments, optionally, the user information further comprises a target site central axis metauniverse item, a target central axis item id and a target item coordinate.
The foregoing describes various implementations and technical advantages that may be achieved by embodiments of the present application. The processing procedure of the embodiment of the present application is described in detail below based on specific examples.
As an example, a hybrid integrated APP of a native integrated Unity APP is described below as an example, and specifically, a tour navigation APP (belonging to the hybrid integrated APP) is described as an example, to describe an implementation manner of the shared recording function according to the embodiment of the present application.
The Unity APP in the hybrid integrated class APP refers to APP implemented in Unity programming, and is most commonly referred to as game class APP. The reason why the hybrid integrated class APP cannot realize shared recording is briefly described below.
1. When the natcode of the Unity APP is used for recording video, memory leakage exists, the leakage is about 200M each time, and an APP system is crashed after multiple recording operations; this is particularly the case in ios systems.
2. When the avproviecapture of the Unity APP is used for recording video, environmental sound cannot be recorded, and only sound played by the Unity itself can be recorded;
3. when the original RPScreen recorder component is used for recording video, only a pure original app can be recorded, or a pure unityapp can not be recorded in a shared way; the reason is that the RPScreen recorder can only record the main process keywindow, if the app of the original project is embedded by the unit, the two do not share the keywindow, so that the transition from the original page recording to the unit page recording cannot be realized.
For ease of understanding, this example assumes that a hybrid APP is a navigation APP specifically tailored for a museum, including a native project and a meta-universe project, and because it belongs to the hybrid APP, the system also has the above-mentioned drawbacks, and the sharing and recording of the native content and the meta-universe content cannot be achieved.
According to the processing mode of the embodiment of the application, special setting needs to be carried out on the museum tour navigation APP. Specifically, firstly, the call mode when the original call up unit is modified, the original navigation controller directly displays the unit root page rootview controller, and secondly, the original keywindow is shared in the unit frame work (the acquisition mode of the window also needs to be modified, which is described in detail below). And when the unit is started, starting the sharing keywindows, so that the state that only one window is always in the app can be achieved, and recording of the shared video at any position can be achieved.
The specific settings and operational procedures that may be adopted in this example are described in detail below.
1. APP engineering architecture
Taking the museum tour navigation APP as an example, the museum tour navigation APP is a hybrid integration type APP applied to a native integration meta universe of a client, and comprises a native project and a meta universe project, wherein the native project is a main project, and the main project architecture is as follows:
window+navigationController+viewController。
2. Main project and sub-project integrated mode and interaction mode
2.1 integration of Meta-universe engineering
After the meta-universe program is compiled, the Xcode engineering can be compiled by a compiler of a meta-universe development platform Unity, and can be used as a sub-engineering to be applied to the hybrid integrated class APP, namely, the meta-universe engineering is used as the sub-engineering to be applied to the museum tour navigation APP.
The above process is that of the Unity integration mode, in which the native engineering refers to the unityframe in the meta-universe, where unityframe is a dynamic link library for presenting all the data of the meta-universe, such as all the scenes and the resources, the link library is the core code logic of the meta-universe, unityframe is a dynamic link library generated after compiling the meta-universe, the example variables ufw and ufw of unityframe are defined in the native engineering, the native engineering displays the meta-universe scene and the like through rendering ufw, and all the communication and interaction between the native and the meta-universe are realized through ufw.
For convenience of description hereinafter, the primordial engineering in the museum tour navigation APP of the present example is sometimes referred to as "a engineering" or "primordial engineering a", and the meta-universe engineering is referred to as "B engineering" or "meta-universe engineering B".
Interaction mode of 2.2 yuan cosmic engineering B and original engineering A:
2.2.1: and creating a protocol native callsProtocol in the meta-universe engineering B, wherein the protocol mainly realizes the meta-universe operation native engineering, such as the presentation of user login account information, head portrait information and the like by the meta-universe, and acquiring user information in the native engineering through the native callsProtocol protocol.
2.2.2: the protocol native callsprotocol is implemented in the native engineering a, and all interactions of the meta-universe operation native engineering are in the appdelta+unity file implementing the protocol.
2.3: the way the original project A calls up and hides the meta-space project B:
2.3.1: and finding a scene interface (corresponding to a native code of ViewController) displayed at the current uppermost layer in the native project A.
2.3.2: the method comprises the steps of creating a native navigation controller NavgationController bearing a meta-universe, and putting a root controller rootViewController of a meta-universe (ufw) scene into the native navigation controller NavgationController.
2.3.3: the meta-universe scene is displayed, and the native ViewController loads the native navigation controller NavgationController where the meta-universe is located in a push-up manner (presentViewController) from below.
2.3.4: returning from the meta-universe scenario to the native project a: the root controller rootViewController of the meta-universe (ufw) scene invokes the disvisViewControlleranimate method to hide the meta-universe information (ufw), also known as returning the native page, i.e., hiding the meta-universe engineering B.
3. Recording video
FIG. 4 shows a schematic diagram of a native page of an embodiment of the present application. Referring to fig. 4, the operation procedure of the shared recording of the native content and the meta-cosmic content in this example is as follows:
3.1: adding a record button 601 on the original engineering page 600;
3.2: clicking the record button 601 "start recording", popup the rights box for determining if the user allows recording, and giving two options: recording is started after the screen recording is performed and the screen recording is not allowed, and the option screen recording is clicked in the permission frame, and the recording button 601 is changed into stop recording;
clicking the metauniverse entry button 602 enters the metauniverse page. In addition, the original page is also provided with a plurality of sub-engineering icons of the meta-space engineering, namely, an icon 603 of the first sub-engineering, an icon 604 of the meta-second sub-engineering and an icon 605 of the third sub-engineering, after clicking an entry button 602 of the meta-space engineering, the meta-space engineering matches the sub-engineering of the meta-space engineering according to the coordinates of a user, and digital resources in the meta-space engineering are combined with a real scene and are displayed in the page of the meta-space engineering.
The native engineering and the meta engineering share a native window, which is a main window for displaying the native scene and meta scene information. The process of acquiring the native windows by the meta-space engineering comprises the following steps:
Finding a code file for creating a window in the meta-space engineering B; and modifying a window creation mode, obtaining a native window in the native engineering XX instead, and assigning the window to the meta-space engineering B.
Therefore, when the original content is switched into the meta-space content, the meta-space engineering acquires the original window of the original engineering to display the meta-space content, so that window sharing of the original engineering and the meta-space engineering in the hybrid integrated APP is realized, and the problem that the high-quality recording of the original content and the meta-space content cannot be realized in the hybrid integrated APP is solved.
3.3: the mobile phone screen is intercepted at fixed time intervals, the intercepting frequency can be 30 frames/s or 60 frames/s, and the higher the frequency is, the better the animation continuity is. Specifically, the appropriate frequency may be selected based on the performance of the handset. The intercepted screen photo is stored in a cache of the museum tour navigation APP;
3.4: after clicking the 'end recording' button, condensing all stored screen photos into a video, and storing the video into a mobile phone system album; the video comprises original page content, meta-cosmic page content and a transition page when the original APP engineering and the meta-cosmic engineering are mutually switched.
3.5: the recording is completed and can be saved to a local album or uploaded to a cloud service.
For facilitating understanding of the architecture of the hybrid integrated class APP described by way of example in the present application, fig. 5 to 7 schematically show a partial system architecture relationship diagram of the hybrid integrated class APP according to an embodiment of the present application. As shown in fig. 5, the hybrid integrated APP includes an a project 402 as a main project and a B project 401 as a sub project, and contents such as a sub file and an icon under each project.
The system architecture relationship illustrated in fig. 6 further illustrates the lower level files under multiple subfiles as compared to the contents of fig. 4. For example, the lower level file of unityframe also includes unityframe and info. In addition, bonjourclientmpl is also provided under the plug file for setting up the communication protocol native callsptocol. Fig. 7 shows the position of the appdelete+unity file in the hybrid integration class APP, so as to implement the operation of the meta-space engineering on the native engineering.
Correspondingly, the embodiment of the present application further provides a device for sharing shooting between native content and metauniverse content, referring to fig. 8, the device 100 for sharing shooting between native content and metauniverse content includes:
the screen capturing processing module 101 is configured to obtain a plurality of screen capturing images by calling a predetermined component in the system function file to capture a screen of the APP at intervals of a predetermined time period, wherein a part of the screen capturing images in the plurality of screen capturing images are pages of the native APP engineering, another part of the screen capturing images are pages of the meta-cosmic engineering, and a further part of the screen capturing images are transition pages when the native APP engineering and the meta-cosmic engineering are mutually switched;
A storage module 102, configured to store the plurality of screen capturing images into a cache;
a synthesizing module 103, configured to perform synthesis processing on the plurality of screen capturing images to generate a shared video, where the shared video is used to display the native content of the native APP project and the meta-cosmic content of the meta-cosmic project; the method comprises the following steps of:
the determining module 104 is configured to determine that a current page of the APP is a native scene page of a native APP project;
the loading module 105 is configured to trigger an entry of the meta-space engineering in a native scene page of a current native APP engineering, load the native navigation controller component in a manner that the native view controller component in the native APP engineering is taken from a bottom-up present, and set a root controller component for the meta-space scene in the native navigation controller component;
the native window filling module 106 is configured to obtain a native window of a native APP project, and fill the native window with the content of the root controller component, so that a current page of the APP displays a meta-universe scene of the meta-universe project, and complete switching from the native APP project to the meta-universe project;
The metauniverse scene hiding module 107 is configured to trigger the root controller component of the metauniverse engineering to hide the metauniverse scene when needed, so that the APP returns from the metauniverse scene page of the metauniverse engineering to the native scene page of the native APP engineering.
It should be noted that, for the sake of clarity, various embodiments of the present application are described as a series of acts or combinations of processes. It will be appreciated by those skilled in the art that the implementation is not limited by the acts or order of processing described, and that certain steps in embodiments of the application may be processed in other orders or concurrently.
Those skilled in the art will appreciate that the embodiments described herein are presently preferred embodiments, and that the acts, steps, modules, or units, etc. that are involved are not necessarily required by embodiments of the application. In the foregoing embodiments, the descriptions of the embodiments of the present application are emphasized, and in part, not described in detail in one embodiment, reference may be made to related descriptions of other embodiments.
For ease of understanding, the following provides a general understanding and meaning of some related art terms in the art.
App: and (3) running programs on the mobile phone, the iPad, the smart glasses and the smart watch.
2. Native APP: taking the iOS device as an example, a program running on a mobile phone is considered to belong to a native APP for a program written in the Objective-C, swift language by the iOS developer entirely.
Unity APP: the APP realized by Unity programming is mainly game APP.
4. Hybrid integration class APP: APP in the form of native program integration Unity, native integration h5 applet, etc.
Window: the container for bearing and displaying all APP contents, the container at the bottom layer of the program, and the window have attributes; the rootViewController has to put all the content and pages to be displayed on it.
The interlock controller: and the navigation controller is used for controlling the page display and hiding.
View controller: and the page display controller, the APP realizes the displayed page on the mobile phone.
Xcode: a compiler writing iOS apps may generate a program installation package (e.g., ipa format) that can be installed on a mobile device such as a cell phone.
Unity: a compiler that writes a metauniverse program may also be referred to as a metauniverse program.
Unityframe work: a dynamic link library (a package and reference mode of a program) generated after compiling a metacosmic program, wherein the library comprises all logic codes in the metacosmic program.
Rootviewcontroller: a root controller; the bottommost display controller is largely identical to the visual controller reality effect, except that the rootViewcontroller is placed at the bottommost display.
Fig. 9 is a schematic structural diagram of an electronic device 60 according to an embodiment of the present application, where the electronic device 60 includes a processor 61, a memory 62, and a communication bus for connecting the processor 61 and the memory 62, where a computer program that can be run on the processor 61 is stored in the memory 62, and when the processor 61 runs the computer program, the steps in the method according to the embodiments of the present application are executed or called implemented. The electronic device 60 also includes a communication interface for receiving and transmitting data. The electronic device 60 may be a server in the embodiment of the present application, and the electronic device 60 may also be a cloud server. The electronic device 60 may also be a terminal device or an AR device in an embodiment of the present application. Electronic devices may also be referred to as computing devices where appropriate.
In some embodiments, the processor 61 may be a central processor (Central Processing Unit, CPU), a graphics processor (graphics processing unit, GPU), an application processor (application processor, AP), a modem processor, an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a neural-network processor (neural-network processing unit, NPU), or the like; the processor 61 may also be other general purpose processors, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor, but in the alternative, it may be any conventional processor or the like. The NPU can rapidly process input information and can continuously perform self-learning by referring to the biological neural network structure. Applications such as intelligent recognition, image recognition, face recognition, semantic recognition, voice recognition, text understanding, etc. can be implemented by the NPU electronic device 60.
In some embodiments, the memory 62 may be an internal storage unit of the electronic device 60, such as a hard disk or memory of the electronic device 60; the memory 62 may also be an external storage device of the electronic device 60, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, which are provided on the electronic device 60. Memory 62 may also include both internal storage units and external storage devices of electronic device 60. The memory 62 may be used to store an operating system, application programs, boot loader (BootLoader), data, and other programs, such as program code for a computer program. Memory 62 includes, but is not limited to, random access memory (random access memory, RAM), read-only memory (ROM), erasable programmable read-only memory (erasable programmable read only memory, EPROM), or portable read-only memory (compact disc read-only memory, CD-ROM). Memory 62 is used to store program codes executed by electronic device 60 and transmitted data. The memory 62 may also be used to temporarily store data that has been output or is to be output.
It will be appreciated by those skilled in the art that fig. 9 is merely an example of the electronic device 60 and is not intended to limit the electronic device 60, and that the electronic device 60 may include more or less components than illustrated, or may combine certain components, or may include different components, such as may also include input-output devices, network access devices, and the like.
Fig. 10 is a schematic software structure of a terminal device according to an embodiment of the present application. Taking a mobile phone operating system as an Android system as an example, in some embodiments, the Android system is divided into four layers, which are respectively: an application layer, an application framework layer (FWK), a system layer, and a hardware abstraction layer, the layers communicating via a software interface.
First, the application layer may include a plurality of application packages, which may be various application apps such as call, camera, video, navigation, weather, instant messaging, education, etc., or may be application apps based on AR technology.
Second, the application framework layer FWK provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer may include some predefined functions, such as functions for receiving events sent by the application framework layer.
The application framework layer may include a window manager, a resource manager, a notification manager, and the like.
Wherein the window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
Among other things, the resource manager provides various resources to the application, such as localization strings, icons, pictures, layout files, video files, and so forth.
The notification manager enables the application program to display notification information in a status bar, can be used for conveying notification type information, and can automatically disappear after a short stay without user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
In addition, the application framework layer may also include a view system that includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views, for example, a text display view and a picture display view may be included on the display interface of the sms notification icon.
Third, the system layer may include a plurality of functional modules, such as a sensor service module, a physical state recognition module, a three-dimensional graphics processing library (e.g., openGLES), and so forth.
The sensor service module is used for monitoring sensor data uploaded by various sensors of the hardware layer and determining the physical state of the mobile phone; the physical state recognition module is used for analyzing and recognizing gestures, faces and the like of a user; the three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
In addition, the system layer may also include a surface manager and a media library. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like.
Finally, a hardware abstraction layer is a layer between hardware and software. The hardware abstraction layer may include display drivers, camera drivers, sensor drivers, etc. for driving the relevant hardware of the hardware layer, such as the display screen, camera, sensor, etc.
The embodiments of the present application also provide a computer-readable storage medium storing a computer program or instructions which, when executed, implement the steps in the method designed in the above embodiments.
Embodiments of the present application also provide a computer program product comprising a computer program or instructions which, when executed, implement the steps in the method devised in the embodiments described above. The computer program product may be, for example, a software installation package.
Those skilled in the art will appreciate that the methods, steps, or functions of related modules/units described in the embodiments of the present application may be implemented, in whole or in part, by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product, or in the form of computer program instructions executed by a processor. Wherein the computer program product comprises at least one computer program instruction, which may be comprised of corresponding software modules, which may be stored in RAM, flash memory, ROM, EPROM, EEPROM, registers, hard disk, a removable disk, a compact disk read-only memory (CD-ROM), or any other form of storage medium known in the art. The computer program instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another. For example, the computer program instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center, by wire or wirelessly. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), optical medium, or semiconductor medium (e.g., SSD), etc.
With respect to each of the apparatuses/products described in the above embodiments, the modules/units included therein may be software modules/units, or may be hardware modules/units, or may be partly software modules/units, or partly hardware modules/units. For example, for an application or a device/product integrated on a chip, each module/unit included in the device/product may be implemented in hardware such as a circuit, or at least some modules/units may be implemented in software programs, and run on a processor integrated inside the chip, where the remaining modules/units are implemented in hardware such as a circuit. For another example, for an application or a device/product integrated in a terminal, each module/unit included in the device/product may be implemented in hardware such as a circuit, or at least some modules/units may be implemented in software program, and run on a processor integrated in the terminal, where the rest of modules/units may be implemented in hardware such as a circuit.
In the foregoing, only the specific embodiments of the present application are described, and it will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the systems, modules and units described above may refer to the corresponding processes in the foregoing method embodiments, which are not repeated herein. It should be understood that the scope of the present application is not limited thereto, and any equivalent modifications or substitutions can be easily made by those skilled in the art within the technical scope of the present application, and they should be included in the scope of the present application.

Claims (13)

1. The method for sharing shooting of the original content and the meta-cosmic content is characterized by being applied to an APP of a terminal device, wherein the APP comprises an original APP project and a meta-cosmic project, the original APP project comprises an original navigation controller component, an original window component and an original view controller component, and the creation mode of the meta-cosmic window in the meta-cosmic project comprises the steps of obtaining the original window in the original APP project;
the method for sharing shooting of the original content and the meta-universe content comprises the following steps:
the method comprises the steps that a preset component in a system function file is called to screen the page of the APP at intervals of preset time intervals to obtain a plurality of screen capturing images, one part of screen capturing images in the plurality of screen capturing images are pages of a primary APP project, the other part of screen capturing images are pages of a meta-space project, and the other part of screen capturing images are transition pages when the primary APP project and the meta-space project are mutually switched;
storing the plurality of screen capture images to a cache;
synthesizing the plurality of screen capturing images to generate a shared video, wherein the shared video is used for displaying the original content of the original APP project and the meta-universe content of the meta-universe project;
wherein the native APP engineering and the meta-space engineering are switched to each other by:
Determining that the current page of the APP is a native scene page of a native APP project;
triggering an entry of the meta-universe engineering in a native scene page of the current native APP engineering, loading the native navigation controller component in a mode that the native view controller component in the native APP engineering is taken from the bottom to the top, wherein a root controller component for the meta-universe scene is arranged in the native navigation controller component;
the meta-space engineering obtains a native window of the native APP engineering, and fills the native window with the content of the root controller component, so that the current page of the APP displays the meta-space scene of the meta-space engineering, and the switching from the native APP engineering to the meta-space engineering is completed;
when needed, the root controller component of the metauniverse hides the metauniverse scene so that the APP returns to the original scene page of the original APP project from the metauniverse scene page of the metauniverse project.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the creation mode of the meta-space window in the meta-space engineering comprises the step of assigning a native window component in the native APP engineering to the meta-space engineering.
3. The method of claim 1, wherein the native APP project is a main project and the meta space project is a sub project.
4. The method of claim 1, wherein the native view controller component always controls a scene interface displayed at a top level in a native APP project.
5. The method of claim 1, wherein the predetermined component is a startRecordingWithHandler.
6. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the metauniverse includes a dynamically linked library, instance variables of the dynamically linked library, and a communication protocol, wherein,
the dynamic link library is used for presenting metadata universe engineering data, and the data at least comprises scene data and resource data;
the instance variables are used for realizing interaction and communication between the native APP engineering and the meta-space engineering;
the communication protocol is used for acquiring user information in the original APP engineering by the meta-space engineering.
7. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the native APP project comprises an AppDelegate+Unity file for realizing a protocol, and all interactions of the meta-universe operation native project are in the AppDelegate+Unity file for realizing the protocol.
8. The method of claim 6, wherein the communication protocol comprises a native callsprotocol protocol.
9. The method of claim 7, wherein the user information in the native APP project comprises at least one of: user account information and user head portrait information.
10. An electronic device comprising a processor and a memory storing computer program instructions; the electronic device implementing the method of any one of claims 1-9 when executing the computer program instructions.
11. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon computer program instructions which, when executed by a processor, implement the method according to any of claims 1-9.
12. A computer program product comprising computer program instructions which, when executed by a processor, implement the method of any of claims 1-9.
13. The device for sharing shooting of the original content and the meta-cosmic content is characterized by being applied to an APP of a terminal device, wherein the APP comprises an original APP project and a meta-cosmic project, the original APP project comprises an original navigation controller component, an original window component and an original view controller component, and the creation mode of the meta-cosmic window in the meta-cosmic project comprises the steps of obtaining the original window in the original APP project;
The device for sharing shooting of the native content and the meta-universe content comprises:
the screen capturing processing module is used for capturing the screen of the page of the APP at intervals of a preset time period by calling a preset component in the system function file to obtain a plurality of screen capturing images, wherein one part of the screen capturing images in the plurality of screen capturing images are pages of the original APP engineering, the other part of the screen capturing images are pages of the meta-cosmic engineering, and the other part of the screen capturing images are transition pages when the original APP engineering and the meta-cosmic engineering are mutually switched;
the storage module is used for storing the plurality of screen capturing images into a cache;
the synthesis module is used for carrying out synthesis processing on the plurality of screen capturing images to generate a shared video, wherein the shared video is used for displaying the original content of the original APP project and the meta-universe content of the meta-universe project;
the original APP engineering and the meta-space engineering are mutually switched through the following modules:
the determining module is used for determining that the current page of the APP is a native scene page of a native APP project;
the loading module is used for triggering an entry of the meta-universe engineering in a native scene page of the current native APP engineering, the native view controller component in the native APP engineering is loaded in a mode of being taken from bottom to top, and a root controller component for the meta-universe scene is arranged in the native navigation controller component;
The native window filling module is used for acquiring a native window of a native APP project, and filling the native window with the content of the root controller component so that a current page of the APP displays a meta-universe scene of the meta-universe project, and switching from the native APP project to the meta-universe project is completed;
the meta-cosmic scene hiding module is used for triggering a root controller component of the meta-cosmic engineering to hide the meta-cosmic scene when needed, so that the APP returns to a native scene page of the native APP engineering from a meta-cosmic scene page of the meta-cosmic engineering.
CN202310464724.0A 2023-04-26 2023-04-26 Method and device for sharing shooting of native content and meta-universe content and electronic equipment Pending CN116610396A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310464724.0A CN116610396A (en) 2023-04-26 2023-04-26 Method and device for sharing shooting of native content and meta-universe content and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310464724.0A CN116610396A (en) 2023-04-26 2023-04-26 Method and device for sharing shooting of native content and meta-universe content and electronic equipment

Publications (1)

Publication Number Publication Date
CN116610396A true CN116610396A (en) 2023-08-18

Family

ID=87680877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310464724.0A Pending CN116610396A (en) 2023-04-26 2023-04-26 Method and device for sharing shooting of native content and meta-universe content and electronic equipment

Country Status (1)

Country Link
CN (1) CN116610396A (en)

Similar Documents

Publication Publication Date Title
US11748054B2 (en) Screen projection method and terminal device
KR102317167B1 (en) Duplicate Tracking System
KR102257167B1 (en) Surface recognition lens
KR102369686B1 (en) Media item attachment system
CN112639892A (en) Augmented reality personification system
CN109448050B (en) Method for determining position of target point and terminal
KR102199735B1 (en) Method and system for sharing effects for video
TWI783472B (en) Ar scene content generation method, display method, electronic equipment and computer readable storage medium
CN110990075A (en) Starting method, device and equipment of fast application and storage medium
CN113806306B (en) Media file processing method, device, equipment, readable storage medium and product
JP7393487B2 (en) Method and system for recommending profile pictures and non-transitory computer-readable storage medium
CN111078325B (en) Application program running method and device, electronic equipment and storage medium
CN110543347A (en) Method and device for generating screenshot image and electronic equipment
WO2021254113A1 (en) Control method for three-dimensional interface and terminal
US11507633B2 (en) Card data display method and apparatus, and storage medium
CN110971974B (en) Configuration parameter creating method, device, terminal and storage medium
CN114449171B (en) Method for controlling camera, terminal device, storage medium and program product
CN115878115A (en) Page rendering method, device, medium and electronic equipment
CN116610396A (en) Method and device for sharing shooting of native content and meta-universe content and electronic equipment
CN113419806A (en) Image processing method, image processing device, computer equipment and storage medium
CN115686700A (en) Rendering method and electronic equipment
CN110908629A (en) Electronic equipment operation method and device, electronic equipment and storage medium
US20240144547A1 (en) Electronic device for providing information on virtual space and method thereof
CN116633890A (en) 3D message processing method and device, electronic equipment and readable storage medium
JP2023003489A (en) Video processing system, video processing program and video processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination