CN116503529A - Rendering, 3D picture control method, electronic device, and computer-readable storage medium - Google Patents

Rendering, 3D picture control method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN116503529A
CN116503529A CN202310401235.0A CN202310401235A CN116503529A CN 116503529 A CN116503529 A CN 116503529A CN 202310401235 A CN202310401235 A CN 202310401235A CN 116503529 A CN116503529 A CN 116503529A
Authority
CN
China
Prior art keywords
engine
interface
rendering
display interface
communication event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310401235.0A
Other languages
Chinese (zh)
Inventor
袁方
杨中雷
杨舟
李军舰
陆鹏
王志成
沈彬
张路路
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202310401235.0A priority Critical patent/CN116503529A/en
Publication of CN116503529A publication Critical patent/CN116503529A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The application provides a rendering and 3D picture control method, electronic equipment and a computer readable storage medium, which can improve 3D display speed and effect. The rendering method comprises the following steps: obtaining model information in a 3D engine; determining rendering information according to the model information; calling a graphic application program interface according to the rendering information; rendering the rendering information into a drawing canvas of a display interface by using the graphic application program interface and the 3D engine; and the drawing canvas of the display interface is obtained by encapsulating the drawing canvas of the 3D engine in an embedded view component of the display interface in the same layer.

Description

Rendering, 3D picture control method, electronic device, and computer-readable storage medium
Technical Field
The present disclosure relates to the field of computer vision, and in particular, to a rendering and 3D picture control method, an electronic device, and a computer readable storage medium.
Background
With the development of computer technology, computer vision has also been increasingly diversified, from the display of 2D (2D) pictures to the multi-angle display of 3D (3D) models. Meanwhile, with the development of mobile terminal technology, various applications, diversified application functions and applets are layered endlessly. The terminal can also realize various display functions through application. The rise of the display dimension is likely to cause a phenomenon such as a slow rendering rate and a stuck phenomenon.
Disclosure of Invention
The embodiment of the application provides a rendering and 3D picture control method, electronic equipment and a computer readable storage medium, which can improve 3D display speed and effect.
In a first aspect, an embodiment of the present application provides a rendering method, including: obtaining model information in a 3D engine; determining rendering information according to the model information; calling a graphic application program interface according to the rendering information; rendering the rendering information into a drawing canvas of the display interface by using a graphic application program interface and a 3D engine; the drawing canvas of the display interface is obtained by encapsulating the drawing canvas of the 3D engine in an embedded view component of the display interface in the same layer.
In a second aspect, an embodiment of the present application provides a 3D picture control method, including: receiving gesture operation information through an embedded view component of a display interface; generating a communication event from the component to the engine according to the gesture operation information; invoking the JS application program interface to pass the component-to-engine communication event to the 3D engine, so that the 3D engine can alter model information for generating a 3D screen embedded in the view component according to the component-to-engine communication event.
In a third aspect, an embodiment of the present application provides a 3D picture control method, including: receiving model operation information through an operable button of a display interface; generating a communication event from the display interface to the engine according to the model operation information; invoking the JS application program interface to communicate the display interface-to-engine communication event to the 3D engine, so that the 3D engine can change model information for generating a 3D picture embedded in the view component according to the display interface-to-engine communication event.
In a fourth aspect, embodiments of the present application provide an electronic device including a memory, a processor, and a computer program stored on the memory, the processor implementing the method of any one of the above when the computer program is executed.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored therein, the computer program, when executed by a processor, implementing a method according to any one of the above.
Compared with the prior art, the application has the following advantages:
when model information of a 3D model in the 3D engine is rendered, a drawing canvas is encapsulated in an embedded view component of the display interface, so that the rendering of the drawing canvas can be executed in the same layer as the rendering of the display interface, and the rendering speed is improved.
The foregoing description is merely an overview of the technical solutions of the present application, and in order to make the technical means of the present application more clearly understood, it is possible to implement the present application according to the content of the present specification, and in order to make the above and other objects, features and advantages of the present application more clearly understood, the following detailed description of the present application will be given.
Drawings
In the drawings, the same reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily drawn to scale. It is appreciated that these drawings depict only some embodiments according to the application and are not to be considered limiting of its scope.
Fig. 1 is an application scene schematic diagram of a rendering method provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of a rendering method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a rendering method implementation architecture according to an example of the present application;
FIG. 4 is a schematic drawing of rendering logic according to an example of the present application;
FIG. 5 is a schematic diagram of a rendering apparatus according to an example of the present application; and
fig. 6 is a block diagram of an electronic device used to implement an embodiment of the present application.
Detailed Description
Hereinafter, only certain exemplary embodiments are briefly described. As will be recognized by those of skill in the pertinent art, the described embodiments may be modified in various different ways without departing from the spirit or scope of the present application. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
In order to facilitate understanding of the technical solutions of the embodiments of the present application, the following describes related technologies of the embodiments of the present application. The following related technologies may be optionally combined with the technical solutions of the embodiments of the present application, which all belong to the protection scope of the embodiments of the present application.
Fig. 1 is an application scene schematic diagram of a rendering method according to an embodiment of the present application. In the embodiment of the present application, the rendering method may be applied to the user terminal 101. The user terminal 101 may be a mobile phone, a desktop computer, a notebook computer, a tablet computer, a palm computer, an intelligent wearable device, etc. The user terminal 101 runs a certain operating system 102, for example, a Windows system, a Linux system, an IOS system, an Android (Android) system, etc. An Application (APP) having a 3D rendering function, that is, APP103 may be run at the user terminal 101 based on the operating system 102. The APP103 can run a webpage or other forms of display interfaces through an applet or a network tool, render the webpage or other forms of display interfaces according to the rendering method provided by the embodiment of the application, and generate a 3D rendering picture in the webpage or other forms of display interfaces opened by the APP103. The APP in this embodiment may be an APP dedicated to generating a 3D picture, or an APP having an applet for generating a 3D picture.
The rendering method provided by the embodiment of the application, as shown in fig. 2, includes steps S201-S204.
In step S201, model information in the 3D engine is acquired.
In this embodiment, the 3D engine is a collection of algorithm implementations that abstract real substances into expressions such as polygons or various curves, and perform correlation calculations in a computer and output final images. Typically, 3D engines support high-level graphics software development as an underlying tool.
In the embodiment of the application, the 3D engine may provide a function of rendering the 3D information into a 2D image and displaying the 3D information in the 2D image.
In the present embodiment, the model information may be information of a model for creating a 2D screen, such as information of an object model, a character model, and an animal model. For example, model information may be used to represent the dimensions, shadow effects, orientations, visual effects, etc. of room models, building models, tree models, etc. of a game scene.
The model information may also include information stored in the 3D engine that represents model properties, parameters, materials, etc. that can be used to render prototypes (e.g., humans, animals, plants) to which the model corresponds into a 2D picture.
In a specific implementation, the model information may include information of a plurality of three-dimensional models of the object for constructing the three-dimensional picture. For example, if a scenario of a three-dimensional concert is to be generated, model information of a concert venue model, model information of a stage model, model information of a seat model, model information of a performer model, model information of a musical instrument model, and the like can be obtained. For another example, if the APP is to perform a three-dimensional shopping guide service, model information of a shopping guide model, model information of a commodity model, model information of a store internal model, and the like can be obtained. For another example, model information of objects and character models related to the interaction may be obtained during the 3D interaction.
In step S202, rendering information is determined from the model information.
In this embodiment, determining rendering information according to the model information may include performing three-dimensional to two-dimensional conversion on three-dimensional information of the 3D model to obtain two-dimensional information; rendering information is determined from the two-dimensional information.
In another implementation, the rendering information may refer to information for generating a 2D picture calculated according to 3D parameters of the 3D model, such as may include a pixel value of each pixel in the display picture or data for calculating a pixel value of each pixel in the display picture.
In step S203, the graphics application program interface is called according to the rendering information.
In the embodiment of the application, the JS application program interface may include an API (Application Programming Interface, API) based on the Java Script language. JavaScript (abbreviated as "JS") is a lightweight, interpreted or just-in-time compiled programming language with functional prioritization. Although it is a scripting language that is known to develop Web pages, it is also used in many non-browser environments, java Script is based on a prototype-programmed, multi-paradigm dynamic scripting language, and supports object-oriented, command-style, declarative, functional programming paradigms.
Invoking the graphics application program interface in accordance with the rendering information may include invoking the graphics application program interface if the rendering information is received.
In the embodiment of the present application, the invoked graphics application program interface may be a graphics application program interface capable of setting the utilization rate of the GPU (Graphics Processing Unit, graphics processor) to a set level.
At step S204, rendering the rendering information into a drawing canvas of the display interface using the graphics application program interface and the 3D engine; the drawing canvas of the display interface is obtained by encapsulating the drawing canvas (Render Surface View) of the 3D engine in an embedded view component of the display interface in a same layer.
In this embodiment, the display interface may be a Web page (Web). The drawing canvas of the display interface may be a drawing canvas of a webpage. Rendering the rendering information into a drawing canvas of the webpage, so that the rendering information corresponding to the 3D model in the 3D engine can be displayed in the webpage, and the rendering of the 3D model in the webpage is realized. The drawing canvas may be a canvas provided by a container for encapsulating an embedded view component of a web page. In addition to web pages, the display interface may also be an applet page, a program display interface, and the like.
In this embodiment, the same-layer encapsulation may include embedding a native view in the display interface container, through which the drawing canvas of the 3D engine is obtained.
In this embodiment, in the case where the display interface is a web page, the web page may be a plain text file containing HTML tags, may be a "page" in the world wide web, and may be in a hypertext markup language format (an application of standard universal markup language, file extension is. HTML or. Htm).
In general, at a client of an iOS, android, or other system, the system is provided with a Native (Native) component, which may be part of a terminal system or may be used as a third party library. For example, the UITEXTField (User Interface Text Field ) component, the UITEXTView (User Interface Text View ) component in the iOS system; editText components, listView components, etc. in the Android system. While web page-provided components, such as the H5 component: may refer to Web components written in HTML5 (hypertext markup Language) Language, such as < input/> (input) components, < textarea > (Text region) components, etc. Compared with components of Web pages such as an H5 component, the native component not only can provide some functions which cannot be realized by the H5 component, but also can promote the smoothness of user experience, and the communication cost is reduced because the flow of client-side codes and Web View communication is reduced, and the native component has full functions, high speed and less cost.
Thus for some components of a web page, such as H5 components, that are mapped into native components when rendered by the client will result in a smoother user experience. However, for clients that typically use Web View to load HTML, the native components are off-line from the Web View rendering flow, and if the Web View is viewed as a separate layer, then the native components are at another higher level. Such a hierarchy presents some problems: among the native components and web page components, the level of the native component is highest, and no matter what the Z-index (Z index, i.e., level) is set by other components in the web page, the native component cannot be covered by the other components; part of the CSS (Cascading Style Sheets, cascading style sheet) style cannot be applied to native components; native components cannot be used in scrollable H5 components such as scroll-view: because the native component is inserted as its child node if the developer is in the scrollable DOM region, it is neither moved along nor arbitrated because it is a hierarchy that is directly inserted outside the Web View, with no association between DOM. To solve this problem, co-layer rendering is coming out. The same-layer rendering is a technology which allows Native components and Web View DOM elements to be mixed together for rendering, and can ensure that the Native components and the DOM elements are consistent in sense of body, and little difference exists in the aspects of rendering hierarchy, rolling feeling, touch events and the like. The essence of the method is that the native component and the H5 component can be displayed on the same level, so that the native component and the H5 component can be randomly overlapped, and the level restriction is removed. The native component is used like the H5 component, the component style is set, and so on.
In the case that the display interface is other interfaces, the same-layer encapsulation in the embodiment of the application may set the view components of different layers to be the same layer, so as to implement the same-layer rendering of the view components of different layers.
In the embodiment of the application, when rendering the model information of the 3D model in the 3D engine, the drawing canvas is encapsulated in an embedded view component (embedded view component) of a webpage or other forms of display interfaces, so that the rendering of the drawing canvas can be executed on the same layer as the rendering of the webpage or other forms of display interfaces, and the rendering speed is improved.
In one embodiment, invoking the graphics application program interface in accordance with the rendering information includes: generating a communication event between the 3D engine and the display interface according to the rendering information; the graphical application program interface is invoked based on a communication event between the 3D engine and the display interface.
The communication event may be a messaging event between the 3D engine and the display interface.
In this embodiment, the graphics application program interface is invoked based on a communication event between the 3D engine and the display interface, so that in the presence of rendering information, rendering can be achieved through the graphics application program interface.
In one embodiment, the 3D engine connects to the image application program interface using JS Binding, calls the graphics application program interface based on a communication event between the 3D engine and the display interface, comprising: packaging a communication event between the 3D engine and the display interface through a bridging interface of the 3D engine; the bridging interface of the 3D engine is an interface for connection based on a JS bridging mode; and calling a graphic application program interface based on the packaged system native communication event by using a drawing canvas of the 3D engine.
In this embodiment, the 3D engine may be a JS-based engine. The primary purpose of JS Binding is to open the underlying frame capability, also known as JS Bridge (JS Bridge), using a JS-based engine and JS communications, which determines the speed of JS and frame layer communications. The interface of the terminal bottom layer can be called in a JS binding mode, and the interface comprises an advanced graphic interface of the terminal system bottom layer, so that a better rendering function is realized.
In one embodiment, the rendering method further comprises: embedding the embedded view component into a display interface view component of the display interface to obtain the embedded view component of the display interface; the embedded view component of the display interface and the drawing canvas of the 3D engine are encapsulated in layers using an encapsulation container.
In the case that the display interface is a Web page, the embodiment of the application embeds an embedded View component (embedded View) into a Web page View component (Web View) of the Web page to obtain the embedded View component of the Web page; the embedded view component of the web page and the drawing canvas of the 3D engine are packaged in layers using a packaging container.
In this embodiment, in the case where the display interface is a web page, the embedded view component may be implemented based on specifications of a browser for rendering the web page.
The embedded view component of the web page and the drawing canvas of the 3D engine are encapsulated in the same layer using the encapsulation container, and may be included in the embedded view component of the web page, embedding the drawing canvas of the 3D engine. The embedded view component of the web page is obtained by embedding the embedded view component into the view component of the web page. That is, the drawing canvas of the 3D engine may be embedded into the embedded view component after the embedded view component is embedded into the web page view component.
In the embodiment, the drawing canvas of the embedded view component and the drawing canvas of the 3D engine are packaged in the same layer, so that the same-layer rendering of the picture can be realized.
In one embodiment, rendering information into a drawing canvas of a display interface using a graphics application program interface and a 3D engine includes: acquiring the state of the 3D engine by using the packaging container; and under the condition that the state of the 3D engine is the running state, calling a graphical program interface by utilizing a view component of the 3D engine, and rendering the rendering information into a drawing canvas of the display interface.
In other embodiments, the state of the 3D engine may also include engine initialization, etc. After the 3D engine completes initialization, the rendering operation can begin.
In this embodiment, when the 3D engine is in an operating state, the graphics program interface is called to implement rendering operation, so as to implement 2D rendering of the 3D model in the display interface.
In one embodiment, invoking the graphical program interface to render rendering information into a drawing canvas of the display interface includes: and calling a graphical program interface, drawing a drawing canvas of the display interface based on a communication event encapsulated between the 3D engine and the display interface, wherein the encapsulated communication event is generated based on rendering information.
In this embodiment, the rendering canvas may be rendered based on the communication event, so as to implement the same-layer rendering of the 3D information on the 2D display interface.
In one embodiment, the rendering method further comprises: acquiring an embedded drawing canvas in the packaging container; binding the embedded drawing canvas to the 3D engine to obtain the drawing canvas of the 3D engine.
In this embodiment, the embedded canvas is bound to the 3D engine to form the drawing canvas of the 3D engine, so that the drawing canvas of the 3D engine and the display interface canvas can be rendered in the same layer by the embedded mode.
In one embodiment, the graphical application program interface comprises a Vulkan interface or a Metal interface.
In this embodiment, the Metal interface is an IOS platform advanced graphics API. The Vulkan interface is an Android platform advanced graphics API.
In other embodiments, where the terminal system is another application system, the graphics application program interface is included in the other application system, and is a graphics interface capable of fully invoking GPU functions. The determination of whether to call sufficiently may be achieved by setting data such as a utilization threshold of the response.
In the embodiment of the application, by providing the 3D same-layer rendering scheme, the performance bottleneck of 3D rendering and the problem of service release flexibility of a game engine are solved, the efficiency and flexibility of 3D service development and release are ensured, and the high-level graphic APIs of different platform bottom layers can be called to render.
On the basis that the display interface is a webpage, conventional 3D development generally comprises two modes. One is pure webpage 3D engine development based on JS language, and the 3D picture rendering process is realized in a browser; another is 3D engine development using a game engine, building a stand-alone APP. The JS-based pure webpage 3D engine development scheme has the advantages of flexibility and cross-platform property, can be very flexibly developed, debugged and released, and is very suitable for rapid iteration and propagation of functions. However, the development scheme of the pure web 3D engine based on JS language has the disadvantage that the current rendering capability of the browser is limited, and the capability of fully using the GPU cannot be used by using high-level image APIs such as Metal, vulkan, and the like. Meanwhile, the development scheme of the pure webpage 3D engine based on JS language is limited by the running performance of JS and Web GL, the overall running performance is poor, and the running speed is compared with that of a card under the scene of large data transmission quantity. In addition, when the webpage is operated by adopting the pure webpage 3D engine development scheme based on JS language, the memory consumed by the webpage browsing process by the APP is also limited greatly.
The 3D engine development scheme using the game engine has advantages of perfect functions, excellent 3D rendering performance and 3D rendering effect. This approach can use high-level APIs such as Metal, vulkan, etc. to fully exploit the capabilities of the device GPU. The disadvantage of this solution is that the cost of the APP access engine developed by the e-commerce is relatively high and the size of the memory resources occupied by the APP can be greatly increased. On the other hand, APP related service development mostly depends on APP publishing of new version, which means that using APP to realize 3D engine development can make APP internal 3D rendering related service fast error trial and error fast iteration have great influence.
In order to solve the pain points existing in the two modes respectively, the embodiment of the application combines the capability of the browser embedded view component and the transformation of the engine architecture, proposes the same-layer rendering scheme of the display interfaces such as the 3D engine and the webpage, enables the 3D development to have a dynamic development mode and a native rendering capability, and not only gives consideration to the engine function rendering capability, but also ensures the flexibility of service iteration.
In one embodiment, the 3D engine is a c++ language based engine. In this embodiment, C++ (C plus plus) is a high-level programming language of a computer, and is generated by expanding upgrades in the C language. C++ can be used for carrying out procedural programming of a C language, carrying out object-based programming with abstract data types as characteristics, and carrying out object-oriented programming with inheritance and polymorphism as characteristics.
The embodiment of the application also provides a 3D picture control method, which comprises the following steps: receiving gesture operation information through an embedded view component of a display interface; generating a system native communication event according to the gesture operation information; the system native communication event is communicated to the 3D engine using the JS application interface, such that the 3D engine can alter model information for generating a 3D picture embedded in the view component in accordance with the system native communication event.
In one embodiment, the rendering method further comprises: modifying the model information by a 3D engine; and changing the 3D picture embedded in the view component according to the changed model information.
The embodiment of the application also provides a 3D picture control method, which comprises the following steps: receiving model operation information through an operable button of a display interface; generating a system native communication event according to the model operation information; the system native communication event is communicated to the 3D engine using the JS application interface, such that the 3D engine can alter model information for generating a 3D picture embedded in the view component in accordance with the system native communication event.
In one example of the present application, referring to fig. 3, at a terminal, an architecture for implementing 3D rendering includes a service container side (browser side) 301 and an execution engine side (3D engine side) 302. On the service container side, the method comprises Web pages, web peer components and native implementation. The Web page comprises a graph anchor module and other modules. The Web co-layer components include an embedded View component (embedded View), an AceNNR View component (which may correspond to a View component of a Web page), and an AceBridge (an Ace bridging component, i.e., a bridging layer of a 3D engine). Ace may represent a uniform name of a component. In the native implementation section, canvas listeners (Surface listens) and rendersurface view (drawing canvas of 3D engine) are included. On the execution engine side 302, a JS engine and an AceNNR engine are included. The JS engine further comprises an event communication module and a gesture interaction module, and the AceNNR engine further comprises a 3D model rendering module, a resource management module, an event management module and a gesture processing module.
Based on the example architecture shown in fig. 3, the 3D peer-layer scheme is implemented based on a windvane (wind vane container, which is equivalent to the packaging container of the foregoing embodiment and is also equivalent to the container for packaging the Web peer-layer component in fig. 3), and the principle is that based on a peer-layer communication mechanism, communication between a Web View (Web page View component, which corresponds to the Web peer-layer component in fig. 3) and a 3D Engine (3D Engine, which corresponds to the execution Engine side 302 in fig. 3) is opened, so as to implement communication between the 3D Engine and the Web, and model rendering of the Engine on the Web, thereby implementing rendering and interaction of the 3D model.
In this example, still referring to fig. 3, on the service container side 301, an embedded View component of a Web page, that is, a Web peer component (which may include multiple components of the same layer) shown in fig. 3, is obtained by mainly encapsulating an embedded View component by JS encapsulation, embedding the embedded View component in a Web View, and embedding the rendersurface View component in a native implementation in the embedded View component.
AceBridge is a bridge of bridging Native components (Native components) and web page components, and can be implemented based on the JS bridging principle. The aforementioned native components may include native components in an applet for opening a web page in an APP, such as, for example, camera (camera control), video (video control), etc., which components ultimately map to during rendering.
The embedded view component of the web page, the rendering canvas of the 3D engine, and AceBridge, aceBridge, which use the windavane container, are responsible for providing a native implementation interface to the windavane container. After the windvane container obtains the corresponding 3D engine state (such as engine initialization), rendering related operations can be executed, and rendering is performed on the drawing canvas of the 3D engine. The AceNNR View is a core control of the same layer, responsible for bridging the native components of the 3D engine and the Web View, rendering information received by a native component of the 3D engine on a winnane window, and acquiring embedded surface View (canvas View component) and textview (text View component) of the windstone at the time of initialization, binding the embedded surface View (canvas View component) with the 3D engine, and providing the canvas for the 3D engine so that the 3D engine has the canvas to obtain the rendersurface View (drawing canvas of the 3D engine). Rendersurface view is a true implementer of rendering at the same layer, and is responsible for providing canvas to the 3D engine, managing the lifecycle of the 3D engine, and performing engine operations (such as engine warm-up, startup, event distribution, etc.) by way of JNI call (Java Native Interface ). The JNI call is a communication mode from native to the bottom layer c++ of the terminal system.
Gesture operation on the Rendersurface View is transmitted to the engine, and after being processed by the gesture processing module and the event management module, other components on the Web page are synchronized in a JS calling mode, so that control linkage is realized. For example, after receiving the gesture rotation instruction, the gesture rotation event can be received through a canvas monitor, synchronized to an AceNNR engine and a JS engine through a rendersurface view, parsed to a 2D layer through a gesture processing module, an event management module, a gesture interaction module and an event communication module, thrown to a 2D service container side, and controlled through a graphics anchor point module or other modules. Similarly, the operation on the Web page is also transferred to the 3D engine in a JS call (browser-to-native communication) manner, and the corresponding linkage operation is executed on the model. The resource management module and the 3D model rendering module may be used for rendering operation management and resource configuration management in a 3D rendering process.
Inside the Engine, the AceNNR Engine and the JS Engine are bound in a JS binding mode, and the Web page can call an Engine interface in a JS call mode, so that the operation of a model, such as anchor point control, model switching and the like, is realized. The gesture related operation is transmitted to the AceNNR Engnie through a Android Touch Event (An Zhuochu control event) event, the gesture event is acquired by an engine to be processed, the model is re-rendered, and interaction between the model and the anchor point is realized in a JS call mode, such as rotation, amplification, shrinkage and other cool interaction effects of the model.
Rendering of the 3D model, the engine is a brush and is responsible for rendering the model, and the container provided on the end is a canvas and is responsible for drawing the model and displaying. The canvas of the terminal of different systems is different, the terminal of the IOS system is rendered and drawn on a metallayer (Metal interface layer canvas), the terminal of the Android system is rendered and drawn on a Surface (canvas), and due to the unification of metallayer synchronization mechanisms of IOS, model rendering of IOS is relatively smooth, but on Android, because of the problem of VSYNC (vertical synchronization) synchronization mechanisms, serious rendering problems such as clamping, frame dropping and the like appear when the terminal of the IOS system is started to be used on a 3D model. To address this problem of Android, embodiments of the present application employ hybrid rendering based on embedded View (Embed View). Therefore, in the embodiment of the application, a Surface can be created for the embedded View through the UC (Unified Communications, same communication mechanism) Web View, and the View is directly drawn onto the Surface provided by the EmbedView through the engine, so that not only is a layer of redundant drawing and memory allocation reduced, but also the problems of hierarchy interaction and clamping are solved. The whole drawing logic diagram is as shown in fig. 4, and the drawing of the same layer is realized in the drawing window of the Web container by calling the graphic API through the 3D engine. Wherein, the 3D engine corresponds to the AceNNR engine, and the drawing window can correspond to the UC webpage view component, the embedded view component and the drawing canvas of the 3D engine. The AceNNR engine binds (binding) the Vulkan graphic API of the android system, and calls the Vulkan graphic API of the android system to draw (draw) in a drawing canvas, so that the same-layer rendering of an embedded View component embedded in a UC webpage View component (UC Web View) and the drawing canvas is realized.
In the embodiment of the application, advanced graphic APIs of different platforms such as Metal, vulkan and the like are directly used, the capability of fully mined GPUs is enabled to support approximately 100 ten thousand triangles, a single frame Draw Call is more than 500 times, a plurality of real-time light sources (3D rendering simulates a real imaging principle in real life), light Probe Groups (light detection group) and Reflection Probes (reflection detection) are combined with Planar Reflections (plane reflection)/Screen Space Reflections (screen space reflection) to make real-time GIs (Global Illumination ), lightmap (light map) is made into static GIs, a particle system and multi-section skeletal animation playing are simultaneously included, and complex 3D scenes of various user-defined materials are not smoothly experienced in running, and 3D scenes with the same complexity are not smoothly experienced in small programs and games. The triangle is a basic unit of a surface of the 3D model in the embodiment of the application. The Draw Call is an operation that the CPU (Central Processing Unit ) calls the image programming interface to command the GPU to perform rendering.
In this example, the 3D engine may be a c++ language-based engine. The same-layer rendering scheme, the JS binding scheme, the C++ and the 3D engine are combined, a 3D same-layer rendering scheme is provided, the 3D logic developed by JS is operated in the JS engine, the 3D rendering calls the C++ engine, and the 2D interface is operated in the browser, so that the flexibility of service development and release can be ensured, and the extremely high rendering performance can be ensured.
According to the embodiment of the application, the problem of unstable frame rate and frame loss caused by asynchronous refreshing frequency of Android terminal equipment and GPU rendering frequency is solved based on a hybrid rendering scheme of the self-embedded View.
The embodiment of the application further provides a rendering device, as shown in fig. 5, including: a model information obtaining module 501, configured to obtain model information in a 3D engine; a rendering information determining module 502, configured to determine rendering information according to the model information; a calling module 503, configured to call a graphics application program interface according to the rendering information; a rendering module 504 for rendering the rendering information into a drawing canvas of the display interface using the graphics application program interface and the 3D engine; the drawing canvas of the display interface is obtained by encapsulating the drawing canvas of the 3D engine in an embedded view component of the display interface in the same layer.
In one embodiment of the present application, the calling module includes: the communication event generating unit is used for generating a communication event between the 3D engine and the display interface according to the rendering information; and the calling interface unit is used for calling the graphical application program interface based on the communication event between the 3D engine and the display interface.
In one embodiment of the present application, the 3D engine connects to the image application program interface using JS bridging, and the call interface unit is further configured to: packaging a communication event between the 3D engine and the display interface through a bridging interface of the 3D engine; the bridging interface of the 3D engine is an interface for connection based on a JS bridging mode; and calling a graphic application program interface based on the packaged system native communication event by using a drawing canvas of the 3D engine.
In one embodiment of the present application, the rendering apparatus further includes: the embedded view component obtaining module is used for embedding the embedded view component into the display interface view component of the display interface to obtain the embedded view component of the display interface; and the packaging module is used for packaging the embedded view component of the display interface and the drawing canvas of the 3D engine in the same layer by using the packaging container.
In one embodiment of the present application, the rendering module further includes: a state acquisition unit for acquiring a state of the 3D engine using the packaging container; and the drawing unit is used for calling the graphical program interface by utilizing the view component of the 3D engine under the condition that the state of the 3D engine is the running state, drawing a drawing canvas of the display interface based on the communication event packaged between the 3D engine and the display interface, and the packaged communication event is generated based on the rendering information.
In one embodiment of the present application, the rendering apparatus further includes: the canvas acquisition module is used for acquiring the embedded drawing canvas in the packaging container; and the binding module is used for binding the embedded drawing canvas to the 3D engine to obtain the drawing canvas of the 3D engine.
In one embodiment of the present application, the graphical application program interface comprises a Vulkan interface or a Metal interface.
In one embodiment of the present application, the 3D engine is a c++ language based engine.
The embodiment of the application also provides a 3D picture control device, which comprises: the gesture operation information module is used for receiving gesture operation information through an embedded view component of the display interface; the first communication event module is used for generating a communication event from the component to the engine according to the gesture operation information; the first calling module is used for calling the JS application program interface and transmitting the communication event from the component to the engine to the 3D engine, so that the 3D engine can change model information for generating a 3D picture embedded in the view component according to the communication event from the component to the engine.
In one embodiment of the application, the 3D engine is bound to the JS engine, so that the JS application program interface of the 3D engine can be called in a JS call manner to transfer the communication event from the component to the engine to the 3D engine.
The embodiment of the application also provides a 3D picture control device, which comprises: the button operation information module is used for receiving model operation information through an operable button of the display interface; the second communication event module is used for generating a communication event from the display interface to the engine according to the model operation information; and the second calling module is used for calling the JS application program interface and transmitting the display interface-to-engine communication event to the 3D engine, so that the 3D engine can change model information for generating a 3D picture embedded in the view component according to the display interface-to-engine communication event.
In one embodiment of the application, the 3D engine is bound to the JS engine, so that the JS application program interface of the 3D engine can be called in a JS call manner to transfer the communication event from the display interface to the engine to the 3D engine.
Fig. 6 is a block diagram of an electronic device used to implement an embodiment of the present application. As shown in fig. 6, the electronic device includes: a memory 610 and a processor 620, the memory 610 storing a computer program executable on the processor 620. The processor 620, when executing the computer program, implements the methods of the above-described embodiments. The number of memory 610 and processors 620 may be one or more.
The electronic device further includes:
the communication interface 630 is used for communicating with external devices for data interactive transmission.
If the memory 610, the processor 620, and the communication interface 630 are implemented independently, the memory 610, the processor 620, and the communication interface 630 may be connected to each other and perform communication with each other through buses. The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component Interconnect, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The bus may be classified as an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
Alternatively, in a specific implementation, if the memory 610, the processor 620, and the communication interface 630 are integrated on a chip, the memory 610, the processor 620, and the communication interface 630 may communicate with each other through internal interfaces.
The present embodiments provide a computer-readable storage medium storing a computer program that, when executed by a processor, implements the methods provided in the embodiments of the present application.
The embodiment of the application also provides a chip, which comprises a processor and is used for calling the instructions stored in the memory from the memory and running the instructions stored in the memory, so that the communication device provided with the chip executes the method provided by the embodiment of the application.
The embodiment of the application also provides a chip, which comprises: the input interface, the output interface, the processor and the memory are connected through an internal connection path, the processor is used for executing codes in the memory, and when the codes are executed, the processor is used for executing the method provided by the application embodiment.
It should be appreciated that the processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or any conventional processor or the like. It is noted that the processor may be a processor supporting an advanced reduced instruction set machine (Advanced RISC Machines, ARM) architecture.
Further alternatively, the memory may include a read-only memory and a random access memory. The memory may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), programmable ROM (PROM), erasable Programmable ROM (EPROM), electrically Erasable EPROM (EEPROM), or flash Memory, among others. Volatile memory can include random access memory (Random Access Memory, RAM), which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available. For example, static RAM (SRAM), dynamic RAM (Dynamic Random Access Memory, DRAM), synchronous DRAM (SDRAM), double Data Rate Synchronous DRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct RAM (DR RAM).
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. Computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
Any process or method described in flow charts or otherwise herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process. And the scope of the preferred embodiments of the present application includes additional implementations in which functions may be performed in a substantially simultaneous manner or in an opposite order from that shown or discussed, including in accordance with the functions that are involved.
Logic and/or steps described in the flowcharts or otherwise described herein, e.g., may be considered a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. All or part of the steps of the methods of the embodiments described above may be performed by a program that, when executed, comprises one or a combination of the steps of the method embodiments, instructs the associated hardware to perform the method.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing module, or each unit may exist alone physically, or two or more units may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules described above, if implemented in the form of software functional modules and sold or used as a stand-alone product, may also be stored in a computer-readable storage medium. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
The foregoing is merely exemplary embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of various changes or substitutions within the technical scope of the present application, which should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A rendering method, comprising:
obtaining model information in a 3D engine;
determining rendering information according to the model information;
calling a graphic application program interface according to the rendering information;
rendering the rendering information into a drawing canvas of a display interface by using the graphic application program interface and the 3D engine; and the drawing canvas of the display interface is obtained by encapsulating the drawing canvas of the 3D engine in an embedded view component of the display interface in the same layer.
2. The method of claim 1, wherein invoking a graphics application program interface in accordance with the rendering information comprises:
generating a communication event between the 3D engine and the display interface according to the rendering information;
A graphical application program interface is invoked based on a communication event between the 3D engine and the display interface.
3. The method of claim 2, wherein the 3D engine connects the graphical application program interface using JS-binding, the invoking the graphical application program interface based on a communication event between the 3D engine and the display interface comprising:
packaging system native communication events between the 3D engine and the display interface through a bridging interface of the 3D engine; the bridging interface of the 3D engine is an interface for connection based on a JS bridging mode;
and calling the graphic application program interface based on the encapsulated system native communication event by using a drawing canvas of the 3D engine.
4. The method according to claim 1, wherein the method further comprises:
embedding the embedded view component into a display interface view component of the display interface to obtain the embedded view component of the display interface;
the embedded view component of the display interface and the drawing canvas of the 3D engine are encapsulated in layers using an encapsulation container.
5. The method of claim 4, wherein rendering the rendering information into a drawing canvas of a display interface using the graphics application program interface and the 3D engine comprises:
Acquiring the state of the 3D engine by using the packaging container;
and under the condition that the state of the 3D engine is an operation state, calling a graphical program interface by utilizing a view component of the 3D engine, and drawing a drawing canvas of the display interface based on a communication event encapsulated between the 3D engine and the display interface, wherein the encapsulated communication event is generated based on the rendering information.
6. The method according to claim 1, wherein the method further comprises:
acquiring an embedded drawing canvas in the packaging container;
binding the embedded drawing canvas to the 3D engine to obtain the drawing canvas of the 3D engine.
7. The method of any of claims 1-6, wherein the graphical application program interface comprises a Vulkan interface or a Metal interface.
8. The method of any of claims 1-6, wherein the 3D engine is a c++ language-based engine.
9. A 3D picture control method, comprising:
receiving gesture operation information through an embedded view component of a display interface;
generating a communication event from the component to the engine according to the gesture operation information;
Invoking a JS application program interface to transfer the component-to-engine communication event to a 3D engine, so that the 3D engine can change model information for generating a 3D picture in the embedded view component according to the component-to-engine communication event.
10. The method of claim 9, wherein the 3D engine is bound to a JS engine such that a JS application interface of the 3D engine can be invoked by way of a JS call to pass the component-to-engine communication event to the 3D engine.
11. A 3D picture control method, comprising:
receiving model operation information through an operable button of a display interface;
generating a communication event from the display interface to the engine according to the model operation information;
and invoking a JS application program interface, and transmitting the display interface-to-engine communication event to a 3D engine, so that the 3D engine can change model information for generating a 3D picture embedded in a view component according to the display interface-to-engine communication event.
12. The method of claim 11, wherein the 3D engine is bound to a JS engine such that a JS application interface of the 3D engine can be invoked by way of a JS call to pass the display-to-engine communication event to the 3D engine.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory, the processor implementing the method of any one of claims 1-12 when the computer program is executed.
14. A computer readable storage medium having stored therein a computer program which, when executed by a processor, implements the method of any of claims 1-12.
CN202310401235.0A 2023-04-13 2023-04-13 Rendering, 3D picture control method, electronic device, and computer-readable storage medium Pending CN116503529A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310401235.0A CN116503529A (en) 2023-04-13 2023-04-13 Rendering, 3D picture control method, electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310401235.0A CN116503529A (en) 2023-04-13 2023-04-13 Rendering, 3D picture control method, electronic device, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN116503529A true CN116503529A (en) 2023-07-28

Family

ID=87315955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310401235.0A Pending CN116503529A (en) 2023-04-13 2023-04-13 Rendering, 3D picture control method, electronic device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN116503529A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117093793A (en) * 2023-08-25 2023-11-21 江西格如灵科技股份有限公司 Webpage 3D scene two-dimensional display method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117093793A (en) * 2023-08-25 2023-11-21 江西格如灵科技股份有限公司 Webpage 3D scene two-dimensional display method and system

Similar Documents

Publication Publication Date Title
WO2022116759A1 (en) Image rendering method and apparatus, and computer device and storage medium
US10026147B1 (en) Graphics scenegraph rendering for web applications using native code modules
CN103713891B (en) It is a kind of to carry out the method and apparatus that figure renders on the mobile apparatus
US8675000B2 (en) Command buffers for web-based graphics rendering
US8797339B2 (en) Hardware-accelerated graphics for web applications using native code modules
JP5166552B2 (en) Multi-buffer support for off-screen surfaces in graphics processing systems
CN108876887B (en) Rendering method and device
CN108959392B (en) Method, device and equipment for displaying rich text on 3D model
MXPA06012368A (en) Integration of three dimensional scene hierarchy into two dimensional compositing system.
CN106991096B (en) Dynamic page rendering method and device
CN113411664B (en) Video processing method and device based on sub-application and computer equipment
US20190080017A1 (en) Method, system, and device that invokes a web engine
WO2022095526A1 (en) Graphics engine and graphics processing method applicable to player
WO2023197762A1 (en) Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN116503529A (en) Rendering, 3D picture control method, electronic device, and computer-readable storage medium
CN108364324B (en) Image data processing method and device and electronic terminal
CN114564630A (en) Method, system and medium for visualizing graph data Web3D
CN111414150B (en) Game engine rendering method and device, electronic equipment and computer storage medium
CN113744377A (en) Animation processing system, method, device, equipment and medium
CN113419806B (en) Image processing method, device, computer equipment and storage medium
CN115586893A (en) Cross-platform software development system and method
CN114821001B (en) AR-based interaction method and device and electronic equipment
CN117014689A (en) Bullet screen display method and device and electronic equipment
CN117708454A (en) Webpage content processing method, device, equipment, storage medium and program product
CN114444003A (en) Webpage content processing method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination