CN112235634A - Object rendering method and device, electronic equipment and storage medium - Google Patents
Object rendering method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN112235634A CN112235634A CN202011112234.7A CN202011112234A CN112235634A CN 112235634 A CN112235634 A CN 112235634A CN 202011112234 A CN202011112234 A CN 202011112234A CN 112235634 A CN112235634 A CN 112235634A
- Authority
- CN
- China
- Prior art keywords
- rendering
- rendered
- orientation information
- visual
- control point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 169
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000000007 visual effect Effects 0.000 claims abstract description 96
- 239000011159 matrix material Substances 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 21
- 230000002159 abnormal effect Effects 0.000 abstract description 9
- 230000008569 process Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 12
- 230000000977 initiatory effect Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 6
- 239000002245 particle Substances 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 239000003999 initiator Substances 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44012—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8146—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides an object rendering method, an object rendering device, electronic equipment and a storage medium, and relates to the technical field of Internet, wherein the object to be rendered is rendered according to rendering azimuth information by acquiring visual azimuth information of virtual vision and further obtaining the rendering azimuth information of the object to be rendered based on the visual azimuth information; therefore, the object to be rendered can be displayed towards the visual point all the time without generating abnormal display content in the rendered picture, and the rendering effect is improved.
Description
Technical Field
The present application relates to the field of internet technologies, and in particular, to an object rendering method and apparatus, an electronic device, and a storage medium.
Background
In a scene such as live webcasting, some special effects may be rendered in a live image, for example, some rendering objects such as bubbles and balloons are added in the live image, so as to increase live content in the live image, and improve a live effect.
However, in the process of rendering the rendering object to the live broadcast image, when the view angle of the terminal device changes, some abnormal display contents may appear on the rendering object, resulting in a poor rendering effect.
Disclosure of Invention
An object rendering method, an object rendering device, an electronic device, and a storage medium are provided, which enable an object to be rendered to be displayed toward a visual point all the time without generating abnormal display content in a rendered image, thereby improving a rendering effect.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides an object rendering method, including:
acquiring visual direction information of the virtual visual points;
based on the visual orientation information, obtaining rendering orientation information of an object to be rendered; wherein the rendering orientation information is used for indicating orientation information of the object to be rendered when facing the virtual visual point;
and rendering the object to be rendered according to the rendering direction information.
In a second aspect, the present application provides an object rendering apparatus, the apparatus comprising:
the processing module is used for acquiring visual direction information of the virtual visual points;
the processing module is further used for obtaining rendering orientation information of the object to be rendered based on the visual orientation information; wherein the rendering orientation information is used for indicating orientation information of the object to be rendered when facing the virtual visual point;
and the rendering module is used for rendering the object to be rendered according to the rendering azimuth information.
In a third aspect, the present application provides an electronic device comprising a memory for storing one or more programs; a processor; the one or more programs, when executed by the processor, implement the object rendering method described above.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the object rendering method described above.
According to the object rendering method, the device, the electronic equipment and the storage medium, the rendering azimuth information of the object to be rendered is obtained based on the visual azimuth information by acquiring the visual azimuth information of the virtual vision, so that the object to be rendered is rendered according to the rendering azimuth information; therefore, the object to be rendered can be displayed towards the visual point all the time without generating abnormal display content in the rendered picture, and the rendering effect is improved.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly explain the technical solutions of the present application, the drawings needed for the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also derive other related drawings from these drawings without inventive effort.
Fig. 1 shows an interactive scene schematic diagram of a live broadcast system provided by the present application.
Fig. 2 shows a schematic diagram of an object rendering effect.
Fig. 3 shows a schematic block diagram of an electronic device provided in the present application.
Fig. 4 shows an exemplary flowchart of an object rendering method provided in the present application.
Fig. 5 is a schematic diagram illustrating relative orientation information of an object to be rendered and a virtual visual point.
Fig. 6 shows an effect schematic diagram after rendering according to the object rendering method provided by the present application.
Fig. 7 shows an exemplary structural block diagram of an object rendering apparatus provided in the present application.
In the figure: 100-an electronic device; 101-a memory; 102-a processor; 103-a communication interface; 300-an object rendering device; 301-a processing module; 302-rendering module.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the accompanying drawings in some embodiments of the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. The components of the present application, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments obtained by a person of ordinary skill in the art based on a part of the embodiments in the present application without any creative effort belong to the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic view illustrating an interactive scene of a live broadcast system provided in the present application, which may be a live broadcast platform for live broadcast such as internet in some embodiments. The live broadcast system can comprise a server, a live broadcast initiating terminal and a live broadcast receiving terminal, wherein the server can be communicated with the live broadcast receiving terminal and the live broadcast initiating terminal respectively, and the server can provide live broadcast service for the live broadcast receiving terminal and the live broadcast initiating terminal. For example, the anchor may provide a live stream online in real time to the viewer through the live initiator and transmit the live stream to the server, and the live receiver may pull the live stream from the server for online viewing or playback.
In some implementations, the live receiver and the live initiator may be used interchangeably. For example, a anchor of a live originator may use the live originator to provide live video services to viewers, or as viewers to view live video provided by other anchors. For another example, a viewer at a live receiver may also use the live receiver to watch live video provided by a concerned anchor, or serve as the anchor to provide live video services to other viewers.
In some embodiments, the live receiver and the live initiator may include, but are not limited to, a mobile device, a tablet computer, a laptop computer, or any combination of two or more thereof. In some embodiments, the mobile device may include, but is not limited to, a wearable device, a smart mobile device, an augmented reality device, and the like, or any combination thereof. In some embodiments, the smart mobile device may include, but is not limited to, a smartphone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, or a point of sale (POS) device, or the like, or any combination thereof.
In addition, in some possible embodiments, there may be zero, one, or more live receivers and live initiators, only one of which is shown in fig. 1, accessing the server. The live broadcast receiving end and the live broadcast initiating end can be provided with internet products for providing internet live broadcast services, for example, the internet products can be application programs APP, Web webpages, applets and the like used in a computer or a smart phone and related to the internet live broadcast services.
In some embodiments, the server may be a single physical server or a server group consisting of a plurality of physical servers for performing different data processing functions. The set of servers can be centralized or distributed (e.g., the servers can be a distributed system). In some possible embodiments, such as where the server employs a single physical server, the physical server may be assigned different logical server components based on different live service functions.
It will be appreciated that the live system shown in fig. 1 is only one possible example, and that in other possible embodiments of the present application, the live system may also include only some of the components shown in fig. 1 or may also include other components.
In a scene such as live network shown in fig. 1, some rendering objects such as bubbles and balloons may be rendered in a live image based on technologies such as AR (Augmented Reality) by using tools such as ARKit or arc, so as to increase live content in the live image and improve live effects.
The live broadcast image generally shows a scene picture in a 3D space, and as shown in fig. 2, taking a special effect of rendering bubbles in the live broadcast image as an example, the bubbles can be rendered in the live broadcast image in a form of rendering 2D patches, so that the data amount processed when the bubbles are rendered is reduced, and processing resources are saved.
However, in some possible scenarios, the viewing angle of the terminal device may change during the process of displaying the live broadcast image by the terminal device, for example, when the terminal device is a mobile phone operated by a user, the shooting viewing angle of the mobile phone may also change during the process of the user holding the mobile phone to move the shooting direction; however, in this process, the rendered 2D tile may not change with the change of the viewing angle of the terminal device, so that the rendered bubble may have some abnormal display contents such as a sharp edge in the live broadcast image, resulting in a poor rendering effect.
Therefore, based on some defects existing in the rendering scheme, a possible implementation manner provided by the present application is: the rendering orientation information of the object to be rendered is obtained based on the visual orientation information by obtaining the visual orientation information of the virtual vision, so that the object to be rendered is rendered according to the rendering orientation information; therefore, the object to be rendered is always displayed towards the visual point, abnormal display content cannot be generated in the rendered picture, and the rendering effect is improved.
Referring to fig. 3, fig. 3 shows a schematic block diagram of an electronic device 100 provided in the present application, and in some embodiments, the electronic device 100 may include a memory 101, a processor 102, and a communication interface 103, and the memory 101, the processor 102, and the communication interface 103 are electrically connected to each other directly or indirectly to implement data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 101 may be configured to store software programs and modules, such as program instructions/modules corresponding to the object rendering apparatus provided in the present application, and the processor 102 executes the software programs and modules stored in the memory 101 to execute various functional applications and data processing, thereby executing the steps of the object rendering method provided in the present application. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The Processor 102 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
It will be appreciated that the configuration shown in fig. 3 is merely illustrative and that electronic device 100 may include more or fewer components than shown in fig. 3 or have a different configuration than shown in fig. 3. The components shown in fig. 3 may be implemented in hardware, software, or a combination thereof.
The following schematically illustrates an object rendering method provided by the present application, with the electronic device 100 shown in fig. 3 as an exemplary execution subject.
It can be understood that, in some possible scenarios of the present application, the electronic device 100 may serve as a live broadcast initiating terminal in fig. 1, and a main broadcast at a live broadcast initiating terminal side interacts with the live broadcast initiating terminal, so that the live broadcast initiating terminal executes the object rendering method provided by the present application, and a live broadcast code stream obtained after rendering is sent to a server, so that a live broadcast receiving terminal can pull the live broadcast code stream from the server and play a rendered live broadcast image after decoding.
Or, in some other possible scenarios in the present application, the electronic device 100 may also be used as the server in fig. 1, where the server executes the object rendering method provided in the present application by receiving a rendering request sent by a live broadcast initiating terminal or a live broadcast receiving terminal, so as to obtain a rendered live broadcast image; and sending the rendered live broadcast image to at least one of the live broadcast receiving end and the live broadcast initiating end, so that the rendered live broadcast image can be displayed by at least one of the live broadcast receiving end and the live broadcast initiating end.
Still alternatively, in some other possible scenarios in the present application, the electronic device 100 may also be used as a live broadcast receiving end in fig. 1, where the live broadcast receiving end pulls a live broadcast code stream from a server and decodes the live broadcast code stream to obtain a live broadcast image, and in a process of displaying the live broadcast image on a display interface of the live broadcast receiving end, the object rendering method provided in the present application may be executed by receiving instruction information of a user, so as to render the live broadcast image, and the rendered live broadcast image is displayed on the display interface of the live broadcast receiving end.
Referring to fig. 4, fig. 4 shows an exemplary flowchart of an object rendering method provided in the present application, and as a possible implementation, the object rendering method may include the following steps:
And 203, obtaining rendering orientation information of the object to be rendered based on the visual orientation information.
And step 205, rendering the object to be rendered according to the rendering azimuth information.
In some embodiments, in the process of executing the object rendering method provided by the present application, the electronic device may utilize an kit or an arcre tool to create a virtual visual point in advance, for example, to virtually a 3D virtual camera in a 3D space, and use a position of the 3D virtual camera in the 3D space as the virtual visual point, where the virtual visual point may be used to simulate a viewing angle of a user; taking a mobile phone operated by a user as the above electronic device as an example, in some possible implementations, the position coordinates of the mobile phone in the 3D space may be used as the position coordinates of the virtual visual point in the 3D space.
In a scene, for example, in which the position coordinates of the mobile phone in the 3D space are used as the position coordinates of the virtual visual points, the electronic device may obtain visual orientation information of the virtual visual points based on the posture information of the mobile phone; for example, assume that the current posture information of the mobile phone in the display scene is: the included angle with the ground is 30 °, the visual orientation information of the virtual visual point may be: at an angle of 30 deg. to the ground in 3D space.
In addition, the electronic device may also virtualize a particle emitter in the 3D space, and the particle emitter may be used to virtualize an emission source of the object to be rendered; such as in a rendered bubble special effect scene such as that shown in fig. 1, the particle emitter may be used to generate a bubble effect, and the bubble may move along a preset trajectory after being generated from the particle emitter.
It will be appreciated that in a scene such as that described above, to reduce the amount of data processed in rendering the bubble effect, the above-described bubble effect may be rendered in the form of a 2D tile; accordingly, when the electronic device virtualizes the particle emitter, a 2D particle emitter can be virtualized, so that a bubble special effect in a 2D form is generated.
In some embodiments, after the electronic device generates a rendering object such as a bubble using, for example, the 2D example transmitter described above, based on the obtained visual orientation information, rendering orientation information of the rendering object may be obtained, and the rendering orientation information may be used to indicate orientation information of the rendering object when the rendering object is oriented to the virtual visual point.
Next, the electronic device may render the object to be rendered according to the rendering orientation information of the obtained object to be rendered; for example, in the live broadcast scene illustrated in fig. 1, the object to be rendered is rendered into the live broadcast image according to the rendering orientation information, so that when the rendered live broadcast image is displayed, the rendering object is always displayed toward the visual point, and abnormal display contents such as a sharp edge in fig. 2 are not displayed, thereby improving the rendering effect.
In some embodiments, for a virtual visual point created in advance by the electronic device, the electronic device may create a visual coordinate system based on the virtual visual point, an origin of the visual coordinate system may be a position coordinate of the virtual visual point in the 3D space, an X-axis of the visual coordinate system is forward, which may be a viewing angle direction of the virtual visual point, a Y-axis of the visual coordinate system may point to a direction perpendicular to the X-axis and parallel to a horizontal plane, and a Z-axis of the visual coordinate system may be perpendicular to a plane formed by the X-axis and the Y-axis. And, the electronic device may calculate a view matrix, which may be used for coordinate transformation of a visual coordinate system of the virtual visual point and a world coordinate system in the 3D space.
Based on this, in the process of executing step 203, since the view angle of the view matrix and the model matrix are in relative positions in the world coordinate system, the electronic device may calculate the visual model matrix corresponding to the visual orientation information in a manner of, for example, obtaining an inverse matrix based on the view matrix; the virtual visual point may be used to simulate a viewing angle of a user, and the parameter in the visual model matrix may be used to indicate visual orientation information of the virtual visual point, such as orientation information, position information, and the like of the virtual visual point in a 3D coordinate system.
Next, the electronic device may calculate rendering orientation information of the object to be rendered when the object is oriented to the virtual visual point according to the obtained visual model matrix.
It should be noted that, with reference to the example of fig. 5, in some possible scenarios, the rendering position information of the object to be rendered may have the same parameters as the visual position information of the virtual visual point, for example, when the object to be rendered is located on the X axis of the visual coordinate system in the 3D space, the included angle between the object to be rendered and the ground is equal to the included angle between the virtual visual point and the ground; of course, in some other scenes, when the object to be rendered is not located on the X axis of the visual coordinate system in the 3D space, the included angle between the object to be rendered and the ground is not equal to the included angle between the virtual visual point and the ground.
In addition, in a scene in which bubbles are taken as objects to be rendered, for example, the electronic device may render the bubble effect in the form of a 2D tile in order to reduce the amount of data processed in rendering. For the exemplary rendering object of the bubble, the display effect of the rendering object in the live image is a generally circular area; therefore, based on this characteristic, the electronic device can determine a rendering control point in the bubble and also control the display area of the bubble in the live image; for example, the electronic device may select a center point of the bubble as a rendering control point according to a configuration parameter obtained in advance; alternatively, the electronic device may select a screen vertex of the bubble in the live image as the rendering control point.
Based on this, in some embodiments, in the process of executing step 205, the electronic device may first calculate the control point coordinates of the object to be rendered according to the rendering orientation information, where the control point coordinates may be used to indicate the coordinates of the rendering control point of the object to be rendered, such as the coordinates of the center point of the bubble or the coordinates of the vertex of the bubble.
Next, the electronic device may render a rendering display patch of the object to be rendered at a rendering control point based on the rendering orientation information; for example, as shown in connection with fig. 6, the electronic device may render a 2D tile of the bubble at a rendering control point to exhibit a bubble effect in the live image.
In addition, in the live webcasting scene illustrated in fig. 1, for example, in order to enable the user to normally view the rendered bubbles without viewing the abnormal display content illustrated in fig. 2 in the process of viewing live webcasting through different viewing angles, the electronic device may further render the rendering display tile of the object to be rendered in at least one visual expansion direction corresponding to the virtual visual point based on the rendering control point in the process of executing the object rendering method provided by the present application.
That is to say, in some scenes, when the electronic device executes the object rendering method provided by the present application, not only a rendering display patch of the object to be rendered may be rendered in a direction in which a rendering control point faces a virtual visual point, but also the rendering display patch of the object to be rendered may be rendered in at least one visual expansion direction corresponding to the virtual visual point; for example, based on the constructed visual coordinate system, the electronic device may render one rendering display patch along the positive direction and the negative direction of the X axis and the positive direction and the negative direction of the Y axis of the visual coordinate system; as such, even if the user views the rendered object from a different perspective, abnormal display contents such as a sharp edge as exemplified in fig. 2 are not generated.
In addition, in some embodiments, in the process that the electronic device renders the rendering display patch of the object to be rendered in at least one visual expansion direction corresponding to the virtual visual point, the electronic device may also perform rendering according to a preset spatial distance parameter; for example, in combination with the live broadcast scene illustrated in fig. 1, the electronic device may scale the display interface parameter according to a preset ratio to form a spatial distance parameter based on the display interface parameter (for example, the width and the height of the display interface) of the terminal device, so that the electronic device may render the rendering display patch of the object to be rendered in at least one visual expansion direction corresponding to the virtual visual point according to the spatial distance parameter.
It can be understood that, in some embodiments, the preset spatial distance parameter may be calculated by obtaining a display interface parameter for the electronic device, or may be sent by the electronic device receiving another device, or received by a user; as long as the electronic device can obtain a spatial distance parameter before executing the object rendering method provided by the present application, so as to execute the step of rendering the rendering display patch of the object to be rendered in at least one vision expansion direction corresponding to the virtual vision point.
Accordingly, referring to fig. 7 based on the same inventive concept as the object rendering method provided in the present application, fig. 7 illustrates an exemplary structural block diagram of an object rendering apparatus 300 provided in the present application, where the object rendering apparatus 300 may include a processing module 301 and a rendering module 302.
The processing module 301 is configured to obtain visual orientation information of the virtual visual point;
the processing module 301 is further configured to obtain rendering orientation information of the object to be rendered based on the visual orientation information; the rendering orientation information is used for indicating the orientation information of the object to be rendered when the object to be rendered faces the virtual visual point;
and the rendering module 302 is configured to render the object to be rendered according to the rendering orientation information.
Optionally, as a possible implementation manner, when rendering the object to be rendered according to the rendering orientation information, the rendering module 302 is specifically configured to:
calculating the control point coordinates of the object to be rendered according to the rendering orientation information; the control point coordinates are used for indicating the coordinates of the rendering control points of the object to be rendered;
and rendering the rendering display patch of the object to be rendered at the rendering control point based on the rendering position information.
Optionally, as a possible implementation, the rendering module 302 is further configured to:
and rendering the rendering display patch of the object to be rendered in at least one vision expansion direction corresponding to the virtual vision point based on the rendering control point.
Optionally, as a possible implementation manner, when the rendering module 302 renders the rendering display tile of the object to be rendered in at least one vision expansion direction corresponding to the virtual vision point, specifically, the rendering module is configured to:
rendering the rendering display patch of the object to be rendered in at least one vision expansion direction corresponding to the virtual vision point according to a preset space distance parameter.
Optionally, as a possible implementation manner, when obtaining rendering orientation information of an object to be rendered based on the visual orientation information, the processing module 301 is specifically configured to:
calculating a visual model matrix corresponding to the visual orientation information;
and according to the visual model matrix, calculating rendering azimuth information of the object to be rendered when the object to be rendered faces the virtual visual point.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to some embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in some embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to some embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above description is only a few examples of the present application and is not intended to limit the present application, and those skilled in the art will appreciate that various modifications and variations can be made in the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Claims (10)
1. A method of object rendering, the method comprising:
acquiring visual direction information of the virtual visual points;
based on the visual orientation information, obtaining rendering orientation information of an object to be rendered; wherein the rendering orientation information is used for indicating orientation information of the object to be rendered when facing the virtual visual point;
and rendering the object to be rendered according to the rendering direction information.
2. The method of claim 1, wherein the rendering the object to be rendered according to the rendering orientation information comprises:
calculating the control point coordinates of the object to be rendered according to the rendering orientation information; the control point coordinates are used for indicating the coordinates of the rendering control point of the object to be rendered;
rendering a rendering display patch of the object to be rendered at the rendering control point based on the rendering orientation information.
3. The method of claim 2, wherein the method further comprises:
rendering a rendering display patch of the object to be rendered in at least one vision expansion direction corresponding to the virtual visual point based on the rendering control point.
4. The method of claim 3, wherein rendering the rendered display tile of the object to be rendered in the at least one vision expansion direction corresponding to the virtual visual point comprises:
rendering the rendering display patch of the object to be rendered in at least one vision expansion direction corresponding to the virtual vision point according to a preset space distance parameter.
5. The method of claim 1, wherein the deriving rendering orientation information of an object to be rendered based on the visual orientation information comprises:
calculating a visual model matrix corresponding to the visual orientation information;
and calculating rendering azimuth information of the object to be rendered when the object to be rendered faces the virtual visual point according to the visual model matrix.
6. An object rendering apparatus, characterized in that the apparatus comprises:
the processing module is used for acquiring visual direction information of the virtual visual points;
the processing module is further used for obtaining rendering orientation information of the object to be rendered based on the visual orientation information; wherein the rendering orientation information is used for indicating orientation information of the object to be rendered when facing the virtual visual point;
and the rendering module is used for rendering the object to be rendered according to the rendering azimuth information.
7. The apparatus of claim 6, wherein the rendering module, when rendering the object to be rendered according to the rendering orientation information, is specifically configured to:
calculating the control point coordinates of the object to be rendered according to the rendering orientation information; the control point coordinates are used for indicating the coordinates of the rendering control point of the object to be rendered;
rendering a rendering display patch of the object to be rendered at the rendering control point based on the rendering orientation information.
8. The apparatus of claim 7, wherein the rendering module is further to:
rendering a rendering display patch of the object to be rendered in at least one vision expansion direction corresponding to the virtual visual point based on the rendering control point.
9. An electronic device, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-5.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011112234.7A CN112235634A (en) | 2020-10-16 | 2020-10-16 | Object rendering method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011112234.7A CN112235634A (en) | 2020-10-16 | 2020-10-16 | Object rendering method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112235634A true CN112235634A (en) | 2021-01-15 |
Family
ID=74118875
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011112234.7A Pending CN112235634A (en) | 2020-10-16 | 2020-10-16 | Object rendering method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112235634A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105825544A (en) * | 2015-11-25 | 2016-08-03 | 维沃移动通信有限公司 | Image processing method and mobile terminal |
CN108053464A (en) * | 2017-12-05 | 2018-05-18 | 北京像素软件科技股份有限公司 | Particle effect processing method and processing device |
CN109448099A (en) * | 2018-09-21 | 2019-03-08 | 腾讯科技(深圳)有限公司 | Rendering method, device, storage medium and the electronic device of picture |
WO2020037923A1 (en) * | 2018-08-24 | 2020-02-27 | 北京微播视界科技有限公司 | Image synthesis method and apparatus |
CN111583373A (en) * | 2020-05-11 | 2020-08-25 | 上海米哈游天命科技有限公司 | Model rendering method, device, equipment and storage medium |
-
2020
- 2020-10-16 CN CN202011112234.7A patent/CN112235634A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105825544A (en) * | 2015-11-25 | 2016-08-03 | 维沃移动通信有限公司 | Image processing method and mobile terminal |
CN108053464A (en) * | 2017-12-05 | 2018-05-18 | 北京像素软件科技股份有限公司 | Particle effect processing method and processing device |
WO2020037923A1 (en) * | 2018-08-24 | 2020-02-27 | 北京微播视界科技有限公司 | Image synthesis method and apparatus |
CN109448099A (en) * | 2018-09-21 | 2019-03-08 | 腾讯科技(深圳)有限公司 | Rendering method, device, storage medium and the electronic device of picture |
CN111583373A (en) * | 2020-05-11 | 2020-08-25 | 上海米哈游天命科技有限公司 | Model rendering method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20170186219A1 (en) | Method for 360-degree panoramic display, display module and mobile terminal | |
CN109829981B (en) | Three-dimensional scene presentation method, device, equipment and storage medium | |
JP7247276B2 (en) | Viewing Objects Based on Multiple Models | |
US20190355170A1 (en) | Virtual reality content display method and apparatus | |
CN103157281B (en) | Display method and display equipment of two-dimension game scene | |
JP7182689B2 (en) | Video frame processing method and apparatus | |
US20170186243A1 (en) | Video Image Processing Method and Electronic Device Based on the Virtual Reality | |
CN112215936A (en) | Image rendering method and device, electronic equipment and storage medium | |
CN112218108B (en) | Live broadcast rendering method and device, electronic equipment and storage medium | |
US10147240B2 (en) | Product image processing method, and apparatus and system thereof | |
CN111737603B (en) | Method, device, equipment and storage medium for judging whether interest points are visible | |
CN112153408B (en) | Live broadcast rendering method and device, electronic equipment and storage medium | |
CN112165630B (en) | Image rendering method and device, electronic equipment and storage medium | |
CN112235634A (en) | Object rendering method and device, electronic equipment and storage medium | |
CN109949396A (en) | A kind of rendering method, device, equipment and medium | |
CN112153409B (en) | Live broadcast method and device, live broadcast receiving end and storage medium | |
US20170186218A1 (en) | Method for loading 360 degree images, a loading module and mobile terminal | |
CN114004953A (en) | Method and system for realizing reality enhancement picture and cloud server | |
CN108762855B (en) | Picture processing method and device | |
CN110197524B (en) | Stereoscopic display method, apparatus, device, and computer-readable storage medium | |
CN111142825A (en) | Multi-screen view display method and system and electronic equipment | |
CN113694519B (en) | Applique effect processing method and device, storage medium and electronic equipment | |
CN115278202B (en) | Display method and device | |
WO2018000610A1 (en) | Automatic playing method based on determination of image type, and electronic device | |
CN111857341B (en) | Display control method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210115 |
|
RJ01 | Rejection of invention patent application after publication |