CN108154548B - Image rendering method and device - Google Patents

Image rendering method and device Download PDF

Info

Publication number
CN108154548B
CN108154548B CN201711278635.8A CN201711278635A CN108154548B CN 108154548 B CN108154548 B CN 108154548B CN 201711278635 A CN201711278635 A CN 201711278635A CN 108154548 B CN108154548 B CN 108154548B
Authority
CN
China
Prior art keywords
target
angle range
visual angle
dimensional projection
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711278635.8A
Other languages
Chinese (zh)
Other versions
CN108154548A (en
Inventor
武云龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pixel Software Technology Co Ltd
Original Assignee
Beijing Pixel Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pixel Software Technology Co Ltd filed Critical Beijing Pixel Software Technology Co Ltd
Priority to CN201711278635.8A priority Critical patent/CN108154548B/en
Publication of CN108154548A publication Critical patent/CN108154548A/en
Application granted granted Critical
Publication of CN108154548B publication Critical patent/CN108154548B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Generation (AREA)

Abstract

The invention provides an image rendering method and device, and relates to the technical field of image processing. According to the scheme, a preset observation visual angle range of a target viewpoint relative to a target object is obtained for each target object and is used as a target visual angle range; selecting a two-dimensional projection image corresponding to the target visual angle range from a plurality of pre-stored two-dimensional projection images, and using the two-dimensional projection image as a target map; and rendering according to the target map. According to the scheme, the two-dimensional projection images corresponding to the target object in different preset observation visual angle ranges are prestored through the electronic equipment, and then the corresponding two-dimensional projection images are selected according to the target visual angle ranges for rendering, so that the rendered images have a layering effect.

Description

Image rendering method and device
Technical Field
The invention relates to the technical field of image processing, in particular to an image rendering method and device.
Background
With the rapid development of rendering technology, the requirements for rendering are higher and higher. In virtual scenarios, there are various solutions for simulating the effect of real trees. For example, in the prior art, trees can be simulated using the speedtree's three-dimensional tree, directly using the billboard or X-patch. Wherein the speedtree is set to be a three-dimensional tree at a short distance and a two-dimensional tree at a long distance. When the camera is looking horizontally at the tree, a billboard or X-shaped patch can be used. The use of a large amount of speedtree results in a large memory consumption and is inefficient at runtime. If a billboard or X-tree is used, there is a lack of layering at long distances. The camera cannot display trees when rendering vertically.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides an image rendering method and an image rendering device, which are used for obtaining corresponding two-dimensional projection images according to different observation angle ranges of a target object so as to reduce consumption of system resources and improve the rendering effect of the target object, thereby solving the problems.
In order to achieve the above object, the technical solutions provided by the preferred embodiments of the present invention are as follows:
the invention provides an image rendering method, which is applied to electronic equipment, wherein two-dimensional projection images corresponding to target objects in different preset observation visual angle ranges are prestored in the electronic equipment; the method comprises the following steps:
aiming at each target object, obtaining a preset observation visual angle range of a target viewpoint relative to the target object, and taking the preset observation visual angle range as a target visual angle range;
selecting a two-dimensional projection image corresponding to the target visual angle range from a plurality of pre-stored two-dimensional projection images, and using the two-dimensional projection image as a target map;
and rendering according to the target map.
Optionally, the step of obtaining a preset viewing angle range in which the target viewpoint is located relative to the target object and taking the preset viewing angle range as the target viewing angle range includes:
acquiring a distance value L between the target viewpoint and a reference surface and a distance value M between the target viewpoint and a target object, and calculating an included angle beta between a connecting line of the target viewpoint and the target object and the reference surface according to the distance value L and the distance value M;
and identifying the observation visual angle range corresponding to the included angle beta, and taking the observation visual angle range as the target visual angle range.
Optionally, the step of calculating an included angle β between a connection line between the target viewpoint and the target object and the reference plane according to the distance value L and the distance value M includes:
according to the formula
Figure BDA0001497099610000021
And calculating to obtain the value beta of the included angle.
Optionally, there are a plurality of target objects, and the step of rendering according to the target map includes:
for each target map, calculating a visible area of the target map relative to the target viewpoint according to the coordinate data of the target map;
rendering a visible region of the target map.
Optionally, the step of selecting a two-dimensional projection image corresponding to the target viewing angle range from a plurality of pre-stored two-dimensional projection images and using the two-dimensional projection image as the target map includes:
and selecting the target map from a plurality of preset view angle ranges according to the target view angle range, wherein the preset observation view angle range corresponding to the target map is the same as the target view angle range.
Optionally, before the step of obtaining the preset viewing angle range in which the target viewpoint is located relative to the target object, the method includes:
and acquiring two-dimensional projection images of the three-dimensional image in each preset observation visual angle range according to the three-dimensional image of the target object, and storing each two-dimensional projection image in the electronic equipment, wherein one two-dimensional projection image is acquired in one preset observation visual angle range.
The invention also provides an image rendering device, which is applied to electronic equipment, wherein two-dimensional projection images corresponding to target objects in different preset observation visual angle ranges are stored in the electronic equipment in advance; the image rendering apparatus includes:
the acquisition unit is used for acquiring a preset observation visual angle range of a target viewpoint relative to the target object and taking the preset observation visual angle range as a target visual angle range for each target object;
the selecting unit is used for selecting a two-dimensional projection image corresponding to the target visual angle range from a plurality of pre-stored two-dimensional projection images and using the two-dimensional projection image as a target map;
and the rendering unit is used for rendering according to the target map.
Optionally, the number of the target objects is multiple, and the rendering unit is further configured to:
for each target map, calculating a visible area of the target map relative to the target viewpoint according to the coordinate data of the target map;
rendering a visible region of the target map.
Optionally, the selecting unit is further configured to: and selecting the target map from a plurality of preset view angle ranges according to the target view angle range, wherein the preset observation view angle range corresponding to the target map is the same as the target view angle range.
Optionally, the image rendering apparatus further includes an image obtaining and storing unit, before the obtaining unit obtains the preset observation angle range where the target viewpoint is located relative to the target object, the image obtaining and storing unit is configured to:
and acquiring two-dimensional projection images of the three-dimensional image in each preset observation visual angle range according to the three-dimensional image of the target object, and storing each two-dimensional projection image in the electronic equipment, wherein one two-dimensional projection image is acquired in one preset observation visual angle range.
Compared with the prior art, the image rendering method and the image rendering device provided by the invention at least have the following beneficial effects: according to the scheme, a preset observation visual angle range of a target viewpoint relative to a target object is obtained for each target object and is used as a target visual angle range; selecting a two-dimensional projection image corresponding to the target visual angle range from a plurality of pre-stored two-dimensional projection images, and using the two-dimensional projection image as a target map; and rendering according to the target map. According to the scheme, the two-dimensional projection images corresponding to the target object in different preset observation visual angle ranges are prestored through the electronic equipment, and then the corresponding two-dimensional projection images are selected according to the target visual angle ranges for rendering, so that the rendered images have a layering effect.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments will be briefly described below. It is appreciated that the following drawings depict only some embodiments of the invention and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 is a block diagram of an electronic device according to a preferred embodiment of the invention.
Fig. 2 is a flowchart illustrating an image rendering method according to a preferred embodiment of the invention.
FIG. 3 is a schematic diagram of a two-dimensional projection image provided in accordance with a preferred embodiment of the present invention.
FIG. 4 is a schematic diagram of a top-view rendered image according to a preferred embodiment of the present invention.
Fig. 5 is a second flowchart illustrating an image rendering method according to a preferred embodiment of the invention.
Fig. 6 is a block diagram of an image rendering apparatus according to a preferred embodiment of the present invention.
Icon: 10-an electronic device; 11-a processing unit; 12-a storage unit; 100-an image rendering device; 110-an obtaining unit; 120-selecting unit; 130-a rendering unit; 140-image acquisition storage unit.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely a few embodiments of the invention, and not all embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Furthermore, the terms "first," "second," and the like are used merely to distinguish one description from another, and are not to be construed as indicating or implying relative importance.
In the prior art, trees are typically simulated using a speedtree three-dimensional tree, directly using bulletin boards (billboards) or X-shaped panels. Wherein the speedtree is set to be a three-dimensional tree at a short distance and a two-dimensional tree at a long distance. When the camera is looking horizontally at the tree, a billboard or X-shaped patch can be used. If a large amount of speedtree is used, a large amount of memory consumption is generated, and meanwhile, the occupied computing resources are large, so that rendering efficiency during running is low. If a billboard or an X-shaped tree is used, the remote distance lacks of layering sense, and the tree cannot be displayed when the camera is vertically rendered.
How to provide a scientific image rendering method which can reduce the consumption of system resources, improve the rendering efficiency and improve the hierarchy of rendered pictures is a big problem of technical personnel in the field. Understandably, the consumption of the system resources includes the memory usage and the processor usage. In view of the above problems, the present inventors have conducted extensive research and research to provide the following embodiments to solve the above problems. The following describes embodiments of the present invention in detail with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Fig. 1 is a block diagram of an electronic device 10 according to a preferred embodiment of the invention. The electronic device 10 provided by the present invention may be used to perform the image rendering method described below. The electronic device 10 may be, but is not limited to, a smart phone, a Personal Computer (PC), a tablet PC, a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), and the like.
In this embodiment, the electronic device 10 includes a processing unit 11, a storage unit 12, and an image rendering apparatus 100, and the processing unit 11, the storage unit 12, and the image rendering apparatus 100 are electrically connected directly or indirectly to implement data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The processing unit 11 may be a processor. For example, the processor may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed.
The memory unit 12 may be, but is not limited to, a random access memory, a read only memory, a programmable read only memory, an erasable programmable read only memory, an electrically erasable programmable read only memory, and the like. In this embodiment, the storage unit 12 may be configured to store two-dimensional projection images corresponding to different preset viewing angle ranges for the target object. Of course, the storage unit 12 may also be used for storing a program, and the processing unit 11 executes the program after receiving the execution instruction.
Further, the image rendering apparatus 100 includes at least one software function module which may be stored in the storage unit 12 in a form of software or firmware (firmware) or solidified in an Operating System (OS) of the electronic device 10. The processing unit 11 is used for executing executable modules stored in the storage unit 12, such as software functional modules and computer programs included in the image rendering device 100.
It is understood that the configuration shown in fig. 1 is only a schematic configuration of the electronic device 10, and the electronic device 10 may include more or less components than those shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Fig. 2 is a flowchart illustrating an image rendering method according to a preferred embodiment of the invention. The image rendering method provided by the present invention can be applied to the electronic device 10, and the electronic device 10 executes the image rendering method to reduce consumption of system resources, improve rendering efficiency of images, and make rendered images have a hierarchical sense. Alternatively, the method may be used to render vegetation in a sports game or cartoon.
As will be described in detail below for each step of the image rendering method shown in fig. 2, in this embodiment, the image rendering method may include the following steps:
step S210, for each target object, obtaining a preset observation angle range in which the target viewpoint is located relative to the target object, and taking the preset observation angle range as a target angle range.
In this embodiment, the target object may be, but is not limited to, a vegetation model (e.g., a three-dimensional tree), a prop model (e.g., a three-dimensional knife, a gun, a sword, etc.), a three-dimensional character model, and the like. Understandably, the target viewpoint is a position where the camera acquires the target object, and the camera is a virtual camera for acquiring pictures of each model to form an image or a video. With each target object as a reference point, the target view angle ranges formed by the target objects and the target viewpoints may be different or the same.
Fig. 3 is a schematic diagram of a two-dimensional projection image according to a preferred embodiment of the present invention. In this embodiment, optionally, step S210 may be: acquiring a distance value L between the target viewpoint and a reference surface and a distance value M between the target viewpoint and a target object, and calculating an included angle beta between a connecting line of the target viewpoint and the target object and the reference surface according to the distance value L and the distance value M; and identifying the observation visual angle range corresponding to the included angle beta, and taking the observation visual angle range as the target visual angle range.
It is understood that a shown in fig. 3 is a target viewpoint, B is a target object or a preset object (for example, a billboard center point is used as a B point), and C is a projection point of the target viewpoint projected to a reference plane to form a right triangle ABC. Wherein, the distance value L between the target viewpoint and the reference surface, namely the length value of the side AC is L; the distance value M between the target viewpoint and the target object, namely the length value of the side AB is M; the included angle beta is the value of ≈ ABC.
Alternatively, it can be according to a formula
Figure BDA0001497099610000071
And calculating to obtain the value beta of the included angle. And identifying the observation visual angle range corresponding to the included angle beta according to the obtained included angle beta.
It should be noted that, in other embodiments, the included angle β may be calculated or the observation angle range corresponding to the included angle β may be determined in a manner different from the above-mentioned embodiment, and the manner of calculating the included angle β or determining the observation angle range corresponding to the included angle β is not particularly limited as long as the observation angle range corresponding to the included angle β can be determined.
Step S220 is to select a two-dimensional projection image corresponding to the target view angle range from a plurality of pre-stored two-dimensional projection images, and use the two-dimensional projection image as a target map.
Understandably, the electronic device 10 stores in advance a two-dimensional projection image of each target object to which a plurality of two-dimensional projection images of different observation angle ranges are associated. Step S220 may be: and selecting the target map from a plurality of preset view angle ranges according to the target view angle range, wherein the preset observation view angle range corresponding to the target map is the same as the target view angle range.
And step S230, rendering according to the target map.
Understandably, the target map is rendered based on the texture, the gray value, the RGB value, and the like of the two-dimensional projection image preset corresponding to the target map to obtain a rendered view.
In this embodiment, if there are a plurality of target objects, step S230 may be: for each target map, calculating a visible area of the target map relative to the target viewpoint according to the coordinate data of the target map; rendering a visible region of the target map.
Understandably, if a plurality of target objects need to be rendered, there may be an occluded area between the target objects, and the area needing to be rendered is an unoccluded area (i.e., a visible area). By rendering the part of the target map located in the visible area, a simulated scene picture of the target view angle can be obtained. Wherein, the occlusion region can be understood as: assuming that the target viewpoint is used as a light source, the shadow area of the target map projected on other target maps is an occlusion area.
In the prior art, the electronic device 10 generally only stores a map of the target object in a front view, and the map is a line segment when viewed from above, instead of a real top view of the target object in a top view, which results in a large difference between the displayed picture and the actual picture. Fig. 4 is a schematic diagram of a top-view rendered image according to a preferred embodiment of the invention. In this embodiment, by the above method, a target map under a top view angle can be obtained, and the target map approaches to an actual view of a target object under a current view angle.
Fig. 5 is a second flowchart illustrating an image rendering method according to a preferred embodiment of the invention. In this embodiment, before step S210, the method may further include step S240.
Step S240, obtaining two-dimensional projection images of the three-dimensional image within each preset viewing angle range according to the three-dimensional image of the target object, and storing each two-dimensional projection image in the electronic device 10, where one two-dimensional projection image is obtained within one preset viewing angle range.
Referring to fig. 3 again, in the present embodiment, the target object is taken as a reference point, and the space where the target object is located is divided into a plurality of observation angle regions, and the number of the division can be set according to the actual situation. In order to further reduce the occupation of the memory, the target viewpoint only corresponds to one two-dimensional projection image in the same observation angle area. This same viewing angle region can be understood as: and a space region obtained by rotating the sector region corresponding to the observation visual angle region around the perpendicular line based on the perpendicular line of the reference plane, wherein the perpendicular line is sufficient as the target object B. That is, if the target viewpoint is located in the spatial region, only one two-dimensional projection image corresponding to the spatial region is formed, and the two-dimensional projection image can be obtained from a viewpoint in the spatial region according to actual conditions, which is not particularly limited herein.
In other embodiments, when the target object is located in the same spatial region, the corresponding two-dimensional projection images may be arranged in the spatial region in the direction of the target object according to the target viewpoint, for example, one two-dimensional projection image may be arranged in each of the front, rear, left, and right of the target object.
Based on the design, the occupation amount of system memory resources can be reduced by rendering the two-dimensional images, in addition, the corresponding two-dimensional projection images are arranged in different observation visual angle ranges, and when a plurality of target objects with different distances are rendered, the rendered images have a hierarchical sense.
Fig. 6 is a block diagram of an image rendering apparatus 100 according to a preferred embodiment of the present invention. The image rendering apparatus 100 may be applied to the electronic device 10 described above for executing an image rendering method. The electronic device 10 pre-stores two-dimensional projection images corresponding to target objects in different preset observation visual angle ranges; the image rendering apparatus 100 may include an obtaining unit 110, a selecting unit 120, and a rendering unit 130.
The obtaining unit 110 is configured to obtain, for each target object, a preset viewing angle range in which the target viewpoint is located relative to the target object, and use the preset viewing angle range as the target viewing angle range. Specifically, the obtaining unit 110 may be configured to perform step S210 as shown in fig. 2, and the detailed operation content of the step S210 may be referred to.
And the selecting unit 120 is configured to select a two-dimensional projection image corresponding to the target view angle range from a plurality of pre-stored two-dimensional projection images, and use the two-dimensional projection image as a target map.
Further, the selecting unit 120 is further configured to select the target map from a plurality of preset view angle ranges according to the target view angle range, where a preset observation view angle range corresponding to the target map is the same as the target view angle range.
Specifically, the selecting unit 120 may be configured to execute step S220 shown in fig. 2, and the detailed operation content of the step S220 may be referred to.
And the rendering unit 130 is configured to render according to the target map. If there are more than one target objects, the rendering unit 130 is further configured to: for each target map, calculating a visible area of the target map relative to the target viewpoint according to the coordinate data of the target map; rendering a visible region of the target map. Specifically, the rendering unit 130 may be configured to execute step S230 shown in fig. 2, and the detailed operation content of the step S230 may be referred to.
The image rendering apparatus 100 further includes an image acquisition storage unit 140. Before the obtaining unit 110 performs step S210, the image acquisition storage unit 140 is configured to: according to the three-dimensional image of the target object, two-dimensional projection images of the three-dimensional image within each preset observation visual angle range are obtained, and each two-dimensional projection image is stored in the electronic device 10, wherein one two-dimensional projection image is obtained within one preset observation visual angle range. Specifically, the acquisition storage unit 12 may be configured to execute step S240 shown in fig. 5, and the detailed operation content of the execution may refer to the detailed description of step S240.
From the above description of the embodiments, it is clear to those skilled in the art that the present invention can be implemented by hardware, or by software plus a necessary general hardware platform, and based on such understanding, the technical solution of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions to make a computer device (which can be a personal computer, a server, or a network device, etc.) execute the method described in the embodiments of the present invention.
In summary, the present invention provides an image rendering method and apparatus, in which a preset viewing angle range of a target viewpoint relative to a target object is obtained for each target object and is used as a target viewing angle range; selecting a two-dimensional projection image corresponding to the target visual angle range from a plurality of pre-stored two-dimensional projection images, and using the two-dimensional projection image as a target map; and rendering according to the target map. According to the scheme, the two-dimensional projection images corresponding to the target object in different preset observation visual angle ranges are prestored through the electronic equipment, and then the corresponding two-dimensional projection images are selected according to the target visual angle ranges for rendering, so that the rendered images have a layering effect.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. The image rendering method is applied to electronic equipment, and is characterized in that two-dimensional projection images corresponding to target objects in different preset observation visual angle ranges are stored in the electronic equipment in advance; the method comprises the following steps:
aiming at each target object, obtaining a preset observation visual angle range of a target viewpoint relative to the target object, and taking the preset observation visual angle range as a target visual angle range, wherein the target object comprises any one of a three-dimensional vegetation model, a three-dimensional prop model and a three-dimensional character model, and the target viewpoint is the position of a virtual camera for shooting the target object;
selecting a two-dimensional projection image corresponding to the target visual angle range from a plurality of pre-stored two-dimensional projection images, and using the two-dimensional projection image as a target map;
rendering according to the target map;
when a plurality of target objects are provided, the step of rendering according to the target map includes:
for each target map, calculating a visible area of the target map relative to the target viewpoint according to the coordinate data of the target map;
rendering a visible area of the target map;
the step of obtaining a preset observation visual angle range of the target viewpoint relative to the target object and using the preset observation visual angle range as the target visual angle range comprises the following steps:
acquiring a distance value L between the target viewpoint and a reference surface and a distance value M between the target viewpoint and a target object, and calculating an included angle beta between a connecting line of the target viewpoint and the target object and the reference surface according to the distance value L and the distance value M;
identifying an observation visual angle range corresponding to the included angle beta, and taking the observation visual angle range as the target visual angle range;
the step of calculating an included angle β between a connecting line of the target viewpoint and the target object and the reference plane according to the distance value L and the distance value M includes:
according to the formula
Figure FDA0003452411950000021
And calculating to obtain the value beta of the included angle.
2. The method according to claim 1, wherein the step of selecting a two-dimensional projection image corresponding to the target view angle range from a plurality of pre-stored two-dimensional projection images and using the two-dimensional projection image as the target map comprises:
and selecting the target map from a plurality of preset view angle ranges according to the target view angle range, wherein the preset observation view angle range corresponding to the target map is the same as the target view angle range.
3. The method according to claim 1, wherein the step of obtaining the preset viewing angle range of the target viewpoint relative to the target object is preceded by the step of:
and acquiring two-dimensional projection images of the three-dimensional image in each preset observation visual angle range according to the three-dimensional image of the target object, and storing each two-dimensional projection image in the electronic equipment, wherein one two-dimensional projection image is acquired in one preset observation visual angle range.
4. An image rendering device is applied to electronic equipment, and is characterized in that two-dimensional projection images corresponding to target objects in different preset observation visual angle ranges are stored in advance in the electronic equipment; the image rendering apparatus includes:
the system comprises an obtaining unit and a processing unit, wherein the obtaining unit is used for obtaining a preset observation visual angle range of a target viewpoint relative to a target object, the target object is used as the target visual angle range, the target object comprises any one of a three-dimensional vegetation model, a three-dimensional prop model and a three-dimensional character model, and the target viewpoint is the position of a virtual camera shooting the target object;
the selecting unit is used for selecting a two-dimensional projection image corresponding to the target visual angle range from a plurality of pre-stored two-dimensional projection images and using the two-dimensional projection image as a target map;
the rendering unit is used for rendering according to the target map;
wherein, when the target object is multiple, the rendering unit is further configured to:
for each target map, calculating a visible area of the target map relative to the target viewpoint according to the coordinate data of the target map;
rendering a visible area of the target map;
the obtaining unit is further configured to:
acquiring a distance value L between the target viewpoint and a reference surface and a distance value M between the target viewpoint and a target object, and calculating an included angle beta between a connecting line of the target viewpoint and the target object and the reference surface according to the distance value L and the distance value M;
identifying an observation visual angle range corresponding to the included angle beta, and taking the observation visual angle range as the target visual angle range;
the obtaining unit is further configured to:
according to the formula
Figure FDA0003452411950000031
And calculating to obtain the value beta of the included angle.
5. The image rendering apparatus according to claim 4, wherein the selecting unit is further configured to:
and selecting the target map from a plurality of preset view angle ranges according to the target view angle range, wherein the preset observation view angle range corresponding to the target map is the same as the target view angle range.
6. The image rendering apparatus according to claim 4, further comprising an image acquisition storage unit configured to, before the obtaining unit obtains the preset observation angle of view range in which the target viewpoint is located with respect to the target object:
and acquiring two-dimensional projection images of the three-dimensional image in each preset observation visual angle range according to the three-dimensional image of the target object, and storing each two-dimensional projection image in the electronic equipment, wherein one two-dimensional projection image is acquired in one preset observation visual angle range.
CN201711278635.8A 2017-12-06 2017-12-06 Image rendering method and device Active CN108154548B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711278635.8A CN108154548B (en) 2017-12-06 2017-12-06 Image rendering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711278635.8A CN108154548B (en) 2017-12-06 2017-12-06 Image rendering method and device

Publications (2)

Publication Number Publication Date
CN108154548A CN108154548A (en) 2018-06-12
CN108154548B true CN108154548B (en) 2022-02-22

Family

ID=62466540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711278635.8A Active CN108154548B (en) 2017-12-06 2017-12-06 Image rendering method and device

Country Status (1)

Country Link
CN (1) CN108154548B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109377542B (en) * 2018-09-28 2023-07-18 国网辽宁省电力有限公司锦州供电公司 Three-dimensional model rendering method and device and electronic equipment
CN110148452A (en) * 2019-05-07 2019-08-20 东软医疗系统股份有限公司 A kind of image rendering method and device
CN110264393B (en) * 2019-05-15 2023-06-23 联想(上海)信息技术有限公司 Information processing method, terminal and storage medium
CN110517345A (en) * 2019-08-28 2019-11-29 网易(杭州)网络有限公司 A kind of method and device of threedimensional model rendering
CN110706319B (en) * 2019-10-15 2024-02-13 北京思维造物信息科技股份有限公司 Animation monitoring playing method, device, equipment and storage medium
CN113065999B (en) * 2019-12-16 2023-03-10 杭州海康威视数字技术股份有限公司 Vehicle-mounted panorama generation method and device, image processing equipment and storage medium
CN111275803B (en) * 2020-02-25 2023-06-02 北京百度网讯科技有限公司 3D model rendering method, device, equipment and storage medium
CN111369655B (en) * 2020-03-02 2023-06-30 网易(杭州)网络有限公司 Rendering method, rendering device and terminal equipment
CN111784804A (en) * 2020-06-01 2020-10-16 北京像素软件科技股份有限公司 Lightning simulation method and device in virtual scene, terminal and readable storage medium
CN111569418B (en) * 2020-06-10 2023-04-07 网易(杭州)网络有限公司 Rendering method, device and medium for content to be output and electronic equipment
CN112396683B (en) * 2020-11-30 2024-06-04 腾讯科技(深圳)有限公司 Shadow rendering method, device, equipment and storage medium for virtual scene
CN113140028A (en) * 2021-04-08 2021-07-20 广州三七互娱科技有限公司 Virtual object rendering method and device and electronic equipment
CN113096254B (en) * 2021-04-23 2023-09-22 北京百度网讯科技有限公司 Target rendering method and device, computer equipment and medium
CN113205582B (en) * 2021-06-03 2022-12-13 腾讯科技(深圳)有限公司 Method, device, equipment and medium for generating and using baking paste chart
CN113457161B (en) * 2021-07-16 2024-02-13 深圳市腾讯网络信息技术有限公司 Picture display method, information generation method, device, equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8289320B2 (en) * 2007-10-22 2012-10-16 Samsung Electronics Co., Ltd. 3D graphic rendering apparatus and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal

Also Published As

Publication number Publication date
CN108154548A (en) 2018-06-12

Similar Documents

Publication Publication Date Title
CN108154548B (en) Image rendering method and device
US11842438B2 (en) Method and terminal device for determining occluded area of virtual object
EP3968270A1 (en) Image occlusion processing method, device, apparatus and computer storage medium
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
US20240193846A1 (en) Scene rendering method, electronic device, and non-transitory readable storage medium
US20230230311A1 (en) Rendering Method and Apparatus, and Device
CN111739142A (en) Scene rendering method and device, electronic equipment and computer readable storage medium
CN112153303B (en) Visual data processing method and device, image processing equipment and storage medium
CN111754381A (en) Graphics rendering method, apparatus, and computer-readable storage medium
CN111583398B (en) Image display method, device, electronic equipment and computer readable storage medium
CN111161398A (en) Image generation method, device, equipment and storage medium
JP5916764B2 (en) Estimation method of concealment in virtual environment
CN115512025A (en) Method and device for detecting model rendering performance, electronic device and storage medium
US20240087219A1 (en) Method and apparatus for generating lighting image, device, and medium
CN113648655B (en) Virtual model rendering method and device, storage medium and electronic equipment
EP3343516A1 (en) Method and device for applying an effect of an augmented or mixed reality application
CN112973121B (en) Reflection effect generation method and device, storage medium and computer equipment
CN111260767B (en) Rendering method, rendering device, electronic device and readable storage medium in game
US20230260218A1 (en) Method and apparatus for presenting object annotation information, electronic device, and storage medium
US10754498B2 (en) Hybrid image rendering system
CN115965735B (en) Texture map generation method and device
CN116912387A (en) Texture map processing method and device, electronic equipment and storage medium
CN111589111B (en) Image processing method, device, equipment and storage medium
US20190005736A1 (en) Method and apparatus for calculating a 3d density map associated with a 3d scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant