CN111862342A - Texture processing method and device for augmented reality, electronic equipment and storage medium - Google Patents

Texture processing method and device for augmented reality, electronic equipment and storage medium Download PDF

Info

Publication number
CN111862342A
CN111862342A CN202010688589.4A CN202010688589A CN111862342A CN 111862342 A CN111862342 A CN 111862342A CN 202010688589 A CN202010688589 A CN 202010688589A CN 111862342 A CN111862342 A CN 111862342A
Authority
CN
China
Prior art keywords
texture
model
target object
augmented reality
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010688589.4A
Other languages
Chinese (zh)
Inventor
张璟聪
李云玖
陈志立
罗琳捷
刘晶
杨骁�
王国晖
杨建朝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
ByteDance Inc
Original Assignee
Beijing ByteDance Network Technology Co Ltd
ByteDance Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd, ByteDance Inc filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010688589.4A priority Critical patent/CN111862342A/en
Publication of CN111862342A publication Critical patent/CN111862342A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a texture processing method and device for augmented reality, electronic equipment and a storage medium. Applied to a shader, comprising: carrying out noise texture sampling according to the image of the target object to obtain a texture image of the surface of the target object; when the augmented reality model of the target object is displayed, determining texture maps of the surfaces of the model units according to the texture images, wherein the augmented reality model comprises a plurality of model units; and outputting the model unit according to the texture map. The texture mapping method has the advantages that the texture image can be intercepted from the surface of the target object, the texture image is applied to the surface of the model unit, the texture mapping can be configured for each model unit, the augmented reality model can be composed of the model units, and therefore the texture mapping with the model units can achieve rapid completion of texture loading on the outer surface of the augmented reality model, the pixel value of each pixel point does not need to be calculated respectively, calculation cost can be reduced, and real-time performance is improved.

Description

Texture processing method and device for augmented reality, electronic equipment and storage medium
Technical Field
Embodiments of the present invention relate to virtual reality technologies, and in particular, to a method and an apparatus for processing texture of augmented reality, an electronic device, and a storage medium.
Background
Augmented Reality (AR) is a technology in which real information and virtual information are superimposed. Firstly, processing real information by a computer system, and generating virtual information which is matched with the real information and contains virtual objects, sounds or characters and the like; then, the virtual information is superposed to a human-computer interaction interface displaying the real information, so that the perception of the user to the real world is enhanced.
Currently, when outputting an augmented reality model, there is a need to load the surface texture of a subject onto the surface of the augmented reality model. For example, the texture of the building's exterior wall is loaded onto the building augmented reality model surface. In the process, the pixel value of each pixel point in each augmented reality picture needs to be calculated, so that the loading of the texture of the augmented reality model of the building is completed, a large amount of computing resources are occupied in the process, the instantaneity is poor, and the computing cost is high.
Disclosure of Invention
The invention provides a texture processing method and device for augmented reality, electronic equipment and a storage medium, which are used for improving the real-time property of texture loading of an augmented reality model and reducing the calculation cost.
In a first aspect, an embodiment of the present invention provides a method for processing a texture of an augmented reality, including:
Carrying out noise texture sampling according to the image of the target object to obtain a texture image of the surface of the target object;
when the augmented reality model of the target object is displayed, determining texture maps of the surfaces of the model units according to the texture images, wherein the augmented reality model comprises a plurality of model units;
and outputting the model unit according to the texture map.
In a second aspect, an embodiment of the present invention further provides an augmented reality texture processing apparatus, including:
the sampling module is used for sampling noise textures according to the image of the target object to obtain a texture image of the surface of the target object;
the texture determining module is used for determining a texture map of the surface of the model unit according to the texture image when the augmented reality model of the target object is displayed, wherein the augmented reality model comprises a plurality of model units;
and the output module is used for outputting the model unit according to the texture map.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the augmented reality texture processing method as shown in the embodiment of the present disclosure.
In a fourth aspect, the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are used to perform a texture processing method for augmented reality as shown in the embodiments of the present disclosure.
The texture processing scheme for augmented reality disclosed by the embodiment of the disclosure can perform noise texture sampling according to the image of the target object to obtain the texture image of the surface of the target object; when the augmented reality model of the target object is displayed, determining texture maps of the surfaces of the model units according to the texture images, wherein the augmented reality model comprises a plurality of model units; and outputting the model unit according to the texture map. Compared with the prior art that the pixel value of each pixel point needs to be calculated, the real-time performance is poor, the calculation cost is high, the texture image can be intercepted from the surface of the target object, the texture image is applied to the surface of the model unit, the texture mapping can be configured for each model unit, and the augmented reality model is composed of the model units, so that the texture mapping with the model units can quickly complete the texture loading on the outer surface of the augmented reality model, the pixel value of each pixel point does not need to be calculated respectively, the calculation cost can be reduced, and the real-time performance is improved.
Drawings
Fig. 1 is a flowchart of a texture processing method for augmented reality according to a first embodiment of the present invention;
fig. 2 is a schematic structural diagram of an augmented reality texture processing apparatus according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device in a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Augmented Reality (AR) is a technology in which real information and virtual information are superimposed. For example, the virtual information may be an augmented reality model of a pre-constructed landmark building, and the reality information may be a landmark building (shot) acquired by a camera of the electronic device. The texture of the outer surface of the camera may change with time or external environment (such as light, weather, etc.), so the texture of the outer surface of the augmented reality model cannot be completely fused with actual information. It is therefore desirable to load the current texture of the landmark building into the augmented reality model. In general, texture is also called noise. Noise (Noise) is commonly used to create visual effects such as simplex Noise, berlin Noise (Perlin Noise), and the like. Currently, when noise (also called texture) is calculated in real time, a noise value corresponding to each pixel is calculated in a Fragment Shader (Fragment Shader). However, the above process is computationally intensive, and if the process is applied to some low-end mobile phones, a higher frame rate cannot be guaranteed. In order to solve the technical problem, an embodiment of the present application provides a texture processing scheme for augmented reality, which is specifically shown as follows.
Example one
Fig. 1 is a flowchart of a texture processing method for augmented reality according to an embodiment of the present invention, where the embodiment is applicable to a case where a texture of a photographed object is loaded in an augmented reality model in real time, and the method may be executed by an electronic device that displays the augmented reality model, where the electronic device may be a terminal such as a smart phone or a tablet computer, and specifically includes the following steps:
and 110, performing noise texture sampling according to the image of the target object to obtain a texture image of the surface of the target object.
The target object is a shot object shot by the camera of the electronic equipment. The embodiment of the application is applied to augmented reality scenes. The user can start the augmented reality function in the preset application, and the preset application can be a photographing application of the electronic device or an application with the augmented reality function. The electronic equipment camera acquires a current image, and the electronic equipment displays the acquired current image in a preview page. And if the corresponding augmented reality model exists in the shot object in the current image, mapping the augmented reality model to the building image. The object to be photographed may be a landmark building, or may be an entity corresponding to a previously-created enhanced display model such as a vehicle.
And after the user starts the augmented reality function in the preset application, judging whether the shot object shot at present is a target object with a corresponding augmented reality model. The method can acquire the whole image of the target object for noise texture sampling, and can also acquire the local image of the target object for noise texture sampling.
In one implementation, first, an appearance image of a target object is acquired; then, noise texture sampling is carried out on the appearance image according to a preset dividing unit, and a sampling result is obtained; and finally, performing rendering and splicing according to the sampling result to obtain a texture image of the surface of the target object.
The appearance image is an image containing the surface of the target object. When the target object is a building, if the user is close to the building, the appearance image may be a partial image of the building. As the distance between the user and the building increases, more and more target images may be captured until the appearance image is able to represent the overall outline of the target object. Because the texture of the outer surface of the building has a certain rule, for example, glass, concrete walls and the like have fixed textures. Therefore, noise texture sampling can be performed on the appearance image according to the preset division unit. The preset division unit may be a fixed size, for example, the preset number of pixels is a square with a side length. And intercepting the graph in the appearance image by using a preset division unit to obtain a sampling result. The image size of the sampling result is the same as the preset division unit. The sampling position may be any position in the appearance image. And after obtaining the sampling result, splicing the sampling result to obtain a texture image sequence frame as a texture mapping. The texture image sequence frame is formed by splicing a plurality of sampling results. And performing noise texture sampling on the appearance image by using a preset dividing unit, so that the texture image can be quickly determined from the appearance image, and the generation speed of the texture image is increased.
Further, the texture of the outer surface of the target object may not be completely uniform, for example, the outer surface is composed of glass and concrete, and in this case, the step may be performed by: acquiring texture distribution characteristics in an image of a target object; dividing the image of the target object according to the texture distribution characteristics to obtain a plurality of texture areas, wherein each texture area has the same texture characteristics; and carrying out noise texture sampling in the texture area to obtain a texture image of the surface of the target object.
The texture distribution characteristic refers to the texture characteristic corresponding to each of a plurality of regions of the target object. Texture characteristics such as glass texture, concrete texture, etc. After the image of the target object is divided based on the texture distribution characteristics, a plurality of texture regions can be obtained, and the image in each texture region corresponds to one texture characteristic. In the subsequent step, the model unit county in the augmented reality model may be mapped to a texture region, and the texture image of the texture region is used as the texture map of the model unit. And then the effect of generating response texture images for target objects with various texture characteristics is realized, and the accuracy of texture noise is improved while the generation efficiency of the texture images is improved.
And 120, when the augmented reality model of the target object is displayed, determining texture maps of the surfaces of the model units according to the texture images, wherein the augmented reality model comprises a plurality of model units.
After the electronic equipment camera acquires the target object, the electronic equipment can prompt the user to display the augmented reality model of the target object. And when the user confirms the display, the electronic equipment covers the augmented reality model on the image of the target object in the preview picture, and the loading of the augmented reality model of the target object is completed.
According to the technical scheme provided by the embodiment of the application, the model unit is used as a processing object to set the texture mapping, and the texture mapping is determined according to the sampling result of the image of the target object from the camera, so that the texture mapping on the surface of the model unit is consistent with the current texture of the target object, and the current texture of the target object is loaded to the mapping on the surface of the model unit. The number of the model units is far smaller than that of the pixel points, so that the current texture of the target object can be quickly applied to the augmented reality model.
For example, determining the corresponding texture map of the model unit surface according to the texture image can be implemented by the following steps:
And step 121, determining a mapping area according to the vertex coordinates of a plurality of vertexes of the model unit positioned on the external surface of the augmented reality model.
The model unit is a unit constituting the augmented reality model. The model unit is configured with a plurality of vertexes and vertex coordinates, which are three-dimensional coordinates. Illustratively, the target object is a building and the model elements are cube elements. And if the model unit is a cube unit, the cube unit consists of eight vertexes, and the position of the cube unit in the augmented reality model can be determined according to the vertex coordinates. For the model unit positioned on the surface of the augmented reality model, the position of the corresponding point of the model unit in the texture image can be determined according to the coordinates of four vertexes positioned on the surface of the augmented reality model in the model unit.
Illustratively, three-dimensional coordinates of vertices of the model elements are obtained; determining a plurality of vertexes located on the outer surface of the augmented reality model according to the three-dimensional coordinates; mapping the three-dimensional coordinates of the plurality of vertices to two-dimensional coordinates (UV coordinates); and determining a mapping area in the texture image according to the two-dimensional coordinates.
The texture image is a two-dimensional image, and the coordinates of the vertexes in the model unit are a three-dimensional image. And mapping the three-dimensional image to the outer surface where the model unit is located to obtain a two-dimensional coordinate. The four vertices are used to determine the map region in the two-dimensional coordinate system.
And step 122, intercepting the texture map of the surface of the model unit from the texture image of the surface of the target object according to the map area.
After determining a map region in the texture image using the vertex coordinates of the model unit, using a texture map corresponding to the map region as a texture map of the model unit on the outer surface.
And step 130, outputting a model unit according to the texture map.
For model units located on the outer surface of the augmented reality model, the individual model units may be processed in parallel by the shader. And loading texture maps on the surface of the model unit positioned on the outer surface of the augmented reality model.
Further, the augmented reality model may trigger some model changes after being loaded into the target object. Model changes such as flipping of model elements in a certain direction or jumping of model elements. At this time, if only the outer surface loaded texture map located on the outer surface is configured, when the model unit is turned over or jumped, the surface on which the texture map is not loaded is an initial picture, and a new surface after the model unit is turned over is located on the outer surface of the augmented reality model, so that the visual effect is poor. In order to solve the problem, the texture map is used as a map of a first surface of the model unit, and the first surface is a surface corresponding to the outer surface of the augmented reality model. When the model element rotates or jumps, the texture map is used as a map of the second surface of the model element. In addition to loading texture maps on a first surface, texture maps may also be loaded on a second surface, the second surface being a surface other than the first surface. By loading texture maps on a plurality of surfaces of the model unit, the stability of the outer surface texture of the target object can be maintained when the model unit is triggered to overturn or jump.
The texture processing method for augmented reality disclosed by the embodiment of the disclosure can perform noise texture sampling according to an image of a target object to obtain a texture image of the surface of the target object; when the augmented reality model of the target object is displayed, determining texture maps of the surfaces of the model units according to the texture images, wherein the augmented reality model comprises a plurality of model units; and outputting the model unit according to the texture map. Compared with the prior art that the pixel value of each pixel point needs to be calculated, the real-time performance is poor, the calculation cost is high, the texture image can be intercepted from the surface of the target object, the texture image is applied to the surface of the model unit, the texture mapping can be configured for each model unit, and the augmented reality model is composed of the model units, so that the texture mapping with the model units can quickly complete the texture loading on the outer surface of the augmented reality model, the pixel value of each pixel point does not need to be calculated respectively, the calculation cost can be reduced, and the real-time performance is improved.
Example two
Fig. 2 is a schematic structural diagram of an augmented reality texture processing apparatus according to a second embodiment of the present disclosure, where the embodiment is applicable to a case where a texture of a photographed object is loaded in an augmented reality model in real time, and the method may be executed by an electronic device that actually displays the augmented reality model, where the electronic device may be a smart phone, a tablet computer, or the like, and the apparatus includes: a sampling module 210, a texture determination module 220, and an output module 230, wherein:
The sampling module 210 is configured to perform noise texture sampling according to an image of a target object to obtain a texture image of the surface of the target object;
a texture determining module 220, configured to determine a texture map of a surface of a model unit according to a texture image when an augmented reality model of a target object is displayed, where the augmented reality model includes a plurality of model units;
and an output module 230, configured to output the model unit according to the texture map.
Further, the sampling module 210 is configured to:
acquiring an appearance image of a target object;
noise texture sampling is carried out on the appearance image according to a preset dividing unit, and a sampling result is obtained;
and performing rendering and splicing according to the sampling result to obtain a texture image of the surface of the target object.
Further, the texture determining module 220 is configured to:
determining a mapping region according to the vertex coordinates of a plurality of vertexes of the model unit on the outer surface of the augmented reality model;
and intercepting the texture mapping of the surface of the model unit from the texture image of the surface of the target object according to the mapping area.
Further, the texture determining module 220 is configured to:
acquiring three-dimensional coordinates of a vertex of the model unit;
determining a plurality of vertexes located on the outer surface of the augmented reality model according to the three-dimensional coordinates;
Mapping the three-dimensional coordinates of the plurality of vertices to two-dimensional coordinates;
and determining a mapping area in the texture image according to the two-dimensional coordinates.
Further, the sampling module 210 is configured to:
acquiring texture distribution characteristics in an image of a target object;
dividing the image of the target object according to the texture distribution characteristics to obtain a plurality of texture areas, wherein each texture area has the same texture characteristics;
and carrying out noise texture sampling in the texture area to obtain a texture image of the surface of the target object.
Further, the output module 230 is configured to:
taking the texture map as a map of a first surface of the model unit, wherein the first surface is a surface corresponding to the outer surface of the augmented reality model;
when the model element rotates or jumps, the texture map is used as a map of the second surface of the model element.
Further, the target object is a building, and the model unit is a cubic unit.
In the texture processing method for augmented reality disclosed in the embodiment of the present disclosure, the sampling module 210 performs noise texture sampling according to an image of a target object to obtain a texture image of the surface of the target object; when the augmented reality model of the target object is displayed, the texture determining module 220 determines a texture map of the surface of the model unit according to the texture image, wherein the augmented reality model comprises a plurality of model units; the output module 230 outputs model elements according to the texture map. Compared with the prior art that the pixel value of each pixel point needs to be calculated, the real-time performance is poor, the calculation cost is high, the texture image can be intercepted from the surface of the target object, the texture image is applied to the surface of the model unit, the texture mapping can be configured for each model unit, and the augmented reality model is composed of the model units, so that the texture mapping with the model units can quickly complete the texture loading on the outer surface of the augmented reality model, the pixel value of each pixel point does not need to be calculated respectively, the calculation cost can be reduced, and the real-time performance is improved.
The augmented reality texture processing device provided by the embodiment of the invention can execute the augmented reality texture processing method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
EXAMPLE III
Referring now to fig. 3, shown is a schematic diagram of an electronic device 800 suitable for use in implementing a third embodiment of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device 800 may include a processing means (e.g., central processing unit, graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage means 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data necessary for the operation of the electronic apparatus 800 are also stored. The processing apparatus 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
Generally, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage 808 including, for example, magnetic tape, hard disk, etc.; and a communication device 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device 800 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 809, or installed from the storage means 808, or installed from the ROM 802. The computer program, when executed by the processing apparatus 801, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising the at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects the internet protocol addresses from the at least two internet protocol addresses and returns the internet protocol addresses; receiving an internet protocol address returned by the node evaluation equipment; wherein the obtained internet protocol address indicates an edge node in the content distribution network.
Alternatively, the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from the at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (10)

1. A texture processing method for augmented reality, comprising:
carrying out noise texture sampling according to the image of the target object to obtain a texture image of the surface of the target object;
when an augmented reality model of a target object is displayed, determining texture maps of surfaces of model units according to the texture images, wherein the augmented reality model comprises a plurality of model units;
And outputting the model unit according to the texture map.
2. The method according to claim 1, wherein the performing noise texture sampling according to the image of the target object to obtain the texture image of the surface of the target object comprises:
acquiring an appearance image of a target object;
performing noise texture sampling on the appearance image according to a preset dividing unit to obtain a sampling result;
and performing rendering and splicing according to the sampling result to obtain the texture image of the surface of the target object.
3. The method of claim 2, wherein determining a texture map corresponding to the model element surface from the texture image comprises:
determining a mapping region according to the vertex coordinates of a plurality of vertexes of the model unit positioned on the external surface of the augmented reality model;
and intercepting the texture mapping of the surface of the model unit from the texture image of the surface of the target object according to the mapping area.
4. The method of claim 3, wherein determining the map region according to the vertex coordinates of the plurality of vertices of the model element located on the external surface of the augmented reality model comprises:
acquiring three-dimensional coordinates of a vertex of the model unit;
Determining a plurality of vertexes located on the external surface of the augmented reality model according to the three-dimensional coordinates;
mapping the three-dimensional coordinates of the plurality of vertices to two-dimensional coordinates;
and determining a mapping area in the texture image according to the two-dimensional coordinates.
5. The method according to claim 4, wherein the performing noise texture sampling according to the image of the target object to obtain the texture image of the surface of the target object comprises:
acquiring texture distribution characteristics in an image of a target object;
dividing the image of the target object according to the texture distribution characteristics to obtain a plurality of texture areas, wherein each texture area has the same texture characteristics;
and carrying out noise texture sampling in the texture area to obtain a texture image of the surface of the target object.
6. The method of claim 5, wherein outputting the model unit according to the texture map comprises:
taking the texture map as a map of a first surface of the model unit, wherein the first surface is a surface corresponding to the outer surface of the augmented reality model;
and when the model unit rotates or jumps, using the texture map as a map of the second surface of the model unit.
7. The method according to any one of claims 1-6, wherein the target object is a building and the model elements are cube elements.
8. An augmented reality texture processing apparatus comprising:
the sampling module is used for sampling noise textures according to the image of the target object to obtain a texture image of the surface of the target object;
the texture determining module is used for determining a texture map of the surface of a model unit according to the texture image when an augmented reality model of a target object is displayed, wherein the augmented reality model comprises a plurality of model units;
and the output module is used for outputting the model unit according to the texture map.
9. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement an augmented reality texture processing method as recited in any one of claims 1-7.
10. A storage medium containing computer executable instructions for performing the augmented reality texture processing method of any one of claims 1-7 when executed by a computer processor.
CN202010688589.4A 2020-07-16 2020-07-16 Texture processing method and device for augmented reality, electronic equipment and storage medium Pending CN111862342A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010688589.4A CN111862342A (en) 2020-07-16 2020-07-16 Texture processing method and device for augmented reality, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010688589.4A CN111862342A (en) 2020-07-16 2020-07-16 Texture processing method and device for augmented reality, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111862342A true CN111862342A (en) 2020-10-30

Family

ID=72983198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010688589.4A Pending CN111862342A (en) 2020-07-16 2020-07-16 Texture processing method and device for augmented reality, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111862342A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117218266A (en) * 2023-10-26 2023-12-12 神力视界(深圳)文化科技有限公司 3D white-mode texture map generation method, device, equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876886A (en) * 2017-05-09 2018-11-23 腾讯科技(深圳)有限公司 Image processing method, device and computer equipment
CN108876931A (en) * 2017-05-12 2018-11-23 腾讯科技(深圳)有限公司 Three-dimension object color adjustment method, device, computer equipment and computer readable storage medium
CN110363860A (en) * 2019-07-02 2019-10-22 北京字节跳动网络技术有限公司 3D model reconstruction method, device and electronic equipment
CN110458932A (en) * 2018-05-07 2019-11-15 阿里巴巴集团控股有限公司 Image processing method, device, system, storage medium and image scanning apparatus
CN110782507A (en) * 2019-10-11 2020-02-11 创新工场(北京)企业管理股份有限公司 Texture mapping generation method and system based on face mesh model and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876886A (en) * 2017-05-09 2018-11-23 腾讯科技(深圳)有限公司 Image processing method, device and computer equipment
CN108876931A (en) * 2017-05-12 2018-11-23 腾讯科技(深圳)有限公司 Three-dimension object color adjustment method, device, computer equipment and computer readable storage medium
CN110458932A (en) * 2018-05-07 2019-11-15 阿里巴巴集团控股有限公司 Image processing method, device, system, storage medium and image scanning apparatus
CN110363860A (en) * 2019-07-02 2019-10-22 北京字节跳动网络技术有限公司 3D model reconstruction method, device and electronic equipment
CN110782507A (en) * 2019-10-11 2020-02-11 创新工场(北京)企业管理股份有限公司 Texture mapping generation method and system based on face mesh model and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117218266A (en) * 2023-10-26 2023-12-12 神力视界(深圳)文化科技有限公司 3D white-mode texture map generation method, device, equipment and medium

Similar Documents

Publication Publication Date Title
US20220277481A1 (en) Panoramic video processing method and apparatus, and storage medium
CN110728622B (en) Fisheye image processing method, device, electronic equipment and computer readable medium
CN110378947B (en) 3D model reconstruction method and device and electronic equipment
CN114461064B (en) Virtual reality interaction method, device, equipment and storage medium
CN111258519B (en) Screen split implementation method, device, terminal and medium
WO2024104248A1 (en) Rendering method and apparatus for virtual panorama, and device and storage medium
WO2023207963A1 (en) Image processing method and apparatus, electronic device, and storage medium
EP4290464A1 (en) Image rendering method and apparatus, and electronic device and storage medium
CN116310036A (en) Scene rendering method, device, equipment, computer readable storage medium and product
CN109801354B (en) Panorama processing method and device
CN115330925A (en) Image rendering method and device, electronic equipment and storage medium
CN114445269A (en) Image special effect processing method, device, equipment and medium
CN111862349A (en) Virtual brush implementation method and device and computer readable storage medium
CN111833459B (en) Image processing method and device, electronic equipment and storage medium
CN115908679A (en) Texture mapping method, device, equipment and storage medium
CN115409696A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111862342A (en) Texture processing method and device for augmented reality, electronic equipment and storage medium
CN111199569A (en) Data processing method and device, electronic equipment and computer readable medium
CN115578299A (en) Image generation method, device, equipment and storage medium
CN112164066B (en) Remote sensing image layered segmentation method, device, terminal and storage medium
CN114049403A (en) Multi-angle three-dimensional face reconstruction method and device and storage medium
CN114723600A (en) Method, device, equipment, storage medium and program product for generating cosmetic special effect
CN114742934A (en) Image rendering method and device, readable medium and electronic equipment
CN110390717B (en) 3D model reconstruction method and device and electronic equipment
CN114419299A (en) Virtual object generation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination