CN111882642B - Texture filling method and device for three-dimensional model - Google Patents

Texture filling method and device for three-dimensional model Download PDF

Info

Publication number
CN111882642B
CN111882642B CN202010741361.7A CN202010741361A CN111882642B CN 111882642 B CN111882642 B CN 111882642B CN 202010741361 A CN202010741361 A CN 202010741361A CN 111882642 B CN111882642 B CN 111882642B
Authority
CN
China
Prior art keywords
dimensional
frame
dimensional image
images
dimensional model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010741361.7A
Other languages
Chinese (zh)
Other versions
CN111882642A (en
Inventor
谭皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010741361.7A priority Critical patent/CN111882642B/en
Publication of CN111882642A publication Critical patent/CN111882642A/en
Application granted granted Critical
Publication of CN111882642B publication Critical patent/CN111882642B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Abstract

The application discloses a texture filling method and a texture filling device for a three-dimensional model, which are applied to electronic equipment, wherein the method comprises the following steps: acquiring a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein the multi-frame two-dimensional image comprises images of each surface of the object to be molded, and the reference three-dimensional model comprises N triangular patches, and N is a positive integer; selecting M frames of two-dimensional images from the multi-frame two-dimensional images, wherein M is a positive integer smaller than N; determining the mapping relation between the N triangular patches and the M-frame two-dimensional image; and carrying out texture filling on the reference three-dimensional model according to the mapping relation to obtain a target three-dimensional model of the object to be modeled. The embodiment of the application is beneficial to reducing the complexity of texture filling algorithm and reducing power consumption.

Description

Texture filling method and device for three-dimensional model
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a texture filling method and apparatus for a three-dimensional model.
Background
Texture mapping is a process of establishing a correspondence between the three-dimensional object surface and the two-dimensional image space pixel coordinates, but many algorithms adopt complex calculations for pursuing the effect of texture mapping and enhancing the reality of the model, and the calculations can only be applied to high-performance desktop-level machines.
At present, for a simple small object, even if only tens of seconds are collected, several hundred two-dimensional images exist, in addition, in order to accurately represent the details of the object, the triangular patches on the surface of the three-dimensional object can reach hundreds of thousands, so that a large amount of calculation is required when texture mapping is performed, and the expected effect cannot be achieved on an embedded platform with limited calculation capacity, that is, the calculation amount and the effect cannot be balanced.
Disclosure of Invention
The embodiment of the application provides a texture filling method and device for a three-dimensional model, which are used for reducing the complexity of a texture filling algorithm and reducing power consumption.
In a first aspect, an embodiment of the present application provides a texture filling method of a three-dimensional model, which is applied to an electronic device, and the method includes:
acquiring a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein the multi-frame two-dimensional image comprises images of each surface of the object to be molded, the reference three-dimensional model comprises N triangular patches, and N is a positive integer
Selecting M frames of two-dimensional images from the multi-frame two-dimensional images, wherein M is a positive integer smaller than N;
determining the mapping relation between the N triangular patches and the M-frame two-dimensional image;
And carrying out texture filling on the reference three-dimensional model according to the mapping relation to obtain a target three-dimensional model of the object to be modeled.
In a second aspect, an embodiment of the present application provides a texture filling apparatus for a three-dimensional model, which is applied to an electronic device, where the texture filling apparatus for a three-dimensional model includes an obtaining unit, a selecting unit, a determining unit, and a filling unit, where:
the acquisition unit is used for acquiring a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein the multi-frame two-dimensional image comprises images of each surface of the object to be molded, and the reference three-dimensional model comprises N triangular patches, and N is a positive integer;
the selecting unit is used for selecting M frames of two-dimensional images from the multi-frame two-dimensional images, wherein M is a positive integer smaller than N;
the determining unit is used for determining the mapping relation between the N triangular patches and the M-frame two-dimensional image;
and the filling unit is used for carrying out texture filling on the reference three-dimensional model according to the mapping relation to obtain the target three-dimensional model of the object to be modeled.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for performing steps in any of the methods of the first aspect of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program causes a computer to perform part or all of the steps as described in any of the methods of the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in any of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, an electronic device obtains a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein a multi-frame two-dimensional image includes an image of each surface of the object to be molded, the reference three-dimensional model includes N triangular patches, an M-frame two-dimensional image is selected from the multi-frame two-dimensional image, a mapping relationship between the N triangular patches and the M-frame two-dimensional image is determined, and then texture filling is performed on the reference three-dimensional model according to the mapping relationship, so as to obtain a target three-dimensional model of the object to be molded. Therefore, the electronic device firstly screens the multi-frame two-dimensional images, performs texture mapping according to the selected M-frame two-dimensional images, performs texture mapping instead of performing texture mapping through all the two-dimensional images, and further performs texture filling, so that the data size of the texture mapping is reduced, the complexity of a texture filling algorithm is reduced, the texture filling efficiency is improved, and the power consumption is reduced.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic software structure of an electronic device according to an embodiment of the present application;
FIG. 3A is a schematic flow chart of a texture filling method for a three-dimensional model according to an embodiment of the present application;
FIG. 3B is a schematic diagram of a triangular patch of a three-dimensional model surface according to an embodiment of the present application;
FIG. 4 is a flow chart of another method for texture filling of a three-dimensional model according to an embodiment of the present application;
FIG. 5 is a block diagram of a distributed functional unit of a texture filling apparatus for a three-dimensional model according to an embodiment of the present application;
fig. 6 is a block diagram of an integrated functional unit of a texture filling apparatus for a three-dimensional model according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
For a better understanding of aspects of embodiments of the present application, related terms and concepts that may be related to embodiments of the present application are described below.
1) The electronic device may be a portable electronic device that also contains other functions such as personal digital assistant and/or music player functions, such as a cell phone, tablet computer, wearable electronic device with wireless communication capabilities (e.g., a smart watch), etc. Exemplary embodiments of portable electronic devices include, but are not limited to, portable electronic devices that are equipped with IOS systems, android systems, microsoft systems, or other operating systems. The portable electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be appreciated that in other embodiments, the electronic device described above may not be a portable electronic device, but rather a desktop computer.
2) Texture mapping is a process of establishing a corresponding relation between a three-dimensional object surface and two-dimensional image space pixel coordinates, wherein the three-dimensional object surface is represented by using a plurality of triangular patches, each triangular patch comprises three vertexes under a world coordinate system, and the two-dimensional image is a two-dimensional color image acquired by a color camera.
By way of example, fig. 1 shows a schematic diagram of an electronic device 100. Electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a compass 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In other embodiments, memory may also be provided in the processor 110 for storing instructions and data. Illustratively, the memory in the processor 110 may be a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses and reduces the latency of the processor 110, thereby improving the efficiency of the electronic device 100 in processing data or executing instructions.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include inter-integrated circuit (inter-integrated circuit, I2C) interfaces, inter-integrated circuit audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, SIM card interfaces, and/or USB interfaces, among others. The USB interface 130 is an interface conforming to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. The USB interface 130 may also be used to connect headphones through which audio is played.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrixorganic light emitting diode), a flexible light-emitting diode (FLED), a mini light-emitting diode (mini light-emitting diode), microLed, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The electronic device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature, etc. of the photographed scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or more cameras 193.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the electronic device 100 to execute the method of displaying page elements provided in some embodiments of the present application, as well as various applications, data processing, and the like, by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area can store an operating system; the storage program area may also store one or more applications (such as gallery, contacts, etc.), etc. The storage data area may store data created during use of the electronic device 100 (e.g., photos, contacts, etc.), and so on. In addition, the internal memory 121 may include high-speed random access memory, and may also include nonvolatile memory, such as one or more disk storage units, flash memory units, universal flash memory (universal flash storage, UFS), and the like. In some embodiments, processor 110 may cause electronic device 100 to perform the methods of displaying page elements provided in embodiments of the present application, as well as other applications and data processing, by executing instructions stored in internal memory 121, and/or instructions stored in a memory provided in processor 110. The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., X, Y and Z axis) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Embodiments of the present application are described in detail below.
Referring to fig. 3A, fig. 3A is a flowchart of a texture filling method for a three-dimensional model according to an embodiment of the application, which is applied to an electronic device.
S301, electronic equipment acquires a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein the multi-frame two-dimensional image comprises images of each surface of the object to be molded, and the reference three-dimensional model comprises N triangular patches, and N is a positive integer;
the triangular surface patch comprises three vertexes in a world coordinate system, as shown in fig. 3B, and the reference three-dimensional model surface of the object to be molded is formed by a large number of triangular surface patches.
The multi-frame two-dimensional image is a two-dimensional color image, and the multi-frame two-dimensional image can be acquired by an electronic device through a drawing device (such as a color camera) in all directions around the object to be molded, for example, the drawing device takes the object to be molded as a circle center, and acquires the multi-frame two-dimensional image around the object to be molded on a plurality of circumferences taking a first group of values as radii.
The reference three-dimensional model may be a model constructed according to the multi-frame two-dimensional image, a model constructed according to a multi-frame two-dimensional depth image acquired by a depth camera, or a model constructed according to the multi-frame two-dimensional image and the multi-frame two-dimensional depth image, which is not limited herein.
S302, the electronic equipment selects M frames of two-dimensional images from the multi-frame two-dimensional images, wherein M is a positive integer smaller than N;
the specific implementation manner of selecting the M-frame two-dimensional images by the electronic device from the multi-frame two-dimensional images may be various, for example, the multi-frame two-dimensional images may be divided into multiple groups of two-dimensional images according to an angle of a drawing device when the two-dimensional images are collected, at least one two-dimensional image with the highest resolution ratio may be selected from each group of two-dimensional images, or at least one two-dimensional image photographed when the drawing device is closest to an object to be molded may be selected from each group of two-dimensional images, which is not limited herein.
S303, the electronic equipment determines the mapping relation between the N triangular patches and the M-frame two-dimensional image.
Wherein, each two-dimensional image may correspond to one or more triangular patches, which are not limited herein.
S304, the electronic equipment performs texture filling on the reference three-dimensional model according to the mapping relation to obtain a target three-dimensional model of the object to be modeled.
It can be seen that, in the embodiment of the present application, an electronic device obtains a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, where a multi-frame two-dimensional image includes an image of each surface of the object to be molded, and the reference three-dimensional model includes N triangular patches, selects M frames of two-dimensional images from the multi-frame two-dimensional image, determines a mapping relationship between the N triangular patches and the M frames of two-dimensional images, and then texture fills the reference three-dimensional model according to the mapping relationship to obtain a target three-dimensional model of the object to be molded. Therefore, the electronic device firstly screens the multi-frame two-dimensional images, performs texture mapping according to the selected M-frame two-dimensional images, performs texture mapping instead of performing texture mapping through all the two-dimensional images, and further performs texture filling, so that the data size of the texture mapping is reduced, the complexity of a texture filling algorithm is reduced, the texture filling efficiency is improved, and the power consumption is reduced.
In one possible example, the selecting M-frame two-dimensional image from the multi-frame two-dimensional image includes:
Grouping the multi-frame two-dimensional images according to preset parameters to obtain a plurality of groups of two-dimensional images;
and selecting at least one two-dimensional image from each group of two-dimensional images to obtain the M-frame two-dimensional images.
The preset parameter may be a shooting time, or may be a position parameter when a two-dimensional image is acquired, where the position parameter is a relative position parameter between the acquisition device and the object to be molded in a world coordinate system, and is not limited herein.
When the preset parameter is shooting time, the electronic device may determine a plurality of time periods, and divide the two-dimensional image shot in each time period into a group, for example, the electronic device determines every two seconds as a time period, and divides the two-dimensional image shot in each time period into a group; when the preset parameters are position parameters, the electronic device can determine a plurality of position areas, and divide the two-dimensional images acquired by the image acquisition device in each area into a group, which is not limited herein.
In this example, when the electronic device selects M two-dimensional images from the multiple two-dimensional images, the multiple two-dimensional images are grouped according to preset parameters, so that the preset parameters corresponding to each group of two-dimensional images are close, redundant frames in each group of two-dimensional images are further eliminated, the rationality and the effectiveness of two-dimensional image selection are improved, and the influence on the implementation of texture mapping is avoided.
In this possible example, said selecting at least one two-dimensional image from each set of said two-dimensional images to obtain said M-frame two-dimensional image includes:
calculating the center point coordinates of the object to be molded according to a first formula;
determining a first distance from each two-dimensional image in the multi-frame two-dimensional images to the center point of the object to be molded;
and selecting the two-dimensional image with the minimum first distance from each group of the two-dimensional images to obtain the M-frame two-dimensional image.
In this possible example, the first formula is:
the first distance from each frame of two-dimensional image to the center point of the object to be molded is as follows:
wherein v is c (x c ,y c ,z c ) The coordinates of the central point of the object to be molded; v i0 (x i0 ,y i0 ,z i0 )、v i1 (x i1 ,y i1 ,z i1 )、v i2 (x i2 ,y i2 ,z i2 ) Respectively the ith three triangular surface patchesWorld coordinates of the vertex, i is a positive integer less than or equal to N; v j (x j ,y j ,z j ) And j is a positive integer for the position coordinate of the optical center of the image acquisition device when acquiring the j-th frame of images in the multi-frame two-dimensional images.
The first distance from each frame of two-dimensional image to the center point of the object to be molded can be calculated by the distance from the optical center position of the image acquisition device to the center point of the object to be molded when the two-dimensional image is acquired.
And calculating the mean coordinates of the central points of the N triangular patches as the coordinates of the central points of the object to be molded.
In this example, the electronic device selects the M-frame two-dimensional image with the shortest first distance according to the first distance from each frame of two-dimensional image to the center point of the object to be molded, which is favorable for guaranteeing the texture richness of the selected M-frame two-dimensional image, and further improving the texture filling effect.
In one possible example, the determining the mapping relationship between the N triangular patches and the M-frame two-dimensional image includes:
q area blocks on the M-frame two-dimensional image are obtained according to the areas of the N triangular patches, which are respectively mapped on the M-frame two-dimensional image, wherein Q is a positive integer;
and determining that the mapping relation corresponding to the minimum value of the energy function is the mapping relation between the N triangular patches and the Q area blocks through iterative optimization of the energy function.
The electronic equipment determines at least one triangular patch corresponding to each frame of two-dimensional image according to the surface information of an object to be molded, which is shot by the image acquisition device when each frame of two-dimensional image is acquired, then determines Q area blocks on M frames of two-dimensional images by referring to the reflection of three vertexes of each triangular patch on the corresponding two-dimensional image, for example, a first surface of the object to be molded, which is shot by the image acquisition device when the two-dimensional image A is acquired, then determines at least one area block by referring to the area of the first surface on the three-dimensional model through inverse mapping, and each triangular patch determines one area block.
Wherein the energy function may be a Markov random field (Markov Random Field, MFP) energy function, and the system reaches an optimal state when the energy value of the energy function is minimal.
The iterative optimization process is a process that the electronic equipment changes the mapping relation between the N triangular patches and the Q area blocks each time so that the energy value of the energy function finally reaches the minimum value, and when the energy value is the minimum value, the iterative optimization is finished.
In this example, the electronic device performs iterative optimization on the energy function to obtain a mapping relationship between the N triangular patches and the Q region blocks when the energy value of the energy function is minimum, and the mapping relationship is used as a mapping relationship for texture filling, which is favorable for guaranteeing the effect of the target three-dimensional model.
In one possible example, the energy function is:
wherein data item E data For measuring area block l p Mapped on triangular patch F a Texture information richness at the time of uploading, smoothing item E smooth For measuring model surface F a And F b Color consistency of texture image gaps corresponding to two adjacent triangular patches; when area block l p Mapped on triangular patch F a Upper and area block l q Mapped on triangular patch F b If l p And/l q Is the same area block, then E smooth For a first value, if l p And/l q For different area blocks, then E smooth And the first numerical value is smaller than the second numerical value, p and Q are positive integers smaller than or equal to Q respectively, and a and b are positive integers smaller than or equal to N respectively.
Wherein the first value may be 0 and the second value may be 1, which is not limited herein.
Where possibleIn the example of (a), the data item E data Using an angle term cost function:
wherein,for collecting area block l p The included angle between the view point sight line and the texture sight line of the corresponding two-dimensional image time image acquisition device is +.>Is triangular dough sheet F a Alpha is a weight factor, and the texture sight is the optical center and triangular surface patch F of the image acquisition device a A line between the centers.
In this example, the electronic device evaluates the richness of the mapped texture information and the consistency of the gap colors between the adjacent triangular patches through the energy function formed by the data item and the smooth item, and determines the data item through the angle item, so that the final obtained mapping relationship result is that the texture shift phenomenon caused by the model error is reduced and the richness and the smoothness of the texture of the three-dimensional model are improved by selecting a proper texture image in the optimal label view angle and the observation view angle.
Referring to fig. 4, fig. 4 is a flowchart illustrating another texture filling method for a three-dimensional model according to an embodiment of the application, where the texture filling method for a three-dimensional model may be applied to an electronic device. As shown in the figure, the texture filling method of the three-dimensional model comprises the following operations:
s401, electronic equipment acquires a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein the multi-frame two-dimensional image comprises images of each surface of the object to be molded, and the reference three-dimensional model comprises N triangular patches, and N is a positive integer.
S402, the electronic equipment groups the multi-frame two-dimensional images according to a preset angle to obtain a plurality of groups of two-dimensional images.
S403, the electronic equipment calculates the center point coordinates of the object to be molded according to a first formula.
S404, the electronic equipment determines a first distance from each two-dimensional image in the multi-frame two-dimensional images to the center point of the object to be molded.
And S405, the electronic equipment selects the two-dimensional image with the minimum first distance from each group of two-dimensional images to obtain M frames of two-dimensional images.
S406, the electronic device obtains Q area blocks on the M-frame two-dimensional image according to the areas of the N triangular patches, which are respectively mapped on the M-frame two-dimensional image, wherein Q is a positive integer.
S407, the electronic device determines that the mapping relation corresponding to the minimum value of the energy function is the mapping relation between the N triangular patches and the Q area blocks through iterative optimization of the energy function.
And S408, the electronic equipment performs texture filling on the reference three-dimensional model according to the mapping relation to obtain a target three-dimensional model of the object to be modeled.
It can be seen that, in the embodiment of the present application, an electronic device obtains a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, where a multi-frame two-dimensional image includes an image of each surface of the object to be molded, and the reference three-dimensional model includes N triangular patches, selects M frames of two-dimensional images from the multi-frame two-dimensional image, determines a mapping relationship between the N triangular patches and the M frames of two-dimensional images, and then texture fills the reference three-dimensional model according to the mapping relationship to obtain a target three-dimensional model of the object to be molded. Therefore, the electronic device firstly screens the multi-frame two-dimensional images, performs texture mapping according to the selected M-frame two-dimensional images, performs texture mapping instead of performing texture mapping through all the two-dimensional images, and further performs texture filling, so that the data size of the texture mapping is reduced, the complexity of a texture filling algorithm is reduced, the texture filling efficiency is improved, and the power consumption is reduced.
In addition, the electronic equipment selects M-frame two-dimensional images with the shortest first distance according to the first distance from each frame of two-dimensional image to the center point of the object to be molded, so that the texture richness of the selected M-frame two-dimensional images is guaranteed, and the texture filling effect is improved.
In addition, the electronic equipment obtains the mapping relation between the N triangular patches and the Q area blocks when the energy value of the energy function is minimum by carrying out iterative optimization on the energy function, and the mapping relation is used as the mapping relation of texture filling, so that the effect of the target three-dimensional model is guaranteed.
The embodiment of the application provides a texture filling device for a three-dimensional model, which can be an electronic device 100. Specifically, the texture filling device of the three-dimensional model is used for executing the steps of the texture filling method of the three-dimensional model. The texture filling device of the three-dimensional model provided by the embodiment of the application can comprise modules corresponding to the corresponding steps.
According to the embodiment of the application, the functional modules of the texture filling device of the three-dimensional model can be divided according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. The division of the modules in the embodiment of the application is schematic, only one logic function is divided, and other division modes can be adopted in actual implementation.
Fig. 5 shows a possible structural diagram of the texture filling apparatus of the three-dimensional model involved in the above-described embodiment in the case where the respective functional blocks are divided with the respective functions. As shown in fig. 5, the texture filling apparatus 500 of the three-dimensional model includes an acquisition unit 501, a selection unit 502, a determination unit 503, and a filling unit 504, wherein:
the acquiring unit 501 is configured to acquire a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, where the multi-frame two-dimensional image includes an image of each surface of the object to be molded, and the reference three-dimensional model includes N triangular patches, where N is a positive integer;
the selecting unit 502 is configured to select M frames of two-dimensional images from the multiple frames of two-dimensional images, where M is a positive integer less than N;
the determining unit 503 is configured to determine a mapping relationship between the N triangular patches and the M-frame two-dimensional image;
the filling unit 504 is configured to perform texture filling on the reference three-dimensional model according to the mapping relationship, so as to obtain a target three-dimensional model of the object to be modeled.
All relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein. Of course, the texture filling device of the three-dimensional model provided by the embodiment of the application includes, but is not limited to, the above modules, for example: the texture filling device of the three-dimensional model may further comprise a storage unit. The memory unit may be used for storing program code and data of the texture filling means of the three-dimensional model.
In the case of using an integrated unit, a schematic structural diagram of a texture filling device for a three-dimensional model according to an embodiment of the present application is shown in fig. 6. In fig. 6, a texture filling apparatus 600 of a three-dimensional model includes: a processing module 602 and a communication module 601. The processing module 602 is configured to control and manage actions of the texture filling apparatus of the three-dimensional model, for example, performing steps performed by the determining unit 501, the selecting unit 502, and the filling unit 503, and/or performing other processes of the techniques described herein. The communication module 601 is used for supporting interaction between the texture filling device of the three-dimensional model and other devices or between internal modules of the texture filling device of the three-dimensional model. As shown in fig. 6, the texture filling apparatus of the three-dimensional model may further include a storage module 603, where the storage module 603 is configured to store program codes and data of the texture filling apparatus of the three-dimensional model, for example, the content stored in the storage unit.
The processing module 602 may be a processor or controller, such as a central processing unit (Central Processing Unit, CPU), general purpose processor, digital signal processor (Digital Signal Processor, DSP), ASIC, FPGA or other programmable logic device, transistor logic device, hardware components, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor may also be a combination that performs the function of a computation, e.g., a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, and the like. The communication module 601 may be a transceiver, a radio frequency circuit, a communication interface, or the like. The memory module 603 may be a memory.
All relevant contents of each scenario related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein. Both the texture filling apparatus 500 of the three-dimensional model and the texture filling apparatus 600 of the three-dimensional model may perform the texture filling method of the three-dimensional model shown in any one of fig. 3A to 4.
The present embodiment also provides a computer storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the operating method in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described related steps to implement the texture filling method of the three-dimensional model in the above-described embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be embodied as a chip, component or module, which may include a processor and a memory coupled to each other; the memory is configured to store computer-executable instructions, and when the apparatus is running, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the texture filling method of the three-dimensional model in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (8)

1. A texture filling method of a three-dimensional model, applied to an electronic device, the method comprising:
acquiring a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein the multi-frame two-dimensional image comprises images of each surface of the object to be molded, and the reference three-dimensional model comprises N triangular patches, and N is a positive integer;
selecting M frames of two-dimensional images from the multi-frame two-dimensional images, wherein the M frames of two-dimensional images comprise: grouping the multi-frame two-dimensional images according to preset parameters to obtain a plurality of groups of two-dimensional images, and selecting at least one two-dimensional image from each group of two-dimensional images to obtain M frames of two-dimensional images, wherein M is a positive integer smaller than N;
determining the mapping relation between the N triangular patches and the M-frame two-dimensional image comprises the following steps: according to the surface information of the object to be modeled, which is shot when each frame of two-dimensional image is acquired, at least one triangular patch corresponding to each frame of two-dimensional image is determined, Q area blocks on the M frames of two-dimensional images are obtained through reflection of three vertexes of each triangular patch on the reference three-dimensional model on the corresponding two-dimensional image, Q is a positive integer, and the mapping relation corresponding to the minimum value of the energy function is determined as the mapping relation between the N triangular patches and the Q area blocks through iterative optimization of the energy function;
And carrying out texture filling on the reference three-dimensional model according to the mapping relation to obtain a target three-dimensional model of the object to be modeled.
2. The method of claim 1, wherein selecting at least one two-dimensional image from each set of two-dimensional images to obtain the M-frame two-dimensional image comprises:
calculating the center point coordinates of the object to be molded according to a first formula;
determining a first distance from each two-dimensional image in the multi-frame two-dimensional images to the center point of the object to be molded;
and selecting the two-dimensional image with the minimum first distance from each group of the two-dimensional images to obtain the M-frame two-dimensional image.
3. The method of claim 2, wherein the first formula is:
the first distance from each frame of two-dimensional image to the center point of the object to be molded is as follows:
wherein v is c (x c ,y c ,z c ) The coordinates of the central point of the object to be molded; v i0 (x i0 ,y i0 ,z i0 )、v i1 (x i1 ,y i1 ,z i1 )、v i2 (x i2 ,y i2 ,z i2 ) Respectively the firstWorld coordinates of three vertexes of the i triangular patches, wherein i is a positive integer less than or equal to N; v j (x j ,y j ,z j ) And j is a positive integer for the position coordinate of the optical center of the image acquisition device when acquiring the j-th frame of images in the multi-frame two-dimensional images.
4. The method of claim 1, wherein the energy function is:
Wherein data item E data For measuring area block l p Mapped on triangular patch F a Texture information richness at the time of uploading, smoothing item E smooth For measuring model surface F a And F b Color consistency of texture image gaps corresponding to two adjacent triangular patches; when area block l p Mapped on triangular patch F a Upper and area block l q Mapped on triangular patch F b If l p And/l q Is the same area block, then E smooth For a first value, if l p And/l q For different area blocks, then E smooth And the first numerical value is smaller than the second numerical value, p and Q are positive integers smaller than or equal to Q respectively, and a and b are positive integers smaller than or equal to N respectively.
5. The method according to claim 4, wherein the data item E data Using an angle term cost function:
wherein,for collecting area block l p The included angle between the view point sight line and the texture sight line of the corresponding two-dimensional image time image acquisition device is +.>Is triangular dough sheet F a Alpha is a weight factor, and the texture sight is the optical center and triangular surface patch F of the image acquisition device a A line between the centers.
6. The texture filling device of the three-dimensional model is characterized by being applied to electronic equipment, and comprises an acquisition unit, a selection unit, a determination unit and a filling unit, wherein:
The acquisition unit is used for acquiring a multi-frame two-dimensional image of an object to be molded and a reference three-dimensional model, wherein the multi-frame two-dimensional image comprises images of each surface of the object to be molded, and the reference three-dimensional model comprises N triangular patches, and N is a positive integer;
the selecting unit is configured to select M-frame two-dimensional images from the multi-frame two-dimensional images, and includes: grouping the multi-frame two-dimensional images according to preset parameters to obtain a plurality of groups of two-dimensional images, and selecting at least one two-dimensional image from each group of two-dimensional images to obtain M frames of two-dimensional images, wherein M is a positive integer smaller than N;
the determining unit is configured to determine a mapping relationship between the N triangular patches and the M-frame two-dimensional image, and includes: according to the surface information of the object to be modeled, which is shot when each frame of two-dimensional image is acquired, at least one triangular patch corresponding to each frame of two-dimensional image is determined, Q area blocks on the M frames of two-dimensional images are obtained through reflection of three vertexes of each triangular patch on the reference three-dimensional model on the corresponding two-dimensional image, Q is a positive integer, and the mapping relation corresponding to the minimum value of the energy function is determined as the mapping relation between the N triangular patches and the Q area blocks through iterative optimization of the energy function;
And the filling unit is used for carrying out texture filling on the reference three-dimensional model according to the mapping relation to obtain the target three-dimensional model of the object to be modeled.
7. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-5.
8. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-5.
CN202010741361.7A 2020-07-28 2020-07-28 Texture filling method and device for three-dimensional model Active CN111882642B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010741361.7A CN111882642B (en) 2020-07-28 2020-07-28 Texture filling method and device for three-dimensional model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010741361.7A CN111882642B (en) 2020-07-28 2020-07-28 Texture filling method and device for three-dimensional model

Publications (2)

Publication Number Publication Date
CN111882642A CN111882642A (en) 2020-11-03
CN111882642B true CN111882642B (en) 2023-11-21

Family

ID=73200326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010741361.7A Active CN111882642B (en) 2020-07-28 2020-07-28 Texture filling method and device for three-dimensional model

Country Status (1)

Country Link
CN (1) CN111882642B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112634414B (en) * 2020-12-24 2023-09-05 北京百度网讯科技有限公司 Map display method and device
CN113591300B (en) * 2021-07-29 2024-03-15 深圳市创想三维科技股份有限公司 Method, device, computer equipment and storage medium for generating 3D printing file
CN115131419B (en) * 2022-06-15 2023-05-30 荣耀终端有限公司 Image processing method for forming Tyndall luminous effect and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032741A (en) * 2000-07-13 2002-01-31 Sony Corp System and method for three-dimensional image generation and program providing medium
CN104574501A (en) * 2014-12-19 2015-04-29 浙江大学 High-quality texture mapping method aiming at complicated three-dimensional scene
CN107484428A (en) * 2015-03-25 2017-12-15 “实验室24”股份有限公司 Method for showing object
CN109191393A (en) * 2018-08-16 2019-01-11 Oppo广东移动通信有限公司 U.S. face method based on threedimensional model
CN111145081A (en) * 2019-12-16 2020-05-12 佛山科学技术学院 Three-dimensional model view projection method and system based on space volume characteristics

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100254607A1 (en) * 2009-04-02 2010-10-07 Kamal Patel System and method for image mapping and integration

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002032741A (en) * 2000-07-13 2002-01-31 Sony Corp System and method for three-dimensional image generation and program providing medium
CN104574501A (en) * 2014-12-19 2015-04-29 浙江大学 High-quality texture mapping method aiming at complicated three-dimensional scene
CN107484428A (en) * 2015-03-25 2017-12-15 “实验室24”股份有限公司 Method for showing object
CN109191393A (en) * 2018-08-16 2019-01-11 Oppo广东移动通信有限公司 U.S. face method based on threedimensional model
CN111145081A (en) * 2019-12-16 2020-05-12 佛山科学技术学院 Three-dimensional model view projection method and system based on space volume characteristics

Also Published As

Publication number Publication date
CN111882642A (en) 2020-11-03

Similar Documents

Publication Publication Date Title
CN111553846B (en) Super-resolution processing method and device
CN111738122B (en) Image processing method and related device
CN111882642B (en) Texture filling method and device for three-dimensional model
CN111782879B (en) Model training method and device
CN113994317A (en) User interface layout method and electronic equipment
CN111555825B (en) Radio frequency resource allocation method and device
CN111400605A (en) Recommendation method and device based on eyeball tracking
CN110830645B (en) Operation method, electronic equipment and computer storage medium
CN111768352A (en) Image processing method and device
CN111767016B (en) Display processing method and device
CN111612723B (en) Image restoration method and device
CN111524528B (en) Voice awakening method and device for preventing recording detection
CN111381996B (en) Memory exception handling method and device
CN111581119B (en) Page recovery method and device
CN111836226B (en) Data transmission control method, device and storage medium
CN114384465A (en) Azimuth angle determination method and device
CN111768416B (en) Photo cropping method and device
CN111459271B (en) Gaze offset error determination method and device
CN114336998A (en) Charging control method, charging control device and storage medium
CN114510192B (en) Image processing method and related device
CN114596819B (en) Brightness adjusting method and related device
CN113311380B (en) Calibration method, device and storage medium
CN114630153B (en) Parameter transmission method and device for application processor and storage medium
CN113505451B (en) Method for determining narrowest width of waterproof sealing foam of upper cover and lower cover and related products
CN111768416A (en) Photo clipping method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant