CN110458902A - 3D illumination estimation method and electronic equipment - Google Patents

3D illumination estimation method and electronic equipment Download PDF

Info

Publication number
CN110458902A
CN110458902A CN201910586485.XA CN201910586485A CN110458902A CN 110458902 A CN110458902 A CN 110458902A CN 201910586485 A CN201910586485 A CN 201910586485A CN 110458902 A CN110458902 A CN 110458902A
Authority
CN
China
Prior art keywords
scene
illumination
information
electronic equipment
ball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910586485.XA
Other languages
Chinese (zh)
Other versions
CN110458902B (en
Inventor
王习之
刘昆
李阳
杜成
王强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN110458902A publication Critical patent/CN110458902A/en
Application granted granted Critical
Publication of CN110458902B publication Critical patent/CN110458902B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The embodiment of the present application provides three-dimensional 3D illumination estimation method and electronic equipment, it is related to augmented reality field, the lighting effect closer to real scene can be estimated according at least one spherical harmonic coefficient, so that virtual objects are more effectively merged with real scene, the user experience is improved.This method comprises: obtaining the image information of the first scene;Wherein, the image information of first scene includes the depth information of the colouring information of pixel and pixel in first scene in first scene;The first simulated scenario is established according to the image information of first scene;Wherein, which includes at least two illumination detection ball and point cloud;The information estimation illumination for the point cloud for including according to first simulated scenario detects the spherical harmonic coefficient of ball;Wherein, the information of the cloud includes the location information and irradiation level information of the point in the cloud;The spherical harmonic coefficient of the position of the virtual objects is estimated according to the spherical harmonic coefficient that the illumination detects ball.

Description

3D illumination estimation method and electronic equipment
" this application claims in submission on March 26th, 2019 State Intellectual Property Office, application No. is 201910234517.X, The priority of the Chinese patent application of entitled " 3D illumination estimation method and electronic equipment ", entire contents pass through reference It is incorporated in the present application ".
Technical field
This application involves augmented reality field more particularly to the 3D illumination estimation methods and electronic equipment of more scenes.
Background technique
Augmented reality (augmented reality, AR) technology is the new technology to grow up on the basis of virtual reality, It is to increase the technology that perceive to real world of user by the information that computer system provides, and by the virtual right of computer generation It is a kind of by real world to realize " enhancing " to reality as, scene or system prompt information superposition are into real scene Information and " seamless " the integrated new technology of virtual world information.Therefore, how by the rendering effect and environmental harmony of virtual objects, It is significant to the user experience of AR product.Virtual objects are rendered using illumination estimation, are the important of " seamless " AR Component part.
Existing manufacturer is proposed associated solutions at present, such as: the software tool pack for the AR application that Google is released The software tool pack ARKit for the AR application that ARCore and apple are released.But by above-mentioned software, electronic equipment is only capable of obtaining To less information, such as: color compensation information and strength information of environment light etc. are believed according to the illumination that above- mentioned information estimate Breath is one average Lighting information (Lighting information in i.e. each direction is the same), therefore according to the virtual objects of above- mentioned information rendering Illumination patterns differ larger with the illumination patterns in true environment, augmented reality effect is untrue, poor user experience.
Summary of the invention
The embodiment of the present application provides 3D illumination estimation method and electronic equipment, can be estimated according at least one spherical harmonic coefficient User's body is improved closer to the lighting effect of real scene so that virtual objects are more effectively merged with real scene out It tests.
In order to achieve the above objectives, embodiments herein adopts the following technical scheme that
In a first aspect, the embodiment of the present application provides a kind of 3D illumination estimation method, this method comprises: obtaining the first scene Image information;Wherein, the image information of first scene includes the colouring information of pixel and this first in first scene The depth information of pixel in scape;The first simulated scenario is established according to the image information of first scene;Wherein, first simulation Scene includes at least two illumination detection ball and point cloud;The information for the point cloud for including according to first simulated scenario estimates illumination Detect the spherical harmonic coefficient of ball;Wherein, the information of the cloud includes the location information and irradiation level information of the point in the cloud;According to The spherical harmonic coefficient of illumination detection ball estimates the spherical harmonic coefficient of the position of the virtual objects.
The technical solution that above-mentioned first aspect provides, electronic equipment can pass through the image information of the first scene of acquisition, root The information for the point cloud established the first simulated scenario according to the image information of the first scene, and include according to the first simulated scenario estimates light According to the spherical harmonic coefficient of detection ball, the spherical harmonic coefficient of the position of the humorous system estimation virtual objects of ball of ball is detected further according to illumination, In, spherical harmonic coefficient may include more information content, such as: the direction etc. of light intensity.Therefore, electronic equipment is estimated that and more connects The lighting effect of nearly real scene, so that virtual objects are more effectively merged with real scene, the user experience is improved.
With reference to first aspect, in the first possible implementation, ball is detected for the first illumination, according to first mould The information for the point cloud that quasi- scene includes estimates the spherical harmonic coefficient of first illumination detection ball, comprising: according to the acquisition of information of cloud First irradiation level of first illumination detection ball;Estimate that first illumination is visited according to the first irradiation level that first illumination detects ball Survey the spherical harmonic coefficient of ball;First illumination detection ball is that any illumination that first simulated scenario includes detects ball.Based on first The possible implementation of the first of aspect, electronic equipment can be according to the irradiation of any illumination detection ball of acquisition of information of cloud Degree estimates the spherical harmonic coefficient of any illumination detection ball, according to the irradiation level that any illumination detects ball so that electronic equipment is according to light According to the spherical harmonic coefficient of the position of the spherical harmonic coefficient estimation virtual objects of detection ball.
The possible implementation of with reference to first aspect the first, in the second possible implementation, this is according to point First irradiation level of the first illumination of acquisition of information detection ball of cloud, comprising: according to the first illumination of acquisition of information of the first available point Detect the first irradiation level of ball, wherein the first available point is the point in a cloud in the visible range of first illumination detection ball. Second of possible implementation based on first aspect, electronic equipment can be according to the first light of acquisition of information of the first available point According to the first irradiation level of detection ball, so that electronic equipment estimates that the first illumination is visited according to the first irradiation level that the first illumination detects ball Survey the spherical harmonic coefficient of ball.
With reference to first aspect and the various possible implementations of first aspect, in the third possible implementation In, this estimates the spherical harmonic coefficient of the position of the virtual objects according to the spherical harmonic coefficient that the illumination detects ball, comprising: visits to the illumination The spherical harmonic coefficient for surveying ball is weighted summation, obtains the spherical harmonic coefficient of the position of the virtual objects.Third based on first aspect The spherical harmonic coefficient that illumination detects ball can be weighted summation, obtain virtual objects by the possible implementation of kind, electronic equipment Position spherical harmonic coefficient, so as to electronic equipment according to the spherical harmonic coefficient of virtual objects position estimate real scene in virtual objects Lighting effect.
The third possible implementation with reference to first aspect, in the fourth possible implementation, the illumination are visited Survey ball between the virtual objects at a distance from be less than or equal to first distance.The 4th kind of possible realization side based on first aspect Formula, electronic equipment can add the spherical harmonic coefficient for the illumination detection ball for being less than or equal to first distance at a distance between virtual objects Power summation, obtains the spherical harmonic coefficient of the position of virtual objects.
With reference to first aspect and the various possible implementations of first aspect, in the 5th kind of possible implementation In, this method further include: the position for obtaining virtual objects is inputted according to user;Alternatively, pre-seting the position of virtual objects.It is based on 5th kind of possible implementation of first aspect, electronic equipment can input the position for obtaining virtual objects, electricity according to user Sub- equipment can also pre-set the position of virtual objects.
Second aspect, the embodiment of the present application provide a kind of 3D illumination estimation method, this method comprises: obtaining first scene Image information;Wherein, the image information of first scene includes the colouring information of pixel in first scene;According to this The corresponding day empty graph of the image information acquisition of one scene first scene;Wherein, the corresponding day empty graph of first scene is for referring to Show the illumination patterns of first scene;The spherical harmonic coefficient of the virtual objects is estimated according to the information of this day empty graph, wherein the sky The information of figure includes the spherical harmonic coefficient of sky figure.
The technical solution that above-mentioned first aspect provides, electronic equipment can pass through the image information of the first scene of acquisition, root According to the corresponding day empty graph of the first scene of image information acquisition of the first scene, the ball of virtual objects is estimated according to the information of day empty graph Humorous coefficient, wherein spherical harmonic coefficient may include more information content, such as: the direction etc. of light intensity.Therefore, electronic equipment can be with The lighting effect estimated closer to real scene improves use so that virtual objects are more effectively merged with real scene Family experience.
In conjunction with second aspect, in the first possible implementation, which estimates that this is virtual The spherical harmonic coefficient of object, comprising: using the spherical harmonic coefficient of this day empty graph as the spherical harmonic coefficient of the virtual objects.Based on second aspect The first possible implementation, electronic equipment can using the spherical harmonic coefficient of day empty graph as the spherical harmonic coefficient of virtual objects, So that electronic equipment is according to the lighting effect of virtual objects in the spherical harmonic coefficient of virtual objects estimation real scene.
In conjunction with the possible implementation of the first of second aspect and second aspect, in second of possible implementation In, this method further include: the position for obtaining virtual objects is inputted according to user;Alternatively, pre-seting the position of virtual objects.It is based on Second of second aspect possible implementation, electronic equipment can input the position for obtaining virtual objects according to user, alternatively, Electronic equipment can pre-set the position of virtual objects.
The third aspect, the embodiment of the present application provide a kind of electronic equipment, which, which has, realizes above-mentioned first party Method described in face and function.The function can also execute corresponding software realization by hardware realization by hardware.It should Hardware or software include one or more modules corresponding with above-mentioned function.
Fourth aspect, the embodiment of the present application provide a kind of electronic equipment, which, which has, realizes above-mentioned second party Method described in face and function.The function can also execute corresponding software realization by hardware realization by hardware.It should Hardware or software include one or more modules corresponding with above-mentioned function.
5th aspect, the embodiment of the present application provide a kind of electronic equipment, comprising: at least one processor and at least one A memory, at least one processor are coupled at least one processor;At least one processor is calculated for storing Machine program, so that realizing such as first aspect and its various possible realities when the computer program is executed by least one processor 3D illumination estimation method described in existing mode.
6th aspect, the embodiment of the present application provide a kind of electronic equipment, comprising: at least one processor and at least one A memory, at least one processor are coupled at least one processor;At least one processor is calculated for storing Machine program, so that realizing such as second aspect and its various possible realities when the computer program is executed by least one processor 3D illumination estimation method described in existing mode.
7th aspect, this application provides a kind of System on Chip/SoC, which can be using in the electronic device, this is Chip of uniting includes: at least one processor, and the program instruction being related to executes at least one processor, to realize according to the The function of the method for one side and its electronic equipment in any design.Optionally, which can also include at least one A memory, the program instruction which relates to.
Eighth aspect, this application provides a kind of System on Chip/SoC, which can be using in the electronic device, this is Chip of uniting includes: at least one processor, and the program instruction being related to executes at least one processor, to realize according to the The function of the method for one side and its electronic equipment in any design.Optionally, which can also include at least one A memory, the program instruction which relates to.
9th aspect, the embodiment of the present application provides a kind of computer readable storage medium, as computer it is non-transient can Read storage medium.Computer program is stored thereon, when the computer program is run on computers, so that computer executes Any possible method of above-mentioned first aspect.For example, the computer can be at least one memory node.
Tenth aspect, the embodiment of the present application provides a kind of computer readable storage medium, as computer it is non-transient can Read storage medium.Computer program is stored thereon, when the computer program is run on computers, so that computer executes Any possible method of above-mentioned second aspect.For example, the computer can be at least one memory node.
Tenth on the one hand, and the embodiment of the present application provides a kind of computer program product, when run on a computer, So that either first aspect offer method is performed.For example, the computer can be at least one memory node.
12nd aspect, the embodiment of the present application provide a kind of computer program product, when run on a computer, So that either second aspect offer method is performed.For example, the computer can be at least one memory node.
It should be understood that any electronic equipment or System on Chip/SoC or computer storage medium or computer of above-mentioned offer Program product etc. is used to execute corresponding method presented above, and therefore, the attainable beneficial effect of institute can refer to pair The beneficial effect in method answered, details are not described herein again.
Detailed description of the invention
Fig. 1 is the hardware structural diagram of electronic equipment provided by the embodiments of the present application;
Fig. 2 (a) is the schematic illustration one of 3D illumination estimation method provided by the embodiments of the present application;
Fig. 2 (b) is the schematic illustration two of 3D illumination estimation method provided by the embodiments of the present application;
Fig. 2 (c) is the schematic illustration three of 3D illumination estimation method provided by the embodiments of the present application;
Fig. 3 (a) is display interface example schematic one provided by the embodiments of the present application;
Fig. 3 (b) is display interface example schematic two provided by the embodiments of the present application;
Fig. 3 (c) is display interface example schematic three provided by the embodiments of the present application;
Fig. 3 (d) is display interface example schematic four provided by the embodiments of the present application;
Fig. 4 (a) is scene type schematic diagram one provided by the embodiments of the present application;
Fig. 4 (b) is scene type schematic diagram two provided by the embodiments of the present application;
Fig. 4 (c) is scene type schematic diagram three provided by the embodiments of the present application;
Fig. 5 is 3D illumination estimation method flow schematic diagram one provided by the embodiments of the present application;
Fig. 6 (a) is display interface example schematic five provided by the embodiments of the present application;
Fig. 6 (b) is display interface example schematic six provided by the embodiments of the present application;
Fig. 7 is the schematic diagram one that illumination provided by the embodiments of the present application detects ball;
Fig. 8 is the schematic diagram two that illumination provided by the embodiments of the present application detects ball;
Fig. 9 is display interface example schematic seven provided by the embodiments of the present application;
Figure 10 is 3D illumination estimation method flow schematic diagram two provided by the embodiments of the present application;
Figure 11 is that sky provided by the embodiments of the present application diagram is intended to;
Figure 12 is 3D illumination estimation method flow schematic diagram three provided by the embodiments of the present application;
Figure 13 is the structural schematic diagram one of electronic equipment provided by the embodiments of the present application;
Figure 14 is the structural schematic diagram two of electronic equipment provided by the embodiments of the present application.
Specific embodiment
The embodiment of the present application is described below with reference to the attached drawing in the embodiment of the present application.The implementation of the embodiment of the present application The term that mode part uses is only used for explaining the specific embodiment of the application, and is not intended to limit the application.
To the invention relates to electronic equipment be introduced.Electronic equipment provided by the embodiments of the present application can be Any appliance with augmented reality (augmented reality, AR), such as: can be portable computer (such as mobile phone), pen Remember this computer, personal computer (personal computer, PC), wearable electronic (such as smartwatch), plate electricity Brain, AR equipment etc..
Referring to Fig. 1, Fig. 1 is the structural schematic diagram of a kind of electronic equipment provided by the embodiments of the present application.Electronics in Fig. 1 Equipment 100 can specifically include: processor 110, external memory interface 120, internal storage 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, day Line 1, antenna 2, mobile communication module 150, wireless communication module 160, audio-frequency module 170, loudspeaker 170A, receiver 170B, Microphone 170C, earphone interface 170D, sensor module 180, key 190, motor 191, indicator 192, camera 193 are shown Display screen 194 and Subscriber Identity Module (subscriber identification module, SIM) card interface 195 etc..Its Middle sensor module 180 may include depth transducer 180A, pressure sensor 180B, gyro sensor 180C, air pressure transmission Sensor 180D, Magnetic Sensor 180E, acceleration transducer 180F, range sensor 180G, close to optical sensor 180H, fingerprint is passed Sensor 180J, temperature sensor 180K, touch sensor 180L, ambient light sensor 180M, bone conduction sensor 180N etc..
It is understood that the structure of the embodiment of the present application signal does not constitute the specific restriction to electronic equipment 100.In In other embodiments of the application, electronic equipment 100 may include than illustrating more or fewer components, or the certain portions of combination Part perhaps splits certain components or different component layouts.The component of diagram can be with hardware, software or software and hardware Combination realize.
Processor 110 may include one or more processing units, such as: processor 110 may include application processor (application processor, AP), modem processor, graphics processor (graphics processing Unit, GPU), image-signal processor (image signal processor, ISP), controller, Video Codec, number Signal processor (digital signal processor, DSP), baseband processor and/or neural network processor (neural-network processing unit, NPU) etc..Wherein, different processing units can be independent device, It can integrate in one or more processors.
Controller can generate operating control signal according to instruction operation code and clock signal, complete instruction fetch and execution The control of instruction.
Memory can also be set in processor 110, for storing instruction and data.Memory can store for realizing The instruction of six modular functionalities: detection instruction, link order, information management instruction, analysis instruction, data transfer instruction and logical Know instruction, and controls execution by processor 110.In some embodiments, the memory in processor 110 is deposited for speed buffering Reservoir.The memory can save the instruction or data that processor 110 is just used or is recycled.If processor 110 needs The instruction or data are reused, can be called directly from the memory.Repeated access is avoided, processor 110 is reduced Waiting time, thus improve the efficiency of system.
In some embodiments, processor 110 may include one or more interfaces.Interface may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit Sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiving-transmitting transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), universal input export (general-purpose Input/output, GPIO) interface, Subscriber Identity Module (subscriber identity module, SIM) interface, and/or Universal serial bus (universal serial bus, USB) interface etc..
I2C interface is a kind of bi-directional synchronization universal serial bus, including serial data line (serial data line, SDA) He Yigen serial time clock line (derail clock line, SCL).In some embodiments, processor 110 may include Multiple groups I2C bus.Processor 110 can by different I2C bus interface distinguish coupled with touch sensors 180K, charger, Flash lamp, camera 193 etc..Such as: processor 110 can make processor by I2C interface coupled with touch sensors 180K 110 are communicated with touch sensor 180K by I2C bus interface, realize the touch function of electronic equipment 100.
I2S interface can be used for voice communication.In some embodiments, processor 110 may include multiple groups I2S bus. Processor 110 can be coupled by I2S bus with audio-frequency module 170, be realized logical between processor 110 and audio-frequency module 170 Letter.In some embodiments, audio-frequency module 170 can transmit audio signal to wireless communication module 160 by I2S interface, real The function of now being received calls by bluetooth headset.
Pcm interface can be used for voice communication, by analog signal sampling, quantization and coding.In some embodiments, sound Frequency module 170 can be coupled with wireless communication module 160 by pcm bus interface.In some embodiments, audio-frequency module 170 Audio signal can also be transmitted to wireless communication module 160 by pcm interface, realize the function to receive calls by bluetooth headset Energy.The I2S interface and the pcm interface may be used to voice communication.
UART interface is a kind of Universal Serial Bus, is used for asynchronous communication.The bus can be bidirectional communications bus. The data that it will be transmitted are converted between serial communication and parallel communications.In some embodiments, UART interface usually by with In connection processor 110 and wireless communication module 160.Such as: processor 110 passes through UART interface and wireless communication module 160 In bluetooth module communication, realize Bluetooth function.In some embodiments, audio-frequency module 170 can be by UART interface to nothing Line communication module 160 transmits audio signal, realizes the function that music is played by bluetooth headset.
MIPI interface can be used to connect the peripheral components such as processor 110 and display screen 194, camera 193.MIPI connects Mouth includes camera serial line interface (camera serial interface, CSI), display screen serial line interface (display Serial interface, DSI) etc..In some embodiments, processor 110 and camera 193 are communicated by CSI interface, real The shooting function of existing electronic equipment 100.Processor 110 and display screen 194 realize electronic equipment 100 by DSI interface communication Display function.
GPIO interface can pass through software configuration.GPIO interface can be configured as control signal, may be alternatively configured as counting It is believed that number.In some embodiments, GPIO interface can be used for connecting processor 110 and camera 193, display screen 194, wirelessly Communication module 160, audio-frequency module 170, sensor module 180 etc..GPIO interface can be additionally configured to I2C interface, and I2S connects Mouthful, UART interface, MIPI interface etc..
Usb 1 30 is the interface for meeting USB standard specification, specifically can be MiniUSB interface, MicroUSB interface, USBTypeC interface etc..Usb 1 30 can be used for connecting charger for the charging of electronic equipment 100, can be used for electronics and sets Data are transmitted between standby 100 and peripheral equipment.It can be used for connection earphone, audio played by earphone.The interface can be with For connecting other electronic equipments, such as AR equipment etc..
It is understood that the interface connection relationship of each intermodule of signal of the embodiment of the present invention, only schematically illustrates, The structure qualification to electronic equipment 100 is not constituted.In other embodiments of the application, electronic equipment 100 can also be used The combination of different interface connection type or multiple interfaces connection type in above-described embodiment.
Charge management module 140 is used to receive charging input from charger.Wherein, charger can be wireless charger, It is also possible to wired charger.In the embodiment of some wired chargings, charge management module 140 can pass through usb 1 30 Receive the charging input of wired charger.In the embodiment of some wireless chargings, charge management module 140 can pass through electronics The Wireless charging coil of equipment 100 receives wireless charging input.While charge management module 140 is that battery 142 charges, may be used also To be power electronic equipment by power management module 141.
Power management module 141 is for connecting battery 142, charge management module 140 and processor 110.Power management mould Block 141 receives the input of battery 142 and/or charge management module 140, is processor 110, internal storage 121, display screen 194, the power supply such as camera 193 and wireless communication module 160.Power management module 141 can be also used for monitoring battery capacity, Circulating battery number, the parameters such as cell health state (electric leakage, impedance).In some other embodiment, power management module 141 Also it can be set in processor 110.In further embodiments, power management module 141 and charge management module 140 can also To be set in the same device.
The wireless communication function of electronic equipment 100 can pass through antenna 1, antenna 2, mobile communication module 150, wireless communication Module 160, modem processor and baseband processor etc. are realized.
Antenna 1 and antenna 2 electromagnetic wave signal for transmitting and receiving.Each antenna in electronic equipment 100 can be used for covering Cover single or multiple communication bands.Different antennas can also be multiplexed, to improve the utilization rate of antenna.Such as: it can be by antenna 1 It is multiplexed with the diversity antenna of WLAN.In other embodiments, antenna can be used in combination with tuning switch.
Mobile communication module 150, which can provide, applies wirelessly communicating on electronic equipment 100 including 2G/3G/4G/5G etc. Solution.Mobile communication module 150 may include at least one filter, switch, power amplifier, low-noise amplifier (low noise amplifier, LNA) etc..Mobile communication module 150 can receive electromagnetic wave by antenna 1, and to received electricity Magnetic wave is filtered, and the processing such as amplification is sent to modem processor and is demodulated.Mobile communication module 150 can also be right The modulated modulated signal amplification of demodulation processor, switchs to electromagenetic wave radiation through antenna 1 and goes out.In some embodiments, it moves At least partly functional module of dynamic communication module 150 can be arranged in processor 110.In some embodiments, mobile logical At least partly functional module of letter module 150 can be arranged in the same device at least partly module of processor 110.
Modem processor may include modulator and demodulator.Wherein, modulator is used for low frequency base to be sent Band signal is modulated into high frequency signal.Demodulator is used to received electromagnetic wave signal being demodulated into low frequency baseband signal.Then solution Adjust device that the low frequency baseband signal that demodulation obtains is sent to baseband processor.Low frequency baseband signal is through baseband processor Afterwards, it is delivered to application processor.Application processor is defeated by audio frequency apparatus (being not limited to loudspeaker 170A, receiver 170B etc.) Voice signal out, or image or video are shown by display screen 194.In some embodiments, modem processor can be Independent device.In further embodiments, modem processor can be independently of processor 110, with mobile communication module 150 or other function module be arranged in the same device.
It includes WLAN (wireless that wireless communication module 160, which can be provided and be applied on electronic equipment 100, Local area networks, WLAN) (such as Wireless Fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), Global Navigation Satellite System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), the short distance wireless communication technology (near field communication, NFC) are red The solution of the wireless communications such as outer technology (infrared, IR).Wireless communication module 160 can be integrated into few communication One or more devices of processing module.Wireless communication module 160 receives electromagnetic wave via antenna 2, by electromagnetic wave signal frequency modulation And filtering processing, by treated, signal is sent to processor 110.Wireless communication module 160 can also connect from processor 110 Signal to be sent is received, frequency modulation is carried out to it, is amplified, is switched to electromagenetic wave radiation through antenna 2 and go out.
In some embodiments, the antenna 1 of electronic equipment 100 and mobile communication module 150 couple, antenna 2 and channel radio Believe that module 160 couples, allowing electronic equipment 100, technology is communicated with network and other equipment by wireless communication.It is described Wireless communication technique may include global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), CDMA access (code Division multiple access, CDMA), wideband code division multiple access (wideband code division multiple Access, WCDMA), time division CDMA (time-division code division multiple access, TD- SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM and/or IR technology etc..Institute Stating GNSS may include GPS (global positioning system, GPS), global navigational satellite system It unites (global navigation satellite system, GLONASS), Beidou satellite navigation system (beidou Navigation satellite system, BDS), quasi- zenith satellite system (quasi-zenith satellite System, QZSS) and/or satellite-based augmentation system (satellite based augmentation systems, SBAS).
Electronic equipment 100 realizes display function by GPU, display screen 194 and application processor etc..GPU is at image The microprocessor of reason connects display screen 194 and application processor.GPU is calculated for executing mathematics and geometry, is used for figure wash with watercolours Dye.Processor 110 may include one or more GPU, execute program instructions to generate or change display information.
Display screen 194 is for showing image, video etc..Display screen 194 includes display panel.Display panel can use liquid Crystal display screen (liquid crystal display, LCD), Organic Light Emitting Diode (organic light-emitting Diode, OLED), active matrix organic light-emitting diode or active-matrix organic light emitting diode (active-matrix Organic light emitting diode, AMOLED), Flexible light-emitting diodes (flexlight-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, light emitting diode with quantum dots (quantum dot light emitting Diodes, QLED) etc..In some embodiments, electronic equipment 100 may include 1 or N number of display screen 194, and N is greater than 1 Positive integer.
Electronic equipment 100 can be by ISP, camera 193, Video Codec, GPU, display screen 194 and at It manages device etc. and realizes shooting function.
ISP is used to handle the data of the feedback of camera 193.For example, opening shutter when taking pictures, light is passed by camera lens It is delivered on camera photosensitive element, optical signal is converted to electric signal, and camera photosensitive element passes to the electric signal at ISP Reason, is converted into macroscopic image.ISP can also be to the noise of image, brightness, colour of skin progress algorithm optimization.ISP can be with Exposure to photographed scene, the parameter optimizations such as colour temperature.In some embodiments, ISP can be set in camera 193.
Camera 193 is for capturing still image or video.Object generates optical imagery by camera lens and projects photosensitive member Part.Photosensitive element can be charge-coupled device (charge coupled device, CCD) or complementary metal oxide is partly led Body (complementary metal-oxide-semiconductor, CMOS) phototransistor.Photosensitive element turns optical signal It changes electric signal into, electric signal is passed into ISP later and is converted into data image signal.Data image signal is output to DSP by ISP Working process.Data image signal is converted into the RGB of standard, the picture signal of the formats such as YUV by DSP.In some embodiments, Electronic equipment 100 may include 1 or N number of camera 193, and N is the positive integer greater than 1, if electronic equipment 100 includes N number of takes the photograph As head, having one in N number of camera is main camera.
Digital signal processor, in addition to can handle data image signal, can also handle it for handling digital signal His digital signal.For example, digital signal processor is used to carry out Fu to frequency point energy when electronic equipment 100 is when frequency point selects In leaf transformation etc..
Video Codec is used for compression of digital video or decompression.Electronic equipment 100 can be supported one or more Video Codec.In this way, electronic equipment 100 can play or record the video of a variety of coded formats, and such as: dynamic image is special Family's group (moving picture experts group, MPEG) 1, MPEG2, mpeg 3, MPEG4 etc..
NPU is neural network (neural-network, NN) computation processor, by using for reference biological neural network structure, Such as transfer mode between human brain neuron is used for reference, it, can also continuous self study to input information fast processing.Pass through NPU The application such as intelligent cognition of electronic equipment 100 may be implemented, such as: image recognition, recognition of face, speech recognition, text understanding Deng.
External memory interface 120 can be used for connecting external memory card, such as Micro SD card, realize that extension electronics is set Standby 100 storage capacity.External memory card is communicated by external memory interface 120 with processor 110, realizes that data store function Energy.Such as by music, the files such as video are stored in external memory card.
Internal storage 121 can be used for storing computer executable program code, and the executable program code includes Instruction.Internal storage 121 may include storing program area and storage data area.Wherein, storing program area can store operation system It unites, application program (such as sound-playing function, image player function etc.) needed at least one function etc..It storage data area can The data (such as audio data, phone directory etc.) etc. created in storage 100 use process of electronic equipment.In addition, storage inside Device 121 may include high-speed random access memory, can also include nonvolatile memory, for example, at least a disk storage Device, flush memory device, generic flash memory (universal flash storage, UFS) etc..Processor 110 passes through operation It is stored in the instruction of internal storage 121, and/or is stored in the instruction for the memory being set in processor, electronics is executed and sets Standby 100 various function application and data processing.
Electronic equipment 100 can pass through audio-frequency module 170, loudspeaker 170A, receiver 170B, microphone 170C, earphone Interface 170D and application processor etc. realize audio-frequency function.Such as music, recording etc..
Audio-frequency module 170 is used to for digitized audio message to be converted into analog audio signal output, is also used for analogue audio frequency Input is converted to digital audio and video signals.Audio-frequency module 170 can be also used for audio-frequency signal coding and decoding.In some embodiments In, audio-frequency module 170 can be set in processor 110, or the partial function module of audio-frequency module 170 is set to processor In 110.
Loudspeaker 170A, also referred to as " loudspeaker ", for audio electrical signal to be converted to voice signal.Electronic equipment 100 can be with Music is listened to by loudspeaker 170A, or listens to hand-free call.
Receiver 170B, also referred to as " earpiece ", for audio electrical signal to be converted into voice signal.When electronic equipment 100 connects It answers a call or when voice messaging, it can be by the way that receiver 170B be answered voice close to human ear.
Microphone 170C, also referred to as " microphone ", " microphone ", for voice signal to be converted to electric signal.When making a phone call Or when sending voice messaging, voice signal can be input to microphone by mouth close to microphone 170C sounding by user 170C.At least one microphone 170C can be set in electronic equipment 100.In further embodiments, electronic equipment 100 can be set Two microphone 170C are set, in addition to collected sound signal, can also realize decrease of noise functions.In further embodiments, electronics is set Standby 100 can also be arranged three, four or more microphone 170C, realize that collected sound signal, noise reduction can also identify sound Directional recording function etc. is realized in source.
Earphone interface 170D is for connecting wired earphone.Earphone interface 170D can be usb 1 30, be also possible to Opening mobile electronic device platform (open mobile terminal platform, OMTP) standard interface of 3.5mm, the U.S. Cellular telecommunication industrial association (cellular telecommunications industry association of the USA, CTIA) standard interface.
Depth transducer 180A is used to obtain the depth information of scenery.In some embodiments, depth transducer can be set It is placed in camera 193.
Pressure signal can be converted into electric signal for experiencing pressure signal by pressure sensor 180B.In some implementations In example, pressure sensor 180B be can be set in display screen 194.There are many type of pressure sensor 180B, such as resistive pressure Sensor, inductance pressure transducer, capacitance pressure transducer, etc..Capacitance pressure transducer, can be including at least two Parallel-plate with conductive material.When effectively acting on pressure sensor 180B, the capacitor between electrode changes.Electronic equipment 100 determine the intensity of pressure according to the variation of capacitor.When there is touch operation to act on display screen 194, electronic equipment 100 is according to pressure Force snesor 180B detects the touch operation intensity.Electronic equipment 100 can also be believed according to the detection of pressure sensor 180B Number calculate touch position.In some embodiments, identical touch location, but the touch behaviour of different touch operation intensity are acted on Make, different operational orders can be corresponded to.Such as: when the touch operation for having touch operation intensity to be less than first pressure threshold value acts on When short message application icon, the instruction for checking short message is executed.When have touch operation intensity be greater than or equal to first pressure threshold When the touch operation of value acts on short message application icon, the instruction of newly-built short message is executed.
Gyro sensor 180C is determined for the athletic posture of electronic equipment 100.It in some embodiments, can be with Determine that electronic equipment 100 surrounds the angular speed of three axis (that is, x, y and z-axis) by gyro sensor 180C.Gyro sensors Device 180C can be used for shooting stabilization.Illustratively, when pressing shutter, gyro sensor 180C detection electronic equipment 100 is trembled Dynamic angle goes out the distance that lens module needs to compensate according to angle calculation, camera lens is allowed to offset electronic equipment by counter motion Stabilization is realized in 100 shake.Gyro sensor 180C can be also used for navigating, somatic sensation television game scene.
Baroceptor 180D is for measuring air pressure.In some embodiments, electronic equipment 100 passes through baroceptor The atmospheric pressure value that 180D is measured calculates height above sea level, auxiliary positioning and navigation.
Magnetic Sensor 180E includes Hall sensor.Electronic equipment 100 can use Magnetic Sensor 180E flip cover skin The folding of set.In some embodiments, when electronic equipment 100 is liding machine, electronic equipment 100 can be according to Magnetic Sensor The folding of 180E flip cover.And then according to the folding condition of the leather sheath detected or the folding condition of flip lid, setting flip lid is certainly The characteristics such as dynamic unlock.
Acceleration transducer 180F can detect the big of (the generally three axis) acceleration in all directions of electronic equipment 100 It is small.It can detect that size and the direction of gravity when electronic equipment 100 is static.It can be also used for identification electronic equipment posture, answer Switch for horizontal/vertical screen, the application such as pedometer.
Range sensor 180G, for measuring distance.Electronic equipment 100 can pass through infrared or laser distance measuring.In In some embodiments, photographed scene, electronic equipment 100 can use range sensor 180G ranging to realize rapid focus.
It may include such as light emitting diode (LED) and photodetector, such as photodiode close to optical sensor 180H. Light emitting diode can be infrared light-emitting diode.Electronic equipment 100 launches outward infrared light by light emitting diode.Electronics is set Standby 100 detect the infrared external reflection light from neighbouring object using photodiode.It, can be true when detecting sufficient reflected light Determining electronic equipment 100 nearby has object.When detecting insufficient reflected light, electronic equipment 100 can determine electronic equipment 100 do not have object nearby.Electronic equipment 100 can use to be pasted close to optical sensor 180H detection user's hand-hold electronic equipments 100 Nearly ear call, so that automatic distinguishing screen achievees the purpose that power saving.It can also be used for leather sheath mode, mouth close to optical sensor 180H Bag mode automatic unlocking and screen locking.
Fingerprint sensor 180J is for acquiring fingerprint.The fingerprint characteristic that electronic equipment 100 can use acquisition realizes fingerprint Unlock accesses application lock, and fingerprint is taken pictures, fingerprint incoming call answering etc..
Temperature sensor 180K is for detecting temperature.In some embodiments, electronic equipment 100 utilizes temperature sensor The temperature of 180K detection, executes Temperature Treatment strategy.For example, when the temperature sensor 180K temperature reported is more than threshold value, electronics Equipment 100 executes the performance for reducing the processor being located near temperature sensor 180K, implements Thermal protection to reduce power consumption.In In other embodiments, when temperature is lower than another threshold value, electronic equipment 100 heats battery 142, leads to electricity to avoid low temperature The abnormal shutdown of sub- equipment 100.In some other embodiment, when temperature is lower than another threshold value, electronic equipment 100 is to battery 142 output voltage executes boosting, to avoid shutting down extremely caused by low temperature.
Touch sensor 180L, also referred to as " touch-control device ".Touch sensor 180L can be set in display screen 194, by touching It touches sensor 180L and display screen 194 forms touch screen, also referred to as " touch screen ".Touch sensor 180L acts on it for detecting On or near touch operation.The touch operation that touch sensor can will test passes to application processor, to determine touching Touch event type.Visual output relevant to touch operation can be provided by display screen 194.In further embodiments, it touches Touching sensor 180L also can be set in the surface of electronic equipment 100, different from the location of display screen 194.
Ambient light sensor 180M is for perceiving environmental light brightness.Electronic equipment 100 can be according to the environment bright of perception Spend 194 brightness of automatic adjusument display screen.Automatic white balance adjustment when ambient light sensor 180M can also be used for taking pictures.Environment light Sensor 180M can also cooperate with close to optical sensor 180G, electronic equipment 100 be detected whether in pocket, with false-touch prevention.
The available vibration signal of bone conduction sensor 180N.In some embodiments, bone conduction sensor 180N can be with Obtain the vibration signal of human body part vibration bone block.Bone conduction sensor 180N can also contact human pulse, receive blood pressure and jump Dynamic signal.In some embodiments, bone conduction sensor 180N also can be set in earphone, be combined into bone conduction earphone.Sound Frequency module 170 can parse voice based on the vibration signal for the part vibration bone block that the bone conduction sensor 180N is obtained Signal realizes phonetic function.The blood pressure jitter solution that application processor can be obtained based on the bone conduction sensor 180N Heart rate information is analysed, realizes heart rate detecting function.
Key 190 includes power button, volume key etc..Key 190 can be mechanical key.It is also possible to touch-key. Electronic equipment 100 can receive key-press input, generate key letter related with the user setting of electronic equipment 100 and function control Number input.
Motor 191 can produce vibration prompt.Motor 191 can be used for calling vibration prompt, can be used for touching vibration Dynamic feedback.For example, acting on the touch operation of different application (such as taking pictures, audio broadcasting etc.), different vibrations can be corresponded to Feedback effects.The touch operation of 194 different zones of display screen is acted on, motor 191 can also correspond to different vibrational feedback effects. Different application scenarios (such as: time alarm receives information, alarm clock, game etc.) different vibrational feedback effects can also be corresponded to Fruit.Touch vibrational feedback effect can also be supported customized.
Indicator 192 can be indicator light, can serve to indicate that charged state, electric quantity change can be used for instruction and disappear Breath, missed call, notice etc..
SIM card interface 195 is for connecting SIM card.SIM card can be by being inserted into SIM card interface 195, or from SIM card interface 195 extract, and realization is contacting and separating with electronic equipment 100.Electronic equipment 100 can support 1 or N number of SIM card interface, N For the positive integer greater than 1.SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc..The same SIM Card interface 195 can be inserted into multiple cards simultaneously.The type of multiple cards may be the same or different.SIM card interface 195 Different types of SIM card can also be compatible with.SIM card interface 195 can also be with compatible external storage card.Electronic equipment 100 passes through SIM Card and network interaction realize the functions such as call and data communication.In some embodiments, electronic equipment 100 uses eSIM, That is: embedded SIM card.ESIM card can cannot separate in electronic equipment 100 with electronic equipment 100.
The specific original of 3D illumination estimation method provided by the present application is introduced below in conjunction with Fig. 2 (a), Fig. 2 (b) and Fig. 2 (c) Reason.The embodiment of the present application provides 3D illumination estimation method and electronic equipment, can be applied to electronic equipment 100 shown in FIG. 1.It should AR application software is installed, which can be the application software applied to multiple fields, example on electronic equipment 100 Such as: medical field, military field, field of play, the communications field, education sector multiple fields.AR application software can execute 3D illumination estimation algorithm provided by the embodiments of the present application, such as: when user wants using AR function, user can be pressed by physics Key clicks the AR application icon on electronic equipment 100 and opens AR application on electronic equipment 100, and electronic equipment 100 can be with The scene type (such as: indoor scene, outdoor scene, face scene or other scenes) for determining current real scene, for not Same scene type obtains different image informations, and obtains the humorous system of one or more balls using different illumination estimation algorithms Number, the lighting effect of virtual objects in the current real scene is estimated further according to the one or more spherical harmonic coefficient.For interior Scene, electronic equipment 100 can also estimate the lighting effect of designated position in the scene according to multiple spherical harmonic coefficients.
As shown in Fig. 2 (a), user can be beaten by the application icon 101 on click (such as: clicking) electronic equipment 100 Open the AR application on electronic equipment 100.
As shown in Fig. 2 (b), into after AR application, electronic equipment 100 can determine the scene type of current real scene, And the image information of current real scene 102 is obtained by camera 193, and using the scene class with current real scene 102 The corresponding illumination estimation algorithm of type obtains one or more spherical harmonic coefficients, estimates further according to the one or more spherical harmonic coefficient virtual The spherical harmonic coefficient of object.
As shown in Fig. 2 (c), electronic equipment 100 can render the virtual objects according to the spherical harmonic coefficient of virtual objects, and will The virtual objects 103 rendered are shown on the display screen of electronic equipment 100.
In conclusion electronic equipment 100 estimate the real scene in virtual objects lighting effect when, for different The available different image information of scene type, and the humorous system of one or more balls is obtained using different illumination estimation algorithms Number, wherein spherical harmonic coefficient may include more information content, such as: the direction etc. of light intensity.Therefore, by this programme, electronics is set Standby 100 are estimated that the lighting effect closer to real scene, so that virtual objects are more effectively merged with real scene, The user experience is improved.
3D illumination estimation method provided by the embodiments of the present application is described below in conjunction with Figure of description.
Some or all of it should be understood that in the embodiment of the present application, electronic equipment can execute in the embodiment of the present application Step, these steps or operation are only examples, and the deformation of other operations or various operations can also be performed in the embodiment of the present application. In addition, each step can be executed according to the different sequences that the embodiment of the present application is presented, and it is possible to not really want to execute All operationss in the embodiment of the present application.
Firstly, electronic equipment can automatically open camera applications, or open by the operation of user after applying into AR Camera applications determine the scene type of current real scene.Electronic equipment can call the scene Recognition function in camera applications Determine the scene type of current real scene.Wherein, it is big to can be the photography of the artificial intelligence in Huawei's mobile phone for scene Recognition function Teacher MasterAI, the scene Recognition function can also be other kinds of application, and the embodiment of the present application does not limit.Current true field Scape can be the scene that user is observed by the camera of electronic equipment.The scene Recognition function can according to scene Recognition, Face datection as a result, recommending to be suitble to the photographing mode of current real scene according to predefined rule.Electronic equipment can root The scene type of current real scene is determined according to the photographing mode.For example, if photographing mode is indoor mode, electronic equipment can be with The scene type for determining current real scene is indoor scene;If photographing mode is outdoor mode, electronic equipment, which can determine, works as The scene type of preceding real scene is outdoor scene;If photographing mode is face mode, electronic equipment can determine current true The scene type of scene is face scene.
Wherein, indoor scene may include in personage or animal inhabitation or movable building construction, the vehicles etc. Portion's space environment;Outdoor scene may include the nature and artificial scene except building construction inside;Face scene may include Face account for current real scene ratio be greater than or equal to threshold value (such as: scene 30%).
Refer to Fig. 3 (a), user can scheme by the AR in the operation interface 302 of click (such as: clicking) electronic equipment Mark 301 enters AR and applies, and into after AR application, electronic equipment can automatically open camera applications, and call in camera applications Scene Recognition function determines the scene type of current real scene.
Refer to Fig. 3 (b), user can scheme by the AR in the operation interface 302 of click (such as: clicking) electronic equipment Mark 301 enters AR and applies, and into after AR application, user can open phase by the button "Yes" in click information prompting frame 303 Machine application, and the scene Recognition function in camera applications is called to determine the scene type of real scene.
Fig. 3 (c) is referred to, if the button "No" in user's click information prompting frame 303, electronic equipment will not open camera Using, but the operation interface 304 of AR application can be entered.
Refer to Fig. 3 (d), user can scheme by the AR in the operation interface 302 of click (such as: clicking) electronic equipment Mark 301 enters AR and applies, and into after AR application, user can pass through AR function button in the operation interface 304 of click AR application 305 open camera applications, and the scene Recognition function in camera applications is called to determine the scene type of current real scene.
It is introduced so that the scene type of current real scene is respectively indoor scene, outdoor scene and face scene as an example below The 3D illumination estimation method.
If the scene type for the real scene that electronic equipment determines is indoor scene (scene as shown in Fig. 4 (a)), the 3D The realization process of illumination estimation method can be with reference to the corresponding embodiment of method shown in following Fig. 5.
If the scene type for the real scene that electronic equipment determines is outdoor scene (scene as shown in Fig. 4 (b)), the 3D The realization process of illumination estimation method can be with reference to the corresponding embodiment of method shown in following Figure 10.
If the scene type for the real scene that electronic equipment determines is face scene (scene as shown in Fig. 4 (c)), the 3D The realization process of illumination estimation method can be with reference to the corresponding embodiment of method shown in following Figure 12.
It should be noted that Fig. 4 (a), Fig. 4 (b) and Fig. 4 (c) are only the one of indoor scene, outdoor scene and face scene A example, it should be appreciated by those skilled in the art that indoor scene, outdoor scene and face scene can also be including other content Scene, the embodiment of the present application is without concrete restriction.
As shown in figure 5, being a kind of 3D illumination estimation method provided by the embodiments of the present application, which is applicable in In AR technology, the 3D illumination estimation method the following steps are included:
Step 501, electronic equipment obtain the image information of the first scene.
Wherein, which can be current true scene, should if the scene type of the first scene is indoor scene The image information of first scene may include the depth of pixel in the colouring information of pixel and the first scene in the first scene Information.
Wherein, the colouring information of pixel can serve to indicate that the color of pixel in the first scene in first scene, Such as: the colouring information of pixel can be the red (red, R) of pixel in the first scene, green in the first scene The color value of (green, G), blue (blue, B).The colouring information of pixel can pass through electronic equipment in first scene Main camera obtains, such as: the main camera of electronic equipment can shoot the image or the first scene of at least one the first scene Video, electronic equipment is according to pixel in the image of at least one first scene or the first scene of video acquisition of the first scene The colouring information of point.If the main camera of electronic equipment has taken two or two or more images, electronic equipment is available The colouring information of pixel in every image, and the average value of the colouring information of pixel in two or two images above is made For the colouring information of pixel in the first scene.If the main camera of electronic equipment has taken the video of the first scene of 10 frames, The colouring information of pixel in the available every frame video of electronic equipment, and by 10 frame videos the colouring information of pixel it is flat Colouring information of the mean value as pixel in the first scene.
Wherein, the depth information of pixel can serve to indicate that the position of pixel in the first scene in first scene, Such as: the depth information of pixel can be in the first scene pixel in the space coordinate of the first scene in the first scene.It should The depth information of pixel can be obtained by the depth transducer of electronic equipment in first scene, such as: the depth of electronic equipment Degree sensor can shoot the image of at least one the first scene or the video of the first scene, and electronic equipment is according to this at least one The depth information of pixel in the first scene of the image of first scene or the video acquisition of the first scene.If the depth of electronic equipment Sensor has taken two or two or more images, the depth information of pixel in the available every image of electronic equipment, And believe the average value of the depth information of pixel in two or two images above as the depth of pixel in the first scene Breath.If the depth transducer of electronic equipment has taken the video of the first scene of 10 frames, the available every frame video of electronic equipment The depth information of middle pixel, and using the average value of the depth information of pixel in 10 frame videos as pixel in the first scene Depth information.
It should be noted that the image or the first scene of at least one the first scene that the main camera of electronic equipment is shot Video and electronic equipment depth transducer shooting the image of at least one the first scene or the video of the first scene have phase Same reference by location.
Illustratively, which can be flight time (time of fight, ToF) sensor.Electronic equipment The depth information of pixel quickly establishes the first simulated scenario with less frame number in the first scene that can be obtained according to ToF.
Optionally, electronic equipment is when determining the scene type of the first scene is indoor scene, automatically by electronic equipment Main camera and depth transducer shoot the image of at least one the first scene or the video of the first scene, at least further according to this The image information of the first scene of video acquisition of the image or the first scene of one the first scene.
For example, referring to Fig. 6 (a), user can be entered by the AR function button 602 clicked on AR application interface 601 Camera applications, electronic equipment can be when determining the scene type of the first scene be indoor scenes, automatically by electronic equipment Main camera and depth transducer shoot the video of the first scene, further according to the figure of the first scene of video acquisition of first scene As information.
Optionally, electronic equipment is shot when determining the scene type of the first scene is indoor scene by user manually The image of at least one the first scene or the video of the first scene, the image further according at least one first scene or first The image information of the first scene of video acquisition of scape.
For example, refer to Fig. 6 (b), user can be with the button " determination " in click information prompting frame 603, then in mould of taking pictures A photo is shot by the main camera and depth transducer of electronic equipment by clicking button 604 of taking pictures under formula, further according to The photo of shooting obtains the image information of the first scene.
Step 502, electronic equipment establish the first simulated scenario according to the image information of the first scene.
Wherein, the first simulated scenario can be the three-dimensional virtual scene established according to the image information of the first scene. First simulated scenario may include at least two illumination detection ball and point cloud.
Wherein, illumination detection ball can be to be used to indicate the ball of illumination detection ball present position in the first simulated scenario The virtual sphere of humorous coefficient.
Optionally, illumination detection ball can also be known as illumination probe (light probe) or with other name nominatings, originally Apply for that embodiment not limits.The spherical harmonic coefficient of illumination detection ball includes colouring information and the illumination detection of illumination detection ball The colouring information of the location information of ball, illumination detection ball is used to indicate the color of illumination detection ball;The illumination detects ball Location information is used to indicate position of the illumination detection ball in the first simulated scenario.
It should be noted that the colouring information for the illumination detection ball that the spherical harmonic coefficient of illumination detection ball includes may include light According to the color value and direction of illumination of detection ball.It should be appreciated by those skilled in the art that the spherical harmonic coefficient packet in the embodiment of the present application The colouring information included all includes color value and direction of illumination, is hereafter repeated no more.
Optionally, number and position of the illumination detection ball in the first simulated scenario pre-set.
Wherein, the cloud can in the first simulated scenario, be used to indicate the geometric shape of object in the first simulated scenario, The set of thingness and the point of distribution of color.Wherein, thingness may include light source and non-light source.For example, point cloud presses object Body attribute can be divided into light source point cloud and non-light source point cloud, the light source point cloud be by deep learning algorithm (such as: convolutional Neural net Network (convolutionalneuralnetworks, CNN)) set for belonging to the point in light source objects that recognizes, the non-light source Point cloud be by deep learning algorithm (such as: the set for belonging to the point in non-light source objects CNN) recognized.Wherein, light source object Body may include light, window or strongly reflective etc..
Optionally, point cloud is the set of the sampled point after digitizing the image information of the first scene.The information of point cloud can With include in a cloud location information of each point and irradiation level (irradiance) information, the location information of the point be used to indicate this Position o'clock in the first simulated scenario, the irradiation level information of the point are used to indicate the irradiation level of the point.
It should be noted that electronic equipment can be according to the information of the image information acquisition point cloud of the first scene.
Illustratively, Fig. 7 is a simulated scenario, and illumination probe can be sphere shown in Fig. 7, and point cloud can be Stain shown in Fig. 7, the cloud are a part point clouds in simulated scenario shown in Fig. 7.
It should be noted that may include more or less than illumination shown in Fig. 7 detection ball in the first simulated scenario Illumination detect ball, may include that than shown in Fig. 7 cloud more puts cloud in the first simulated scenario, the embodiment of the present application not into Row is specific to be limited.
The ball that the information estimation illumination for the point cloud that step 503, electronic equipment include according to the first simulated scenario detects ball is humorous Coefficient.
Optionally, ball is detected for the first illumination, first illumination detection ball is any light that the first simulated scenario includes According to detection ball, the information for the point cloud for including according to the first simulated scenario estimates the spherical harmonic coefficient of first illumination detection ball, can be with Including step 1 and step 2:
Step 1, electronic equipment detect the first irradiation level of ball according to the first illumination of acquisition of information of cloud.
Optionally, for any point in cloud, electronic equipment is according to the location information and irradiation level acquisition of information of the point Second irradiation level of this o'clock in the first illumination detection ball.Electronic equipment is according to each o'clock in cloud the of the first illumination detection ball Two irradiation level obtain the first irradiation level of the first illumination detection ball.
Example one, electronic equipment obtain the first light according to each o'clock in cloud the second irradiation level in the first illumination detection ball The first irradiation level according to detection ball may include: the second spoke that each o'clock in a cloud is detected ball by electronic equipment in the first illumination Illumination weighted sum obtains the first irradiation level of first illumination detection ball.Such as: if including 6 points, 6 points minute in point cloud The second irradiation level and weighting coefficient not at the first illumination detection ball position is as shown in table 1.First irradiation level=q1*L1+ q2*L2+q3*L3+q4*L4+q5*L5+q6*L6。
Table 1
Serial number Second irradiation level Weighting coefficient
1 L1 q1
2 L2 q2
3 L3 q3
4 L4 q4
5 L5 q5
6 L6 q6
Wherein, the weighting coefficient of each point can be according to each point and the first illumination detection ball in the first simulated scenario Position determines.For example, the weighting coefficient of the point is smaller if the linear distance between the point and first illumination detection ball is bigger; If the linear distance between the point and first illumination detection ball is smaller, the weighting coefficient of the point is bigger;If the point and this first The angle that illumination detects between ball is bigger, and the weighting coefficient of the point is smaller;If the folder between the point and first illumination detection ball Angle is smaller, and the weighting coefficient of the point is bigger.
Wherein, if position coordinates of any point in the first simulated scenario in point cloud are (x1, y1, z1), the first illumination is visited Surveying position coordinates of the ball in the first simulated scenario is (x2, y2, z2), then the point and first illumination detect the straight line between ball Distance can beIf the angle between the point and first illumination detection ball is θ, Then
Example two, electronic equipment obtain the first light according to each o'clock in cloud the second irradiation level in the first illumination detection ball The first irradiation level according to detection ball may include: the second irradiation level that the first available point is detected ball by electronic equipment in the first illumination Weighted sum obtains the first irradiation level of first illumination detection ball.Wherein, the first available point can be in a cloud in the first light According to the point in the visible range of detection ball.
Optionally, the visible range of first illumination detection ball pre-sets or electronic equipment is according to the first light It is determined according to position o'clock in first scene of the detection ball in the position and point cloud in the first scene.
As shown in figure 8, any point 801 in point cloud puts cloud in the visible range of illumination detection ball LP802 and LP803 In any point 801 not illumination detection ball LP804 visible range in.
Illustratively, if point cloud in include 9 points, wherein have 6 o'clock the first illumination detection ball visible range in.9 The second irradiation level at the first illumination detection ball position and weighting coefficient are as shown in table 2 respectively for a point.First irradiation level= q1*L1+q2*L2+q3*L3+q4*L4+q5*L5+q6*L6。
Table 2
It should be noted that if point cloud in some point not the first illumination detection ball visible range in, the phase Weighting coefficient for first illumination detection ball is 0, and second irradiation level of this at first illumination detection ball position can Think 0 value for being also possible to a very little, i.e. second irradiation level of this at first illumination detection ball position can be ignored Disregard.
Wherein, the weighting coefficient of each point can be according to each point and the first illumination detection ball in the first simulated scenario Position determines.For example, the weighting coefficient of the point is smaller if the linear distance between the point and first illumination detection ball is bigger; If the linear distance between the point and first illumination detection ball is smaller, the weighting coefficient of the point is bigger;If the point and this first The angle that illumination detects between ball is bigger, and the weighting coefficient of the point is smaller;If the folder between the point and first illumination detection ball Angle is smaller, and the weighting coefficient of the point is bigger.
Wherein, about the linear distance and the point and first illumination detection ball between the point and first illumination detection ball Between the introduction of angle can be with reference to corresponding introduction in above-mentioned example one, details are not described herein again.
Example three, electronic equipment obtain the first light according to each o'clock in cloud the second irradiation level in the first illumination detection ball The first irradiation level according to detection ball may include: the second irradiation level that the second available point is detected ball by electronic equipment in the first illumination Weighted sum obtains the first irradiation level of first illumination detection ball.Wherein, the second available point can be in a cloud in the first light According to detection ball visible range in, and the first illumination detection ball the second irradiation level be greater than or equal to preset threshold point.Example Such as, if including 9 points in point cloud.9 points detect the second irradiation level and weighting coefficient at ball position in the first illumination respectively As shown in table 3.First irradiation level=q1*L1+q2*L2+q3*L3+q4*L4+q5*L5.
Table 3
It is preset it should be noted that if some o'clock in point cloud is less than in the second irradiation level of the first illumination detection ball Threshold value, this can be 0 or the value of a very little relative to the weighting coefficient of first illumination detection ball.
Wherein, the weighting coefficient of each point can be according to each point and the first illumination detection ball in the first simulated scenario Position determines.For example, the weighting coefficient of the point is smaller if the linear distance between the point and first illumination detection ball is bigger; If the linear distance between the point and first illumination detection ball is smaller, the weighting coefficient of the point is bigger;If the point and this first The angle that illumination detects between ball is bigger, and the weighting coefficient of the point is smaller;If the folder between the point and first illumination detection ball Angle is smaller, and the weighting coefficient of the point is bigger.
Wherein, about the linear distance and the point and first illumination detection ball between the point and first illumination detection ball Between the introduction of angle can be with reference to corresponding introduction in above-mentioned example one, details are not described herein again.
Step 2, electronic equipment estimate that the ball of the first illumination detection ball is humorous according to the first irradiation level that the first illumination detects ball Coefficient.
Optionally, the spherical harmonic coefficient of the first illumination detection ball includes the spherical harmonic coefficient of three Color Channels (R, G and B).
Wherein, the first illumination detection ball can be 0~2 rank totally 9 dimension floating type system in the spherical harmonic coefficient of a Color Channel Number (also referred to as L2 spherical harmonics).Therefore, the spherical harmonic coefficient of the first illumination detection ball can be 3*9 dimension.
Optionally, the spherical harmonic coefficient of any one Color Channel of the first illumination detection ball can be expressed asThe spherical harmonic coefficient that first illumination detects three Color Channels of ball can be with It indicates are as follows:
Electronic equipment is introduced by taking any one Color Channel of the first illumination detection ball as an example below to be estimated according to the first irradiation level The first illumination detection ball is counted in the spherical harmonic coefficient of the Color Channel.
If the spherical harmonic coefficient of any one Color Channel of the first illumination detection ball isElectronic equipment can calculate the first illumination according to following formula and detect ball In the element of the spherical harmonic coefficient of the Color Channel:
Wherein, xjIt either detects on ball for the first illumination to L (xj) indicate either direction on the first illumination detection ball On third irradiation level, Yl m(xj) indicate that the first illumination detects the spherical harmonic basis function on ball in either direction.
It should be noted that electronic equipment can be detected in the first illumination select N number of direction on ball, which can be with The sphere of the first illumination detection ball is covered, N is positive integer, 0 < j≤N.Electronic equipment can detect the of ball according to the first illumination One irradiation level obtains the third irradiation level L (x on the first illumination detection ball in either directionj).Electronic equipment can be according to following Formula obtains Yl m(xj)。
Wherein,It is xjSpherical coordinates on the spherical surface of the first illumination detection ball, Yl mIt is the spheric harmonic function of l and m, 0≤ M≤l, i are imaginary unit, Pl mIt is legnedre polynomial,PlIt (x) is that l rank Le allows Moral multinomial,
Optionally, electronic equipment calculates the first illumination detection ball in the humorous system of ball of three Color Channels according to above-mentioned formula Number, i.e. the first illumination detect the spherical harmonic coefficient of ball.
Optionally, electronic equipment estimates the spherical harmonic coefficient of any one illumination detection ball according to step 5031 and 5032.
Step 504, electronic equipment detect the spherical harmonic coefficient of the position of the spherical harmonic coefficient estimation virtual objects of ball according to illumination.
Wherein, virtual objects can be user want to be shown in by AR application on the display screen of electronic equipment by electron number According to the visualized objects of production.For example, virtual objects can be a picture, 3D model etc..
Optionally, electronic equipment inputs the position for obtaining virtual objects according to user;Alternatively, electronic equipment pre-sets virtually The position of object.Wherein, the position of virtual objects is used to indicate virtual objects in the space coordinate of the first simulated scenario.
Such as: electronic equipment can be prompted after the image information for obtaining the first scene by the prompting frame 901 in Fig. 9 User inputs the position of virtual objects, and user can pass through the tea table in the image information of click (such as: clicking) the first scene 902, the position of virtual objects is inputted, the virtual objects rendered can be placed on the position of user's input by electronic equipment.
In a kind of example, the spherical harmonic coefficient for the spherical harmonic coefficient estimation virtual objects that electronic equipment detects ball according to illumination can To include: that electronic equipment is weighted summation to the spherical harmonic coefficient of illumination detection ball, the spherical harmonic coefficient of virtual objects is obtained.
Illustratively, if the first simulated scenario includes that 5 illumination detect ball, 5 illumination detect the spherical harmonic coefficient of ball and add Weight coefficient is as shown in table 4, then the spherical harmonic coefficient of virtual objects is 0*y1+0.3*y2+0.1*y3+0.2*y4+0.4*y5+0*y6. Wherein, 6 weighting coefficients can determine at a distance from virtual objects according to 6 illumination detection balls.
Table 4
Serial number The spherical harmonic coefficient of each illumination detection ball Weighting coefficient
1 y1 0
2 y2 0.3
3 y3 0.1
4 y4 0.2
5 y5 0.4
6 y6 0
In another example, electronic equipment detects the spherical harmonic coefficient of the spherical harmonic coefficient estimation virtual objects of ball according to illumination It may include: that electronic equipment is weighted summation to the spherical harmonic coefficient of illumination detection ball, obtain the spherical harmonic coefficient of virtual objects.Its In, the illumination detect ball between virtual objects at a distance from be less than or equal to first distance.
For example, the first simulated scenario includes that 5 illumination detect balls, 5 illumination detection balls and void if first distance is 0.5m Distance, spherical harmonic coefficient, the weighting coefficient of 5 illumination detection balls between quasi- object is as shown in table 5, then the spherical harmonic coefficient of virtual objects =0.3*y2+0.1*y3+0.2*y4+0.4*y5.Wherein, 6 weighting coefficients can be according to 6 illumination detection balls and virtual objects Distance determine.
Table 5
Serial number The spherical harmonic coefficient of each illumination detection ball Weighting coefficient At a distance between virtual objects
1 y1 0 1m
2 y2 0.3 0.2m
3 y3 0.1 0.4m
4 y4 0.2 0.3m
5 y5 0.4 0.1m
6 y6 0 2m
It should be noted that in practical applications, electronic equipment can according to step 5031 and 5032, estimate with it is virtually right Distance as between is less than or equal to the spherical harmonic coefficient of the illumination detection ball of first distance, further according to small at a distance between virtual objects In or equal to first distance illumination detection ball spherical harmonic coefficient estimation virtual objects spherical harmonic coefficient, and do not have to estimation with it is virtual Distance between object is greater than the spherical harmonic coefficient of the illumination detection ball of first distance.
Optionally, electronic equipment can render virtual objects according to the spherical harmonic coefficient of virtual objects.
Electronic equipment after the scene type for determining the first scene is indoor scene, by the main camera of electronic equipment and Depth transducer obtains the image information of the first scene, and being established according to the image information of the first scene includes that at least two illumination are visited It surveys ball and puts the first simulated scenario of cloud, the ball of illumination detection ball is estimated further according to the information for the point cloud that the first simulated scenario includes Humorous coefficient finally estimates the spherical harmonic coefficient of the virtual objects according to the spherical harmonic coefficient that illumination detects ball, because virtual objects Spherical harmonic coefficient is to be estimated according to the spherical harmonic coefficient of light source point cloud and non-light source point cloud, and the spherical harmonic coefficient includes illumination side To, so electronic equipment can estimate the lighting effect closer to real scene according to the spherical harmonic coefficient that illumination detects ball, with So that virtual objects are more effectively merged with real scene, the user experience is improved.
Figure 5 above is illustrated to the illumination estimation of indoor scene, but due to there is outdoor and people in practical application The scenes such as face can be using method shown in following Figure 10 when detecting outdoor scene.
It as shown in Figure 10, is a kind of 3D illumination estimation method provided by the embodiments of the present application, the 3D illumination estimation method packet Include following steps:
Step 1001, electronic equipment obtain the image information of the first scene.
Wherein, which can be current true scene, should if the scene type of the first scene is outdoor scene The image information of first scene may include the colouring information of pixel in the first scene.
Wherein, the colouring information of pixel can serve to indicate that the color of pixel in the first scene in first scene, Such as: the colouring information of pixel can be the RGB color value of pixel in the first scene in the first scene.In first scene The colouring information of pixel can be obtained by one or more cameras of electronic equipment (such as: wide-angle camera, focal length are taken the photograph As head etc.), such as: one or more cameras of electronic equipment can shoot at least one the first scene image or first The video of scape, electronic equipment is according to picture in the image of at least one first scene or the first scene of video acquisition of the first scene The colouring information of vegetarian refreshments.If one or more cameras of electronic equipment have taken two or two or more images, electronics is set The colouring information of pixel in standby available every image, and by the colouring information of pixel in two or two images above Colouring information of the average value as pixel in the first scene.If one or more cameras of electronic equipment have taken 10 frames The first scene video, the colouring information of pixel in the available every frame video of electronic equipment, and by picture in 10 frame videos Colouring information of the average value of the colouring information of vegetarian refreshments as pixel in the first scene.
Illustratively, electronic equipment can be when determining the scene type of the first scene be outdoor scene, automatically by electricity The camera of sub- equipment shoots the image of at least one the first scene or the video of the first scene, further according to this at least one first The image information of the first scene of the image of scene or the video acquisition of the first scene.
Illustratively, electronic equipment can pass through user hand when determining the scene type of the first scene is outdoor scene The image of at least one the first scene of dynamic shooting or the video of the first scene, image further according at least one first scene or The image information of the first scene of video acquisition of first scene.
Step 1002, electronic equipment are according to the corresponding day empty graph of the first scene of image information acquisition of the first scene.
Wherein, the corresponding day empty graph of first scene is used to indicate the illumination patterns of first scene.For example, in Figure 11 A line is the corresponding day empty graph of different illumination patterns, and the second row is the picture of the virtual objects rendered with the day empty graph of the first row.
Optionally, electronic equipment is by inputting the first convolutional neural networks for the image information of the first scene (convolutional neural networks, CNN) obtains the corresponding day empty graph of the first scene.
Illustratively, the image information of the first scene can be by the multilayer function in the first CNN, such as convolutional layer (convolutional layer), pond layer (pooling layer), line rectification layer (relu layer), batch normalization layer After (batch normalization layer) and loss function layer etc., the corresponding day of image information of the first scene is obtained Empty graph.
Step 1003, electronic equipment estimate the spherical harmonic coefficient of virtual objects according to the information of day empty graph.
Wherein, the information of day empty graph may include the spherical harmonic coefficient of day empty graph.
Wherein, virtual objects can be user want to be shown in by AR application on the display screen of electronic equipment by electron number According to the visualized objects of production.For example, virtual objects can be a picture, 3D model etc..
Optionally, electronic equipment inputs the position for obtaining virtual objects according to user;Alternatively, electronic equipment pre-sets virtually The position of object.Wherein, the position of virtual objects is used to indicate virtual objects in the space coordinate of the first simulated scenario.
Optionally, electronic equipment is humorous by the ball of this day empty graph according to the spherical harmonic coefficient of the acquisition of information day empty graph of day empty graph Spherical harmonic coefficient of the coefficient as virtual objects, electronic equipment render virtual objects according to the spherical harmonic coefficient of virtual objects.Such as: electricity Sub- equipment obtains the illumination principal direction, compensation colour temperature and intensity of illumination of virtual objects, electronics according to the spherical harmonic coefficient of the first scene Equipment renders virtual objects according to the illumination principal direction, the compensation colour temperature and the intensity of illumination.
Electronic equipment passes through the one or more of electronic equipment after the scene type for determining the first scene is outdoor scene Camera takes the image information of the first scene, according to the corresponding day empty graph of the first scene of image information acquisition of the first scene, then The spherical harmonic coefficient that virtual objects are estimated according to the information of day empty graph, because the spherical harmonic coefficient of virtual objects includes direction of illumination, institute The lighting effect closer to real scene can be estimated according to the spherical harmonic coefficient that illumination detects ball with electronic equipment, so that empty Quasi- object is more effectively merged with real scene, and the user experience is improved.
Figure 5 above, Figure 10 are respectively illustrated the illumination estimation of indoor scene, outdoor scene, but due to actually answering There are the scenes such as face in, can be using method shown in following Figure 12 when detecting face scene.
It as shown in figure 12, is a kind of 3D illumination estimation method provided by the embodiments of the present application, the 3D illumination estimation method packet Include following steps:
Step 1201, electronic equipment obtain the image information of the first scene.
Wherein, which can be current true scene, should if the scene type of the first scene is face scene The image information of first scene may include the colouring information of pixel in the first scene.
It should be noted that if the scene type of the first scene is face scene, the color letter of pixel in the first scene Specific introduce of breath can be with reference to the introduction in step 1001.
Illustratively, electronic equipment can be when determining the scene type of the first scene be face scene, automatically by electricity The camera of sub- equipment shoots the image of at least one the first scene or the video of the first scene, further according to this at least one first The image information of the first scene of the image of scene or the video acquisition of the first scene.
Illustratively, electronic equipment can pass through user hand when determining the scene type of the first scene is face scene The image of at least one the first scene of dynamic shooting or the video of the first scene, image further according at least one first scene or The image information of the first scene of video acquisition of first scene.
Step 1202, electronic equipment estimate the spherical harmonic coefficient of virtual objects according to the image information of the first scene.
Optionally, electronic equipment is according to the spherical harmonic coefficient of the first scene of image information acquisition of the first scene, and this first The spherical harmonic coefficient of scape is the spherical harmonic coefficient of virtual objects.Electronic equipment can render virtual according to the spherical harmonic coefficient of virtual objects Object.Such as: electronic equipment obtains the illumination principal direction, compensation colour temperature and light of virtual objects according to the spherical harmonic coefficient of the first scene According to intensity, electronic equipment renders virtual objects according to the illumination principal direction, the compensation colour temperature and the intensity of illumination.
Optionally, electronic equipment is humorous by the ball that the image information of the first scene is inputted the 2nd CNN the first scene of acquisition Coefficient.
Illustratively, the image information of the first scene can be by the multilayer function in the 2nd CNN, such as convolutional layer (convolutional layer), pond layer (pooling layer), line rectification layer (relu layer), batch normalization layer After (batch normalization layer) and loss function layer etc., the spherical harmonic coefficient of available first scene.
Optionally, the spherical harmonic coefficient of first scene is the spherical harmonic coefficient of face in the first scene.
Electronic equipment passes through the one or more of electronic equipment after the scene type for determining the first scene is face scene Camera takes the image information of the first scene, and the spherical harmonic coefficient of virtual objects is estimated according to the image information of the first scene, because The spherical harmonic coefficient of virtual objects includes direction of illumination, so electronic equipment can be estimated according to the spherical harmonic coefficient that illumination detects ball Closer to the lighting effect of real scene, so that virtual objects are more effectively merged with real scene, the user experience is improved.
It is understood that above-mentioned electronic equipment is in order to realize the above functions, it comprises executing, each function is corresponding Hardware configuration and/or software module.Those skilled in the art should be readily appreciated that, in conjunction with the embodiments described herein Each exemplary unit and algorithm steps of description, the embodiment of the present application can be with the combination shape of hardware or hardware and computer software Formula is realized.Some functions is executed in a manner of hardware or computer software driving hardware actually, depends on technical solution Specific application and design constraint.Professional technician can realize each specific application using distinct methods Described function, but this realization is it is not considered that exceed the range of the embodiment of the present application.
The embodiment of the present application can carry out the division of functional module, example according to above method example to above-mentioned electronic equipment Such as, each functional module of each function division can be corresponded to, two or more functions can also be integrated at one It manages in module.Above-mentioned integrated module both can take the form of hardware realization, can also use the form of software function module It realizes.It should be noted that being schematical, only a kind of logic function stroke to the division of module in the embodiment of the present application Point, there may be another division manner in actual implementation.
Illustratively, in the case where each function division of use correspondence each functional module, Figure 13 shows a kind of electricity The structural schematic diagram of sub- equipment 130.The electronic equipment 130 includes: to obtain module 1301, establish module 1302 and estimation module 1303。
Module 1301 is obtained, for obtaining the image information of the first scene;Wherein, the image information of first scene includes The depth information of the colouring information of pixel and pixel in first scene in first scene.
Module 1302 is established, for establishing the first simulated scenario according to the image information of first scene;Wherein, this first Simulated scenario includes at least two illumination detection ball and point cloud.
Estimation module 1303, the ball of the information estimation illumination detection ball of the point cloud for including according to first simulated scenario Humorous coefficient;Wherein, the information of the cloud includes the location information and irradiation level information of the point in the cloud.
Estimation module 1303 is also used to estimate according to the spherical harmonic coefficient that the illumination detects ball the ball of the position of the virtual objects Humorous coefficient.
Optionally, estimation module 1303, specifically for detecting the first of ball according to first illumination of the acquisition of information of cloud Irradiation level;Estimation module 1303 estimates first illumination also particularly useful for the first irradiation level for detecting ball according to first illumination Detect the spherical harmonic coefficient of ball;Wherein, first illumination detection ball is that any illumination that first simulated scenario includes detects ball.
Optionally, module 1301 is obtained, specifically for detecting ball according to the first illumination of acquisition of information of the first available point First irradiation level, wherein the first available point is the point in a cloud in the visible range of first illumination detection ball.
Optionally, estimation module 1303 are weighted summation also particularly useful for the spherical harmonic coefficient to illumination detection ball, obtain To the spherical harmonic coefficient of the position of the virtual objects.
Optionally, the illumination detect ball between the virtual objects at a distance from be less than or equal to first distance.
Optionally, module 1301 is obtained, is also used to input the position for obtaining virtual objects according to user;Alternatively, pre-seting The position of virtual objects.
Wherein, all related contents that above method embodiment was related to respectively operate can quote corresponding function module Function description, details are not described herein.
In the present embodiment, which is in the form of using the integrated each functional module of model split It is existing.Here " module " can refer to that specific ASIC, circuit execute processor and the storage of one or more softwares or firmware program Device, integrated logic circuit and/or other device of above-mentioned function can be provided.In a simple embodiment, this field Technical staff is contemplated that the electronic equipment 130 can use form shown in FIG. 1.
For example, processor 110 in Fig. 1 can by calling the computer executed instructions stored in internal storage 121, So that electronic equipment 130 executes the 3D illumination estimation method in above method embodiment.
Illustratively, the acquisition module 1301 in Figure 13, establish function/realization of module 1302 and estimation module 1303 Process can call the computer executed instructions stored in internal storage 121 to realize by the processor 110 in Fig. 1.
Since above-mentioned 3D illumination estimation method can be performed in electronic equipment 130 provided in this embodiment, can obtain The technical effect obtained can refer to above method embodiment, and details are not described herein.
Illustratively, in the case where each function division of use correspondence each functional module, Figure 14 shows a kind of electricity The structural schematic diagram of sub- equipment 140.The electronic equipment 140 includes: to obtain module 1401 and estimation module 1402.
Module 1401 is obtained, for obtaining the image information of first scene;Wherein, the image information packet of first scene Include the colouring information of pixel in first scene.
Module 1401 is obtained, is also used to according to the image information acquisition of the first scene corresponding sky of the first scene Figure;Wherein, the corresponding day empty graph of first scene is used to indicate the illumination patterns of first scene.
Estimation module 1402, for estimating the spherical harmonic coefficient of the virtual objects according to the information of this day empty graph, wherein the day The information of empty graph includes the spherical harmonic coefficient of sky figure.
Optionally, estimation module 1402, specifically for the spherical harmonic coefficient of this day empty graph is humorous as the ball of the virtual objects Coefficient.
Optionally, module 1401 is obtained, is also used to input the position for obtaining virtual objects according to user;Alternatively, pre-seting The position of virtual objects.
Wherein, all related contents that above method embodiment was related to respectively operate can quote corresponding function module Function description, details are not described herein.
In the present embodiment, which is in the form of using the integrated each functional module of model split It is existing.Here " module " can refer to that specific ASIC, circuit execute processor and the storage of one or more softwares or firmware program Device, integrated logic circuit and/or other device of above-mentioned function can be provided.In a simple embodiment, this field Technical staff is contemplated that the electronic equipment 140 can use form shown in FIG. 1.
For example, processor 110 in Fig. 1 can by calling the computer executed instructions stored in internal storage 121, So that electronic equipment 140 executes the 3D illumination estimation method in above method embodiment.
Illustratively, the function of the acquisition module 1401 in Figure 14 and estimation module 1402/realization process can pass through Processor 110 in Fig. 1 calls the computer executed instructions stored in internal storage 121 to realize.
Since above-mentioned 3D illumination estimation method can be performed in electronic equipment 140 provided in this embodiment, can obtain The technical effect obtained can refer to above method embodiment, and details are not described herein.
Through the above description of the embodiments, it is apparent to those skilled in the art that, for description It is convenienct and succinct, only the example of the division of the above functional modules, in practical application, can according to need and will be upper It states function distribution to be completed by different functional modules, i.e., the internal structure of device is divided into different functional modules, to complete All or part of function described above.
In several embodiments provided herein, it should be understood that disclosed device and method can pass through it Its mode is realized.For example, the apparatus embodiments described above are merely exemplary, for example, the module or unit It divides, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components It may be combined or can be integrated into another device, or some features can be ignored or not executed.Another point, it is shown or The mutual coupling, direct-coupling or communication connection discussed can be through some interfaces, the indirect coupling of device or unit It closes or communicates to connect, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown can be a physical unit or multiple physical units, it can and it is in one place, or may be distributed over Multiple and different places.Some or all of unit therein can be selected to realize this embodiment scheme according to the actual needs Purpose.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Formal implementation of hardware both can be used in member, and the form that SFU software functional unit also can be used is realized.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product When, it can store in a read/write memory medium.Based on this understanding, the technical solution of the embodiment of the present application is substantially The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words It embodies, which is stored in a storage medium, including some instructions are used so that an equipment (can be list Piece machine, chip etc.) or processor (processor) execute each embodiment the method for the application all or part of the steps. And storage medium above-mentioned includes: that USB flash disk, mobile hard disk, ROM, RAM, magnetic or disk etc. are various can store program code Medium.
The above, the only specific embodiment of the application, but the protection scope of the application is not limited thereto, it is any Change or replacement within the technical scope of the present application should all be covered within the scope of protection of this application.Therefore, this Shen Protection scope please should be based on the protection scope of the described claims.

Claims (22)

1. a kind of three-dimensional 3D illumination estimation method, which is characterized in that the described method includes:
Obtain the image information of the first scene;Wherein, the image information of first scene includes pixel in first scene The depth information of pixel in the colouring information of point and first scene;
The first simulated scenario is established according to the image information of first scene;Wherein, first simulated scenario includes at least Two illumination detection balls and point cloud;
The information estimation illumination for the point cloud for including according to first simulated scenario detects the spherical harmonic coefficient of ball;Wherein, the point The information of cloud includes the location information and irradiation level information of the point in described cloud;
The spherical harmonic coefficient of the position of the spherical harmonic coefficient estimation virtual objects of ball is detected according to the illumination.
2. 3D illumination estimation method according to claim 1, which is characterized in that ball is detected for the first illumination, according to institute The information for stating the point cloud that the first simulated scenario includes estimates the spherical harmonic coefficient of the first illumination detection ball, comprising:
First irradiation level of the detection ball of the first illumination according to the acquisition of information of cloud;
The spherical harmonic coefficient of the first illumination detection ball is estimated according to the first irradiation level that first illumination detects ball;
The first illumination detection ball is that any illumination that first simulated scenario includes detects ball.
3. 3D illumination estimation method according to claim 2, which is characterized in that the acquisition of information first according to cloud First irradiation level of illumination detection ball, comprising: the first irradiation of ball is detected according to the first illumination of acquisition of information of the first available point Degree, wherein the first available point is the point in a cloud in the visible range of first illumination detection ball.
4. 3D illumination estimation method according to claim 1-3, which is characterized in that described to be visited according to the illumination The spherical harmonic coefficient of survey ball estimates the spherical harmonic coefficient of the position of the virtual objects, comprising:
Summation is weighted to the spherical harmonic coefficient of illumination detection ball, obtains the spherical harmonic coefficient of the position of the virtual objects.
5. 3D illumination estimation method according to claim 4, which is characterized in that illumination detection ball and described virtually right Distance as between is less than or equal to first distance.
6. 3D illumination estimation method according to claim 1-5, which is characterized in that the method also includes: root The position for obtaining virtual objects is inputted according to user;Alternatively, pre-seting the position of virtual objects.
7. a kind of three-dimensional 3D illumination estimation method, which is characterized in that the described method includes:
Obtain the image information of the first scene;Wherein, the image information of first scene includes pixel in first scene The colouring information of point;
According to the corresponding day empty graph of the first scene described in the image information acquisition of first scene;Wherein, first scene Corresponding day empty graph is used to indicate the illumination patterns of first scene;
The spherical harmonic coefficient of virtual objects is estimated according to the information of the day empty graph, wherein the information of the day empty graph includes sky The spherical harmonic coefficient of figure.
8. 3D illumination estimation method according to claim 7, which is characterized in that the information according to the day empty graph is estimated Count the spherical harmonic coefficient of the virtual objects, comprising:
Using the spherical harmonic coefficient of the day empty graph as the spherical harmonic coefficient of the virtual objects.
9. 3D illumination estimation method according to claim 7 or 8, which is characterized in that the method also includes: according to user Input obtains the position of virtual objects;Alternatively, pre-seting the position of virtual objects.
10. a kind of electronic equipment, which is characterized in that the electronic equipment includes: to obtain module, establish module and estimation module;
The acquisition module, for obtaining the image information of the first scene;Wherein, the image information of first scene includes institute State the depth information of pixel in the colouring information of pixel and first scene in the first scene;
It is described to establish module, for establishing the first simulated scenario according to the image information of first scene;Wherein, described first Simulated scenario includes at least two illumination detection ball and point cloud;
The ball of the estimation module, the information estimation illumination detection ball of the point cloud for including according to first simulated scenario is humorous Coefficient;Wherein, the information of described cloud includes the location information and irradiation level information of the point in described cloud;
The estimation module is also used to detect the humorous system of ball of the position of the spherical harmonic coefficient estimation virtual objects of ball according to the illumination Number.
11. electronic equipment according to claim 10, which is characterized in that
The estimation module, specifically for detecting the first irradiation level of ball according to the first illumination of acquisition of information of cloud;
The estimation module estimates first illumination also particularly useful for the first irradiation level for detecting ball according to first illumination Detect the spherical harmonic coefficient of ball;
Wherein, the first illumination detection ball is that any illumination that first simulated scenario includes detects ball.
12. electronic equipment according to claim 11, which is characterized in that
The acquisition module, specifically for detecting the first irradiation level of ball according to the first illumination of acquisition of information of the first available point, Wherein, the first available point is the point in a cloud in the visible range of first illumination detection ball.
13. the described in any item electronic equipments of 0-12 according to claim 1, which is characterized in that
The estimation module is weighted summation also particularly useful for the spherical harmonic coefficient to illumination detection ball, obtains the void The spherical harmonic coefficient of the position of quasi- object.
14. electronic equipment according to claim 13, which is characterized in that between the illumination detection ball and the virtual objects Distance be less than or equal to first distance.
15. the described in any item electronic equipments of 0-14 according to claim 1, which is characterized in that
The acquisition module is also used to input the position for obtaining virtual objects according to user;Alternatively, pre-seting the position of virtual objects It sets.
16. a kind of electronic equipment, which is characterized in that the electronic equipment includes: to obtain module and estimation module;
The acquisition module, for obtaining the image information of the first scene;Wherein, the image information of first scene includes institute State the colouring information of pixel in the first scene;
The acquisition module is also used to the corresponding sky of the first scene according to the image information acquisition of first scene Figure;Wherein, the corresponding day empty graph of first scene is used to indicate the illumination patterns of first scene;
The estimation module, for estimating the spherical harmonic coefficient of virtual objects according to the information of the day empty graph, wherein the sky The information of figure includes the spherical harmonic coefficient of sky figure.
17. electronic equipment according to claim 16, which is characterized in that
The estimation module, specifically for using the spherical harmonic coefficient of the day empty graph as the spherical harmonic coefficient of the virtual objects.
18. electronic equipment according to claim 16 or 17, which is characterized in that
The acquisition module is also used to input the position for obtaining virtual objects according to user;Alternatively, pre-seting the position of virtual objects It sets.
19. a kind of circuit system, the circuit system is applied in electronic equipment, which is characterized in that the System on Chip/SoC includes:
At least one processor, program instruction execute at least one described processor, are required with perform claim any in 1-6 The method.
20. a kind of circuit system, the circuit system is applied in electronic equipment, which is characterized in that the System on Chip/SoC includes:
At least one processor, program instruction execute at least one described processor, are required with perform claim any in 7-9 The method.
21. a kind of computer storage medium, which is characterized in that it is stored with program instruction in the computer readable storage medium, When described program instruction operation, any the method in 1-6 is required with perform claim.
22. a kind of computer storage medium, which is characterized in that it is stored with program instruction in the computer readable storage medium, When described program instruction operation, any the method in 7-9 is required with perform claim.
CN201910586485.XA 2019-03-26 2019-07-01 3D illumination estimation method and electronic equipment Active CN110458902B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910234517X 2019-03-26
CN201910234517 2019-03-26

Publications (2)

Publication Number Publication Date
CN110458902A true CN110458902A (en) 2019-11-15
CN110458902B CN110458902B (en) 2022-04-05

Family

ID=68481912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910586485.XA Active CN110458902B (en) 2019-03-26 2019-07-01 3D illumination estimation method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110458902B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021103137A1 (en) * 2019-11-28 2021-06-03 浙江大学 Indoor scene illumination estimation model, method and device, and storage medium and rendering method
WO2021151380A1 (en) * 2020-01-30 2021-08-05 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for rendering virtual object based on illumination estimation, method for training neural network, and related products
CN113298588A (en) * 2020-06-19 2021-08-24 阿里巴巴集团控股有限公司 Method and device for providing object information and electronic equipment
CN113537194A (en) * 2021-07-15 2021-10-22 Oppo广东移动通信有限公司 Illumination estimation method, illumination estimation device, storage medium, and electronic apparatus
CN114125310A (en) * 2022-01-26 2022-03-01 荣耀终端有限公司 Photographing method, terminal device and cloud server
CN114979457A (en) * 2021-02-26 2022-08-30 华为技术有限公司 Image processing method and related device
CN115375827A (en) * 2022-07-21 2022-11-22 荣耀终端有限公司 Illumination estimation method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110182352A1 (en) * 2005-03-31 2011-07-28 Pace Charles P Feature-Based Video Compression
CN104268923A (en) * 2014-09-04 2015-01-07 无锡梵天信息技术股份有限公司 Illumination method based on picture level images
US20150279113A1 (en) * 2014-03-25 2015-10-01 Metaio Gmbh Method and system for representing a virtual object in a view of a real environment
CN108235053A (en) * 2016-12-19 2018-06-29 中国电信股份有限公司 Interactive rendering intent, equipment and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110182352A1 (en) * 2005-03-31 2011-07-28 Pace Charles P Feature-Based Video Compression
US20150279113A1 (en) * 2014-03-25 2015-10-01 Metaio Gmbh Method and system for representing a virtual object in a view of a real environment
CN106133796A (en) * 2014-03-25 2016-11-16 Metaio有限公司 For representing the method and system of virtual objects in the view of true environment
CN104268923A (en) * 2014-09-04 2015-01-07 无锡梵天信息技术股份有限公司 Illumination method based on picture level images
CN108235053A (en) * 2016-12-19 2018-06-29 中国电信股份有限公司 Interactive rendering intent, equipment and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
何菲菲: ""基于图像的室外场景虚实融合关键技术研究"", 《中国优秀硕士学位全文数据库 信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021103137A1 (en) * 2019-11-28 2021-06-03 浙江大学 Indoor scene illumination estimation model, method and device, and storage medium and rendering method
WO2021151380A1 (en) * 2020-01-30 2021-08-05 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for rendering virtual object based on illumination estimation, method for training neural network, and related products
CN113298588A (en) * 2020-06-19 2021-08-24 阿里巴巴集团控股有限公司 Method and device for providing object information and electronic equipment
CN114979457A (en) * 2021-02-26 2022-08-30 华为技术有限公司 Image processing method and related device
CN114979457B (en) * 2021-02-26 2023-04-07 华为技术有限公司 Image processing method and related device
CN113537194A (en) * 2021-07-15 2021-10-22 Oppo广东移动通信有限公司 Illumination estimation method, illumination estimation device, storage medium, and electronic apparatus
CN114125310A (en) * 2022-01-26 2022-03-01 荣耀终端有限公司 Photographing method, terminal device and cloud server
CN115375827A (en) * 2022-07-21 2022-11-22 荣耀终端有限公司 Illumination estimation method and electronic equipment
CN115375827B (en) * 2022-07-21 2023-09-15 荣耀终端有限公司 Illumination estimation method and electronic equipment

Also Published As

Publication number Publication date
CN110458902B (en) 2022-04-05

Similar Documents

Publication Publication Date Title
CN110458902A (en) 3D illumination estimation method and electronic equipment
CN110445978A (en) A kind of image pickup method and equipment
CN109766043A (en) The operating method and electronic equipment of electronic equipment
CN110495819B (en) Robot control method, robot, terminal, server and control system
CN109544618A (en) A kind of method and electronic equipment obtaining depth information
CN110506416A (en) A kind of method and terminal of terminal switching camera
CN110012154A (en) A kind of control method and electronic equipment of the electronic equipment with Folding screen
CN110119684A (en) Image-recognizing method and electronic equipment
CN110035141A (en) A kind of image pickup method and equipment
CN110347269A (en) A kind of sky mouse mode implementation method and relevant device
CN109793498A (en) A kind of skin detecting method and electronic equipment
CN109274828A (en) A kind of method, control method and electronic equipment generating screenshot
CN110401767A (en) Information processing method and equipment
CN110337020A (en) A kind of control method and relevant apparatus showing equipment
WO2021169515A1 (en) Method for data exchange between devices, and related device
CN112312366A (en) Method, electronic equipment and system for realizing functions through NFC (near field communication) tag
CN110248037A (en) A kind of identity document scan method and device
CN114610193A (en) Content sharing method, electronic device, and storage medium
CN114727220A (en) Equipment searching method and electronic equipment
CN114880251A (en) Access method and access device of storage unit and terminal equipment
CN110138999A (en) A kind of papers-scanning method and device for mobile terminal
WO2022214004A1 (en) Target user determination method, electronic device and computer-readable storage medium
EP4224485A1 (en) Adaptive action evaluation method, electronic device, and storage medium
CN113572957B (en) Shooting focusing method and related equipment
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant