CN108881739B - Image generation method, device, terminal and storage medium - Google Patents

Image generation method, device, terminal and storage medium Download PDF

Info

Publication number
CN108881739B
CN108881739B CN201810790667.4A CN201810790667A CN108881739B CN 108881739 B CN108881739 B CN 108881739B CN 201810790667 A CN201810790667 A CN 201810790667A CN 108881739 B CN108881739 B CN 108881739B
Authority
CN
China
Prior art keywords
matrix
image
original image
brightness
brightness component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810790667.4A
Other languages
Chinese (zh)
Other versions
CN108881739A (en
Inventor
陈志博
石楷弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Tencent Cloud Computing Beijing Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810790667.4A priority Critical patent/CN108881739B/en
Publication of CN108881739A publication Critical patent/CN108881739A/en
Application granted granted Critical
Publication of CN108881739B publication Critical patent/CN108881739B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses an image generation method, an image generation device, a terminal and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: the method comprises the steps of acquiring an original image matrix according to optical signals acquired by a plurality of photosensitive elements in an image sensor, wherein the original image matrix comprises image information output by the plurality of photosensitive elements, acquiring a brightness component matrix according to the original image matrix, the brightness component matrix is used for representing brightness components contained in the image information output by the plurality of photosensitive elements, acquiring a second convolution kernel matched with the original image matrix according to the brightness component matrix and a preset first convolution kernel, performing convolution processing on the original image matrix according to the second convolution kernel, and generating an image according to the processed image matrix. The signal detail components in the original image matrix are enhanced, the problem that the image is locally too bright or too dark is effectively solved, and the quality of the generated image is improved.

Description

Image generation method, device, terminal and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to an image generation method, an image generation device, a terminal and a storage medium.
Background
The image sensor is an element for converting an optical signal into an electrical signal, and with the continuous progress of science and technology and the rapid development of multimedia technology, the image sensor is widely applied to various terminals, and the terminals can collect the optical signal through the configured image sensor and generate an image according to the electrical signal converted from the optical signal.
In the related art, the image sensor has an operating range in which the intensity of the light signal can be measured normally, and when the signal intensity of the light signal in the environment is too high or too low to exceed the operating range, the image generated by the image sensor has a high-brightness blind area or a low-brightness blind area. In order to avoid the problem, the image sensor sets parameters such as gain and sampling time of an amplifier according to the intensity of the optical signal estimated by the external environment, after the optical signal is converted into the electric signal by the image sensor, the electric signal can be amplified by the amplifier, and then an image is generated according to the amplified electric signal, which is equivalent to enlarging the working range of the image sensor and avoiding the problem that the image has a high-brightness blind area or a low-brightness blind area as much as possible.
The scheme can only avoid the problem that the image has a high-brightness blind area or a low-brightness blind area to a certain extent, but the image may have the problem of local over-brightness or over-darkness, so that the image quality is poor.
Disclosure of Invention
The embodiment of the invention provides an image generation method, an image generation device, a terminal and a storage medium, which can solve the problems in the related art. The technical scheme is as follows:
in one aspect, an image generation method is provided, and the method includes:
acquiring an original image matrix according to optical signals collected by a plurality of photosensitive elements in an image sensor, wherein the original image matrix comprises image information output by the plurality of photosensitive elements;
acquiring a brightness component matrix according to the original image matrix, wherein the brightness component matrix is used for representing brightness components contained in the image information output by the photosensitive elements;
acquiring a second convolution kernel matched with the original image matrix according to the brightness component matrix and a preset first convolution kernel;
and performing convolution processing on the original image matrix according to the second convolution kernel, and generating an image according to the processed image matrix.
In another aspect, there is provided an image generating apparatus, the apparatus comprising:
the system comprises an original acquisition module, a processing module and a processing module, wherein the original acquisition module is used for acquiring an original image matrix according to optical signals acquired by a plurality of photosensitive elements in an image sensor, and the original image matrix comprises image information output by the photosensitive elements;
the brightness acquisition module is used for acquiring a brightness component matrix according to the original image matrix, wherein the brightness component matrix is used for representing brightness components contained in the image information output by the photosensitive elements;
a convolution kernel obtaining module, configured to obtain a second convolution kernel matched with the original image matrix according to the luminance component matrix and a preset first convolution kernel;
and the image generation module is used for performing convolution processing on the original image matrix according to the second convolution kernel and generating an image according to the processed image matrix.
In another aspect, a terminal for generating an image is provided, the terminal comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to carry out the operations as performed in the image generation method.
In yet another aspect, a computer-readable storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions, loaded by a processor and having instructions to implement the operations as possessed in the image generation method is provided.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
according to the method, the device, the terminal and the storage medium provided by the embodiment of the invention, the original image matrix is obtained, the brightness component matrix is obtained according to the original image matrix, the second convolution kernel matched with the original image matrix is obtained according to the brightness component matrix and the preset first convolution kernel, the original image matrix is subjected to convolution processing based on the second convolution kernel, and the image is generated according to the processed image matrix, so that the detail components in the image can be enhanced, the problem of local over-brightness or over-darkness in the image is effectively solved, and the image quality is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart of an image generation method provided by an embodiment of the invention;
fig. 2 is a schematic diagram of a sigmoid mapping function according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of the operation provided by the embodiment of the invention;
fig. 4 is a schematic structural diagram of an image generating apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention will be described in further detail with reference to the accompanying drawings.
In the related art, when the light signal intensity in the environment where the image sensor is located is too high or too low and exceeds the working range of the image sensor, which can normally measure, the image sensor amplifies the electric signal by setting the gain of the amplifier, so that the working range of the image sensor is expanded, and a high-brightness blind area or a low-brightness blind area is avoided in an image. However, the scheme can only avoid the problem that the image has a high-brightness blind area or a low-brightness blind area to a certain extent, but the image may have a problem that the image is locally too bright or too dark, which results in poor image quality.
The embodiment of the invention provides an image generation method, which can acquire a second convolution kernel matched with an original image matrix according to the original image matrix and a preset first convolution kernel, and perform convolution processing on the original image matrix based on the second convolution kernel, so that the detail components of image information in the original image matrix are enhanced, the problem of local over-brightness or over-darkness is solved, and the image quality is improved.
The embodiment of the invention can be applied to any scene for generating the image. For example, when the terminal needs to take a picture, a clearer picture can be generated by adopting the method provided by the embodiment of the invention. Or, when the terminal needs to shoot a video, the method provided by the embodiment of the invention can be adopted to generate a plurality of frames of images, and finally the video composed of the plurality of frames of images is obtained.
Fig. 1 is a flowchart of an image generation method according to an embodiment of the present invention. The execution subject of the embodiment of the present invention is a terminal, and referring to fig. 1, the method includes:
101. the terminal acquires an original image matrix according to optical signals collected by a plurality of photosensitive elements in the image sensor.
The terminal can be various types of equipment such as a mobile phone, a tablet computer and a camera, and is provided with an image sensor and a display screen, wherein an image can be generated through the image sensor, and the image is displayed on the display screen. The image sensor comprises a plurality of photosensitive elements, each photosensitive element is used for sensing an optical signal in an external environment and converting the sensed optical signal into a corresponding electrical signal, a corresponding image can be acquired according to the electrical signals output by the plurality of photosensitive elements, the image comprises a plurality of pixel units respectively corresponding to the plurality of photosensitive elements, and the electrical signals output by the photosensitive elements are image information of the corresponding pixel units in the image.
In the embodiment of the invention, after the plurality of photosensitive elements collect the optical signals and output the electric signals, the terminal acquires the original image matrix according to the optical signals collected by the plurality of photosensitive elements, the original image matrix comprises the image information output by the plurality of photosensitive elements, and an image can be generated according to the original image matrix subsequently.
102. The terminal acquires a luminance component matrix from the original image matrix, the luminance component matrix being used to represent the luminance components contained in the image information output by the plurality of photosensitive elements, and executes steps 103 and 105.
In the embodiment of the invention, after the terminal acquires the original image matrix, the image is not directly generated, but the original image matrix is processed, and then the image is generated according to the processed image matrix, so that the detail components in the image are enhanced, and the problem that the local part of the image is too bright or too dark is avoided.
For this purpose, the terminal extracts a luminance component contained in each image information contained in the original image matrix, constituting a luminance component matrix.
In one possible implementation, each image information in the original image matrix contains three color channels of Red, Green and Blue, and the color of each image information is obtained by color superposition of the three color channels, i.e. the three color channels of Red, Green and Blue together constitute color information in the RGB (Red/Green/Blue ) space. In order to extract the brightness component, each image information in the original image matrix is converted from an RGB space to a YUV (a color coding method, which represents brightness and chrominance) space, and the image is identified by using the brightness component and two color difference components in the YUV space, so that the brightness component and the color difference component contained in each image information can be obtained, and the brightness component matrix is formed according to the brightness components contained in a plurality of image information in the original image matrix.
In one possible implementation, a luminance component matrix formed by luminance components included in each image information in the original image matrix is used as the first luminance component matrix, and the first luminance component matrix includes luminance components included in the image information output by the plurality of photosensitive elements. And the terminal performs convolution processing on the first brightness component matrix according to a preset third convolution kernel to obtain a second brightness component matrix.
The third convolution kernel is used for performing convolution operation with each part in the first luminance component matrix respectively, performing local processing on the first luminance component matrix, having a certain suppression effect on a noise interference part existing in the first luminance component matrix, and alleviating adverse effects of noise interference on the first luminance component matrix.
For example, the third convolution kernel may be a 3x 3 full 1 matrix, as follows:
Figure BDA0001734822650000051
103. and the terminal acquires a second convolution kernel matched with the original image matrix according to the brightness component matrix and a preset first convolution kernel.
In the embodiment of the present invention, the terminal may preset a preset mapping function and a first convolution kernel, and perform mapping operation on the acquired luminance component matrix by calling the preset mapping function, so as to obtain a scaling matrix corresponding to the luminance component matrix. And carrying out scaling processing on the first convolution kernel according to the scaling matrix to obtain a second convolution kernel, wherein the second convolution kernel is obtained after the first convolution kernel is scaled, and the second convolution kernel is determined according to the brightness component matrix, so that the second convolution kernel is equivalent to being matched with the original image matrix. The preset mapping function can be set according to actual needs.
In one possible implementation, scaling the first convolution kernel according to the scaling matrix may include: and performing addition calculation on the scaling matrix and the first convolution kernel, or performing multiplication calculation on the scaling matrix and the first convolution kernel, or performing other operations.
For example, the preset mapping function may be a sigmoid function, as shown in fig. 2, a straight line in the graph represents original image data, and after the mapping of the sigmoid function, a curve in the graph may be obtained.
In another possible implementation manner, the preset mapping function may also be used to process only the high brightness portion and not process the low brightness portion, so as to achieve the effect of suppressing only the local area of the image from being too bright.
104. And the terminal performs convolution processing on the original image matrix according to the second convolution kernel and generates an image according to the processed image matrix.
And the terminal performs convolution operation on the original image according to the second convolution kernel to obtain a processed image matrix, and generates an image according to the processed image matrix, so that detail components in the image can be enhanced, and the problem of local over-brightness or over-darkness is avoided as much as possible.
In addition, the convolution process can sharpen or blur the image to a certain degree, which is determined according to the selected convolution kernel. And the resolution of the output image can be adjusted by changing the step length of the convolution kernel, so that the method can be applied to the scene needing to adjust the resolution of the image.
The image sensor can provide an interface for the terminal, when the image sensor acquires the processed image matrix, the processed image matrix is output through the interface, and the terminal acquires the processed image matrix through the interface and displays a corresponding image on the display screen. For example, a schematic flow chart of the operation of the embodiment of the present invention is shown in fig. 3.
It should be noted that after the image is generated, the terminal may further process the image. For example, the performance of the neural network model can be improved because the neural network model is trained using the generated image as a sample image and the quality of the image is good. Or, the generated image can also be used for face recognition, and the accuracy of the face recognition can be improved due to better image quality. Or, continuous shooting can be performed by the method provided by the embodiment of the invention, and a plurality of images are generated, so that the plurality of images form a video with clear pictures.
105. And the terminal acquires the adjusted exposure time according to the brightness component matrix, and executes the step of acquiring an original image matrix according to the optical signals acquired by the plurality of photosensitive elements in the image sensor according to the adjusted exposure time.
In step 101, the sensing elements control the duration of collecting the light signal according to the set exposure time, and the luminance component matrix may represent the luminance of the light signal in the external environment, so that after the terminal acquires the luminance component matrix, the exposure time may be adjusted according to the luminance component matrix, so as to set the exposure time more matched with the external environment. The purpose of setting the exposure time is to make the exposure time not too large or too small, and finally, the sampled image can be relatively balanced.
Therefore, the terminal obtains an average brightness component according to a plurality of brightness components in the brightness component matrix, obtains exposure time in inverse proportion to the average brightness component according to the average brightness component, and reversely controls the acquisition duration of the light signals in the environment by a plurality of photosensitive elements in the image sensor in the subsequent process according to the obtained exposure time.
Since the image information acquired by the photosensitive element has a certain limit, which cannot be less than 0 and cannot exceed the maximum threshold, once the image information exceeds the maximum threshold, an overflow phenomenon occurs. Therefore, the embodiment of the invention reversely controls the acquisition duration of the photosensitive element on the optical signal according to the acquired exposure time, and can control the range of the image information, so that each image information in the original image matrix is close to 0 and not less than 0, the maximum value of the image information does not exceed the maximum threshold value, and the overflow phenomenon is avoided.
When the average luminance component is obtained, a column vector formed by the average value of each column may be obtained after averaging a plurality of elements of each column in the luminance component matrix, and the average luminance component may be obtained by averaging a plurality of elements in the column vector.
In one possible implementation manner, the terminal may calculate the average luminance component by using the following formula to obtain the exposure time: t ═ ax2+ bx + c, where t denotes an exposure time, x denotes an average luminance component, a, b, and c denote coefficients, and a is less than 0.
In the related art, when the brightness of the external environment where the image sensor is located changes rapidly, a local area is too bright, another local area around the local area is too dark, and the output image has the problem of local over-bright or over-dark, an overflow phenomenon occurs, so that details of data beyond a normal range are submerged and cannot be reflected in the image.
According to the method provided by the embodiment of the invention, the original image matrix is obtained, the brightness component matrix is obtained according to the original image matrix, the second convolution kernel matched with the original image matrix is obtained according to the brightness component matrix and the preset first convolution kernel, the original image matrix is subjected to convolution processing based on the second convolution kernel, and the image is generated according to the processed image matrix, so that the signal detail component in the original image matrix is enhanced, the problem that local over-brightness or over-darkness exists in the image is effectively improved, the quality of the generated image is improved, and the further processing of the image is facilitated.
Fig. 4 is a schematic structural diagram of an image generating apparatus according to an embodiment of the present invention. Referring to fig. 4, the apparatus includes:
an original acquisition module 401, configured to perform the step of acquiring an original image matrix according to optical signals collected by a plurality of photosensitive elements in the image sensor in the foregoing embodiment;
a luminance obtaining module 402, configured to perform the step of obtaining a luminance component matrix according to the original image matrix in the foregoing embodiment;
a convolution kernel obtaining module 403, configured to perform the step of obtaining a second convolution kernel matched with the original image matrix according to the luminance component matrix and a preset first convolution kernel in the foregoing embodiment;
the image generating module 404 is configured to perform the steps of performing convolution processing on the original image matrix according to the second convolution kernel and generating an image according to the processed image matrix in the foregoing embodiment.
Optionally, the brightness obtaining module 402 includes:
a first luminance obtaining unit configured to perform the step of obtaining a first luminance component matrix from the original image matrix in the above embodiment;
and a second luminance obtaining unit, configured to perform convolution processing on the first luminance component matrix according to a preset third convolution kernel in the foregoing embodiment to obtain a second luminance component matrix.
Optionally, the brightness obtaining module 402 includes:
the space conversion unit is used for executing the step of converting each image information in the original image matrix from an RGB space to a YUV space to obtain a brightness component contained in each image information in the original image matrix;
a constructing unit configured to execute the step of constructing the luminance component matrix based on the luminance component contained in each image information in the original image matrix in the above-described embodiment.
Optionally, the convolution kernel obtaining module 403 includes:
a scaling matrix obtaining unit, configured to execute the step of calling a preset mapping function in the above embodiment, and processing the luminance component matrix to obtain a scaling matrix corresponding to the luminance component matrix;
and the convolution kernel obtaining unit is configured to perform a step of performing scaling processing on the first convolution kernel according to the scaling matrix in the above embodiment to obtain a second convolution kernel.
Optionally, the apparatus further comprises:
an average luminance component obtaining module, configured to perform the step of obtaining an average luminance component according to the plurality of luminance components in the luminance component matrix in the foregoing embodiment;
and the exposure time acquisition module is used for executing the step of acquiring the exposure time in inverse proportion to the average brightness component according to the average brightness component in the embodiment.
It should be noted that: the image generating apparatus provided in the above embodiment is only illustrated by dividing the functional modules when generating an image, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the terminal may be divided into different functional modules to complete all or part of the functions described above. In addition, the image generation apparatus and the image generation method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
Fig. 5 shows a block diagram of a terminal 500 according to an exemplary embodiment of the present invention. The terminal 500 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture experts Group Audio Layer III, motion video experts compression standard Audio Layer 3), an MP4 player (Moving Picture experts Group Audio Layer IV, motion video experts compression standard Audio Layer 4), a notebook computer, a desktop computer, a head-mounted device, or any other intelligent terminal. Terminal 500 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
In general, the terminal 500 includes: a processor 501 and a memory 502.
The processor 501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 501 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 502 may include one or more computer-readable storage media, which may be non-transitory. Memory 502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 502 is used to store at least one instruction for being possessed by processor 501 to implement the behavior vector generation methods provided by method embodiments herein.
In some embodiments, the terminal 500 may further optionally include: a peripheral interface 503 and at least one peripheral. The processor 501, memory 502 and peripheral interface 503 may be connected by a bus or signal lines. Each peripheral may be connected to the peripheral interface 503 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 504, touch screen display 505, camera 506, audio circuitry 507, positioning components 508, and power supply 509.
The peripheral interface 503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 501 and the memory 502. In some embodiments, the processor 501, memory 502, and peripheral interface 503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 501, the memory 502, and the peripheral interface 503 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 504 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 504 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 504 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 8G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 504 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 505 is a touch display screen, the display screen 505 also has the ability to capture touch signals on or over the surface of the display screen 505. The touch signal may be input to the processor 501 as a control signal for processing. At this point, the display screen 505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 505 may be one, providing the front panel of the terminal 500; in other embodiments, the display screens 505 may be at least two, respectively disposed on different surfaces of the terminal 500 or in a folded design; in still other embodiments, the display 505 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 500. Even more, the display screen 505 can be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display screen 505 may be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 506 is used to capture images or video. Optionally, camera assembly 506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuitry 507 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 501 for processing, or inputting the electric signals to the radio frequency circuit 504 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 501 or the radio frequency circuit 504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 507 may also include a headphone jack.
The positioning component 508 is used to locate the current geographic position of the terminal 500 for navigation or LBS (location based Service). The positioning component 508 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 509 is used to power the various components in terminal 500. The power source 509 may be alternating current, direct current, disposable or rechargeable. When power supply 509 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 500 also includes one or more sensors 510. The one or more sensors 510 include, but are not limited to: acceleration sensor 511, gyro sensor 512, pressure sensor 513, fingerprint sensor 514, optical sensor 515, and proximity sensor 516.
The acceleration sensor 511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 500. For example, the acceleration sensor 511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 501 may control the touch screen 505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 511. The acceleration sensor 511 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 512 may detect a body direction and a rotation angle of the terminal 500, and the gyro sensor 512 may cooperate with the acceleration sensor 511 to acquire a 3D motion of the user on the terminal 500. The processor 501 may implement the following functions according to the data collected by the gyro sensor 512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 513 may be disposed on a side bezel of the terminal 500 and/or an underlying layer of the touch display screen 505. When the pressure sensor 513 is disposed on the side frame of the terminal 500, a user's holding signal of the terminal 500 may be detected, and the processor 501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 513. When the pressure sensor 513 is disposed at the lower layer of the touch display screen 505, the processor 501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 514 is used for collecting a fingerprint of the user, and the processor 501 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 514, or the fingerprint sensor 514 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the user is authorized by processor 501 to have relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 514 may be provided on the front, back, or side of the terminal 500. When a physical button or a vendor Logo is provided on the terminal 500, the fingerprint sensor 514 may be integrated with the physical button or the vendor Logo.
The optical sensor 515 is used to collect the ambient light intensity. In one embodiment, the processor 501 may control the display brightness of the touch display screen 505 based on the ambient light intensity collected by the optical sensor 515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 505 is turned down. In another embodiment, processor 501 may also dynamically adjust the shooting parameters of camera head assembly 506 based on the ambient light intensity collected by optical sensor 515.
A proximity sensor 516, also referred to as a distance sensor, is typically disposed on the front panel of the terminal 500. The proximity sensor 516 is used to collect the distance between the user and the front surface of the terminal 500. In one embodiment, when the proximity sensor 516 detects that the distance between the user and the front surface of the terminal 500 gradually decreases, the processor 501 controls the touch display screen 505 to switch from the bright screen state to the dark screen state; when the proximity sensor 516 detects that the distance between the user and the front surface of the terminal 500 becomes gradually larger, the processor 501 controls the touch display screen 505 to switch from the screen-rest state to the screen-on state.
Those skilled in the art will appreciate that the configuration shown in fig. 5 is not intended to be limiting of terminal 500 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
An embodiment of the present invention further provides a terminal for generating an image, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the instruction, the program, the code set, or the instruction set is loaded by the processor and has an operation to implement the image generating method of the foregoing embodiment.
An embodiment of the present invention further provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the instruction, the program, the code set, or the set of instructions is loaded by a processor and has an operation to implement the image generation method of the above-described embodiment.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only a preferred embodiment of the present invention, and should not be taken as limiting the invention, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An image generation method, characterized in that the method comprises:
acquiring an original image matrix according to optical signals collected by a plurality of photosensitive elements in an image sensor, wherein the original image matrix comprises image information output by the plurality of photosensitive elements;
acquiring a brightness component matrix according to the original image matrix, wherein the brightness component matrix is used for representing brightness components contained in the image information output by the photosensitive elements;
calling a preset mapping function, and processing the brightness component matrix to obtain a scaling matrix corresponding to the brightness component matrix;
according to the scaling matrix, scaling a preset first convolution kernel to obtain a second convolution kernel matched with the original image matrix;
and performing convolution processing on the original image matrix according to the second convolution kernel, and generating an image according to the processed image matrix.
2. The method of claim 1, wherein obtaining a luminance component matrix from the original image matrix comprises:
acquiring a first brightness component matrix according to the original image matrix, wherein the first brightness component matrix comprises brightness components contained in image information output by the photosensitive elements;
and performing convolution processing on the first brightness component matrix according to a preset third convolution kernel to obtain a second brightness component matrix.
3. The method of claim 1, wherein obtaining a luminance component matrix from the original image matrix comprises:
converting each image information in the original image matrix from an RGB space to a YUV space to obtain a brightness component contained in each image information in the original image matrix;
and forming the brightness component matrix according to the brightness component contained in each image information in the original image matrix.
4. The method of claim 1, further comprising:
obtaining an average brightness component according to a plurality of brightness components in the brightness component matrix;
and acquiring exposure time in inverse proportion to the average brightness component according to the average brightness component, wherein the photosensitive elements are used for controlling the time length of collecting optical signals according to the exposure time.
5. An image generation apparatus, characterized in that the apparatus comprises:
the system comprises an original acquisition module, a processing module and a processing module, wherein the original acquisition module is used for acquiring an original image matrix according to optical signals acquired by a plurality of photosensitive elements in an image sensor, and the original image matrix comprises image information output by the photosensitive elements;
the brightness acquisition module is used for acquiring a brightness component matrix according to the original image matrix, wherein the brightness component matrix is used for representing brightness components contained in the image information output by the photosensitive elements;
a convolution kernel obtaining module, configured to obtain a second convolution kernel matched with the original image matrix according to the luminance component matrix and a preset first convolution kernel;
the image generation module is used for carrying out convolution processing on the original image matrix according to the second convolution kernel and generating an image according to the processed image matrix;
the convolution kernel obtaining module comprises:
the scaling matrix obtaining unit is used for calling a preset mapping function and processing the brightness component matrix to obtain a scaling matrix corresponding to the brightness component matrix;
and the convolution kernel acquisition unit is used for carrying out scaling processing on the first convolution kernel according to the scaling matrix to obtain the second convolution kernel.
6. The apparatus of claim 5, wherein the brightness obtaining module comprises:
a first luminance obtaining unit configured to obtain a first luminance component matrix from the original image matrix, the first luminance component matrix including luminance components included in image information output by the plurality of photosensitive elements;
and the second brightness acquisition unit is used for performing convolution processing on the first brightness component matrix according to a preset third convolution kernel to obtain a second brightness component matrix.
7. The apparatus of claim 5, wherein the brightness obtaining module comprises:
the space conversion unit is used for converting each image information in the original image matrix from an RGB space to a YUV space to obtain a brightness component contained in each image information in the original image matrix;
and the forming unit is used for forming the brightness component matrix according to the brightness component contained in each image information in the original image matrix.
8. The apparatus of claim 5, further comprising:
an average brightness component obtaining module, configured to obtain an average brightness component according to the multiple brightness components in the brightness component matrix;
and the exposure time acquisition module is used for acquiring the exposure time in inverse proportion to the average brightness component according to the average brightness component, and the photosensitive elements are used for controlling the duration of collecting the optical signals according to the exposure time.
9. A terminal for generating an image, the terminal comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the instruction, the program, the set of codes, or the set of instructions being loaded and executed by the processor to carry out operations performed in the image generation method according to any one of claims 1 to 4.
10. A computer-readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to carry out the operations performed in the image generation method according to any one of claims 1 to 4.
CN201810790667.4A 2018-07-18 2018-07-18 Image generation method, device, terminal and storage medium Active CN108881739B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810790667.4A CN108881739B (en) 2018-07-18 2018-07-18 Image generation method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810790667.4A CN108881739B (en) 2018-07-18 2018-07-18 Image generation method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN108881739A CN108881739A (en) 2018-11-23
CN108881739B true CN108881739B (en) 2020-01-10

Family

ID=64302921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810790667.4A Active CN108881739B (en) 2018-07-18 2018-07-18 Image generation method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN108881739B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583104B (en) * 2020-05-14 2022-08-19 展讯通信(上海)有限公司 Light spot blurring method and device, storage medium and computer equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105070247A (en) * 2015-07-29 2015-11-18 信利(惠州)智能显示有限公司 Method for determining nonuniform-brightness pixel points in display screen
CN106709890A (en) * 2017-01-18 2017-05-24 聚龙智瞳科技有限公司 Method and device for processing low-light video image
CN108259997A (en) * 2018-04-02 2018-07-06 腾讯科技(深圳)有限公司 Image correlation process method and device, intelligent terminal, server, storage medium
CN108268931A (en) * 2016-12-30 2018-07-10 华为技术有限公司 The methods, devices and systems of data processing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8737687B2 (en) * 2011-12-30 2014-05-27 Honeywell International Inc. System and method for tracking a subject using raw images and tracking errors

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105070247A (en) * 2015-07-29 2015-11-18 信利(惠州)智能显示有限公司 Method for determining nonuniform-brightness pixel points in display screen
CN108268931A (en) * 2016-12-30 2018-07-10 华为技术有限公司 The methods, devices and systems of data processing
CN106709890A (en) * 2017-01-18 2017-05-24 聚龙智瞳科技有限公司 Method and device for processing low-light video image
CN108259997A (en) * 2018-04-02 2018-07-06 腾讯科技(深圳)有限公司 Image correlation process method and device, intelligent terminal, server, storage medium

Also Published As

Publication number Publication date
CN108881739A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN110502954B (en) Video analysis method and device
CN107945163B (en) Image enhancement method and device
CN108305236B (en) Image enhancement processing method and device
CN111372126B (en) Video playing method, device and storage medium
CN109829864B (en) Image processing method, device, equipment and storage medium
CN111464749B (en) Method, device, equipment and storage medium for image synthesis
CN111028144B (en) Video face changing method and device and storage medium
CN110839128B (en) Photographing behavior detection method and device and storage medium
CN110933334B (en) Video noise reduction method, device, terminal and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN110797042B (en) Audio processing method, device and storage medium
CN111754386A (en) Image area shielding method, device, equipment and storage medium
WO2022033272A1 (en) Image processing method and electronic device
CN110619614A (en) Image processing method and device, computer equipment and storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN113160031A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112738606A (en) Audio file processing method and device, terminal and storage medium
CN112235650A (en) Video processing method, device, terminal and storage medium
CN108881739B (en) Image generation method, device, terminal and storage medium
CN111860064A (en) Target detection method, device and equipment based on video and storage medium
CN111757146B (en) Method, system and storage medium for video splicing
CN114155132A (en) Image processing method, device, equipment and computer readable storage medium
CN108881715B (en) Starting method and device of shooting mode, terminal and storage medium
CN110443841B (en) Method, device and system for measuring ground depth
CN107992230B (en) Image processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230707

Address after: 518057 Tencent Building, No. 1 High-tech Zone, Nanshan District, Shenzhen City, Guangdong Province, 35 floors

Patentee after: TENCENT TECHNOLOGY (SHENZHEN) Co.,Ltd.

Patentee after: TENCENT CLOUD COMPUTING (BEIJING) Co.,Ltd.

Address before: 518057 Tencent Building, No. 1 High-tech Zone, Nanshan District, Shenzhen City, Guangdong Province, 35 floors

Patentee before: TENCENT TECHNOLOGY (SHENZHEN) Co.,Ltd.

TR01 Transfer of patent right