CN111669611B - Image processing method, device, terminal and storage medium - Google Patents

Image processing method, device, terminal and storage medium Download PDF

Info

Publication number
CN111669611B
CN111669611B CN202010564642.XA CN202010564642A CN111669611B CN 111669611 B CN111669611 B CN 111669611B CN 202010564642 A CN202010564642 A CN 202010564642A CN 111669611 B CN111669611 B CN 111669611B
Authority
CN
China
Prior art keywords
image
image data
difference
beauty
original image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010564642.XA
Other languages
Chinese (zh)
Other versions
CN111669611A (en
Inventor
梁衍鹏
吴文艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fanxing Huyu IT Co Ltd
Original Assignee
Guangzhou Fanxing Huyu IT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fanxing Huyu IT Co Ltd filed Critical Guangzhou Fanxing Huyu IT Co Ltd
Priority to CN202010564642.XA priority Critical patent/CN111669611B/en
Publication of CN111669611A publication Critical patent/CN111669611A/en
Application granted granted Critical
Publication of CN111669611B publication Critical patent/CN111669611B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application

Abstract

The embodiment of the application provides an image processing method, an image processing device, a terminal and a storage medium. The method comprises the following steps: acquiring image data of an original image; performing beauty treatment on the original image to obtain image data of a beauty image corresponding to the original image; performing difference judgment on the image data of the original image and the image data of the beauty image to obtain a first difference degree; if the first difference degree is larger than a first difference degree threshold value, sending image data of the beautifying image to a rendering process; performing difference judgment on the image data received by the rendering process and the image data of the beauty image to obtain a second difference degree; and if the second difference is smaller than a second difference threshold value, rendering and displaying the beautifying image based on the image data received by the rendering process. The embodiment of the application provides a double detection mechanism, which can reduce the occurrence probability of exposing the original image to the audience caused by the failure of beautifying or the failure of transmitting the image data.

Description

Image processing method, device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of internet, in particular to an image processing method, an image processing device, a terminal and a storage medium.
Background
Currently, live broadcast applications are often provided with a beauty function that anchor users use while live, thereby showing more beautiful images to audience users.
In the related technology, a director user selects a beauty type to be used, such as adding a filter, thinning a face, enlarging eyes and the like, from a beauty type displayed on an upper layer of a live broadcast interface, the director terminal performs beauty processing on a collected image according to the selected beauty type, then transmits image data obtained after the beauty processing to a rendering process, and the rendering process renders and displays a beauty image based on the image data obtained after the beauty processing.
In the related art, when the beautifying processing fails or transmission errors occur in the process of transmitting image data obtained after the beautifying processing to a rendering process, the anchor terminal displays the acquired original image.
Disclosure of Invention
Embodiments of the present application provide an image processing method, an image processing apparatus, a terminal, and a storage medium, which reduce the probability of exposing an original image to viewers due to a failure in beautifying or a failure in transmitting image data. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides an image processing method, where the method includes:
acquiring image data of an original image;
performing beauty treatment on the original image to obtain image data of a beauty image corresponding to the original image;
performing difference judgment on the image data of the original image and the image data of the beauty image to obtain a first difference degree between the original image and the beauty image;
if the first difference degree is larger than a first difference degree threshold value, sending image data of the beautifying image to a rendering process;
performing difference judgment on the image data received by the rendering process and the image data of the beauty image to obtain a second difference degree between the image corresponding to the image data received by the rendering process and the beauty image;
and if the second difference is smaller than a second difference threshold value, rendering and displaying the beautifying image based on the image data received by the rendering process, wherein the second difference threshold value is smaller than the first difference threshold value.
In another aspect, an embodiment of the present application provides an image processing apparatus, including:
the data acquisition module is used for acquiring image data of an original image;
the beautifying processing module is used for carrying out beautifying processing on the original image to obtain image data of a beautifying image corresponding to the original image;
the first judgment module is used for carrying out difference judgment on the image data of the original image and the image data of the beauty image to obtain a first difference degree between the original image and the beauty image;
a data sending module, configured to send image data of the beauty image to a rendering process if the first difference is greater than a first difference threshold;
the second judgment module is used for performing difference judgment on the image data received by the rendering process and the image data of the beauty image to obtain a second difference degree between the image corresponding to the image data received by the rendering process and the beauty image;
and the rendering and displaying module is used for rendering and displaying the beautifying image based on the image data received by the rendering process if the second difference is smaller than a second difference threshold, wherein the second difference threshold is smaller than the first difference threshold.
In yet another aspect, embodiments of the present application provide a terminal including a processor, a memory and a flexible display, where the memory stores a computer program, and the computer program is loaded and executed by the processor to implement the image processing method according to an aspect.
In still another aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, and the computer program is loaded and executed by a processor to implement the image processing method according to the aspect.
In yet another aspect, an embodiment of the present application provides a computer program product, where the computer program product includes computer instructions stored in a computer-readable storage medium, and a processor of a computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device performs the image processing method provided in the foregoing aspect or various alternative implementations of the aspect.
The technical scheme provided by the embodiment of the application can bring the beneficial effects of at least comprising:
the method comprises the steps of carrying out first difference detection after finishing beauty treatment, judging whether beauty treatment succeeds or not by detecting the difference degree between a beauty image and an original image, then carrying out second difference detection after a rendering process receives image data, and judging whether errors occur in the process of transmitting the image data of the beauty image to the rendering process by detecting the difference degree between the image data received by the rendering process and the image data of the beauty image.
Drawings
FIG. 1 is a schematic illustration of an implementation environment shown in an exemplary embodiment of the present application;
FIG. 2 is a flow chart of an image processing method shown in an exemplary embodiment of the present application;
FIG. 3 is a flow chart of an image processing method shown in another exemplary embodiment of the present application;
FIG. 4 is a flow chart of an image processing method shown in another exemplary embodiment of the present application;
FIG. 5 is a block diagram of an image processing apparatus shown in an exemplary embodiment of the present application;
fig. 6 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the present application is shown, the implementation environment including an anchor terminal 110, a server 120, and a viewer terminal 130.
Anchor terminal 110 is a terminal used by an anchor user. The anchor terminal 110 is a smartphone, tablet, Personal Computer (PC), smart wearable device, or the like.
The anchor terminal 110 has an image capturing function and an image processing function. Optionally, anchor terminal 110 implements the image capture function described above via a camera assembly or a video-type application. For example, the anchor terminal captures a portrait of the anchor user through a camera assembly, as well as the environment in which the anchor user is located. As another example, the anchor terminal collects content in the user interface of anchor terminal 110 through a screen-recording type application. Optionally, the anchor terminal 110 implements a beautifying process on the original image through an image processing function, where the beautifying process includes a combination of one or more of the following: adding a filter, adding a sticker, and performing skin grinding, face beautifying, face thinning, eye enlarging, whitening and other treatment on a portrait area in an original image.
Anchor terminal 110 also has the capability to interact with server 120 for data. Illustratively, after the anchor terminal 110 captures and processes the image, the processed data is pushed to the server 120.
Optionally, the anchor terminal 110 is installed with a specific application program, by which the above-described image capturing function, image processing function, and function of data interaction with the server 120 are implemented. Optionally, the specific application is an application having a live function.
The server 120 may be one server, a server cluster formed by a plurality of servers, or a cloud computing service center. Optionally, the server 110 is a backend server corresponding to the specified application installed in the anchor terminal 110.
The viewer terminal 130 is a terminal used by a viewer user. The spectator terminal 130 is a smartphone, tablet, personal computer, smart wearable device, or the like. The viewer terminal 130 has an image processing function, a function of data interaction with the server 120, and a rendering display function.
A communication connection is established between anchor terminal 110 and server 120 through a wired or wireless network. A communication connection is established between the server 120 and the viewer terminal 130 through a wired or wireless network.
The wireless or wired networks described above use standard communication techniques and/or protocols. The Network is typically the internet, but may be any other Network including, but not limited to, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wireline or wireless Network, a private Network, or any combination of virtual private networks. In some embodiments, data exchanged over a network is represented using techniques and/or formats including Hypertext Mark-up Language (HTML), Extensible Markup Language (XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as Secure Socket Layer (SSL), Transport Layer Security (TLS), Virtual Private Network (VPN), Internet Protocol Security (IPsec). In other embodiments, custom and/or dedicated data communication techniques may also be used in place of, or in addition to, the data communication techniques described above.
Referring to fig. 2, a flowchart of an image processing method provided by an embodiment of the present application is shown, where the method is applied to a terminal (e.g., the anchor terminal 110 or the viewer terminal 130) in the embodiment of fig. 1, and the method includes:
step 201, acquiring image data of an original image.
And the image data of the original image is used for rendering and displaying the original image by the terminal. When the terminal is a main broadcasting terminal, the terminal acquires image data of an original image through a camera assembly or a screen recording application program. When the terminal is a viewer terminal, it receives image data of an original image transmitted by the server.
Optionally, after acquiring the image data of the original image, the terminal stores the image data of the original image in the first cache region. The first cache region is set by a default of a terminal, or the first cache region is set by a user in a self-defined mode.
Step 202, performing a beauty treatment on the original image to obtain image data of a beauty image corresponding to the original image.
Cosmetic types include, but are not limited to: adding a filter, adding a sticker, and performing skin grinding, face beautifying, face thinning, eye enlarging, whitening and other treatment on a portrait area in an original image. Optionally, a beauty type list is displayed in a current display interface of the terminal, the anchor user selects a target beauty type in the beauty type list, the terminal obtains the target beauty type selected by the anchor user, and then, the original image is beautified according to the target beauty type.
Step 203, performing difference judgment on the image data of the original image and the image data of the beauty image to obtain a first difference degree between the original image and the beauty image.
The first degree of difference is used for measuring the degree of difference between the original image and the beauty image. The greater the first degree of difference between the beauty image and the original image, the greater the degree of difference between the original image and the beauty image, and the more dissimilar the two. The first degree of difference between the beauty image and the original image, the smaller the degree of difference between the original image and the beauty image, the more similar the two. In the embodiment of the application, a first difference degree between the beautifying image and the original image is obtained through a difference judgment mode, and the success of beautifying is judged according to the first difference degree.
Optionally, the terminal performs difference judgment on the image data of the original image and the image data of the beauty image by using a preset difference identification algorithm to obtain a first difference between the original image and the beauty image.
In step 204, if the first difference is greater than the first difference threshold, the image data of the beauty image is sent to the rendering process.
The first difference threshold is set according to practical experience. Illustratively, the first dissimilarity degree threshold is actually set according to the number of beauty types for which the original image is subjected to beauty processing. Optionally, the first dissimilarity threshold is positively correlated with the number of beauty types. That is, the greater the number of beauty types, the greater the first dissimilarity threshold, and the fewer the number of beauty types, the smaller the first dissimilarity threshold. For example, if the terminal performs 2 kinds of skin treatments (e.g., buffing, adding filters) on the original image, the first difference threshold is 30%, and if the terminal performs 4 kinds of skin treatments (e.g., buffing, adding filters, enlarging eyes, thinning faces) on the original image, the first difference threshold is 50%. The rendering process is used for receiving the image data of the beautifying image and rendering and displaying the corresponding image based on the received image data.
And if the first difference degree is greater than the first difference degree threshold value, the beautifying is successful, and the terminal sends the image data of the beautifying image to the rendering process after judging that the beautifying is successful. And if the first difference is smaller than or equal to the first difference threshold value, the beautifying fails, and at the moment, the terminal renders and displays the preset virtual image. The preset virtual image is set by default of the terminal, or the preset virtual image is set by user-defined. By the method, the face beautifying image is still rendered and displayed under the condition of face beautifying failure.
Optionally, if the first difference degree is greater than the first difference degree threshold, storing the image data of the beauty image in the second buffer area. The second cache region here is not the same as the first cache region mentioned above. On one hand, the image data of the beauty image is stored after the beauty is judged to be successful through the difference judgment mode, so that the image data of the beauty image is prevented from being stored under the condition of the beauty failure, and the cache occupation of the terminal is reduced. On the other hand, the image data of the original image is stored in a different location from the image data of the beauty image, and it is possible to reduce the occurrence probability of erroneously reading the image data of the original image when it is necessary to read the image data of the beauty image, and to reduce the probability of exposing the original image to the viewer.
Step 205, performing difference determination on the image data received by the rendering process and the image data of the beauty image to obtain a second difference between the image corresponding to the image data received by the rendering process and the beauty image.
In the embodiment of the application, after the rendering process completes data reception, the terminal performs difference judgment on the received image data and the image data of the beauty image to obtain the difference degree between the image corresponding to the image data received by the rendering process and the beauty image, and further determines whether the image data of the beauty image is wrong in the transmission process. The difference determination process may refer to the related explanation in step 203, which is not described herein.
In step 206, if the second difference is smaller than the second difference threshold, the image data received by the rendering process is rendered and displayed as a beauty image.
The second degree of difference threshold is less than the first degree of difference threshold. If the second difference degree is smaller than the second difference degree threshold value, it indicates that no error occurs in the process of transmitting the image data of the beauty image to the rendering process. And if the second difference degree is larger than the second difference degree threshold value, indicating that an error occurs in the process of transmitting the beautifying image to the rendering process.
In the embodiment of the application, after the beauty processing is completed, the first difference detection is performed, whether the beauty processing is successful is judged by detecting the difference between the beauty image and the original image, then the second difference detection is performed after the image data is received by the rendering process, and whether an error occurs in the process of transmitting the image data of the beauty image to the rendering process is judged by detecting the difference between the image data received by the rendering process and the image data of the beauty image.
Optionally, if the second difference is greater than or equal to the second difference threshold, the preset virtual image is rendered and displayed. The preset virtual image is set by default by a terminal, or the virtual image is set by self-definition of a user. Optionally, if the second difference is greater than or equal to the second difference threshold, the step of sending the image data of the beauty image to the rendering process is restarted until the second difference is less than the second difference threshold.
Optionally, if the second difference degree is greater than or equal to the second difference degree threshold, performing graying processing on the image data stored in the second cache region. The graying processing is used for prohibiting rendering and displaying of the image data stored in the second buffer area.
In other possible implementations, step 205 may alternatively be implemented as: and performing difference judgment on the image data received by the rendering process and the image data of the original image to obtain a third difference degree between the image corresponding to the image data received by the rendering process and the original image. If the third difference is greater than the third difference threshold, it is indicated that the image data of the original image is not similar to the image data received by the rendering process, no error occurs in the transmission process, and then the terminal renders and displays the beauty image based on the image data received by the rendering process. If the third difference is smaller than or equal to the third difference threshold, it indicates that the image data of the original image is similar to the image data received by the rendering process, and an error occurs in the transmission process, and at this time, the terminal does not perform the step of rendering and displaying the beauty image based on the image data received by the rendering process. The third difference threshold may be the same as or different from the first difference threshold.
Referring collectively to fig. 3, a schematic diagram of image processing provided by one embodiment of the present application is shown. The method comprises the steps that after an original image is collected by a terminal, a beautifying image is obtained through beautifying processing, then, first difference detection is carried out on image data of the original image and image data of the beautifying image to judge whether beautifying is successful or not, if beautifying is successful, the image data of the beautifying image is sent to a rendering process, after transmission is completed, second difference error detection is carried out on the image data received by the rendering process and the image data of the original image to judge whether errors occur in the transmission process or not, if errors do not occur in the transmission process, the image data received by the rendering process are rendered to display the beautifying image, if errors occur in the transmission process, a preset virtual image is rendered and displayed, and graying processing is carried out on the stored image data of the beautifying image.
To sum up, according to the technical solution provided in the embodiment of the present application, a first difference detection is performed after the beauty processing is completed, whether the beauty is successful is determined by detecting a difference between the beauty image and the original image, then a second difference detection is performed after the image data is received by the rendering process, and whether an error occurs in the process of transmitting the image data of the beauty image to the rendering process is determined by detecting a difference between the image data received by the rendering process and the image data of the beauty image.
In practical application, the anchor user may choose not to perform the beautifying processing on the original image collected by the anchor terminal. This case will be explained below with reference to fig. 4.
Step 401, acquiring image data of an original image.
Step 402, detecting whether the original image needs to be beautified.
Optionally, the terminal acquires a target beauty type selected by the user from a current display interface, if the target beauty type selected by the user can be acquired, the original image needs to be beautified, and if the target beauty type selected by the user is not acquired, the original image does not need to be beautified.
Step 403, if the original image does not need to be beautified, adding the target mark to the image data of the original image.
The target mark is used for marking that the original image does not need to be beautified. Optionally, the terminal adds the target identification at a specific position of the image data of the original image. The specific position is set by the terminal by default, such as an edge of image data of the original image.
If the original image needs to be beautified, step 202 to step 206 are executed.
Step 404, sending the image data of the original image to the rendering process according to the target identifier.
And in the case that the target identification exists in the image data of the original image, sending the image data of the original image to the rendering process according to the target identification. Accordingly, the rendering process receives image data of the original image.
In step 405, the original image is rendered and displayed based on the image data of the original image by the rendering process.
And rendering and displaying the original image in the rendering process under the condition that the target identification exists in the image data of the original image. In the embodiment of the application, the original image which does not need to be beautified is marked through the target identifier and is further distinguished from the original image which needs to be beautified, the original image which does not need to be beautified adopts another set of processing logic, double-difference detection is not needed, on one hand, processing resources of a terminal can be reduced, and on the other hand, the situation that the original image which does not need to be beautified cannot be rendered and displayed is avoided.
To sum up, the technical scheme provided by the embodiment of the application marks the original image without the need of beauty treatment through the target identification, and then distinguishes the original image without the need of beauty treatment, and the original image without the need of beauty treatment adopts another set of processing logic without the need of double-difference detection, so that on one hand, the processing resources of the terminal can be reduced, and on the other hand, the situation that the original image without the need of beauty treatment cannot be rendered and displayed is avoided.
In the following, embodiments of the apparatus of the present application are described, and for portions of the embodiments of the apparatus not described in detail, reference may be made to technical details disclosed in the above-mentioned method embodiments.
Referring to fig. 5, a block diagram of an image processing apparatus according to an exemplary embodiment of the present application is shown. The image processing apparatus may be implemented as all or a part of the terminal by software, hardware, or a combination of both. The image processing apparatus includes:
a data obtaining module 501, configured to obtain image data of an original image.
A beauty processing module 502, configured to perform beauty processing on the original image to obtain image data of a beauty image corresponding to the original image.
A first determining module 503, configured to perform difference determination on the image data of the original image and the image data of the beauty image to obtain a first difference between the original image and the beauty image.
A data sending module 504, configured to send the image data of the beauty image to a rendering process if the first difference is greater than a first difference threshold.
A second determining module 505, configured to perform difference determination on the image data received by the rendering process and the image data of the beauty image, to obtain a second difference between the image corresponding to the image data received by the rendering process and the beauty image.
A rendering and displaying module 506, configured to render and display the beauty image based on the image data received by the rendering process if the second difference is smaller than a second difference threshold, where the second difference threshold is smaller than the first difference threshold.
To sum up, according to the technical solution provided in the embodiment of the present application, a first difference detection is performed after the beauty processing is completed, whether the beauty is successful is determined by detecting a difference between the beauty image and the original image, then a second difference detection is performed after the image data is received by the rendering process, and whether an error occurs in the process of transmitting the image data of the beauty image to the rendering process is determined by detecting a difference between the image data received by the rendering process and the image data of the beauty image.
In an optional embodiment provided based on the embodiment shown in fig. 5, the apparatus further comprises: a first memory module and a second memory module (not shown in fig. 5).
And the first storage module is used for storing the image data of the original image in a first cache region.
And the second storage module is used for storing the image data of the beauty image in a second cache area if the first difference degree is greater than the first difference degree threshold value.
The first cache region and the second cache region are different.
Optionally, the apparatus further comprises: an ash disposal module (not shown in fig. 5).
And the graying processing module is used for performing graying processing on the image data stored in the second cache region if the second difference degree is greater than or equal to the second difference degree threshold, wherein the graying processing is used for prohibiting rendering and displaying of the image data stored in the second cache region.
In an optional embodiment provided based on the embodiment shown in fig. 5, the rendering and displaying module 506 is configured to render and display a preset virtual image if the second difference is greater than or equal to the second difference threshold.
In an optional embodiment provided based on the embodiment described in fig. 5, the apparatus further comprises: a detection module, an identification adding module (not shown in fig. 5).
And the detection module is used for detecting whether the original image needs to be subjected to beautifying processing.
And the identifier adding module is used for adding a target identifier in the image data corresponding to the original image if the original image does not need to be subjected to the beautifying processing.
The data sending module 504 is configured to send the image data of the original image to the rendering process according to the target identifier.
The rendering and displaying module 506 is configured to render and display the original image based on the image data of the original image through the rendering process.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 6 shows a block diagram of a terminal 600 according to an exemplary embodiment of the present application. The terminal 600 may be: a smartphone, a tablet, an MP3 player, an MP4 player, a laptop, or a desktop computer. The terminal 600 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, the terminal 600 includes: a processor 601 and a memory 602.
The processor 601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 601 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 601 may be integrated with a Graphics Processing Unit (GPU) which is responsible for rendering and drawing the content required to be displayed on the display screen.
The memory 602 may include one or more computer-readable storage media, which may be non-transitory. The memory 602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 602 is used to store a computer program for execution by the processor 601 to implement the song playback method provided by the method embodiments of the present application.
In some embodiments, the terminal 600 may further optionally include: a peripheral interface 603 and at least one peripheral. The processor 601, memory 602, and peripheral interface 603 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 603 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 604, a touch screen display 605, a camera 606, an audio circuit 607, a positioning component 608, and a power supply 609.
The peripheral interface 603 may be used to connect at least one Input/Output (I/O) related peripheral to the processor 601 and the memory 602. In some embodiments, the processor 601, memory 602, and peripheral interface 603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 601, the memory 602, and the peripheral interface 603 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 604 is used for receiving and transmitting Radio Frequency (RF) signals, also called electromagnetic signals. The radio frequency circuitry 604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 604 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 604 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wireless Fidelity (WiFi) networks. In some embodiments, rf circuitry 604 may also include Near Field Communication (NFC) related circuitry, which is not limited in this application.
The display 605 is used to display a User Interface (UI). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 605 is a touch display screen, the display screen 605 also has the ability to capture touch signals on or over the surface of the display screen 605. The touch signal may be input to the processor 601 as a control signal for processing. At this point, the display 605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 605 may be one, providing the front panel of the terminal 600; in other embodiments, the display 605 may be at least two, respectively disposed on different surfaces of the terminal 600 or in a folded design; in still other embodiments, the display 605 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 600. Even more, the display 605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 605 may be made of Liquid Crystal Display (LCD), Organic Light-Emitting Diode (OLED), or the like.
The camera assembly 606 is used to capture images or video. Optionally, camera assembly 606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and a Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 606 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuitry 607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 601 for processing or inputting the electric signals to the radio frequency circuit 604 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 601 or the radio frequency circuit 604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 607 may also include a headphone jack.
The positioning component 608 is used to locate the current geographic Location of the terminal 600 for navigation or Location Based Service (LBS). The Positioning component 608 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 609 is used to provide power to the various components in terminal 600. The power supply 609 may be ac, dc, disposable or rechargeable. When the power supply 609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 600 also includes one or more sensors 66. The one or more sensors 66 include, but are not limited to: acceleration sensor 611, gyro sensor 612, pressure sensor 613, fingerprint sensor 614, optical sensor 615, and proximity sensor 616.
The acceleration sensor 611 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 600. For example, the acceleration sensor 611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 601 may control the touch display 606 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 611. The acceleration sensor 611 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 612 may detect a body direction and a rotation angle of the terminal 600, and the gyro sensor 612 and the acceleration sensor 611 may cooperate to acquire a 3D motion of the user on the terminal 600. The processor 601 may implement the following functions according to the data collected by the gyro sensor 612: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 613 may be disposed on the side bezel of terminal 600 and/or underneath touch display 606. When the pressure sensor 613 is disposed on the side frame of the terminal 600, a user's holding signal of the terminal 600 can be detected, and the processor 601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 613. When the pressure sensor 613 is arranged at the lower layer of the touch display screen 606, the processor 601 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 606. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 614 is used for collecting a fingerprint of a user, and the processor 601 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 614, or the fingerprint sensor 614 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 601 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 614 may be disposed on the front, back, or side of the terminal 600. When a physical button or vendor Logo is provided on the terminal 600, the fingerprint sensor 614 may be integrated with the physical button or vendor Logo.
The optical sensor 615 is used to collect the ambient light intensity. In one embodiment, processor 601 may control the display brightness of touch display 606 based on the ambient light intensity collected by optical sensor 615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 606 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 606 is turned down. In another embodiment, the processor 601 may also dynamically adjust the shooting parameters of the camera assembly 606 according to the ambient light intensity collected by the optical sensor 615.
A proximity sensor 616, also known as a distance sensor, is typically disposed on the front panel of the terminal 600. The proximity sensor 616 is used to collect the distance between the user and the front surface of the terminal 600. In one embodiment, when the proximity sensor 616 detects that the distance between the user and the front surface of the terminal 600 gradually decreases, the processor 601 controls the touch display 606 to switch from the bright screen state to the dark screen state; when the proximity sensor 616 detects that the distance between the user and the front surface of the terminal 600 gradually becomes larger, the processor 601 controls the touch display 606 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 6 is not intended to be limiting of terminal 600 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein a computer program, which is loaded and executed by a processor of a terminal to implement the image processing method in the above-described method embodiments.
Alternatively, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, which includes computer instructions stored in a computer-readable storage medium, which are read by a processor of a computer device from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the image processing method provided in the foregoing one aspect or various alternative implementations of the one aspect.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. As used herein, the terms "first," "second," and the like, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. An image processing method, characterized in that the method comprises:
acquiring image data of an original image;
performing beauty treatment on the original image to obtain image data of a beauty image corresponding to the original image;
performing difference judgment on the image data of the original image and the image data of the beauty image to obtain a first difference degree between the original image and the beauty image;
if the first difference degree is larger than a first difference degree threshold value, sending image data of the beautifying image to a rendering process;
performing difference judgment on the image data received by the rendering process and the image data of the beauty image to obtain a second difference degree between the image corresponding to the image data received by the rendering process and the beauty image;
and if the second difference is smaller than a second difference threshold value, rendering and displaying the beautifying image based on the image data received by the rendering process, wherein the second difference threshold value is smaller than the first difference threshold value.
2. The method of claim 1, wherein after the acquiring the original image, further comprising:
storing image data of the original image in a first cache region;
if the first difference degree is larger than the first difference degree threshold value, storing the image data of the beauty image in a second cache region;
the first cache region and the second cache region are different.
3. The method of claim 2, further comprising:
and if the second difference degree is greater than or equal to the second difference degree threshold value, performing gray setting processing on the image data stored in the second cache region, wherein the gray setting processing is used for prohibiting rendering the image data stored in the second cache region.
4. The method according to any one of claims 1 to 3, further comprising:
and if the second difference degree is greater than or equal to the second difference degree threshold value, rendering and displaying a preset virtual image.
5. The method of any of claims 1 to 3, wherein after the acquiring the original image, further comprising:
detecting whether the original image needs to be beautified;
if the original image does not need to be subjected to the beautifying processing, adding a target identifier in the image data corresponding to the original image;
sending the image data of the original image to the rendering process according to the target identification;
rendering and displaying the original image based on the image data of the original image through the rendering process.
6. An image processing apparatus, characterized in that the apparatus comprises:
the data acquisition module is used for acquiring image data of an original image;
the beautifying processing module is used for carrying out beautifying processing on the original image to obtain image data of a beautifying image corresponding to the original image;
the first judgment module is used for carrying out difference judgment on the image data of the original image and the image data of the beauty image to obtain a first difference degree between the original image and the beauty image;
a data sending module, configured to send image data of the beauty image to a rendering process if the first difference is greater than a first difference threshold;
the second judgment module is used for performing difference judgment on the image data received by the rendering process and the image data of the beauty image to obtain a second difference degree between the image corresponding to the image data received by the rendering process and the beauty image;
and the rendering and displaying module is used for rendering and displaying the beautifying image based on the image data received by the rendering process if the second difference is smaller than a second difference threshold, wherein the second difference threshold is smaller than the first difference threshold.
7. The apparatus of claim 6, further comprising:
the first storage module is used for storing the image data of the original image in a first cache region;
a second storage module, configured to store image data of the beauty image in a second cache area if the first difference is greater than the first difference threshold;
the first cache region and the second cache region are different.
8. The apparatus of claim 7, further comprising:
and the graying processing module is used for performing graying processing on the image data stored in the second cache region if the second difference degree is greater than or equal to the second difference degree threshold, wherein the graying processing is used for prohibiting rendering and displaying of the image data stored in the second cache region.
9. A terminal, characterized in that it comprises a processor and a memory, said memory storing a computer program which is loaded and executed by said processor to implement the image processing method according to any one of claims 1 to 5.
10. A computer-readable storage medium, in which a computer program is stored, which is loaded and executed by a processor to implement the image processing method according to any one of claims 1 to 5.
CN202010564642.XA 2020-06-19 2020-06-19 Image processing method, device, terminal and storage medium Active CN111669611B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010564642.XA CN111669611B (en) 2020-06-19 2020-06-19 Image processing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010564642.XA CN111669611B (en) 2020-06-19 2020-06-19 Image processing method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111669611A CN111669611A (en) 2020-09-15
CN111669611B true CN111669611B (en) 2022-02-22

Family

ID=72389002

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010564642.XA Active CN111669611B (en) 2020-06-19 2020-06-19 Image processing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111669611B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862712A (en) * 2021-02-01 2021-05-28 广州方图科技有限公司 Beautifying processing method, system, storage medium and terminal equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107172354A (en) * 2017-06-21 2017-09-15 深圳市万普拉斯科技有限公司 Method for processing video frequency, device, electronic equipment and storage medium
CN107808404A (en) * 2017-09-08 2018-03-16 广州视源电子科技股份有限公司 Image processing method, system, readable storage medium storing program for executing and dollying equipment
CN108492246A (en) * 2018-03-12 2018-09-04 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN110956106A (en) * 2019-11-20 2020-04-03 广州华多网络科技有限公司 Processing method, device, storage medium and equipment for live broadcast
CN110992327A (en) * 2019-11-27 2020-04-10 北京达佳互联信息技术有限公司 Lens contamination state detection method and device, terminal and storage medium
CN111163330A (en) * 2020-01-13 2020-05-15 广州虎牙科技有限公司 Live video rendering method, device, system, equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10681391B2 (en) * 2016-07-13 2020-06-09 Oath Inc. Computerized system and method for automatic highlight detection from live streaming media and rendering within a specialized media player
US10523979B2 (en) * 2017-12-21 2019-12-31 Vyu Labs, Inc. Streaming live video
CN108765531A (en) * 2018-03-27 2018-11-06 广东欧珀移动通信有限公司 Image rendering method, device, storage medium and intelligent terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107172354A (en) * 2017-06-21 2017-09-15 深圳市万普拉斯科技有限公司 Method for processing video frequency, device, electronic equipment and storage medium
CN107808404A (en) * 2017-09-08 2018-03-16 广州视源电子科技股份有限公司 Image processing method, system, readable storage medium storing program for executing and dollying equipment
CN108492246A (en) * 2018-03-12 2018-09-04 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN110956106A (en) * 2019-11-20 2020-04-03 广州华多网络科技有限公司 Processing method, device, storage medium and equipment for live broadcast
CN110992327A (en) * 2019-11-27 2020-04-10 北京达佳互联信息技术有限公司 Lens contamination state detection method and device, terminal and storage medium
CN111163330A (en) * 2020-01-13 2020-05-15 广州虎牙科技有限公司 Live video rendering method, device, system, equipment and storage medium

Also Published As

Publication number Publication date
CN111669611A (en) 2020-09-15

Similar Documents

Publication Publication Date Title
CN110502954B (en) Video analysis method and device
CN111083516B (en) Live broadcast processing method and device
CN111753784A (en) Video special effect processing method and device, terminal and storage medium
CN109302632B (en) Method, device, terminal and storage medium for acquiring live video picture
CN112667835A (en) Work processing method and device, electronic equipment and storage medium
CN110933452A (en) Method and device for displaying lovely face gift and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN111083513B (en) Live broadcast picture processing method and device, terminal and computer readable storage medium
CN112565806A (en) Virtual gift presenting method, device, computer equipment and medium
CN111586279B (en) Method, device and equipment for determining shooting state and storage medium
CN109783176B (en) Page switching method and device
CN109660876B (en) Method and device for displaying list
CN110769120A (en) Method, device, equipment and storage medium for message reminding
CN111753606A (en) Intelligent model upgrading method and device
CN111669611B (en) Image processing method, device, terminal and storage medium
CN109819308B (en) Virtual resource acquisition method, device, terminal, server and storage medium
CN112419143A (en) Image processing method, special effect parameter setting method, device, equipment and medium
CN111327819A (en) Method, device, electronic equipment and medium for selecting image
CN110933454A (en) Method, device, equipment and storage medium for processing live broadcast budding gift
CN112015612B (en) Method and device for acquiring stuck information
CN109561215B (en) Method, device, terminal and storage medium for controlling beautifying function
CN110717365B (en) Method and device for obtaining picture
CN110471613B (en) Data storage method, data reading method, device and system
CN110443841B (en) Method, device and system for measuring ground depth
CN111723615A (en) Method and device for carrying out detection object matching judgment on detection object image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant