CN113284199A - Image gray area determination method, electronic device and server - Google Patents

Image gray area determination method, electronic device and server Download PDF

Info

Publication number
CN113284199A
CN113284199A CN202110564405.8A CN202110564405A CN113284199A CN 113284199 A CN113284199 A CN 113284199A CN 202110564405 A CN202110564405 A CN 202110564405A CN 113284199 A CN113284199 A CN 113284199A
Authority
CN
China
Prior art keywords
image
gray scale
feature
network
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110564405.8A
Other languages
Chinese (zh)
Inventor
徐立桐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aiku Software Technology Shanghai Co ltd
Original Assignee
Aiku Software Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aiku Software Technology Shanghai Co ltd filed Critical Aiku Software Technology Shanghai Co ltd
Priority to CN202110564405.8A priority Critical patent/CN113284199A/en
Publication of CN113284199A publication Critical patent/CN113284199A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/20Circuitry for controlling amplitude response
    • H04N5/202Gamma control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Abstract

The application discloses an image gray scale area determining method, electronic equipment and a server, and belongs to the field of image processing. The method comprises the following steps: acquiring a first image sent by electronic equipment, and performing feature recognition on the first image to obtain a feature of the first image; selecting a network image containing the feature from a first network image data set; and determining the gray scale area of the first image according to the gray scale area corresponding to the feature in the network image.

Description

Image gray area determination method, electronic device and server
Technical Field
The application belongs to the field of image processing, and particularly relates to an image gray scale area determination method, electronic equipment and a server.
Background
In the process of photographing, the electronic device may have a phenomenon that an object which is actually White is no longer White after being imaged, and at this time, an AWB (Auto White Balance) automatic White Balance technology is required to adjust colors, so as to ensure color consistency under different color temperatures.
However, in the AWB technique in the prior art, the gray scale region in the original image is mainly determined according to the information of the pixel point in the original image, so that the color adjustment is performed according to the determined gray scale region, and therefore, in the AWB technique, the accuracy of the color adjustment can be ensured only by accurately selecting the gray scale region, but the AWB technique is limited by the calculation power of the electronic device, and the wrong gray scale region is easily selected at some color temperatures.
Therefore, how to accurately select the gray scale region of the image has become an urgent problem to be solved in the industry.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image gray scale region determining method, an electronic device, and a server, which can solve the problem of accurately selecting a gray scale region of an image.
In a first aspect, an embodiment of the present application provides an image grayscale region determining method, where the method includes:
acquiring a first image sent by electronic equipment, and performing feature recognition on the first image to obtain a feature of the first image;
selecting a network image containing the feature from a first network image data set;
and determining the gray scale area of the first image according to the gray scale area corresponding to the feature in the network image.
In a second aspect, an embodiment of the present application provides an image grayscale region determining method, including:
sending the first image to a server;
receiving a gray area of a first image fed back by the server;
and the gray scale area of the current preview is determined by the server according to the gray scale area corresponding to the feature in the network image after the feature of the first image is obtained.
In a third aspect, an embodiment of the present application provides a server, where the server includes:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a first image sent by electronic equipment and performing feature recognition on the first image to obtain a feature of the first image;
the selecting unit is used for selecting the network image containing the characteristic object from the first network image data set;
and the determining unit is used for determining the gray scale area of the first image according to the gray scale area corresponding to the feature in the network image.
In a fourth aspect, an embodiment of the present application provides an electronic device, including:
a transmitting unit for transmitting the first image to a server;
the receiving unit is used for receiving the gray scale area of the first image fed back by the server;
and the gray scale area of the current preview is determined by the server according to the gray scale area corresponding to the feature in the network image after the feature of the first image is obtained.
In a fifth aspect, the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect or the third aspect.
In a sixth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first or third aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect or the third aspect.
In the embodiment of the application, after the electronic device sends the first image to the server, the server searches for a network image with the same feature as that in the preview image in the network, and since the color of each pixel point of the feature image in the network image is known, the pixel point with the known color determines the gray scale region corresponding to the feature in the network image, and since the first image also contains the feature, the gray scale region of the first image can be correspondingly found.
Drawings
Fig. 1 is a flowchart of an image gray scale region determination method according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a first image block provided in the embodiment of the present application;
fig. 3 is a schematic flowchart of a method for determining an image gray scale region according to an embodiment of the present application;
FIG. 4 is a schematic illustration of a preview interface described in embodiments of the present application;
fig. 5 is a schematic flow chart illustrating white balance adjustment performed in the embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 8 is a second schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
An image gray scale region determining method, an image gray scale region determining device, and an electronic device provided in the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of an image gray scale region determining method according to an embodiment of the present application, and as shown in fig. 1, the method includes:
step 110, acquiring a first image sent by electronic equipment, and performing feature recognition on the first image to obtain a feature of the first image;
specifically, in this embodiment of the application, the server acquires a first image sent by the electronic device, where the first image may be a shooting preview image acquired by a camera of the electronic device at the current time, or an image displayed on a display screen of the electronic device at the current time.
The electronic device described in the present application may refer to a device having a communication transmission function, such as a smartphone, a tablet computer, or a camera.
In the present application, the electronic device and the server are connected through a wireless communication network, or may be connected through a wired network.
The feature described in this application refers to an image of a specific object in the first image, that is, an object having a certain feature, such as a billboard, a road sign, a landmark building, etc. in the image.
In the present application, the feature recognition of the first image may be specifically performed by a pre-trained feature recognition neural network model, which may detect a feature included in the first image. The implementation of the feature recognition in the present application may also be some other common algorithms, such as the YOLO V2 algorithm, etc., which is not limited in the present application.
Step 120, selecting a network image containing the feature from the first network image dataset;
specifically, the first network image data set described in this application is composed of network images having the same positioning information as the first image, and may be obtained by the server screening in the internet database according to the positioning information of the first image.
Since the network images in the first network image dataset are all identical to the localization information of the first image, the features in these network images are likely to be identical to the features of the first image.
Therefore, the network images having the same feature as the first image can be selected from the network images of the first network data set.
Step 130, determining the gray scale area of the first image according to the gray scale area corresponding to the feature in the network image.
Specifically, the grayscale region described in this application refers to a region appearing in gray, that is, a region composed of gray dots, and pixel points in the region actually appear as R ═ G ═ B.
After the network image containing the feature is obtained, because the color of each pixel point in the network image is known, whether a gray area exists in the feature of the network image can be further analyzed, and if one feature has an area expressing gray color in a plurality of network images, the area expressing gray color in the feature is determined as a gray area.
In the application, since the same feature exists in both the network image and the first image, after the gray scale region corresponding to the feature in the network image is determined, the gray scale region corresponding to the same feature in the first image can be determined because the gray scale region in the feature in different images is not changed.
After determining the gray scale region in the first image, the server may feed back the gray scale region to the electronic device, and when the electronic device calculates the AWB of the next frame, the electronic device may directly perform color adjustment according to the gray scale region fed back by the server.
In the embodiment of the application, after the electronic device sends the first image to the server, the server searches for a network image with the same feature as that in the preview image in the network, and since the color of each pixel point of the feature image in the network image is known, the pixel point with the known color determines the gray scale region corresponding to the feature in the network image, and since the first image also contains the feature, the gray scale region of the first image can be correspondingly found.
More specifically, the server sends the grayscale region of the first image to the electronic device, and after receiving the grayscale region of the first image, the electronic device determines the red gain and the blue gain of the next frame of preview image according to the grayscale region when the first image is the photographed preview image, so as to perform white balance adjustment.
Optionally, the first image carries positioning information of the electronic device, and before the network image including the feature is selected from the first network image dataset, the method further includes:
screening a first network image data set in a network image database according to the positioning information of the electronic equipment, wherein the positioning information of each network image in the first candidate image data set is the same as the positioning information.
Specifically, the Positioning information of the electronic device described in this application may refer to the Positioning information when the electronic device acquires the first image through the camera, and the Positioning information may be realized through a Positioning module in the electronic device, specifically may refer to a Beidou Positioning module or a Global Positioning System (GPS) Positioning module.
The internet image database described in the present application may be an image database corresponding to an existing internet search engine, or an internet image database constructed according to the scheme of the present application, where a large amount of network images are stored in the database, and the network images in the database all carry the positioning information.
The first network image data set is screened, specifically, in a network image database, network images having the same positioning information as the first image are screened, and the first network image data set is constructed by the screened network images, that is, the positioning information of each network image in the first candidate image data set is the same as the positioning information of the first image
In the embodiment of the application, because the positioning information of the network image in the first candidate image data set is the same as the positioning information of the first image, the probability that the network image and the first image contain the same feature is higher, the calculation amount of the related retrieval process is reduced, and the calculation efficiency is improved.
Optionally, performing feature recognition on the first image to obtain a feature of the first image, including:
dividing the first image into a plurality of first sub-images;
and performing feature recognition on each first sub-image to obtain a feature of each first sub-image.
Specifically, after the server acquires the first image, in order to enhance the identification accuracy and improve the operation efficiency, the first image may be divided into N blocks to obtain N first sub-images, where the specific number of the blocks may be preset according to actual experience or requirements.
Fig. 2 is a schematic diagram of a first image block provided in the embodiment of the present application, and as shown in fig. 2, the first image is divided into four blocks, so as to obtain four first sub-images.
In the embodiment of the application, after the first image is partitioned, the network image corresponding to each first sub-image is determined by taking each partitioned first sub-image as a unit, and the identification accuracy can be effectively enhanced and the operation efficiency can be improved through the partition identification.
Optionally, after obtaining the feature of each first sub-image, the method further includes: determining a network image corresponding to each first sub-image in the first network image data set, wherein the network image comprises features of the first sub-image;
the determining of the gray scale region of the first image comprises;
and determining the gray scale area of each first sub-image according to the gray scale area corresponding to the feature in the network image so as to obtain the gray scale area of the first image.
Specifically, in the embodiment of the present application, a network image that includes the same feature as that in the first sub-image is screened from the first network image data set, and since the feature corresponding to each first sub-image may be different, the network image corresponding to each currently viewed sub-image may also be different, and therefore, the network image corresponding to each first sub-image may be finally determined.
More specifically, the gray scale area corresponding to the feature in the network image is determined in units of each first sub-image, so that the gray scale area of each first sub-image is determined.
And obtaining the gray scale area of the first image according to the gray scale area of the first sub-image.
In the embodiment of the application, the first image is partitioned into the plurality of first sub-images, and each first sub-image is taken as a unit, so that the gray scale area is determined, the gray scale area of the first image can be determined more accurately through more refined search, and the detection accuracy is improved.
Optionally, performing feature recognition on the first image to obtain a feature of the first image, including:
and under the condition that the first image comprises a preset gray scale area, carrying out feature recognition on the preset gray scale area to obtain a preset feature corresponding to the preset gray scale area.
Specifically, the preset grayscale region in the application is a grayscale region set by a user in advance through the electronic device, or the grayscale region may be screened out by the electronic device through self-analysis, and information of the preset grayscale region is sent to the server along with the first image.
Generally, the gray scale region set by the user has higher reliability, and at this time, in order to reduce the calculation amount and increase the calculation speed, the server may only verify the preset gray scale region set by the user, and does not determine the gray scale region again.
Therefore, when the first image includes the preset gray scale region, the server does not perform feature recognition on the whole first image, but performs feature recognition only on the preset gray scale region of the first image to obtain the preset feature of the preset gray scale region.
In the embodiment of the application, under the condition that the first image comprises the preset gray scale region, only the characteristic recognition is carried out on the preset gray scale region, the calculation amount of the gray scale region can be effectively determined, and the calculation speed is improved.
Optionally, after obtaining the preset feature of the preset grayscale region, the method further includes:
selecting a network image containing the preset feature from the first network image data set;
the determining the gray scale region of the first image comprises:
and determining the gray scale area of the first image according to the gray scale area corresponding to the preset feature in the network image.
Specifically, the preset feature described in the present application refers to a feature obtained by performing feature recognition on a preset grayscale region.
And the server judges the gray scale area according to the preset feature to obtain a judgment result of the server, and the judgment result of the server is used as the gray scale area of the first image finally sent to the electronic equipment.
In the embodiment of the application, the preset gray scale region is verified through the server, so that the condition that the preset gray scale region is inaccurate can be avoided, and the accuracy of the selected gray scale region of the first image is ensured.
Fig. 3 is a schematic flow chart of a method for determining an image grayscale region according to an embodiment of the present application, as shown in fig. 3, including:
step 310, sending the first image to a server;
in the embodiment of the application, the electronic device sends a first image to the server, where the first image may be a shooting preview image acquired by a camera of the electronic device at the current time, or an image displayed on a display screen of the electronic device at the current time.
After receiving the first image sent by the electronic device, the server finds a network image with the same feature as the first image through the Internet image database, and then judges the gray level area in the feature through the known pixel point information of the network image, so as to determine the gray level area of the first image.
Step 320, receiving a gray scale area of the first image fed back by the server;
and the gray scale area of the current preview is determined by the server according to the gray scale area corresponding to the feature in the network image after the feature of the first image is obtained.
Specifically, after receiving the grayscale region of the first image fed back by the server, the electronic device in the application may display a grayscale region mark in a preview interface of the electronic device, and a user may know a finally determined grayscale region through the grayscale region.
In the embodiment of the application, after the electronic device sends the first image to the server, the server searches for a network image with the same feature as that in the preview image in the network, and since the color of each pixel point of the feature image in the network image is known, the pixel point with the known color determines the gray scale region corresponding to the feature in the network image, and since the first image also contains the feature, the gray scale region of the first image can be correspondingly found.
Optionally, before the sending the first image to the server, the method further includes:
receiving a first input of a user for a target area in a first image, wherein the first input is used for setting the target area as a preset gray area;
displaying a first image including a preset gray area in response to the first input;
the sending the first image to the server includes:
and sending the first image comprising the preset gray scale area to a server.
Specifically, the shooting preview interface in the embodiment of the present application is an interface for displaying a preview image shot by the electronic device, and may specifically be a shooting preview interface during shooting, or a shooting preview interface during video recording.
The target area described in the embodiment of the present application is a selected area in the first image displayed on the shooting preview interface by the user.
The first input in the present application is a preset grayscale region set subjectively by a user, which may be specifically an operation of the user clicking or long-pressing a target region.
Fig. 4 is a schematic view of a preview interface described in the embodiment of the present application, and as shown in fig. 4, a first image carrying a preset gray scale region 42 is displayed in a preview interface 41.
And after the first image carrying the preset gray scale area is displayed on the preview interface, the electronic equipment sends the first image comprising the preset gray scale area to the server.
When the server acquires the first image comprising the preset gray scale area, only the preset gray scale area set by the user is verified, and the gray scale area is not determined again.
In the embodiment of the application, the user selects the preset gray scale area by himself and provides the user with the option of participating in image adjustment, and meanwhile, the operation speed of the server can be reduced because the data volume of the server required for feature recognition is greatly reduced.
Optionally, after receiving the grayscale region of the first image fed back by the server, the method further includes:
and when the first image is a preview image, determining the red gain and the blue gain of a next frame of preview image according to the gray scale area of the first image, wherein the next frame of preview image is the next frame of image of the first image.
Specifically, in the embodiment of the present application, after determining the gray point region, the average R/G and B/G of the gray points may be further determined, the current color temperature may be estimated through the R/G and B/G on the color temperature curve, and then the red gain and the blue gain at the color temperature may be determined.
However, since the first image is continuously updated during image capturing, the red gain and the blue gain can be used for white balance adjustment of the next frame image, and thus the red gain and the blue gain of the next frame preview image are finally determined according to the gray scale region of the first image.
In the embodiment of the application, after the gray scale region of the image is accurately selected, the red gain and the blue gain of the image can be reasonably determined to be adjusted by white balance according to the gray scale region, so that the color consistency of the image under different color temperatures can be effectively ensured.
Optionally, after receiving the grayscale region of the first image fed back by the server, the method further includes:
under the condition that the first image is a preview image and the gray scale area of the first image comprises the gray scale areas of a plurality of first sub-images, determining the red gain and the blue gain corresponding to each first sub-image according to the gray scale areas of the plurality of first sub-images;
and determining the red gain and the blue gain of a next frame of preview image based on the red gain and the blue gain corresponding to each first sub-image, wherein the next frame of preview image refers to the next frame of image of the first image.
Specifically, in the process of processing the first image, the processor may divide the first image into a plurality of first sub-images, then determine the grayscale region of each first sub-image, and determine the grayscale region of each first sub-image by taking each first sub-image as a unit, so that the finally determined grayscale region may be accurate, and all the first sub-images may form the first image, and therefore the grayscale region of the first image may also be obtained according to the grayscale regions of the first sub-images.
Therefore, the gray scale region of the first image received by the electronic device may be a gray scale region including a plurality of first sub-images, and at this time, when the electronic device calculates the red gain and the blue gain, the electronic device still calculates in the unit of the first sub-images to determine the red gain and the blue gain corresponding to each first sub-image, and finally determines the red gain and the blue gain corresponding to each first sub-image in each next frame preview image according to the red gain and the blue gain corresponding to each first sub-image, and then the red gain and the blue gain are respectively applied to the corresponding regions to realize white balance adjustment.
In the embodiment of the application, the first sub-images are taken as units, the red gain and the blue gain corresponding to each first sub-image are firstly determined, then the red gain and the blue gain of the corresponding area of each first sub-image in each next frame of preview image are determined, and the white balance can be accurately adjusted through refined calculation.
Fig. 5 is a schematic flow diagram of white balance adjustment performed in the embodiment of the present application, and as shown in fig. 5, after N-1 frames of image data acquired by a pixel sensor of an electronic device are processed by an image processing module, an N-1 frame of preview image is obtained, and then the N-1 frame of preview image is sent to a server, after the server receives the N-1 frame of preview image sent by the electronic device, a first network image data set having the same positioning information is screened from a network image database according to the positioning information of the N-1 frame of preview image, and then a network image having the same features as those of the N-1 frame of preview image is screened from the first network image data set; and screening the feature with the gray area from the screened network image so as to determine the gray area of the N-1 frames of image data, feeding the gray area of the first image back to an image processing module of the terminal, and determining the red gain and the blue gain of the N frames of image data by the image processing module of the terminal according to the gray area.
It should be noted that, in the method for determining an image grayscale region provided in the embodiment of the present application, the execution subject may be an electronic device, or a control module in the electronic device for executing the method for determining the image grayscale region. The electronic device for determining the image gray scale area provided by the embodiment of the present application is described with a method for determining the image gray scale area performed by the electronic device as an example.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 6, including: a transmitting unit 610 and a receiving unit 620; wherein, the sending unit 610 is configured to send the first image to the server; the receiving unit 620 is configured to receive a grayscale region of the first image fed back by the server; and the gray scale area of the current preview is determined by the server according to the gray scale area corresponding to the feature in the network image after the feature of the first image is obtained.
Optionally, the apparatus further comprises: a receiving unit;
the receiving unit is used for first input of a target area in a first image, and the first input is used for setting the target area as a preset gray area;
displaying a first image including a preset gray area in response to the first input;
the receiving unit is specifically configured to:
and sending the first image comprising the preset gray scale area to a server.
Optionally, the apparatus further comprises a processing unit;
the processing unit is configured to determine a red gain and a blue gain of a next frame of preview image according to a gray scale region of the first image when the first image is a preview image, where the next frame of preview image is a next frame of image of the first image.
Optionally, the processing unit is configured to determine, when the first image is a preview image and a grayscale region of the first image includes grayscale regions of a plurality of first sub-images, a red gain and a blue gain corresponding to each of the first sub-images according to the grayscale regions of the plurality of first sub-images;
and determining the red gain and the blue gain of a next frame of preview image based on the red gain and the blue gain corresponding to each first sub-image, wherein the next frame of preview image refers to the next frame of image of the first image.
Fig. 7 is a schematic structural diagram of a server according to an embodiment of the present application, and as shown in fig. 7, the server includes: an obtaining unit 710, a selecting unit 720 and a determining unit 730; the acquiring unit 710 is configured to acquire a first image sent by an electronic device, perform feature recognition on the first image, and obtain a feature of the first image; the selecting unit 720 is configured to select a network image including the feature from the first network image dataset; the determining unit 730 is configured to determine a gray scale region of the first image according to a gray scale region corresponding to the feature in the network image.
Optionally, the server further comprises a screening unit;
the screening unit is used for screening a first network image data set in a network image database according to the positioning information of the electronic equipment, and the positioning information of each network image in the first candidate image data set is the same as the positioning information.
Optionally, the obtaining unit is specifically configured to:
dividing the first image into a plurality of first sub-images;
and performing feature recognition on each first sub-image to obtain a feature of each first sub-image.
Optionally, the selecting unit is specifically configured to: determining a network image corresponding to each first sub-image in the first network image data set, wherein the network image comprises features of the first sub-image;
the determining unit is specifically configured to determine a gray scale region of each first sub-image according to a gray scale region corresponding to the feature in the network image, so as to obtain a gray scale region of the first image.
Optionally, the obtaining unit is specifically configured to: and under the condition that the first image comprises a preset gray scale area, carrying out feature recognition on the preset gray scale area to obtain a preset feature corresponding to the preset gray scale area.
In the embodiment of the application, after the electronic device sends the first image to the server, the server searches for a network image with the same feature as that in the preview image in the network, and since the color of each pixel point of the feature image in the network image is known, the pixel point with the known color determines the gray scale region corresponding to the feature in the network image, and since the first image also contains the feature, the gray scale region of the first image can be correspondingly found.
The electronic device in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a Network Attached Storage (NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The electronic device in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The electronic device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 5, and is not described here again to avoid repetition.
Optionally, fig. 8 is a second schematic structural diagram of an electronic device according to an embodiment of the present disclosure; as shown in fig. 8, an electronic device 800 is further provided in this embodiment of the present application, and includes a processor 801, a memory 802, and a program or an instruction stored in the memory 802 and executable on the processor 801, where the program or the instruction is executed by the processor 801 to implement each process of the foregoing method for determining a gray scale area of an image, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, and a processor 910.
Those skilled in the art will appreciate that the electronic device 900 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The radio frequency unit 901 is configured to send the first image to the server;
the network module 902 is configured to receive a grayscale region of the first image fed back by the server;
and the gray scale area of the current preview is determined by the server according to the gray scale area corresponding to the feature in the network image after the feature of the first image is obtained.
Optionally, the user input unit 907 is configured to receive a first input of a user to a target region in the first image, where the first input is used to set the target region as a preset grayscale region;
a display unit 906 for displaying a first image including a preset gradation region in response to the first input;
the sending the first image to the server includes:
the radio frequency unit 901 is configured to send the first image including the preset grayscale region to a server.
Optionally, the processor 910 is configured to, when the first image is a preview image, determine a red gain and a blue gain of a next frame of preview image according to a gray scale region of the first image, where the next frame of preview image is a next frame of image of the first image.
Optionally, the processor 910 is configured to, when the first image is a preview image and the grayscale region of the first image includes grayscale regions of a plurality of first sub-images, determine, according to the grayscale regions of the plurality of first sub-images, a red gain and a blue gain corresponding to each of the first sub-images;
and determining the red gain and the blue gain of a next frame of preview image based on the red gain and the blue gain corresponding to each first sub-image, wherein the next frame of preview image refers to the next frame of image of the first image.
In the embodiment of the application, after the electronic device sends the first image to the server, the server searches for the network image with the same feature as that in the preview image in the network, and since the color of each pixel point of the feature image in the network image is known, the pixel point with the known color determines the gray scale region corresponding to the feature in the network image, and since the first image also contains the feature, the gray scale region of the first image can be correspondingly found.
It should be understood that, in the embodiment of the present application, the input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics Processing Unit 9041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 906 may include a display panel 1061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 907 includes a touch panel 9071 and other input devices 9072. A touch panel 9071 also referred to as a touch screen. The touch panel 9071 may include two parts, a touch detection device and a touch controller. Other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 909 can be used to store software programs as well as various data including, but not limited to, application programs and operating systems. The processor 910 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above method for determining a gray level region of an image, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image grayscale region determining method, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. An image gray scale area determining method is applied to a server and is characterized by comprising the following steps:
acquiring a first image sent by electronic equipment, and performing feature recognition on the first image to obtain a feature of the first image;
selecting a network image containing the feature from a first network image data set;
and determining the gray scale area of the first image according to the gray scale area corresponding to the feature in the network image.
2. The method of claim 1, wherein the first image carries positioning information of the electronic device, and wherein before the network image containing the feature is selected from the first network image data set, the method further comprises:
screening a first network image data set in a network image database according to the positioning information of the electronic equipment, wherein the positioning information of each network image in the first candidate image data set is the same as the positioning information.
3. The method for determining the gray scale region of the image according to claim 1, wherein the step of performing feature recognition on the first image to obtain the feature of the first image comprises:
dividing the first image into a plurality of first sub-images;
and performing feature recognition on each first sub-image to obtain a feature of each first sub-image.
4. The method according to claim 3, further comprising, after obtaining the feature of each of the first sub-images: determining a network image corresponding to each first sub-image in the first network image data set, wherein the network image comprises features of the first sub-image;
the determining the gray scale region of the first image comprises:
and determining the gray scale area of each first sub-image according to the gray scale area corresponding to the feature in the network image so as to obtain the gray scale area of the first image.
5. The method for determining the gray scale region of the image according to claim 1, wherein the performing the feature recognition on the first image to obtain the feature of the first image comprises:
and under the condition that the first image comprises a preset gray scale area, carrying out feature recognition on the preset gray scale area to obtain a preset feature corresponding to the preset gray scale area.
6. An image gray scale region determining method is applied to electronic equipment and is characterized by comprising the following steps:
sending the first image to a server;
receiving a gray area of a first image fed back by the server;
and the gray scale area of the current preview is determined by the server according to the gray scale area corresponding to the feature in the network image after the feature of the first image is obtained.
7. The method of claim 6, wherein prior to sending the first image to the server, the method further comprises:
receiving a first input of a user to a target area in a first image, wherein the first input is used for setting the target area as a preset gray area;
displaying a first image including a preset gray area in response to the first input;
the sending the first image to the server includes:
and sending the first image comprising the preset gray scale area to a server.
8. The method for determining the gray scale region of the image according to claim 6, wherein after receiving the gray scale region of the first image fed back by the server, the method further comprises:
and when the first image is a preview image, determining the red gain and the blue gain of a next frame of preview image according to the gray scale area of the first image, wherein the next frame of preview image is the next frame of image of the first image.
9. The method for determining the gray scale region of the image according to claim 6, wherein after receiving the gray scale region of the first image fed back by the server, the method further comprises:
under the condition that the first image is a preview image and the gray scale area of the first image comprises the gray scale areas of a plurality of first sub-images, determining the red gain and the blue gain corresponding to each first sub-image according to the gray scale areas of the plurality of first sub-images;
and determining the red gain and the blue gain of a next frame of preview image based on the red gain and the blue gain corresponding to each first sub-image, wherein the next frame of preview image refers to the next frame of image of the first image.
10. A server, comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a first image sent by electronic equipment and performing feature recognition on the first image to obtain a feature of the first image;
the selecting unit is used for selecting the network image containing the characteristic object from the first network image data set;
and the determining unit is used for determining the gray scale area of the first image according to the gray scale area corresponding to the feature in the network image.
11. An electronic device, comprising:
a transmitting unit for transmitting the first image to a server;
the receiving unit is used for receiving the gray scale area of the first image fed back by the server;
and the gray scale area of the current preview is determined by the server according to the gray scale area corresponding to the feature in the network image after the feature of the first image is obtained.
CN202110564405.8A 2021-05-24 2021-05-24 Image gray area determination method, electronic device and server Pending CN113284199A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110564405.8A CN113284199A (en) 2021-05-24 2021-05-24 Image gray area determination method, electronic device and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110564405.8A CN113284199A (en) 2021-05-24 2021-05-24 Image gray area determination method, electronic device and server

Publications (1)

Publication Number Publication Date
CN113284199A true CN113284199A (en) 2021-08-20

Family

ID=77280993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110564405.8A Pending CN113284199A (en) 2021-05-24 2021-05-24 Image gray area determination method, electronic device and server

Country Status (1)

Country Link
CN (1) CN113284199A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115018565A (en) * 2022-08-08 2022-09-06 长沙朗源电子科技有限公司 Advertisement media image identification method, system, equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104780353A (en) * 2015-03-26 2015-07-15 广东欧珀移动通信有限公司 Image processing method and device
US20150371370A1 (en) * 2014-06-19 2015-12-24 Microsoft Corporation Identifying gray regions for auto white balancing
CN107277479A (en) * 2017-07-10 2017-10-20 广东欧珀移动通信有限公司 White balancing treatment method and device
CN108024107A (en) * 2017-12-06 2018-05-11 广东欧珀移动通信有限公司 Image processing method, device, electronic equipment and computer-readable recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150371370A1 (en) * 2014-06-19 2015-12-24 Microsoft Corporation Identifying gray regions for auto white balancing
CN104780353A (en) * 2015-03-26 2015-07-15 广东欧珀移动通信有限公司 Image processing method and device
CN107277479A (en) * 2017-07-10 2017-10-20 广东欧珀移动通信有限公司 White balancing treatment method and device
CN108024107A (en) * 2017-12-06 2018-05-11 广东欧珀移动通信有限公司 Image processing method, device, electronic equipment and computer-readable recording medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
金黄斌等: "基于灰度世界和白点检测的自动白平衡算法", 电子器件, vol. 34, no. 02, pages 226 - 231 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115018565A (en) * 2022-08-08 2022-09-06 长沙朗源电子科技有限公司 Advertisement media image identification method, system, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN113132618B (en) Auxiliary photographing method and device, terminal equipment and storage medium
US9491366B2 (en) Electronic device and image composition method thereof
CN107613202B (en) Shooting method and mobile terminal
CN112162930B (en) Control identification method, related device, equipment and storage medium
CN108701439B (en) Image display optimization method and device
US20170032219A1 (en) Methods and devices for picture processing
US20040093432A1 (en) Method and system for conducting image processing from a mobile client device
US11514263B2 (en) Method and apparatus for processing image
WO2023046112A1 (en) Document image enhancement method and apparatus, and electronic device
CN111835982B (en) Image acquisition method, image acquisition device, electronic device, and storage medium
CN112037160A (en) Image processing method, device and equipment
CN113284199A (en) Image gray area determination method, electronic device and server
CN104866194A (en) Picture searching method and apparatus
CN115731442A (en) Image processing method, image processing device, computer equipment and storage medium
CN113473012A (en) Virtualization processing method and device and electronic equipment
CN106484215B (en) Method and device for managing desktop of mobile terminal
CN113794831A (en) Video shooting method and device, electronic equipment and medium
CN109040774B (en) Program information extraction method, terminal equipment, server and storage medium
CN113873081B (en) Method and device for sending associated image and electronic equipment
CN113162845B (en) Image sharing method and device
CN113691729B (en) Image processing method and device
US20210327004A1 (en) Information processing apparatus, information processing method, and system
CN111382752B (en) Labeling method and related device
CN111860342A (en) Image processing method, image processing device, storage medium and electronic equipment
CN115237312A (en) Screen capturing method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination