CN111435276A - Image processing method and device and electronic equipment - Google Patents

Image processing method and device and electronic equipment Download PDF

Info

Publication number
CN111435276A
CN111435276A CN201910033558.2A CN201910033558A CN111435276A CN 111435276 A CN111435276 A CN 111435276A CN 201910033558 A CN201910033558 A CN 201910033558A CN 111435276 A CN111435276 A CN 111435276A
Authority
CN
China
Prior art keywords
color
image
designated area
user
colors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910033558.2A
Other languages
Chinese (zh)
Inventor
童飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201910033558.2A priority Critical patent/CN111435276A/en
Publication of CN111435276A publication Critical patent/CN111435276A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an image processing method, an image processing device and electronic equipment, wherein the method comprises the following steps: determining a designated area of the image; and performing color analysis on the image of the designated area, and displaying the analyzed color category in an operation interface so that a user can perform selection operation on at least one analyzed color. According to the scheme of the embodiment of the invention, the color can be automatically extracted based on the color analysis of the picture, so that a user can conveniently and directly select the analyzed color, and the image processing can be conveniently carried out on the target image based on the selected color.

Description

Image processing method and device and electronic equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an image processing method and apparatus, and an electronic device.
Background
In the image processing operation, in order to keep the overall color of the image consistent and consistent, the existing color in the image is often required to be selected to fill the color at the position of other areas, and this operation is often called a "straw" in the image processing. The operation similar to the operation of a sucker on the PC terminal is very inconvenient at the mobile terminal, and when the color is selected in the touch image selection area, due to the large contact area between the finger and the screen, the refined click of the color in a certain area of the image is inconvenient at the mobile terminal, so that the refined color selection operation cannot be performed.
Disclosure of Invention
The invention provides an image processing method, an image processing device and electronic equipment, which can automatically extract colors based on color analysis of a picture, and are convenient for a user to directly select the analyzed colors so as to perform image processing on a target image based on the selected colors.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
in a first aspect, an image processing method is provided, including:
determining a designated area of the image;
and performing color analysis on the image of the designated area, and displaying the analyzed color category in an operation interface so that a user can perform selection operation on at least one analyzed color.
In a second aspect, there is provided a color pickup processing method including:
providing a selection operation item for determining a designated area in an image for a user in an image operation interface;
based on a specified area determined in an image by a user, carrying out color analysis on the image of the specified area, and displaying the analyzed color category in an operation interface;
responding to the operation of selecting colors on the image color analyzing operation component by the user, and determining the colors selected by the user in the designated area;
providing a selection operation item for determining a target element in an image for a user in an image operation interface;
and filling the image of the target element selected by the user with the selected color based on the color selected by the user.
In a third aspect, an image processing apparatus is provided, including:
the area determining module is used for determining a designated area of the image;
and the color analysis module is used for carrying out color analysis on the image of the designated area and displaying the analyzed color category in an operation interface so that a user can carry out selection operation on at least one analyzed color.
In a fourth aspect, there is provided a color pickup processing apparatus comprising:
the interface display processing module is used for providing a selection operation item for determining a designated area in the image for a user in the image operation interface;
the color analysis processing module is used for carrying out color analysis on the image of the specified area based on the specified area determined by the user in the image and displaying the analyzed color type in an operation interface;
the color selection determining module is used for responding to the operation of selecting colors on the image color analyzing operation component by the user and determining the colors selected by the user in the designated area;
the interface display processing module is also used for providing a selection operation item for determining a target element in the image for a user in the image operation interface;
and the filling operation processing module is used for performing filling operation on the image of the target element selected by the user in the selected color based on the color selected by the user.
In a fifth aspect, an electronic device is provided, comprising:
a memory for storing a program;
and the processor is coupled to the memory and used for executing the program, and the program executes the image processing method provided by the invention when running.
In a sixth aspect, another electronic device is provided, including:
a memory for storing a program;
and the processor is coupled to the memory and used for executing the program, and the program executes the color picking processing method provided by the invention.
The invention provides an image processing method, an image processing device and electronic equipment.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a schematic diagram of an image processing logic according to an embodiment of the present invention;
FIG. 2a is a flowchart illustrating a first image processing method according to an embodiment of the present invention;
FIG. 2b is a flowchart of a second image processing method according to an embodiment of the present invention;
FIG. 3 is a flowchart of a third image processing method according to an embodiment of the present invention;
FIG. 4 is a flowchart of a fourth image processing method according to an embodiment of the present invention;
FIG. 5 is a first block diagram of an image processing apparatus according to an embodiment of the present invention;
FIG. 6 is a block diagram of an image processing apparatus according to an embodiment of the present invention;
FIG. 7 is a flowchart of a color picking processing method according to an embodiment of the present invention;
FIG. 8 is a block diagram of a color pick-up processing apparatus according to an embodiment of the present invention;
FIG. 9 is a first schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The embodiment of the invention overcomes the defect that in the prior art, when color selecting operation is realized on a mobile terminal such as a mobile phone, a tablet personal computer and the like through a touch screen, the color selecting operation cannot be refined due to the fact that the contact area between a finger and the screen is large and the refined clicking of a certain area of a picture is inconvenient.
As shown in fig. 1, the image processing scheme shown in the present scheme can be applied to an image processing platform, which can be a business platform dedicated to image processing or a software client. The image processing platform integrates a plurality of image processing functional operations, wherein the image processing method can realize the color selecting and filling (straw) function and mainly comprises two image processing links. Firstly, carrying out color analysis on a designated area of an image part containing colors to be selected, and determining the color types contained in the designated area; and secondly, filling the image part of the target element with the selected color based on the analyzed color selected by the user.
Further, after the colors of the designated area are analyzed, the analyzed colors can be ranked, and the color with the highest possibility of being selected is ranked first. For example, when a screen contact point or a screen contact surface corresponding to a touch operation is identified by receiving the touch operation of a user, and a designated area is obtained through the screen contact point or the screen contact surface, colors corresponding to the screen contact point corresponding to the touch operation may be ranked in a front position, and remaining colors may be ranked in a rear position, and may be ranked according to a smaller area ratio in the designated area, so as to facilitate color picking by the user.
The technical solution of the present application is further illustrated by the following examples.
Example one
Based on the above image processing concept, fig. 2a is a flowchart of a first image processing method according to an embodiment of the present invention, which can be applied to the image processing platform shown in fig. 1. As shown in fig. 2a, the image processing method includes the steps of:
s210, determining a designated area of the image.
The designated area of the image includes an image portion to be colored, and the area of the designated area is determined to be as small as possible.
S220, color analysis is carried out on the image of the designated area, and the analyzed color category is displayed in the operation interface, so that the user can conveniently carry out selection operation on at least one analyzed color.
After the appointed area of the image is determined, analyzing the color contained in the appointed area in the image to obtain a color category, and displaying the obtained color category in an operation interface in a mode of characters, images and the like so as to facilitate selection of a user.
For example, when performing color analysis on the designated area, RGB color value analysis may be performed on the pixel points in the designated area to obtain the color type included in the image in the designated area.
The method of color analysis employed in this embodiment is not limited to RGB color value analysis, and other color analysis methods may be employed, and the color type included in the designated area may be obtained by analysis using these color analysis methods.
Further, as shown in fig. 2b, after step S220, the following steps may also be performed:
and S230, responding to the selection operation of the analyzed at least one color, and performing filling operation on the image of the target element with the selected color based on the selected color.
The user can select the color included in the designated area obtained by the color analysis, and can select one or more kinds of colors as the operation objects. And performing filling operation on the image part of the target element with the selected color based on the selected color.
The target elements to be filled may be, but are not limited to, characters (layers) and/or color blocks (pixel sets) in the image.
In addition, in actual operation, if the filling color selected by the user is more than one, when the target element is filled with the selected color, the selected color can be synthesized, and a compromise color (for example, a color which is synthesized by blue and yellow and then becomes green) is obtained; or the colors are combined according to a certain combination state to generate a new color matching state (such as color bands arranged at intervals).
In addition, the target element to be filled may be in the same image as the designated area to be colored, or may be classified into different images.
According to the image processing method provided by the invention, after the designated area of the image is determined, the color of the image in the designated area is analyzed, and the analyzed color category is displayed in the operation interface, so that a user can conveniently select at least one analyzed color, the user can conveniently and directly select the analyzed color, and the image is processed based on the selected color.
Furthermore, the user can conveniently and directly select the analyzed color to perform color filling on the target image element by responding to the selection operation of the analyzed at least one color and performing filling operation on the image of the target element with the selected color based on the selected color.
Example two
As shown in fig. 3, a flowchart of an image processing method according to an embodiment of the present invention is shown, and this embodiment is supplemented with the scheme based on the method shown in the first embodiment. As shown in fig. 3, the image processing method includes the steps of:
s310, in response to the selection operation of the elements in the image, the target elements are determined.
For example, in the operation interface of the image processing APP, the user may select a target element (not limited to a character or a color patch) from the image as an operation object of a color to be filled.
S320, in response to the selection operation of the area in the image, determining the designated area.
For example, in the operation interface of the image processing APP, the user may select one region from the designated image as an operation region, that is, the designated region, to select a color from the image of the designated region. In practice, the designated area may include an image portion of a color that the user wants to select, and the area of the designated area may be as small as possible.
For example, when the user executes the image processing method on a mobile terminal such as a mobile phone or a tablet computer, the designated area can be selected through a touch screen. Because the designated area in the scheme is not the area corresponding to the image part of the finally selected color, even if the contact area of the finger and the screen is large, fine click on a certain area of the image cannot be realized, and the later color selection cannot be influenced.
Two methods for determining the designated area on the touch screen are shown below;
the first method comprises the steps of receiving touch operation on a touch screen displaying an image, identifying a screen contact point corresponding to the touch operation, and acquiring a designated area through the screen contact point.
For example, the touch point can be directly acquired by a touch point identification method carried by a mobile phone operating system, and then a certain area range including the touch point can be selected as the designated area based on the touch point.
And secondly, receiving touch operation on the touch screen for displaying the image, identifying a screen contact surface corresponding to the touch operation, and acquiring the designated area through the screen contact surface.
For example: 1) native development, namely, a contact surface can be directly obtained through a terminal system with a touch surface identification method, such as MotionEvent getSize (), of android; 2) h5 developed a prototype area of 18mm diameter centered on the contact point (most adults have index fingers of 16-20 mm in average width, converted to pixels of about 45-57 pixels) to convert the contact area to a contact surface by contact point. Then, a certain area range including the contact surface may be selected as the above-described designated area based on the contact surface.
S330, color analysis is performed on the image of the designated area.
S340, sorting the analyzed colors, wherein the color corresponding to one screen contact point is sorted at the first position, the rest colors are sorted from large to small according to the area ratio of the rest colors in the designated area, and the sorted color categories are displayed in an operation interface.
After the colors of the designated areas are analyzed, the analyzed colors can be sorted, for example, the corresponding colors at the screen contact points during touch operation are arranged at the front positions in the queue in the designated areas determined by the touch operation on the touch screen, because the colors are most likely to be the colors really selected by the user; and the rest colors are arranged at the back position in the queue and can be sorted according to the area ratio of the rest colors in the designated area, so that the user can conveniently click. When the analyzed colors are displayed, the colors which are sequenced in the front by a fixed number can be displayed, so that the area of a display area of the screen interface is saved.
And S350, responding to the selection operation of the analyzed at least one color, and performing filling operation on the image of the target element with the selected color based on the selected color.
The embodiment is based on the embodiment shown in fig. 2, and further, the designated area is determined in response to the selection operation for the area in the image, so that the user can flexibly select the designated area needing to be colored.
Further, after color analysis is carried out on the designated area of the image, the analyzed colors are sequenced, wherein the corresponding colors at the contact point of the screen are sequenced at the front position, the rest colors are sequenced at the rear position, the colors are sequenced from large to small according to the area ratio of the colors in the designated area, and the sequenced color categories are displayed in an operation interface, so that the color selection of a user is facilitated.
EXAMPLE III
The method steps of the present embodiment describe the execution process of the information processing method in the above embodiment based on the processes of image processing shown in fig. 2a, 2b, and 3 from the viewpoint of the actual operation flow of image processing. As shown in fig. 4, it is a flowchart of a fourth image processing method according to an embodiment of the present invention, and includes the following processing steps:
step 1: the user selects a picture in the image processing APP and selects the image element (text) that is desired to be filled, and then clicks on the suction pipe icon
Step 2:
2.1: user clicks a designated area of a desired color from a picture
2.2: the color extraction tool acquires a screen contact point and a screen contact surface of a user.
The acquisition mode of the screen contact point is as follows: the contact point can be directly acquired by a method carried by a mobile phone operating system; the acquisition mode of the screen contact surface: 1) native development, namely, a contact surface can be directly obtained through a system self-carrying method, such as MotionEvent getSize (), of android; 2) h5, converting the contact surface from the contact point, and using the contact point as the center and 18mm as the diameter prototype area (most adults have index finger with average width of 16-20 mm, and the conversion is about 45-57 pixels);
2.3: the color taking tool analyzes the color of all pixel points in the specified area determined based on the contact point or the contact surface and calculates the number of the pixel points of each color; the color taking tool arranges and displays the analyzed colors according to certain logic: the first preferred method is to arrange the color of the screen contact point at the forefront of the queue, arrange the rest colors at the back of the queue, sort the colors from the top to the bottom according to the proportion of the number of color pixel points, and remove the color duplication with the screen contact point. The number of color presentations can be controlled according to design requirements, such as a maximum of 10 colors being presented.
And 3, step 3: the user clicks on the desired paint color block and the text is adjusted to the corresponding color while the background area of the pipette (paint tool) is filled with the color.
And 4, step 4: and finishing color taking.
Example four
As shown in fig. 5, a first block diagram of an image processing apparatus according to an embodiment of the present invention is a block diagram of the image processing apparatus, the image processing apparatus being capable of controlling and executing the method steps shown in fig. 2a, and the method steps including:
a region determining module 510 for determining a designated region of the image;
and the color analysis module 520 is configured to perform color analysis on the image of the designated area, and display the analyzed color category in the operation interface, so that a user performs a selection operation on at least one analyzed color.
Further, the color analysis module 520 may be specifically configured to perform RGB color value analysis on the pixel points in the designated area to obtain the color category included in the image in the designated area.
Further, as shown in fig. 6, the image processing apparatus may further include:
and a filling operation module 610, configured to, in response to the selection operation on the parsed at least one color, perform a filling operation on the image of the target element in the selected color based on the selected color.
Further, the target elements may include words and/or color blocks.
Further, the region selection module 510 may be specifically configured to determine the designated region in response to a selection operation for a region in the image.
Further, the area selection module 510 may be specifically configured to receive a touch operation on a touch screen displaying an image, identify a screen contact point corresponding to the touch operation, and obtain a designated area through the screen contact point.
Alternatively, the area selection module 510 may be specifically configured to receive a touch operation on a touch screen displaying an image, identify a screen contact surface corresponding to the touch operation, and obtain the designated area through the screen contact surface.
Further, the color analysis module 520 may be specifically configured to sort the analyzed colors, where the colors corresponding to the screen contact points are sorted in the front position, the remaining colors are sorted in the rear position, and the colors are sorted from large to small according to the area ratio of the colors in the designated area; and displaying the sorted color categories in an operation interface.
Further, the image processing apparatus shown in fig. 6 may further include:
an element selection module 620, configured to determine a target element in response to a selection operation for an element in the image.
The image processing apparatus shown in fig. 6 may be used to perform the method steps as shown in fig. 2b, 3, 4.
According to the image processing device, after the designated area of the image is determined, the color of the image in the designated area is analyzed, and the analyzed color category is displayed in the operation interface, so that a user can conveniently select at least one analyzed color, the user can conveniently and directly select the analyzed color, and the image is processed based on the selected color.
Furthermore, the user can conveniently and directly select the analyzed color to perform color filling on the target image element by responding to the selection operation of the analyzed at least one color and performing filling operation on the image of the target element with the selected color based on the selected color.
Further, the designated area is determined in response to the selection operation of the area in the image, so that the user can flexibly select the designated area needing color taking.
Further, after color analysis is carried out on the designated area of the image, the analyzed colors are sequenced, wherein the corresponding colors at the contact point of the screen are sequenced at the front position, the rest colors are sequenced at the rear position, the colors are sequenced from large to small according to the area ratio of the colors in the designated area, and the sequenced color categories are displayed in an operation interface, so that the color selection of a user is facilitated.
EXAMPLE five
Based on the above-mentioned image processing concept, fig. 7 is a flowchart of a color picking processing method provided by an embodiment of the present invention, and the method can be applied to the image processing platform shown in fig. 1. As shown in fig. 7, the color pickup processing method includes the steps of:
and S710, providing a selection operation item for determining the designated area in the image for the user in the image operation interface.
For example, an operation item for the user to perform partial selection on the image area may be provided in the image operation interface, and after the user clicks the operation item, a designated area to be processed may be arbitrarily selected on the target image.
S720, based on the specified area determined by the user in the image, performing color analysis on the image of the specified area, and displaying the analyzed color category in the operation interface.
And S730, responding to the operation of selecting the color on the image color analyzing operation component by the user, and determining the color selected by the user in the designated area.
And S740, providing a selection operation item for determining the target element in the image for the user in the image operation interface.
And S750, performing filling operation on the image of the target element selected by the user in the selected color based on the color selected by the user.
The specific implementation process in this embodiment can refer to the method steps shown in fig. 2a to fig. 4, which are not described herein again.
EXAMPLE six
As shown in fig. 8, which is a structural diagram of a color pickup processing apparatus according to an embodiment of the present invention, the color pickup processing apparatus is capable of controlling and executing the method steps shown in fig. 7, and the method includes:
the interface display processing module 810 is configured to provide a user with a selection operation item for determining a designated area in an image operation interface;
a color analysis processing module 820, configured to perform color analysis on the image in the designated area based on the designated area determined by the user in the image, and display the analyzed color category in the operation interface;
a color selection determination module 830, configured to determine a color selected by the user in the designated area in response to an operation of the user selecting a color on the analyze image color operation component;
the interface display processing module 810 is further configured to provide a selection operation item for determining a target element in the image for a user in the image operation interface;
and the filling operation processing module 840 is used for performing filling operation on the image of the target element selected by the user in the selected color based on the color selected by the user.
EXAMPLE seven
The foregoing embodiment describes an overall architecture of an image processing apparatus, and functions of the apparatus can be implemented by an electronic device, as shown in fig. 9, which is a schematic structural diagram of the electronic device according to the embodiment of the present invention, and specifically includes: a memory 910 and a processor 920.
A memory 910 for storing programs.
In addition to the programs described above, the memory 910 may also be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 910 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 920, coupled to the memory 910, for executing a program in the memory 910, the program executing the image processing method as shown in fig. 2a to fig. 4.
The above specific processing operations have been described in detail in the foregoing embodiments, and are not described again here.
Further, as shown in fig. 9, the electronic device may further include: communication components 930, power components 940, audio components 950, display 960, and the like. Only some of the components are schematically shown in fig. 9, and the electronic device is not meant to include only the components shown in fig. 9.
The communication component 930 is configured to facilitate wired or wireless communication between the electronic device and other devices. The electronic device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 930 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 930 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
A power supply component 940 provides power to the various components of the electronic device. The power components 940 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for an electronic device.
The audio component 950 is configured to output and/or input audio signals. For example, the audio component 950 includes a Microphone (MIC) configured to receive external audio signals when the electronic device is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 910 or transmitted via the communication component 930. In some embodiments, audio component 950 also includes a speaker for outputting audio signals.
The display 960 includes a screen, which may include a liquid crystal display (L CD) and a Touch Panel (TP). if the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
Example eight
The foregoing embodiment describes an overall architecture of a color pickup processing apparatus, and the functions of the apparatus can be implemented by an electronic device, as shown in fig. 10, which is a schematic structural diagram of the electronic device according to an embodiment of the present invention, and specifically includes: a memory 101 and a processor 102.
A memory 101 for storing programs.
In addition to the above-described programs, the memory 101 may also be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 101 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 102, coupled to the memory 101, for executing a program in the memory 101, the program executing the color pick-up processing method as shown in fig. 7.
The above specific processing operations have been described in detail in the foregoing embodiments, and are not described again here.
Further, as shown in fig. 10, the electronic device may further include: communication components 103, power components 104, audio components 105, display 106, and other components. Only some of the components are schematically shown in fig. 10, and the electronic device is not meant to include only the components shown in fig. 10.
The communication component 103 is configured to facilitate wired or wireless communication between the electronic device and other devices. The electronic device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 103 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 103 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The power supply component 104 provides power to various components of the electronic device. The power components 104 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for an electronic device.
The audio component 105 is configured to output and/or input audio signals. For example, the audio component 105 includes a Microphone (MIC) configured to receive external audio signals when the electronic device is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 101 or transmitted via the communication component 103. In some embodiments, audio component 105 also includes a speaker for outputting audio signals.
The display 106 includes a screen, which may include a liquid crystal display (L CD) and a Touch Panel (TP). if the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (22)

1. An image processing method comprising:
determining a designated area of the image;
and performing color analysis on the image of the designated area, and displaying the analyzed color category in an operation interface so that a user can perform selection operation on at least one analyzed color.
2. The method of claim 1, wherein color resolving the image of the designated area comprises:
and analyzing the RGB color values of the pixel points of the designated area to obtain the color types contained in the image of the designated area.
3. The method of claim 1, wherein the method further comprises:
and in response to the selection operation of the analyzed at least one color, performing filling operation on the image of the target element with the selected color based on the selected color.
4. The method of claim 3, wherein the target elements comprise words and/or color blocks.
5. The method of claim 1, wherein the determining a designated area of an image comprises:
determining the designated area in response to a selection operation for an area in the image.
6. The method of claim 5, wherein the determining the designated region in response to the selection operation for the region in the image comprises:
and receiving touch operation on a touch screen displaying the image, identifying a screen contact point corresponding to the touch operation, and acquiring the designated area through the screen contact point.
7. The method of claim 5, wherein the determining the designated region in response to the selection operation for the region in the image comprises:
and receiving touch operation on a touch screen displaying the image, identifying a screen contact surface corresponding to the touch operation, and acquiring the designated area through the screen contact surface.
8. The method according to claim 6 or 7, wherein the color resolving the image of the designated area and displaying the resolved color category in an operation interface comprises:
sorting the analyzed colors, wherein the colors corresponding to the screen contact points are sorted at the front positions, the rest colors are sorted at the rear positions, and the colors are sorted from large to small according to the area ratio of the colors in the designated area;
and displaying the sorted color categories in an operation interface.
9. The method of claim 3, wherein the method further comprises:
in response to a selection operation for an element in the image, the target element is determined.
10. A color-picking processing method comprising:
providing a selection operation item for determining a designated area in an image for a user in an image operation interface;
based on a specified area determined in an image by a user, carrying out color analysis on the image of the specified area, and displaying the analyzed color category in an operation interface;
responding to the operation of selecting colors on the image color analyzing operation component by the user, and determining the colors selected by the user in the designated area;
providing a selection operation item for determining a target element in an image for a user in an image operation interface;
and filling the image of the target element selected by the user with the selected color based on the color selected by the user.
11. An image processing apparatus comprising:
the area determining module is used for determining a designated area of the image;
and the color analysis module is used for carrying out color analysis on the image of the designated area and displaying the analyzed color category in an operation interface so that a user can carry out selection operation on at least one analyzed color.
12. The apparatus of claim 11, wherein the color parsing module is specifically configured to,
and analyzing the RGB color values of the pixel points of the designated area to obtain the color types contained in the image of the designated area.
13. The apparatus of claim 11, wherein the apparatus further comprises:
and the filling operation module is used for responding to the selection operation of the analyzed at least one color and performing filling operation on the image of the target element in the selected color based on the selected color.
14. The apparatus of claim 13, wherein the target elements comprise words and/or color blocks.
15. The apparatus of claim 11, wherein the region selection module is specifically configured to,
determining the designated area in response to a selection operation for an area in the image.
16. The apparatus of claim 15, wherein the region selection module is specifically configured to,
and receiving touch operation on a touch screen displaying the image, identifying a screen contact point corresponding to the touch operation, and acquiring the designated area through the screen contact point.
17. The apparatus of claim 15, wherein the region selection module is specifically configured to,
and receiving touch operation on a touch screen displaying the image, identifying a screen contact surface corresponding to the touch operation, and acquiring the designated area through the screen contact surface.
18. The apparatus according to claim 16 or 17, wherein the color resolution module is specifically configured to,
sorting the analyzed colors, wherein the colors corresponding to the screen contact points are sorted at the front positions, the rest colors are sorted at the rear positions, and the colors are sorted from large to small according to the area ratio of the colors in the designated area; and displaying the sorted color categories in an operation interface.
19. The apparatus of claim 13, wherein the apparatus further comprises:
an element selection module to determine the target element in response to a selection operation for an element in the image.
20. A color-pickup processing apparatus comprising:
the interface display processing module is used for providing a selection operation item for determining a designated area in the image for a user in the image operation interface;
the color analysis processing module is used for carrying out color analysis on the image of the specified area based on the specified area determined by the user in the image and displaying the analyzed color type in an operation interface;
the color selection determining module is used for responding to the operation of selecting colors on the image color analyzing operation component by the user and determining the colors selected by the user in the designated area;
the interface display processing module is also used for providing a selection operation item for determining a target element in the image for a user in the image operation interface;
and the filling operation processing module is used for performing filling operation on the image of the target element selected by the user in the selected color based on the color selected by the user.
21. An electronic device, comprising:
a memory for storing a program;
a processor coupled to the memory for executing the program, the program when executed performing the image processing method of any one of claims 1 to 9.
22. An electronic device, comprising:
a memory for storing a program;
a processor, coupled to the memory, for executing the program, which when executed performs the color pick processing method of claim 10.
CN201910033558.2A 2019-01-14 2019-01-14 Image processing method and device and electronic equipment Pending CN111435276A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910033558.2A CN111435276A (en) 2019-01-14 2019-01-14 Image processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910033558.2A CN111435276A (en) 2019-01-14 2019-01-14 Image processing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111435276A true CN111435276A (en) 2020-07-21

Family

ID=71579940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910033558.2A Pending CN111435276A (en) 2019-01-14 2019-01-14 Image processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111435276A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506597A (en) * 2020-12-01 2021-03-16 珠海格力电器股份有限公司 Software interface color matching method and device, computer equipment and storage medium
CN113658287A (en) * 2021-07-14 2021-11-16 支付宝(杭州)信息技术有限公司 User interface color matching processing method, device and equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850432A (en) * 2015-04-29 2015-08-19 小米科技有限责任公司 Method and device for adjusting color
CN106775374A (en) * 2016-11-17 2017-05-31 广州视源电子科技股份有限公司 Color acquisition methods and device based on touch-screen
CN107193458A (en) * 2017-05-22 2017-09-22 广州视源电子科技股份有限公司 One kind takes color method, device, equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850432A (en) * 2015-04-29 2015-08-19 小米科技有限责任公司 Method and device for adjusting color
CN106775374A (en) * 2016-11-17 2017-05-31 广州视源电子科技股份有限公司 Color acquisition methods and device based on touch-screen
CN107193458A (en) * 2017-05-22 2017-09-22 广州视源电子科技股份有限公司 One kind takes color method, device, equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506597A (en) * 2020-12-01 2021-03-16 珠海格力电器股份有限公司 Software interface color matching method and device, computer equipment and storage medium
CN112506597B (en) * 2020-12-01 2022-04-08 珠海格力电器股份有限公司 Software interface color matching method and device, computer equipment and storage medium
CN113658287A (en) * 2021-07-14 2021-11-16 支付宝(杭州)信息技术有限公司 User interface color matching processing method, device and equipment
CN113658287B (en) * 2021-07-14 2024-05-03 支付宝(杭州)信息技术有限公司 User interface color matching processing method, device and equipment

Similar Documents

Publication Publication Date Title
CN107368550B (en) Information acquisition method, device, medium, electronic device, server and system
CN108701439B (en) Image display optimization method and device
CN111368934A (en) Image recognition model training method, image recognition method and related device
EP2824558A1 (en) Apparatus and method for processing contents in portable terminal
CN105138962A (en) Image display method and image display device
CN106844580B (en) Thumbnail generation method and device and mobile terminal
US11462002B2 (en) Wallpaper management method, apparatus, mobile terminal, and storage medium
CN107995440B (en) Video subtitle map generating method and device, computer readable storage medium and terminal equipment
JP2018506082A (en) Font addition method, apparatus, program, and recording medium
CN111435276A (en) Image processing method and device and electronic equipment
CN110851350A (en) Method and device for monitoring white screen of web page interface
CN106155542A (en) Image processing method and device
CN111476154A (en) Expression package generation method, device, equipment and computer readable storage medium
CN109271083B (en) Data processing method and device, computing equipment and storage medium
CN105955683B (en) System and control method
CN114120307A (en) Display content identification method, device, equipment and storage medium
US20130134221A1 (en) Apparatus and method for acquiring code image in a portable terminal
CN113837181B (en) Screening method, screening device, computer equipment and computer readable storage medium
CN104978414A (en) Content search method and terminal
CN114090883A (en) Service account processing method and device, electronic equipment and storage medium
CN114221923A (en) Message processing method and device and electronic equipment
CN109683758B (en) Application icon display method and device and storage medium
CN113220954A (en) Information display method and device and projection equipment
CN112785381A (en) Information display method, device and equipment
CN111435282A (en) Image processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination