CN110888579A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN110888579A
CN110888579A CN201811052343.7A CN201811052343A CN110888579A CN 110888579 A CN110888579 A CN 110888579A CN 201811052343 A CN201811052343 A CN 201811052343A CN 110888579 A CN110888579 A CN 110888579A
Authority
CN
China
Prior art keywords
image
pixel
color
processed
replacement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811052343.7A
Other languages
Chinese (zh)
Inventor
郑微
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Microlive Vision Technology Co Ltd
Original Assignee
Beijing Microlive Vision Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Microlive Vision Technology Co Ltd filed Critical Beijing Microlive Vision Technology Co Ltd
Priority to CN201811052343.7A priority Critical patent/CN110888579A/en
Publication of CN110888579A publication Critical patent/CN110888579A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06T3/04

Abstract

The embodiment of the disclosure discloses an image processing method and device. One embodiment of the method comprises: acquiring an image to be processed; acquiring position information of operation of a user on a screen; determining at least two image areas to be displayed in a screen in the image to be processed according to the position information; and respectively processing the at least two image areas to obtain at least two processed image areas. The implementation mode realizes effective reduction of the operation amount and the system overhead.

Description

Image processing method and device
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to an image processing method and device.
Background
A wide variety of effects can be achieved by adding different filters to the image or video. Therefore, for terminal devices such as smart phones, the filter function is gradually becoming a standard function. On the basis, scenes with different filters superposed are increasing. At present, for scenes with different overlapped filters, different filters are needed to process the whole image or video frame.
Disclosure of Invention
The embodiment of the disclosure provides an image processing method and device.
In a first aspect, an embodiment of the present disclosure provides an image processing method, including: acquiring an image to be processed; acquiring position information of operation of a user on a screen; determining at least two image areas to be displayed in a screen in the image to be processed according to the position information; and respectively processing the at least two image areas to obtain at least two processed image areas.
In some embodiments, the processing the at least two image regions respectively to obtain the processed at least two image regions may include: color replacement is performed for pixels in at least two image areas.
In some embodiments, color replacement is performed for pixels in at least two image regions, including: determining whether the target two-dimensional color map contains a replacement color corresponding to the pixel; in response to determining that the replacement color corresponding to the pixel is included in the target two-dimensional color map, transforming the color of the pixel to the replacement color corresponding to the pixel.
In some embodiments, performing color replacement for the pixels in the at least two image areas may further include: determining a replacement color corresponding to the pixel by interpolation in response to determining that the replacement color corresponding to the pixel is not included in the target two-dimensional color map; the color of the pixel is transformed into a replacement color corresponding to the pixel.
In some embodiments, the method further comprises: and displaying the processed at least two image areas on a screen.
In a second aspect, an embodiment of the present disclosure provides an image processing apparatus including: an image acquisition unit configured to acquire an image to be processed; a position information acquisition unit configured to acquire position information of an operation of a user on a screen; a determination unit configured to determine at least two image areas to be displayed in a screen in the image to be processed, based on the position information; and the processing unit is configured to process the at least two image areas to obtain the processed at least two image areas.
In some embodiments, the processing unit is further configured to: color replacement is performed for pixels in at least two image areas.
In some embodiments, the processing unit is further configured to: determining whether the target two-dimensional color map contains a replacement color corresponding to the pixel; in response to determining that the replacement color corresponding to the pixel is included in the target two-dimensional color map, transforming the color of the pixel to the replacement color corresponding to the pixel.
In some embodiments, the processing unit is further configured to: determining a replacement color corresponding to the pixel by interpolation in response to determining that the replacement color corresponding to the pixel is not included in the target two-dimensional color map; the color of the pixel is transformed into a replacement color corresponding to the pixel.
In some embodiments, the apparatus further comprises: a display unit configured to display the processed at least two image areas on a screen.
In a third aspect, an embodiment of the present disclosure provides a terminal device, including: one or more processors; a storage device having one or more programs stored thereon; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation manner of the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides a computer-readable medium, on which a computer program is stored, which when executed by a processor implements the method as described in any of the implementations of the first aspect.
According to the image processing method and the image processing device, at least two image areas to be displayed in a screen in an image to be processed are processed according to position information operated by a user on the screen. In the process, the whole image to be processed is avoided being processed, so that the operation amount and the system overhead can be effectively reduced.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram for one embodiment of an image processing method according to the present disclosure;
FIG. 3 is a schematic diagram of an exemplary application scenario of an image processing method according to the present disclosure;
FIG. 4 is a flow diagram of yet another embodiment of an image processing method according to the present disclosure;
FIG. 5 is a schematic block diagram of one embodiment of an image processing apparatus according to the present disclosure;
FIG. 6 is a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant disclosure and are not limiting of the disclosure. It should be noted that, for the convenience of description, only the parts relevant to the related disclosure are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an embodiment system architecture 100 of an image processing method or apparatus to which embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. Various communication client applications, such as a photographing application, a video sharing application, an image processing application, and the like, may be installed on the terminal devices 101, 102, and 103.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting image processing and display, including but not limited to smart phones, tablet computers, laptop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules, or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, such as a background web server providing support for applications installed on the terminal devices 101, 102, 103.
It should be noted that the image processing method provided by the embodiments of the present disclosure is generally executed by the terminal devices 101, 102, 103. Accordingly, the image processing apparatus is generally provided in the terminal devices 101, 102, 103.
The server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of an image processing method according to the present disclosure is shown. The image processing method comprises the following steps:
step 201, acquiring an image to be processed.
In the embodiment, an execution subject of the image processing method (for example, the terminal device shown in fig. 1) may acquire an image to be processed by means of camera shooting. In addition, the execution main body can also acquire the image to be processed from a server or other terminal equipment in a wired connection or wireless connection mode.
The image to be processed may be an arbitrary image. The specific determination of the image to be processed may be specified by a technician, or may be obtained by screening according to a certain condition.
It should be noted that the image to be processed may be one image or at least two images, and is not limited herein.
Step 202, acquiring position information of the operation of the user on the screen.
In this embodiment, the execution main body may acquire position information of an operation of the user on the screen. In practice, the user may perform an operation (e.g., click, slide, etc.) on the screen by a finger or a stylus, etc., to input a corresponding signal. The screen may be a liquid crystal display device that supports receiving signals input by a stylus, a finger, or the like. By way of example, a capacitive screen, a resistive screen, or the like may be used. During the operation of the user, the contact point between the user's finger or a stylus pen and the screen may change. For example, when a user performs a sliding operation, the contact point is usually moved.
In order to accurately identify various operations of a user, various electronic devices currently can continuously determine positions of contact points between a finger or a stylus pen of the user and a screen in the screen during the operation of the user, and acquire position information of the positions. The position information may be information for characterizing a position of the contact point in the screen. For example, the position information may be coordinates of the contact points in a two-dimensional coordinate system of the screen. As another example, the position information may also be a distance between the contact point and the edge of the screen.
Step 203, determining at least two image areas to be displayed in the screen in the image to be processed according to the position information.
In this embodiment, the executing body may determine at least two image areas to be displayed on the screen in the image to be processed according to the position information obtained in step 202.
In order to adapt to various scenes, the execution subject can display at least two image areas in the image in a specific mode instead of displaying the whole image in the process of displaying the image. At least two image regions may be the same image region or different image regions. Each of the at least two image areas may be a part or an entirety of the image to be processed, and is not particularly limited herein. In practice, it may be determined whether at least two image regions display the same image region and whether the whole of the image to be processed is displayed in a manner preset by a technician, and is not specifically limited herein.
On the basis, the execution body can determine at least two image areas to be displayed in the screen in the image to be processed according to the position information.
Take the case of presetting at least two image areas as two different image areas of the same image. The execution body may divide the image to be processed into two left and right image areas according to coordinates of a contact point of a user's finger, a stylus pen, or the like with the screen. As an example, when the image to be processed is the same size as the screen, a region from the left edge to the coordinate point in the image to be processed may be determined as one image region. Accordingly, the region from the coordinate point to the right edge is determined as another image region. Thereby obtaining two image areas to be displayed in the screen in the image to be processed. As an example, the executing body may also divide the image to be processed up and down according to the coordinates of the contact point between the user's finger, stylus pen, or the like and the screen, so as to obtain two image areas to be displayed on the screen in the image to be processed. It is understood that three or more image areas may be determined in a similar manner, and will not be described in detail herein.
In a case where at least two image regions are preset to be regions in different images, as an example, the user may adjust a selection box on the screen to select an image region. In this case, the execution subject may determine at least two image areas by the coordinates of the user-determined selection box.
And 204, respectively processing the at least two image areas to obtain at least two processed image areas.
In this embodiment, the executing entity may process at least two image areas respectively, and may not process other image areas except for the at least two image areas in the image to be processed, so as to obtain two processed image areas. Wherein the execution body may perform various processes for each image area. As an example, the execution subject described above may adjust resolution, size, and the like for the two image areas, respectively.
In some optional implementations of this embodiment, the execution subject may perform color replacement on pixels in at least two image areas. The following effects are achieved by color replacement, including but not limited to: adjusting hue, contrast, brightness, and adding various special effects (filters), etc.
In the present embodiment, the execution subject described above may execute different processes for at least two image areas. For example, for one of the two image areas, the execution subject may adjust the contrast of the image area. And for the other of the two image areas, the execution subject may adjust the saturation of the image area.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the image processing method according to the present embodiment. In the application scenario of fig. 3, the execution subject of the image processing method is a smartphone 301. Take the example of dividing the image to be processed left and right and adding different filters. The smartphone 301 may first acquire the to-be-processed image 3011. The user adjusts the range of the left and right regions by sliding or clicking on the screen. In this process, for example, when the user clicks or slides to the center of the screen, the smartphone 301 may obtain coordinates of a contact point with the screen when the user performs an operation on the screen. From the obtained coordinates, two image regions 3012 and 3013 in the image to be processed 3011 are determined, as shown by the dashed-line framed region. The areas within the dotted-line frame of the image to be processed 3011 are processed respectively. As an example, the image region 3012 is gaussian-blurred. The image region 3013 is warped. Resulting in two processed image regions 3012 'and 3013', respectively. Thereafter, if necessary, the processed two image regions 3012 'and 3013' may be displayed as shown in the figure.
In this embodiment, the executing body may respectively process at least two image areas to be displayed on the screen in the image to be processed according to the position information operated by the user on the screen. In the process, the whole image to be processed is avoided being processed, so that the operation amount and the system overhead can be effectively reduced.
With further reference to FIG. 4, there is shown a flow 400 of yet another embodiment of an image processing method, the flow 400 of the image processing method comprising the steps of:
step 401, acquiring an image to be processed.
In step 402, position information of an operation of a user on a screen is acquired.
Step 403, determining at least two image areas to be displayed in the screen in the image to be processed according to the position information.
In the present embodiment, the specific processing in steps 401 and 403 and the technical effects thereof are similar to that in step 201 and 203 in the embodiment corresponding to fig. 2, and are not repeated herein.
Step 404, for pixels in at least two image areas, performing the following color replacement steps:
step 4041 determines whether the target two-dimensional color map contains a replacement color corresponding to the pixel.
In this embodiment, the executing body of the image processing method may determine whether or not the target two-dimensional color map contains the replacement color corresponding to the pixel. The target two-dimensional color map may be any two-dimensional color map. The determination of the target two-dimensional color map can be specified by a technician, and can also be obtained by screening according to certain conditions. For example, the target two-dimensional color map may be a two-dimensional color map corresponding to a filter selected by a user.
Image processing includes a transformation of the colors of pixels in an image. Taking an RGB image as an example, the RGB values of the replacement colors can be obtained by looking up a table. That is, for each processing effect (filter), a color lookup table corresponding thereto may be prepared in advance. Thus, the processed image can be obtained by querying the color lookup table and replacing the original color of the pixel in the image with the color in the color lookup table. Wherein, RGB is an image color mode. RGB represents the colors of the red, green and blue channels, respectively. Wherein, a color lookup table may store RGB values of part or all of the replacement colors of a processing effect. However, since the number of colors is large and the color lookup tables corresponding to each processing effect are different, the drawing calculation amount of the color lookup tables is large. In practice, in order to solve the problem of large calculation amount of the Color Lookup Table, a ColorLUT (Color Lookup Table) is adopted to replace the Color Lookup Table. It should be noted that the color lookup table may be stored in a two-dimensional color map manner, so as to save storage space. Wherein the RGB value of each pixel in the two-dimensional color map represents the RGB value of the replacement color. Specifically, for a certain pixel in the image area, a certain pixel of the two-dimensional color map may be determined from the RGB value of this pixel. And the RGB values of the color of this pixel of the two-dimensional color map are the RGB values of the replacement color. For the specific implementation principle of ColorLUT, since it is widely applied in the field of image processing, it is not repeated herein. In addition, since each channel of the RGB image has 256 values, if the color lookup table includes a replacement color corresponding to each color, the data amount of the color lookup table is very large. Therefore, in practice, a reduced version of the color look-up table is often employed.
In some optional implementations of this embodiment, the image size of the target two-dimensional color map is 512 × 512. In addition, the image size of the target two-dimensional color map may also use other values as needed. For example: 1024 × 1024, 2048 × 2048, and so on.
On this basis, the execution subject may determine whether the target two-dimensional color map includes the replacement color corresponding to the pixel by querying. If so, step 4043 may be performed directly.
Step 4042, in response to determining that the target two-dimensional color map does not contain the replacement color corresponding to the pixel, determining the replacement color corresponding to the pixel by interpolation.
In the present embodiment, in response to determining that the replacement color corresponding to the pixel is not included in the target two-dimensional color map, the execution body described above may determine the replacement color corresponding to the pixel by various interpolation methods. The interpolation method comprises the following steps: linear interpolation, bilinear interpolation, and the like.
Step 4043 converts the color of the pixel to an alternate color corresponding to the pixel.
In this embodiment, the execution subject may replace the original color of the pixel with a replacement color obtained by direct query or interpolation. Specifically, the RGB values of the original pixels may be replaced with the queried RGB values.
And step 405, displaying the processed at least two image areas on a screen.
In this embodiment, the execution body may display the processed at least two image areas on the screen.
In some optional implementations of this embodiment, the image processing method may further include the steps of: displaying a set of related information of a two-dimensional color map in a screen; in response to receiving information about a two-dimensional color map selected by a user from a collection; and taking the two-dimensional color map indicated by the relevant information selected by the user as a target two-dimensional color map, and performing the color replacement step.
In these implementations, each of the at least two-dimensional color maps may correspond to a processing effect (filter). The correspondence may be set in advance. The related information of the two-dimensional color map may be various information related to the two-dimensional color map. As an example, it may be processing effect description information (e.g., old photo effect, black and white photo effect, etc.) corresponding to a two-dimensional color map. As an example, the processing effect map may be a two-dimensional color map, and is not particularly limited herein.
On the basis of the above, the user can select the required processing effect from the set. According to the corresponding relationship between the processing effect and the two-dimensional color map, the execution main body may use the two-dimensional color map corresponding to the processing effect selected by the user as the target two-dimensional color map, and execute the color replacement step.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, in this embodiment, processing on at least two image areas is implemented by querying the color lookup table, so that the image processing speed is increased. Further, by using a reduced version of the color lookup table and interpolation, alternative colors not included in the two-dimensional color map are obtained. Therefore, the data volume of the two-dimensional color map is reduced, and the storage and query of the two-dimensional color map are facilitated.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an image processing apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the image processing apparatus 500 of the present embodiment includes: an image acquisition unit 501, a position information acquisition unit 502, a determination unit 503, and a processing unit 504. Wherein the image acquisition unit 501 is configured to acquire an image to be processed. The position information acquisition unit 502 is configured to acquire position information of an operation of a user on a screen. The determination unit 503 is configured to determine at least two image areas to be displayed in the screen in the image to be processed, based on the position information. The processing unit 504 is configured to process the at least two image areas, resulting in processed at least two image areas.
In this embodiment, specific processes of the image obtaining unit 501, the position information obtaining unit 502, the determining unit 503 and the processing unit 504 in the image processing apparatus 500 and the realized technical effects thereof may refer to step 201 and step 204 of the embodiment corresponding to fig. 2, and are not described herein again.
In some optional implementations of this embodiment, the processing unit 504 may be further configured to: color replacement is performed for pixels in at least two image areas.
In some optional implementations of this embodiment, the processing unit 504 may be further configured to: determining whether the target two-dimensional color map contains a replacement color corresponding to the pixel; in response to determining that the replacement color corresponding to the pixel is included in the target two-dimensional color map, transforming the color of the pixel to the replacement color corresponding to the pixel.
In some optional implementations of this embodiment, the processing unit 504 may be further configured to: determining a replacement color corresponding to the pixel by interpolation in response to determining that the replacement color corresponding to the pixel is not included in the target two-dimensional color map; the color of the pixel is transformed into a replacement color corresponding to the pixel.
In some optional implementations of this embodiment, the apparatus 500 may further include: a display unit (not shown in the figure). Wherein the display unit is configured to display the processed at least two image areas on a screen.
In this embodiment, the processing unit 504 may process at least two image areas to be displayed in the screen in the image to be processed according to the position information operated by the user on the screen. In the process, the whole image to be processed is avoided being processed, so that the operation amount and the system overhead can be effectively reduced.
Referring now to fig. 6, shown is a schematic diagram of an electronic device (e.g., terminal device in fig. 1) 600 suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed 6 from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an image to be processed; acquiring position information of operation of a user on a screen; determining at least two image areas to be displayed in a screen in the image to be processed according to the position information; and respectively processing the at least two image areas to obtain at least two processed image areas.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The name of the unit does not in some cases constitute a limitation of the unit itself, and for example, the image acquisition unit may also be described as a "unit that acquires an image to be processed".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (12)

1. An image processing method comprising:
acquiring an image to be processed;
acquiring position information of operation of a user on a screen;
determining at least two image areas to be displayed in the screen in the image to be processed according to the position information;
and respectively processing the at least two image areas to obtain at least two processed image areas.
2. The method of claim 1, wherein the processing the at least two image regions respectively to obtain at least two processed image regions comprises:
color replacement is performed for pixels in the at least two image areas.
3. The method of claim 2, wherein the color replacement for pixels in the at least two image regions comprises:
determining whether the target two-dimensional color map contains a replacement color corresponding to the pixel;
in response to determining that the target two-dimensional color map includes a replacement color corresponding to the pixel, transforming the color of the pixel to the replacement color corresponding to the pixel.
4. The method of claim 3, wherein the color replacing pixels in the at least two image regions further comprises:
in response to determining that the target two-dimensional color map does not contain a replacement color corresponding to the pixel, determining a replacement color corresponding to the pixel by interpolation;
transforming the color of the pixel to a replacement color corresponding to the pixel.
5. The method according to any one of claims 1-4, wherein the method further comprises:
and displaying the at least two processed image areas on the screen.
6. An image processing apparatus comprising:
an image acquisition unit configured to acquire an image to be processed;
a position information acquisition unit configured to acquire position information of an operation of a user on a screen;
a determination unit configured to determine at least two image areas to be displayed in the screen in the image to be processed according to the position information;
and the processing unit is configured to process the at least two image areas to obtain at least two processed image areas.
7. The apparatus of claim 6, wherein the processing unit is further configured to:
color replacement is performed for pixels in the at least two image areas.
8. The apparatus of claim 7, wherein the processing unit is further configured to:
determining whether the target two-dimensional color map contains a replacement color corresponding to the pixel;
in response to determining that the target two-dimensional color map includes a replacement color corresponding to the pixel, transforming the color of the pixel to the replacement color corresponding to the pixel.
9. The apparatus of claim 8, wherein the processing unit is further configured to:
in response to determining that the target two-dimensional color map does not contain a replacement color corresponding to the pixel, determining a replacement color corresponding to the pixel by interpolation;
transforming the color of the pixel to a replacement color corresponding to the pixel.
10. The apparatus of any of claims 6-9, wherein the apparatus further comprises:
a display unit configured to display the processed at least two image areas on the screen.
11. A terminal device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
12. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-5.
CN201811052343.7A 2018-09-10 2018-09-10 Image processing method and device Pending CN110888579A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811052343.7A CN110888579A (en) 2018-09-10 2018-09-10 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811052343.7A CN110888579A (en) 2018-09-10 2018-09-10 Image processing method and device

Publications (1)

Publication Number Publication Date
CN110888579A true CN110888579A (en) 2020-03-17

Family

ID=69745290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811052343.7A Pending CN110888579A (en) 2018-09-10 2018-09-10 Image processing method and device

Country Status (1)

Country Link
CN (1) CN110888579A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930559A (en) * 2012-10-23 2013-02-13 华为技术有限公司 Image processing method and device
CN106201242A (en) * 2016-06-27 2016-12-07 北京金山安全软件有限公司 Image processing method and device and electronic equipment
CN107657592A (en) * 2017-09-19 2018-02-02 珠海市君天电子科技有限公司 A kind of image processing method, processing unit and electronic equipment
CN107950017A (en) * 2016-06-15 2018-04-20 索尼公司 Image processing equipment, image processing method and picture pick-up device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930559A (en) * 2012-10-23 2013-02-13 华为技术有限公司 Image processing method and device
CN107950017A (en) * 2016-06-15 2018-04-20 索尼公司 Image processing equipment, image processing method and picture pick-up device
CN106201242A (en) * 2016-06-27 2016-12-07 北京金山安全软件有限公司 Image processing method and device and electronic equipment
CN107657592A (en) * 2017-09-19 2018-02-02 珠海市君天电子科技有限公司 A kind of image processing method, processing unit and electronic equipment

Similar Documents

Publication Publication Date Title
CN110865862B (en) Page background setting method and device and electronic equipment
CN110211030B (en) Image generation method and device
CN110825286B (en) Image processing method and device and electronic equipment
CN111258519B (en) Screen split implementation method, device, terminal and medium
CN112767238A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112434175A (en) Multimedia information display method and device, electronic equipment and computer readable medium
CN110852946A (en) Picture display method and device and electronic equipment
CN110647369B (en) Page dynamic display method and device, mobile terminal and storage medium
US20230360286A1 (en) Image processing method and apparatus, electronic device and storage medium
CN111461965B (en) Picture processing method and device, electronic equipment and computer readable medium
CN110097520B (en) Image processing method and device
CN110599394A (en) Method and device for processing pictures in online presentation, storage medium and equipment
CN112465940B (en) Image rendering method and device, electronic equipment and storage medium
CN110888579A (en) Image processing method and device
CN111738950B (en) Image processing method and device
CN110399802B (en) Method, apparatus, medium, and electronic device for processing eye brightness of face image
CN110636331B (en) Method and apparatus for processing video
CN110825993B (en) Picture display method and device and electronic equipment
CN114170341A (en) Image processing method, device, equipment and medium
CN110825480A (en) Picture display method and device, electronic equipment and computer readable storage medium
CN110807164A (en) Automatic image area adjusting method and device, electronic equipment and computer readable storage medium
CN114827482B (en) Image brightness adjusting method and device, electronic equipment and medium
US20220292733A1 (en) Method and apparatus for text effect processing
WO2023125500A1 (en) Image processing method and apparatus, electronic device and storage medium
CN117425086A (en) Multimedia data processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination