CN107808361B - Image processing method, mobile terminal and computer readable storage medium - Google Patents

Image processing method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN107808361B
CN107808361B CN201711040054.0A CN201711040054A CN107808361B CN 107808361 B CN107808361 B CN 107808361B CN 201711040054 A CN201711040054 A CN 201711040054A CN 107808361 B CN107808361 B CN 107808361B
Authority
CN
China
Prior art keywords
pixel value
pixel
image
image file
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711040054.0A
Other languages
Chinese (zh)
Other versions
CN107808361A (en
Inventor
付一鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201711040054.0A priority Critical patent/CN107808361B/en
Publication of CN107808361A publication Critical patent/CN107808361A/en
Application granted granted Critical
Publication of CN107808361B publication Critical patent/CN107808361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/18

Abstract

The embodiment of the invention discloses an image processing method, a mobile terminal and a computer readable storage medium. The image processing method applied to the mobile terminal comprises the following steps: determining a target area of a first image generated based on a first image file, wherein pixel rearrangement is required to be performed on the first image, performing pixel rearrangement based on a first pixel value in the target area, and generating a third pixel value; performing pixel interpolation processing based on the second pixel value outside the target area and generating a fourth pixel value; combining the third pixel value and the fourth pixel value to generate a second image file; in this way, in the embodiment of the present invention, the rearrangement processing is performed based on only the first pixel value, and the pixel interpolation processing is performed based on the second pixel value, and the pixel interpolation processing is an operation that is simpler and faster than the pixel rearrangement processing, so that the generation rate of the second image corresponding to the second image file can be increased.

Description

Image processing method, mobile terminal and computer readable storage medium
Technical Field
The present invention relates to the field of information technologies, and in particular, to an image processing method, a mobile terminal, and a computer-readable storage medium.
Background
When the all-in-one camera is used for image shooting, image data acquired by a photosensitive element of the camera needs to be stored into an image file with a specific format, the image file is generated based on pixel values of a plurality of pixels, and finally, an image with the number of pixels smaller than the number of pixels actually sensed by the photosensitive element is output. In some scenarios, the image file needs to be converted to obtain another image file, so that the image file outputs an image with the number of pixels equivalent to the number of pixels actually sensed by the photosensitive element. Pixel rearrangement processing is required here to realize conversion of image files of different pixel storage formats. The pixel rearrangement processing in the image file relates to the rearrangement of pixel values of pixel points, such as the conversion of pixel coordinates and the re-storage of the pixel values, and is relatively complex processing operation, large in calculation amount and large in consumed calculation resources. If the number of the related pixels is large, the problems of long pixel sequencing time, slow file generation response rate of the target image and the like can be caused.
Disclosure of Invention
In view of the above, embodiments of the present invention are directed to an image processing method, a mobile terminal and a computer readable storage medium, which at least partially solve the above problems.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method applied in a mobile terminal, including:
determining a target area, in which pixel rearrangement needs to be performed, of a first image generated based on a first image file, wherein a pixel value located in the target area is a first pixel value, and a pixel value located outside the target area is a second pixel value; the first image file is an image file stored in a first pixel storage format;
performing the pixel rearrangement based on the first pixel value and generating a third pixel value;
performing pixel interpolation processing based on the second pixel value and generating a fourth pixel value;
and generating a second image file by combining the third pixel value and the fourth pixel value, wherein the second image file is an image file stored in a second pixel storage format.
Based on the above solution, the determining a target region where a first image generated based on a first image file needs to perform pixel rearrangement includes:
identifying a graphical element in the first image;
and determining the area where the main graphic element meeting the preset condition is located as the target area, or determining the area where the graphic element of the preset type is located as the target area.
Based on the above scheme, the target region is a middle region of the first image.
Based on the above scheme, the performing pixel interpolation processing based on the second pixel value and generating a fourth pixel value includes:
generating an interpolated pixel value based on second pixel values of two adjacent pixels;
wherein the fourth pixel value comprises: a second pixel value and the interpolated pixel value.
Based on the scheme, the second image file is used for outputting a second image;
wherein the third pixel value is used for outputting a first region of the second image; the fourth pixel value is used for outputting a second area of the second image;
wherein the image resolution of the first region is equal to the image resolution of the second region.
In a second aspect, an embodiment of the present invention provides a mobile terminal, including:
the image processing device comprises a determining unit, a generating unit and a processing unit, wherein the determining unit is used for determining a target area which needs to execute pixel rearrangement on a first image generated based on a first image file, the pixel value in the target area is a first pixel value, and the pixel value outside the target area is a second pixel value; the first image file is an image file stored in a first pixel storage format;
a rearrangement unit configured to perform the pixel rearrangement based on the first pixel value and generate a third pixel value;
a pixel generation unit configured to perform pixel interpolation processing based on the second pixel value and generate a fourth pixel value;
and the file generation unit is used for determining that the area where the main graphic element meeting the preset condition is located is the target area, or determining that the area where the graphic element of the preset type is located is the target area.
Based on the above scheme, the determining unit is specifically configured to identify the graphic element in the first image and determine that the region where the main graphic element meeting the preset condition is located is the target region.
Based on the above scheme, the target region is a middle region of the first image.
In a third aspect, an embodiment of the present invention provides a mobile terminal, including: a memory, a processor, and a computer program stored on the memory and executed by the processor;
the memory is used for storing information;
the processor is connected with the memory and is used for realizing the image processing method provided by one or more of the above technical schemes by executing the computer program.
In a fourth aspect, an embodiment of the present invention is a computer-readable storage medium for storing a computer program; after being executed, the computer program can realize the image processing method provided by one or more of the technical solutions.
The embodiment of the invention provides an image processing method, a mobile terminal and a computer readable storage medium, which firstly determine a target area needing to execute pixel rearrangement, wherein the target area is usually a core area which is focused by a user. In this embodiment, pixel rearrangement is performed on the first pixel value in the core region to generate a third pixel value; and only the third pixel value is rearranged, and the fourth pixel is obtained by pixel interpolation processing for the second pixel value outside the target region. The pixel interpolation processing is simpler pixel processing and less-computationally intensive processing than the pixel rearrangement processing; the method simplifies the pixel processing of the target image to be displayed finally, improves the processing rate, reduces the processing delay, reduces the computing resources consumed by the pixel processing, and has the characteristics of small response delay and less consumed computing resources.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
fig. 3 is a schematic flowchart of an image processing method according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a pixel interpolation process according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a target region and a non-target region of a first image according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a target region and a non-target region of another first image according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a second image according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of another mobile terminal according to an embodiment of the present invention;
fig. 10 is a schematic diagram of remosiac processing according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a PDA (Personal Digital Assistant), a PMP (Portable Media Player), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a fixed terminal such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other computer-readable storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present invention, a communication network system on which the mobile terminal of the present invention is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present invention, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the mobile terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Among them, the eNodeB2021 may be connected with other eNodeB2022 through backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. The MME2031 is a control node that handles signaling between the UE201 and the EPC203, and provides bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present invention is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the mobile terminal, the invention provides various embodiments of the method.
As shown in fig. 3, the present embodiment provides an image processing method applied in a mobile terminal, including:
step S110: determining a target area, in which pixel rearrangement needs to be performed, of a first image generated based on a first image file, wherein a pixel value located in the target area is a first pixel value, and a pixel value located outside the target area is a second pixel value; the first image file is an image file stored in a first pixel storage format;
step S120: performing the pixel rearrangement based on the first pixel value and generating a third pixel value;
step S130: performing pixel interpolation processing based on the second pixel value and generating a fourth pixel value;
step S140: and generating a second image file by combining the third pixel value and the fourth pixel value, wherein the second image file is an image file stored in a second pixel storage format.
The mobile terminal provided by the embodiment can be the mobile terminal shown in fig. 1. The mobile terminal can be a man-carried terminal carried by a user, such as a mobile phone, a tablet computer or a wearable device. The mobile terminal may be a vehicle-mounted terminal mounted on various vehicles or the like, or may be a device-mounted terminal carried by various toys or robots.
In this embodiment, the mobile terminal may include: all-in-one camera. The all-in-one camera can be a four-in-one camera. Taking a four-in-one camera as an example, the camera may include: the image processing method includes the steps of sensing light of N light-sensing elements of pixels, wherein the number of the generated pixel values after the light sensing is N, and when image acquisition of a small-area image is performed, weighting the brightness values of 4 adjacent pixels in order to improve the image quality of the small-area image (for example, in order to improve the brightness or definition of the pixel), and then considering the weighted brightness values as the brightness value of one pixel. In this way, the output image file includes only: the pixel value is N/4 pixels; therefore, the image size is reduced, meanwhile, the brightness of a single pixel in the image is improved, and the image quality is improved. The number of pixels included for a 4-in-one camera may be 1600 ten thousand, i.e., N is 1600 ten thousand. If a small-size image is output, an image of 400 ten thousand pixels is displayed.
The all-in-one camera may include: the M unifies the camera that has N pixel, N is for being not less than 2 integer, N is the biggest pixel number that this camera image acquisition can gather.
The image file displayed by the all-in-one camera can be as follows: a first type of image file comprising N pixels, and a second type of image file comprising N/M pixels. In this embodiment, the first image file may be the first type image file.
In this embodiment, the first pixel value generally refers to the pixel values of all pixels in the target area in the first image of the first image file, and is a set of the pixel values of all pixels in the target area. The second pixel value may be a set of pixel values for all pixels in the first image outside the target region. Similarly, in the embodiment of the present invention, the third pixel value and the fourth pixel value are also sets of pixel values of corresponding pixel values. Fig. 5 and 6 are schematic diagrams of a first image, which can be divided into a target area and a non-target area outside the target area through step S110.
The pixel rearrangement operation is performed based on the first pixel value and a third pixel value is generated in step S120.
For example, pixels in a first image file are rearranged according to a second pixel storage format of a second image file, thereby realizing processing from the first image file stored in the first pixel storage format to the second image file.
The step S130 and the step S120 do not have a certain sequence, and the step S120 may be executed before the step S130, may be executed after the step S130, and may also be executed synchronously with the step S130.
In step S130, an interpolated pixel is generated based on the pixel interpolation process, so as to obtain the fourth pixel value. Optionally, the step S130 may include:
generating an interpolated pixel value based on second pixel values of two adjacent pixels;
wherein the fourth pixel value comprises: a second pixel value and the interpolated pixel value.
As shown in fig. 4, in two adjacent pixels outside the target area in the first image, the pixel value of pixel 1 is pixel value 1, and the pixel value of pixel 2 is pixel value 2; if it is desired to insert a pixel value 3 between the pixel 1 and the pixel 2, the inserted pixel value of the inserted pixel is determined based on the pixel value 1 and the pixel value 2. For example, the pixel value 3 obtained by performing a weighted average of the pixel value 1 and the pixel value 2 may be the pixel value of the inserted pixel 3. For another example, the median value of the pixel value 1 and the pixel value 2 is solved, and the solved median value is used as the pixel value of the pixel 3. In summary, one pixel is inserted between two adjacent pixels, and the pixel value of this pixel depends on the pixel values of the two pixels it inserts. In performing the pixel insertion processing, the pixel in-and-out processing may be performed a plurality of times based on the pixel to be inserted, thereby obtaining a required number of fourth pixels, thereby obtaining a required number of fourth pixel values.
In fig. 4, one square indicates one pixel; the pixel value of this pixel is represented by the shade of the color of the square, and it is clear that the pixel value of the pixel 3 inserted is between the pixel values of the pixels 1 and 2. And the pixel coordinates of the interpolated pixel 3 lie between the pixel coordinates of pixel 1 and pixel 2.
In this embodiment, a second image file may be generated in combination with the third pixel value and the fourth pixel value. When the second image file is output, the second image file is directly output based on the third pixel value and the fourth pixel value. The second image file shows a second image including pixels that are redundant of the pixels included in the first image.
In some embodiments, the third pixel value is for outputting a first region of the second image; the fourth pixel value is used for outputting a second area of the second image; wherein the image resolution of the first region is equal to the image resolution of the second region.
The pixel rearrangement can make the number of the third pixel values more than that of the first pixel values;
the pixel interpolation process also causes the number of fourth pixel values to be greater than the number of second pixel values.
The image resolution of the first region exhibited based on the third pixel value and the second region exhibited based on the fourth pixel value are equal. That is, the first region and the second region have the same number of pixels per unit area.
Therefore, the problem that the image definition change is sharp in transition due to the fact that the image resolutions of the first area and the second area are different can be solved, and the image quality of the second image is improved.
FIG. 7 is a schematic diagram of a second image, the pattern in the first region being formed based on third pixel values; the pattern in the second region is formed based on the fourth pixel value.
There are many ways to determine the target area, and several alternatives are provided below:
the first alternative is as follows:
the step S110 may include:
step S111: identifying a graphical element in the first image;
step S112: and determining the area where the main graphic element meeting the preset condition is located as the target area. As shown in fig. 5, the main figure element is recognized, an area including the main figure element is set as the target area, and an area other than the target area is set as a non-target area.
The main graphic element meeting the preset condition may include at least one of the following:
image elements occupying a graphic area in the first image which is not smaller than a preset area, for example, by contour extraction and the like, it is determined that a graphic area included in a contour of one image element is not smaller than the preset area, and it is considered that a prime number preset condition is satisfied;
and a graphic element occupying a graphic area ratio in the first image not less than a preset ratio. For example, first, a graph area of the image element is calculated through contour extraction and the like, then a ratio of the graph area to the whole image area of the first image is calculated to obtain a graph area ratio, and then whether the preset condition is met or not is determined through comparison with a preset ratio;
determining the graphic area of each image element in the first image by means of contour extraction and the like;
and carrying out graphic area sequencing on each image element, and selecting one or more larger graphic elements as the main graphic elements.
In other embodiments, the step S110 may further include:
step S111: identifying a graphical element in the first image;
step S113: and determining the area where the graphic element of the preset type is positioned as the target area.
The type of image element needs to be identified in step S111 in the present embodiment. For example, the types of the graphic elements are classified according to the collection object, and may be classified into a figure graphic element obtained by collecting an image of a person, a scenery graphic element obtained by collecting an image of a scenery, an animal graphic element obtained by collecting an image of an animal, and an artifact graphic element obtained by collecting a graphic element of an artifact. The scene may include: natural scene, tree, flower, grass, blue sky, etc. The artifact may include: buildings, vehicles, etc.
In this embodiment, the graphic elements of the predetermined type may be various graphic elements determined in advance, and may be, optionally, character graphic elements and/or animal graphic elements.
Fig. 6 is a schematic diagram illustrating the division of the graphic areas by using the figure elements as the predetermined type of graphic elements, and it is apparent that the figure elements are included in the target area, and the figure elements are not included in the non-target area.
In general, there are many ways to determine the target area, and the method is not limited to any of the above.
In some embodiments, the target region is a middle region of the first image. The middle region of an image is usually the core region where the user's sight is focused on, and in this embodiment, the target region may be directly the middle region.
In some embodiments, the middle area may be an area determined based on a user operation, or may be an area automatically determined based on a built-in instruction. For example, when the target region is a middle region, the target region may be an image region that is automatically determined based on a built-in instruction and that extends a preset area outward centered on a center point of the first image.
As shown in fig. 8, the present embodiment provides a mobile terminal, including:
a determining unit 310, configured to determine a target area where a first image generated based on a first image file needs to perform pixel rearrangement, where a pixel value located in the target area is a first pixel value, and a pixel value located outside the target area is a second pixel value; the first image file is an image file stored in a first pixel storage format;
a rearrangement unit 320 configured to perform the pixel rearrangement based on the first pixel value and generate a third pixel value;
a pixel generation unit 330 configured to perform pixel interpolation processing based on the second pixel value and generate a fourth pixel value;
the file generating unit 340 is configured to generate a second image file by combining the third pixel value and the fourth pixel value, where the second image file is an image file stored in a second pixel storage format.
The determining unit 310, the rearranging unit 320, the pixel generating unit 330, and the file generating unit 340 may all correspond to a processor. The processor may implement operations such as execution of pixel rearrangement, generation of the second image file, and the like through execution of a computer program or application software.
The processor may be a central processing unit, microprocessor, digital signal processor, application processor, programmable array or application specific integrated circuit, or the like.
Optionally, the determining unit 310 includes:
an identification module, which may correspond to an image processor or a general processor, etc., may be used to identify the graphical elements in the first image;
the determining module may correspond to the processor, and may be configured to determine that an area where the main graphical element meeting a preset condition is located is the target area, or determine that an area where a predetermined type of graphical element is located is the target area.
Optionally, the target region is a middle region of the first image.
In some embodiments, the pixel generation unit 330 is configured to generate an interpolated pixel value based on the second pixel values of two adjacent pixels; wherein the fourth pixel value comprises: a second pixel value and the interpolated pixel value.
Further, the second image file is used for outputting a second image; wherein the third pixel value is used for outputting a first region of the second image; the fourth pixel value is used for outputting a second area of the second image;
wherein the image resolution of the first region is equal to the image resolution of the second region.
As shown in fig. 9, the present embodiment provides a mobile terminal, including: a memory 410, a processor 420, and a computer program stored on the memory 410 and executed by the processor 420;
the memory 410 is used for storing information;
the processor 420 is connected to the memory 410 respectively, and is configured to implement the image processing method provided by one or more of the foregoing embodiments by executing the computer program.
The memory 410 may include: various types of computer-readable storage media may store information, such as memory 410.
The processor 420 is connected to the memory 410, for example, connected to the touch screen and the memory 410 through an integrated circuit bus, and the like, and may be configured to implement the image processing method provided by one or more of the foregoing technical solutions by reading and executing a computer program and the like on the memory 410.
In an embodiment of the invention, the processor 420 may include: a central processing unit, a microprocessor, a digital signal processor, an application processor, a programmable array or an application specific integrated circuit, etc.
An embodiment of the present invention provides a computer-readable storage medium, in which a computer program is stored; after being executed, the computer program can realize the image processing method provided by one or more of the technical solutions.
The computer readable storage medium provided by the embodiment of the present invention may be various types of computer readable storage media, for example, various computer readable storage media such as a random access memory, a read only memory, a flash memory, or a DVD or a usb disk, and may be selected as a non-transitory computer readable storage medium.
The following provides a specific example based on any of the embodiments described above:
in this example, when shooting is performed using a four-in-one camera, an image of 400 ten thousand pixels may be obtained. The 400-ten-thousand-pixel image substantially includes 1600-thousand pixels, but due to the 4-pixel merging process, if the image file generated by the acquisition is directly output, only one 400-thousand-pixel image can be output.
If an image of 1600 ten thousand pixels needs to be shot, the terminal needs to perform remosilac processing on the preview image shot by the camera, that is, to rearrange the pixel points, and then obtain a target image.
In fig. 10, one square may represent one pixel; the same filled tile may be used to provide pixel parameters, e.g., luminance values, for the same pixel of the first image prior to remoiac processing. The differently filled tiles may be used to provide pixel parameters for different pixels in the first image prior to remoiac processing.
After remosiac processing is completed, the identically filled squares are broken up, and the pixel values that correspond to providing pixel parameters for the same pixel are broken up for rearrangement.
Fig. 10 is a diagram showing an arrangement change of pixel values after remosiac processing. It is apparent that the pixel storage formats (i.e., pixel arrangements) before and after remosiac processing are different.
Since the effective information required by the terminal is often in the middle area of the image, remosiac processing on the whole image increases the processing time of the image and reduces the image processing efficiency. In this example, the terminal may perform remosiac processing only on the middle region of the image, and simultaneously perform pixel difference processing on the edge regions other than the middle region, and finally obtain a processed image. This example can reduce the amount of data subjected to remosiac processing, so that image processing efficiency can be greatly improved, and the image processing time can be reduced.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments can be implemented by hardware related to program instructions, and the program can be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned computer-readable storage media comprise: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (7)

1. An image processing method applied to a mobile terminal includes:
determining a target area, in which pixel rearrangement needs to be performed, of a first image generated based on a first image file, wherein a pixel value located in the target area is a first pixel value, and a pixel value located outside the target area is a second pixel value; the first image file is an image file stored in a first pixel storage format;
performing the pixel rearrangement based on the first pixel value and generating a third pixel value;
performing pixel interpolation processing based on the second pixel value and generating a fourth pixel value; the method comprises the following steps: generating an interpolated pixel value based on second pixel values of two adjacent pixels; wherein the fourth pixel value comprises: a second pixel value and the interpolated pixel value;
combining the third pixel value and the fourth pixel value to generate a second image file, wherein the second image file is an image file stored in a second pixel storage format;
wherein the pixel rearrangement comprises: rearranging the pixels in the first image file according to a second pixel storage format of the second image file;
wherein the determining a target region where a first image generated based on a first image file needs to be subjected to pixel rearrangement comprises:
identifying a graphical element in the first image;
and determining the area where the main graphic element meeting the preset condition is located as the target area, or determining the area where the graphic element of the preset type is located as the target area.
2. The method of claim 1, wherein the target region is a middle region of the first image.
3. The method according to claim 1 or 2, wherein the second image file is used for outputting a second image;
wherein the third pixel value is used for outputting a first region of the second image; the fourth pixel value is used for outputting a second area of the second image;
wherein the image resolution of the first region is equal to the image resolution of the second region.
4. A mobile terminal, comprising:
the image processing device comprises a determining unit, a generating unit and a processing unit, wherein the determining unit is used for determining a target area which needs to execute pixel rearrangement on a first image generated based on a first image file, the pixel value in the target area is a first pixel value, and the pixel value outside the target area is a second pixel value; the first image file is an image file stored in a first pixel storage format;
a rearrangement unit configured to perform the pixel rearrangement based on the first pixel value and generate a third pixel value;
a pixel generation unit configured to perform pixel interpolation processing based on the second pixel value and generate a fourth pixel value; the method comprises the following steps: generating an interpolated pixel value based on second pixel values of two adjacent pixels; wherein the fourth pixel value comprises: a second pixel value and the interpolated pixel value;
the file generating unit is used for generating a second image file by combining the third pixel value and the fourth pixel value, wherein the second image file is an image file stored in a second pixel storage format;
wherein the pixel rearrangement comprises: rearranging the pixels in the first image file according to a second pixel storage format of the second image file;
the determining unit is specifically configured to determine that an area where a main graphic element meeting a preset condition is located is the target area, or determine that an area where a predetermined type of graphic element is located is the target area.
5. The mobile terminal of claim 4, wherein the target area is a middle area of the first image.
6. A mobile terminal, comprising: a memory, a processor, and a computer program stored on the memory and executed by the processor;
the memory is used for storing information;
the processor, connected to the memory, is configured to implement the image processing method provided in any one of claims 1 to 3 by executing the computer program.
7. A computer-readable storage medium, having one or more computer programs stored thereon, which when executed, enable the image processing method provided in any one of claims 1 to 3.
CN201711040054.0A 2017-10-30 2017-10-30 Image processing method, mobile terminal and computer readable storage medium Active CN107808361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711040054.0A CN107808361B (en) 2017-10-30 2017-10-30 Image processing method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711040054.0A CN107808361B (en) 2017-10-30 2017-10-30 Image processing method, mobile terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN107808361A CN107808361A (en) 2018-03-16
CN107808361B true CN107808361B (en) 2021-08-10

Family

ID=61582361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711040054.0A Active CN107808361B (en) 2017-10-30 2017-10-30 Image processing method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN107808361B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101960487A (en) * 2008-03-03 2011-01-26 三菱电机株式会社 Image processing device and method, and image display device and method
US9401126B2 (en) * 2013-11-19 2016-07-26 Samsung Display Co., Ltd. Display driver for pentile-type pixels and display device including the same

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5471796B2 (en) * 2010-05-11 2014-04-16 コニカミノルタ株式会社 Threshold matrix generation method, threshold matrix generation apparatus, threshold matrix, quantization apparatus, and image forming apparatus
JP6019567B2 (en) * 2011-03-31 2016-11-02 ソニー株式会社 Image processing apparatus, image processing method, image processing program, and imaging apparatus
KR101795601B1 (en) * 2011-08-11 2017-11-08 삼성전자주식회사 Apparatus and method for processing image, and computer-readable storage medium
JP6074307B2 (en) * 2013-04-05 2017-02-01 凸版印刷株式会社 Color image processing device
CN105282558B (en) * 2014-07-18 2018-06-15 清华大学 Pixel prediction method, coding method, coding/decoding method and its device in frame
CN106506984B (en) * 2016-11-29 2019-05-14 Oppo广东移动通信有限公司 Image processing method and device, control method and device, imaging and electronic device
CN106604001B (en) * 2016-11-29 2018-06-29 广东欧珀移动通信有限公司 Image processing method, image processing apparatus, imaging device and electronic device
CN106713790B (en) * 2016-11-29 2019-05-10 Oppo广东移动通信有限公司 Control method, control device and electronic device
CN106507068B (en) * 2016-11-29 2018-05-04 广东欧珀移动通信有限公司 Image processing method and device, control method and device, imaging and electronic device
CN106507019B (en) * 2016-11-29 2019-05-10 Oppo广东移动通信有限公司 Control method, control device, electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101960487A (en) * 2008-03-03 2011-01-26 三菱电机株式会社 Image processing device and method, and image display device and method
US9401126B2 (en) * 2013-11-19 2016-07-26 Samsung Display Co., Ltd. Display driver for pentile-type pixels and display device including the same

Also Published As

Publication number Publication date
CN107808361A (en) 2018-03-16

Similar Documents

Publication Publication Date Title
CN107731199B (en) Screen color temperature adjusting method, terminal and computer readable storage medium
CN109068052B (en) Video shooting method, mobile terminal and computer readable storage medium
CN107959795B (en) Information acquisition method, information acquisition equipment and computer readable storage medium
CN108038825B (en) Image processing method and mobile terminal
CN108280136B (en) Multimedia object preview method, equipment and computer readable storage medium
CN108200421B (en) White balance processing method, terminal and computer readable storage medium
CN107295270B (en) Image brightness value determination method and device, terminal and computer-readable storage medium
CN108459799B (en) Picture processing method, mobile terminal and computer readable storage medium
CN107979667B (en) Dual-screen display method, mobile terminal and computer-readable storage medium
CN108198150B (en) Method for eliminating image dead pixel, terminal and storage medium
CN110187808B (en) Dynamic wallpaper setting method and device and computer-readable storage medium
CN107743199B (en) Image processing method, mobile terminal and computer readable storage medium
CN107295262B (en) Image processing method, mobile terminal and computer storage medium
CN107817963B (en) Image display method, mobile terminal and computer readable storage medium
CN111381762A (en) Double-screen switching method and device and computer readable storage medium
CN112135045A (en) Video processing method, mobile terminal and computer storage medium
CN112423102A (en) Small window screen projection control method and device and computer readable storage medium
CN112423211A (en) Multi-audio transmission control method, equipment and computer readable storage medium
CN108305218B (en) Panoramic image processing method, terminal and computer readable storage medium
CN107743204B (en) Exposure processing method, terminal, and computer-readable storage medium
CN112532838B (en) Image processing method, mobile terminal and computer storage medium
CN107808361B (en) Image processing method, mobile terminal and computer readable storage medium
CN112333326B (en) Screen projection display control method and device and computer readable storage medium
CN108196924B (en) Brightness adjusting method, terminal and computer readable storage medium
CN113902918A (en) Game image binarization processing method, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant