CN111221999A - Picture processing method and device, mobile terminal and storage medium - Google Patents

Picture processing method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN111221999A
CN111221999A CN202010018762.XA CN202010018762A CN111221999A CN 111221999 A CN111221999 A CN 111221999A CN 202010018762 A CN202010018762 A CN 202010018762A CN 111221999 A CN111221999 A CN 111221999A
Authority
CN
China
Prior art keywords
picture
mask
determining
features
common feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010018762.XA
Other languages
Chinese (zh)
Inventor
胡杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010018762.XA priority Critical patent/CN111221999A/en
Publication of CN111221999A publication Critical patent/CN111221999A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a picture processing method, a picture processing device, a mobile terminal and a storage medium, wherein the method comprises the following steps: in the browsing process of the atlas, determining a currently browsed target atlas and acquiring common characteristics in the target atlas; determining an area containing the common features in the first picture, and determining a first mask corresponding to the area containing the common features in the first picture, wherein the first picture is any one picture contained in the target atlas; the common features in the first picture are marked with a first mask. The embodiment of the application can improve the display effect of the pictures in the atlas.

Description

Picture processing method and device, mobile terminal and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for processing an image, a mobile terminal, and a storage medium.
Background
Currently, in an album of a mobile terminal (e.g., a mobile phone, a tablet computer, etc.), stored pictures are classified according to characteristics (e.g., classified according to persons included in the pictures), and are included in different albums. All pictures in a picture set contain the same feature (e.g., contain the avatar of a particular person). When a user browses a certain album, if the images in the album contain many features (for example, photos taken in a gym or a shopping mall with a large number of people and a large number of background people), the user often cannot quickly find the common features of the album from the album.
Disclosure of Invention
The embodiment of the application provides a picture processing method, a picture processing device, a mobile terminal and a storage medium, which can be used for distinguishing and displaying common features of pictures in a picture set and improving the display effect of the pictures in the picture set.
A first aspect of an embodiment of the present application provides an image processing method, including:
in the process of browsing an atlas, determining a currently browsed target atlas and acquiring common characteristics in the target atlas;
determining a region containing the common feature in a first picture, and determining a first mask corresponding to the region containing the common feature in the first picture, wherein the first picture is any one picture contained in the target atlas;
marking the common features in the first picture with the first mask.
A second aspect of the embodiments of the present application provides an image processing apparatus, including:
the first determining unit is used for determining a currently browsed target atlas in the process of browsing the atlas;
an acquisition unit configured to acquire a common feature in the target map set;
a second determining unit, configured to determine a region of a first picture that includes the common feature, and determine a first mask corresponding to the region of the first picture that includes the common feature, where the first picture is any one picture included in the target atlas;
an image processing unit for marking the common features in the first picture with the first mask.
A third aspect of an embodiment of the present application provides a mobile terminal, including a processor and a memory, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the step instructions in the first aspect of the embodiment of the present application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps as described in the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
In the embodiment of the application, in the browsing process of the atlas, a currently browsed target atlas is determined, and common features in the target atlas are obtained; determining a region containing the common feature in a first picture, and determining a first mask corresponding to the region containing the common feature in the first picture, wherein the first picture is any one picture contained in the target atlas; marking the common features in the first picture with the first mask. According to the embodiment of the application, in the browsing process of the atlas, the common features in the pictures in the currently browsed target atlas are marked by the mask, and the common features of the pictures in the atlas can be displayed in a distinguishing manner, so that the display effect of the pictures in the atlas is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a picture processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an effect of displaying an atlas image provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of another image processing method according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another image processing method according to an embodiment of the present application;
FIG. 5a is a schematic illustration of common features and other features in a first picture provided by an embodiment of the present application;
FIG. 5b is a schematic illustration of common features and other features in another first picture provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The mobile terminal according to the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and so on. For convenience of description, the above-mentioned devices are collectively referred to as a mobile terminal.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating a picture processing method according to an embodiment of the present disclosure. As shown in fig. 1, the picture processing method may include the following steps.
101, in the process of browsing the atlas, the mobile terminal determines the currently browsed target atlas and acquires the common features in the target atlas.
In the embodiment of the application, in an album of a mobile terminal (for example, a mobile phone, a tablet computer, etc.), stored pictures are classified according to common features, and pictures containing the same common features are classified into the same album. All pictures in a picture set contain the same common features. Different atlases may contain the same picture, e.g. a common feature of a first atlas is a first feature and a common feature of a second atlas is a second feature, and if a picture contains both the first feature and the second feature, the picture may be included in the first atlas or in the second atlas.
The common features may be a portrait, a landscape, a building, a logo, etc. For example, the common feature may be an avatar containing a particular person.
And 102, the mobile terminal determines an area containing the common features in the first picture, and determines a first mask corresponding to the area containing the common features in the first picture, wherein the first picture is any picture contained in the target atlas.
In the embodiment of the application, the mask converts different gray color values into different transparencies and applies the different transparencies to the layer where the mask is located, so that the transparencies of different parts of the layer are correspondingly changed. The mask may treat the area it covers, or treat other areas than the area it covers. The effects of the mask may include adjusting transparency, brightness, gray scale, etc.
The shape of the region including the common feature may be a rectangle, a circle, an ellipse, a polygon, or any other shape, which is not limited in the embodiments of the present application.
The first picture is any one picture contained in the target atlas. Other pictures in the target picture set can be processed in a mask marking mode of the first picture, and details are not repeated here.
Optionally, in step 102, the mobile terminal determines the first mask corresponding to the area containing the common feature in the first picture, and specifically includes the following steps:
(11) the mobile terminal determines the size of an area containing the common features in the first picture, and determines the size of the first mask according to the size of the area containing the common features;
(12) the mobile terminal determines the average brightness of the first picture, and determines the brightness of the first mask according to the average brightness of the first picture.
In an embodiment of the present application, the region size of the common feature is a size of a region occupied by the common feature in the first picture, and the region size of the common feature may include an area size of the region occupied by the common feature in the first picture. The size of the first mask may include an area of the first mask and a shape of the first mask.
Wherein the area size of the region occupied by the common feature in the first picture can be determined by an edge detection algorithm, the boundary of the region occupied by the common feature in the first picture, and then the area of the region occupied by the common feature in the first picture is calculated by dividing the region occupied by the common feature in the first picture into a plurality of rectangular regions.
The area of the first mask may be equal to the area of the region of the common feature in the first picture and the shape of the first mask may be the same as the shape of the region of the common feature in the first picture, ensuring that the region covered by the first mask is exactly the same as the region occupied by the common feature in the first picture.
The area of the first mask may be greater than or equal to the area of the region of the common feature in the first picture, and the shape of the first mask may be fixed to be rectangular, circular, elliptical, or the like.
After the mobile terminal determines the average brightness of the first picture, the average brightness of the first picture may be used as the brightness of the first mask. The situation that the mask effect is poor due to the fact that the difference between the brightness of the first mask and the average brightness of the first picture is too large can be avoided.
Optionally, after determining the average brightness of the first picture, the mobile terminal may increase a certain threshold on the basis of the average brightness of the first picture, and then use the increased threshold as the brightness of the first mask. The certain threshold may be twice the average brightness of the first picture, or may be three times the average brightness of the first picture, or the like. The brightness of the first mask is greatly different from the average brightness of the first picture, so that the position of the common feature can be quickly determined from each picture of the target atlas, and a user can conveniently and quickly locate the position of the common feature.
103, the mobile terminal marks the common features in the first picture with a first mask.
In the embodiment of the present application, the mobile terminal marks the common features in the first picture with the first mask, specifically: the mobile terminal covers the area of the common characteristics in the first picture with the first mask, and then processes the area covered by the first mask or processes other areas except the area of the first mask.
The following specifically explains the display effect of the pictures in the album by taking a mobile phone album interface as an example and combining fig. 2. Referring to fig. 2, fig. 2 is a schematic diagram illustrating an effect of displaying an image in an album according to an embodiment of the present disclosure. As shown in fig. 2, when a user enters the album directory of the mobile phone album, the album can be seen to be classified into a "people" album, a "buildings" album, a "natural scenery" album and a "marks" album according to categories. If the user wants to browse the 'people' atlas, the user can click on the icon of the 'people' atlas to enter a 'people' atlas directory of the mobile phone album. Under the "people" atlas, the "people" atlas is seen to be divided into a "first people" atlas, a "second people" atlas, a "third people" atlas and a "fourth people" atlas according to the difference of people. For example, the "first person" atlas may contain an atlas of an avatar of person A, the "second person" atlas may contain an atlas of an avatar of person B, the "third person" atlas may contain an atlas of an avatar of person C, and the "fourth person" atlas may contain an atlas of an avatar of person D. Person a, person B, person C, person D may be different persons included in the album. If the user wants to browse the "first person" atlas (the "first person" atlas may correspond to the target atlas), the icon of the "first person" atlas may be clicked on to enter the "first person" atlas directory of the mobile phone album. The "first person" atlas may include picture 1, picture 2, picture 3, and picture 4 (the first picture may be any one of picture 1, picture 2, picture 3, and picture 4), where picture 1, picture 2, picture 3, and picture 4 all include an avatar of the first person, the avatar of the first person included in picture 1 is marked by mask 1, the avatar of the first person included in picture 2 is marked by mask 2, the avatar of the first person included in picture 3 is marked by mask 3, and the avatar of the first person included in picture 4 is marked by mask 4. Note that the sizes of the mask 1, the mask 2, the mask 3, and the mask 4 are different because the area occupied by the avatar of the first person is different among the pictures 1, 2, 3, and 4.
In the embodiment of the application, in the browsing process of the atlas, the common features in the pictures in the currently browsed target atlas are marked by using the mask, and the common features of the pictures in the atlas can be displayed in a distinguishing manner, so that the display effect of the pictures in the atlas is improved.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating another picture processing method according to an embodiment of the present disclosure. As shown in fig. 3, the picture processing method may include the following steps.
301, in the process of browsing the atlas, the mobile terminal determines the currently browsed target atlas and obtains the common features in the target atlas.
And 302, the mobile terminal determines an area containing the common features in the first picture, and determines a first mask corresponding to the area containing the common features in the first picture, wherein the first picture is any picture contained in the target atlas.
The specific implementation of step 301 and step 302 may refer to step 101 and step 102 shown in fig. 1, and is not described here again.
303, the mobile terminal detects whether the first picture is in a preview mode or a reduced display mode. If yes, go to step 304, otherwise, the mobile terminal does not mark the first picture by using the mask.
In the embodiment of the application, in the preview mode or the reduced display mode, the area of the first picture is small, a user cannot easily find the common features from the first picture, the first mask is adopted for marking, the positions of the common features can be determined from the first picture quickly, and the user can conveniently and quickly position the positions of the common features.
Wherein step 303 needs to be performed before step 304. Step 303 may be performed before step 302, or after step 302, and step 303 may be performed before step 301, or after step 301, which is not limited in this embodiment of the application.
The mobile terminal marks 304 the common features in the first picture with a first mask.
The specific implementation of step 304 may refer to step 101 shown in fig. 1, and is not described herein again.
Optionally, after step 304 is executed, the following steps may also be executed:
and if the first picture is in a full-screen display mode or an enlarged display mode, the mobile terminal cancels the mark of the first mask.
In the embodiment of the application, when the first picture is in the full-screen display mode or the enlarged display mode, the user's intention is to see the details of the first picture clearly, and at this time, the marking effect of the first mask can influence the user to see the details of the first picture clearly, so that the user picture browsing experience is influenced. According to the method and the device, the mark of the first mask can be cancelled when the first picture is in a full-screen display mode or an amplification display mode, so that the picture browsing experience of a user is improved.
Optionally, if the first picture is switched from the full-screen display mode or the enlarged display mode to the preview mode or the reduced display mode, the common features in the first picture may be marked again with the first mask.
In the embodiment of the application, the first picture in the target picture set is not always in the state of the mask mark, and the first picture is marked by the first mask in the preview mode or the zoom-out display mode, so that the position of the common feature can be quickly determined from the first picture, and a user can conveniently and quickly locate the position of the common feature. And when the first picture is in a full-screen display mode or an amplification display mode, the mark of the first mask is cancelled, so that the picture browsing experience of a user is improved.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating another picture processing method according to an embodiment of the present disclosure. As shown in fig. 4, the picture processing method may include the following steps.
401, in the process of browsing the atlas, the mobile terminal determines the currently browsed target atlas and obtains the common features in the target atlas.
402, the mobile terminal determines an area containing the common features in the first picture, wherein the first picture is any one picture contained in the target atlas.
The specific implementation of step 401 and step 402 may refer to step 101 and step 102 shown in fig. 1, and is not described here again.
And 403, the mobile terminal determines the common feature area of the first picture containing the common features and determines whether the first picture contains other features with the same common feature type.
In the embodiment of the present application, the common feature area of the common feature refers to an area of a region occupied by the common feature in the first picture. Other features refer to the same features as the common feature class. For example, if the common feature is the avatar of the first person, then the other features may be the avatars of persons other than the first person. The other features are the same as the category of the common feature, and when the first picture is in the preview mode or the zoom display mode, the user easily confuses the common feature with the other features, and it is difficult to quickly recognize the common feature from the first picture.
And 404, in the case that the ratio of the area of the common feature to the area of the first picture is smaller than a first threshold value, and the first picture contains other features same as the common feature category, the mobile terminal determines the first picture as a mask enabled picture, and determines a first mask corresponding to a region of the first picture containing the common features.
In the embodiment of the application, the mask starting picture refers to a picture which needs to be marked by using a mask. Specifically, taking the first picture as an example, the common features in the first picture may be marked by the first mask when the first picture is in the preview mode or the zoom display mode, and the marking of the first mask in the first picture may be cancelled when the first picture is in the full-screen mode or the enlarged display mode.
In a case where the ratio of the common feature area to the area of the first picture is smaller than the first threshold value, and the first picture contains other features that are the same as the common feature class. When a user views a first picture, common features in the first picture are easily undetected and easily confused with other features. In this case, the first picture is determined as a mask enabled picture, the first mask corresponding to the region of the first picture containing the common feature is determined, and the common feature in the first picture is marked with the first mask. The mask may be used for marking in case common features in the first picture are easily confused. Not all the common features in the pictures are marked by the mask, so that the number of the pictures marked by the mask is reduced, and the efficiency of marking the mask is improved.
The first threshold may be preset and stored in a memory (e.g., a non-volatile memory) of the mobile terminal. For example, the first threshold may be less than 20%, and specifically, the first threshold may be set to any one of 10%, 8%, or 5%.
Optionally, the mobile terminal determines that the first picture is a mask closed picture when the first picture does not include other features that are the same as the common feature category.
In the embodiment of the application, when the first picture does not contain other features which are the same as the common feature types, it is indicated that the first picture does not contain other features, and when a user views the first picture, the common features in the first picture are not affected by the other features and are relatively easy to see by the user. Therefore, in this case, no matter whether the ratio of the common feature area to the area of the first picture is greater than the first threshold, the first picture does not need to be masked, the number of pictures which are masked in the target picture set is reduced, and the efficiency of masking is improved.
Optionally, in a case that the first picture includes other features that are the same as the common feature category, the mobile terminal determines that the first picture includes other feature areas of each of the other features that are the same as the common feature category;
and if other features of the target with the area larger than the first threshold exist in the other features, the mobile terminal determines that the first picture is a mask enabled picture, and the step of determining, by the mobile terminal, a first mask corresponding to the region containing the common features in the first picture in step 404 is executed.
In the embodiment of the application, when the first picture contains other features which are the same as the common feature type, and other features of the target with the area larger than the first threshold exist in the other features, which indicates that the other features easily cause visual interference to the common features, the first picture needs to be masked and marked, so that the picture browsing experience of a user is improved.
The area of the other features can be determined by an edge detection algorithm, and then the area occupied by the other features in the first picture is divided into a plurality of rectangular areas to calculate the area of the area occupied by the other features in the first picture.
Optionally, when other features of the target having the feature area larger than the first threshold do not exist, the mobile terminal determines that the first picture is a mask closing picture, and does not mark the first picture with a mask.
In the embodiment of the application, under the condition that the first picture contains other features which are the same as the common feature types, if the areas of the other features in the first picture are smaller than the first threshold, the other features in the first picture are considered not to cause visual interference on the common features, the first picture does not need to be masked, the number of pictures which are subjected to masking marking in a target picture set is reduced, and the efficiency of masking marking is improved.
The mobile terminal marks 405 the common features in the first picture with a first mask.
The specific implementation of step 405 may refer to step 103 shown in fig. 1, and is not described herein again.
In the embodiment of the application, when the proportion of the area of the common feature in the first picture is smaller than the first threshold and the first picture contains other features that are the same as the common feature category, the first picture may be marked by using the first mask, and a user may quickly determine the position of the common feature from the first picture, so that the user can quickly locate the position of the common feature. Under the condition that the first picture does not contain other features which are the same as the common feature types, the mobile terminal determines that the first picture is a mask closing picture, the number of pictures which are subjected to mask marking in the target picture set is reduced, and the mask marking efficiency is improved.
Referring to fig. 5a, fig. 5a is a schematic diagram of common features and other features in a first picture according to an embodiment of the present disclosure. As shown in fig. 5a, the first picture includes a common feature, another feature a, another feature B, and another feature C, where a ratio of a common feature area to an area of the first picture is smaller than a first threshold, and the first picture includes another feature that is the same as the common feature category, the mobile terminal determines the first picture as a mask enabled picture, determines a first mask corresponding to a region of the first picture that includes the common feature, and marks the common feature in the first picture with the first mask.
Referring to fig. 5b, fig. 5b is a schematic diagram of common features and other features in another first picture according to an embodiment of the present disclosure. As shown in fig. 5B, the first picture includes a common feature, another feature a, another feature B, and another feature C, where a ratio of the area of the first picture occupied by the common feature is greater than a first threshold, and the areas of the other features a, the other features B, and the other features of the other features C are all smaller than the first threshold, and the mobile terminal determines that the first picture is a mask closing picture, and does not mark the common feature of the first picture with a mask.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the mobile terminal includes hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the mobile terminal may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In accordance with the above, referring to fig. 6, fig. 6 is a schematic structural diagram of a picture processing apparatus according to an embodiment of the present application, where the picture processing apparatus 600 may include a first determining unit 601, an obtaining unit 602, a second determining unit 603, and an image processing unit 604, where:
the first determining unit 601 is configured to determine a currently browsed target atlas in an atlas browsing process;
the obtaining unit 602 is configured to obtain a common feature in the target map set;
the second determining unit 603 determines a region of a first picture including the common feature, and determines a first mask corresponding to the region of the first picture including the common feature, where the first picture is any one picture included in the target atlas;
the image processing unit 604 is configured to mark the common feature in the first picture with the first mask.
Optionally, the second determining unit 603 determines the first mask corresponding to the region containing the common feature in the first picture, specifically: determining the size of an area containing the common features in the first picture, and determining the size of a first mask according to the size of the area containing the common features; and determining the average brightness of the first picture, and determining the brightness of the first mask according to the average brightness of the first picture.
Optionally, the picture processing apparatus 600 may include a detection unit 605;
the detecting unit 605 is configured to detect whether the first picture is in a preview mode or a zoom-out display mode before the image processing unit 604 marks the common feature in the first picture with the first mask;
the image processing unit 604 is further configured to mark the common feature in the first picture with the first mask if the first picture is in a preview mode or a reduced display mode.
Optionally, the image processing unit 604 is further configured to, after marking the common feature in the first picture with the first mask, cancel the marking of the first mask when the first picture is in a full-screen display mode or an enlarged display mode.
Optionally, the second determining unit 603 is further configured to determine, after determining a region in a first picture that includes the common feature, a common feature area in the first picture that includes the common feature, and determine whether the first picture includes another feature that is the same as the common feature category; determining the first picture as a mask enabled picture, and determining a first mask corresponding to a region of the first picture containing the common feature, if the proportion of the common feature area to the area of the first picture is less than a first threshold and the first picture contains other features of the same category as the common feature.
Optionally, the second determining unit 603 is further configured to determine that the first picture is a mask closed picture when the first picture does not include other features that are the same as the common feature categories.
Optionally, the second determining unit 603 is further configured to determine, when the first picture includes other features that are the same as the common feature class, that the first picture includes other feature areas of each of the other features that are the same as the common feature class;
the second determining unit 603 is further configured to determine that the first picture is a mask enabled picture if there is another target feature with an area of the other feature larger than the first threshold among the other features, and determine a first mask corresponding to a region of the first picture that includes the common feature.
The second determining unit 603 is further configured to determine that the first picture is a mask closed picture when there is no target other feature with an area of the other feature larger than the first threshold in the other features.
The first determining unit 601, the acquiring unit 602, the second determining unit 603, the image processing unit 604 and the detecting unit 605 may be processors of a mobile terminal.
In the embodiment of the application, in the browsing process of the atlas, the common features in the pictures in the currently browsed target atlas are marked by using the mask, and the common features of the pictures in the atlas can be displayed in a distinguishing manner, so that the display effect of the pictures in the atlas is improved.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure, as shown in fig. 7, the mobile terminal 700 includes a processor 701 and a memory 702, and the processor 701 and the memory 702 may be connected to each other through a communication bus 703. The communication bus 703 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus 704 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 7, but this is not intended to represent only one bus or type of bus. The memory 702 is used for storing a computer program comprising program instructions, the processor 701 being configured for invoking the program instructions, the program comprising instructions for performing the method shown in fig. 1, fig. 3 or fig. 4.
The processor 701 may be a general purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of programs according to the above schemes.
The Memory 702 may be, but is not limited to, a Read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via a bus. The memory may also be integral to the processor.
The mobile terminal 700 may also include a camera 704 and a display 705. The cameras 704 may include front-facing cameras, rear-facing cameras, and the like. The display 705 may include a liquid crystal display, an LED display, an OLED display, or other touch display.
In addition, the mobile terminal 700 may also include general-purpose components such as a communication interface, an antenna, and the like, which will not be described in detail herein.
In the embodiment of the application, in the browsing process of the atlas, the common features in the pictures in the currently browsed target atlas are marked by using the mask, and the common features of the pictures in the atlas can be displayed in a distinguishing manner, so that the display effect of the pictures in the atlas is improved.
Embodiments of the present application also provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the image processing methods as described in the above method embodiments.
Embodiments of the present application further provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program causes a computer to execute some or all of the steps of any one of the image processing methods described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash memory disks, read-only memory, random access memory, magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An image processing method, comprising:
in the process of browsing an atlas, determining a currently browsed target atlas and acquiring common characteristics in the target atlas;
determining a region containing the common feature in a first picture, and determining a first mask corresponding to the region containing the common feature in the first picture, wherein the first picture is any one picture contained in the target atlas;
marking the common features in the first picture with the first mask.
2. The method of claim 1, wherein the determining a first mask corresponding to an area of the first picture containing the common feature comprises:
determining the size of an area containing the common features in the first picture, and determining the size of a first mask according to the size of the area containing the common features;
and determining the average brightness of the first picture, and determining the brightness of the first mask according to the average brightness of the first picture.
3. The method of claim 1 or 2, wherein prior to said marking said common feature in said first picture with said first mask, said method further comprises:
detecting whether the first picture is in a preview mode or a reduced display mode;
if yes, the step of marking the common features in the first picture by the first mask is executed.
4. The method of claim 3, wherein after the marking the common feature in the first picture with the first mask, the method further comprises:
and if the first picture is in a full-screen display mode or an enlarged display mode, canceling the mark of the first mask.
5. The method according to any of claims 1 to 4, wherein after determining the region of the first picture containing the common feature, the method further comprises:
determining a common feature area in the first picture containing the common feature, and determining whether the first picture contains other features which are the same as the common feature category;
determining the first picture as a mask enabled picture if the proportion of the common feature area to the area of the first picture is less than a first threshold and the first picture contains other features that are the same as the common feature class, the step of determining the first mask corresponding to the region of the first picture containing the common features being performed.
6. The method of claim 5, further comprising:
determining that the first picture is a mask closed picture if the first picture does not include other features that are the same as the common feature class.
7. The method of claim 5, further comprising:
determining that the first picture contains an additional feature area for each of the additional features that are the same as the common feature class if the first picture contains the additional features that are the same as the common feature class;
and when other features have target other features with the area larger than the first threshold, determining that the first picture is a mask enabled picture, and executing the step of determining a first mask corresponding to a region containing the common features in the first picture.
8. A picture processing apparatus, comprising:
the first determining unit is used for determining a currently browsed target atlas in the process of browsing the atlas;
an acquisition unit configured to acquire a common feature in the target map set;
a second determining unit, configured to determine a region of a first picture that includes the common feature, and determine a first mask corresponding to the region of the first picture that includes the common feature, where the first picture is any one picture included in the target atlas;
an image processing unit for marking the common features in the first picture with the first mask.
9. A mobile terminal comprising a processor and a memory, the memory for storing a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to carry out the method according to any one of claims 1 to 7.
CN202010018762.XA 2020-01-08 2020-01-08 Picture processing method and device, mobile terminal and storage medium Pending CN111221999A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010018762.XA CN111221999A (en) 2020-01-08 2020-01-08 Picture processing method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010018762.XA CN111221999A (en) 2020-01-08 2020-01-08 Picture processing method and device, mobile terminal and storage medium

Publications (1)

Publication Number Publication Date
CN111221999A true CN111221999A (en) 2020-06-02

Family

ID=70829381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010018762.XA Pending CN111221999A (en) 2020-01-08 2020-01-08 Picture processing method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111221999A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020762A1 (en) * 2001-07-27 2003-01-30 Budrys Audrius J. Multi-component iconic representation of file characteristics
US7970240B1 (en) * 2001-12-17 2011-06-28 Google Inc. Method and apparatus for archiving and visualizing digital images
CN108038431A (en) * 2017-11-30 2018-05-15 广东欧珀移动通信有限公司 Image processing method, device, computer equipment and computer-readable recording medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020762A1 (en) * 2001-07-27 2003-01-30 Budrys Audrius J. Multi-component iconic representation of file characteristics
US7970240B1 (en) * 2001-12-17 2011-06-28 Google Inc. Method and apparatus for archiving and visualizing digital images
CN108038431A (en) * 2017-11-30 2018-05-15 广东欧珀移动通信有限公司 Image processing method, device, computer equipment and computer-readable recording medium

Similar Documents

Publication Publication Date Title
CN110933296B (en) Apparatus and method for providing content aware photo filter
CN110100251B (en) Apparatus, method, and computer-readable storage medium for processing document
CN103500066B (en) Screenshot device and method suitable for touch screen equipment
CN109064390B (en) Image processing method, image processing device and mobile terminal
KR102173123B1 (en) Method and apparatus for recognizing object of image in electronic device
CN106909377B (en) Hot area page processing method and device
CN103793521B (en) Image processing method and device
CN108898082B (en) Picture processing method, picture processing device and terminal equipment
CN104866755B (en) Setting method and device for background picture of application program unlocking interface and electronic equipment
US11825040B2 (en) Image shooting method and device, terminal, and storage medium
US20180159971A1 (en) Method and apparatus for generating unlocking interface, and electronic device
KR20150025214A (en) Method for displaying visual object on video, machine-readable storage medium and electronic device
EP3032482A1 (en) Page display method and apparatus
CN112040145B (en) Image processing method and device and electronic equipment
CN111105474B (en) Font drawing method, font drawing device, computer device and computer readable storage medium
CN109791703B (en) Generating three-dimensional user experience based on two-dimensional media content
US10216381B2 (en) Image capture
JP6564859B2 (en) Color gamut mapping method and apparatus
CN108984740B (en) Page interaction method, device, equipment and computer readable medium
CN105096355B (en) Image processing method and system
CN108898169B (en) Picture processing method, picture processing device and terminal equipment
CN111221999A (en) Picture processing method and device, mobile terminal and storage medium
US20220038637A1 (en) Information processing apparatus and non-transitory computer readable medium
US20180211027A1 (en) Password setting method and device
CN111324267B (en) Image display method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination