CN117992628A - Image display control method and device and electronic equipment - Google Patents

Image display control method and device and electronic equipment Download PDF

Info

Publication number
CN117992628A
CN117992628A CN202311429184.9A CN202311429184A CN117992628A CN 117992628 A CN117992628 A CN 117992628A CN 202311429184 A CN202311429184 A CN 202311429184A CN 117992628 A CN117992628 A CN 117992628A
Authority
CN
China
Prior art keywords
image
input
images
image feature
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311429184.9A
Other languages
Chinese (zh)
Inventor
陈虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311429184.9A priority Critical patent/CN117992628A/en
Publication of CN117992628A publication Critical patent/CN117992628A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an image display control method, an image display control device and electronic equipment, and belongs to the technical field of image processing, wherein the method comprises the following steps: a program interface for displaying an album program, wherein the program interface comprises at least one folder icon, the at least one folder icon is obtained by classifying images according to first type image features indicated by first image feature labels, and album folders corresponding to different folder icons comprise at least one image with different first type image features; receiving a first input of a user, wherein the first input is used for selecting a folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, and an album folder corresponding to the folder icon selected by the first input comprises at least two images, and N is a positive integer; and displaying an image obtained by classifying at least two images according to the image characteristics indicated by the N image characteristic labels in response to the first input.

Description

Image display control method and device and electronic equipment
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image display control method and device and electronic equipment.
Background
With the popularity of electronic devices, electronic devices are becoming more and more widely used, for example, people use electronic devices to capture images and store the images in an album where they can be subsequently viewed. However, as the number of images in the album increases, people can be complicated when searching for images, and the efficiency of searching for images by users is low.
Disclosure of Invention
The embodiment of the application aims to provide an image display control method, an image display control device and electronic equipment, which can more accurately classify and display images in a rich image classification mode, can facilitate a user to find out the images wanted by the user more quickly, can rapidly screen out the required images from at least two images in the mode, and can improve the image searching efficiency.
In a first aspect, an embodiment of the present application provides an image display control method, including:
A program interface for displaying an album program, wherein the program interface comprises at least one folder icon, the at least one folder icon is obtained by classifying images according to first type image features indicated by first image feature labels, and album folders corresponding to different folder icons comprise at least one image with different first type image features;
Receiving a first input of a user, wherein the first input is used for selecting a folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, an album folder corresponding to the folder icon selected by the first input comprises at least two images, and N is a positive integer;
And responding to the first input, and displaying images obtained by carrying out image classification on the at least two images according to the image characteristics indicated by the N image characteristic labels.
In a second aspect, an embodiment of the present application provides an image classification display control apparatus, including:
The display module is used for displaying a program interface of the album program, the program interface comprises at least one folder icon, the at least one folder icon is obtained by classifying images according to first type image features indicated by the first image feature labels, and at least one image with different first type image features is included in the album folders corresponding to different folder icons;
The receiving module is used for receiving a first input of a user, wherein the first input is used for selecting a folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, an album folder corresponding to the file icon selected by the first input comprises at least two images, and N is a positive integer;
the display module is further used for responding to the first input, and displaying images obtained by classifying the images of the at least two images according to the image features indicated by the N image feature labels.
In a third aspect, an embodiment of the present application provides an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor perform the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a program product stored in a storage medium, the program product being executed by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, a program interface of an album program is displayed, the program interface comprises at least one folder icon, the at least one folder icon is obtained by classifying images according to the first type of image features indicated by the first image feature labels, and the album folders corresponding to different folder icons comprise at least one image with different first type of image features; receiving a first input of a user, wherein the first input is used for selecting a folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, an album folder corresponding to the file icon selected by the first input comprises at least two images, and N is a positive integer; and responding to the first input, and displaying images obtained by carrying out image classification on the at least two images according to the image characteristics indicated by the N image characteristic labels. In the above, the at least two images in the folder icon selected by the first input may be classified by selecting one folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, and the classified images may be displayed, and since the classification dimensions of the image features indicated by the N image feature labels may be considered, the images may be classified and displayed more accurately in a rich image classification manner, so that the user may find the desired image more quickly, and by the above manner, the desired image may be obtained by fast screening from the at least two images, thereby improving the searching efficiency of the image.
Drawings
FIG. 1 is a flow chart of an image display control method provided by some embodiments of the present application;
FIG. 2A is a schematic diagram of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2B is a schematic diagram of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2C is a schematic diagram of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2D is a schematic diagram of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2E is a schematic illustration of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2F is a schematic diagram of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2G is a schematic diagram of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2H is a schematic illustration of an electronic device display screen interface provided by some embodiments of the application;
FIG. 2I is a schematic diagram of an electronic device display screen interface provided by some embodiments of the application;
FIG. 3 is a block diagram of an image classification display control apparatus provided by some embodiments of the present application;
FIG. 4 is a block diagram of an electronic device provided by some embodiments of the application;
Fig. 5 is a block diagram of an electronic device provided by some embodiments of the application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type, and are not limited to the number of objects, such as the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
An image feature tag may be understood as a feature tag of an image, for example, a photographing time, a photographing place, and the like of the image.
It should be noted that, in the image display control method provided by the embodiment of the present application, the execution body may be an electronic device such as a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, and the like. In some embodiments of the present application, an image display control method is executed by using an electronic device as an execution body, and the image display control method provided by the embodiments of the present application is described.
The image display control method provided by the embodiment of the application can be applied to searching the scene of the image in the album application program installed on the electronic equipment by the user of the electronic equipment, wherein a specific application scene comprises 1000 images shot by the user of the electronic equipment by using the camera of the electronic equipment in the album application program, the user wants to find one image shot at the B place from the 1000 images, and the shooting time of the image is not remembered by the user. The image display control method provided by some embodiments of the present application is described in detail below with reference to the accompanying drawings by specific embodiments in combination with the above application scenario.
Fig. 1 is a flowchart of an image display control method according to some embodiments of the present application, where the image display control method is applied to an electronic device, and includes the following steps:
Step 101, displaying a program interface of an album program, wherein the program interface comprises at least one folder icon, the at least one folder icon is obtained by classifying images according to first type image features indicated by first image feature labels, and at least one image with different first type image features is included in an album folder corresponding to different folder icons. The first type of image features in the embodiment of the present application include common image features, such as a shooting location, shooting time, or a shooting object. The second type of image features in the embodiment of the application are different from common image features, and specifically may include at least one of the following: weather, season, topography, temperature, mood, color, event.
The program interface comprises at least one folder icon, wherein the images in the folders corresponding to the folder icons are obtained by classifying the images according to the first image characteristic labels, and the images in the album folders corresponding to the different folder icons are classified according to the different image characteristic labels. For example, as shown in fig. 2A, the program interface includes three folder icons, the folder icon 1 is obtained by classifying 1000 images in the album program by using the shooting locations, the folder icon 1 includes 700 images in the folder corresponding to the folder icon 1, the folder icon 1 may be a thumbnail of any one of the 700 images, the 700 images all have the image feature of the shooting location, the shooting locations of the 700 images may be the same shooting location, for example, all the images shot at the B location, or the shooting locations of the 700 images may be different shooting locations, for example, 300 images shot at the B location, 300 images shot at the C location, and 100 images shot at the a location. When there are a plurality of shooting sites, the 700 images can be displayed in groups, the images in the same group have the same shooting site, and the image shooting sites in different groups are different.
The folder icon 2 is obtained by classifying 1000 images in the album program by adopting shooting time, the folder corresponding to the folder icon 2 comprises 800 images, and the folder icon 2 can be a thumbnail of any one image in the 800 images; the 800 images all have the image characteristic of the photographing time, the photographing time of the 800 images may be the same photographing time, for example, the photographing time of the image photographed at the month of 23 years 1, or the photographing time of the 800 images may be different photographing times, for example, 300 images among the 800 images photographed at the month of 23 years 1, 300 images photographed at the month of 23 years 2, and 200 images photographed at the month of 23 years 3. When the photographing time of the 800 images is plural, the photographing time may be grouped by day, the images photographed on the same day belong to the same group, and the photographing times of the images in different groups are different.
The folder icon 3 is obtained by classifying 1000 images in the album program by adopting weather, the folder corresponding to the folder icon 3 comprises 600 images, and the folder icon 3 can be a thumbnail of any one of the 600 images. The 600 images all have the image characteristic of weather, and the shooting weather of the 600 images can be the same shooting weather, for example, all the shooting weather is cloudy; the shooting weather of the 600 images can also be different, for example, the shooting weather comprises a cloudy day, a sunny day and a rainy day, when the 600 images are displayed, the 600 images are displayed in groups according to the shooting weather, the images belonging to the cloudy day are in the same group, the images belonging to the sunny day are in the same group, and the images belonging to the rainy day are in the same group.
For another example, as shown in fig. 2B, the program interface includes three folder icons, the folder icon 4 is obtained by classifying 1000 images in the album program by using a tag that captures a sunny day, the folder corresponding to the folder icon 4 includes 200 thumbnails of any one of the 200 images, and the 200 images have the image feature of sunny days; the folder icon 5 is obtained by classifying 1000 images in the album program by adopting a tag with shooting weather being rainy days, the folder corresponding to the folder icon 5 comprises 300 images, the folder icon 5 can be a thumbnail of any one image in the 300 images, and the 300 images have the image characteristic of rainy days; the folder icon 6 is obtained by classifying 1000 images in the album program by using a tag with cloudy shooting weather, the folder corresponding to the folder icon 6 comprises 100 images, the folder icon 6 can be a thumbnail of any one image in the 100 images, and the 100 images have the characteristic of cloudy images.
Step 102, receiving a first input of a user, where the first input is used to select a folder icon to be reclassified from the at least one folder icon and select N image feature labels, and an album folder corresponding to the folder icon selected by the first input includes at least two images, where N is a positive integer.
In some embodiments of the application, the first input may be a first operation. Illustratively, the first input includes, but is not limited to: the user can specifically determine according to the actual use requirement by inputting the touch control of the folder icons through the touch control device such as a finger or a stylus, or inputting a voice command input by the user, or inputting a specific gesture input by the user, or inputting other feasibility, and the embodiment of the application is not limited. The specific gesture in the embodiment of the application can be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture and a double-click gesture; the click input in the embodiment of the application can be single click input, double click input or any time click input, and the like, and can also be long-press input or short-press input.
For example, the first input includes two sub-inputs, the first sub-input for selecting one folder icon and the second sub-input for selecting N image feature labels.
And step 103, displaying images obtained by performing image classification on the at least two images according to the image features indicated by the N image feature labels in response to the first input.
In this step, the images in the selected folder are classified according to the image features indicated by the N image feature tags, and the classified images are displayed.
In this embodiment, by selecting one folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, at least two images in the folder icon selected by the first input may be classified, and the classified images may be displayed, because the classification dimensions of the image features indicated by the N image feature labels may be considered, the images may be classified and displayed more accurately in a rich image classification manner, so that a user may find the desired image more quickly, and by using the above manner, the user may conveniently and quickly screen the image to be searched from the at least two images, thereby improving the searching efficiency of the image.
In some embodiments of the application, n=1, the N image feature labels include a second image feature label; the second image feature tag indicates a second type of image feature, the second type of image feature is associated with at least two second type of sub-tags, and the second type of sub-tag indicates one sub-classification feature of the second type of image feature;
the step of responding to the first input, displaying an image obtained by classifying the at least two images according to the image characteristics indicated by the N image characteristic labels, comprises the following steps:
Responding to the first input, and carrying out image classification on the at least two images according to the sub-classification features indicated by the at least two second class sub-labels to obtain at least two groups of images, wherein one group of images has one sub-classification feature indicated by the second class sub-label;
And displaying the at least two groups of images and the at least two second class sub-labels in a partition mode.
In the embodiment of the present application, n=1, that is, the images in the selected folder are classified according to the image feature indicated by one image feature tag. For example, the second type of image feature may be weather and the at least two second type of sub-tags may be cloudy, rainy, sunny, cloudy.
At least two images are classified according to each second type of sub-label, a group of images corresponding to each second type of sub-label is obtained, each group of images can be displayed in a partitioning mode when each group of images are displayed, for example, the images are displayed in sequence from top to bottom or from left to right, and the corresponding second type of sub-labels are displayed at the display area of each group of images. As shown in fig. 2C, the second sub-label includes a sunny day 11, a rainy day 12, a cloudy day 13, a cloudy day 14, an image 21 and an image 22 which are group images corresponding to the sunny day 11, and an image 23 and an image 24 which are group images corresponding to the rainy day 12; image 25 and image 26 are shown as grouped images corresponding to clouds 13; image 27, image 28 and image 29 are shown as grouped images corresponding to overcast 14.
In the embodiment of the application, at least two images are respectively classified through the plurality of second type sub-labels, so that a plurality of groups of images displayed in groups and the corresponding second type sub-labels are obtained, a user can conveniently check the first image, and the image searching efficiency is improved.
In some embodiments of the application, the first image feature tag is a shooting location, the first input comprises a first press input on one of the at least one folder icon and a selection input to select a second image feature tag, the selection input comprising a slide input to slide from a press location of the press input to the second image feature tag and a second press input on the second image feature tag;
in receiving the first input, the method further comprises:
Displaying at least two image feature labels on the folder icon selected by the first input under the condition that the pressing time length of the first pressing input is larger than a first time length threshold value;
receiving a sliding input from a user sliding from a pressing position of the pressing input to the second image feature label and a second pressing input on the second image feature label;
The displaying of the image obtained by classifying the at least two images according to the image features indicated by the N image feature labels comprises the following steps:
Displaying a map interface under the condition that the pressing time length of the second pressing input is larger than a second time length threshold, wherein at least two map areas associated with the position of one place on the map interface comprise at least two thumbnails;
The location is a shooting location of an image in an album folder indicated by the folder icon selected by the first input; and the at least two thumbnails are obtained by classifying all images in the album folder according to the second-class image features indicated by the second-class image feature tags, and one thumbnail corresponds to a group of images with the sub-classification features indicated by the second-class sub-tags.
In the foregoing, the first time duration threshold and the second time duration threshold may be defined according to actual situations, and these two time duration thresholds may be the same or different, which is not limited herein. For example, the first time period threshold may have a value in a range of [0.5 seconds, 3 seconds ], that is, 0.5s,3s ], for example, the first time period threshold may have any value between 0.5 seconds and 3 seconds, specifically, may have a value of 0.5 seconds, 0.6 seconds, 1 second, 2.5 seconds, 2 seconds, 3 seconds, etc., and the second time period threshold may have a value of 0.5 seconds to 3 seconds.
As shown in a program interface of fig. 2D, there are displayed a folder icon 31, a folder icon 32 and a folder icon 33, which are obtained by classifying images in an album according to shooting locations, wherein the images in the folders corresponding to the folder icon 31 are grouped according to shooting locations B, the user presses the folder icon 31 for 2 seconds, the folder icon 31 may be a thumbnail of any one of the grouped images, and after the pressing, as shown in fig. 2E, four image feature labels indicating the second type of image features are displayed on the folder icon of B, respectively: a "topography" tab 41, a "season" tab 42, a "temperature" tab 43, and a "weather" tab 44. Wherein the "topography" tag 41 indicates the image characteristics of the topography of the location where the image was taken; the "season" label 42 indicates the image characteristics of the season at the time of image capturing; the "temperature" label 43 indicates an image feature of the temperature at the time of image capturing; the "weather" tab 44 indicates the image characteristics of the weather at the time of image capturing.
After displaying at least two image feature labels, the user continues to slide to a second image feature label from the pressing position, presses the second image feature label for more than a second time length threshold value, displays a map interface, and if the second image feature label is a season label, displays grouping images corresponding to four second-type sub-labels in spring, summer, autumn and winter in a region associated with the position of the B place on the map interface. If there are a plurality of images in each group of images, thumbnail images of any one image in each group of images may be displayed when each group of images is displayed. The user can click the thumbnail of any group of images to view all the images included in the group of images, so that the user can conveniently search the images, and the searching efficiency is improved.
In this embodiment, a group of images corresponding to each second type of sub-label is displayed on the map interface, so that a user can intuitively view the image of the shooting location, and interestingness is increased.
In some embodiments of the application, after the displaying the map interface, the method further comprises:
Receiving a second input from the user;
And in response to the second input, updating each thumbnail according to an image updating interval, so that each map area associated with the position of the place sequentially displays images in each group of images.
In some embodiments of the present application, the second input is used to control the electronic device to display a plurality of images related to the specific location according to a carousel image display manner, and the second input may be a second operation. Illustratively, the second input includes, but is not limited to: the user can specifically determine according to the actual use requirement by inputting the touch control of the folder icons through the touch control device such as a finger or a stylus, or inputting a voice command input by the user, or inputting a specific gesture input by the user, or inputting other feasibility, and the embodiment of the application is not limited. The specific gesture in the embodiment of the application can be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture and a double-click gesture; the click input in the embodiment of the application can be single click input, double click input or any time click input, and the like, and can also be long-press input or short-press input.
For example, the second input may be an input that presses the location of B on the map interface for a long time. In the case of receiving the second input, each thumbnail is updated according to the image update interval, and the value range of the update interval may be 0.5 seconds to 1 second, which is not limited herein. Each thumbnail is updated, namely the thumbnail of the image in the corresponding grouping image is adopted to replace the currently displayed thumbnail in turn, namely the thumbnail of each season in the corresponding spring, summer, autumn and winter is updated at the same time, and the thumbnail of the next image in the corresponding season is played in turn, so that the carousel effect is realized; the user can quickly browse a plurality of images of different seasons shot at a certain place through the second input.
The update interval can be related to the pressing force of the finger of the second input, when the pressing force is increased, the shorter the update interval is, the faster the carousel speed of the image thumbnail in each season is, and when the pressing force is reduced, the longer the update interval is, and the carousel speed is slowed down. For example, the pressing force is 0.3 newton, i.e., 0.3N, the update interval is 1 second, the pressing force is 0.8 newton, and the update interval is 0.5 seconds.
Furthermore, when each thumbnail is replaced by other image thumbnails in the corresponding grouped images, the process can be recorded, and the sharing video of the four-season images shot at the place can be obtained.
By the method, the matched image at a certain position on the map interface can be intuitively checked, carousel control can be performed on the image, the interest of checking the image by a user is increased, the user can quickly check the image shot at the certain position by pressing the certain position on the map interface, and the interactive operation of the image display control of the user is convenient.
In some embodiments of the application, the receiving the second input includes:
and receiving touch input of a user on the position of the place on the map interface.
For example, the touch input may be an input for pressing the location of the B location on the map interface for a long time, so as to trigger the electronic device to update each thumbnail respectively according to the image update interval, so that the matching image at a certain location on the map interface can be intuitively viewed, and the interest is increased.
In some embodiments of the application, prior to said receiving the second input of the user, the method further comprises:
Receiving a third input from the user;
responding to the third input, and establishing an association relationship between the album hanging piece and the place;
the receiving a second input from the user, comprising:
And receiving touch input of a user to the album hanging piece.
In this embodiment, the third input is used to establish an association relationship between the album hanging and the place, for example, the association relationship may be recorded in the form of a mapping table, and if a correspondence relationship between B and the album hanging is recorded in a preset mapping table, for example, the association relationship is established between B and the album hanging, and the third input may be an input of filling in a name of the place and a name of the album hanging in the mapping table. After the association relation is established, all operations on album pendants are identical to those on the places on the map interface. If the association relationship between the album hanging pieces and the places is established, the mapping table is also queried when the map interface is displayed, the album hanging pieces recorded in the mapping table are also displayed at the same time, and a user can trigger the electronic equipment to update each thumbnail respectively according to the image update interval by touch input, for example, pressing the album hanging pieces for a long time, so that the matched images at a certain position on the map interface can be intuitively checked, and the interest of the user in checking the images is increased.
In some embodiments of the present application, the album folder corresponding to the folder icon selected by the first input includes E images, where F images in the E images record the second type of image features in the image capturing process, and G images in the E images do not record the second type of image features in the image capturing process; e is an integer greater than 1, F is a positive integer, F < E, G=E-F;
The displaying of the image obtained by classifying the at least two images according to the image features indicated by the N image feature labels comprises the following steps:
and updating the display sequence of the E images and the G images in the folder so that the E images are arranged before the G images.
In the above, when an image is photographed, information of a photographing place, photographing time, weather, temperature, topography of the photographing place, and the like may be recorded, and the information may be stored as image characteristics in attribute information of the image. The user may select the image features to be recorded or may employ default image features. The image features corresponding to different images may be the same or different.
If the user selects E images in the folder, only F images in the E images record the second type image features, and G images do not record the second type image features, when the classified images are displayed, the display sequence of the F images is prior to the display sequence of the G images.
For example, if the second type of image features are a sunny day, the folder selected by the user includes 20 images, and if the second type of image features of 5 images in the 20 images are a sunny day and 15 images are not a sunny day, then the 5 images with the image features of the sunny day are preferentially displayed, and then the 15 images without the image features of the sunny day are displayed.
For example, in the case of image-stacked display, the 5 images are located above the 15 images; in the case of image tiling, the 5 images are tiled in turn from left to right according to the habit of viewing the images by the user, and after the 5 images are displayed, the last image of the 5 images is displayed in turn, and then the images of the 15 images are displayed in turn.
The order of the 5 images may be random, may be ordered according to the shooting time, or may be ordered according to the data size of the images, which is not limited herein.
In this embodiment, by arranging the E images before the G images, it is convenient for the user to check the E images preferentially, so as to improve the image searching efficiency.
In some embodiments of the application, N > 1; the N image feature labels comprise a third image feature label and a fourth image feature label; the third image feature tag indicates a third type of image feature, the third type of image feature is associated with at least two third type of sub-tags, and the third type of sub-tag indicates one sub-classification feature of the third type of image feature; the fourth image feature tag indicates a fourth type of image feature, the fourth type of image feature is associated with at least two fourth type of sub-tags, and the fourth type of sub-tag indicates one sub-classification feature of the fourth type of image feature;
the step of responding to the first input, displaying an image obtained by classifying the at least two images according to the image characteristics indicated by the N image characteristic labels, comprises the following steps:
Responding to the first input, and carrying out image classification on the at least two images according to the sub-classification features indicated by the at least two third-class sub-labels to obtain M groups of images, wherein one group of images has one sub-classification feature indicated by the third-class sub-label, and M is a positive integer;
According to the sub-classification features indicated by the at least two fourth-class sub-labels, carrying out image classification on the at least two images to obtain K groups of images, wherein one group of images has one sub-classification feature indicated by the fourth-class sub-label, and K is a positive integer;
And displaying the M groups of images, the at least two third type sub-labels, the K groups of images and the at least two fourth type sub-labels in a partitioning manner in sequence according to the selection sequence of the third image feature labels and the fourth image feature labels.
In this embodiment, the user may select the third image tag and the fourth image tag simultaneously through the first input, in which case the first input includes a first pressing input on one of the at least one folder icon, a first selection input selecting the third image feature tag, and a second selection input selecting the fourth image feature tag in sequence;
The first selection input includes a sliding input that slides from a pressed position of the pressing input to the third image feature label and a third pressing input on the third image feature label;
the second selection input includes a sliding input that slides from a pressed position of the third pressing input to the fourth image feature label and a fourth pressing input on the fourth image feature label.
As a program interface shown in fig. 2D, there are displayed a folder icon 31, a folder icon 32, and a folder icon 33, which are obtained by classifying 700 images in an album by shooting places, and all of the 700 images have the image feature of the shooting place, but there are a plurality of shooting places, namely, B place, C place, and a place. The images in the folder corresponding to the folder icon 31 are obtained by grouping 700 images with the shooting location as B, and include 100 images, and the image features of the 100 images include: the shooting place is B place; the images in the folder corresponding to the folder icon 32 are obtained by grouping 700 images with the shooting location as C, and include 300 images, and the image characteristics of the 300 images include: the shooting place is C land; the images in the folder corresponding to the folder icon 33 are obtained by grouping 700 images with the shooting location as a, and include 300 images, and the image characteristics of the 300 images include: the shooting location is a ground. The user presses the folder icon 31 for 2 seconds, and the folder icon 31 may be a thumbnail of any one of the group images, and after pressing, as shown in fig. 2E, four image feature labels indicating the second type of image features are displayed on the folder icon at location B, which are respectively: a "topography" tab 41, a "season" tab 42, a "temperature" tab 43, and a "weather" tab 44.
After displaying at least two image feature labels, the user continues to slide from the pressed position to the third image feature label, presses on the third image feature label, continues to slide to the fourth image feature label, and presses on the fourth image feature label.
And classifying the images in the folder by sequentially adopting the sub-labels corresponding to the selected image feature labels according to the sequence of the image feature labels selected by the first input. For example, the third image feature tag is a season tag, and the sub-tags corresponding to the season tag are spring, summer, autumn and winter; the fourth image feature tag is a weather tag, the sub-tag corresponding to the weather tag is a sunny day, a rainy day, a snowy day and a typhoon day, if the folder is a B-land folder, 100 images are classified by adopting the sub-tag corresponding to seasons in spring, summer, autumn and winter to obtain four groups of images, and then the 100 images are classified by adopting the sub-tag corresponding to the weather tag in sunny day, rainy day and snowy day to obtain three groups of images.
When the group images are displayed, the M groups of images, the at least two third-class sub-labels, the K groups of images and the at least two fourth-class sub-labels can be sequentially displayed in a partitioning mode according to the selection sequence of the third image feature labels and the fourth image feature labels. For example, the group image corresponding to the third image feature tag and the group image corresponding to the fourth image feature tag are displayed sequentially from top to bottom, or the group image corresponding to the third image feature tag and the group image corresponding to the fourth image feature tag are displayed sequentially from left to right, and the corresponding sub-tag is displayed at the display area of each group image.
In this embodiment, the third image feature tag and the fourth image feature tag can be selected simultaneously through the first input, and the plurality of grouping images and the corresponding sub-tags are sequentially displayed, so that a user can conveniently check the grouping images, and the image searching efficiency is improved.
In some embodiments of the application, the method further comprises:
Receiving a fourth input from the user;
Determining at least one image feature to be recorded during capturing of an image in response to the fourth input;
Recording characteristic information of at least one image characteristic of an image in the process of shooting the image by a camera, and storing the characteristic information of the at least one image characteristic in attribute information of the image, wherein the image characteristic comprises at least one of the following: weather, season, topography, temperature, mood, color, event.
In the process of capturing an image, basic information of the image is generally recorded in attribute information, for example, specific time of capturing the image: including week, specific shooting date and shooting time; image file name sequence: the serial number shows how many images this was taken since the last formatting; shooting equipment model: including brand name and product name; lens information used: the lens focal section, the maximum aperture, the lens product model and other information are included; image file information: including pixel number, resolution, and file size; image capturing information: the method comprises imaging sensitivity, photographing using a focal segment, exposure compensation, actual photographing aperture, shutter speed and the like; image position information: including but not limited to country, city and specific location. Feature information of the image feature selected by the user may be recorded in the image file information.
The user can select any one of the image characteristics such as weather, seasons, terrains, temperature, moods, colors, events and the like to be used as the grouping basis of the images. If the user selects at least two image features from the image features, a corresponding number of labels are displayed. Image features include, but are not limited to, weather, season, topography, temperature, mood, color, event, and any image feature specified by the user, such as the style of an image, if any of the images has a film-like filter, the style of the image is a film, if any of the images does not have any filter, the style of the image is a realistic image. The user can set the image feature to be recorded in the process of shooting the image through the image feature setting interface of the camera program, taking the setting of the image feature to be recorded by the user through the image feature setting interface of the camera program in the mobile phone as an example, for example, the image feature setting interface in fig. 2F includes 4 image feature options, such as "weather", "temperature", "topography" and "season", and fig. 2F also includes a "custom image feature" control, and the user can select at least one option from the 4 image feature options as the image feature to be recorded in the process of shooting the image, and can also implement the custom of the image feature to be recorded in the process of shooting the image through the "custom image feature" control.
For example, when a user of a mobile phone wants to take a self-photo in paris in france, the user selects weather as an image feature to be recorded on an image feature setting interface of a camera program, and after the user sets the image feature to be recorded in the shooting process, the user takes a self-photo through a camera of the mobile phone, and in the shooting process, the mobile phone records description information describing weather conditions of the shooting place, that is, weather information of paris in france on the same day, for example, "sunny day, sun is very big, sky is very blue", and stores the weather information in Exif (Exchangeable image file) information of the image. The mobile phone generates a weather label, namely a second image characteristic label, according to the image characteristic of the weather selected by the user.
In an exemplary embodiment, a mobile phone user takes a self-photo in summer, and the user selects a season as an image feature to be recorded on an image feature setting interface of a camera program, so that after the user sets the image feature to be recorded in the shooting process, the user takes a self-photo through a mobile phone camera, and in the shooting self-photo process, the mobile phone records description information describing the season condition of a shooting place, namely, the season of the shooting day, such as 'autumn, fallen leaves flying', and stores the season information in Exif information of the image. The mobile phone generates a seasonal label, namely a second image characteristic label, according to the seasonal image characteristic selected by the user.
In an exemplary embodiment, a user of a mobile phone takes a landscape photograph, and selects a landscape as an image feature to be recorded on an image feature setting interface of a camera program, and after the user sets the image feature to be recorded in the photographing process, the user photographs a landscape through a camera of the mobile phone, and in the process of photographing the landscape photograph, the mobile phone records description information describing the landscape condition of the photographing place, namely, the landscape of the photographing place, such as 'hilly, landscape beautiful', and stores the landscape information in Exif information of the image. The mobile phone generates a terrain label, namely a second image characteristic label, according to the terrain image characteristic selected by the user.
In an exemplary embodiment, a user of the mobile phone takes a landscape shot, and selects a temperature as an image feature to be recorded on an image feature setting interface of a camera program, and after the user sets the image feature to be recorded in the shooting process, the user shoots a landscape shot through a camera of the mobile phone, and in the process of shooting the landscape shot, the mobile phone records description information describing the temperature of a shooting place, namely the temperature of the shooting place, such as 'temperature 25 degrees, comfort', and stores the temperature information in Exif information of the image. The mobile phone generates a temperature label, namely a second image characteristic label, according to the image characteristic of the temperature selected by the user.
In an exemplary embodiment, a user of a mobile phone takes a self-shot picture, and the user selects a color as an image feature to be recorded on an image feature setting interface of a camera program, so that after the user sets the image feature to be recorded in the shooting process, the user shoots a self-shot picture through a camera of the mobile phone, and in the shooting self-shot process, the mobile phone records description information describing a color of a main body in an environment, namely, a color of a shooting place, such as 'purple, lavender field', and stores the color information in Exif information of the image. The mobile phone generates a color label, namely a second image characteristic label, according to the image characteristic of the color selected by the user.
In an exemplary embodiment, a user of a mobile phone takes a self-shot picture, and selects an event as an image feature to be recorded on an image feature setting interface of a camera program, so that after the user sets the image feature to be recorded in the shooting process, the user shoots a self-shot picture through a camera of the mobile phone, and in the shooting self-shot process, the mobile phone records description information describing a background event of shooting the image, namely, a background event of shooting the image, such as 'company activity, outing pedal' and stores event information in Exif information of the image. The mobile phone generates an event label, namely a second image characteristic label, according to the image characteristic of the event selected by the user.
In the above, although the location and time are the dimensions of the common classified images, because the images in the album program are more and more, the user cannot remember the shooting time or location of the images, and possibly the user can remember any one of weather, season, topography, temperature, mood, color and event when shooting at that time.
The user may also customize the image feature to be recorded, for example, the user wants to record the mood of the user when shooting an image during travel, so the user wants to use the mood of the user when playing as the image feature to be recorded, and clicks the "custom image feature" control 201 in the image feature setting interface of fig. 2F to display the image feature custom interface as shown in fig. 2G, where the input box 202 in the image feature custom interface is used for inputting the name of the image feature of the user custom, the input box 203 in the image feature custom interface is used for inputting the input mode of the image feature of the user custom, i.e. what mode the user inputs the image feature of the user custom is, the input mode of the image feature may include voice input, text input, etc., for example, the user wants to use the mood of the user when shooting an image as the image feature to be recorded during shooting an image, and inputs the image feature related to the mood through the text input mode, and the "mood" may be input in the input box 202 of the custom interface, and the input mode of the image feature to be set up the image feature to be custom. After the user finishes shooting a self-shot image, the mobile phone automatically pops up a shooting mood input box 204 as shown in fig. 2H, and the user can input the mood of the user during shooting in the shooting mood input box 204. Alternatively, the user may be prompted to input a text description of the mood at the time of shooting in the shooting mood input box 204 by displaying a prompt message or a voice prompt, for example, a prompt of "you can input the mood at the time of shooting in this input box" may be displayed, or a voice is played, where the voice content is "you can input the mood at the time of shooting in this input box". Illustratively, the user inputs a mood "although weather is good today, i'm mood is bad" at the time of own shooting in fig. 2H.
When the user inputs "voice" in the input box 203 of the image feature custom interface shown in fig. 2G, after the user shoots a self-shot image, the electronic device will automatically display a voice input control of mood, as shown in fig. 2I, the user clicks the voice input control, the microphone of the mobile phone is turned on to start recording the voice of the user, the voice input interface can prompt the user to input the mood of the user in a voice input mode, when the user clicks the "start recording" control 205, the microphone of the electronic device starts to collect the voice information of the user, to obtain an audio file including the shooting mood of the user through voice input, and store the audio file in association with the shot self-shot image, specifically, the self-shot image and the audio file including the shooting mood of the user through voice input can be stored in the same album folder, and the mapping relation between the image name of the self-shot image and the audio file can be recorded through the mapping table.
Because the user can customize the image characteristics to be recorded, the flexibility of image characteristic setting is improved, and the user can customize the image characteristic information to be recorded of the image, so that the images which the user wants to search can be screened and obtained more quickly.
The fourth input of an embodiment of the application is used to select or determine at least one image feature to be recorded. The fourth input may be a fourth operation. Illustratively, the fourth input described above includes, but is not limited to: the user can specifically determine according to the actual use requirement by inputting the touch control of the folder icon by a touch control device such as a finger or a stylus, or inputting a voice command input by the user, or inputting a specific gesture input by the user, or inputting other feasibility, and the embodiment of the application is not limited. The specific gesture in the embodiment of the application can be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture and a double-click gesture; the click input in the embodiment of the application can be single click input, double click input or any time click input, and the like, and can also be long-press input or short-press input.
In this embodiment, at least one image feature may be selected through the fourth input, and in the process of capturing an image, the at least one image feature is recorded in attribute information of the image, and by manually selecting an image feature to be recorded in the image, flexibility in classifying by using the image feature in the following process may be improved.
In some embodiments of the present application, the determining at least one image feature to be recorded in the process of capturing an image includes:
In the case that the fourth input is the selection input of the user in at least two alternative image feature options, determining the image feature corresponding to at least one image feature option selected by the fourth input as the image feature to be recorded in the process of shooting the image;
In the case that the fourth input is an input of an image feature input by a user, the image feature input by the fourth input is determined as an image feature to be recorded in the process of capturing an image.
In the foregoing, when the user selects the image feature through the fourth input, the user may select from the existing image feature options, for example, the selection interface includes image features of weather, season, topography, temperature, mood, color, event, and the like, from which the user may select one or more image features as desired. At least one of weather, season, topography, temperature, mood, color and event can be input by voice input mode when the image is shot, the electronic equipment carries out word recognition on the language input and then stores the language input, and after the image is shot, the corresponding file is stored in the attribute information of the image.
The user may also customize the image features, for example, the user may customize a tag on a sunny day, which may be input by voice or text, without limitation.
In this embodiment, in the case of determining the image feature to be recorded for the captured image through the fourth input, the image feature may be selected from at least two candidate image feature options, or the image feature may be customized, so that the flexibility of determining the image feature may be improved.
The image display control method provided by some embodiments of the present application is exemplified below.
The images in the album can be divided according to the location clusters to obtain a plurality of classified images of different locations, namely at least two images, and the at least two images can be further subdivided, for example, the images can be subdivided based on labels such as weather, four seasons and the like. The current album images can contain the time and place information when the images are shot, and the information can be extracted to classify the images.
Specific information such as weather, air temperature and the like corresponding to the shooting time can be requested to be obtained from weather software of the electronic equipment based on the shooting time of the image. Or when the image is shot, inquiring weather, air temperature and the like of the current position in real time, and recording the information in the attribute information of the image, so that the information is convenient for use in the subsequent classification.
Under the condition that the user selects N image feature labels based on at least two images, the images in the at least two images are filtered and recombined, for example, if the user selects weather, various weather is displayed in a classified mode, and corresponding images are displayed in each weather.
In the above, if the user presses the image feature tag for a long time, the user automatically enters the footprint mode, and after entering the footprint mode, the image classification is performed based on the image feature tag, for example, the season tag, for example, all images of the footprint in the a mountain are classified into 4 types in 4 seasons of spring, summer, autumn and winter, and all images under the corresponding seasons can be skipped and checked by clicking each season alone.
When a user clicks a place name of a certain footprint, a finger is pressed on the place name, for example, the finger is continuously pressed at an A mountain position in a map, cover images of each season corresponding to spring, summer, autumn and winter can be updated at the same time, and next images of the corresponding seasons are played in a rotating mode, so that a carousel effect is achieved. When the pressing force of the user is increased, the carousel speed of the images in each season is also increased, and when the pressing force is reduced, the carousel speed is reduced.
And recording the screen in the process while forming the effect of the carousel dynamic effect to obtain the shared video for continuously playing all the images on the footprint.
In addition, the footprint can be combined with the album hanging piece, the A mountain is displayed on the album hanging piece, and when the user directly presses the album hanging piece, the image under the footprint of the A mountain can be played in a rotating mode. The method has the advantages that through simple pressing operation, images in the album program can be controlled to be grouped according to the needs, the album program does not need to be clicked to enter the program interface to check one image by one, and because a user can flexibly control the carousel speed of the displayed grouped images by controlling the pressing force, the user can conveniently control the updating speed of image display, the operation is convenient, the entering of the program interface can be avoided, the back-and-forth jump between the program interface and the map interface is avoided, and therefore the user operation efficiency of image display control and the searching efficiency of images are improved.
According to the embodiment, the place tag is fused with the weather information and the geographic information, so that further subdivision of classified images is realized, and the image searching efficiency can be improved.
Some embodiments of the present application provide an image display control method, and an execution subject may be an image classification display control device. In the embodiments of the present application, an image classification display control device executes an image display control method as an example, and image classification display control devices provided in some embodiments of the present application are described.
As shown in fig. 3, an embodiment of the present application provides an image classification display control apparatus, and an image classification display control apparatus 300 includes:
The display module 301 is configured to display a program interface of an album program, where the program interface includes at least one folder icon, where the at least one folder icon is obtained by performing image classification according to a first type of image feature indicated by a first image feature tag, and at least one image having a different first type of image feature is included in an album folder corresponding to a different folder icon;
A receiving module 302, configured to receive a first input from a user, where the first input is used to select a folder icon to be reclassified from the at least one folder icon and select N image feature tags, and an album folder corresponding to the first input-selected folder icon includes at least two images, where N is a positive integer;
The display module 301 is further configured to display, in response to the first input, an image obtained by image classification of the at least two images according to the image features indicated by the N image feature labels.
In some embodiments of the present application, n=1, and the N image feature labels include a second image feature label; the second image feature tag indicates a second type of image feature, the second type of image feature is associated with at least two second type of sub-tags, and the second type of sub-tag indicates one sub-classification feature of the second type of image feature;
The device also comprises a processing module, a processing module and a processing module, wherein the processing module is used for responding to the first input and carrying out image classification on the at least two images according to the sub-classification characteristics indicated by the at least two second class sub-labels to obtain at least two groups of images, and one group of images has one sub-classification characteristic indicated by the second class sub-label;
The display module 301 is further configured to display the at least two groups of images and the at least two second class sub-labels in a partitioned manner.
In some embodiments of the present application, the first image feature tag is a shooting location, the first input includes a first press input on one of the at least one folder icon and a selection input to select a second image feature tag, the selection input includes a slide input to slide from a press location of the press input to the second image feature tag and a second press input on the second image feature tag;
The display module 301 is further configured to display at least two image feature labels on a folder icon selected by the first input when a pressing time period of the first pressing input is greater than a first time period threshold;
The receiving module 302 is further configured to receive a sliding input that a user slides from a pressing position of the pressing input to the second image feature tag and a second pressing input on the second image feature tag;
The display module 301 is further configured to display a map interface, where at least two map areas associated with a location where a place on the map interface is located include at least two thumbnails, when a pressing time period of the second pressing input is greater than a second time period threshold;
The location is a shooting location of an image in an album folder indicated by the folder icon selected by the first input; and the at least two thumbnails are obtained by classifying all images in the album folder according to the second-class image features indicated by the second-class image feature tags, and one thumbnail corresponds to a group of images with the sub-classification features indicated by the second-class sub-tags.
In some embodiments of the present application, the receiving module 302 is further configured to receive a second input from the user;
and the processing module is also used for respectively updating each thumbnail according to the second input and the image updating interval so as to enable each map area associated with the position of the place to sequentially display the images in each group of images.
In some embodiments of the present application, the receiving module 302 is further configured to receive a touch input from a user to a location on the map interface where the location is located.
In some embodiments of the present application, the receiving module 302 is further configured to receive a third input from the user;
the processing module is further used for responding to the third input and establishing an association relationship between the album hanging piece and the shooting place;
The receiving module 302 is further configured to receive a touch input from a user to the album hanging component.
In some embodiments of the present application, the album folder corresponding to the folder icon selected by the first input includes E images, where F images in the E images record the second type of image features in the image capturing process, and G images in the E images do not record the second type of image features in the image capturing process; e is an integer greater than 1, F is a positive integer, F < E, G=E-F;
The display module 301 is further configured to update a display order of the E-images and the G-images in the album folder, so that the E-images are arranged before the G-images.
In some embodiments of the application, N > 1; the N image feature labels comprise a third image feature label and a fourth image feature label; the third image feature tag indicates a third type of image feature, the third type of image feature is associated with at least two third type of sub-tags, and the third type of sub-tag indicates one sub-classification feature of the third type of image feature; the fourth image feature tag indicates a fourth type of image feature, the fourth type of image feature is associated with at least two fourth type of sub-tags, and the fourth type of sub-tag indicates one sub-classification feature of the fourth type of image feature;
The device further comprises a processing module, a first input module and a second input module, wherein the processing module is used for responding to the first input and carrying out image classification on the at least two images according to the sub-classification characteristics indicated by the at least two third-class sub-labels to obtain M groups of images, one group of images has the sub-classification characteristics indicated by one third-class sub-label, and M is a positive integer;
According to the sub-classification features indicated by the at least two fourth-class sub-labels, carrying out image classification on the at least two images to obtain K groups of images, wherein one group of images has one sub-classification feature indicated by the fourth-class sub-label, and K is a positive integer;
The display module 301 is further configured to sequentially display the M groups of images, the at least two third types of sub-labels, the K groups of images, and the at least two fourth types of sub-labels in a partition manner according to a selection order of the third image feature labels and the fourth image feature labels.
In some embodiments of the present application, the receiving module 302 is further configured to receive a fourth input from a user;
The apparatus further comprises a processing module for determining at least one image feature to be recorded during capturing of an image in response to the fourth input; recording characteristic information of at least one image characteristic of an image in the process of shooting the image by a camera, and storing the characteristic information of the at least one image characteristic in attribute information of the image, wherein the image characteristic comprises at least one of the following: weather, season, topography, temperature, mood, color, event.
In some embodiments of the present application, the processing module is further configured to determine, when the fourth input is a selection input of at least two candidate image feature options by a user, an image feature corresponding to at least one image feature option selected by the fourth input as an image feature to be recorded in the process of capturing an image; in the case that the fourth input is an input of an image feature input by a user, the image feature input by the fourth input is determined as an image feature to be recorded in the process of capturing an image.
The image classification display control device 300 provided in some embodiments of the present application can implement each process implemented by the foregoing method embodiments, and in order to avoid repetition, a description is omitted herein.
The image classification display control device 300 in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. The electronic device may be a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a Mobile internet appliance (Mobile INTERNET DEVICE, MID), an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a robot, a wearable device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc., and may also be a server, a network attached storage (Network Attached Storage, NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, etc., which are not particularly limited in the embodiments of the present application.
The image classification display control device 300 in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
Optionally, as shown in fig. 4, the embodiment of the present application further provides an electronic device 400, including a processor 401 and a memory 402, where the memory 402 stores a program or an instruction that can be executed on the processor 401, and the program or the instruction implements each step of the embodiment of the image display control method when executed by the processor 401, and the steps achieve the same technical effects, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 5 is a hardware configuration diagram of an electronic device implementing an embodiment of the present application.
The electronic device 500 includes, but is not limited to: radio frequency unit 501, network module 502, audio output unit 503, input unit 504, sensor 505, display unit 506, user input unit 507, interface unit 508, memory 509, and processor 510.
Those skilled in the art will appreciate that the electronic device 500 may further include a power source (e.g., a battery) for powering the various components, and that the power source may be logically coupled to the processor 510 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 4 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The display unit 506 is configured to display a program interface of the album program, where the program interface includes at least one folder icon, where the at least one folder icon is obtained by performing image classification according to a first type of image feature indicated by the first image feature tag, and at least one image having a different first type of image feature is included in an album folder corresponding to a different folder icon;
A user input unit 507, configured to receive a first input from a user, where the first input is used to select a folder icon to be reclassified from the at least one folder icon and select N image feature labels, and an album folder corresponding to the first input selected folder icon includes at least two images, where N is a positive integer;
The display unit 506 is further configured to display, in response to the first input, an image obtained by image classification of the at least two images according to the image features indicated by the N image feature labels.
In some embodiments of the application, n=1, the N image feature labels include a second image feature label; the second image feature tag indicates a second type of image feature, the second type of image feature is associated with at least two second type of sub-tags, and the second type of sub-tag indicates one sub-classification feature of the second type of image feature;
A processor 510, configured to respond to the first input, and perform image classification on the at least two images according to the sub-classification features indicated by the at least two second class sub-labels, so as to obtain at least two groups of images, where one group of images has one sub-classification feature indicated by the second class sub-label;
the display unit 506 is further configured to display the at least two groups of images and the at least two second class sub-labels in a partition manner.
In some embodiments of the application, the first image feature tag is a shooting location, the first input comprises a first press input on one of the at least one folder icon and a selection input to select a second image feature tag, the selection input comprising a slide input to slide from a press location of the press input to the second image feature tag and a second press input on the second image feature tag;
The display unit 506 is further configured to display at least two image feature labels on the folder icon selected by the first input if the pressing time of the first pressing input is longer than a first time threshold;
The user input unit 507 is further configured to receive a sliding input that a user slides from a pressing position of the pressing input to the second image feature tag and a second pressing input on the second image feature tag;
The display unit 506 is further configured to display a map interface, where at least two map areas associated with a location where a place on the map interface is located include at least two thumbnails, when a pressing time period of the second pressing input is greater than a second time period threshold;
The location is a shooting location of an image in an album folder indicated by the folder icon selected by the first input; and the at least two thumbnails are obtained by classifying all images in the album folder according to the second-class image features indicated by the second-class image feature tags, and one thumbnail corresponds to a group of images with the sub-classification features indicated by the second-class sub-tags.
In some embodiments of the present application, the user input unit 507 is further configured to receive a second input from a user;
The processor 510 is further configured to update each thumbnail at an image update interval in response to the second input, so that each map area associated with the location of the place sequentially displays images in each group of images.
In some embodiments of the present application, the user input unit 507 is further configured to receive a touch input from a user on the map interface where the location is located.
In some embodiments of the present application, the user input unit 507 is further configured to receive a third input from the user;
The processor 510 is further configured to establish an association relationship between the album hanging piece and the shooting location in response to the third input;
The user input unit 507 is further configured to receive a touch input from a user to the album hanging member.
In some embodiments of the present application, the album folder corresponding to the folder icon selected by the first input includes E images, where F images in the E images record the second type of image features in the image capturing process, and G images in the E images do not record the second type of image features in the image capturing process; e is an integer greater than 1, F is a positive integer, F < E, G=E-F;
the display unit 506 is further configured to update a display order of the E-sheets and the G-sheets in the album folder so that the E-sheets are arranged before the G-sheets.
In some embodiments of the application, N > 1; the N image feature labels comprise a third image feature label and a fourth image feature label; the third image feature tag indicates a third type of image feature, the third type of image feature is associated with at least two third type of sub-tags, and the third type of sub-tag indicates one sub-classification feature of the third type of image feature; the fourth image feature tag indicates a fourth type of image feature, the fourth type of image feature is associated with at least two fourth type of sub-tags, and the fourth type of sub-tag indicates one sub-classification feature of the fourth type of image feature;
The processor 510 is further configured to respond to the first input, and perform image classification on the at least two images according to the sub-classification features indicated by the at least two third-class sub-labels, so as to obtain M groups of images, where one group of images has one sub-classification feature indicated by the third-class sub-label, and M is a positive integer; according to the sub-classification features indicated by the at least two fourth-class sub-labels, carrying out image classification on the at least two images to obtain K groups of images, wherein one group of images has one sub-classification feature indicated by the fourth-class sub-label, and K is a positive integer;
The display unit 506 is further configured to sequentially display the M groups of images, the at least two third types of sub-labels, the K groups of images, and the at least two fourth types of sub-labels in a partition manner according to the selection order of the third image feature labels and the fourth image feature labels.
In some embodiments of the present application, the user input unit 507 is further configured to receive a fourth input from a user;
a processor 510 further configured to determine at least one image feature to be recorded during capturing of an image in response to the fourth input; recording characteristic information of at least one image characteristic of an image in the process of shooting the image by a camera, and storing the characteristic information of the at least one image characteristic in attribute information of the image, wherein the image characteristic comprises at least one of the following: weather, season, topography, temperature, mood, color, event.
In some embodiments of the present application, the processor 510 is further configured to determine, when the fourth input is a selection input of at least two candidate image feature options by a user, an image feature corresponding to at least one image feature option selected by the fourth input as an image feature to be recorded in the process of capturing an image;
In the case that the fourth input is an input of an image feature input by a user, the image feature input by the fourth input is determined as an image feature to be recorded in the process of capturing an image.
The electronic device provided in some embodiments of the present application can implement each process implemented by the foregoing method embodiments, and in order to avoid repetition, details are not repeated here.
It should be appreciated that in embodiments of the present application, the input unit 504 may include a graphics processor (Graphics Processing Unit, GPU) 5041 and a microphone 5042, with the graphics processor 5041 processing image data of still images or video obtained by an image capture device (e.g., a camera) in a video capture mode or an image capture mode. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 507 includes at least one of a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen. Touch panel 5071 may include two parts, a touch detection device and a touch controller. Other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 509 may include volatile memory or nonvolatile memory, or the memory 509 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate Synchronous dynamic random access memory (Double DATA RATE SDRAM, DDRSDRAM), enhanced Synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCH LINK DRAM, SLDRAM), and Direct random access memory (DRRAM). Memory 509 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 510 may include one or more processing units; optionally, the processor 510 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 510.
The embodiment of the application also provides a readable storage medium, on which a program or an instruction is stored, which when executed by a processor, implements each process of the above embodiment of the image display control method, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, which comprises a processor and a communication interface, wherein the communication interface is coupled with the processor, and the processor is used for running programs or instructions to realize the processes of the embodiment of the image display control method, and can achieve the same technical effects, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, which is executed by at least one processor to implement the respective processes of the above-described image display control method embodiments, and achieve the same technical effects, and are not described herein in detail for avoiding repetition.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (21)

1. An image display control method, characterized by comprising:
A program interface for displaying an album program, wherein the program interface comprises at least one folder icon, the at least one folder icon is obtained by classifying images according to first type image features indicated by first image feature labels, and album folders corresponding to different folder icons comprise at least one image with different first type image features;
Receiving a first input of a user, wherein the first input is used for selecting a folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, an album folder corresponding to the folder icon selected by the first input comprises at least two images, and N is a positive integer;
And responding to the first input, and displaying images obtained by carrying out image classification on the at least two images according to the image characteristics indicated by the N image characteristic labels.
2. The method of claim 1, wherein N = 1, the N image feature tags including a second image feature tag; the second image feature tag indicates a second type of image feature, the second type of image feature is associated with at least two second type of sub-tags, and the second type of sub-tag indicates one sub-classification feature of the second type of image feature;
the step of responding to the first input, displaying an image obtained by classifying the at least two images according to the image characteristics indicated by the N image characteristic labels, comprises the following steps:
Responding to the first input, and carrying out image classification on the at least two images according to the sub-classification features indicated by the at least two second class sub-labels to obtain at least two groups of images, wherein one group of images has one sub-classification feature indicated by the second class sub-label;
And displaying the at least two groups of images and the at least two second class sub-labels in a partition mode.
3. The method of claim 2, wherein the first image feature tag is a shooting location, the first input comprising a first press input on one of the at least one folder icon and a selection input to select a second image feature tag, the selection input comprising a slide input that slides from a press location of the press input to the second image feature tag and a second press input on the second image feature tag;
in receiving the first input, the method further comprises:
Displaying at least two image feature labels on the folder icon selected by the first input under the condition that the pressing time length of the first pressing input is larger than a first time length threshold value;
receiving a sliding input from a user sliding from a pressing position of the pressing input to the second image feature label and a second pressing input on the second image feature label;
The displaying of the image obtained by classifying the at least two images according to the image features indicated by the N image feature labels comprises the following steps:
Displaying a map interface under the condition that the pressing time length of the second pressing input is larger than a second time length threshold, wherein at least two map areas associated with the position of one place on the map interface comprise at least two thumbnails;
The location is a shooting location of an image in an album folder indicated by the folder icon selected by the first input; and the at least two thumbnails are obtained by classifying all images in the album folder according to the second-class image features indicated by the second-class image feature tags, and one thumbnail corresponds to a group of images with the sub-classification features indicated by the second-class sub-tags.
4. A method according to claim 3, wherein after the displaying of the map interface, the method further comprises:
Receiving a second input from the user;
And in response to the second input, updating each thumbnail according to an image updating interval, so that each map area associated with the position of the place sequentially displays images in each group of images.
5. The method of claim 4, wherein the receiving a second input comprises:
and receiving touch input of a user on the position of the place on the map interface.
6. The method of claim 4, wherein prior to said receiving the second input from the user, the method further comprises:
Receiving a third input from the user;
establishing an association relationship between album pendants and the shooting places in response to the third input;
the receiving a second input from the user, comprising:
And receiving touch input of a user to the album hanging piece.
7. The method of claim 2, wherein the album folder corresponding to the folder icon selected by the first input includes E images, F images of the E images having second type image features recorded during image capturing, and G images of the E images having no second type image features recorded during image capturing; e is an integer greater than 1, F is a positive integer, F < E, G=E-F;
The displaying of the image obtained by classifying the at least two images according to the image features indicated by the N image feature labels comprises the following steps:
and updating the display sequence of the E images and the G images in the album folder so that the E images are arranged before the G images.
8. The method of claim 1, wherein N > 1; the N image feature labels comprise a third image feature label and a fourth image feature label; the third image feature tag indicates a third type of image feature, the third type of image feature is associated with at least two third type of sub-tags, and the third type of sub-tag indicates one sub-classification feature of the third type of image feature; the fourth image feature tag indicates a fourth type of image feature, the fourth type of image feature is associated with at least two fourth type of sub-tags, and the fourth type of sub-tag indicates one sub-classification feature of the fourth type of image feature;
the step of responding to the first input, displaying an image obtained by classifying the at least two images according to the image characteristics indicated by the N image characteristic labels, comprises the following steps:
Responding to the first input, and carrying out image classification on the at least two images according to the sub-classification features indicated by the at least two third-class sub-labels to obtain M groups of images, wherein one group of images has one sub-classification feature indicated by the third-class sub-label, and M is a positive integer;
According to the sub-classification features indicated by the at least two fourth-class sub-labels, carrying out image classification on the at least two images to obtain K groups of images, wherein one group of images has one sub-classification feature indicated by the fourth-class sub-label, and K is a positive integer;
And displaying the M groups of images, the at least two third type sub-labels, the K groups of images and the at least two fourth type sub-labels in a partitioning manner in sequence according to the selection sequence of the third image feature labels and the fourth image feature labels.
9. The method according to claim 1, wherein the method further comprises:
Receiving a fourth input from the user;
Determining at least one image feature to be recorded during capturing of an image in response to the fourth input;
recording characteristic information of at least one image characteristic of an image in the process of shooting the image by a camera, and storing the characteristic information of the at least one image characteristic in attribute information of the image;
Wherein the image features include at least one of: weather, season, topography, temperature, mood, color, event.
10. The method of claim 9, wherein determining at least one image feature to be recorded in capturing an image comprises:
In the case that the fourth input is the selection input of the user in at least two alternative image feature options, determining the image feature corresponding to at least one image feature option selected by the fourth input as the image feature to be recorded in the process of shooting the image;
In the case that the fourth input is an input of an image feature input by a user, the image feature input by the fourth input is determined as an image feature to be recorded in the process of capturing an image.
11. An image classification display control device, comprising:
The display module is used for displaying a program interface of the album program, the program interface comprises at least one folder icon, the at least one folder icon is obtained by classifying images according to first type image features indicated by the first image feature labels, and at least one image with different first type image features is included in the album folders corresponding to different folder icons;
The receiving module is used for receiving a first input of a user, wherein the first input is used for selecting a folder icon to be reclassified from the at least one folder icon and selecting N image feature labels, an album folder corresponding to the file icon selected by the first input comprises at least two images, and N is a positive integer;
the display module is further used for responding to the first input, and displaying images obtained by classifying the images of the at least two images according to the image features indicated by the N image feature labels.
12. The apparatus of claim 11, wherein N = 1, the N image feature tags including a second image feature tag; the second image feature tag indicates a second type of image feature, the second type of image feature is associated with at least two second type of sub-tags, and the second type of sub-tag indicates one sub-classification feature of the second type of image feature;
The device also comprises a processing module, a processing module and a processing module, wherein the processing module is used for responding to the first input and carrying out image classification on the at least two images according to the sub-classification characteristics indicated by the at least two second class sub-labels to obtain at least two groups of images, and one group of images has one sub-classification characteristic indicated by the second class sub-label;
The display module is further used for displaying the at least two groups of images and the at least two second class sub-labels in a partitioning mode.
13. The apparatus of claim 12, wherein the first image feature tag is a shooting location, the first input comprising a first press input on one of the at least one folder icon and a selection input to select a second image feature tag, the selection input comprising a slide input to slide from a press location of the press input to the second image feature tag and a second press input on the second image feature tag;
The display module is further configured to display at least two image feature labels on a folder icon selected by the first input when a pressing time period of the first pressing input is greater than a first time period threshold;
the receiving module is further used for receiving a sliding input of a user from a pressing position of the pressing input to the second image feature label and a second pressing input on the second image feature label;
The display module is further configured to display a map interface when the pressing time of the second pressing input is greater than a second time threshold, where at least two map areas associated with a location of a place on the map interface include at least two thumbnails;
The location is a shooting location of an image in an album folder indicated by the folder icon selected by the first input; and the at least two thumbnails are obtained by classifying all images in the album folder according to the second-class image features indicated by the second-class image feature tags, and one thumbnail corresponds to a group of images with the sub-classification features indicated by the second-class sub-tags.
14. The apparatus of claim 13, wherein the receiving module is further configured to receive a second input from a user;
and the processing module is also used for respectively updating each thumbnail according to the second input and the image updating interval so as to enable each map area associated with the position of the place to sequentially display the images in each group of images.
15. The apparatus of claim 14, wherein the receiving module is further configured to receive a touch input from a user to a location on the map interface where the location is located.
16. The apparatus of claim 14, wherein the receiving module is further configured to receive a third input from a user;
the processing module is further used for responding to the third input and establishing an association relationship between the album hanging piece and the shooting place;
the receiving module is further used for receiving touch input of a user to the album hanging piece.
17. The apparatus of claim 12, wherein the album folder corresponding to the folder icon selected by the first input includes E images, F of the E images having second type image features recorded during the image capturing process, and G of the E images having no second type image features recorded during the image capturing process; e is an integer greater than 1, F is a positive integer, F < E, G=E-F;
And the display module is also used for updating the display sequence of the E images and the G images in the album folder so as to enable the E images to be arranged before the G images.
18. The apparatus of claim 11, wherein N > 1; the N image feature labels comprise a third image feature label and a fourth image feature label; the third image feature tag indicates a third type of image feature, the third type of image feature is associated with at least two third type of sub-tags, and the third type of sub-tag indicates one sub-classification feature of the third type of image feature; the fourth image feature tag indicates a fourth type of image feature, the fourth type of image feature is associated with at least two fourth type of sub-tags, and the fourth type of sub-tag indicates one sub-classification feature of the fourth type of image feature;
The device further comprises a processing module, a first input module and a second input module, wherein the processing module is used for responding to the first input and carrying out image classification on the at least two images according to the sub-classification characteristics indicated by the at least two third-class sub-labels to obtain M groups of images, one group of images has the sub-classification characteristics indicated by one third-class sub-label, and M is a positive integer; according to the sub-classification features indicated by the at least two fourth-class sub-labels, carrying out image classification on the at least two images to obtain K groups of images, wherein one group of images has one sub-classification feature indicated by the fourth-class sub-label, and K is a positive integer;
The display module is further configured to display the M groups of images, the at least two third types of sub-labels, the K groups of images, and the at least two fourth types of sub-labels in a partition manner sequentially according to a selection sequence of the third image feature labels and the fourth image feature labels.
19. The apparatus of claim 11, wherein the receiving module is further configured to receive a fourth input from a user;
The apparatus further comprises a processing module for determining at least one image feature to be recorded during capturing of an image in response to the fourth input; recording characteristic information of at least one image characteristic of an image in the process of shooting the image by a camera, and storing the characteristic information of the at least one image characteristic in attribute information of the image;
Wherein the image features include at least one of: weather, season, topography, temperature, mood, color, event.
20. The apparatus according to claim 19, wherein the processing module is further configured to determine, if the fourth input is a user selection input of at least two alternative image feature options, an image feature corresponding to at least one image feature option selected by the fourth input as an image feature to be recorded in capturing an image; in the case that the fourth input is an input of an image feature input by a user, the image feature input by the fourth input is determined as an image feature to be recorded in the process of capturing an image.
21. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the image display control method of any one of claims 1 to 10.
CN202311429184.9A 2023-10-30 2023-10-30 Image display control method and device and electronic equipment Pending CN117992628A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311429184.9A CN117992628A (en) 2023-10-30 2023-10-30 Image display control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311429184.9A CN117992628A (en) 2023-10-30 2023-10-30 Image display control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN117992628A true CN117992628A (en) 2024-05-07

Family

ID=90899739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311429184.9A Pending CN117992628A (en) 2023-10-30 2023-10-30 Image display control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117992628A (en)

Similar Documents

Publication Publication Date Title
CN108235765B (en) Method and device for displaying story photo album
WO2017129018A1 (en) Picture processing method and apparatus, and smart terminal
US8811775B1 (en) Visualizing digital images on a map
US8078618B2 (en) Automatic multimode system for organizing and retrieving content data files
US20160189414A1 (en) Autocaptioning of images
US9538116B2 (en) Relational display of images
JP6628115B2 (en) Multimedia file management method, electronic device, and computer program.
US20120213497A1 (en) Method for media reliving on demand
CN113766296B (en) Live broadcast picture display method and device
US20220174237A1 (en) Video special effect generation method and terminal
CN111371999A (en) Image management method, device, terminal and storage medium
CN116017043A (en) Video generation method, device, electronic equipment and storage medium
CN112685119A (en) Display control method and device and electronic equipment
CN112287141A (en) Photo album processing method and device, electronic equipment and storage medium
O'Hare et al. My digital photos: where and when?
CN112698761A (en) Image display method and device and electronic equipment
CN112330728A (en) Image processing method, image processing device, electronic equipment and readable storage medium
CN117992628A (en) Image display control method and device and electronic equipment
TWI428818B (en) Electronic device displaying multi-media files and browsing method of the electronic device
CN114416664A (en) Information display method, information display device, electronic apparatus, and readable storage medium
CN112492206B (en) Image processing method and device and electronic equipment
CN115278378B (en) Information display method, information display device, electronic apparatus, and storage medium
CN115665355A (en) Video processing method and device, electronic equipment and readable storage medium
CN116074459A (en) File generation method and device
CN117435757A (en) Image searching method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination