CN110245246B - Image display method and terminal equipment - Google Patents

Image display method and terminal equipment Download PDF

Info

Publication number
CN110245246B
CN110245246B CN201910364497.8A CN201910364497A CN110245246B CN 110245246 B CN110245246 B CN 110245246B CN 201910364497 A CN201910364497 A CN 201910364497A CN 110245246 B CN110245246 B CN 110245246B
Authority
CN
China
Prior art keywords
target
control
input
image
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910364497.8A
Other languages
Chinese (zh)
Other versions
CN110245246A (en
Inventor
唐义
朱宗伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910364497.8A priority Critical patent/CN110245246B/en
Publication of CN110245246A publication Critical patent/CN110245246A/en
Priority to PCT/CN2020/081249 priority patent/WO2020220873A1/en
Application granted granted Critical
Publication of CN110245246B publication Critical patent/CN110245246B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an image display method and terminal equipment, relates to the technical field of terminals, and aims to solve the problems that the process of screening images by the terminal equipment is complex and tedious and the man-machine interaction performance is poor in the prior art. The method comprises the following steps: responding to a received first input of a user to a first control, and displaying M second controls, wherein the first control is used for indicating a first type of image in the terminal equipment, and M is a positive integer; receiving a second input of the user to a target control in the M second controls; and responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input. The scheme is particularly applied to the scene of image screening.

Description

Image display method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of terminals, in particular to an image display method and terminal equipment.
Background
With the continuous development of terminal technology, the application of terminal equipment is more and more extensive. For example, in the prior art, instant social applications in terminal devices such as a mobile phone and a tablet have an image sharing function, and in the image sharing process, a user needs to trigger the terminal device to obtain an image which the user needs to share from a gallery of the terminal device.
Currently, in the prior art, a terminal device may classify images in a gallery according to image attribute parameters such as a time parameter, a location parameter, and an image type parameter (e.g., a person type, a landscape type, a food type, etc.). Therefore, in the process of screening the images, the user may first trigger the terminal device to screen the images (first screening results) belonging to a certain time period, a certain place or a certain type, and then trigger the terminal device to browse the images in the first screening results one by one to screen the images that the user needs to share.
However, as the storage capacity of the terminal device is increased, the number of images in the gallery is increased, and the number of images belonging to the first filtering result is also increased, so that it takes a lot of time to filter the pictures. Therefore, the process of screening images by the terminal equipment in the prior art is complex and tedious, and the human-computer interaction performance is poor.
Disclosure of Invention
The embodiment of the invention provides an image display method and terminal equipment, and aims to solve the problems that in the prior art, the process of screening images by the terminal equipment is complex and tedious, and the man-machine interaction performance is poor.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image display method, which is applied to a terminal device, and the method includes: responding to a received first input of a user to a first control, and displaying M second controls, wherein the first control is used for indicating a first type of image in the terminal equipment, and M is a positive integer; receiving a second input of the user to a target control in the M second controls; and responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input.
In a second aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes: a display module and a receiving module; the display module is used for responding to a received first input of a user to a first control and displaying M second controls, wherein the first control is used for indicating a first type of image in the terminal equipment, and M is a positive integer; the receiving module is used for receiving a second input of a user to a target control in the M second controls displayed by the display module; the display module is further configured to display a target image in the first type of image, which is matched with a target parameter, in response to the second input received by the receiving module, where the target parameter is a parameter corresponding to the second input.
In a third aspect, an embodiment of the present invention provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and operable on the processor, and when executed by the processor, the computer program implements the steps of the image display method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image display method as in the first aspect.
In the embodiment of the invention, the terminal device can display M second controls by responding to the received first input of the user to the first control, wherein the first control is used for indicating the first type of images in the terminal device, and M is a positive integer; receiving a second input of the user to a target control in the M second controls; and responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input. Through the scheme, a user can trigger the terminal equipment to display the M second controls through first input of the first control (corresponding to the first type of image), and then the user triggers the terminal equipment to display a target image matched with target parameters corresponding to second input in the first type of image through second input of a target control in the M second controls. Therefore, the user can quickly screen out the images required by the user from the gallery by inputting the two controls. The problems that in the prior art, the process of screening images by the terminal equipment is complex and tedious, and the human-computer interaction performance is poor can be solved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
FIG. 2 is a flowchart of an image display method according to an embodiment of the present invention;
FIG. 3 is a schematic view of an interface of an image display method according to an embodiment of the present invention;
FIG. 4 is a second schematic view of an interface of an image display method according to an embodiment of the present invention;
FIG. 5 is a third schematic view of an interface of an image display method according to an embodiment of the present invention;
FIG. 6 is a second flowchart of an image displaying method according to an embodiment of the present invention;
FIG. 7 is a third flowchart of an image displaying method according to an embodiment of the present invention;
FIG. 8 is a fourth flowchart of an image displaying method according to an embodiment of the present invention;
FIG. 9 is a fourth schematic view of an interface of an image display method according to an embodiment of the present invention;
FIG. 10 is a fifth schematic view of an interface of an image display method according to an embodiment of the present invention;
FIG. 11 is a sixth schematic view of an interface of an image display method according to an embodiment of the present invention;
FIG. 12 is a fifth flowchart of an image displaying method according to an embodiment of the present invention;
FIG. 13 is a seventh schematic view of an interface of an image display method according to an embodiment of the present invention;
FIG. 14 is an eighth schematic view of an interface of an image display method according to an embodiment of the present invention;
fig. 15 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 16 is a hardware schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first," "second," "third," and "fourth," etc. in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input, the second input, the third input, the fourth input, etc. are used to distinguish between different inputs, rather than to describe a particular order of inputs.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of processing units means two or more processing units; plural elements means two or more elements, and the like.
The embodiment of the invention provides an image display method, wherein a terminal device can display M second controls by responding to a received first input of a user to a first control, and the first control is used for indicating a first type of image in the terminal device; receiving a second input of the user to a target control in the M second controls; and responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input. Through the scheme, a user can trigger the terminal equipment to display the M second controls through first input of the first control (corresponding to the first type of image), and then the user triggers the terminal equipment to display a target image matched with target parameters corresponding to second input in the first type of image through second input of a target control in the M second controls. Therefore, the user can quickly screen out the images required by the user from the gallery by inputting the two controls. The problems that in the prior art, the process of screening images by the terminal equipment is complex and tedious, and the human-computer interaction performance is poor can be solved.
The following describes a software environment to which the image display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image display method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal can implement the image display method provided by the embodiment of the invention by running the software program in the android operating system.
The terminal device in the embodiment of the invention can be a mobile terminal device and can also be a non-mobile terminal device. The mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile terminal device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
The execution main body of the image display method provided by the embodiment of the present invention may be the terminal device (including a mobile terminal device and a non-mobile terminal device), or may also be a functional module and/or a functional entity capable of implementing the method in the terminal device, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited. The following takes a terminal device as an example to exemplarily explain an image display method provided by the embodiment of the present invention.
Referring to fig. 2, an embodiment of the present invention provides an image display method applied to a terminal device, and the method may include steps 201 to 203 described below.
Step 201, responding to the received first input of the user to the first control, and displaying the M second controls by the terminal device.
The first control is used for indicating a first type of image in the terminal equipment, and M is a positive integer.
Optionally, the first control is an input control (e.g., an input box control), the first input may be input of a first parameter input by a user in the first control, the first control is a selection control, the first input may be input of a first parameter selected by the user in the first control, and the terminal device may determine, according to the first parameter corresponding to the first input, a first type of image that matches the first parameter from the terminal device. The first parameter may be attribute information of the image.
Optionally, each of the at least one control is for indicating a type of image.
It is to be appreciated that each of the at least one control represents a different image type. The images represented by each control can be classified according to different attribute information of the images to obtain a class of images.
The attribute information of the image may be shooting time information of the image, shooting location information of the image, content attribute information of the image, and the like.
For example, if the images are classified according to the shooting time information of the images, the images are divided into images shot in different time periods (the time length of each time period may be the same or different), for example, the images are divided into an image shot in 2018 year 1 month, an image shot in 2018 year 2 month, an image shot in 2018 year 3 month, an image shot in 2018 year 4 month, an image shot in 2018 year 5 month, an image shot in 2018 year 6 month, and the like. Then, as shown in (a) of fig. 3, the type of image indicated by each control is an image taken in a time period, and each control corresponds to a time period.
For example, if the images are classified according to the photographing location information of the images, the images are divided into images photographed at different locations, for example, the images are divided into images photographed at home, at school, at company, in mountains, at wild goose towers, at happy park, and the like. Then, as shown in fig. 3 (b), the type of image indicated by each control is an image taken at a location, and each control corresponds to a location.
For example, if the images are classified according to the content attribute information (image content type) of the images, the images are classified into different types of images, for example, the images are classified into a person type, a landscape type, a document type, a gourmet type, and the like. Then, as shown in (a) of fig. 4, the type of image indicated by each control is an image with the same image content type, and each control corresponds to one image content type.
It should be noted that: in the embodiment of the present invention, the effect of the "cancel" option in each figure is: the user clicks the cancel option and can trigger the terminal device to return to the initial interface. The initial interface may be an original album interface, or an interface where the terminal device is triggered to return to the first control, which may be determined according to actual use requirements. Therefore, after the operation is finished, the user can exit by selecting the 'cancel' option, directly return to the initial interface and select again, and complex rollback actions are avoided.
The first input is an input by a user selecting a first control from the at least one control. The first input may be a click input of a user on the first control, the first input may also be a slide input of the user on the first control, and the like, and the first input may also be other feasible operations of the user on the first interface, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
For example, the click input may be a single click input, a double click input, or any number of click inputs. The slide input may be a slide input in any direction, such as an upward slide input, a downward slide input, a leftward slide input, or a rightward slide input.
Each of the M second controls is configured to determine at least one parameter, the at least one parameter being configured to determine at least one image from the first type of image. And the at least one image is an image matched with the at least one parameter in the first type of image. Specifically, the at least one image is an image in the first class of images that meets the at least one parameter, or an image in the first class of images that does not meet the at least one parameter.
It can be understood that when M is equal to 1, M second controls are one control, and when M is greater than or equal to 2, M second controls are multiple controls, which is specifically set according to an actual use situation, and the embodiment of the present invention is not limited.
If the M second controls are one control, as shown in (b) in fig. 4, the one second control may be used to determine one parameter, or as shown in (a) in fig. 5, the one second control may be used to determine at least one parameter, which is determined according to actual usage requirements, and the embodiment of the present invention is not limited thereto.
If the M second controls include multiple controls, as shown in (b) of fig. 5, each of the M second controls may be used to determine one parameter, or may be used to determine at least one parameter, which is determined according to actual usage requirements, and the embodiment of the present invention is not limited thereto.
It is understood that the image of the first type of image corresponding to the at least one parameter is an image of the first type of image whose image content includes content information indicated by the at least one parameter. The image which does not accord with the at least one parameter in the first kind of image is the image whose image content does not include the content information indicated by the at least one parameter. Illustratively, the at least one parameter is: the number of image people is 5, and the proportion of men is 0.5-0.7. The at least one image is an image with the number of image people in the first type of image being 5, and the man proportion being 0.5-0.7 (the at least one image is an image meeting the at least one parameter in the first type of image), or the at least one image is an image in the first type of image except for an image with the man input being 5, and the man proportion being 0.5-0.7 (the at least one image is an image in the first type of image not meeting the at least one parameter).
The user can trigger the terminal device to determine at least one parameter according to the target control through inputting the target control in the M second controls, and determine at least one image from the first type of images according to the at least one parameter.
It should be noted that: each parameter in the at least one parameter includes a parameter name and data, where the data corresponding to each parameter may be a numerical value or a numerical range, and is specifically set according to an actual use condition, and the embodiment of the present invention is not limited.
In the embodiment of the invention, the first-class images are further screened by the target controls in the M second controls, so that the images except the target images in the first-class images are not required to be displayed, and under the condition that a user is not clear of picture information, the user can obtain the images really wanted by the user without browsing a large number of pictures, thereby reducing the workload of the user and ensuring the safety of the user information.
Step 202, the terminal device receives a second input of the user to a target control of the M second controls.
The target control is at least one control of the M second controls. The second input is input by the user to determine a target parameter through input to the target control. The second input may be a click input, a slide input, a rotation input, and the like, which are specifically set according to actual use requirements, and the embodiment of the present invention is not limited.
And step 203, responding to the second input, and displaying a target image matched with the target parameter in the first type of image by the terminal equipment.
The target image is an image in the first class of images matched with the target parameter, specifically, the target image may be an image in the first class of images conforming to the target parameter, or an image in the first class of images not conforming to the target parameter, and the target parameter is a parameter corresponding to the second input.
Illustratively, as shown in fig. 5 (a), a second input is used to trigger the terminal device to acquire two parameters (target parameters) of the number of people and the proportion of children, and in response to the second input, the terminal device displays the target image determined according to the target parameters.
Therefore, the target image required by the user can be quickly obtained through the input of the target control, the target image is displayed, and the man-machine interaction performance can be improved.
Optionally, the first control is a control of at least one control, and for example, with reference to fig. 2 and as shown in fig. 6, before step 201, the image display method provided in the embodiment of the present invention may further include step 204 described below.
Step 204, the terminal device receives a first input of the first control of the at least one control from the user.
Each of the at least one control is to indicate a type of image.
For a detailed description, reference may be made to the related description in step 201, which is not repeated herein.
Therefore, the user interface comprises at least one control, the user can select the first control from the at least one control, the user interface is clear at a glance, the user can conveniently obtain the first type of images, and the man-machine interaction performance can be improved.
Optionally, with reference to fig. 6, as shown in fig. 7, before step 203, the image display method provided in the embodiment of the present invention may further include steps 205 to 206 described below.
Step 205, the terminal device determines the target parameter.
And step 206, the terminal equipment determines the target image from the first type of image according to the target parameter.
And the terminal equipment determines a target parameter corresponding to the second input according to the second input of the user, and determines the target image from the first type of image according to the target parameter. Reference may be made to the description of steps 202 to 203 in the above embodiments, which is not repeated herein.
Therefore, the terminal equipment can determine different target images according to different input of the user.
Optionally, the M second controls include a target graphic control, the target graphic control is a closed graphic, the target graphic control includes N regions, each of the N regions is used to indicate a type of parameter, a value of the type of parameter indicated by each region is a continuously changing value, and N is a positive integer; the second input is user input to the target graphical control. Then, in conjunction with fig. 7, as shown in fig. 8, this step 205 can be specifically realized by the step 205a described below.
Step 205a, the terminal device determines a target position and obtains the target parameter corresponding to the target position.
The target position is a position in the target graphical control corresponding to the second input.
The M second controls include a target graphical control, and the M second controls may also include other controls, such as a target input control, and the like.
The closed graph can be a regular image such as a circle, a triangle, a quadrangle, a pentagon and the like, and can also be an irregular graph, which is determined according to actual use requirements, and the embodiment of the invention is not limited. In the embodiment of the present invention, a target graphic control is taken as an example for description.
In the embodiment of the present invention, the continuous variation means that the values of the first-class parameters indicated by each region are continuously varied from small to large, and specifically, the values of the first-class parameters indicated by each region are continuously varied according to a preset rule. For example, if the numerical value corresponding to each parameter is a positive integer, the numerical values of the parameters in the category are consecutive positive integers, or the numerical values of the parameters in the category are consecutive positive integers with the same interval or with different intervals.
Optionally, the target graphical control includes a region, as shown in (b) of fig. 4, and the target graphical control (the region) is used to indicate a type of parameter, and the value of the type of parameter indicated by the target graphical control is a continuously changing value.
Optionally, the target graphical control includes a plurality of regions, each region of the plurality of regions is used for indicating a type of parameter, and the value of the type of parameter indicated by each region is a continuously changing value. As shown in fig. 9 (a), the target graphic control includes A, B, C and D four areas, of which an a area indicates the number of people, a B area indicates a man ratio, a C area indicates a child ratio, and a D area indicates an old man ratio.
The terminal device determines a target position (at least one position) corresponding to the second input in the target control according to the second input, acquires a target parameter (at least one parameter, one position corresponds to one parameter) corresponding to the target position, and determines the target image (at least one image) from the first type of image according to the target parameter.
Illustratively, as shown in (a) of fig. 9, when the user clicks an a position (any position on the ray a in the a region), a B position (any position on the ray B in the B region), and a C position (any position on the ray C in the C region) on the target graphic control, the terminal device determines three parameters (target parameters) of a number of people a1, a man proportion B1, and a child proportion C1 according to the three positions. The terminal device determines an image of the first type of image, which corresponds to the number of people of A1, the proportion of men of B1 and the proportion of children of C1, as a target image, or determines an image of the first type of image, which is not an image of the number of people of A1, the proportion of men of B1 and the proportion of children of C1, as a target image.
The M second controls comprise the target graphic control, so that the M second controls are more vivid, more visual and convenient to operate, and the man-machine interaction performance is improved. Moreover, because the numerical value of the parameter of the type indicated by each region is a continuously changing numerical value, the user can obtain dynamic parameters according to actual requirements, and further obtain images satisfied by the user.
Optionally, the target graphic control may further include at least one pointer control, the second input is specifically an input to a target pointer control, the target pointer control is a control in the at least one pointer control, and the target position is specifically a position of the target pointer control corresponding to the second input.
Illustratively, as shown in (B) of fig. 9, the second input is an input that the user moves the pointer of the a region from the a0 position to the a1 position, moves the pointer of the B region from the B0 position to the B1 position, and moves the pointer of the C region from the C0 position to the C1 position, and the terminal device determines three parameters (target parameters) of the number of people a1, the proportion of men B1, and the proportion of children C1 from the a1 position, the B1 position, and the C1 position. The terminal device determines an image of the first type of image, which corresponds to the number of people of A1, the proportion of men of B1 and the proportion of children of C1, as a target image, or determines an image of the first type of image, which is not an image of the number of people of A1, the proportion of men of B1 and the proportion of children of C1, as a target image.
The pointer control is added, so that the design is more operable, and the man-machine interaction performance can be improved.
It should be noted that: in the embodiment of the present invention, the image content type corresponding to the first control is a person type, and the first type image is a person image. Therefore, if the image content type corresponding to the first control is a landscape type and the first type image is a landscape type image, as shown in (a) in fig. 10, the parameter type corresponding to each second control may include a sky proportion, a mountain proportion, a sea proportion, and a forest proportion; if the image content type corresponding to the first control is a document type and the first type image is a document type image, as shown in (b) in fig. 10, the parameter type corresponding to each second control may include a sons body proportion, a cursive script proportion, a microsoft proportion, and an clerical script proportion; if the image content type corresponding to the first control is a gourmet type and the first type of image is a gourmet type, as shown in fig. 11, the parameter types corresponding to each second control may include a hunan dish ratio, a cantonese dish ratio, a thunberg dish ratio, and a chinese style dish ratio. The type of the parameter may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
It should be noted that: if the data corresponding to each parameter is a numerical range, the user may determine a numerical range through one input in one region (each position of one region corresponds to one numerical range, and one input determines one numerical range), or may determine a numerical range through two inputs in one region (each position of one region corresponds to one numerical value, the user once inputs and determines one numerical value, and two consecutive inputs determine one numerical range), which may be specifically set according to an actual use situation, and the embodiment of the present invention is not limited.
Optionally, the M second controls include a target input control, and the second input is input of a parameter in the target input control. Then, in conjunction with fig. 7, as shown in fig. 12, this step 205 can be specifically realized by the step 205b described below.
Step 205b, the terminal device determines the parameter input in the target input control as the target parameter.
The M second controls include a target input control, and the M second controls may also include other controls, such as a target graphical control, and the like. With reference to the foregoing example, in the embodiment of the present invention, the M second controls may only include the target graphic control, the M second controls may also only include the target input control, the M second controls may also include the target graphic control and the target input control, the M second controls may also include other feasible controls, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited in this embodiment.
It should be noted that: in the embodiment of the present invention, the number of target graphic controls in the M second controls and the number of target input controls in the M second controls are not limited, and may be specifically determined according to actual use requirements.
The user can input the target parameters in the target input control according to the self requirement.
For example, the target input control may be a control as shown in (a) in fig. 13, a user may input target parameters (including parameter names and data) in an input box of the target input control according to a self requirement, or may be a control as shown in (b) in fig. 13, and a user may input data corresponding to preset parameter names in an input box of the target input control according to a self requirement, which is specifically determined according to an actual use requirement, and the embodiment of the present invention is not limited.
Therefore, the display mode of the control is increased, more choices are provided for a user, and the man-machine interaction performance can be improved.
Optionally, the M second controls may further include a target drawing board control, if the user compares the content of a complete and clear required image (hereinafter referred to as a first image), the user may draw a part of the content of the first image in the target drawing board control, and the terminal device extracts a target parameter from the first image drawn by the user and determines the target image from the first type of image according to the target parameter. The terminal equipment can also compare the first image drawn by the user with each image in the first class of images one by one to obtain a target image. The specific method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, with reference to fig. 6, before step 204, the image display method provided in the embodiment of the present invention may further include steps 207 to 208 described below.
And step 207, the terminal equipment receives a third input of the user.
The third input may be an input that a user activates a target function (an application of the image display function provided in the embodiment of the present invention, which may also be referred to as an image fast search function, and may also be referred to as an image screening function) in the album application; the third input may also be an input that the user activates the album application in the non-album application (for example, when the user needs to share an image in the non-album application, the album is started); the third input may also be an input that the user starts the album application on the desktop (the terminal device opens the album application on the desktop); the third input may also be other feasible inputs, which may be determined according to actual usage requirements, and the embodiment of the present invention is not limited.
The description of the third input may refer to the above description related to the first input, and will not be repeated here.
And step 208, responding to the third input, and displaying the at least one control by the terminal equipment.
Therefore, the method for starting the image display function provided by the embodiment of the invention can be added, so that the user can conveniently and quickly find the target image required by the user, and the man-machine interaction performance can be improved.
It should be noted that: with reference to fig. 2, before step 203, the image display method according to the embodiment of the present invention may further include the following steps 209 to 210, where step 203 may be specifically realized by step 203a described below.
And step 209, responding to the second input, and displaying a third control by the terminal equipment.
And step 210, the terminal device receives a fourth input of the user to the third control.
The description of the fourth input may refer to the above description related to the second input, and will not be repeated here.
And step 203a, responding to the fourth input, and displaying the target image matched with the second parameter in the first type of image by the terminal equipment.
The second parameter is the target parameter and a parameter corresponding to the fourth input.
And the terminal equipment acquires the target parameter corresponding to the second input and displays the third control. The terminal device obtains a parameter corresponding to the fourth input (hereinafter referred to as a third parameter), and determines the target image according to the target parameter and the third parameter (i.e., the second parameter). The target image is an image which is in accordance with the target parameter and the third parameter in the first type of image, or an image which is not in accordance with the target parameter and the third parameter in the first type of image.
Illustratively, the M second controls are controls shown in (b) of fig. 4, the user performs a second input on the second control, the terminal device obtains a target parameter (a number of people parameter) corresponding to the second input, and displays a third control shown in fig. 14, the user performs a fourth input on the third control, the terminal device obtains a third parameter (at least one of a man ratio parameter, a child ratio parameter, a middle-aged ratio parameter, and an aged ratio parameter) corresponding to the fourth input, and the terminal device determines a target image according to the target parameter and the third parameter, and displays the target image.
Or the terminal device acquires a target parameter corresponding to the second input, determines a second type of image from the first type of image according to the target parameter, and displays a fourth control, wherein the fourth control is used for determining a third parameter, and the third parameter is used for determining a target image from the second type of image. The second type of image is an image which accords with the target parameter in the first type of image, or an image which does not accord with the target parameter in the first type of image. The target image is an image which accords with the third parameter in the second type of image, or an image which does not accord with the third parameter in the second type of image.
Therefore, the hierarchical sense of control design can be increased, and the human-computer interaction experience of a user can be better.
The drawings in the embodiments of the present invention are all exemplified by the drawings in the independent embodiments, and when the embodiments of the present invention are specifically implemented, each of the drawings can also be implemented by combining any other drawings which can be combined, and the embodiments of the present invention are not limited. For example, in conjunction with fig. 7 or fig. 12, before step 204, the image display method provided by the embodiment of the present invention may also include steps 207 to 208 described above.
The embodiment of the invention provides an image display method, wherein a terminal device can display M second controls by responding to a received first input of a user to a first control, and the first control is used for indicating a first type of image in the terminal device; receiving a second input of the user to a target control in the M second controls; and responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input. Through the scheme, a user can trigger the terminal equipment to display the M second controls through first input of the first control (corresponding to the first type of image), and then the user triggers the terminal equipment to display a target image matched with target parameters corresponding to second input in the first type of image through second input of a target control in the M second controls. Therefore, the user can quickly screen out the images required by the user from the gallery by inputting the two controls. The problems that in the prior art, the process of screening images by the terminal equipment is complex and tedious, and the human-computer interaction performance is poor can be solved.
As shown in fig. 15, an embodiment of the present invention provides a terminal device 120, where the terminal device 120 includes: a display module 121 and a receiving module 122; the display module 121 is configured to display, in response to a received first input to a first control by a user, M second controls, where the first control is used to indicate a first type of image in the terminal device, and M is a positive integer; the receiving module 122 is configured to receive a second input of a user to a target control in the M second controls displayed by the display module 121; the display module 121 is further configured to display, in response to the second input received by the receiving module 122, a target image in the first type of image, where the target image matches a target parameter, where the target parameter is a parameter corresponding to the second input.
It should be noted that: the terminal device 120 shown in fig. 15 may include only the display module 121 and the receiving module 122; a display module 121, a receiving module 122 and a determining module 123 may also be included (i.e., in the terminal device 120, the determining module 123 is an optional module). In order to clearly illustrate the structure of the terminal device 120, the determination module 123 is illustrated as a dashed box in fig. 15.
Optionally, the receiving module 122 is further configured to receive a first input of the first control from the at least one control by the user before the displaying module 121 displays the M second controls in response to the received first input of the first control by the user, where each of the at least one control is used to indicate a type of image.
Optionally, the terminal device 120 further includes: a determination module 123; the determining module 123 is configured to determine the target parameter before the display module 121 displays the target image matching the target parameter in the first type of image, and determine the target image from the first type of image according to the target parameter.
Optionally, the M second controls include a target graphic control, the target graphic control is a closed graphic, the target graphic control includes N regions, each of the N regions is used to indicate a type of parameter, a value of the type of parameter indicated by each region is a continuously changing value, and N is a positive integer; the second input is input to the target graphic control by a user; the determining module 123 is specifically configured to determine a target position, and obtain the target parameter corresponding to the target position, where the target position is a position in the target graphic control corresponding to the second input.
Optionally, the target graphic control further includes at least one pointer control, the second input is specifically an input to a target pointer control, the target pointer control is a control in the at least one pointer control, and the target position is specifically a position of the target pointer control corresponding to the second input.
Optionally, the M second controls include a target input control, and the second input is input of a parameter in the target input control; the determining module 123 is specifically configured to determine the parameter input in the target input control as the target parameter.
The terminal device provided in the embodiment of the present invention can implement each process shown in any one of fig. 2 to 14 in the above method embodiments, and details are not described here again to avoid repetition.
The embodiment of the invention provides terminal equipment, wherein the terminal equipment can display M second controls by responding to received first input of a user to a first control, and the first control is used for indicating a first type of image in the terminal equipment; receiving a second input of the user to a target control in the M second controls; and responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input. Through the scheme, a user can trigger the terminal equipment to display the M second controls through first input of the first control (corresponding to the first type of image), and then the user triggers the terminal equipment to display a target image matched with target parameters corresponding to second input in the first type of image through second input of a target control in the M second controls. Therefore, the user can quickly screen out the images required by the user from the gallery by inputting the two controls. The problems that in the prior art, the process of screening images by the terminal equipment is complex and tedious, and the human-computer interaction performance is poor can be solved.
Fig. 16 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention. As shown in fig. 16, the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 16 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The display unit 106 is configured to display, in response to a received first input to a first control by a user, M second controls, where the first control is used to indicate a first type of image in the terminal device, and M is a positive integer; a user input unit 107, configured to receive a second input of a user to a target control in the M second controls; the display unit 106 is further configured to display, in response to the second input, a target image in the first type of image that matches a target parameter, where the target parameter is a parameter corresponding to the second input.
According to the terminal device provided by the embodiment of the invention, the terminal device can display the M second controls by responding to the received first input of the user to the first control, wherein the first control is used for indicating the first type of images in the terminal device; receiving a second input of the user to a target control in the M second controls; and responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input. Through the scheme, a user can trigger the terminal equipment to display the M second controls through first input of the first control (corresponding to the first type of image), and then the user triggers the terminal equipment to display a target image matched with target parameters corresponding to second input in the first type of image through second input of a target control in the M second controls. Therefore, the user can quickly screen out the images required by the user from the gallery by inputting the two controls. The problems that in the prior art, the process of screening images by the terminal equipment is complex and tedious, and the human-computer interaction performance is poor can be solved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 16, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Optionally, an embodiment of the present invention further provides a terminal device, which may include the processor 110 shown in fig. 16, the memory 109, and a computer program stored in the memory 109 and capable of being executed on the processor 110, where the computer program, when executed by the processor 110, implements each process of the image display method shown in any one of fig. 2 to fig. 14 in the foregoing method embodiments, and can achieve the same technical effect, and details are not described here to avoid repetition.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the image display method shown in any one of fig. 2 to 14 in the foregoing method embodiments, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. An image display method is applied to a terminal device, and the method comprises the following steps:
responding to a received first input of a user to a first control, and displaying M second controls, wherein the first control is used for indicating a first type of image matched with a first parameter corresponding to the first input in the terminal equipment, and M is a positive integer;
receiving a second input of a user to a target control in the M second controls;
responding to the second input, and displaying a target image matched with a target parameter in the first class of images, wherein the target parameter is a parameter corresponding to the second input;
before the displaying of the target image matched with the target parameter in the first kind of image, the method further comprises:
determining the target parameters, and determining the target image from the first type of image according to the target parameters;
the M second controls comprise a target graphic control, the target graphic control is a closed graphic, the target graphic control comprises N areas, each area in the N areas is used for indicating a type of parameter, the numerical value of the type of parameter indicated by each area is a continuously changing numerical value, and N is a positive integer;
the second input is input of the target graphic control by a user;
the determining the target parameter includes:
and determining a target position, and acquiring the target parameter corresponding to the target position, wherein the target position is a position corresponding to the second input in the target graphic control.
2. The method of claim 1, wherein before displaying the M second controls in response to the received first input to the first control by the user, further comprising:
a first user input to the first control of at least one control is received, each of the at least one control for indicating a type of image.
3. The method of claim 1, wherein the target graphical control further comprises at least one pointer control, wherein the second input is specifically an input to a target pointer control, wherein the target pointer control is a control of the at least one pointer control, and wherein the target location is specifically a location of the target pointer control corresponding to the second input.
4. A terminal device, characterized in that the terminal device comprises: the device comprises a display module, a determination module and a receiving module;
the display module is used for responding to a received first input of a user to a first control, and displaying M second controls, wherein the first control is used for indicating a first type of image matched with a first parameter corresponding to the first input in the terminal equipment, and M is a positive integer;
the receiving module is used for receiving a second input of a user to a target control in the M second controls displayed by the display module;
the display module is further configured to display, in response to the second input received by the receiving module, a target image in the first type of image, which is matched with a target parameter, where the target parameter is a parameter corresponding to the second input;
the determining module is used for determining the target parameters before the display module displays the target images matched with the target parameters in the first-class images, and determining the target images from the first-class images according to the target parameters;
the M second controls comprise a target graphic control, the target graphic control is a closed graphic, the target graphic control comprises N areas, each area in the N areas is used for indicating a type of parameter, the numerical value of the type of parameter indicated by each area is a continuously changing numerical value, and N is a positive integer;
the second input is input of the target graphic control by a user;
the determining module is specifically configured to determine a target position, and acquire the target parameter corresponding to the target position, where the target position is a position in the target graphical control corresponding to the second input.
5. The terminal device of claim 4, wherein the receiving module is further configured to receive a first user input to a first control of at least one control, each of the at least one control being configured to indicate a type of image, before the displaying module displays the M second controls in response to the received first user input to the first control.
6. The terminal device according to claim 4, wherein the target graphical control further includes at least one pointer control, the second input is specifically an input to a target pointer control, the target pointer control is a control in the at least one pointer control, and the target location is specifically a location of the target pointer control corresponding to the second input.
7. A terminal device, characterized by comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image display method according to any one of claims 1 to 3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image display method according to any one of claims 1 to 3.
CN201910364497.8A 2019-04-30 2019-04-30 Image display method and terminal equipment Active CN110245246B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910364497.8A CN110245246B (en) 2019-04-30 2019-04-30 Image display method and terminal equipment
PCT/CN2020/081249 WO2020220873A1 (en) 2019-04-30 2020-03-25 Image display method and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910364497.8A CN110245246B (en) 2019-04-30 2019-04-30 Image display method and terminal equipment

Publications (2)

Publication Number Publication Date
CN110245246A CN110245246A (en) 2019-09-17
CN110245246B true CN110245246B (en) 2021-11-16

Family

ID=67883623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910364497.8A Active CN110245246B (en) 2019-04-30 2019-04-30 Image display method and terminal equipment

Country Status (2)

Country Link
CN (1) CN110245246B (en)
WO (1) WO2020220873A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245246B (en) * 2019-04-30 2021-11-16 维沃移动通信有限公司 Image display method and terminal equipment
CN110889002A (en) * 2019-11-26 2020-03-17 维沃移动通信有限公司 Image display method and electronic equipment
CN111984809A (en) * 2020-08-20 2020-11-24 深圳集智数字科技有限公司 Image searching method and related device
CN112148185A (en) * 2020-09-17 2020-12-29 维沃移动通信(杭州)有限公司 Image display method and device
CN113179205B (en) * 2021-03-31 2023-04-18 维沃移动通信有限公司 Image sharing method and device and electronic equipment
CN113992789A (en) * 2021-10-29 2022-01-28 维沃移动通信有限公司 Image processing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104239336A (en) * 2013-06-19 2014-12-24 华为技术有限公司 Image screening method, device and terminal
CN104284252A (en) * 2014-09-10 2015-01-14 康佳集团股份有限公司 Method for generating electronic photo album automatically
WO2015063551A1 (en) * 2013-11-01 2015-05-07 Sony Corporation Method and apparatus for filtering pictures
CN105893412A (en) * 2015-11-24 2016-08-24 乐视致新电子科技(天津)有限公司 Image sharing method and apparatus
WO2018219664A1 (en) * 2017-05-31 2018-12-06 Interdigital Vc Holdings, Inc. A method and a device for picture encoding and decoding

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103155425B (en) * 2010-08-13 2015-07-29 Lg电子株式会社 mobile terminal, display device and control method thereof
KR101631966B1 (en) * 2014-06-19 2016-06-20 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN104794189B (en) * 2015-04-16 2018-05-08 惠州Tcl移动通信有限公司 A kind of method for screening images and screening system
CN109947968A (en) * 2017-10-13 2019-06-28 南京唯实科技有限公司 A kind of big data method for screening images
CN108769374B (en) * 2018-04-25 2020-10-02 维沃移动通信有限公司 Image management method and mobile terminal
CN108776822B (en) * 2018-06-22 2020-04-24 腾讯科技(深圳)有限公司 Target area detection method, device, terminal and storage medium
CN109151442B (en) * 2018-09-30 2021-01-08 维沃移动通信有限公司 Image shooting method and terminal
CN109543744B (en) * 2018-11-19 2022-10-14 南京邮电大学 Multi-category deep learning image identification method based on Loongson group and application thereof
CN109635683A (en) * 2018-11-27 2019-04-16 维沃移动通信有限公司 Method for extracting content and terminal device in a kind of image
CN110245246B (en) * 2019-04-30 2021-11-16 维沃移动通信有限公司 Image display method and terminal equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104239336A (en) * 2013-06-19 2014-12-24 华为技术有限公司 Image screening method, device and terminal
WO2015063551A1 (en) * 2013-11-01 2015-05-07 Sony Corporation Method and apparatus for filtering pictures
CN104284252A (en) * 2014-09-10 2015-01-14 康佳集团股份有限公司 Method for generating electronic photo album automatically
CN105893412A (en) * 2015-11-24 2016-08-24 乐视致新电子科技(天津)有限公司 Image sharing method and apparatus
WO2018219664A1 (en) * 2017-05-31 2018-12-06 Interdigital Vc Holdings, Inc. A method and a device for picture encoding and decoding

Also Published As

Publication number Publication date
CN110245246A (en) 2019-09-17
WO2020220873A1 (en) 2020-11-05

Similar Documents

Publication Publication Date Title
CN110245246B (en) Image display method and terminal equipment
CN111459355B (en) Content sharing method and electronic equipment
CN110891144B (en) Image display method and electronic equipment
CN111596845B (en) Display control method and device and electronic equipment
CN111010332A (en) Group chat method and electronic equipment
CN110489029B (en) Icon display method and terminal equipment
CN109543099B (en) Content recommendation method and terminal equipment
CN110502163B (en) Terminal device control method and terminal device
CN111142747B (en) Group management method and electronic equipment
CN110489025B (en) Interface display method and terminal equipment
CN110752981B (en) Information control method and electronic equipment
CN110489045B (en) Object display method and terminal equipment
CN110908557B (en) Information display method and terminal equipment
CN109828731B (en) Searching method and terminal equipment
CN109358931B (en) Interface display method and terminal
CN110703972B (en) File control method and electronic equipment
CN108874906B (en) Information recommendation method and terminal
CN110069188B (en) Identification display method and terminal equipment
CN109901761B (en) Content display method and mobile terminal
CN111026350A (en) Display control method and electronic equipment
CN110908555A (en) Icon display method and electronic equipment
CN110647277A (en) Control method and terminal equipment
CN111352547A (en) Display method and electronic equipment
CN111090489A (en) Information control method and electronic equipment
CN111124231B (en) Picture generation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant