WO2020220873A1 - Procédé d'affichage d'image et dispositif terminal - Google Patents

Procédé d'affichage d'image et dispositif terminal Download PDF

Info

Publication number
WO2020220873A1
WO2020220873A1 PCT/CN2020/081249 CN2020081249W WO2020220873A1 WO 2020220873 A1 WO2020220873 A1 WO 2020220873A1 CN 2020081249 W CN2020081249 W CN 2020081249W WO 2020220873 A1 WO2020220873 A1 WO 2020220873A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
input
control
image
parameter
Prior art date
Application number
PCT/CN2020/081249
Other languages
English (en)
Chinese (zh)
Inventor
唐义
朱宗伟
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020220873A1 publication Critical patent/WO2020220873A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the embodiments of the present disclosure relate to the field of communication technology, and in particular, to an image display method and terminal device.
  • the terminal device in the related art can classify the images in the gallery according to image attribute parameters such as time parameters, location parameters, and image type parameters (such as character type, scenery type, food type, etc.). Therefore, in the process of screening images, the user can first trigger the terminal device to filter to obtain images belonging to a certain time period, a certain place or a certain type (the first screening result), and then trigger the terminal device to browse the first screening one by one The images in the results are filtered to get the images that users need to share.
  • image attribute parameters such as time parameters, location parameters, and image type parameters (such as character type, scenery type, food type, etc.). Therefore, in the process of screening images, the user can first trigger the terminal device to filter to obtain images belonging to a certain time period, a certain place or a certain type (the first screening result), and then trigger the terminal device to browse the first screening one by one The images in the results are filtered to get the images that users need to share.
  • the embodiments of the present disclosure provide an image display method and a terminal device, so as to solve the problem of complicated and cumbersome process of screening images by the terminal device in the related art, and poor human-computer interaction performance.
  • an embodiment of the present disclosure provides an image display method applied to a terminal device.
  • the method includes: in response to a received first input from a user to a first control, displaying M second controls, the first The control is used to indicate the first-type image in the terminal device, and M is a positive integer; receive a second input from the user to the target control in the M second controls; in response to the second input, display the first-type image
  • the target image matches the target parameter in the, and the target parameter is the parameter corresponding to the second input.
  • an embodiment of the present disclosure provides a terminal device, the terminal device includes: a display module and a receiving module; the display module is configured to display M items in response to the received first input from the user to the first control
  • the second control the first control is used to indicate the first type of image in the terminal device, and M is a positive integer
  • the receiving module is used to receive the target control of the M second controls displayed by the user on the display module
  • the display module is also used to display a target image matching the target parameter in the first type of image in response to the second input received by the receiving module, and the target parameter is the parameter corresponding to the second input.
  • embodiments of the present disclosure provide a terminal device, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor.
  • the computer program is executed by the processor to achieve the following The steps of the image display method in one aspect.
  • embodiments of the present disclosure provide a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the image display method in the first aspect are implemented.
  • the terminal device may display M second controls in response to the received first input of the user to the first control, and the first controls are used to indicate the first type of image in the terminal device, M is a positive integer; receiving a user's second input to the target control in the M second controls; in response to the second input, displaying a target image in the first type of image that matches the target parameter, the target parameter being the The parameter corresponding to the second input.
  • the user can trigger the terminal device to display M second controls through the first input to the first control (corresponding to the first type of image), and then the user triggers the second input to the target control in the M second controls
  • the terminal device displays the target image matching the target parameter corresponding to the second input in the first type of image.
  • the user can quickly filter out the image that the user needs from the gallery by inputting the two controls. It can solve the problem of complicated and cumbersome process of screening images by terminal equipment in related technologies and poor human-computer interaction performance.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure
  • FIG 3 is one of the schematic diagrams of the interface of the image display method provided by the embodiment of the disclosure.
  • FIG. 5 is the third schematic diagram of the interface of the image display method provided by the embodiments of the disclosure.
  • FIG. 6 is the second flowchart of the image display method provided by an embodiment of the disclosure.
  • FIG. 7 is the third flowchart of the image display method provided by an embodiment of the disclosure.
  • FIG. 8 is the fourth flowchart of the image display method provided by an embodiment of the disclosure.
  • FIG. 9 is the fourth schematic diagram of the interface of the image display method provided by the embodiments of the disclosure.
  • FIG. 10 is the fifth schematic diagram of the interface of the image display method provided by the embodiments of the disclosure.
  • FIG. 11 is a sixth diagram of the interface of the image display method provided by the embodiments of the disclosure.
  • FIG. 12 is the fifth flowchart of the image display method provided by an embodiment of the disclosure.
  • FIG. 13 is the seventh schematic diagram of the interface of the image display method provided by the embodiments of the disclosure.
  • 15 is a schematic structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 16 is a schematic diagram of hardware of a terminal device provided by an embodiment of the disclosure.
  • first”, “second”, “third”, and “fourth” in the specification and claims of the present disclosure are used to distinguish different objects, rather than describing a specific order of objects.
  • first input, the second input, the third input, and the fourth input are used to distinguish different inputs, rather than to describe a specific order of inputs.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present disclosure should not be construed as being more optional or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • multiple refers to two or more than two, for example, multiple processing units refers to two or more processing units; multiple elements Refers to two or more elements, etc.
  • a terminal device can display M second controls in response to a received first input from a user to a first control, and the first control is used to indicate the first control in the terminal device.
  • a type of image receiving a user's second input to the target control in the M second controls; in response to the second input, displaying a target image matching the target parameter in the first type of image, the target parameter being the first Second, enter the corresponding parameters.
  • the user can trigger the terminal device to display M second controls through the first input to the first control (corresponding to the first type of image), and then the user triggers the second input to the target control in the M second controls
  • the terminal device displays the target image matching the target parameter corresponding to the second input in the first type of image.
  • the following uses the Android operating system as an example to introduce the software environment to which the image display method provided by the embodiments of the present disclosure is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software level.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop a software program that implements the image display method provided by the embodiment of the present disclosure based on the system architecture of the Android operating system as shown in FIG.
  • the display method can be run based on the Android operating system shown in FIG. 1. That is, the processor or the terminal can implement the image display method provided by the embodiment of the present disclosure by running the software program in the Android operating system.
  • the terminal device in the embodiment of the present disclosure may be a mobile terminal device or a non-mobile terminal device.
  • the mobile terminal device can be a mobile phone, tablet computer, notebook computer, handheld computer, car terminal, wearable device, ultra-mobile personal computer (UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc.
  • the non-mobile terminal device may be a personal computer (PC), a television (television, TV), a teller machine, or a self-service machine, etc.; the embodiment of the present disclosure does not specifically limit it.
  • the execution subject of the image display method provided by the embodiments of the present disclosure may be the aforementioned terminal device (including mobile terminal devices and non-mobile terminal devices), or may be a functional module and/or functional entity in the terminal device that can implement the method, The details can be determined according to actual usage requirements, and the embodiments of the present disclosure do not limit it.
  • the following takes a terminal device as an example to illustrate the image display method provided by the embodiment of the present disclosure.
  • an embodiment of the present disclosure provides an image display method, which is applied to a terminal device.
  • the method may include the following steps 201 to 203.
  • Step 201 In response to the received first input of the user to the first control, the terminal device displays M second controls.
  • the first control is used to indicate the first type of image in the terminal device, and M is a positive integer.
  • the first control is an input control (for example, an input box control), the first input can be an input of the user entering the first parameter in the first control, and the first control is a selection control, then the first input can be
  • the terminal device may determine the first type of image matching the first parameter from the terminal device according to the first parameter corresponding to the first input.
  • the first parameter may be attribute information of the image.
  • each of the at least one control is used to indicate a type of image.
  • each control in the at least one control represents a different image type.
  • the image represented by each control can be a class of images obtained by classifying images according to different attribute information of the images.
  • the attribute information of the image may be the shooting time information of the image, the shooting location information of the image, the content attribute information of the image, and the like.
  • the images are divided into images taken in different time periods (the duration of each time period may be the same or different), for example, the image is divided into 2018 Images taken in January, images taken in February 2018, images taken in March 2018, images taken in April 2018, images taken in May 2018, images taken in June 2018, etc.
  • the type of image indicated by each control is an image taken within a time period, and each control corresponds to a time period.
  • the images are divided into images shot at different locations, for example, the images are divided into images shot at home, images shot at school, images shot at the company, Images taken in Huashan Mountain, images taken in the Big Wild Goose Pagoda, images taken in Xingqing Park, etc.
  • the type of image indicated by each control is an image taken at a location, and each control corresponds to a location.
  • the image is classified according to the content attribute information (image content type) of the image, the image is divided into different types of images, for example, the image is divided into people, landscapes, documents, and food.
  • image content type the content attribute information of the image
  • the type of images indicated by each control are images of the same image content type, and each control corresponds to one image content type.
  • the function of the "cancel” option in each figure is: the user clicks the "cancel” option to trigger the terminal device to return to the initial interface.
  • the initial interface can be the original photo album interface, or it can trigger the terminal device to return to the interface where the first control is located, which can be specifically determined according to actual usage requirements, which is not limited in the embodiment of the present disclosure. In this way, after the operation is over, you can exit by selecting the "Cancel” option, and directly return to the initial interface, and make the selection again, avoiding complicated rollback actions.
  • the first input is an input for the user to select the first control from the at least one control.
  • the first input can be the user's click input on the first control, the first input can also be the user's sliding input on the first control, etc.
  • the first input can also be other feasible operations of the user on the first interface The details can be determined according to actual use requirements, and the embodiments of the present disclosure do not limit it.
  • the aforementioned click input may be a single click input, a double click input, or any number of click input.
  • the aforementioned sliding input may be a sliding input in any direction, such as an upward sliding input, a downward sliding input, a left sliding input, or a right sliding input.
  • Each of the M second controls is used to determine at least one parameter, and the at least one parameter is used to determine at least one image from the first type of images.
  • the at least one image is an image matching the at least one parameter in the first type of image.
  • the at least one image is an image that meets the at least one parameter in the first type of image, or is an image that does not meet the at least one parameter in the first type of image.
  • the M second controls are one control, and when M is greater than or equal to 2, the M second controls are multiple controls, which are specifically set according to actual usage conditions and are not limited in the embodiment of the present disclosure. .
  • the M second controls are one control, as shown in Figure 4 (b), the one second control can be used to determine a parameter, or as shown in Figure 5 (a), the one second control It can be used to determine at least one parameter, which is specifically determined according to actual usage requirements, which is not limited in the embodiment of the present disclosure.
  • each of the M second controls can be used to determine one parameter, or can be used to determine at least one parameter, according to actual conditions.
  • the use requirements are determined, and the embodiments of the present disclosure are not limited.
  • an image in the first type of image that meets the at least one parameter is an image in which the image content in the first type of image includes the content information indicated by the at least one parameter.
  • the images in the first type of images that do not meet the at least one parameter are images in the first type of images whose image content does not include the content information indicated by the at least one parameter.
  • the at least one parameter is: the number of people in the image is 5, and the proportion of men is 0.5-0.7.
  • the at least one image is an image in which the number of images in the first type image is 5, and the male ratio is 0.5-0.7 (the at least one image is an image in the first type image that meets the at least one parameter), or the at least one image
  • the image is an image of the first type of image other than the image in which the human input is 5 and the male ratio is 0.5-0.7 (the at least one image is an image in the first type of image that does not meet the at least one parameter).
  • the user can trigger the terminal device to determine at least one parameter according to the target control by inputting the target control in the M second controls, and determine at least one image from the first type of images according to the at least one parameter.
  • each parameter in the at least one parameter includes a parameter name and data, where the data corresponding to each parameter can be a numerical value or a numerical range, which is specifically set according to actual usage conditions.
  • the embodiment is not limited.
  • the target controls in the M second controls by further filtering the first type of images by the target controls in the M second controls, it is not necessary to display images in the first type of images other than the target image, so that the user is not very clear about the picture information. In this case, there is no need to browse a large number of pictures, and the images that the user really wants can be obtained, so the workload of the user can be reduced, and the security of the user information can be guaranteed.
  • Step 202 The terminal device receives a second input from the user to the target control among the M second controls.
  • the target control is at least one control among the M second controls.
  • the second input is an input for the user to determine the target parameter through the input of the target control.
  • the second input may be a click input, a sliding input, a rotation input, etc., which are specifically set according to actual use requirements, and the embodiment of the present disclosure does not limit it.
  • Step 203 In response to the second input, the terminal device displays a target image that matches the target parameter in the first type of image.
  • the target image is an image that matches the target parameter in the first type of image.
  • the target image may be an image that meets the target parameter in the first type of image, or an image that does not meet the target parameter in the first type of image.
  • the target parameter is a parameter corresponding to the second input.
  • the second input is used to trigger the terminal device to acquire the two parameters (target parameters) of the number of people and the proportion of children.
  • the terminal device displays that according to the target The target image determined by the parameters.
  • the target image required by the user can be quickly obtained and displayed, and the human-computer interaction performance can be improved.
  • the first control is a control in at least one control.
  • the image display method provided by the embodiment of the present disclosure may further include the following ⁇ 204.
  • Step 204 The terminal device receives a first input from the user to the first control of the at least one control.
  • Each of the at least one control is used to indicate a type of image.
  • step 201 For specific descriptions, reference may be made to the related descriptions in step 201 above, which will not be repeated here.
  • the user interface includes at least one control, and the user can select the first control from the at least one control, the user interface is clear at a glance, it is convenient for the user to obtain the first type of image, and the human-computer interaction performance can be improved.
  • the image display method provided by the embodiment of the present disclosure may further include the following steps 205 to 206.
  • Step 205 The terminal device determines the target parameter.
  • Step 206 The terminal device determines the target image from the first-type image according to the target parameter.
  • the terminal device determines the target parameter corresponding to the second input according to the second input of the user, and determines the target image from the first-type image according to the target parameter.
  • the target parameter corresponding to the second input according to the second input of the user
  • the target image from the first-type image according to the target parameter.
  • the M second controls include a target graphic control, the target graphic control is a closed graphic, the target graphic control includes N areas, and each area of the N areas is used to indicate a type of parameter.
  • the value of a type of parameter indicated by each area is a continuously changing value, and N is a positive integer; the second input is the user's input to the target graphic control.
  • this step 205 can be specifically implemented by the following step 205a.
  • Step 205a The terminal device determines the target location, and obtains the target parameter corresponding to the target location.
  • the target position is a position in the target graphic control corresponding to the second input.
  • the M second controls include target graphic controls, and the M second controls may also include other controls, such as target input controls.
  • the closed graphic can be a regular image such as a circle, a triangle, a quadrilateral, a pentagon, etc., or an irregular graphic, which is specifically determined according to actual usage requirements, and is not limited in the embodiment of the present disclosure.
  • the target graphic control is a circular graphic control as an example.
  • continuous change refers to the successive change from small to large
  • the value of a type of parameter indicated by each area is a continuously changing value.
  • the value of a type of parameter indicated by each area is in accordance with the preset value.
  • the value changes continuously For example, if the value corresponding to each parameter is a positive integer, the value of this type of parameter is a continuous positive integer, or the value of this type of parameter is a continuous positive integer with the same or different intervals, for example, if each The values corresponding to the parameters are all decimals, and the values of this type of parameters are consecutive decimals with the same or different intervals.
  • the target graphic control includes an area, as shown in (b) in FIG. 4, the target graphic control (the area) is used to indicate a type of parameter, and the value of the type of parameter indicated by the target graphic control It is a continuously changing value.
  • the target graphic control includes a plurality of areas, each of the plurality of areas is used to indicate a type of parameter, and the value of the type of parameter indicated by each area is a continuously changing value.
  • the target graphic control includes four areas A, B, C, and D. In these four areas, area A indicates the number of people, area B indicates the proportion of men, and area C indicates the proportion of children, and Area D indicates the proportion of elderly people.
  • the terminal device determines the target position (at least one position) corresponding to the second input in the target control according to the second input, and obtains the target parameter corresponding to the target position (at least one parameter, one position corresponds to one parameter), and According to the target parameter, the target image (at least one image) is determined from the first type of image.
  • the terminal device determines the three parameters (target parameters) that the number of people is A1, the proportion of men is B1, and the proportion of children is C1 based on these three positions.
  • the terminal device determines an image with a matching number of A1, a male ratio of B1, and a child ratio of C1 in the first type of image as the target image, or determines the matching number of A1, the male ratio of B1, and the child ratio of C1 in the first type of image
  • the image other than the image is determined as the target image.
  • the M second controls include target graphic controls, which can make the M second controls more vivid, more intuitive, easy to operate, and improve human-computer interaction performance. Moreover, since the value of a type of parameter indicated by each area is a continuously changing value, users can obtain dynamic parameters according to actual needs, thereby obtaining images that are satisfactory to users.
  • the target graphic control may further include at least one pointer control
  • the second input is specifically an input to the target pointer control
  • the target pointer control is a control in the at least one pointer control
  • the target position is specifically the The position of the target pointer control corresponding to the second input.
  • the second input is that the user moves the pointer of area A from position a0 to position a1, moves the pointer of area B from position b0 to position b1, and moves area C
  • the pointer moves from the c0 position to the c1 position
  • the terminal device determines the three parameters (target parameters) as the number of people as A1, the proportion of men as B1, and the proportion of children as C1 according to the positions a1, b1, and c1.
  • the terminal device determines an image with a matching number of A1, a male ratio of B1, and a child ratio of C1 in the first type of image as the target image, or the first type of image with a matching number of A1, a male ratio of B1, and a child ratio of C1
  • the image other than the image is determined as the target image.
  • adding pointer controls can make the design more maneuverable and improve the performance of human-computer interaction.
  • the image content type corresponding to the first control is a character type
  • the first type of image is a character image as an example. Therefore, if the image content type corresponding to the first control is a landscape type, and the first type of image is a landscape image, as shown in (a) in Figure 10, the parameter type corresponding to each second control can include sky scale, Mountain proportions, ocean proportions, and forest proportions; if the image content type corresponding to the first control is document type, and the first type of image is document type, then as shown in Figure 10 (b), each second control corresponds to
  • the parameter types can include Song Ti proportion, cursive proportion, Microsoft proportion and official script proportion; if the image content type corresponding to the first control is food type, and the first type of image is food type, then as shown in Figure 11, each second control
  • the corresponding parameter types can include the proportion of Hunan cuisine, the proportion of Cantonese cuisine, the proportion of Zhejiang cuisine, and the proportion of Sichuan cuisine. Specifically,
  • the user can determine a numerical range by inputting one time in an area (each position in an area corresponds to a numerical range, and one input determines a numerical range)
  • the specific can be based on actual usage Setting is not limited in the embodiment of the present disclosure.
  • the M second controls include a target input control, and the second input is an input of inputting parameters in the target input control.
  • this step 205 can be specifically implemented by the following step 205b.
  • Step 205b The terminal device determines the parameter input in the target input control as the target parameter.
  • the M second controls include target input controls, and the M second controls may also include other controls, such as target graphic controls.
  • the M second controls may only include the target graphic control, the M second controls may also only include the target input control, and the M second controls may also include the target graphic control and the target input. Controls, the M second controls may also include other feasible controls, which may be specifically determined according to actual use requirements, which are not limited in the embodiment of the present disclosure.
  • the number of target graphic controls in the M second controls is not limited, nor is the number of target input controls in the M second controls limited, which can be specifically determined according to actual usage requirements.
  • the target input control may be the control shown in (a) in Figure 13.
  • the user can enter the target parameters (including the parameter name and data) in the input box of the target input control according to their own needs, or it may be such as
  • the control shown in (b) in FIG. 13 the user can input data corresponding to the preset parameter name in the input box of the target input control according to their own needs, which is specifically determined according to actual use needs, and the embodiment of the present disclosure does not limit it.
  • the M second controls may also include a target drawing board control. If the user is relatively complete and clear about the content of the desired image (hereinafter referred to as the first image), the user can draw part of the first image in the target drawing board control
  • the terminal device extracts the target parameter according to the first image drawn by the user, and determines the target image from the first type of image according to the target parameter.
  • the terminal device can also compare the first image drawn by the user with each image in the first type of image one by one to obtain the target image. The details can be determined according to actual usage requirements, and the embodiments of the present disclosure do not limit it.
  • the image display method provided by the embodiment of the present disclosure may further include the following steps 207 to 208.
  • Step 207 The terminal device receives the third input of the user.
  • the third input can be the input for the user to enable the target function in the photo album application (the application of the image display function provided by the embodiment of the present disclosure, which can also be called the image quick search function, or the image filtering function); the third input can also be Enable the input of the album application for the user in the non-album application (for example, when the user needs to share the image in the non-album application, start the album); the third input can also be the input of the user to start the album application on the desktop (the terminal device is on the desktop Open the photo album application on the top); the third input can also be other feasible inputs, which can be specifically determined according to actual use requirements, and the embodiment of the present disclosure does not limit it.
  • Step 208 In response to the third input, the terminal device displays the at least one control.
  • a method for enabling the image display function provided by the embodiment of the present disclosure can be added, which is convenient for the user to quickly find the target image required by the user, and can improve the human-computer interaction performance.
  • the image display method provided by the embodiment of the present disclosure may further include the following steps 209 to 210, and the above step 203 can be specifically implemented by the following step 203a.
  • Step 209 In response to the second input, the terminal device displays a third control.
  • Step 210 The terminal device receives a fourth input from the user to the third control.
  • Step 203a In response to the fourth input, the terminal device displays a target image matching the second parameter in the first type of image.
  • the second parameter is the target parameter and the parameter corresponding to the fourth input.
  • the terminal device obtains the target parameter corresponding to the second input, and displays the third control.
  • the terminal device obtains the parameter corresponding to the fourth input (hereinafter referred to as the third parameter), and the terminal device determines the target image according to the target parameter and the third parameter (ie, the second parameter).
  • the target image is an image that meets the target parameter and the third parameter in the first type of image, or is an image that does not meet the target parameter and the third parameter in the first type of image.
  • the M second controls are controls as shown in (b) in FIG. 4, the user performs a second input on the second controls, and the terminal device obtains a target parameter (number of people parameter) corresponding to the second input , And display the third control as shown in Figure 14.
  • the user performs a fourth input on the third control, and the terminal device obtains the third parameter (male ratio parameter, child ratio parameter, middle-aged ratio parameter) corresponding to the fourth input , At least one of the elderly proportion parameters), the terminal device determines the target image according to the target parameter and the third parameter, and displays the target image.
  • the terminal device obtains the target parameter corresponding to the second input, and according to the target parameter, determines the second type of image from the first type of image, and displays a fourth control, which is used to determine the third parameter,
  • the third parameter is used to determine the target image from the second type of image.
  • the second type of image is an image that meets the target parameter in the first type of image, or an image that does not meet the target parameter in the first type of image.
  • the target image is an image that meets the third parameter in the second type of image, or an image that does not meet the third parameter in the second type of image.
  • each of the drawings in the embodiments of the present disclosure is an example in combination with the accompanying drawings of the independent embodiment. In specific implementation, each of the drawings may also be implemented in combination with any other accompanying drawings, and the embodiments of the present disclosure are not limited.
  • the image display method provided by the embodiment of the present disclosure may also include the above-mentioned steps 207 to 208.
  • a terminal device can display M second controls in response to a received user's first input to a first control, and the first control is used to indicate A first type of image; receiving a user's second input to the target control in the M second controls; in response to the second input, displaying a target image in the first type of image that matches the target parameter, the target parameter being the The parameter corresponding to the second input.
  • the user can trigger the terminal device to display M second controls through the first input to the first control (corresponding to the first type of image), and then the user triggers the second input to the target control in the M second controls
  • the terminal device displays the target image matching the target parameter corresponding to the second input in the first type of image.
  • the user can quickly filter out the image that the user needs from the gallery by inputting the two controls. It can solve the problem of complicated and cumbersome process of screening images by terminal equipment in related technologies and poor human-computer interaction performance.
  • an embodiment of the present disclosure provides a terminal device 120.
  • the terminal device 120 includes: a display module 121 and a receiving module 122; the display module 121 is configured to respond to the received first control by the user. One input, M second controls are displayed, the first controls are used to indicate the first type of image in the terminal device, and M is a positive integer; the receiving module 122 is used to receive the M displayed by the user on the display module 121 The second input of the target control in the second control; the display module 121 is further configured to display the target image matching the target parameter in the first type of image in response to the second input received by the receiving module 122, the target The parameter is the parameter corresponding to the second input.
  • the terminal device 120 shown in FIG. 15 may only include the display module 121 and the receiving module 122; it may also include the display module 121, the receiving module 122, and the determining module 123 (that is, in the terminal device 120, the determining module 123 Is optional module).
  • the determination module 123 is illustrated in a dotted frame in FIG. 15.
  • the receiving module 122 is further configured to receive the user's response to the first control of the at least one control before the display module 121 displays M second controls in response to the received first input of the user to the first control.
  • the first input of a control, and each control in the at least one control is used to indicate a type of image.
  • the terminal device 120 further includes: a determining module 123; the determining module 123 is configured to determine the target parameter before the display module 121 displays the target image matching the target parameter in the first type of image, and The target parameter determines the target image from the first type of image.
  • the M second controls include a target graphic control, the target graphic control is a closed graphic, the target graphic control includes N areas, and each area of the N areas is used to indicate a type of parameter.
  • the value of a type of parameter indicated by each area is a continuously changing value, and N is a positive integer;
  • the second input is the user's input to the target graphic control;
  • the determination module 123 is specifically used to determine the target position and obtain the target The target parameter corresponding to the position, and the target position is a position corresponding to the second input in the target graphic control.
  • the target graphic control further includes at least one pointer control
  • the second input is specifically an input to the target pointer control
  • the target pointer control is a control in the at least one pointer control
  • the target position is specifically the target The position of the pointer control corresponding to the second input.
  • the M second controls include a target input control, and the second input is an input of inputting a parameter in the target input control; the determining module 123 is specifically configured to determine the parameter input in the target input control Is the target parameter.
  • the terminal device provided by the embodiment of the present disclosure can implement each process shown in any one of FIG. 2 to FIG. 14 in the foregoing method embodiment. To avoid repetition, details are not described herein again.
  • the embodiment of the present disclosure provides a terminal device.
  • the terminal device can display M second controls in response to a received first input from a user to a first control, and the first control is used to indicate the first control in the terminal device.
  • a type of image receiving a user's second input to the target control in the M second controls; in response to the second input, displaying a target image matching the target parameter in the first type of image, the target parameter being the first Second, enter the corresponding parameters.
  • the user can trigger the terminal device to display M second controls through the first input to the first control (corresponding to the first type of image), and then the user triggers the second input to the target control in the M second controls
  • the terminal device displays the target image matching the target parameter corresponding to the second input in the first type of image.
  • the user can quickly filter out the image that the user needs from the gallery by inputting the two controls. It can solve the problem of complicated and cumbersome process of screening images by terminal equipment in related technologies and poor human-computer interaction performance.
  • FIG. 16 is a schematic diagram of the hardware structure of a terminal device implementing various embodiments of the present disclosure.
  • the terminal device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, and a memory 109 , The processor 110, and the power supply 111 and other components.
  • the structure of the terminal device shown in FIG. 16 does not constitute a limitation on the terminal device, and the terminal device may include more or less components than shown in the figure, or a combination of certain components, or different components Layout.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and pedometers.
  • the display unit 106 is configured to display M second controls in response to the received user's first input to the first control.
  • the first controls are used to indicate the first type of image in the terminal device, and M is positive.
  • the user input unit 107 is used to receive the user's second input to the target control in the M second controls;
  • the display unit 106 is also used to display the target control in the first type of image in response to the second input
  • the target parameter is a parameter corresponding to the second input.
  • the terminal device can display M second controls in response to the received first input of the user to the first control, and the first control is used to indicate the first type of the terminal device Image; receiving a user's second input to the target control in the M second controls; in response to the second input, displaying a target image matching the target parameter in the first type of image, the target parameter being the second input The corresponding parameter.
  • the user can trigger the terminal device to display M second controls through the first input to the first control (corresponding to the first type of image), and then the user triggers the second input to the target control in the M second controls
  • the terminal device displays the target image matching the target parameter corresponding to the second input in the first type of image.
  • the user can quickly filter out the image that the user needs from the gallery by inputting the two controls. It can solve the problem of complicated and cumbersome process of screening images by terminal equipment in related technologies and poor human-computer interaction performance.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into audio signals and output them as sounds. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
  • the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the terminal device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), etc.
  • LCD Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be realized by various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to realize the input and output functions of the terminal device, but in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 100 or can be used to connect to the terminal device 100 and external Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes the software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal device, which may include the processor 110 shown in FIG. 16, a memory 109, and a computer program stored in the memory 109 and running on the processor 110,
  • a terminal device which may include the processor 110 shown in FIG. 16, a memory 109, and a computer program stored in the memory 109 and running on the processor 110,
  • the computer program is executed by the processor 110, each process of the image display method shown in any one of FIG. 2 to FIG. 14 in the foregoing method embodiment is realized, and the same technical effect can be achieved. To avoid repetition, details are not repeated here. .
  • the embodiment of the present disclosure also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program shown in any one of FIGS. 2 to 14 in the above method embodiment is implemented.
  • Each process of the image display method can achieve the same technical effect. In order to avoid repetition, it will not be repeated here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.
  • a terminal device which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé d'affichage d'image et un dispositif terminal, se rapportant au domaine de la technologie de terminal, qui sont appliqués à une scène de visionnage d'image. Ledit procédé consiste à : en réponse à une première entrée reçue d'un utilisateur sur un premier gadget logiciel, afficher M deuxièmes gadgets logiciels (201), le premier gadget logiciel étant utilisé pour indiquer un premier type d'image dans le dispositif terminal et M étant un nombre entier positif ; recevoir une deuxième entrée de l'utilisateur sur un gadget logiciel cible parmi les M deuxièmes gadgets logiciels (202) ; et en réponse à la deuxième entrée, afficher une image cible qui coïncide avec un paramètre cible dans le premier type d'image (203), le paramètre cible étant un paramètre correspondant à la deuxième entrée.
PCT/CN2020/081249 2019-04-30 2020-03-25 Procédé d'affichage d'image et dispositif terminal WO2020220873A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910364497.8 2019-04-30
CN201910364497.8A CN110245246B (zh) 2019-04-30 2019-04-30 一种图像显示方法及终端设备

Publications (1)

Publication Number Publication Date
WO2020220873A1 true WO2020220873A1 (fr) 2020-11-05

Family

ID=67883623

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/081249 WO2020220873A1 (fr) 2019-04-30 2020-03-25 Procédé d'affichage d'image et dispositif terminal

Country Status (2)

Country Link
CN (1) CN110245246B (fr)
WO (1) WO2020220873A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113179205A (zh) * 2021-03-31 2021-07-27 维沃移动通信有限公司 图像分享方法、装置及电子设备
CN113992789A (zh) * 2021-10-29 2022-01-28 维沃移动通信有限公司 图像处理方法及装置

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110245246B (zh) * 2019-04-30 2021-11-16 维沃移动通信有限公司 一种图像显示方法及终端设备
CN110889002A (zh) * 2019-11-26 2020-03-17 维沃移动通信有限公司 图像显示方法及电子设备
CN111984809A (zh) * 2020-08-20 2020-11-24 深圳集智数字科技有限公司 一种图像查找方法和相关装置
CN112148185A (zh) * 2020-09-17 2020-12-29 维沃移动通信(杭州)有限公司 图像显示方法、装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104239336A (zh) * 2013-06-19 2014-12-24 华为技术有限公司 一种图像筛选方法、装置及终端
CN104794189A (zh) * 2015-04-16 2015-07-22 惠州Tcl移动通信有限公司 一种图像筛选方法及筛选系统
CN109947968A (zh) * 2017-10-13 2019-06-28 南京唯实科技有限公司 一种大数据图像筛选方法
CN110245246A (zh) * 2019-04-30 2019-09-17 维沃移动通信有限公司 一种图像显示方法及终端设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2605413B1 (fr) * 2010-08-13 2018-10-10 LG Electronics Inc. Terminal mobile, système comprenant le terminal mobile et un dispositif d'affichage et procédé de commande associé
CN104598483A (zh) * 2013-11-01 2015-05-06 索尼公司 图片过滤方法、装置以及电子设备
KR101631966B1 (ko) * 2014-06-19 2016-06-20 엘지전자 주식회사 이동 단말기 및 이의 제어방법
CN104284252A (zh) * 2014-09-10 2015-01-14 康佳集团股份有限公司 一种自动生成电子相册的方法
CN105893412A (zh) * 2015-11-24 2016-08-24 乐视致新电子科技(天津)有限公司 图像分享方法及装置
US11228757B2 (en) * 2017-05-31 2022-01-18 Interdigital Vc Holdings, Inc. Method and a device for picture encoding and decoding
CN108769374B (zh) * 2018-04-25 2020-10-02 维沃移动通信有限公司 一种图像管理方法及移动终端
CN108776822B (zh) * 2018-06-22 2020-04-24 腾讯科技(深圳)有限公司 目标区域检测方法、装置、终端及存储介质
CN109151442B (zh) * 2018-09-30 2021-01-08 维沃移动通信有限公司 一种图像拍摄方法及终端
CN109543744B (zh) * 2018-11-19 2022-10-14 南京邮电大学 一种基于龙芯派的多类别深度学习图像识别方法及其应用
CN109635683A (zh) * 2018-11-27 2019-04-16 维沃移动通信有限公司 一种图像中的内容提取方法及终端设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104239336A (zh) * 2013-06-19 2014-12-24 华为技术有限公司 一种图像筛选方法、装置及终端
CN104794189A (zh) * 2015-04-16 2015-07-22 惠州Tcl移动通信有限公司 一种图像筛选方法及筛选系统
CN109947968A (zh) * 2017-10-13 2019-06-28 南京唯实科技有限公司 一种大数据图像筛选方法
CN110245246A (zh) * 2019-04-30 2019-09-17 维沃移动通信有限公司 一种图像显示方法及终端设备

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113179205A (zh) * 2021-03-31 2021-07-27 维沃移动通信有限公司 图像分享方法、装置及电子设备
CN113179205B (zh) * 2021-03-31 2023-04-18 维沃移动通信有限公司 图像分享方法、装置及电子设备
CN113992789A (zh) * 2021-10-29 2022-01-28 维沃移动通信有限公司 图像处理方法及装置

Also Published As

Publication number Publication date
CN110245246B (zh) 2021-11-16
CN110245246A (zh) 2019-09-17

Similar Documents

Publication Publication Date Title
WO2021104365A1 (fr) Procédé de partage d'objets et dispositif électronique
WO2020220873A1 (fr) Procédé d'affichage d'image et dispositif terminal
WO2021104195A1 (fr) Procédé d'affichage d'images et dispositif électronique
WO2020063091A1 (fr) Procédé de traitement d'image et dispositif terminal
WO2021136159A1 (fr) Procédé de capture d'écran et dispositif électronique
WO2020258934A1 (fr) Procédé d'affichage d'interface et dispositif terminal
CN109471692B (zh) 一种显示控制方法及终端设备
KR102554191B1 (ko) 정보 처리 방법 및 단말
WO2021115278A1 (fr) Procédé de gestion de groupe et dispositif électronique
CN109828850B (zh) 一种信息显示方法及终端设备
WO2021012927A1 (fr) Procédé d'affichage d'icône et dispositif terminal
WO2021129536A1 (fr) Procédé de déplacement d'icône et dispositif électronique
WO2020151460A1 (fr) Procédé de traitement d'objet et dispositif terminal
WO2021004327A1 (fr) Procédé de définition d'autorisation d'application, et dispositif terminal
WO2021057290A1 (fr) Procédé de commande d'informations et dispositif électronique
WO2020181945A1 (fr) Procédé d'affichage d'identifiant et borne
WO2021057301A1 (fr) Procédé de commande de fichier et dispositif électronique
WO2020199783A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2021169954A1 (fr) Procédé de recherche et dispositif électronique
WO2020215982A1 (fr) Procédé de gestion d'icône de bureau et dispositif terminal
WO2021017738A1 (fr) Procédé d'affichage d'interface et dispositif électronique
WO2020220893A1 (fr) Procédé de capture d'écran et terminal mobile
WO2020155980A1 (fr) Procédé de commande et dispositif terminal
WO2021164739A1 (fr) Procédé de commande pour dispositif électronique, et dispositif électronique
CN110908555A (zh) 一种图标显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20798959

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20798959

Country of ref document: EP

Kind code of ref document: A1