WO2020182035A1 - Procédé de traitement d'image et dispositif terminal - Google Patents

Procédé de traitement d'image et dispositif terminal Download PDF

Info

Publication number
WO2020182035A1
WO2020182035A1 PCT/CN2020/077751 CN2020077751W WO2020182035A1 WO 2020182035 A1 WO2020182035 A1 WO 2020182035A1 CN 2020077751 W CN2020077751 W CN 2020077751W WO 2020182035 A1 WO2020182035 A1 WO 2020182035A1
Authority
WO
WIPO (PCT)
Prior art keywords
folder
user
input
target
terminal device
Prior art date
Application number
PCT/CN2020/077751
Other languages
English (en)
Chinese (zh)
Inventor
戴苗苗
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020182035A1 publication Critical patent/WO2020182035A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • H04M1/72472User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the embodiments of the present disclosure relate to the field of communication technologies, and in particular, to an image processing method and terminal equipment.
  • a user can classify the taken photo, so that the photo can be quickly found later.
  • the user can operate the terminal device to trigger the terminal device to display the taken photo, and then the user can manipulate the photo to save the photo to In a certain type of folder, the classification of the photo is realized.
  • the user needs to first open the photo album application and find the photo from the photo album application, and then perform operations on the photo to trigger the terminal device to move the photo to a certain In the folder, so as to realize the classification of the photos, so the user's operation is more cumbersome and time-consuming.
  • the embodiments of the present disclosure provide an image processing method and a terminal device, which can solve the problem of cumbersome and time-consuming operations of the user when the user classifies the photos taken in the related art.
  • an image processing method may include: receiving a user's photographing input; in response to the photographing input, displaying a target object on a first interface, the target object including a first identifier and N first folders, the first identifier is used to indicate the target image to be collected, and each first folder corresponds to an image type; receives the first input of the user for the target object; in response to the first input, the target The image is saved in the target folder, which is one of the N first folders.
  • a terminal device in a second aspect of the embodiments of the present disclosure, includes a receiving unit, a display unit, and a storage unit.
  • the receiving unit is used to receive the user's camera input.
  • the display unit is configured to display the target object on the first interface in response to the photographing input received by the receiving unit.
  • the target object includes a first identifier and N first folders.
  • the first identifier is used to indicate the captured target image.
  • Each first folder corresponds to an image type.
  • the receiving unit is further configured to receive the user's first input for the target object.
  • the saving unit is configured to save the target image in a target folder in response to the first input received by the receiving unit, and the target folder is one of the N first folders.
  • a terminal device in a third aspect of the embodiments of the present disclosure, includes a processor, a memory, and a computer program that is stored in the memory and can run on the processor. The steps of the image processing method described in one aspect.
  • a computer-readable storage medium stores a computer program.
  • the steps of the image processing method as described in the first aspect are implemented. .
  • the terminal device may display the target object on the first interface (the target object includes N first folders (each first folder corresponds to an image type). ) And a first identifier for indicating the collected target image), and after the user performs the first input for the target object, the target image is saved in the target folder.
  • the terminal device can display the target object, so that the user can directly perform an operation on the target object (that is, the first input), thereby saving the target image to a target folder corresponding to an image type , The user does not need to perform multiple operations on the terminal device to save the target image in the target folder, which can simplify the user's operation and save the user's time-consuming.
  • FIG. 1 is a schematic diagram of the architecture of an Android operating system provided by an embodiment of the disclosure
  • FIG. 3 is one of the schematic diagrams of an example of a mobile phone interface provided by an embodiment of the disclosure.
  • FIG. 4 is a second schematic diagram of an example of a mobile phone interface provided by an embodiment of the disclosure.
  • FIG. 5 is the second schematic diagram of an image processing method provided by an embodiment of the disclosure.
  • FIG. 6 is the third schematic diagram of an example of a mobile phone interface provided by an embodiment of the disclosure.
  • FIG. 7 is a fourth schematic diagram of an example of an interface of a mobile phone provided by an embodiment of the disclosure.
  • FIG. 8 is a fifth example of a schematic diagram of an interface of a mobile phone provided by an embodiment of the present disclosure.
  • FIG. 9 is the third schematic diagram of an image processing method provided by an embodiment of the disclosure.
  • FIG. 10 is a sixth schematic diagram of an example of an interface of a mobile phone provided by an embodiment of the disclosure.
  • FIG. 11 is a seventh example of a schematic diagram of an interface of a mobile phone provided by an embodiment of the disclosure.
  • FIG. 12 is the eighth example of a schematic diagram of an interface of a mobile phone provided by an embodiment of the present disclosure.
  • FIG. 13 is one of the schematic structural diagrams of a terminal device provided by an embodiment of the disclosure.
  • FIG. 14 is a second schematic structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 15 is a schematic diagram of hardware of a terminal device provided by an embodiment of the disclosure.
  • first and second in the description and claims of the embodiments of the present disclosure are used to distinguish different objects, rather than to describe a specific order of objects.
  • first input and the second input are used to distinguish different inputs, rather than to describe a specific order of input.
  • plural means two or more.
  • a plurality of elements refers to two elements or more than two elements.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present disclosure should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • Embodiments of the present disclosure provide an image processing method and terminal device. After receiving a user’s photo input, the terminal device can display a target object on a first interface (the target object includes N first folders (each first file) Folders respectively correspond to an image type) and a first identifier used to indicate the acquired target image), and after the user performs the first input for the target object, the target image is saved in the target folder.
  • the target object includes N first folders (each first file) Folders respectively correspond to an image type) and a first identifier used to indicate the acquired target image
  • the terminal device can display the target object, so that the user can directly perform an operation on the target object (that is, the first input), thereby saving the target image to a target folder corresponding to an image type ,
  • the user does not need to perform multiple operations on the terminal device to save the target image in the target folder, which can simplify the user's operation and save the user's time-consuming.
  • the image processing method and terminal device provided by the embodiments of the present disclosure can be applied to the process of classifying and storing collected images.
  • the terminal device in the embodiment of the present disclosure may be a terminal device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiment of the present disclosure.
  • the following uses the Android operating system as an example to introduce the software environment to which the image processing method provided in the embodiments of the present disclosure is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software level.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop software programs that implement the image processing method provided by the embodiments of the present disclosure based on the system architecture of the Android operating system as shown in FIG.
  • the processing method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the image processing method provided by the embodiment of the present disclosure by running the software program in the Android operating system.
  • FIG. 2 shows a flowchart of an image processing method provided by an embodiment of the present disclosure. The method can be applied to the image processing method shown in FIG. Terminal equipment with Android operating system. As shown in FIG. 2, the image processing method provided by the embodiment of the present disclosure may include the following steps 201 to 204.
  • Step 201 The terminal device receives the user's photo input.
  • the current interface of the terminal device is a photographing interface
  • the user can perform photographing input in the photographing interface to trigger the terminal device to collect a target image.
  • the aforementioned photographing input may be a user's pressing input on a physical key of the terminal device; or the aforementioned photographing input may be a user's pressing input on a photographing control of the terminal device.
  • Step 202 In response to the camera input, the terminal device displays the target object on the first interface.
  • the aforementioned target object may include a first identifier and N first folders, and the first identifier is used to indicate the captured target image.
  • the terminal device may superimpose and display the first interface on the shooting interface, and the first interface includes the target object.
  • the above-mentioned first interface may include a first sub-interface and a second sub-interface
  • the terminal device may display the first sub-interface and the second sub-interface on a split screen
  • the first sub-interface includes The target image (or N first folders)
  • the second sub-interface includes N first folders (or target images).
  • the above-mentioned N first folders are folders saved in the terminal device, the N first folders are used to store images, and each of the N first folders is One folder corresponds to one image type.
  • the image types corresponding to the N first folders may be image types obtained after classification according to the feature information of the images.
  • the image type may include a character type, a landscape type, a selfie type, a dynamic image type, a continuous shooting type, and the like.
  • the terminal device is a mobile phone as an example for description.
  • the mobile phone displays a shooting interface 10.
  • the mobile phone displays a first interface, which includes N A first folder (for example, folder A to folder F) and a first identifier 11; or, as shown in (C) in Figure 3, the mobile phone displays a first interface (the first interface includes a first sub-interface 12 and a second sub-interface 13), the first sub-interface 12 includes N first folders (for example, folder A to folder F), and the second sub-interface 13 includes a target image 14.
  • Step 203 The terminal device receives the user's first input for the target object.
  • the user may perform the first input for the target object to trigger the terminal device to save the target object in the target folder.
  • the above-mentioned first input may be an input of the user dragging the first identifier to the target folder.
  • the above-mentioned first input may include a first sub-input and a second sub-input.
  • the first sub-input is the user's long-press input on the first identifier
  • the second sub-input is the user input Drag the first sign to the input of the target folder. It can be understood that after the user performs a long-press input on the first identifier, the terminal device can be triggered to control the state of the target image to the movable state.
  • the above-mentioned first input may be a user's click operation on the first identifier and the target folder. Specifically, after the user performs a click operation on the first identifier, the user performs a click operation on the target folder within a preset time period.
  • the user can long press the target image 14 to make the target image 14 a movable state; then, as shown in (B) of FIG. 4, the user can drag the target image 14 to the target folder (for example, folder F).
  • the target folder for example, folder F.
  • Step 204 In response to the first input, the terminal device saves the target image in the target folder.
  • the aforementioned target folder is one of N first folders.
  • the aforementioned target folder is a folder corresponding to the image type of the target image among the N first folders.
  • the terminal device may upload the first interface (the first interface includes the target The image and N first folders) are updated to the shooting interface of the terminal device.
  • Embodiments of the present disclosure provide an image processing method. After receiving a user’s photo input, a terminal device can display a target object on a first interface (the target object includes N first folders (each first folder corresponds to each An image type) and a first identifier used to indicate the acquired target image), and after the user performs the first input on the target object, the target image is saved in the target folder.
  • the target object includes N first folders (each first folder corresponds to each An image type) and a first identifier used to indicate the acquired target image)
  • the terminal device can display the target object, so that the user can directly perform an operation on the target object (that is, the first input), thereby saving the target image to a target folder corresponding to an image type ,
  • the user does not need to perform multiple operations on the terminal device to save the target image in the target folder, which can simplify the user's operation and save the user's time-consuming.
  • the image processing method provided in the embodiment of the present disclosure may further include the following steps 301 to 304.
  • Step 301 The terminal device determines the image type of the target image.
  • the terminal device may obtain characteristic information of the target image, and determine the image type of the target image according to the characteristic information.
  • the user before the user inputs a photo, the user can select and input the AI image classification control in the shooting interface of the terminal device to trigger the terminal device to start the AI image classification function.
  • the terminal device can obtain the characteristic information of the target image through the AI image classification function, and determine whether the N first folders in the terminal device are included in the characteristic information There is a folder corresponding to the image type of the target image.
  • an AI image recognition classification control 15 is displayed on the shooting interface 10 of the mobile phone, and the user clicks on the AI image recognition classification control 15 to trigger the mobile phone to start the AI image recognition classification function.
  • Step 302 The terminal device searches the N first folders for a folder corresponding to the image type of the target image.
  • the terminal device can search for folders for storing images of the target image type (that is, the image type of the target image) from the N first folders through the AI image classification function.
  • Step 303 If at least one second folder corresponding to the image type of the target image is found from the N first folders, the terminal device displays the first prompt message on the first interface.
  • the above-mentioned first prompt information is used to prompt the user of at least one second folder, and the at least one second folder includes the target folder.
  • the aforementioned at least one second folder may be used to store images of the target image type, that is, the images in the at least one second folder and the target image have matching (ie partially identical/exactly identical) feature information.
  • the terminal device may superimpose and display the first prompt information on the first interface.
  • the above-mentioned first prompt information includes at least one identifier (for example, icon and/or name) of the second folder.
  • the terminal device may display the identification of the at least one second folder (that is, the at least one second identification) in a preset manner (for example, a color display manner).
  • the terminal device may adjust N according to the image type of the target image.
  • the display order of the first folder to prompt the user at least one second folder.
  • the terminal device may preferentially display the identification of at least one second folder.
  • the above-mentioned first prompt information includes at least one second identifier, and each second identifier is used to indicate a second folder.
  • the above step 203 can be specifically implemented by the following step 203a.
  • Step 203a The terminal device receives a user's selection input of a second identifier in the first prompt information.
  • the above-mentioned second identifier is used to indicate the target folder.
  • the user can select and input one of the at least one second identification to trigger the terminal device to save the target image in the folder indicated by the one second identification (ie, the target folder).
  • the mobile phone may display the first folder on the first interface A prompt message 16.
  • the first prompt message 16 includes at least one second identifier (for example, the identifier A of folder A and the identifier B of folder B). The user can select and input the identifier A to trigger the mobile phone to display the target image 14 Save to the folder A indicated by the mark A.
  • the terminal device may send the first prompt information and the first interface (the The first interface (including the target image and N first folders) is updated to the shooting interface of the terminal device.
  • the user can input the first prompt information (for example, the fourth input) to trigger the terminal device to use the first prompt information.
  • the prompt information is updated to the first interface (the first interface includes the target image and N first folders), and the user can make the first input to the target image and the N first folders to trigger the terminal device to save the target image To the target folder.
  • Step 304 If at least one second folder is not found from the N first folders, the terminal device displays second prompt information on the first interface.
  • the above-mentioned second prompt information is used to prompt the user to create a folder.
  • the above-mentioned second prompt information is used to prompt the user to create a folder corresponding to the image type of the target image.
  • the above-mentioned second prompt information may be used to indicate that at least one second folder is not found.
  • the terminal device may superimpose and display the second prompt information on the first interface.
  • the mobile phone may display on the first interface
  • the second prompt message 17 is used to prompt the user to create a folder.
  • the terminal device can search for at least one second folder corresponding to the image type of the target image from N first folders, and in the case of finding at least one second folder, the terminal device can display The first prompt information, so that the user can directly perform an operation (that is, the first input) on the first prompt information, so as to save the target image in the target folder without the user having to perform two operations on the terminal device to save
  • the target image is saved in the target folder, which can simplify the user's operation and save the user's time-consuming.
  • the terminal device can search for at least one second folder corresponding to the image type of the target image from the N first folders.
  • the terminal device can The second prompt message is displayed to prompt the user to create a new folder, so that the user triggers to save the target image to the newly created folder.
  • the image processing method provided in the embodiment of the present disclosure may further include the following steps 401 and 402.
  • Step 401 The terminal device receives the second input of the user.
  • the above-mentioned second input is used to trigger the terminal device to search for a folder corresponding to the image type of the target image from the N first folders.
  • the above-mentioned second input may be a sliding input (for example, a left sliding input and a right sliding input) of the user on the target image (the first identifier is the target image).
  • the above-mentioned second input can be used to trigger the terminal device to start the AI image classification function, so that the terminal device can obtain the feature information of the target image through the AI image classification function, and then use the feature information It is determined whether there is a folder corresponding to the image type of the target image among the N first folders in the terminal device.
  • the user can perform a left-slide input or a right-slide input on the target image 14 to trigger the mobile phone to start the AI image classification function.
  • Step 402 In response to the second input, if the terminal device finds at least one second folder corresponding to the image type of the target image from the N first folders, the first prompt information is displayed on the first interface.
  • the above-mentioned first prompt information is used to prompt the user of at least one second folder, and the at least one second folder includes the target folder.
  • the terminal device responds to the second input, and if at least one second folder is not found from the N first folders, it displays on the first interface
  • the second prompt message is used to prompt the user to create a folder.
  • step 402 reference may be made to the description of step 303 in the above embodiment, which will not be repeated here.
  • the user can perform a second input to the terminal device to trigger the terminal device to search for at least one second file corresponding to the image type of the target image from the N first folders Folder, and display prompt information (that is, the first prompt information or the second prompt information) according to the search result to facilitate user operations.
  • prompt information that is, the first prompt information or the second prompt information
  • the image processing method provided in the embodiment of the present disclosure may further include the following steps 501 and 502.
  • Step 501 The terminal device receives the third input of the user.
  • the aforementioned third input is used to trigger the terminal device to establish a folder.
  • the aforementioned third input may be a sliding input of the user on the target image (for example, a sliding input).
  • Step 502 In response to the third input, the terminal device creates a third folder, and displays the third folder on the first interface.
  • the image type corresponding to the aforementioned third folder is the same as the image type of the target image.
  • the terminal device may create a folder according to the image type of the target image, and the image type corresponding to the folder is the same as the image type of the target image.
  • the terminal device may display third prompt information on the first interface, and the third prompt information is used to prompt the user to input the third file (for example, the user is prompted to input the name of the third file (that is, the name of the image type corresponding to the third file)).
  • the user can slide up the target image 14; as shown in (B) in FIG.
  • the mobile phone can display the third prompt message 18 on the first interface.
  • the third prompt message 18 prompts the user to enter the name of the third file; after the user enters the name of the third file (for example, folder G), As shown in (C) in Figure 11, the mobile phone displays a third folder (ie folder G) in the first interface.
  • the user may perform a third input to the terminal device to trigger the terminal device to establish a third folder, and display the third folder on the first interface.
  • the user may perform a deletion input on the target image (for example, perform a sliding input on the target image) to trigger the terminal device to delete the target image. It can be understood that after the terminal device deletes the target image, the terminal device displays the shooting interface.
  • the user can slide down the target image 14 to trigger the mobile phone to delete the target image 14, so that the mobile phone displays as shown in (A) in FIG. 3 Shown in the shooting interface 10.
  • the terminal device may establish a third folder corresponding to the image type of the target image according to the third input of the user, so that the user can trigger the saving of the target image to the newly created folder.
  • the image processing method provided in the embodiment of the present disclosure may further include the following steps 601 and 602.
  • Step 601 The terminal device receives the third input of the user.
  • Step 602 In response to the third input, the terminal device creates a third folder, and displays the third folder on the first interface.
  • step 601 and step 602 reference may be made to the description of step 501 and step 502 in the above embodiment, which will not be repeated here.
  • FIG. 13 shows a schematic diagram of a possible structure of a terminal device involved in an embodiment of the present disclosure.
  • the terminal device 130 may include: a receiving unit 131, a display unit 132, and a saving unit 133.
  • the receiving unit 131 is configured to receive a user's photo input.
  • the display unit 132 is configured to display the target object on the first interface in response to the photo input received by the receiving unit 131, the target object includes a first identifier and N first folders, and the first identifier is used to indicate the captured target image , Each first folder corresponds to an image type.
  • the receiving unit 131 is further configured to receive the user's first input for the target object.
  • the saving unit 133 is configured to save the target image in a target folder in response to the first input received by the receiving unit 131, and the target folder is one of the N first folders.
  • the display unit 132 is further configured to, in response to the photo input received by the receiving unit 131, display the target object on the first interface, if the target image is found in the N first folders At least one second folder corresponding to the image type, first prompt information is displayed on the first interface.
  • the first prompt information is used to prompt the user of at least one second folder, and the at least one second folder includes the target folder.
  • the receiving unit 131 is further configured to receive a second input from the user after the display unit 132 responds to the photo input received by the receiving unit 131 and displays the target object on the first interface.
  • the display unit 132 is further configured to respond to the second input received by the receiving unit 131, if at least one second folder corresponding to the image type of the target image is found from the N first folders, then on the first interface
  • the first prompt information is displayed, and the first prompt information is used to prompt the user of at least one second folder, and the at least one second folder includes the target folder.
  • the foregoing first prompt information may include at least one second identifier, and each second identifier is used to indicate a second folder.
  • the receiving unit 131 is specifically configured to receive a user's selection input of a second identifier in the first prompt information, and a second identifier is used to indicate the target folder.
  • the display unit 132 is further configured to display second prompt information on the first interface if at least one second folder is not found among the N first folders.
  • the prompt message is used to prompt the user to create a folder.
  • the receiving unit 131 is further configured to receive a third input of the user.
  • the terminal device 130 provided in the embodiment of the present disclosure may further include: an establishment unit 134.
  • the creating unit 134 is configured to create a third folder in response to the third input received by the receiving unit 131.
  • the display unit 132 is further configured to display a third folder created by the creation unit 134 on the first interface, and the image type corresponding to the third folder is the same as the image type of the target image.
  • the terminal device provided by the embodiment of the present disclosure can implement each process implemented by the terminal device in the foregoing method embodiment. To avoid repetition, the detailed description will not be repeated here.
  • the embodiment of the present disclosure provides a terminal device. After receiving a user's photo input, the terminal device can display a target object on a first interface (the target object includes N first folders (each first folder corresponds to one Image types) and a first identifier used to indicate the acquired target image), and after the user performs the first input for the target object, the target image is saved in the target folder.
  • the target object includes N first folders (each first folder corresponds to one Image types) and a first identifier used to indicate the acquired target image)
  • the target image is saved in the target folder.
  • the terminal device can display the target object, so that the user can directly perform an operation on the target object (that is, the first input), thereby saving the target image to a target folder corresponding to an image type ,
  • the user does not need to perform multiple operations on the terminal device to save the target image in the target folder, which can simplify the user's operation and save the user's time-consuming.
  • Fig. 15 is a hardware schematic diagram of a terminal device for implementing various embodiments of the present disclosure.
  • the terminal device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, The processor 110, and the power supply 111 and other components.
  • terminal device structure shown in FIG. 15 does not constitute a limitation on the terminal device, and the terminal device may include more or less components than those shown in FIG. Some components, or different component arrangements.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the processor 110 is configured to control the user input unit 107 to receive the user's camera input; and to receive the user's first input for the target object.
  • the processor 110 is configured to respond to the camera input received by the user input unit 107 and control the display unit 106 to display a target object on a first interface.
  • the target object includes a first identifier and N first folders.
  • the first identifier is used for Indicates the target image to be collected.
  • Each first folder corresponds to an image type.
  • the processor 110 is configured to save the target image in a target folder in response to the first input received by the user input unit 107, and the target folder is one of the N first folders.
  • the embodiment of the present disclosure provides a terminal device. After receiving a user's photo input, the terminal device can display a target object on a first interface (the target object includes N first folders (each first folder corresponds to one Image types) and a first identifier used to indicate the acquired target image), and after the user performs the first input for the target object, the target image is saved in the target folder.
  • the target object includes N first folders (each first folder corresponds to one Image types) and a first identifier used to indicate the acquired target image)
  • the target image is saved in the target folder.
  • the terminal device can display the target object, so that the user can directly perform an operation on the target object (that is, the first input), thereby saving the target image to a target folder corresponding to an image type ,
  • the user does not need to perform multiple operations on the terminal device to save the target image in the target folder, which can simplify the user's operation and save the user's time-consuming.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into audio signals and output them as sounds. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
  • the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frames processed by the graphics processor 1041 can be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the terminal device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), etc.
  • LCD Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be realized by various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to realize the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 100 or can be used to connect to the terminal device 100 and external Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes the software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal device, including a processor 110 as shown in FIG. 15, a memory 109, a computer program stored in the memory 109 and capable of running on the processor 110, the computer
  • a terminal device including a processor 110 as shown in FIG. 15, a memory 109, a computer program stored in the memory 109 and capable of running on the processor 110, the computer
  • the program is executed by the processor 110, each process of the foregoing method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, details are not repeated here.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored.
  • the computer program is executed by a processor, each process of the foregoing method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, I won’t repeat them here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present disclosure.
  • a terminal which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé de traitement d'image et un dispositif terminal. Le procédé consiste à : recevoir une entrée de photographie d'un utilisateur ; en réponse à l'entrée de photographie, afficher un objet cible dans une première interface, l'objet cible contenant un premier identifiant et N premiers dossiers et le premier identifiant étant utilisé pour indiquer une image cible acquise ; recevoir une première entrée de l'utilisateur relative à l'objet cible ; et, en réponse à la première entrée, sauvegarder l'image cible dans un dossier cible, le dossier cible faisant partie des N premiers dossiers. Les modes de réalisation de la présente invention s'appliquent à un processus de classification et de sauvegarde d'une image acquise.
PCT/CN2020/077751 2019-03-12 2020-03-04 Procédé de traitement d'image et dispositif terminal WO2020182035A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910183663.4A CN110049185A (zh) 2019-03-12 2019-03-12 图像处理方法及终端设备
CN201910183663.4 2019-03-12

Publications (1)

Publication Number Publication Date
WO2020182035A1 true WO2020182035A1 (fr) 2020-09-17

Family

ID=67274750

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077751 WO2020182035A1 (fr) 2019-03-12 2020-03-04 Procédé de traitement d'image et dispositif terminal

Country Status (2)

Country Link
CN (1) CN110049185A (fr)
WO (1) WO2020182035A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049185A (zh) * 2019-03-12 2019-07-23 维沃移动通信有限公司 图像处理方法及终端设备
CN111050076B (zh) * 2019-12-26 2021-08-27 维沃移动通信有限公司 拍摄处理方法及电子设备
CN111310096A (zh) * 2020-02-25 2020-06-19 维沃移动通信有限公司 内容保存方法、电子设备及计算机可读存储介质
CN111371999A (zh) * 2020-03-17 2020-07-03 Oppo广东移动通信有限公司 一种图像管理方法、装置、终端及存储介质
CN111597370B (zh) * 2020-04-22 2023-08-01 维沃移动通信有限公司 一种拍摄方法及电子设备
CN113360684A (zh) * 2021-05-25 2021-09-07 维沃移动通信(杭州)有限公司 图片管理方法、装置及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006261916A (ja) * 2005-03-16 2006-09-28 Casio Comput Co Ltd 撮影装置
CN102984390A (zh) * 2012-12-06 2013-03-20 广州市易票联支付技术有限公司 一种手机拍摄管理方法及拍摄手机
CN105224644A (zh) * 2015-09-28 2016-01-06 小米科技有限责任公司 信息分类方法及装置
CN106202210A (zh) * 2016-06-27 2016-12-07 依偎科技(南昌)有限公司 一种照片分类的方法及装置
WO2018062901A1 (fr) * 2016-09-29 2018-04-05 (주) 비미오 Procédé de désignation et d'étiquetage d'album de photographies mémorisées dans un terminal à écran tactile, support d'enregistrement lisible par ordinateur et terminal
CN110049185A (zh) * 2019-03-12 2019-07-23 维沃移动通信有限公司 图像处理方法及终端设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101635763A (zh) * 2008-07-23 2010-01-27 深圳富泰宏精密工业有限公司 图片分类系统及方法
CN102917126A (zh) * 2012-10-11 2013-02-06 中兴通讯股份有限公司 一种照片处理方法和系统
US10809875B2 (en) * 2015-08-03 2020-10-20 Lenovo (Beijing) Co., Ltd. Display control method and device, and electronic apparatus
CN109117037B (zh) * 2018-07-12 2021-03-23 维沃移动通信有限公司 一种图像处理的方法及终端设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006261916A (ja) * 2005-03-16 2006-09-28 Casio Comput Co Ltd 撮影装置
CN102984390A (zh) * 2012-12-06 2013-03-20 广州市易票联支付技术有限公司 一种手机拍摄管理方法及拍摄手机
CN105224644A (zh) * 2015-09-28 2016-01-06 小米科技有限责任公司 信息分类方法及装置
CN106202210A (zh) * 2016-06-27 2016-12-07 依偎科技(南昌)有限公司 一种照片分类的方法及装置
WO2018062901A1 (fr) * 2016-09-29 2018-04-05 (주) 비미오 Procédé de désignation et d'étiquetage d'album de photographies mémorisées dans un terminal à écran tactile, support d'enregistrement lisible par ordinateur et terminal
CN110049185A (zh) * 2019-03-12 2019-07-23 维沃移动通信有限公司 图像处理方法及终端设备

Also Published As

Publication number Publication date
CN110049185A (zh) 2019-07-23

Similar Documents

Publication Publication Date Title
WO2021104195A1 (fr) Procédé d'affichage d'images et dispositif électronique
WO2021083052A1 (fr) Procédé de partage d'objet et dispositif électronique
WO2019137429A1 (fr) Procédé de traitement d'image et terminal mobile
WO2020182035A1 (fr) Procédé de traitement d'image et dispositif terminal
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2020215949A1 (fr) Procédé de traitement d'objet et dispositif terminal
WO2020215957A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2021012931A1 (fr) Procédé et terminal de gestion d'icônes
WO2020151512A1 (fr) Procédé de stockage d'image et appareil terminal
KR20210024650A (ko) 백그라운드 애플리케이션 디스플레이 방법 및 이동 단말
WO2021147779A1 (fr) Procédé de partage d'informations de configuration, dispositif de terminal et support de stockage lisible par ordinateur
WO2020020126A1 (fr) Procédé de traitement d'informations et terminal
WO2020151525A1 (fr) Procédé d'envoi de message et dispositif terminal
WO2021129536A1 (fr) Procédé de déplacement d'icône et dispositif électronique
WO2020238497A1 (fr) Procédé de déplacement d'icône et dispositif terminal
WO2020156118A1 (fr) Procédé de gestion et appareil terminal
WO2020151460A1 (fr) Procédé de traitement d'objet et dispositif terminal
WO2020156123A1 (fr) Procédé de traitement d'informations et dispositif terminal
US11250046B2 (en) Image viewing method and mobile terminal
WO2020199783A1 (fr) Procédé d'affichage d'interface et dispositif terminal
CN108646960B (zh) 一种文件处理方法及柔性屏终端
WO2021077908A1 (fr) Procédé de réglage de paramètre et dispositif électronique
CN108228902B (zh) 一种文件显示方法及移动终端
WO2020220893A1 (fr) Procédé de capture d'écran et terminal mobile
WO2021164716A1 (fr) Procédé d'affichage et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20769114

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20769114

Country of ref document: EP

Kind code of ref document: A1