CN109117037B - Image processing method and terminal equipment - Google Patents

Image processing method and terminal equipment Download PDF

Info

Publication number
CN109117037B
CN109117037B CN201810762649.5A CN201810762649A CN109117037B CN 109117037 B CN109117037 B CN 109117037B CN 201810762649 A CN201810762649 A CN 201810762649A CN 109117037 B CN109117037 B CN 109117037B
Authority
CN
China
Prior art keywords
image
target
identifier
input
application program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810762649.5A
Other languages
Chinese (zh)
Other versions
CN109117037A (en
Inventor
李月鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810762649.5A priority Critical patent/CN109117037B/en
Publication of CN109117037A publication Critical patent/CN109117037A/en
Application granted granted Critical
Publication of CN109117037B publication Critical patent/CN109117037B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the invention discloses an image processing method and terminal equipment, relates to the technical field of terminal equipment, and can solve the problems that when a user stores an image in an application program of the terminal equipment or sends the image to other terminal equipment, the operation process of the user is relatively complicated and time-consuming. The specific scheme is as follows: under the condition that a first interface of the terminal equipment comprises an identification of a target image and an identification of a target application program, receiving a first input of the identification of the target image by a user; in response to a first input, moving the identifier of the target image to a first location, the first location being a location of the identifier of the target application on the first interface; and sending the target image through the target application program, or storing the target image into a storage area corresponding to the expression package in the target application program. The embodiment of the invention is applied to the image processing process of the terminal equipment.

Description

Image processing method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of terminal equipment, in particular to an image processing method and terminal equipment.
Background
Generally, when a user uses some application programs (such as social software) in the terminal equipment, the expression images can be sent to other terminal equipment through the application programs so as to increase the communication interest among the users.
In practical application, a user can send an expression image corresponding to the face image of the user to other terminal equipment. When the user sends the expression image to other terminal devices, the user may input the expression image (for example, click a "share" icon) to trigger the terminal device to display a plurality of application programs, and then the user selects one application program from the plurality of application programs and triggers the terminal device to store the expression image into the application program through the application program, or send the expression image to other terminal devices.
However, in the above method, when the user stores the expression image in the application program of the terminal device or sends the expression image to another terminal device, the user needs to perform multiple inputs to trigger the terminal device to store the expression image in the application program or send the expression image to another terminal device, so that the user operation process is complicated and time-consuming.
Disclosure of Invention
The embodiment of the invention provides an image processing method and terminal equipment, which can solve the problems that the operation process of a user is complicated and the time consumption is long when the user stores an image in an application program of the terminal equipment or sends the image to other terminal equipment.
In order to solve the technical problem, the embodiment of the invention adopts the following technical scheme:
in a first aspect of the embodiments of the present invention, there is provided an image processing method, where the image processing method may include: under the condition that a first interface of the terminal equipment comprises an identification of a target image and an identification of a target application program, receiving a first input of the identification of the target image by a user; in response to a first input, moving the identifier of the target image to a first location, the first location being a location of the identifier of the target application on the first interface; and sending the target image through the target application program, or storing the target image into a storage area corresponding to the expression package in the target application program.
In a second aspect of the embodiments of the present invention, a terminal device is provided, where the terminal device may include: a receiving unit and a processing unit. The receiving unit is used for receiving a first input of the identification of the target image from a user under the condition that the first interface of the terminal device comprises the identification of the target image and the identification of the target application program. The processing unit is used for responding to the first input received by the receiving unit and moving the identification of the target image to a first position, wherein the first position is the position of the identification of the target application program on the first interface; and sending the target image through the target application program, or storing the target image into a storage area corresponding to the expression package in the target application program.
In a third aspect of the embodiments of the present invention, a terminal device is provided, where the terminal device includes a processor, a memory, and a computer program stored in the memory and being executable on the processor, and the computer program, when executed by the processor, implements the steps of the method for image processing according to the first aspect.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium on which a computer program is stored, the computer program, when executed by a processor, implementing the steps of the method of image processing according to the first aspect.
In the embodiment of the present invention, in a case that the first interface of the terminal device includes an identifier of the target image and an identifier of the target application program, the terminal device may receive a first input of the identifier of the target image by the user, move the identifier of the target image to a first position (where the first position is a position of the identifier of the target application program on the first interface), and then send the target image through the target application program or store the target image in a storage area corresponding to the expression package in the target application program. The user can directly perform first input on the identifier of the target image to trigger the terminal device to move the identifier of the target image to the first position, so that the terminal device can send the target image through the target application program or store the target image into the storage area corresponding to the expression package in the target application program, and the terminal device can be triggered to send the target image or store the target image into the storage area corresponding to the expression package in the target application program without performing multiple inputs by the user.
Drawings
Fig. 1 is a schematic structural diagram of an android operating system according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating an image processing method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 4 is a second schematic diagram of an image processing method according to an embodiment of the present invention;
fig. 5 is a second schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 6 is a third schematic diagram of an example of an interface of a mobile phone according to the embodiment of the present invention;
fig. 7 is a third schematic diagram of an image processing method according to an embodiment of the present invention;
FIG. 8 is a fourth schematic diagram illustrating an image processing method according to an embodiment of the present invention;
FIG. 9 is a fifth exemplary illustration of a method of image processing according to the embodiment of the invention;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 11 is a second schematic structural diagram of a terminal device according to a second embodiment of the present invention;
fig. 12 is a third schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 13 is a hardware schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like, in the description and in the claims of embodiments of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs. In the description of the embodiments of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The embodiment of the invention provides an image processing method and terminal equipment, wherein under the condition that a first interface of the terminal equipment displays an identifier of a target image and an identifier of a target application program, the terminal equipment can receive first input of the identifier of the target image by a user, move the identifier of the target image to a first position (the first position is the position of the identifier of the target application program on the first interface), and then send the target image through the target application program or store the target image into a storage area corresponding to an expression package in the target application program. The user can directly perform first input on the identifier of the target image to trigger the terminal device to move the identifier of the target image to the first position, so that the terminal device can send the target image through the target application program or store the target image into the storage area corresponding to the expression package in the target application program, and the terminal device can be triggered to send the target image or store the target image into the storage area corresponding to the expression package in the target application program without performing multiple inputs by the user.
The image processing method and the terminal device provided by the embodiment of the invention can be applied to the process of processing the image by the terminal device. Specifically, the method can be applied to a process that the terminal device sends the target image or stores the target image in a storage area corresponding to the emoticon in the target application program.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the image processing method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image processing method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image processing method may be run based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the image processing method provided by the embodiment of the invention by running the software program in the android operating system.
The following describes an image processing method and a terminal device according to an embodiment of the present invention in detail by using specific embodiments and application scenarios thereof with reference to the accompanying drawings.
At present, in the prior art, because a user stores an expression image into an application program of a terminal device, or when the expression image is sent to another terminal device, the user needs to perform multiple inputs to trigger the terminal device to store the expression image into the application program, or send the expression image to another terminal device, and therefore the operation process of the user is complicated and consumes a long time.
In order to solve the above technical problem, an embodiment of the present invention provides an image processing method. Exemplarily, fig. 2 shows a flowchart of a method for image processing according to an embodiment of the present invention, which may be applied to a terminal device having an android operating system as shown in fig. 1. Wherein, although a logical order of the methods of image processing provided by embodiments of the present invention is shown in a method flow diagram, in some cases, various steps shown or described may be performed in an order different than here. For example, as shown in fig. 2, the method of image processing may include steps 201 and 202 described below.
Step 201, in the case that the first interface of the terminal device includes the identifier of the target image and the identifier of the target application program, the terminal device receives a first input of the identifier of the target image by a user.
In this embodiment of the present invention, the first input may be used to trigger the terminal device to move the identifier of the target image to a first location, where the first location is a location of the identifier of the target application program on the first interface.
Optionally, in this embodiment of the present invention, the terminal device may display the identifier of the target image and the identifier of the target application on the first interface, and the user may perform the first input on the identifier of the target image.
Optionally, in the embodiment of the present invention, the terminal device may input the target control to trigger the terminal device to display the identifier of the target image and the identifier of the target application program on the first interface.
Optionally, in this embodiment of the present invention, the first input may be a drag input of the user on the identifier of the target image, or may be a click made by the user at two consecutive positions. The specific method can be determined according to actual use requirements, and the embodiment of the invention is not limited herein.
Optionally, in the embodiment of the present invention, the user may drag the identifier of the target image to the identifier of the target application program in a manner of a straight line, a curve, an arc, or the like, so as to trigger the terminal device to move the identifier of the target image to a position of the identifier of the target application program on the first interface; or, the user may continuously click on the identifier of the target image and the identifier of the target application program to trigger the terminal device to move the identifier of the target image to the position of the identifier of the target application program on the first interface.
It should be noted that in the embodiment of the present invention, "continuously" may mean that there is no other operation between two operations, and a time interval between the two operations is smaller than a preset threshold.
Optionally, in this embodiment of the present invention, the identifier of the target image may be an icon, a name, and the like of the target image, and the identifier of the target application may be an icon, a name, and the like of the target application. The specific method can be determined according to actual use requirements, and the embodiment of the invention is not limited herein.
Optionally, in this embodiment of the present invention, the identifier of the target image may be a thumbnail image of the target image.
Optionally, in the embodiment of the present invention, the target image may be an expression image.
Optionally, in this embodiment of the present invention, the first interface may be a shooting interface of a "camera" application, and may also be an interface of an "album" application.
For example, the terminal device is taken as a mobile phone for explanation. As shown in fig. 3 a, the first interface of the mobile phone 10 is a shooting interface 11 of the "camera" application, an identifier of the target image (e.g., an identifier of the image 1) and an identifier of the target application (e.g., an identifier of the application 1) are displayed on the shooting interface 11, and the user can drag the identifier of the image 1 to a position where the identifier of the application 1 is on the shooting interface 11 (indicated by a straight line with an arrow in fig. 3 a).
Further exemplarily, as shown in (B) in fig. 3, the first interface of the mobile phone 10 is an interface 12 of an "album" application, an identifier of a target image (e.g., an identifier of the image 1) and an identifier of a target application (e.g., an identifier of the application 1) are displayed on the interface 12, and a user may drag the identifier of the image 1 to a position of the identifier of the application 1 on the shooting interface 11 (indicated by a straight line with an arrow in (B) in fig. 3).
Step 202, the terminal equipment responds to the first input and moves the identification of the target image to a first position; and sending the target image through the target application program, or storing the target image into a storage area corresponding to the expression package in the target application program.
In an embodiment of the present invention, the first location is a location of an identifier of the target application on the first interface.
In the embodiment of the invention, after receiving the first input of the user, the terminal device can firstly move the identifier of the target image to the first position; and then, the target image is sent to other terminal equipment through the target application program, or the target image is stored in a storage area corresponding to the expression package in the target application program.
It can be understood that, in the embodiment of the present invention, the storage area corresponding to the expression package is a storage area used for storing the expression package in the target application program.
It should be noted that, reference may be made to the following detailed description of the method embodiment for the terminal device to send the target image through the target application, which is not described herein again.
The embodiment of the invention provides an image processing method, wherein under the condition that a first interface of a terminal device comprises an identifier of a target image and an identifier of a target application program, the terminal device can receive a first input of the identifier of the target image by a user, move the identifier of the target image to a first position (the first position is the position of the identifier of the target application program on the first interface), and then send the target image through the target application program or store the target image into a storage area corresponding to an expression package in the target application program. The user can directly perform first input on the identifier of the target image to trigger the terminal device to move the identifier of the target image to the first position, so that the terminal device can send the target image through the target application program or store the target image into the storage area corresponding to the expression package in the target application program, and the terminal device can be triggered to send the target image or store the target image into the storage area corresponding to the expression package in the target application program without performing multiple inputs by the user.
Optionally, in the embodiment of the present invention, as shown in fig. 4 with reference to fig. 2, before step 201, the method for processing an image according to the embodiment of the present invention may further include step 301 and step 302 described below.
Step 301, the terminal device receives a second input of the target control from the user.
Optionally, in this embodiment of the present invention, the target control may include a selection box in a preset display area, and the shape of the selection box may be any possible shape such as a circle, a rectangle, a triangle, a diamond, or a polygon. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in this embodiment of the present invention, the second input may be a long-press operation of the target control by the user.
Optionally, in this embodiment of the present invention, the user may also perform a second input on one folder, so as to trigger the terminal device to display the identifier of the at least one image and the identifier of the at least one application program on the first interface.
Illustratively, as shown in fig. 5 (a), a target control 13 is displayed on the shooting interface 11 of the mobile phone 10, and the user may make a second input to the target control 13; as shown in fig. 5 (B), the folder 1 is displayed on the interface 14 of the mobile phone 10, and the user can perform a second input to the folder 1.
Step 302, the terminal device responds to the second input and displays the identification of the at least one image and the identification of the at least one application program on the first interface.
In an embodiment of the present invention, the identifier of the at least one image includes an identifier of a target image, and the identifier of the at least one application includes an identifier of a target application.
Optionally, in this embodiment of the present invention, the terminal device may display the identifier of the at least one image on the first interface in a first preset manner, and display the identifier of the at least one application on the first interface in a second preset manner.
For example, the first preset manner may be displaying an identifier of at least one image on the first interface in a fan shape; the second preset manner may be to display the identifier of the at least one application program on the first interface in a sequential display manner.
Illustratively, in conjunction with (a) in fig. 5, after the user makes a second input to the control 13, as shown in (a) in fig. 6, the mobile phone 10 may display an identifier of at least one image (such as an identifier of image 1, an identifier of image 2, and an identifier of image 3) in a fan shape on the photographing interface 11, and sequentially display an identifier of at least one application (such as an identifier of application 1, an identifier of application 2, and an identifier of application 3) on the photographing interface 11. In conjunction with (B) in fig. 5, after the user makes the second input to the folder 1, as shown in (B) in fig. 6, the mobile phone 10 may sequentially display at least one image identifier (e.g., the image 1 identifier, the image 2 identifier, and the image 3 identifier) on the interface 12, and sequentially display at least one application identifier (e.g., the application 1 identifier, the application 2 identifier, and the application 3 identifier) on the interface 12.
Optionally, in this embodiment of the present invention, as shown in fig. 7 in combination with fig. 4, the step 302 may be specifically implemented by a step 302a described below.
Step 302a, the terminal device responds to the second input, and displays the target image and the identifier of the at least one application program on the first interface under the condition that the input parameter corresponding to the second input meets the preset condition.
Optionally, in an embodiment of the present invention, the input parameter may include at least one of a strength value of the second input, a number of sub-inputs included in the second input, a time interval between two consecutive sub-inputs, and a position of two consecutive sub-inputs.
Optionally, in an embodiment of the present invention, the input parameter may be a force value of the second input, and the preset condition may be that the force value of the second input is greater than or equal to a first threshold; the input parameter may be the number of sub-inputs included in the second input, and the preset condition may be that the number of sub-inputs included in the second input is greater than or equal to a second threshold; the input parameter may be a time interval between two consecutive sub-inputs, and the preset condition may be that the time interval between two consecutive sub-inputs is less than or equal to a third threshold; the input parameter may be positions of two consecutive sub-inputs, and the preset condition may be that the positions of two consecutive sub-inputs are a position of the identifier of the target image on the first interface and a position of the identifier of the target application on the first interface, respectively.
Optionally, in this embodiment of the present invention, the sub-input included in the second input may be a click operation.
Optionally, in the embodiment of the present invention, as shown in fig. 8, the method for processing an image according to the embodiment of the present invention may include step 201, step 202a, step 401, step 402, and step 202b described below.
In fig. 8, if step 202 in fig. 2 is "the terminal device moves the identifier of the target image to the first position in response to the first input and transmits the target image through the target application", after "the terminal device moves the identifier of the target image to the first position in response to the first input (i.e., step 202a in fig. 8)" in step 202, the following steps 401 and 402 are added, and "the target image is transmitted through the target application" in step 202 is replaced with the following step 202 b.
Step 201, in the case that the first interface of the terminal device includes the identifier of the target image and the identifier of the target application program, the terminal device receives a first input of the identifier of the target image by a user.
Step 202a, the terminal device responds to the first input and moves the identification of the target image to the first position.
Step 401, the terminal device displays the target receiving object in the target application program on a second interface of the terminal device.
Optionally, in this embodiment of the present invention, the target receiving object may be a group or a contact in the target application.
Step 402, the terminal device receives a third input of the target receiving object from the user.
Optionally, in this embodiment of the present invention, the third input may be a selection input of the target receiving object by the user, for example, a click operation of the target receiving object by the user.
Step 202b, the terminal device responds to the third input and sends the target image to the target receiving object through the target application program.
Optionally, in the embodiment of the present invention, as shown in fig. 9 with reference to fig. 2, before step 201, the method for processing an image according to the embodiment of the present invention may further include step 501 and step 502 described below.
Step 501, the terminal device acquires a first image.
Optionally, in the embodiment of the present invention, the terminal device may acquire the first image through a camera of the terminal device, or may acquire the first image from an image stored in the terminal device.
Step 502, the terminal device performs image processing on the first image, and generates at least one second image by using a preset template.
In an embodiment of the present invention, the at least one second image includes a target image.
Optionally, in the embodiment of the present invention, the terminal device may perform image processing on the first image through 3D modeling to obtain a 3D image, convert the 3D image into a Q-oriented image through a preset rule, and then generate at least one second image by using a preset template.
Optionally, in the embodiment of the present invention, the preset templates may include templates of various expressions such as anger, crying, happy, smiling, anger, and the like; the terminal device may process the first image using a preset template to generate at least one second image (e.g., at least one emoticon).
Optionally, in an embodiment of the present invention, the at least one second image may be an image in a Gif format, and may also be an image in another format.
Fig. 10 shows a schematic diagram of a possible structure of a terminal device involved in the embodiment of the present invention, and as shown in fig. 10, the terminal device 90 may include: a receiving unit 91 and a processing unit 92.
The receiving unit 91 is configured to receive a first input of an identifier of the target image from a user in a case that the first interface of the terminal device includes the identifier of the target image and the identifier of the target application. A processing unit 92, configured to, in response to the first input received by the receiving unit 91, move the identifier of the target image to a first location, where the identifier of the target application is located on the first interface; and sending the target image through the target application program, or storing the target image into a storage area corresponding to the expression package in the target application program.
In a possible implementation manner, the receiving unit 91 is further configured to receive, in a case that the first interface of the terminal device includes the identifier of the target image and the identifier of the target application, before receiving the first input of the identifier of the target image by the user, receive a second input of the target control by the user. With reference to fig. 10, as shown in fig. 11, the terminal device 90 provided in the embodiment of the present invention may further include: a display unit 93. The display unit 93 is configured to display, on the first interface, an identifier of at least one image and an identifier of at least one application program in response to the second input received by the receiving unit 91, where the identifier of the at least one image includes an identifier of a target image, and the identifier of the at least one application program includes an identifier of a target application program.
In a possible implementation manner, the display unit 93 is specifically configured to display, on the first interface, an identifier of at least one image and an identifier of at least one application program when an input parameter corresponding to the second input meets a preset condition; the input parameter may include at least one of a strength value of the second input, a number of sub-inputs included in the second input, a time interval between two consecutive sub-inputs, and a position of two consecutive sub-inputs.
In a possible implementation manner, the terminal device 90 provided in the embodiment of the present invention may further include: a display unit 93. The display unit 93 is configured to display the target receiving object in the target application on the second interface of the terminal device after the processing unit 92 moves the identifier of the target image to the first position in response to the first input received by the receiving unit 91. The receiving unit 91 is further configured to receive a third input of the target receiving object from the user. The processing unit 92 is specifically configured to send the target image to the target receiving object through the target application in response to the third input received by the receiving unit 91.
In a possible implementation manner, with reference to fig. 10, as shown in fig. 12, a terminal device 90 provided in an embodiment of the present invention may further include: an acquisition unit 94. Wherein the acquiring unit is configured to acquire the first image before the receiving unit 91 receives the first input of the user's identification of the target image. The processing unit 92 is further configured to perform image processing on the first image acquired by the acquiring unit 94, and generate at least one second image by using a preset template, where the at least one second image includes the target image.
The terminal device provided by the embodiment of the present invention can implement each process implemented by the terminal device in the above method embodiments, and for avoiding repetition, detailed description is not repeated here.
The embodiment of the invention provides a terminal device, wherein under the condition that a first interface of the terminal device comprises an identifier of a target image and an identifier of a target application program, the terminal device can receive a first input of the identifier of the target image by a user, move the identifier of the target image to a first position (the first position is the position of the identifier of the target application program on the first interface), and then send the target image through the target application program or store the target image into a storage area corresponding to an expression package in the target application program. The user can directly perform first input on the identifier of the target image to trigger the terminal device to move the identifier of the target image to the first position, so that the terminal device can send the target image through the target application program or store the target image into the storage area corresponding to the expression package in the target application program, and the terminal device can be triggered to send the target image or store the target image into the storage area corresponding to the expression package in the target application program without performing multiple inputs by the user.
Fig. 13 is a hardware schematic diagram of a terminal device for implementing various embodiments of the present invention. As shown in fig. 13, the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111.
It should be noted that, as those skilled in the art will appreciate, the terminal device structure shown in fig. 13 does not constitute a limitation to the terminal device, and the terminal device may include more or less components than those shown in fig. 13, or may combine some components, or may arrange different components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The user input unit 107 may be configured to receive a first input of the identifier of the target image from the user in a case where the first interface of the terminal device includes the identifier of the target image and the identifier of the target application.
A processor 110, operable in response to a first input received by the user input unit 107, to move the identity of the target image to a first location, the first location being a location of the identity of the target application on the first interface; and sending the target image through the target application program, or storing the target image into a storage area corresponding to the expression package in the target application program.
The embodiment of the invention provides a terminal device, wherein under the condition that a first interface of the terminal device comprises an identifier of a target image, the terminal device can receive a first input of the identifier of the target image from a user and send the target image through a target application program, or store the target image into the target application program. The user can directly perform first input on the identifier of the target image and the identifier of the target application program to trigger the terminal device to move the identifier of the target image to the first position, so that the terminal device can send the target image through the target application program or store the target image into the storage area corresponding to the expression package in the target application program, and the terminal device can be triggered to send the target image or store the target image into the storage area corresponding to the expression package in the target application program without performing multiple inputs by the user.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 13, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Preferably, an embodiment of the present invention further provides a terminal device, which includes the processor 110 shown in fig. 13, the memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, and when the computer program is executed by the processor 110, the computer program implements each process of the foregoing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (9)

1. A method of image processing, the method comprising:
acquiring a first image;
performing image processing on the first image, and generating at least one second image by adopting a preset template, wherein the at least one second image comprises a target image;
the generating of the at least one second image by using the preset template includes:
performing image processing on the first image through 3D modeling to obtain a 3D image, converting the 3D image into a Q-shaped image through a preset rule, and generating at least one second image by adopting a preset template;
under the condition that a first interface of the terminal equipment comprises the identification of the target image and the identification of the target application program, receiving first input of a user on the identification of the target image;
in response to the first input, moving the identification of the target image to a first location, the first location being a location of the identification of the target application on the first interface; sending the target image through the target application program, or storing the target image to a storage area corresponding to an expression package in the target application program;
in the case that the first interface of the terminal device includes the identifier of the target image and the identifier of the target application program, before receiving a first input of the identifier of the target image by a user, the method further includes:
receiving a second input of the target control by the user;
in response to the second input, displaying an identification of at least one image and an identification of at least one application on the first interface.
2. The method of claim 1,
the identifier of the at least one image comprises an identifier of the target image, and the identifier of the at least one application program comprises an identifier of the target application program.
3. The method of claim 2, wherein displaying an identification of at least one image and an identification of at least one application on the first interface comprises:
displaying the identifier of the at least one image and the identifier of the at least one application program on the first interface under the condition that the input parameter corresponding to the second input meets a preset condition;
wherein the input parameter comprises at least one of a strength value of the second input, a number of sub-inputs comprised by the second input, a time interval between two consecutive sub-inputs, and a position of two consecutive sub-inputs.
4. The method of claim 1, wherein after moving the identity of the target image to the first location in response to the first input, the method further comprises:
displaying a target receiving object in the target application program on a second interface of the terminal equipment;
receiving a third input of the target receiving object by the user;
the sending the target image by the target application includes:
in response to the third input, sending, by the target application, the target image to the target recipient object.
5. A terminal device, characterized in that the terminal device comprises: the device comprises an acquisition unit, a receiving unit and a processing unit;
the acquisition unit is used for acquiring a first image;
the processing unit is used for carrying out image processing on the first image acquired by the acquisition unit and generating at least one second image by adopting a preset template, wherein the at least one second image comprises a target image;
the processing unit is specifically configured to perform image processing on the first image through 3D modeling to obtain a 3D image, convert the 3D image into a Q-oriented image through a preset rule, and generate at least one second image by using a preset template;
the receiving unit is used for receiving a first input of the identification of the target image by a user under the condition that a first interface of the terminal device comprises the identification of the target image and the identification of a target application program;
the processing unit is further configured to move the identifier of the target image to a first location in response to the first input received by the receiving unit, where the first location is a location of the identifier of the target application on the first interface; sending the target image through the target application program, or storing the target image to a storage area corresponding to an expression package in the target application program;
the receiving unit is further configured to receive a second input of the user to the target control before receiving a first input of the user to the identifier of the target image under the condition that the first interface of the terminal device includes the identifier of the target image and the identifier of the target application program;
the terminal device further includes: a display unit;
the display unit is used for responding to the second input received by the receiving unit and displaying the identification of at least one image and the identification of at least one application program on the first interface.
6. The terminal device of claim 5,
the identifier of the at least one image comprises an identifier of the target image, and the identifier of the at least one application program comprises an identifier of the target application program.
7. The terminal device according to claim 6, wherein the display unit is specifically configured to display the identifier of the at least one image and the identifier of the at least one application on the first interface when the input parameter corresponding to the second input meets a preset condition;
wherein the input parameter comprises at least one of a strength value of the second input, a number of sub-inputs comprised by the second input, a time interval between two consecutive sub-inputs, and a position of two consecutive sub-inputs.
8. The terminal device according to claim 5, wherein the terminal device further comprises: a display unit;
the display unit is used for displaying a target receiving object in the target application program on a second interface of the terminal equipment after the processing unit responds to the first input received by the receiving unit and moves the identification of the target image to a first position;
the receiving unit is further used for receiving a third input of the target receiving object by the user;
the processing unit is specifically configured to send the target image to the target receiving object through the target application in response to the third input received by the receiving unit.
9. Terminal device, characterized in that it comprises a processor, a memory and a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of the method of image processing according to any one of claims 1 to 4.
CN201810762649.5A 2018-07-12 2018-07-12 Image processing method and terminal equipment Active CN109117037B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810762649.5A CN109117037B (en) 2018-07-12 2018-07-12 Image processing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810762649.5A CN109117037B (en) 2018-07-12 2018-07-12 Image processing method and terminal equipment

Publications (2)

Publication Number Publication Date
CN109117037A CN109117037A (en) 2019-01-01
CN109117037B true CN109117037B (en) 2021-03-23

Family

ID=64862663

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810762649.5A Active CN109117037B (en) 2018-07-12 2018-07-12 Image processing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN109117037B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049185A (en) * 2019-03-12 2019-07-23 维沃移动通信有限公司 Image processing method and terminal device
CN111290684B (en) * 2019-12-09 2021-06-01 Oppo广东移动通信有限公司 Image display method, image display device and terminal equipment
CN111796733B (en) * 2020-06-28 2022-05-17 维沃移动通信(杭州)有限公司 Image display method, image display device and electronic equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202014004549U1 (en) * 2013-06-09 2014-09-15 Apple Inc. Device and graphical user interface for sharing content from a particular application
CN104216619A (en) * 2014-09-12 2014-12-17 鲁启元 Method and device for processing data and electronic equipment
CN105160017A (en) * 2015-09-25 2015-12-16 上海斐讯数据通信技术有限公司 System and method for image sorting and viewing
CN105306835A (en) * 2015-10-16 2016-02-03 广州市久邦数码科技有限公司 Image processing system
CN105955569A (en) * 2016-04-25 2016-09-21 乐视控股(北京)有限公司 File sharing method and apparatus
CN106357878A (en) * 2015-07-13 2017-01-25 Lg电子株式会社 Mobile terminal and control method thereof
CN106489126A (en) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 The method and device that a kind of content is shared
CN106681614A (en) * 2016-12-30 2017-05-17 珠海市魅族科技有限公司 Information sharing method and device
CN106681623A (en) * 2016-10-26 2017-05-17 维沃移动通信有限公司 Screenshot picture sharing method and mobile terminal
CN107678644A (en) * 2017-09-18 2018-02-09 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN108228031A (en) * 2018-01-24 2018-06-29 维沃移动通信有限公司 A kind of picture sharing method, image display method and mobile terminal

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202014004549U1 (en) * 2013-06-09 2014-09-15 Apple Inc. Device and graphical user interface for sharing content from a particular application
CN104216619A (en) * 2014-09-12 2014-12-17 鲁启元 Method and device for processing data and electronic equipment
CN106357878A (en) * 2015-07-13 2017-01-25 Lg电子株式会社 Mobile terminal and control method thereof
CN105160017A (en) * 2015-09-25 2015-12-16 上海斐讯数据通信技术有限公司 System and method for image sorting and viewing
CN105306835A (en) * 2015-10-16 2016-02-03 广州市久邦数码科技有限公司 Image processing system
CN105955569A (en) * 2016-04-25 2016-09-21 乐视控股(北京)有限公司 File sharing method and apparatus
CN106489126A (en) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 The method and device that a kind of content is shared
CN106681623A (en) * 2016-10-26 2017-05-17 维沃移动通信有限公司 Screenshot picture sharing method and mobile terminal
CN106681614A (en) * 2016-12-30 2017-05-17 珠海市魅族科技有限公司 Information sharing method and device
CN107678644A (en) * 2017-09-18 2018-02-09 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN108228031A (en) * 2018-01-24 2018-06-29 维沃移动通信有限公司 A kind of picture sharing method, image display method and mobile terminal

Also Published As

Publication number Publication date
CN109117037A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
CN108255378B (en) Display control method and mobile terminal
CN110851051B (en) Object sharing method and electronic equipment
CN110062105B (en) Interface display method and terminal equipment
US11658932B2 (en) Message sending method and terminal device
CN110058836B (en) Audio signal output method and terminal equipment
CN109032486B (en) Display control method and terminal equipment
CN109828705B (en) Icon display method and terminal equipment
CN111142723B (en) Icon moving method and electronic equipment
CN109976611B (en) Terminal device control method and terminal device
CN110752981B (en) Information control method and electronic equipment
CN108681427B (en) Access right control method and terminal equipment
CN110007822B (en) Interface display method and terminal equipment
CN109828731B (en) Searching method and terminal equipment
CN109408072B (en) Application program deleting method and terminal equipment
CN110049187B (en) Display method and terminal equipment
CN111030917B (en) Message display method and electronic equipment
CN110225180B (en) Content input method and terminal equipment
CN109901761B (en) Content display method and mobile terminal
WO2021082772A1 (en) Screenshot method and electronic device
CN109117037B (en) Image processing method and terminal equipment
CN109992192B (en) Interface display method and terminal equipment
CN110012151B (en) Information display method and terminal equipment
CN109766156B (en) Session creation method and terminal equipment
CN108833791B (en) Shooting method and device
CN111090529A (en) Method for sharing information and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant