CN107592458B - Shooting method and mobile terminal - Google Patents

Shooting method and mobile terminal Download PDF

Info

Publication number
CN107592458B
CN107592458B CN201710842551.6A CN201710842551A CN107592458B CN 107592458 B CN107592458 B CN 107592458B CN 201710842551 A CN201710842551 A CN 201710842551A CN 107592458 B CN107592458 B CN 107592458B
Authority
CN
China
Prior art keywords
gesture
detected
parameter
gestures
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710842551.6A
Other languages
Chinese (zh)
Other versions
CN107592458A (en
Inventor
龚柳青
付琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710842551.6A priority Critical patent/CN107592458B/en
Publication of CN107592458A publication Critical patent/CN107592458A/en
Application granted granted Critical
Publication of CN107592458B publication Critical patent/CN107592458B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The embodiment of the invention provides a shooting method and a mobile terminal, relates to the technical field of communication, and aims to solve the problem that a user is inconvenient to operate due to the fact that the user selects an image processing effect in the shooting process. The shooting method comprises the following steps: acquiring a preview image displayed on a shooting preview interface in a shooting preview mode; performing gesture detection on the preview image; if the gesture with the preset characteristics is detected, determining image processing parameters according to the detected gesture; and processing the image according to the image processing parameters. The shooting method in the embodiment of the invention is applied to the mobile terminal.

Description

Shooting method and mobile terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a shooting method and a mobile terminal.
Background
With the continuous update and development of communication technology, the shooting function of the mobile terminal is gradually improved.
For example, while shooting, the mobile terminal can process the preview image in the shooting preview interface, so that the user can shoot according to the processed effect to achieve the purpose of beautifying the image and the like.
When shooting, a user can select an expected effect which can be achieved after image processing, generally, the user can only perform touch selection operation on a display screen or perform key selection operation on a mobile terminal, but the operation mode brings inconvenience, for example, when the mobile terminal is placed on a selfie stick, the user may be far away from the mobile terminal, so that the difficulty of the user in operating the mobile terminal is large, and inconvenience is brought to the user visually.
Disclosure of Invention
The embodiment of the invention provides a shooting method, which aims to solve the problem of inconvenient operation caused by selecting an image processing effect in the shooting process of a user.
In a first aspect, an embodiment of the present invention provides a shooting method applied to a mobile terminal, including: acquiring a preview image displayed on a shooting preview interface in a shooting preview mode; performing gesture detection on the preview image; if the gesture with the preset characteristics is detected, determining image processing parameters according to the detected gesture; and processing the image according to the image processing parameters.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including: the preview image acquisition module is used for acquiring a preview image displayed on a shooting preview interface in a shooting preview mode; the gesture detection module is used for performing gesture detection on the preview image; the parameter determining module is used for determining image processing parameters according to the detected gesture if the gesture with the preset characteristics is detected; and the image processing module is used for processing the image according to the image processing parameters.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the shooting method according to the first aspect.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the shooting method according to the first aspect.
In the embodiment of the invention, when the mobile terminal shoots, the mobile terminal enters a shooting preview mode and can acquire the preview image displayed in a shooting preview interface, so that whether a gesture exists in the acquired preview image can be detected, if the gesture with preset characteristics is detected, the instruction for selecting the expected effect of image processing is given by a shot object, the image processing parameter corresponding to the gesture is determined according to the detected gesture, and the mobile terminal processes the preview image according to the determined image processing parameter so as to achieve the processing effect selected by the shot object. Therefore, in the embodiment of the present invention, gestures with different preset characteristics correspond to different image processing modes, each image processing mode can achieve a corresponding processing effect, and a photographer can complete a selection operation through the gestures, so that the operation is simple and convenient, for example: even if the mobile terminal is far away from the shot person, the shot person can control the image processing mode through gestures, the shot person does not need to operate on the mobile terminal, the operation is convenient, and different image processing effects can meet the requirements of different shot objects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is one of flowcharts of a photographing method of an embodiment of the present invention;
FIG. 2 is a second flowchart of a photographing method according to an embodiment of the invention;
FIG. 3 is a third flowchart of a photographing method according to an embodiment of the present invention;
FIG. 4 is a fourth flowchart of a photographing method according to an embodiment of the invention;
FIG. 5 is a fifth flowchart of a photographing method according to an embodiment of the present invention;
FIG. 6 is one of the block diagrams of a mobile terminal of an embodiment of the present invention;
fig. 7 is a second block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 8 is a third block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 9 is a fourth block diagram of a mobile terminal of an embodiment of the present invention;
fig. 10 is a fifth block diagram of a mobile terminal of an embodiment of the present invention;
fig. 11 is a sixth block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, there is shown a flowchart of a photographing method according to an embodiment of the present invention, the photographing method being applied to a mobile terminal, including:
step 101: and acquiring a preview image displayed on a shooting preview interface in a shooting preview mode.
In this step, the mobile terminal may enter a shooting preview mode from an application program such as a camera, so that, in the shooting preview mode, the relevant data is acquired for the preview image displayed in the shooting preview interface.
Step 102: and performing gesture detection on the preview image.
In this step, after a preview image displayed on the shooting preview interface is acquired, whether a gesture exists in the preview image is detected. The gesture required to be detected can be a gesture with preset characteristics, and the gesture with the preset characteristics can be preset by a user or can be preset automatically by the mobile terminal. Different image processing modes can enable the image to present different effects, and the image processing technology in the same form can process the image to different degrees so as to achieve different effects, wherein the gesture with preset characteristics can be set according to different processing effects, for example: the algorithm level of an image processing technique comprises five levels, and the different levels correspond to different processing degrees, so that each level can be represented by a gesture respectively.
In this step, an image recognition algorithm may be employed to detect a gesture in the preview image.
Step 103: and if the gesture with the preset characteristics is detected, determining the image processing parameters according to the detected gesture.
In this step, an image processing parameter corresponding to the gesture may be determined according to the detected gesture. For example, an algorithm level corresponding to the gesture may be determined according to the detected gesture, so as to obtain a data parameter in the algorithm level, and the obtained data parameter may be determined as the image processing parameter.
In this step, a pattern matching algorithm may be employed to identify the image processing parameters represented by the detected gesture.
Step 104: and processing the image according to the image processing parameters.
In the shooting preview mode, the image is processed based on the determined image processing parameters, and the processed image is used for previewing or shooting.
In this embodiment, the mobile terminal may acquire the preview image displayed in the shooting preview interface in the shooting preview mode, so that whether the preview image has a gesture with preset features may be determined in the acquired preview image, once the gesture is detected, the image processing parameter corresponding to the gesture may be determined according to the detected gesture, and the mobile terminal performs image processing according to the image processing parameter. Therefore, when the photographer gestures in the shooting preview interface according to the expected image processing effect, the mobile terminal can recognize the gestures, and sets the image processing parameters according to the gestures to complete the corresponding image processing. Therefore, the shot person can control the image processing mode through gestures, the shot person does not need to operate on the mobile terminal, the operation is convenient and fast, and different image processing effects can meet the requirements of different shot objects.
On the basis of fig. 1, fig. 2 shows a flowchart of a shooting method according to another embodiment of the present invention, in which the image processing parameters include blurring parameters, and step 103 includes:
step 1031: and if the gesture with the preset characteristics is detected, detecting a shooting object in the preview image.
In this step, after the gesture of the preset feature is detected, the photographic subject in the preview image is first detected, so as to determine the blurring region according to the photographic subject.
Step 1032: based on the detected photographic subject, a background area where the photographic subject is located is identified.
In this step, in order to further determine the blurring region, a background region where the photographic subject is located may be identified from the photographic subject, thereby regarding the background region as the blurring region.
Step 1033: based on the detected gesture, blurring parameters corresponding to the gesture are identified.
After the blurring area is determined, based on the detected gesture, blurring parameters represented by the gesture are recognized, and the blurring parameters are obtained.
Step 1034: based on the identified blurring parameter, an aperture parameter is set.
And setting the aperture parameters of the current blurring treatment according to the acquired blurring parameters.
In this embodiment, step 104 includes:
step 1041: blurring a background area where the object is located according to the aperture parameter.
After the setting of the aperture parameters is finished, blurring processing is directly carried out on the determined background area.
In this embodiment, the image processing technique may include blurring. Referring to table 1, when performing the blurring process, the aperture range that can be adjusted by the blurring process is f/1 to f/16, and preferably, the blurring process can be set to several levels according to different effects that can be achieved by different aperture values, and each level corresponds to a blurring algorithm. For example: assuming that f/1, f/2, f/4, f/8 and f/16 groups of aperture values are adjusted, blurring effects are obviously different, algorithms corresponding to the f/1, f/2, f/4, f/8 and f/16 groups of aperture values can be defined as different algorithm levels, each level of algorithm is represented by one gesture, a gesture "1" represents one level, an aperture value is f/1, a gesture "2" represents two levels, an aperture value is f/2, a gesture "3" represents three levels, an aperture value is f/4, a gesture "4" represents four levels, an aperture value is f/8, a gesture "5" represents five levels, and an aperture value is f/16.
Gesture Aperture parameter
Gesture "1" f/1
Gesture "2" f/2
Gesture "3" f/4
Gesture "4" f/8
Gesture "5" f/16
TABLE 1
Referring to table 1, if the gesture in table 1 is detected, the blurring parameter corresponding to the gesture is identified according to the gesture, which can be understood as: and identifying the algorithm grade of the blurring according to the gesture, so as to obtain the blurring parameter according to the algorithm grade. Further, setting the aperture parameters in the blurring parameters into the blurring algorithm, and performing blurring processing on the preview image again by the algorithm according to the set aperture parameters.
For the blurring process, it is referred to that, when one subject is present, a background region of the subject may be blurred, and when a plurality of subjects are present, a common background region of the plurality of subjects may be blurred.
On the basis of fig. 2, fig. 3 shows a flowchart of a photographing method according to another embodiment of the present invention, in which step 1033 includes:
step 10331: based on the detected gestures, a number of detected gestures is determined.
In this step, if there is more than one detected gesture, statistics on the detected gestures are required to determine the blurring parameters according to the calculation result.
Step 10332: and if the number of the detected gestures is one, identifying blurring parameters corresponding to the gestures.
And when the number of the gestures is one, directly identifying blurring parameters corresponding to the gestures, and using the blurring parameters as a basis for setting aperture parameters to blur the background areas of one or more shooting objects according to the aperture parameters.
Step 10333: and if the number of the detected gestures is two, identifying the sum of blurring parameters corresponding to the two gestures respectively.
When the number of the gestures is two, the sum of the blurring parameters corresponding to the two gestures may be calculated after the blurring parameters corresponding to the two gestures are sequentially recognized. For example, after two gestures are recognized, the two gestures respectively represent different blurring levels, the sum of the two blurring levels can be calculated first, and then blurring parameters of the blurring levels are obtained according to the calculated blurring levels, so that the blurring parameters serve as bases for setting aperture parameters, and background areas of two or more photographic objects are blurred according to the aperture parameters.
Such as: the method comprises the steps of detecting two shooting objects, namely a shooting object A and a shooting object B respectively, detecting that the gesture of the shooting object A is a gesture '1', the gesture of the shooting object B is a gesture '2', identifying that the blurring levels respectively represented by the two gestures are a first level and a second level, adding the identified results, namely the total blurring level is a third level, so as to obtain blurring parameters of the third-level blurring level, setting the aperture value in the corresponding aperture parameter to be f/4, and blurring the common background area of the two shooting objects to the utmost.
Step 10334: and if the number of the detected gestures is more than two, identifying the average value of the blurring parameters corresponding to the plurality of gestures respectively.
When the number of the gestures is greater than two, the average value of the blurring parameters corresponding to the gestures can be calculated after the blurring parameters corresponding to the gestures are recognized. For example, after more than two gestures are sequentially recognized, the gestures respectively represent different blurring levels, an average value of the blurring levels can be calculated first, and then blurring parameters of the blurring levels are obtained according to the calculated blurring levels, so that the blurring parameters serve as bases for setting aperture parameters, and a common background area of more than two photographic objects is blurred according to the aperture parameters.
Therefore, in the embodiment, the purpose of controlling or adjusting the image processing is achieved by detecting the number of the gestures and by a certain algorithm.
It should be noted that, in the embodiment of the present invention, it is assumed that only one gesture can be recognized at most for one photographic subject, so that when two gestures are recognized, at least two photographic subjects are described, and so on.
On the basis of fig. 1, fig. 4 shows a flowchart of a shooting method according to another embodiment of the present invention, in which the detected gesture is one, the image processing parameters include a beauty parameter, and step 103 includes:
step 1031: and if the gesture with the preset characteristics is detected, detecting a shooting object in the preview image.
In this step, a photographic subject is first detected to determine a beauty target from the detected photographic subject.
Step 1035: and determining a shooting object corresponding to the gesture according to the detected gesture.
And according to the detected gesture, taking the shooting object corresponding to the gesture as a beautifying object.
Step 1036: and identifying the beauty parameters corresponding to the gestures based on the detected gestures, and setting the beauty parameters.
And after the beauty object is determined, recognizing the beauty parameters corresponding to the gestures, and setting the current beauty parameters according to the beauty parameters.
Step 104 comprises:
step 1042: and performing beauty treatment on the shooting object corresponding to the gesture according to the beauty parameters.
In the present embodiment, the image processing technique may include a beauty process. Similar to blurring treatment, according to different effects of the beauty treatment, the beauty treatment can be divided into several different grades, the different grades are respectively represented by different gestures, the beauty treatment of the different grades corresponds to different algorithms, and each algorithm corresponds to different beauty treatment parameters. And after the corresponding beauty grade is identified according to the gesture, acquiring the beauty parameter corresponding to the grade, thereby completing the conversion from the gesture to the data, further setting the beauty parameter of the current image processing through the beauty parameter, setting the beauty processing into an algorithm, and completing the beauty processing of the shooting object through calculation.
In this embodiment, a gesture is taken for a user, and the user can actively make the gesture according to the needs of the user to achieve the purpose of beautifying. When there are more photographed persons and at least two photographed persons have a demand for beauty, at least two photographed persons can make gestures. In the next embodiment, this case is explained in detail.
On the basis of fig. 1, fig. 5 shows a flowchart of a photographing method according to another embodiment of the present invention, in which at least two gestures are detected, the image processing parameters include a beauty parameter, and step 103 includes:
step 1031: and if the gesture with the preset characteristics is detected, detecting a shooting object in the preview image.
Step 1037: and determining the shooting object corresponding to each gesture according to the detected gestures.
Step 1038: and identifying the beauty parameters corresponding to each gesture respectively based on the detected gestures.
Step 104 comprises:
step 1043: and according to each group of beauty parameters, respectively carrying out beauty treatment on the shooting object corresponding to each gesture.
In this embodiment, different from the previous embodiment, different gestures may be made for different persons to be photographed according to personal needs, and when a plurality of gestures are detected, the beauty parameter represented by each gesture may be sequentially recognized, so that corresponding beauty parameters are set for different photographing objects, and each group of beauty parameters is respectively set into different algorithms to perform beauty processing for different photographing objects. Preferably, the face-beautifying processing can be sequentially performed on the shooting objects in sequence, in this case, the mobile terminal only corresponds to one face-beautifying algorithm at one time point; it is also possible to perform different algorithms simultaneously on different subjects.
Illustratively, in step 101, the camera may be controlled to capture each frame of preview image to obtain a preview image, and each captured frame of preview image is stored in the buffer, so that in step 102, gesture detection may be performed in the buffered preview image, and these steps may be performed in real time and synchronously to ensure that detection of any gesture is not missed.
In summary, the shooting method of the embodiment of the present invention brings convenience to the user, for example: when the selfie stick is used, if the mobile terminal is far away from a shot person, and the shot person needs to adjust an image processing mode, the image processing mode can be realized only through gestures and is not limited by distance; for another example: when the photographer controls the mobile terminal by one hand, the other hand does not need to carefully adjust the image processing mode and can realize the operation only through simple gestures; for another example: compared with the technology of adjusting the image processing mode through the language, the gesture is used for adjusting, the problem that the recognition difficulty is large due to different languages, different accents and the like can be solved, and the universality is higher.
It should be noted that the embodiments of the present invention are not limited to the blurring process and the beautifying process, and the methods provided in the embodiments of the present invention can be applied to various image processing techniques.
Referring to fig. 6, a block diagram of a mobile terminal of one embodiment of the present invention is shown. The mobile terminal shown in fig. 6 includes:
a preview image acquiring module 10, configured to acquire a preview image displayed on a shooting preview interface in a shooting preview mode;
a gesture detection module 20, configured to perform gesture detection on the preview image;
a parameter determining module 30, configured to determine an image processing parameter according to a detected gesture if the gesture with the preset feature is detected;
and the image processing module 40 is used for carrying out image processing according to the image processing parameters.
In this embodiment, the preview image obtaining module 10 may obtain a preview image displayed in a shooting preview interface in a shooting preview mode, so that the gesture detecting module 20 may determine whether the preview image has a gesture with preset features in the obtained preview image, once the parameter determining module 30 detects the gesture, the parameter determining module may determine an image processing parameter corresponding to the gesture according to the detected gesture, and then the image processing module 40 performs image processing according to the image processing parameter. Therefore, when the photographer gestures in the shooting preview interface according to the expected image processing effect, the mobile terminal can recognize the gestures, and sets the image processing parameters according to the gestures to complete the corresponding image processing. Therefore, the shot person can control the image processing mode through gestures, the shot person does not need to operate on the mobile terminal, the operation is convenient and fast, and different image processing effects can meet the requirements of different shot objects.
The mobile terminal in this embodiment can implement each process implemented by the mobile terminal in the method embodiment of fig. 1, and is not described here again to avoid repetition.
On the basis of fig. 6, fig. 7 shows a block diagram of a mobile terminal according to another embodiment of the present invention, the image processing parameters include blurring parameters, and the parameter determining module 30 includes:
a first photographic subject detection unit 31 for detecting a photographic subject in the preview image;
a background region identifying unit 32 configured to identify a background region where the photographic subject is located, based on the detected photographic subject;
a blurring parameter identification unit 33, configured to identify a blurring parameter corresponding to the gesture based on the detected gesture;
an aperture parameter setting unit 34 for setting an aperture parameter based on the identified blurring parameter;
the image processing module 40 includes:
and a blurring processing unit 41, configured to perform blurring processing on a background area where the object is located according to the aperture parameter.
The mobile terminal in this embodiment can implement each process implemented by the mobile terminal in the method embodiment of fig. 2, and is not described here again to avoid repetition.
On the basis of fig. 7, fig. 8 shows a block diagram of a mobile terminal according to another embodiment of the present invention, and the blurring parameter identification unit 33 includes:
a gesture number determination subunit 331 configured to determine the number of detected gestures based on the detected gestures;
a first identifying subunit 332, configured to identify a blurring parameter corresponding to the gesture if the number of the detected gestures is one;
a second identifying subunit 333, configured to identify, if the number of the detected gestures is two, a sum of blurring parameters corresponding to the two gestures, respectively;
the third identifying subunit 334 is configured to identify an average value of blurring parameters corresponding to the plurality of gestures, if the number of the detected gestures is greater than two.
The mobile terminal in this embodiment can implement each process implemented by the mobile terminal in the method embodiment of fig. 3, and is not described here again to avoid repetition.
On the basis of fig. 6, fig. 9 shows a block diagram of a mobile terminal according to another embodiment of the present invention, where the detected gesture is one, the image processing parameter includes a beauty parameter, and the parameter determining module 30 includes:
a second photographic subject detection unit 35 for detecting a photographic subject in the preview image;
a first photographic subject determining unit 36, configured to determine, according to the detected gesture, a photographic subject corresponding to the gesture;
and the first beauty parameter identification unit 37 is configured to identify a beauty parameter corresponding to the gesture based on the detected gesture, and set the beauty parameter.
The image processing module 40 includes:
and the first beautifying processing unit 42 is used for performing beautifying processing on the shooting object corresponding to the gesture according to the beautifying parameters.
The mobile terminal in this embodiment can implement each process implemented by the mobile terminal in the method embodiment of fig. 4, and is not described here again to avoid repetition.
On the basis of fig. 6, fig. 10 shows a block diagram of a mobile terminal according to another embodiment of the present invention, the detected gesture is at least two, the image processing parameter includes a beauty parameter, and the parameter determining module 30 includes:
a third photographic subject detection unit 38 for detecting a photographic subject in the preview image;
a second photographic subject determination unit 39 configured to determine, according to the detected gestures, photographic subjects corresponding to each gesture, respectively;
a second beauty parameter identification unit 310, configured to identify, based on the detected gestures, beauty parameters corresponding to each gesture, respectively;
the image processing module 40 includes:
and a second beauty processing unit 43, configured to perform beauty processing on the shooting object corresponding to each gesture according to each group of beauty parameters.
The mobile terminal in this embodiment can implement each process implemented by the mobile terminal in the method embodiment of fig. 5, and is not described here again to avoid repetition.
Another embodiment of the present invention further provides a mobile terminal, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements each process of any one of the above embodiments of the shooting method, and can achieve the same technical effect, and therefore, in order to avoid repetition, details are not repeated here.
Another embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the shooting process is executed by the processor, the processes of any one of the above embodiments of the shooting method are implemented, and the same technical effects can be achieved. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Fig. 11 is a block diagram of a mobile terminal according to another embodiment of the present invention. The mobile terminal 800 shown in fig. 11 includes: at least one processor 801, memory 802, at least one network interface 804, and other user interfaces 803. The various components in the mobile terminal 800 are coupled together by a bus system 805. It is understood that the bus system 805 is used to enable communications among the components connected. The bus system 805 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 805 in FIG. 11.
The user interface 803 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, among others.
It will be appreciated that the memory 802 in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (PROM), an erasable programmable Read-only memory (erasabprom, EPROM), an electrically erasable programmable Read-only memory (EEPROM), or a flash memory. The volatile memory may be a Random Access Memory (RAM) which functions as an external cache. By way of example, but not limitation, many forms of RAM are available, such as static random access memory (staticiram, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous dynamic random access memory (syncronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced synchronous SDRAM (ESDRAM), synchronous link SDRAM (SLDRAM), and direct memory bus SDRAM (DRRAM). The memory 802 of the subject systems and methods described in connection with the embodiments of the invention is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 802 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 8021 and application programs 8022.
The operating system 8021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application 8022 includes various applications, such as a media player (MediaPlayer), a Browser (Browser), and the like, for implementing various application services. A program implementing a method according to an embodiment of the present invention may be included in application program 8022.
In this embodiment of the present invention, the mobile terminal 800 further includes: a shooting control program stored on the memory 802 and executable on the processor 801, in particular, may be the shooting control program in the application 8022, which when executed by the processor 801, implements the steps of: acquiring a preview image displayed on a shooting preview interface in a shooting preview mode; performing gesture detection on the preview image; if the gesture with the preset characteristics is detected, determining image processing parameters according to the detected gesture; and processing the image according to the image processing parameters.
The methods disclosed in the embodiments of the present invention described above may be implemented in the processor 801 or implemented by the processor 801. The processor 801 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 801. The processor 801 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may reside in ram, flash memory, rom, prom, or eprom, registers, among other computer-readable storage media known in the art. The computer readable storage medium is located in the memory 702, and the processor 801 reads the information in the memory 802, and combines the hardware to complete the steps of the method. In particular, the computer-readable storage medium has stored thereon a computer program which, when being executed by the processor 801, carries out the steps of an embodiment of the capturing method as described above.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described in this disclosure may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described in this disclosure. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: detecting a photographic object in the preview image; identifying a background area where the photographic subject is located based on the detected photographic subject; based on the detected gesture, identifying a blurring parameter corresponding to the gesture; setting an aperture parameter based on the identified blurring parameter; and blurring a background area where the shooting object is located according to the aperture parameter.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: determining a number of detected gestures based on the detected gestures; if the number of the detected gestures is one, identifying blurring parameters corresponding to the gestures; if the number of the detected gestures is two, identifying the sum of blurring parameters corresponding to the two gestures respectively; and if the number of the detected gestures is more than two, identifying the average value of blurring parameters corresponding to the plurality of gestures respectively.
Alternatively, as another embodiment, the computer program may further implement the following steps when being executed by the processor 801: detecting a photographic object in the preview image; determining a shooting object corresponding to the gesture according to the detected gesture; based on the detected gesture, identifying a beauty parameter corresponding to the gesture, and setting the beauty parameter; and performing beauty treatment on the shooting object corresponding to the gesture according to the beauty parameters.
Alternatively, as another embodiment, the computer program may further implement the following steps when being executed by the processor 801: detecting a photographic object in the preview image; according to the detected gestures, determining a shooting object corresponding to each gesture respectively; based on the detected gestures, identifying beauty parameters corresponding to each gesture respectively; and according to each group of beauty parameters, respectively carrying out beauty treatment on the shooting object corresponding to each gesture.
The mobile terminal 800 can implement each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
In this embodiment, the mobile terminal may acquire the preview image displayed in the shooting preview interface in the shooting preview mode, so that whether the preview image has a gesture with preset features may be determined in the acquired preview image, once the gesture is detected, the image processing parameter corresponding to the gesture may be determined according to the detected gesture, and the mobile terminal performs image processing according to the image processing parameter. Therefore, when the photographer gestures in the shooting preview interface according to the expected image processing effect, the mobile terminal can recognize the gestures, and sets the image processing parameters according to the gestures to complete the corresponding image processing. Therefore, the shot person can control the image processing mode through gestures, the shot person does not need to operate on the mobile terminal, the operation is convenient and fast, and different image processing effects can meet the requirements of different shot objects.
Fig. 12 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention. Specifically, the mobile terminal 900 in fig. 12 may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
The mobile terminal 900 in fig. 12 includes a Radio Frequency (RF) circuit 910, a memory 920, an input unit 930, a display unit 940, a processor 960, an audio circuit 970, a wifi (wireless fidelity) module 980, and a power supply 990.
The input unit 930 may be used, among other things, to receive numeric or character information input by a user and to generate signal inputs related to user settings and function control of the mobile terminal 900. Specifically, in the embodiment of the present invention, the input unit 930 may include a touch panel 931. The touch panel 931, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 931 (for example, a user may operate the touch panel 931 by using a finger, a stylus pen, or any other suitable object or accessory), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 931 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 960, where the touch controller can receive and execute commands sent by the processor 960. In addition, the touch panel 931 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 931, the input unit 930 may also include other input devices 932, and the other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Among other things, the display unit 940 may be used to display information input by the user or information provided to the user and various menu interfaces of the mobile terminal 900. The display unit 940 may include a display panel 941, and optionally, the display panel 941 may be configured in the form of an LCD or an organic light-emitting diode (OLED).
It should be noted that the touch panel 931 may overlay the display panel 941 to form a touch display screen, and when the touch display screen detects a touch operation on or near the touch display screen, the touch display screen transmits the touch operation to the processor 960 to determine the type of the touch event, and then the processor 960 provides a corresponding visual output on the touch display screen according to the type of the touch event.
The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like.
The processor 960 is a control center of the mobile terminal 900, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile terminal 900 and processes data by operating or executing software programs and/or modules stored in the first memory 921 and calling data stored in the second memory 922, thereby integrally monitoring the mobile terminal 900. Optionally, processor 960 may include one or more processing units.
In this embodiment of the present invention, the mobile terminal 900 further includes: a shooting control program stored in the memory 920 and executable on the processor 960, specifically, a shooting control program in an application program, which when executed by the processor 960, implements the steps of: acquiring a preview image displayed on a shooting preview interface in a shooting preview mode; performing gesture detection on the preview image; if the gesture with the preset characteristics is detected, determining image processing parameters according to the detected gesture; and processing the image according to the image processing parameters.
Optionally, the computer program when executed by the processor 960 may also implement the steps of: detecting a photographic object in the preview image; identifying a background area where the photographic subject is located based on the detected photographic subject; based on the detected gesture, identifying a blurring parameter corresponding to the gesture; setting an aperture parameter based on the identified blurring parameter; and blurring a background area where the shooting object is located according to the aperture parameter.
Optionally, the computer program when executed by the processor 960 may also implement the steps of: determining a number of detected gestures based on the detected gestures; if the number of the detected gestures is one, identifying blurring parameters corresponding to the gestures; if the number of the detected gestures is two, identifying the sum of blurring parameters corresponding to the two gestures respectively; and if the number of the detected gestures is more than two, identifying the average value of blurring parameters corresponding to the plurality of gestures respectively.
Optionally, the computer program when executed by the processor 960 may also implement the steps of: determining a number of detected gestures based on the detected gestures; if the number of the detected gestures is one, identifying blurring parameters corresponding to the gestures; if the number of the detected gestures is two, identifying the sum of blurring parameters corresponding to the two gestures respectively; and if the number of the detected gestures is more than two, identifying the average value of blurring parameters corresponding to the plurality of gestures respectively.
Alternatively, as another embodiment, the computer program when executed by the processor 960 may further implement the following steps: detecting a photographic object in the preview image; determining a shooting object corresponding to the gesture according to the detected gesture; based on the detected gesture, identifying a beauty parameter corresponding to the gesture, and setting the beauty parameter; and performing beauty treatment on the shooting object corresponding to the gesture according to the beauty parameters.
Alternatively, as another embodiment, the computer program when executed by the processor 960 may further implement the following steps: detecting a photographic object in the preview image; according to the detected gestures, determining a shooting object corresponding to each gesture respectively; based on the detected gestures, identifying beauty parameters corresponding to each gesture respectively; and according to each group of beauty parameters, respectively carrying out beauty treatment on the shooting object corresponding to each gesture.
In this embodiment, the mobile terminal may acquire the preview image displayed in the shooting preview interface in the shooting preview mode, so that whether the preview image has a gesture with preset features may be determined in the acquired preview image, once the gesture is detected, the image processing parameter corresponding to the gesture may be determined according to the detected gesture, and the mobile terminal performs image processing according to the image processing parameter. Therefore, when the photographer gestures in the shooting preview interface according to the expected image processing effect, the mobile terminal can recognize the gestures, and sets the image processing parameters according to the gestures to complete the corresponding image processing. Therefore, the shot person can control the image processing mode through gestures, the shot person does not need to operate on the mobile terminal, the operation is convenient and fast, and different image processing effects can meet the requirements of different shot objects.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a computer-readable storage medium, which includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned computer-readable storage media comprise: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A shooting method is applied to a mobile terminal and is characterized by comprising the following steps:
acquiring a preview image displayed on a shooting preview interface in a shooting preview mode;
performing gesture detection on the preview image;
if the gesture with the preset characteristics is detected, determining image processing parameters according to the detected gesture;
processing the image according to the image processing parameters;
the number of the detected gestures is at least two, and the image processing parameters comprise beauty parameters;
the step of determining image processing parameters based on the detected gesture comprises:
detecting a photographic object in the preview image;
according to the detected gestures, determining a shooting object corresponding to each gesture respectively;
based on the detected gestures, identifying beauty parameters corresponding to each gesture respectively;
the step of processing the image according to the image processing parameters includes:
and according to each group of beauty parameters, respectively carrying out beauty treatment on the shooting object corresponding to each gesture.
2. The photographing method according to claim 1, wherein the image processing parameters include blurring parameters, and the step of determining the image processing parameters according to the detected gesture includes:
detecting a photographic object in the preview image;
identifying a background area where the photographic subject is located based on the detected photographic subject;
based on the detected gesture, identifying a blurring parameter corresponding to the gesture;
setting an aperture parameter based on the identified blurring parameter;
the step of processing the image according to the image processing parameters includes:
and blurring a background area where the shooting object is located according to the aperture parameter.
3. The shooting method according to claim 2, wherein the step of identifying the blurring parameter corresponding to the gesture based on the detected gesture comprises:
determining a number of detected gestures based on the detected gestures;
if the number of the detected gestures is one, identifying blurring parameters corresponding to the gestures;
if the number of the detected gestures is two, identifying the sum of blurring parameters corresponding to the two gestures respectively;
and if the number of the detected gestures is more than two, identifying the average value of blurring parameters corresponding to the plurality of gestures respectively.
4. The photographing method according to claim 1, wherein the detected gesture is one, and the image processing parameter includes a beauty parameter;
the step of determining image processing parameters based on the detected gesture comprises:
detecting a photographic object in the preview image;
determining a shooting object corresponding to the gesture according to the detected gesture;
based on the detected gesture, identifying a beauty parameter corresponding to the gesture, and setting the beauty parameter;
the step of processing the image according to the image processing parameters includes:
and performing beauty treatment on the shooting object corresponding to the gesture according to the beauty parameters.
5. A mobile terminal, comprising:
the preview image acquisition module is used for acquiring a preview image displayed on a shooting preview interface in a shooting preview mode;
the gesture detection module is used for performing gesture detection on the preview image;
the parameter determining module is used for determining image processing parameters according to the detected gesture if the gesture with the preset characteristics is detected;
the image processing module is used for processing the image according to the image processing parameters;
the number of the detected gestures is at least two, and the image processing parameters comprise beauty parameters;
the parameter determination module comprises:
a third photographic subject detection unit configured to detect a photographic subject in the preview image;
the second shooting object determining unit is used for determining the shooting object corresponding to each gesture according to the detected gestures;
the second beautifying parameter identification unit is used for identifying a beautifying parameter corresponding to each gesture based on the detected gesture;
the image processing module includes:
and the second beautifying processing unit is used for respectively carrying out beautifying processing on the shooting object corresponding to each gesture according to each group of beautifying parameters.
6. The mobile terminal of claim 5, wherein the image processing parameters include blurring parameters, and wherein the parameter determination module comprises:
a first photographic subject detection unit configured to detect a photographic subject in the preview image;
a background region identification unit configured to identify a background region where the photographic subject is located, based on the detected photographic subject;
the blurring parameter identification unit is used for identifying blurring parameters corresponding to the gestures based on the detected gestures;
an aperture parameter setting unit for setting an aperture parameter based on the identified blurring parameter;
the image processing module includes:
and the blurring processing unit is used for blurring a background area where the shooting object is located according to the aperture parameter.
7. The mobile terminal according to claim 6, wherein the blurring parameter identification unit comprises:
a gesture number determination subunit configured to determine the number of detected gestures based on the detected gesture;
the first identification subunit is used for identifying the blurring parameter corresponding to the gesture if the number of the detected gestures is one;
the second identification subunit is used for identifying the sum of blurring parameters corresponding to the two gestures if the number of the detected gestures is two;
and the third identification subunit is used for identifying the average value of the blurring parameters corresponding to the plurality of gestures if the number of the detected gestures is more than two.
8. The mobile terminal of claim 5, wherein the detected gesture is one, and the image processing parameters comprise a beauty parameter;
the parameter determination module comprises:
a second photographic subject detection unit configured to detect a photographic subject in the preview image;
the first shooting object determining unit is used for determining a shooting object corresponding to the gesture according to the detected gesture;
the first beautifying parameter identification unit is used for identifying a beautifying parameter corresponding to the gesture based on the detected gesture and setting the beautifying parameter;
the image processing module includes:
and the first beautifying processing unit is used for carrying out beautifying processing on the shooting object corresponding to the gesture according to the beautifying parameters.
9. A mobile terminal, characterized in that it comprises a processor, a memory, a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of the shooting method according to any one of claims 1 to 4.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the photographing method according to any one of claims 1 to 4.
CN201710842551.6A 2017-09-18 2017-09-18 Shooting method and mobile terminal Active CN107592458B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710842551.6A CN107592458B (en) 2017-09-18 2017-09-18 Shooting method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710842551.6A CN107592458B (en) 2017-09-18 2017-09-18 Shooting method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107592458A CN107592458A (en) 2018-01-16
CN107592458B true CN107592458B (en) 2020-02-14

Family

ID=61047256

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710842551.6A Active CN107592458B (en) 2017-09-18 2017-09-18 Shooting method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107592458B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049248B (en) * 2019-04-29 2021-09-07 努比亚技术有限公司 Shot object regulation and control method and device and computer readable storage medium
CN112822388B (en) * 2019-11-15 2022-07-22 北京小米移动软件有限公司 Shooting mode triggering method, device, equipment and storage medium
CN113825002B (en) * 2021-09-18 2023-06-06 海信视像科技股份有限公司 Display device and focal length control method
CN114785950A (en) * 2022-04-21 2022-07-22 北京达佳互联信息技术有限公司 Image processing method, image processing device, electronic apparatus, medium, and product

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103201710A (en) * 2010-11-10 2013-07-10 日本电气株式会社 Image processing system, image processing method, and storage medium storing image processing program
CN103237172A (en) * 2013-04-28 2013-08-07 广东欧珀移动通信有限公司 Method and device of time-lapse shooting
CN103413270A (en) * 2013-08-15 2013-11-27 北京小米科技有限责任公司 Method and device for image processing and terminal device
CN105704390A (en) * 2016-04-20 2016-06-22 广东欧珀移动通信有限公司 Photo-modifying photo-shooting method and device and mobile terminal
CN105744172A (en) * 2016-04-27 2016-07-06 广东欧珀移动通信有限公司 Photographing method and device and mobile terminal
CN106210521A (en) * 2016-07-15 2016-12-07 深圳市金立通信设备有限公司 A kind of photographic method and terminal
CN106937054A (en) * 2017-03-30 2017-07-07 维沃移动通信有限公司 Take pictures weakening method and the mobile terminal of a kind of mobile terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103201710A (en) * 2010-11-10 2013-07-10 日本电气株式会社 Image processing system, image processing method, and storage medium storing image processing program
CN103237172A (en) * 2013-04-28 2013-08-07 广东欧珀移动通信有限公司 Method and device of time-lapse shooting
CN103413270A (en) * 2013-08-15 2013-11-27 北京小米科技有限责任公司 Method and device for image processing and terminal device
CN105704390A (en) * 2016-04-20 2016-06-22 广东欧珀移动通信有限公司 Photo-modifying photo-shooting method and device and mobile terminal
CN105744172A (en) * 2016-04-27 2016-07-06 广东欧珀移动通信有限公司 Photographing method and device and mobile terminal
CN106210521A (en) * 2016-07-15 2016-12-07 深圳市金立通信设备有限公司 A kind of photographic method and terminal
CN106937054A (en) * 2017-03-30 2017-07-07 维沃移动通信有限公司 Take pictures weakening method and the mobile terminal of a kind of mobile terminal

Also Published As

Publication number Publication date
CN107592458A (en) 2018-01-16

Similar Documents

Publication Publication Date Title
EP3661187B1 (en) Photography method and mobile terminal
CN106406710B (en) Screen recording method and mobile terminal
CN106060406B (en) Photographing method and mobile terminal
CN107509030B (en) focusing method and mobile terminal
CN107124543B (en) Shooting method and mobile terminal
CN107566717B (en) Shooting method, mobile terminal and computer readable storage medium
CN106161967B (en) Backlight scene panoramic shooting method and mobile terminal
CN107613203B (en) Image processing method and mobile terminal
CN107592458B (en) Shooting method and mobile terminal
CN107678644B (en) Image processing method and mobile terminal
CN107172346B (en) Virtualization method and mobile terminal
CN111182205B (en) Photographing method, electronic device, and medium
CN107659722B (en) Image selection method and mobile terminal
CN107172347B (en) Photographing method and terminal
WO2019001152A1 (en) Photographing method and mobile terminal
CN107959789B (en) Image processing method and mobile terminal
CN107360375B (en) Shooting method and mobile terminal
CN106791437B (en) Panoramic image shooting method and mobile terminal
CN106454086B (en) Image processing method and mobile terminal
CN107665434B (en) Payment method and mobile terminal
CN107480500B (en) Face verification method and mobile terminal
CN106993091B (en) Image blurring method and mobile terminal
CN107483821B (en) Image processing method and mobile terminal
TW201604719A (en) Method and apparatus of controlling a smart device
EP3518522B1 (en) Image capturing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant