CN115334243A - Shooting method and device thereof - Google Patents

Shooting method and device thereof Download PDF

Info

Publication number
CN115334243A
CN115334243A CN202211014395.1A CN202211014395A CN115334243A CN 115334243 A CN115334243 A CN 115334243A CN 202211014395 A CN202211014395 A CN 202211014395A CN 115334243 A CN115334243 A CN 115334243A
Authority
CN
China
Prior art keywords
target
moving object
motion
target moving
pixel offset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211014395.1A
Other languages
Chinese (zh)
Inventor
陆小琪
刘心怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202211014395.1A priority Critical patent/CN115334243A/en
Publication of CN115334243A publication Critical patent/CN115334243A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application discloses a shooting method and a shooting device, and belongs to the technical field of camera shooting. The method comprises the following steps: under the condition that a shooting preview interface is displayed, receiving a first input of a user to a target moving object in the shooting preview interface; responding to the first input, and determining the motion parameters of the target moving object in the motion process according to at least two frames of images acquired by the target moving object in the motion process; and shooting the target moving object based on the motion parameters of the target moving object to obtain a target image.

Description

Shooting method and device thereof
Technical Field
The application belongs to the technical field of camera shooting, and particularly relates to a shooting method and a shooting device.
Background
Currently, a user can shoot an image with a panning effect (i.e. the shooting background is blurred, and the shooting object is clear) through an electronic device, and the user can move the electronic device so that the imaging effect of the moving object on the image sensor is relatively static, and thus the user can obtain the image with the panning effect.
However, a user needs to keep the electronic device as stable as possible, and then move the electronic device to perform shooting, and if a shake phenomenon occurs during the process of moving the electronic device, an imaging effect of an image shot by the electronic device is poor, and thus, an imaging effect of an image shot by the electronic device with a panning effect is poor.
Disclosure of Invention
The embodiment of the application aims to provide a shooting method and a shooting device, which can solve the problem of poor imaging effect of shooting a panning-effect image by electronic equipment.
In a first aspect, an embodiment of the present application provides a shooting method, where the shooting method includes: under the condition that the shooting preview interface is displayed, receiving a first input of a user to a target moving object in the shooting preview interface; in response to the first input, determining a motion parameter of the target moving object in the motion process according to at least two frames of images acquired by the target moving object in the motion process; and shooting the target moving object based on the moving parameters of the target moving object to obtain a target image.
In a second aspect, an embodiment of the present application provides a camera, including: the device comprises a receiving module, a determining module and a shooting module. The receiving module is used for receiving a first input of a user to a target moving object in the shooting preview interface under the condition that the shooting preview interface is displayed. And the determining module is used for responding to the first input and determining the motion parameters of the target moving object in the motion process according to at least two frames of images acquired by the target moving object in the motion process. And the shooting module is used for shooting the target moving object based on the moving parameters of the target moving object to obtain a target image.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, under the condition that a shooting preview interface is displayed, a user can input a target moving object in the shooting preview interface, so that the electronic device can acquire at least two frames of images of the target moving object in the moving process, the electronic device can determine the moving parameters of the target moving object in the moving process according to the at least two frames of images, and then the electronic device can shoot the target moving object according to the moving parameters of the target moving object in the moving process to obtain a target image. In the scheme, the electronic equipment can shoot the motion parameters of the target motion object in the motion process in the preview interface by identifying the motion parameters of the target motion object, so that when the electronic equipment shoots the target motion object, the electronic equipment can shoot the target motion object according to the motion parameters of the target motion object in the motion process determined in the preview interface, so that the projection of the target motion object on the image sensor is relatively static, shooting with a panning effect is realized, and poor imaging effect of an image shot by the electronic equipment due to shaking generated when a user moves the electronic equipment is avoided.
Drawings
Fig. 1 is a flowchart of a shooting method provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a shooting device provided in an embodiment of the present application;
fig. 3 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 4 is a second hardware structure schematic diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The shooting method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
The shooting method provided by the embodiment of the application can be applied to shooting images with panning effects.
At present, when a user shoots an image with a panning effect (that is, a shooting background is blurred, and a shooting object is clear) through an electronic device, generally, the user can make an imaging effect of the moving object on an image sensor relatively static by moving the electronic device, so that the user can obtain the image with the panning effect, specifically, since the panning requires a longer exposure time to achieve the background blurring effect, the electronic device can set a shutter speed to be a slow shutter during shooting, and then the user moves the electronic device along with the moving object to shoot the image with the panning effect, the panning shooting method at present requires the user to keep a mobile phone as stable as possible, and then the mobile electronic device shoots (for example, shoots with a waist or hand motion stably following the moving object), however, a shaking phenomenon exists during the moving of the electronic device by the user, which results in a poor imaging effect of the image shot by the electronic device.
In the embodiment of the application, under the condition that a shooting preview interface is displayed, a user can input a target moving object in the shooting preview interface, so that the electronic device can acquire at least two frames of images of the target moving object in the moving process, the electronic device can determine the moving parameters of the target moving object in the moving process according to the at least two frames of images, and then the electronic device can shoot the target moving object according to the moving parameters of the target moving object in the moving process to obtain a target image. In the scheme, the electronic equipment can shoot the motion parameters of the target motion object in the motion process in the preview interface by identifying the motion parameters of the target motion object, so that when the electronic equipment shoots the target motion object, the electronic equipment can shoot the target motion object according to the motion parameters of the target motion object in the motion process determined in the shooting preview interface, the projection of the target motion object on the image sensor can be relatively static, the shooting with the panning effect is realized, the poor imaging effect of the image shot by the electronic equipment caused by the shaking generated when a user moves the electronic equipment is avoided, and the imaging effect of the electronic equipment for shooting the panning image is improved while the efficiency of the electronic equipment for shooting the panning image is improved.
The execution main body of the shooting method provided by the embodiment of the application can be a shooting device, and the shooting device can be electronic equipment or a functional module in the electronic equipment. The technical solutions provided in the embodiments of the present application are described below by taking an electronic device as an example.
The embodiment of the application provides a shooting method, and fig. 1 shows a flowchart of the shooting method provided by the embodiment of the application. As shown in fig. 1, the photographing method provided by the embodiment of the present application may include steps 201 to 203 described below.
Step 201, in the case of displaying a shooting preview interface, the electronic device receives a first input of a user to a target moving object in the shooting preview interface.
In the embodiment of the application, a user can perform first input on a target moving object on a shooting preview interface, so that electronic equipment can determine the target moving object, the electronic equipment can acquire a moving track of the target moving object, and the electronic equipment can determine a moving parameter of the target moving object according to the moving track of the target moving object.
Optionally, in this embodiment of the present application, the target moving object may be a movable object, and for example, the target moving object may be a human or an animal.
Optionally, in this embodiment of the application, the motion parameters may be a motion speed of the target moving object and a motion direction of the target moving object.
Optionally, in this embodiment of the present application, the first input may be a click input, a long-press input, a sliding input, a preset track input, and the like of the target moving object by the user. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in this embodiment of the application, before the electronic device receives a first input of the user on the target moving object in the shooting preview interface, the electronic device may display a prompt message on the shooting preview interface to prompt the user to select the target moving object in the shooting preview interface, so as to obtain the motion parameter of the target moving object.
Optionally, in this embodiment of the present application, a user may input a panning shooting control in a shooting preview interface, so that the electronic device starts a panning shooting mode, and prompts the user to select a target moving object in the shooting preview interface, so as to obtain a motion parameter of the target moving object.
Step 202, the electronic device responds to the first input, and determines a motion parameter of the target moving object in the motion process according to at least two frames of images acquired by the target moving object in the motion process.
In the embodiment of the application, in response to a first input, the electronic device displays an Interest frame (ROI) containing a target moving object in a shooting preview interface, and tracks the target moving object through the Interest frame to acquire at least two frames of images of the target moving object in a moving process, so as to determine a moving parameter of the target moving object in the moving process according to the at least two frames of images.
Specifically, after the electronic device responds to the first input and displays an interest box containing the target moving object in the shooting preview interface, that is, after the user clicks the target moving object in the shooting preview interface, the electronic device may match the most suitable target moving object according to the click position of the user in the shooting preview interface through an Artificial Intelligence (AI) recognition algorithm and track the target moving object.
Optionally, in this embodiment of the application, the electronic device may obtain a position offset and a feature pixel offset between at least two frames of images according to the at least two frames of images, so as to determine a motion parameter of the target moving object in the motion process according to the position offset and the feature pixel offset.
Optionally, in this embodiment of the present application, the motion parameters include a motion direction and a motion speed; the step 202 can be realized by the following steps 202a to 202 c.
Step 202a, the electronic equipment responds to a first input, and determines the position offset and the characteristic pixel offset of the target moving object according to at least two frames of images.
In the embodiment of the application, the electronic device can determine the position offset and the characteristic pixel offset of the target moving object according to the interest frame displayed in the shooting preview interface.
Alternatively, in this embodiment of the application, the step 202a may be specifically implemented by the following steps 301 and 302.
Step 301, the electronic device obtains at least two pieces of location information, and determines a location offset according to the at least two pieces of location information.
In this embodiment, each of the at least two pieces of position information is position information of the target moving object in one frame of image.
In the embodiment of the application, the electronic device may acquire the center position information of the target moving object in one frame of image to obtain at least two pieces of position information, and determine the position offset of the target moving object according to the at least two pieces of position information.
Specifically, the position information may be coordinate information of the target moving object in the image sensor.
Optionally, in this embodiment of the application, the electronic device may obtain center position information of an interest frame including the target moving object to obtain at least two pieces of position information, and determine the position offset of the target moving object according to the at least two pieces of position information.
For example, taking two frame images as an example, after the electronic device acquires the central position information of the frame of interest including the target moving object in each frame image of the two frame images, the electronic device may determine the position offset of the target moving object by using the following formula 1, where the specific formula is:
boxSpeed=Box2_center–Box1_center (1)
box1_ center is the center position information of the interest frame in the first frame image, box2_ center is the center position information of the interest frame in the second frame image, and boxSpeed is the position offset of the target moving object.
Step 302, the electronic device performs image recognition on the at least two frames of images to obtain feature pixel points of the at least two frames of images, and performs feature comparison on the feature pixel points of the at least two frames of images to obtain feature pixel offset.
In the embodiment of the application, the electronic device may perform image feature identification on the at least two frames of images through an AI identification algorithm to obtain key feature points in the at least two frames of images, and then compare the key feature points in the at least two frames of images to obtain the feature pixel offset.
Optionally, in this embodiment of the application, the electronic device may identify, by an AI identification algorithm, images included in the interest frame in the at least two frames of images to obtain key feature points of the images included in the interest frame in the at least two frames of images, and then compare the key feature points of the images included in the interest frame in the at least two frames of images to obtain a feature pixel offset.
For example, taking two frames of images as an example, after the electronic device identifies key feature points of images contained in an interest frame in the two frames of images, the electronic device may determine a feature pixel offset according to the following formula 2, where the specific formula is:
featureSpeed=feature2–feature1 (2)
feature1 is a key feature point of an image contained in an interest frame in the first frame image, feature2 is a key feature point of an image contained in an interest frame in the second frame image, and feature pixel offset between key feature points of images contained in interest frames in the two frame images.
In the embodiment of the application, the electronic device can calculate the imaging position of the target moving object on the image sensor, so that the speed of the target moving object in the moving process can be obtained, and the electronic device can match the moving speed of the target moving object through the calculated speed, so that the panning effect is realized.
Step 202b, the electronic device determines the pixel offset of the target moving object in the first direction and the pixel offset in the second direction according to the position offset and the characteristic pixel offset.
In an embodiment of the present application, the first direction is perpendicular to the second direction.
In this embodiment of the application, after the electronic device obtains the position offset and the feature pixel offset of the at least two frames of images, the electronic device may determine the pixel offset between the at least two frames of images according to the position offset and the feature pixel offset.
For example, taking two frames of images as an example, after the electronic device determines the position offset and the characteristic pixel offset between the two frames of images, the electronic device may obtain the total pixel offset between the two frames of images according to the following formula 3, where the specific formula is:
totalSpeed=boxSpeed+featureSpeed (3)
the pixel offset value is a pixel offset value of the pixel between two frames of images, and the pixel offset value is a pixel offset value of the pixel between two frames of images.
In the embodiment of the present application, after obtaining the pixel shift amount between the at least two frame images, the electronic device may determine the pixel shift amounts in the vertical direction and the horizontal direction of the target moving object in the original frame image according to the pixel shift amount of the target moving object in the next frame image (i.e., the next frame image of the original frame image in the at least two frame images).
For example, taking two frame images as an example, assuming that the pixel point coordinates of the target moving object in the first frame image are (0, 0), and the pixel point coordinates of the target moving object in the second frame image are (1, 3), the pixel offset of the target moving object in the first frame image in the horizontal direction is 1, and the pixel offset in the vertical direction is 3.
Step 202c, determining the moving direction and the moving speed of the target moving object in the moving process according to the pixel offset in the first direction and the pixel offset in the second direction.
In this embodiment, the electronic device may determine a movement angle of the target moving object in the movement process according to the pixel offset of the target moving object in the first direction and the pixel offset of the target moving object in the second direction, so as to determine the movement direction of the target moving object in the movement process.
In this embodiment of the application, the electronic device may determine a moving distance of the target moving object in the moving process according to the pixel offset of the target moving object in the first direction and the pixel offset in the second direction, so as to determine a moving speed of the target moving object in the moving process.
In the embodiment of the application, the electronic device can determine the motion parameters of the camera according to the motion speed and the motion direction of the target motion object, and then the camera is controlled to match the motion speed and the motion direction of the target motion object, so that the panning effect is realized.
Alternatively, in this embodiment of the application, the step 202c may specifically include the following steps 401 and 402.
Step 401, the electronic device determines a target included angle value according to the pixel offset in the first direction and the pixel offset in the second direction, and determines a moving direction of the target moving object in the moving process according to the target included angle value.
In this embodiment of the application, the target included angle value is an included angle value between a horizontal direction and a moving direction of the target moving object in the moving process, and the horizontal direction is a first direction or a second direction.
In this embodiment of the application, after the electronic device obtains the pixel offset in the horizontal direction and the pixel offset in the vertical direction of the target moving object in the first frame image of the at least two frame images, the electronic device may determine the moving direction of the target moving object in the moving process according to the following formula 4, where the specific formula is:
θ=arctan(totalSpeedY/totalSpeedX) (4)
wherein totalSpeedY is the pixel offset of the target moving object in the vertical direction, totalSpeedX is the pixel offset of the target moving object in the horizontal direction, and theta is the angle value of the moving direction.
The horizontal direction is determined by a gyroscope or other components provided with a device for detecting the horizontal direction in the electronic apparatus.
In particular, the gyroscope may be a sensing gyroscope, such as a piezoelectric gyroscope, a micromechanical gyroscope, a fiber optic gyroscope, a laser gyroscope.
Optionally, in this embodiment of the application, if the first direction is a left direction, the second direction is a direction perpendicular to the left direction; if the first direction is a right direction, the second direction is a direction perpendicular to the right direction.
Specifically, in the case where the above-described first direction is the left horizontal direction, the second direction may be an upper vertical direction and a lower vertical direction; in the case where the first direction is a right horizontal direction, the second direction may be an upper vertical direction and a lower vertical direction.
The following explains how the electronic device determines the pinch angle value of the moving direction of the target moving object from the pixel shift amount in the first direction and the pixel shift amount in the second direction by 4 quadrants.
For example, the electronic device may determine an angle value of the movement directions of the target moving object in the first quadrant and the fourth quadrant through the formula four;
the electronic device may determine an included angle value of the moving direction of the target moving object in the second quadrant according to the following formula 5, where the specific formula is:
θ=arctan(totalSpeedY/totalSpeedX)+180° (5)
wherein totalSpeedY is the pixel offset of the target moving object in the vertical direction, totalSpeedX is the pixel offset of the target moving object in the left direction, and theta is the angle value of the target moving object in the moving direction in the second quadrant.
The electronic device may determine an included angle value of the moving direction of the target moving object in the third quadrant through the following formula 6, where the specific formula is:
θ=arctan(totalSpeedY/totalSpeedX)–180° (6)
wherein totalSpeedY is the pixel offset of the target moving object in the lower vertical direction, totalSpeedX is the pixel offset of the target moving object in the left direction, and theta is the angle value of the target moving object in the moving direction in the third quadrant.
Step 402, the electronic device determines a target pixel offset according to the pixel offset in the first direction and the pixel offset in the second direction, and determines a moving speed of the target moving object in the moving process according to the target pixel offset and the target exposure time.
In the embodiment of the present application, the target exposure time is an image exposure time of a target moving object in a moving process.
In this embodiment, the electronic device may determine a moving distance of the target moving object in the moving process (hereinafter, referred to as a first distance) according to the pixel offset in the first direction and the pixel offset in the second direction, and then determine a moving speed of the target moving object in the moving process according to the first distance and the target exposure time.
Specifically, the electronic device may determine the movement distance of the target moving object in the movement process according to the following formula 7, where the specific formula is:
totalSpeed=sqrt(totalSpeedX 2 +totalSpeedY 2 ) (7)
wherein totalSpeedY is the pixel offset of the target moving object in the vertical direction, totalSpeedX is the pixel offset of the target moving object in the horizontal direction, and totalSpeed is the moving distance of the target moving object in at least two frames of images.
Specifically, after obtaining the movement distance of the target moving object in the movement process, the electronic device may determine the movement speed of the target moving object according to the following formula 8, where the specific formula is:
V=totalSpeed/t (8)
wherein totalSpeed is the moving distance of the target moving object in at least two frames of images, t is the target exposure time, and V is the moving speed of the target moving object.
In the embodiment of the application, the electronic device may determine the movement speed and the movement direction of the target moving object in the movement process by using the pixel offset in the first direction and the pixel offset in the second direction, and further, match the movement speed and the movement direction of the target moving object by controlling the camera, thereby implementing the panning effect.
Step 203, the electronic device shoots the target moving object based on the motion parameters of the target moving object to obtain a target image.
In the embodiment of the application, under the condition that the electronic equipment obtains the motion parameters of the target moving object and detects that the target moving object moves, the electronic equipment can automatically shoot the target moving object to obtain the target image.
Optionally, in this embodiment of the application, the electronic device may receive a shooting input of a user, so that the electronic device may shoot the target moving object to obtain the target image.
Optionally, in this embodiment of the application, the shooting input may be a click input, a slide input, or a preset trajectory input of a user on a shooting control in a shooting preview interface; or a user's combined input of physical keys (e.g., power key and volume key); or a voice input to the electronic device by the user. The method can be determined according to actual use conditions, and the embodiment of the application is not limited.
In the embodiment of the application, the electronic device can further match the movement speed and the movement direction of the target moving object by controlling the camera according to the movement parameters of the target moving object, so as to obtain the target image.
The embodiment of the application provides a shooting method, wherein under the condition that a shooting preview interface is displayed, a user can input a target moving object in the shooting preview interface, so that electronic equipment can acquire at least two frames of images of the target moving object in the moving process, the electronic equipment can determine the moving parameters of the target moving object in the moving process according to the at least two frames of images, and then the electronic equipment can shoot the target moving object according to the moving parameters of the target moving object in the moving process to obtain a target image. In the scheme, the electronic equipment can shoot the motion parameters of the target motion object in the motion process in the preview interface by identifying the motion parameters of the target motion object, so that when the electronic equipment shoots the target motion object, the electronic equipment can shoot the target motion object according to the motion parameters of the target motion object in the motion process determined in the preview interface, so that the projection of the target motion object on the image sensor is relatively static, shooting with a panning effect is realized, and poor imaging effect of an image shot by the electronic equipment due to shaking generated when a user moves the electronic equipment is avoided.
Alternatively, in this embodiment of the application, the step 204 may be specifically implemented by the step 204a described below.
And 204a, the electronic equipment controls the camera of the electronic equipment to move based on the motion parameters of the target moving object, and shoots the target moving object in the moving process of the camera to obtain a target image.
In the embodiment of the application, after the electronic device responds to the shooting input, the electronic device may set the initial position of the camera, the initial speed of the camera, and the acceleration of each time period in the exposure period according to the moving direction of the target moving object and the moving speed of the target moving object, so that the camera may shoot the target moving object according to the acceleration to obtain the target image.
Specifically, the camera may be a micro-pan-tilt camera.
In the embodiment of the application, after the electronic device obtains the motion parameters of the target moving object, the electronic device can control the camera to move, so that the projection position of the target moving object acquired by the electronic device on the image sensor is relatively static, and a target image is obtained.
Optionally, in this embodiment of the present application, the motion parameters include a motion speed and a motion direction; the "electronic device controls the camera motion of the electronic device based on the motion parameter of the target moving object" in step 204a may be specifically realized through steps 501 to 504 described below.
Step 501, the electronic device determines the motion acceleration of the target moving object according to the motion speed of the target moving object in the motion process.
In this embodiment of the application, the electronic device may determine, according to a motion speed between every two frames of images in the at least two frames of images, a first acceleration of the target moving object between every two frames of images through the following formula 9, so as to obtain a motion acceleration of the target moving object between the at least two frames of images, where the specific formula is:
S1=V1t+1/2*at 2 (9)
wherein S1 is the moving distance of the target moving object between every two frame images, t is the target exposure time, and a is the first acceleration of the target moving object between every two frame images.
In this embodiment of the application, after the electronic device obtains the motion acceleration between every two frames of images, the electronic device may obtain the motion acceleration of the target moving object in at least two frames of images according to the following formula 10, where the specific formula is:
a=(S2–S1)/t 2 (10)
wherein a is the motion acceleration in at least two frames of images, S1 and S2 are the motion distance of the target moving object between every two frames of images, and t is the target exposure realization.
Step 502, the electronic device determines the average motion speed of the target moving object according to the motion speed and the motion acceleration of the target moving object in the motion process.
In the embodiment of the application, the electronic device may determine the average motion speed of the target moving object in each two frames of images according to the motion speed, the motion acceleration and the target exposure time of the target moving object in the motion process in each two frames of images.
Specifically, after the electronic device obtains the motion acceleration and the motion velocity of the target moving object in each two frames of images, the electronic device may determine the average motion velocity of the target moving object in each two frames of images according to the following formula 11, where the specific formula is:
V’=(V1+V1+a*Δt)/2 (11)
wherein, V1 is the motion speed of the target moving object in each two frames of images, Δ t is the target exposure time, a is the motion acceleration of the target moving object in each two frames of images, and V' is the average motion speed of the target moving object in each two frames of images.
And step 503, the electronic device determines a rotation angle value of the camera within the target exposure time according to the average motion speed, the target exposure time and the intrinsic parameters of the camera.
Optionally, in this embodiment of the present application, the intrinsic parameters of the camera include an equivalent focal length of the camera and specification parameters of the image sensor.
Specifically, after obtaining the average motion speed, the target exposure time, and the intrinsic parameters of the camera, the electronic device may determine a rotation angle value of the camera within the target exposure time through the following formula 12, where the specific formula is:
θ_indeed=arctan((Δt*α*V’)/(2*1000*EFL)) (12)
wherein Δ t is a target exposure time, α is a specification parameter of the image sensor, EFL is an equivalent focal length of the camera, V' is an average motion speed of a target moving object in each two-frame image, and θ _ index is a rotation angle value of the camera within the target exposure time.
And step 504, the electronic equipment controls the camera to rotate according to the rotation angle value and the movement direction of the target moving object.
In the embodiment of the application, the electronic equipment can calculate the actual rotating angle of the camera in each target exposure time, so that the target moving object can be uniformly accelerated and tracked to obtain the target image.
In the embodiment of the application, in the target exposure time, the electronic device can adjust the angle of the camera according to the moving direction of the target moving object and adjust the camera according to the average moving speed of the target moving object, so that the imaging position of the target shooting object on the image sensor is fixed, and thus, the target moving object and the camera are relatively static, the background moves, and an image with a panning effect is shot.
In the shooting method provided in the embodiment of the present application, the execution subject may be a shooting device, or an electronic device, or may also be a functional module or an entity in the electronic device. The embodiment of the present application takes an example in which a shooting device executes a shooting method, and the shooting device provided in the embodiment of the present application is described.
Fig. 2 shows a schematic diagram of a possible configuration of a camera according to an embodiment of the present application. As shown in fig. 2, the photographing device 70 may include: a receiving module 71, a determining module 72 and a photographing module 73.
The receiving module 71 is configured to receive a first input of a user to a target moving object in a shooting preview interface when the shooting preview interface is displayed. A determining module 72, configured to determine, in response to the first input received by the receiving module, a motion parameter of the target moving object during the motion process according to at least two frames of images acquired by the target moving object during the motion process. And the shooting module 73 is configured to shoot the target moving object based on the motion parameter of the target moving object to obtain a target image.
In a possible implementation manner, the motion parameters include a motion direction and a motion speed; the determining module 72 is specifically configured to determine a position offset and a feature pixel offset of the target moving object according to at least two frames of images; determining the pixel offset of the target moving object in a first direction and the pixel offset in a second direction according to the position offset and the characteristic pixel offset, wherein the first direction is vertical to the second direction; and determining the moving direction and the moving speed of the target moving object in the moving process according to the pixel offset in the first direction and the pixel offset in the second direction.
In a possible implementation manner, the determining module 72 is specifically configured to obtain at least two pieces of position information, and determine a position offset according to the at least two pieces of position information, where each piece of position information is position information of a target moving object in a frame of image; and carrying out image recognition on the at least two frames of images to obtain characteristic pixel points of the at least two frames of images, and carrying out characteristic comparison on the characteristic pixel points of the at least two frames of images to obtain characteristic pixel offset.
In a possible implementation manner, the determining module 72 is specifically configured to determine a target included angle value according to a pixel offset in a first direction and a pixel offset in a second direction, and determine a moving direction of the target moving object in the moving process according to the target included angle value, where the target included angle value is an included angle value between a horizontal direction and the moving direction of the target moving object in the moving process, and the horizontal direction is the first direction or the second direction; and determining a target pixel offset according to the pixel offset in the first direction and the pixel offset in the second direction, and determining the movement speed of the target moving object in the movement process according to the target pixel offset and the target exposure time, wherein the target exposure time is the image exposure time of the target moving object in the movement process.
In a possible implementation manner, the motion parameters include a motion speed and a motion direction; the shooting module 73 is specifically configured to determine a motion acceleration of the target moving object according to a motion speed of the target moving object in a motion process; determining the average motion speed of the target motion object according to the motion speed and the motion acceleration of the target motion object in the motion process; determining a rotation angle value of the camera within the target exposure time according to the average motion speed, the target exposure time and the intrinsic parameters of the camera; and controlling the camera to rotate according to the rotation angle value and the movement direction of the target moving object.
The embodiment of the application provides a shooting device, because the shooting device can shoot the motion parameters of a target motion object in a motion process in a preview interface by identifying the motion parameters of the target motion object, when electronic equipment shoots the target motion object, the shooting device can shoot the target motion object according to the motion parameters of the target motion object in the motion process determined in the shooting preview interface, so that the projection of the target motion object on an image sensor can be relatively static, the shooting with a panning effect is realized, and the poor imaging effect of an image shot by the electronic equipment caused by the shake generated when a user moves the electronic equipment is avoided.
The shooting device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in an electronic apparatus. The device can be mobile electronic equipment or non-mobile electronic equipment. The Mobile electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (Storage), a personal computer (NAS), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The shooting device provided in the embodiment of the present application can implement each process implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
Optionally, as shown in fig. 3, an electronic device 90 is further provided in this embodiment of the present application, and includes a processor 91 and a memory 92, where the memory 92 stores a program or an instruction that can be executed on the processor 91, and when the program or the instruction is executed by the processor 91, the steps of the foregoing shooting method embodiment are implemented, and the same technical effects can be achieved, and are not described again here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 4 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 4 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The user input unit 107 is configured to receive a first input of a target moving object in the shooting preview interface from a user when the shooting preview interface is displayed. And the processor 110 is used for responding to the first input and determining the motion parameters of the target moving object in the motion process according to at least two frames of images acquired by the target moving object in the motion process. The processor 110 is further configured to shoot the target moving object based on the motion parameter of the target moving object, so as to obtain a target image.
The embodiment of the application provides electronic equipment, because the electronic equipment can shoot the motion parameters of a target motion object in a motion process in a preview interface by identifying the motion parameters of the target motion object, when the electronic equipment shoots the target motion object, the electronic equipment can shoot the target motion object according to the motion parameters of the target motion object in the motion process determined in the shooting preview interface, so that the projection of the target motion object on an image sensor can be relatively static, the shooting with a panning effect is realized, and the poor imaging effect of an image shot by the electronic equipment caused by the shaking generated when a user moves the electronic equipment is avoided.
Optionally, in this embodiment of the present application, the motion parameters include a motion direction and a motion speed; the processor 110 is specifically configured to determine a position offset and a feature pixel offset of the target moving object according to at least two frames of images; determining the pixel offset of the target moving object in a first direction and the pixel offset in a second direction according to the position offset and the characteristic pixel offset, wherein the first direction is vertical to the second direction; and determining the moving direction and the moving speed of the target moving object in the moving process according to the pixel offset in the first direction and the pixel offset in the second direction.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to obtain at least two pieces of position information, and determine a position offset according to the at least two pieces of position information, where each piece of position information is position information of a target moving object in one frame of image; and performing image recognition on the at least two frames of images to obtain characteristic pixel points of the at least two frames of images, and performing characteristic comparison on the characteristic pixel points of the at least two frames of images to obtain characteristic pixel offset.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to determine a target included angle value according to a pixel offset in a first direction and a pixel offset in a second direction, and determine a moving direction of the target moving object in the moving process according to the target included angle value, where the target included angle value is an included angle value between a horizontal direction and the moving direction of the target moving object in the moving process, and the horizontal direction is the first direction or the second direction; and determining a target pixel offset according to the pixel offset in the first direction and the pixel offset in the second direction, and determining the movement speed of the target moving object in the movement process according to the target pixel offset and the target exposure time, wherein the target exposure time is the image exposure time of the target moving object in the movement process.
Optionally, in this embodiment of the present application, the motion parameters include a motion speed and a motion direction; the processor 110 is specifically configured to determine a motion acceleration of the target moving object according to a motion speed of the target moving object in a motion process; determining the average motion speed of the target motion object according to the motion speed and the motion acceleration of the target motion object in the motion process; determining a rotation angle value of the camera within the target exposure time according to the average motion speed, the target exposure time and the intrinsic parameters of the camera; and controlling the camera to rotate according to the rotation angle value and the movement direction of the target moving object.
The electronic device provided by the embodiment of the application can realize each process realized by the method embodiment, and can achieve the same technical effect, and for avoiding repetition, the details are not repeated here.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, memory 109 may include volatile memory or non-volatile memory, or memory 109 may include both volatile and non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct bus RAM (DRRAM). The memory 109 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor, which mainly handles operations related to the operating system, user interface, application programs, etc., and a modem processor, which mainly handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the foregoing method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the foregoing method embodiments, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing shooting method embodiments, and achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application or portions thereof that contribute to the prior art may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the present embodiments are not limited to those precise embodiments, which are intended to be illustrative rather than restrictive, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of the appended claims.

Claims (10)

1. A method of photographing, the method comprising:
under the condition that a shooting preview interface is displayed, receiving a first input of a user to a target moving object in the shooting preview interface;
in response to the first input, determining a motion parameter of the target moving object in the motion process according to at least two frames of images acquired by the target moving object in the motion process;
and shooting the target moving object based on the motion parameters of the target moving object to obtain a target image.
2. The method of claim 1, wherein the motion parameters include a direction of motion and a speed of motion; the determining the motion parameters of the target moving object in the motion process according to at least two frames of images of the target moving object in the motion process comprises the following steps:
determining the position offset and the characteristic pixel offset of the target moving object according to the at least two frames of images;
determining the pixel offset of the target moving object in a first direction and the pixel offset in a second direction according to the position offset and the characteristic pixel offset, wherein the first direction is vertical to the second direction;
and determining the moving direction and the moving speed of the target moving object in the moving process according to the pixel offset in the first direction and the pixel offset in the second direction.
3. The method of claim 2, wherein determining the position offset and the characteristic pixel offset of the target moving object according to the at least two frames of images comprises:
acquiring at least two pieces of position information, and determining the position offset according to the at least two pieces of position information, wherein each piece of position information is the position information of the target moving object in one frame of image;
and performing image recognition on the at least two frames of images to obtain characteristic pixel points of the at least two frames of images, and performing characteristic comparison on the characteristic pixel points of the at least two frames of images to obtain the characteristic pixel offset.
4. The method according to claim 2, wherein the determining a moving direction and a moving speed of the target moving object during the moving process according to the pixel offset in the first direction and the pixel offset in the second direction comprises:
determining a target included angle value according to the pixel offset in the first direction and the pixel offset in the second direction, and determining the moving direction of the target moving object in the moving process according to the target included angle value, wherein the target included angle value is an included angle value between the horizontal direction and the moving direction of the target moving object in the moving process, and the horizontal direction is the first direction or the second direction;
and determining a target pixel offset according to the pixel offset in the first direction and the pixel offset in the second direction, and determining the movement speed of the target moving object in the movement process according to the target pixel offset and target exposure time, wherein the target exposure time is the image exposure time of the target moving object in the movement process.
5. The method according to any one of claims 1 to 4, wherein the motion parameters include a speed of motion and a direction of motion; the shooting the target moving object based on the moving parameters of the target moving object to obtain a target image comprises the following steps:
determining the motion acceleration of the target moving object according to the motion speed of the target moving object in the motion process;
determining the average motion speed of the target motion object according to the motion speed and the motion acceleration of the target motion object in the motion process;
determining a rotation angle value of the camera within the target exposure time according to the average motion speed, the target exposure time and the intrinsic parameters of the camera;
and controlling the camera to rotate according to the rotation angle value and the movement direction of the target moving object.
6. A camera, the camera comprising: the device comprises a receiving module, a determining module and a shooting module;
the receiving module is used for receiving a first input of a user to a target moving object in the shooting preview interface under the condition that the shooting preview interface is displayed;
the determining module is configured to determine, in response to the first input received by the receiving module, a motion parameter of the target moving object in a motion process according to at least two frames of images acquired by the target moving object in the motion process;
the shooting module is used for shooting the target moving object based on the moving parameters of the target moving object to obtain a target image.
7. The apparatus of claim 6, wherein the motion parameters include a direction of motion and a speed of motion; the determining module is specifically configured to determine a position offset and a feature pixel offset of the target moving object according to the at least two frames of images; determining the pixel offset of the target moving object in a first direction and the pixel offset in a second direction according to the position offset and the characteristic pixel offset, wherein the first direction is vertical to the second direction; and determining the moving direction and the moving speed of the target moving object in the moving process according to the pixel offset in the first direction and the pixel offset in the second direction.
8. The apparatus according to claim 7, wherein the determining module is specifically configured to obtain at least two pieces of position information, and determine the position offset according to the at least two pieces of position information, where each piece of position information is position information of the target moving object in one frame of image; and performing image recognition on the at least two frames of images to obtain characteristic pixel points of the at least two frames of images, and performing characteristic comparison on the characteristic pixel points of the at least two frames of images to obtain the characteristic pixel offset.
9. The apparatus according to claim 7, wherein the determining module is specifically configured to determine a target included angle value according to the pixel offset in the first direction and the pixel offset in the second direction, and determine a moving direction of the target moving object in the moving process according to the target included angle value, where the target included angle value is an included angle value between a horizontal direction and the moving direction of the target moving object in the moving process, and the horizontal direction is the first direction or the second direction; and determining a target pixel offset according to the pixel offset in the first direction and the pixel offset in the second direction, and determining the movement speed of the target moving object in the movement process according to the target pixel offset and target exposure time, wherein the target exposure time is the image exposure time of the target moving object in the movement process.
10. The apparatus of any one of claims 6 to 9, wherein the motion parameters comprise a speed of motion and a direction of motion; the shooting module is specifically used for determining the motion acceleration of the target moving object according to the motion speed of the target moving object in the motion process; determining the average motion speed of the target motion object according to the motion speed and the motion acceleration of the target motion object in the motion process; determining a rotation angle value of the camera within the target exposure time according to the average motion speed, the target exposure time and the intrinsic parameters of the camera; and controlling the camera to rotate according to the rotation angle value and the movement direction of the target moving object.
CN202211014395.1A 2022-08-23 2022-08-23 Shooting method and device thereof Pending CN115334243A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211014395.1A CN115334243A (en) 2022-08-23 2022-08-23 Shooting method and device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211014395.1A CN115334243A (en) 2022-08-23 2022-08-23 Shooting method and device thereof

Publications (1)

Publication Number Publication Date
CN115334243A true CN115334243A (en) 2022-11-11

Family

ID=83926507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211014395.1A Pending CN115334243A (en) 2022-08-23 2022-08-23 Shooting method and device thereof

Country Status (1)

Country Link
CN (1) CN115334243A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102986208A (en) * 2010-05-14 2013-03-20 株式会社理光 Imaging apparatus, image processing method, and recording medium for recording program thereon
CN107920211A (en) * 2017-12-28 2018-04-17 深圳市金立通信设备有限公司 A kind of photographic method, terminal and computer-readable recording medium
CN112585644A (en) * 2019-07-31 2021-03-30 核心光电有限公司 System and method for creating background blur in camera panning or movement
CN112822398A (en) * 2021-01-04 2021-05-18 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN114125305A (en) * 2021-12-01 2022-03-01 西安维沃软件技术有限公司 Shooting method, device and equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102986208A (en) * 2010-05-14 2013-03-20 株式会社理光 Imaging apparatus, image processing method, and recording medium for recording program thereon
CN107920211A (en) * 2017-12-28 2018-04-17 深圳市金立通信设备有限公司 A kind of photographic method, terminal and computer-readable recording medium
CN112585644A (en) * 2019-07-31 2021-03-30 核心光电有限公司 System and method for creating background blur in camera panning or movement
CN112822398A (en) * 2021-01-04 2021-05-18 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN114125305A (en) * 2021-12-01 2022-03-01 西安维沃软件技术有限公司 Shooting method, device and equipment

Similar Documents

Publication Publication Date Title
CN107404615B (en) Image recording method and electronic equipment
CN112637500B (en) Image processing method and device
CN112954212B (en) Video generation method, device and equipment
CN112333382B (en) Shooting method and device and electronic equipment
CN112738397A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN112954199A (en) Video recording method and device
CN111614905A (en) Image processing method, image processing device and electronic equipment
CN114422692B (en) Video recording method and device and electronic equipment
CN112738405B (en) Video shooting method and device and electronic equipment
CN114125305A (en) Shooting method, device and equipment
CN112492215A (en) Shooting control method and device and electronic equipment
CN115379118B (en) Camera switching method and device, electronic equipment and readable storage medium
CN115589532A (en) Anti-shake processing method and device, electronic equipment and readable storage medium
CN115499589A (en) Shooting method, shooting device, electronic equipment and medium
CN114339051A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN115334243A (en) Shooting method and device thereof
CN112738398B (en) Image anti-shake method and device and electronic equipment
CN112261262B (en) Image calibration method and device, electronic equipment and readable storage medium
CN113873147A (en) Video recording method and device and electronic equipment
CN112399092A (en) Shooting method and device and electronic equipment
CN112822398A (en) Shooting method and device and electronic equipment
CN115134536B (en) Shooting method and device thereof
CN115278053A (en) Image shooting method and electronic equipment
CN115278079A (en) Shooting method and device thereof
CN115103112A (en) Lens control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination