CN115002338B - Shooting parameter control method and device - Google Patents

Shooting parameter control method and device Download PDF

Info

Publication number
CN115002338B
CN115002338B CN202111505369.4A CN202111505369A CN115002338B CN 115002338 B CN115002338 B CN 115002338B CN 202111505369 A CN202111505369 A CN 202111505369A CN 115002338 B CN115002338 B CN 115002338B
Authority
CN
China
Prior art keywords
shooting
zoom
adjustment
parameter
prompt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111505369.4A
Other languages
Chinese (zh)
Other versions
CN115002338A (en
Inventor
张雨
周兵旺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111505369.4A priority Critical patent/CN115002338B/en
Priority to CN202310553504.5A priority patent/CN116634261A/en
Publication of CN115002338A publication Critical patent/CN115002338A/en
Application granted granted Critical
Publication of CN115002338B publication Critical patent/CN115002338B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application provides a control method and a control device for shooting parameters, wherein after an electronic device enters a video shooting mode, if shooting parameter adjustment operation is detected, a shooting parameter adjustment range is prompted on a shooting interface. If the adjustment triggering operation is detected, a smooth adjustment curve is obtained based on the initial shooting parameter and the target shooting parameter, and the parameter to be adjusted is smoothly adjusted from the initial shooting parameter to the target shooting parameter based on the smooth adjustment curve. The initial shooting parameter value refers to a shooting parameter value before the current shooting parameter adjustment operation is detected, and the target shooting parameter value is a shooting parameter value after the adjustment triggering operation is detected. According to the scheme, after the adjustment triggering operation is detected, the initial shooting parameters are smoothly adjusted to the target shooting parameters, and the smooth adjustment process of the shooting parameters is recorded in the video, so that the overall effect of the video is improved. Moreover, the scheme can enable the user to intuitively see the shooting range corresponding to the current adjustment operation, and improves user experience.

Description

Shooting parameter control method and device
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to a method and an apparatus for controlling a shooting parameter.
Background
With the development of technology, electronic devices (e.g., digital cameras, or mobile phones with cameras, tablet computers, etc.) have been increasingly used for shooting, and the use of electronic devices has become a popular choice.
When a user uses the electronic device to perform shooting, shooting parameters can be adjusted to achieve a required shooting effect, for example, the user adjusts a shooting range through a zooming operation and adjusts an angle through a rotating operation. However, when the user adjusts the shooting parameters during the process of shooting the video by the electronic device, there is a problem that the shot picture changes suddenly, and the whole adjustment process can be recorded in the video, and the sudden change process can affect the overall effect of the video.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for controlling shooting parameters, so as to solve the problem of abrupt shooting parameter variation, and the disclosed technical scheme is as follows:
in a first aspect, the present application provides a method for controlling a shooting parameter, which is applied to an electronic device, where the electronic device enters a video shooting mode; when a shooting parameter adjustment event is received, displaying adjustment range prompt information matched with the shooting parameter adjustment event on a shooting interface; when an adjustment triggering operation is detected, acquiring a target parameter value of a shooting parameter to be adjusted, wherein the shooting parameter to be adjusted is an parameter adjusted by a shooting parameter adjustment event; obtaining a smooth change curve corresponding to a shooting parameter to be adjusted based on an initial parameter value and a target parameter value, wherein the initial parameter value is a parameter value corresponding to the shooting parameter to be adjusted when a shooting parameter adjustment event is detected, the abscissa of the smooth change curve is shooting parameter adjustment time, and the ordinate is shooting parameter value; and controlling the shooting parameter to be adjusted to smoothly change from the initial parameter value to the target parameter value based on the smooth change curve. Therefore, after the adjustment triggering operation is detected, the scheme smoothly adjusts the value of the shooting parameter to be adjusted, specifically, a smooth adjustment curve is obtained according to the target shooting parameter value and the initial shooting parameter value, and the shooting parameter is smoothly adjusted from the initial shooting parameter value to the target shooting parameter value according to the smooth adjustment curve, so that the process of smoothly adjusting the shooting parameter is recorded in the shot video, and the overall effect of the video is improved. Moreover, the adjusting range prompt information matched with the shooting parameter adjusting operation can be displayed on the current shooting interface in real time, so that a user can intuitively see the shooting range corresponding to the operation, and the user experience is improved.
In a possible implementation manner of the first aspect, obtaining a smooth variation curve corresponding to the shooting parameter to be adjusted based on the initial parameter value and the target parameter value includes: generating a smooth cubic Bezier curve according to the parameter information of the preset cubic Bezier curve and by taking an initial parameter value as a starting point and a target parameter value as an ending point, wherein the parameter information comprises the relative position relation between a first control point and the starting point and the relative position relation between a second control point and the ending point of the cubic Bezier curve. The scheme adopts a shallow-in shallow-out smooth curve obtained by three Bezier curves, namely, the adjustment speed of the accessory at the beginning adjustment time and the ending adjustment time is smaller than the adjustment speed of the middle stage, abrupt change during beginning adjustment and ending adjustment is avoided, and the smooth change effect of shooting parameters is further improved.
In another possible implementation manner of the first aspect, controlling the shooting parameter to be adjusted to smoothly change from the initial parameter value to the target parameter value based on the smooth change curve includes: determining parameter values corresponding to parameters to be adjusted at each adjustment time point based on the smooth change curve; generating an adjustment instruction corresponding to the adjustment time point based on the parameter value corresponding to the adjustment time point; and sequentially executing the adjustment instructions according to the time sequence of each adjustment time point, and adjusting the shooting parameters to be adjusted to parameter values corresponding to the adjustment instructions. Therefore, according to the scheme, the shooting parameters are smoothly adjusted finally by obtaining each adjustment instruction according to the smooth change curve.
In a further possible implementation manner of the first aspect, the shooting parameter adjustment event is generated upon detection of a shooting parameter adjustment operation; displaying adjustment range prompt information matched with a shooting parameter adjustment event on a shooting interface, comprising: displaying prompt information on a shooting interface; when detecting the change of the shooting parameter adjustment operation, determining a real-time parameter value matched with the changed shooting parameter adjustment operation; adjusting display parameters of the prompt message based on the real-time parameter values; the display parameters of the prompt information are matched with the shooting parameter adjustment operation, and the display parameters of the prompt information comprise the size or the direction of the prompt information. Therefore, the prompt information displayed on the shooting interface can change along with the change of the shooting parameter adjustment operation, so that a user can intuitively see the adjusted effect corresponding to each adjustment operation, and the user experience is improved.
In another possible implementation manner of the first aspect, the shooting parameter adjustment operation is a zooming operation, and the prompt information includes a first zooming prompt box and a second zooming prompt box displayed on the shooting interface; displaying a shooting range matched with an initial zoom magnification in a first zoom prompt frame, wherein the initial zoom magnification is a corresponding zoom magnification when a zoom operation is detected; and displaying a shooting range matched with the real-time zoom magnification corresponding to the zooming operation in the second zooming prompt frame. According to the scheme, the shooting range corresponding to the initial zoom magnification is displayed through the first zoom prompt box, and the shooting range corresponding to the real-time zoom magnification of the zoom operation is displayed through the second zoom prompt box, so that a user can intuitively see the shooting range corresponding to the real-time zoom operation, and simultaneously, the user can intuitively see the transition shooting range from the initial zoom magnification to the target zoom magnification, and the user experience is further improved.
In a further possible implementation manner of the first aspect, the area between the first zoom-out frame and the second zoom-out frame is a semitransparent area, and the display content of the semitransparent area is a transition range, and the transition range corresponds to a change from a shooting range displayed by the first zoom-out frame to a shooting range displayed by the second zoom-out frame. Therefore, the user can more intuitively see the transitional shooting range from the initial zoom magnification to the target zoom magnification, and the user experience is improved.
In a further possible implementation manner of the first aspect, the shooting parameter adjustment operation is a zoom magnification operation; adjusting display parameters of the prompt message based on the real-time parameter values, including: and adjusting the size of the second zoom prompt frame based on the real-time zoom magnification corresponding to the zooming magnification operation so as to display a shooting range matched with the real-time zoom magnification in the second zoom prompt frame. Therefore, the user can intuitively see the adjusted shooting range from the shooting range displayed in the second zoom prompt frame, so that the user can determine whether the current shooting range meets shooting requirements, and the intuitive perception experience of the user in the zooming rate adjustment process is improved.
In another possible implementation manner of the first aspect, the shooting parameter adjustment operation is a zoom-out magnification operation; adjusting display parameters of the prompt message based on the real-time parameter values, including: based on the real-time zoom magnification corresponding to the zoom magnification reduction operation, the size of the first zoom prompt frame is adjusted so that a shooting range matched with the initial zoom magnification is displayed in the first zoom prompt frame, the size of the second zoom prompt frame is equal to the size of the shooting picture, and the shooting range matched with the real-time zoom magnification is displayed in the second zoom prompt frame. Therefore, in the scheme, the shooting range displayed in the second zoom prompt box is correspondingly adjusted according to the zoom-out magnification operation, and meanwhile, the shooting range displayed in the first zoom prompt box is always the range corresponding to the initial zoom magnification by adjusting the size of the first zoom prompt box, so that a user can intuitively see the transitional shooting range from the initial zoom magnification to the target zoom magnification, and the intuitive perception experience of the user in the shooting process is improved.
In a further possible implementation manner of the first aspect, the shooting parameter adjustment operation is a rotation operation, and the prompt information includes a rotation direction prompt graphic displayed on the shooting interface; adjusting display parameters of the prompt graph based on the real-time parameter values includes: the indication direction of the rotation direction indication graph is adjusted to be consistent with the rotation angle value corresponding to the rotation operation. Therefore, the user can directly see whether the rotation direction corresponding to the rotation operation meets the shooting requirement or not by adjusting the knowledge direction of the rotation direction prompt graph, and the shooting experience of the user is improved.
In a further possible implementation manner of the first aspect, the rotation direction prompting graphic includes displaying an indication arrow on the photographing interface, where the indication direction of the indication arrow corresponds to a rotation angle value corresponding to the rotation operation.
In a further possible implementation manner of the first aspect, the rotation direction prompting graphic includes an angle dial displayed on the photographing interface, and a rotation angle value corresponding to the rotation operation is displayed in the angle dial.
In a further possible implementation manner of the first aspect, the shooting parameter adjustment event is generated when an angle between a longitudinal symmetry axis of the electronic device and a true horizontal line of the shot picture is detected to be smaller than 90 °; displaying prompt information and option controls on the shooting interface, wherein the prompt information is used for prompting whether a user rotates to a horizon angle, and the real horizon of a shot picture is consistent with the horizon direction on the shooting interface under the horizon angle; the option control comprises a determination option control and a cancel option control, and the shooting parameter triggering operation is an operation of clicking the determination option control. Like this, when detecting that electronic equipment's state satisfies the condition of predetermineeing, automatic show rotatory suggestion information to the user to whether suggestion user rotates to the horizon view, need not the manual rotation of user, can automatic suggestion user whether rotate, reduced user's operation, further improved user experience.
In another possible implementation manner of the first aspect, the prompt information is hidden when the adjustment triggering operation is detected.
In a further possible implementation manner of the first aspect, the photographing parameter to be adjusted is a zoom magnification, and the adjustment triggering operation is that a finger for the zoom operation leaves a screen of the electronic device.
In a further possible implementation manner of the first aspect, the shooting parameter to be adjusted is a rotation angle, and the adjustment triggering operation is that at least two fingers for the rotation operation leave a screen of the electronic device.
In another possible implementation manner of the first aspect, the rotation angle is an angle obtained by rotating the photographed picture clockwise or counterclockwise with a straight line perpendicular to the screen as an axis.
In a second aspect, the present application further provides an electronic device, including a memory and a processor, where the memory stores program instructions that, when executed by the processor, cause the electronic device to execute the method for controlling the shooting parameter according to any one of the first aspect.
In a third aspect, the present application also provides a computer-readable storage medium having stored thereon program instructions that, when run on an electronic device, cause the electronic device to perform the method of controlling a shooting parameter according to any one of the first aspects.
In a fourth aspect, the present application also provides a computer program product having program instructions stored thereon, which, when run on an electronic device, cause the electronic device to perform the method of controlling a shooting parameter according to any of the first aspects.
It should be appreciated that the description of technical features, aspects, benefits or similar language in this application does not imply that all of the features and advantages may be realized with any single embodiment. Conversely, it should be understood that the description of features or advantages is intended to include, in at least one embodiment, the particular features, aspects, or advantages. Therefore, the description of technical features, technical solutions or advantageous effects in this specification does not necessarily refer to the same embodiment. Furthermore, the technical features, technical solutions and advantageous effects described in the present embodiment may also be combined in any appropriate manner. Those of skill in the art will appreciate that an embodiment may be implemented without one or more particular features, aspects, or benefits of a particular embodiment. In other embodiments, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a software structural block diagram of an electronic device according to an embodiment of the present application;
fig. 3A is a schematic diagram of a UI interface with a zoom magnification according to an embodiment of the present application;
FIG. 3B is a schematic view of a UI interface at another magnification provided by embodiments of the present application;
fig. 4 is a schematic diagram of a UI interface for reducing zoom magnification according to an embodiment of the present application;
FIG. 5 is a schematic view of a different angle provided by an embodiment of the present application;
fig. 6A is a schematic diagram of a UI interface for rotating a shooting angle to the left according to an embodiment of the present application;
fig. 6B is a schematic diagram of another UI interface for rotating a shooting angle to the left according to the embodiment of the present application;
Fig. 7A is a schematic diagram of a UI interface for rotating a shooting angle to the right according to an embodiment of the present application;
fig. 7B is a schematic diagram of another UI interface for rotating a shooting angle to the right according to the embodiment of the present application;
fig. 7C is a schematic diagram of another UI interface for rotating a shooting angle according to an embodiment of the present application;
fig. 8 is a flowchart of a control method of shooting parameters according to an embodiment of the present application;
FIG. 9A is a UI diagram of a zoom amplitude bar and a rotation amplitude bar provided by embodiments of the application;
FIG. 9B is a UI schematic diagram of another zoom amplitude bar and rotary scale provided by embodiments of the present application;
fig. 10 is a schematic diagram of a smooth zoom curve corresponding to a zoom magnification according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a smooth zoom curve corresponding to a reduced zoom magnification according to an embodiment of the present application;
FIG. 12 is a schematic view of a smooth rotation curve corresponding to increasing angles according to an embodiment of the present application;
fig. 13 is a schematic diagram of a smooth rotation curve corresponding to a reduced angle according to an embodiment of the present application.
Detailed Description
The terms first, second, third and the like in the description and in the claims and drawings are used for distinguishing between different objects and not for limiting the specified sequence.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
The inventors have found that: in the zooming process of the electronic device, it is difficult for a user to realize smooth zooming, for example, after the user performs an operation of increasing the zoom magnification once, the zoom magnification is found to be too large, and then performs an operation of reducing the zoom magnification again to continue to adjust, and the whole zooming process is recorded in the video. Furthermore, the zooming process is very abrupt and not smooth. Similarly, the problem of abrupt angle change also exists in the process of adjusting the rotation angle, and the abrupt change process can influence the overall effect of the video.
In order to solve the technical problem, the inventor provides a control method and a device for shooting parameters, and if the electronic equipment is in a video shooting mode and shooting parameter adjustment operation is detected, a shooting parameter adjustment range is prompted on a shooting interface. If an adjustment triggering operation is detected, a smooth adjustment curve is obtained based on the initial shooting parameter value and the target shooting parameter value, and the parameter to be adjusted is smoothly adjusted from the initial shooting parameter value to the target shooting parameter value based on the smooth adjustment curve. The initial shooting parameter value refers to a shooting parameter value before the current shooting parameter adjustment operation is detected, and the target shooting parameter value is a shooting parameter value after the adjustment triggering operation is detected. Therefore, after the adjustment triggering operation is detected, the scheme smoothly adjusts the value of the shooting parameter to be adjusted, specifically, a smooth adjustment curve is obtained according to the target shooting parameter value and the initial shooting parameter value, and the shooting parameter is smoothly adjusted from the initial shooting parameter value to the target shooting parameter value according to the smooth adjustment curve, so that the process of smoothly adjusting the shooting parameter is recorded in the shot video, and the overall effect of the video is improved. In addition, the scheme can display the range matched with the shooting parameter adjustment operation on the current shooting interface in real time, so that a user can intuitively see the shooting range corresponding to the operation, and the user experience is improved.
Referring to fig. 1, a schematic structural diagram of an electronic device provided in an embodiment of the present application is shown, where the electronic device is used to execute a control method for shooting parameters provided in the present application.
The electronic device may be an electronic device provided with a camera, such as a cell phone, tablet computer, desktop/laptop notebook computer, ultra-mobile personal computer (UMPC), handheld computer, netbook, personal digital assistant (Personal Digital Assistant, PDA), wearable electronic device. The specific form of the electronic device is not particularly limited in this application.
As shown in fig. 1, the electronic device may include a processor, a camera, a display screen, a touch sensor, an internal memory, an external memory interface, a USB interface, a charge management module, a power management module, a battery, and keys.
The processor is a nerve center and a command center of the electronic equipment, and the controller can generate operation control signals according to instruction operation codes and time sequence signals to finish instruction fetching and instruction execution control.
The processor may include one or more processing units, for example, the processor may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can be a neural center and a command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The memory may be used to store computer executable program code that may include an operating system, application programs required for at least one function (such as a sound playing function, an image, a video playing function, etc.), and the like. The executable program code includes instructions that, by execution, cause the processor to cause the electronic device to perform various functional applications and data processing. For example, in the present application, the processor causes the electronic device to execute the control method of the shooting parameter provided in the present application by executing the instructions stored in the memory.
The display screen is used to display images, videos, a series of graphical user interfaces (graphical user interface, GUI), etc. The electronic device may include 1 or N displays, N being a positive integer greater than 1. For example, in embodiments of the present application, the display screen may display controls for the camera application.
The electronic device can realize shooting functions through ISP, camera, video codec, GPU, display screen, application processor and the like.
The ISP is used for processing the data fed back by the camera. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In an exemplary embodiment, the ISP may be provided in the camera.
Cameras are used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In an exemplary embodiment, the electronic device may include 1 or N cameras, N being a positive integer greater than 1.
Touch sensors, also known as "touch devices". The touch sensor can be arranged on the display screen, and the touch sensor and the display screen form a touch screen, which is also called a touch screen. The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with a touch operation may be provided through a display screen. In other embodiments, the touch sensor may also be disposed on a surface of the electronic device, different from the location of the display screen.
It is to be understood that the configuration illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus. In other embodiments, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Further, an operating system is run on the above components, and an application program can be installed and run on the operating system.
The operating system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of an electronic device is illustrated.
Fig. 2 is a software configuration block diagram of an electronic device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. Taking an Android system as an example, in an exemplary embodiment, the Android system is divided into four layers, namely an Application layer (APP), an Application Framework layer (Framework), a system library, a hardware abstraction layer (Hardware Abstraction Layer, HAL) and a system Kernel layer (Kernel).
The application layer may include a series of application packages, such as applications including cameras, gallery, calendar, phone, navigation, and the like.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes some predefined functions such as camera services, window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
The camera HAL may connect the camera framework API to the underlying camera drivers and hardware.
In an exemplary embodiment, the camera HAL may include a photographing parameter processing module, a smooth zooming module, a smooth rotation module, and the like.
The shooting parameter processing module is used for receiving a shooting parameter adjustment event, analyzing and obtaining shooting parameter types corresponding to the shooting parameter adjustment operation, such as zoom magnification, rotation angle and the like, and triggering the corresponding module to generate parameter adjustment instructions, such as zoom instructions and rotation instructions.
The smooth zooming module is used for generating zooming instructions for enabling zooming multiplying power to be adjusted smoothly.
The smooth rotation module is used for generating a rotation command for smoothly rotating the angle.
The kernel layer (kernel) is a layer between hardware and software, and at least includes a camera driver, a touch driver, and the like.
It should be noted that, although the embodiment of the present application is described taking the Android system as an example, the basic principle is equally applicable to electronic devices based on other operating systems.
The process of adjusting the shooting parameters is described below by taking the electronic device as a mobile phone, and the shooting parameter adjusting process of other types of electronic devices is similar.
In an exemplary embodiment, in a scene where the zoom magnification is adjusted, a first zoom-out frame and a second zoom-out frame may be displayed on the photographing interface. And displaying a shooting range corresponding to the initial zoom magnification in the first zoom prompt box. And displaying a shooting range corresponding to the zooming magnification corresponding to the real-time zooming operation in the second zooming prompt frame, namely, displaying the shooting range in the second zooming prompt frame along with the zooming operation.
A schematic UI interface of the electronic device when adjusting the zoom magnification will be described below with reference to fig. 3A, 3B, and 4, respectively.
Referring to fig. 3A, a schematic diagram of a UI interface with zoom magnification according to an embodiment of the present application is shown.
In a scene where the camera APP is in the video photographing mode, as shown in (1) of fig. 3A, when it is detected that the two fingers touch the screen without moving, the photographing ranges displayed by the first zoom-out frame 1 and the second zoom-out frame 2 are the same. If the two hands are expanded outwards (namely, the zoom magnification), the size of the first zoom prompt frame 1 and the displayed shooting range thereof are kept unchanged, and the size of the second zoom prompt frame 2 is reduced along with the expansion displacement of the two fingers, so that the shooting range displayed in the second zoom prompt frame 2 is reduced, namely, the shooting range displayed in the second zoom prompt frame 2 is the shooting range matched with the corresponding zoom magnification after the two fingers are expanded.
After that, if the user determines that the currently displayed shooting range does not meet the shooting requirement through the shooting range displayed by the second zoom-out prompt frame 2, the user can also continue to adjust through the double-finger folding and unfolding operation, and if the double fingers of the user continue to expand, the size of the second zoom-out prompt frame 2 continues to be smaller along with the expansion action; as another example, when the user double-finger is contracted, the size of the second zoom-out frame 2 becomes larger following the contraction operation.
In the scene of zooming magnification, considering that the shooting range displayed by the first zoom prompt box is always unchanged, in other embodiments, the first zoom prompt box may not be displayed on the shooting interface, but only the second zoom prompt box is displayed, and the shooting range displayed by the second zoom prompt box is the real-time shooting range corresponding to the zooming operation.
Further, the area between the first zoom-out frame 1 and the second zoom-out frame 2 may be a translucent area indicating a transition range from the photographing range shown in the first zoom-out frame 1 to the photographing range shown in the second zoom-out frame 2. The transparency of the semitransparent display area can be set by a user or a default value of the system, such as 80% transparency, is adopted.
When the user determines that the current shooting range meets the requirement by observing the shooting range in the second zoom prompt frame 2, the two fingers leave the UI interface (namely, the zoom trigger operation) to trigger the smooth zooming process, namely, the zoom magnification is controlled to smoothly change from the initial zoom magnification to the target zoom magnification, and the shooting range shown in (1) of fig. 3A is smoothly changed to the shooting range shown in (3) of fig. 3A. It can be seen that the video is recorded with a smooth change from the zoom magnification shown in (1) to the zoom magnification shown in (3), and therefore, the shooting effect of the video is improved.
In another exemplary embodiment, as shown in fig. 3B, there is provided another UI interface at the time of zooming magnification, which is different from that of fig. 3A in that: the initial size of the second zoom-out frame is different from fig. 3A.
As shown in (1) of fig. 3B, when a two-finger zoom-in magnification operation is detected, the second zoom-out frame 2 displayed on the photographing interface is a very small box, and as the two fingers expand, the second zoom-out frame 2 becomes larger following the expansion operation, as it becomes the size shown in (2) of fig. 3B.
In a scenario in which a user performs zoom magnification operation with respect to an area of interest, a display position of the second zoom-out frame 2 may be determined according to a contact position generated on a screen by the zoom operation of the user, for example, a middle position of two contacts, and for example, the second zoom-out frame 2 is displayed at a position of any one contact, the display position of the second zoom-out frame is not limited in this application.
In other exemplary embodiments, the initial second zoom-prompting frame 2 in fig. 3B may be other shapes, such as a circle with a small diameter, or only a dot, or other shapes, etc., where the initial shape of the second zoom-prompting frame is not limited in this application.
In addition, the initial size of the second zoom-out frame 2 is not particularly limited, for example, the initial size of the second zoom-out frame 2 may be a default size of the system, or the second zoom-out frame 2 may be a square frame with two contacts as diagonal vertices, or the like.
Similar to the zoom magnification, the UI interface may display a zoom prompt box in a scene where the user zooms out the zoom magnification.
As shown in fig. 4 (1), when a movement of the screen is detected by touching with both the fingers, the photographing ranges displayed on the photographing interface in which the first zoom-out frame 1 and the second zoom-out frame 2 are displayed are the same, and are the photographing ranges corresponding to the initial zoom magnification (the zoom magnification at the time of detecting the zoom operation). As the two fingers are contracted on the screen, as shown in (2) of fig. 4, the photographing range displayed in the second zoom-out frame 2 becomes large, but the size of the second zoom-out frame 2 is not changed to be the same as the size of the viewfinder of the photographing interface, that is, the maximum range of the second zoom-out frame. The first zoom prompt box 1 becomes smaller, so that the shooting range displayed in the first zoom prompt box 1 always maintains the shooting range corresponding to the initial zoom magnification.
It can be seen that, in the scene of zooming magnification reduction, the size of the second zoom prompt box 2 is unchanged, and the shooting range displayed in the second zoom prompt box 2 becomes larger with zooming magnification reduction operation; the first zoom prompt box 1 becomes smaller, and the shooting range displayed in the first zoom prompt box 1 is always the shooting range corresponding to the initial zoom magnification.
As shown in fig. 4 (2), the area between the first zoom-out frame 1 and the second zoom-out frame 2 is a translucent area, and indicates a transition range from the photographing range indicated by the first zoom-out frame to the photographing range indicated by the second zoom-out frame 2.
When the user determines that the requirements are met by observing the shooting range in the second zoom prompt frame 2, the user triggers the execution of the smooth zooming process when two fingers leave the screen, and the zoom magnification is controlled to smoothly change from the initial zoom magnification to the target zoom magnification, namely, the shooting range smoothly changes from the shooting range shown in (1) of fig. 4 to the shooting range shown in (3) of fig. 4.
Fig. 3A, 3B and 4 are merely exemplary drawings, and the prompt corresponding to the zooming operation may be displayed in other manners, which are not limited in this application.
The process of rotating the angle is similar to adjusting the zoom magnification, and the angle in the present application refers to an angle obtained by rotating the angle around a straight line perpendicular to the screen plane (specifically, the center of the screen plane) as an axis. The adjustment angle of the present application refers to an angle of a shooting picture displayed on a screen through a rotation operation on the premise that a lens angle of an electronic device remains unchanged. For example, the rotation operation may include: at least two fingers are rotated on the screen, a rotation amplitude bar on the UI interface is adjusted, etc.
In an exemplary embodiment, in a scene where the rotation angle of the photographed picture is adjusted, an arrow for indicating the rotation angle may be displayed on the photographing interface (as shown in fig. 6A and 7A), and the indication direction of the arrow changes following the rotation operation, i.e., the direction indicated by the arrow is always the direction in which the angle corresponding to the current rotation operation is located, for example, the rotation operation is used to rotate the picture from 0 ° to-15 °, and the direction of the arrow gradually changes from the initial 0 ° direction to the direction of-15 °.
In an exemplary embodiment, as shown in fig. 5, when the electronic device is placed forward, the direction of the longitudinal symmetry axis of the electronic device facing upwards is 0 ° or 360 °, the angle displayed on the screen increases due to the gesture of rotating the finger in the counterclockwise direction on the screen, the angle displayed on the screen gradually changes from 0 ° to 360 ° in a counterclockwise direction, and conversely, the angle displayed on the screen decreases due to the gesture of rotating the finger in the clockwise direction on the screen, the angle displayed on the screen gradually changes from 360 ° to 0 ° in a clockwise direction.
In other exemplary embodiments, as shown in fig. 5, when the electronic device is placed forward, the longitudinal symmetry axis of the electronic device is a boundary line of positive and negative angles, the angle range corresponding to the left rotation is 0 ° to-180 °, and the angle range corresponding to the right rotation is 0 ° to 180 °.
The above-mentioned angle rotation gesture, adjustment direction and angle range all can be set for by oneself according to the demand, and this application does not limit.
Referring to fig. 6A, a schematic diagram of a UI interface for rotating a shooting screen to the left according to an embodiment of the present application is shown.
In a scene where the camera APP is in the video shooting mode, the user can change the angle of the currently shot picture with a two-finger rotation operation. After the rotation operation is detected, an angle rotation prompt arrow is displayed on the shooting screen. Further, in order to make the user more clearly see the prompt arrow, a translucent layer on which the arrow is displayed, such as the arrow shown in (1) of fig. 6A, may be displayed on the photographing interface first. Further, the direction indicated by the presentation arrow is always the same as the rotation angle corresponding to the rotation operation, that is, the direction indicated by the presentation arrow changes following the change of the rotation operation.
Wherein the direction of the arrow shown in fig. 6A (1) indicates the direction corresponding to the initial angle 0 ° (i.e., the angle at which the picture is taken when the rotation operation is detected).
The user's double finger rotates counterclockwise, such as 90 ° counterclockwise, and accordingly, the direction indicated by the angular rotation prompting arrow changes following the double finger rotating operation counterclockwise, i.e., the direction of the arrow gradually changes from the direction shown in fig. 6A (1) (e.g., 0 °) to the direction shown in fig. 6A (2) (e.g., 90 °).
When the user determines that the current angle satisfies the requirement by observing the direction of the prompt arrow, the two fingers leave the UI interface (i.e., the rotation triggering operation) to trigger the execution of the smooth angle adjustment process, i.e., to control the smooth change of the angle of the photographed picture from the initial angle to the target angle, i.e., from the angle shown in fig. 6A (1) to the angle shown in fig. 6A (3).
In another exemplary embodiment, a disc-shaped rotation prompt control may be displayed at the photographing interface, the angle indicated within the control changing in real time following the change in the rotation operation.
As shown in fig. 6B, another UI diagram when the photographed screen is rotated to the left is provided, which is different from fig. 6A in that: the manner for indicating the rotation angle is different, and the example of fig. 6A indicates the direction of the rotation angle with an arrow and the example of fig. 6B indicates the rotation angle with an angle dial.
When the rotation operation is detected, a circle is displayed on the shooting interface, and the numerical value of the angle corresponding to the rotation operation is displayed in the circle. As shown in (1) of fig. 6B, displaying "0" in the circular angle dial indicates that the current angle is 0 °; when the finger of the user rotates leftwards, the numerical value displayed in the circular angle dial can be changed in real time along with the rotation operation, and if the rotation operation gradually changes from the 0-degree direction shown in (1) to the 90-degree direction shown in (2), the numerical value displayed in the circular angle dial gradually changes from 0 to-90.
When the user leaves the screen with two fingers, the angle of the shooting picture of the control camera is smoothly rotated from 0 DEG shown in (1) to-90 DEG shown in (3).
The rightward rotation process is similar to the leftward rotation process, and as shown in fig. 7A (1), the user rotates two fingers clockwise on the screen, and an angular rotation prompt arrow is displayed on the photographing interface. As the double finger rotates clockwise by 90 °, the direction of the angular rotation prompting arrow gradually changes from the angle shown in fig. 7A (1) (e.g., 0 °) to the angle shown in fig. 7A (2) (e.g., 90 °).
When the user leaves the screen with two fingers, the angle of the photographing screen displayed on the control screen is smoothly rotated from the angle shown in fig. 7A (1) to the angle shown in fig. 7A (3).
Similar to the UI interface rotated leftward as shown in fig. 6B, when rotated rightward, a circular angle dial may be displayed on the photographing interface and an angle value corresponding to the rotation operation in real time may be displayed in the disc as shown in fig. 7B. The manner of indicating the circular angle dial when rotated rightward is the same as that of fig. 6B, and a detailed description thereof will be omitted.
Fig. 7A and 7B are merely exemplary drawings, and in other embodiments, the rotation direction prompt information corresponding to the rotation operation may be displayed in other manners, for example, the angle value corresponding to the rotation operation in real time is displayed directly on the photographing interface. The specific manner of the rotation prompt is not limited in this application.
In the examples shown in fig. 7A and 7B, when the mobile phone shoots a video, the mobile phone is in a vertical state, i.e. the horizontal plane of the longitudinal symmetry axis of the mobile phone is vertical, and in another scenario, the mobile phone is in an inclined state, i.e. the angle between the longitudinal symmetry axis of the mobile phone and the horizontal plane is smaller than 90 °, as shown in (1) of fig. 7C, a certain angle (e.g. -15 °) exists between the real shooting view angle (i.e. the actual view angle) of the mobile phone and the horizontal view angle (i.e. the horizontal view angle) of the shooting interface. If a video is photographed at an actual viewing angle, when a user views the photographed video (e.g., when the user views the mobile phone in a vertical state), an angle between a picture displayed in the video and a horizontal plane may occur, i.e., a photographed object in the video is in an inclined state.
In this scenario, the mobile phone may automatically prompt the user whether to rotate the prompt information of the horizon view angle, as shown in (2) of fig. 7C, display "whether to rotate to the horizon view angle" and display selection controls of "ok" and "cancel" on the shooting interface, and if the user clicks the "ok" control, control the shooting angle to smoothly rotate from the angle shown in (1) to the horizon view angle.
The example shown in fig. 7C is that the mobile phone is tilted left, and the mobile phone is tilted right, which is similar to the prompt message in fig. 7C, and will not be described here again.
In the embodiment shown in fig. 7C, when the mobile phone enters a video shooting mode and the mobile phone is detected to be in an inclined state, whether the user rotates back to the horizon view angle is automatically prompted on the shooting interface, if the user selects to rotate back to the horizon view angle, the mobile phone automatically controls the shooting picture to smoothly rotate from the initial angle to the angle corresponding to the horizon view angle, and the user is not required to manually rotate, so that whether the user rotates or not can be automatically prompted, user operation is reduced, and user experience is further improved.
The following describes a procedure of a photographing parameter control method applied to an electronic apparatus, as shown in fig. 8, with reference to fig. 8, and may include the steps of:
s110, when the camera APP is in the video recording mode, if shooting parameter adjustment operation is detected, a shooting parameter adjustment event is generated and transmitted to a shooting parameter processing module.
Taking the example of adjusting shooting parameters by touch operation, when the camera APP is in a video recording mode, after the touch screen detects the touch operation of a user, the touch operation is reported to the camera APP, and after the camera APP detects that the touch operation is the shooting parameter adjustment parameters, a shooting parameter adjustment event is generated and transmitted to a shooting parameter processing module of the camera HAL layer.
As another example, in the scenario shown in fig. 7C, the photographing parameter adjustment event is generated when it is detected that the angle between the longitudinal symmetry axis of the mobile phone and the true horizontal line of the photographed image is smaller than 90 °.
S120, the shooting parameter processing module responds to the shooting parameter adjustment event and determines shooting parameters to be adjusted corresponding to the shooting parameter adjustment operation.
In one embodiment, the photographing parameters may include zoom magnification, rotation angle.
In an exemplary embodiment, a mapping relationship between a shooting parameter adjustment operation and a shooting parameter is stored in the electronic device, and after receiving a shooting parameter adjustment event, the shooting parameter processing module may determine, according to the mapping relationship, a shooting parameter corresponding to the currently detected operation, that is, a shooting parameter to be adjusted.
For example, a two-finger expansion operation corresponds to a magnification zoom magnification, and a two-finger contraction operation corresponds to a reduction zoom magnification; the double finger rotates counterclockwise or clockwise.
In another exemplary embodiment, as shown in fig. 9A, a zoom magnification amplitude bar is provided on the right side of the photographing interface, and a rotation amplitude bar may be provided on the left side of the photographing interface. Corresponding adjustment controls are arranged in the zooming multiplying power amplitude bar and the rotation amplitude bar, and a user can adjust the adjustment controls in the amplitude bar so as to adjust corresponding shooting parameters.
As another example, in other exemplary embodiments, as shown in fig. 9B, the space for adjusting the rotation angle may also be an adjustment control provided on a rotation scale, where the rotation scale may be provided below the photographed screen. In addition, the specific location of the rotary scale is not limited in this application.
Fig. 9A and 9B are only examples of shooting parameter adjustment controls, and the form and position of the control for adjusting shooting parameters are not limited in the present application.
S130, if the shooting parameter to be adjusted is determined to be the zoom magnification, the shooting parameter processing module acquires the zoom magnification corresponding to the zoom operation.
After resolving that the shooting parameters to be adjusted corresponding to the shooting parameter adjustment event are zoom magnifications, the shooting parameter processing module obtains the zoom magnifications corresponding to the detected zoom operation, namely the initial zoom magnifications, and obtains the zoom magnifications corresponding to the current shooting parameter adjustment operation, namely the real-time zoom magnifications.
In an exemplary embodiment, the zoom operation is an expansion operation of a two-finger on the screen, i.e., an operation of increasing the zoom magnification, and further, based on the detected displacement of the zoom operation on the screen, the increased real-time zoom magnification corresponding to the zoom operation is determined. Similarly, when the zoom operation is detected as a pinch operation of the two fingers on the screen, the real-time zoom magnification after the zoom operation is reduced can be determined. And displaying real-time zoom magnification in the zoom prompt box.
The camera system of the electronic equipment is provided with a mapping relation between the displacement of the screen and the zoom magnification variation amplitude of the two fingers, and based on the mapping relation and the initial zoom magnification, the real-time zoom magnification corresponding to the zoom operation can be calculated.
In other embodiments, the process of determining the real-time zoom magnification corresponding to other types of zoom operation is similar to that of the two-finger zoom operation, and will not be described here.
S140, triggering the display screen to prompt the zoom range on the shooting interface.
As shown in fig. 3A, 3B, and 4, a first zoom-out frame 1, a second zoom-out frame 2, and a translucent layer are displayed on a photographing interface for photographing a video, wherein a photographing range corresponding to a zoom operation is displayed in the second zoom-out frame 2.
And S150, if the shooting parameter processing module detects a zooming trigger event, determining a target zooming multiplying power.
After receiving a touch event reported by the touch screen layer by layer, the camera APP transmits the zoom trigger event to the shooting parameter processing module if the touch event is determined to be the zoom trigger event.
After the shooting parameter processing module responds to the zoom trigger event, determining that the real-time zoom magnification corresponding to the received zoom trigger operation is the target zoom magnification, and sending the zoom trigger event, the initial zoom magnification, the target zoom magnification and other zoom information to the smooth zoom module.
If the zoom operation is a two-finger expansion/contraction operation, the zoom trigger operation is that the two fingers of the user leave the screen, and when the two fingers are detected to leave the screen, the sliding displacement of the two fingers on the screen is determined, and the target zoom magnification is obtained based on the sliding displacement and the initial zoom magnification.
If the zoom operation is to adjust the adjustment control on the zoom level bar, the zoom trigger operation is that the user's finger leaves the adjustment control, in which case the target zoom magnification is determined based on the final position of the adjustment control on the zoom level bar. Alternatively, in other possible implementations, the zoom trigger operation may be an operation that a user clicks a specific control or a shortcut key of the photographing interface, and the specific manner of the zoom trigger operation is not limited in this application.
S160, the smooth zooming module responds to the zooming trigger event, and a smooth zooming curve is obtained according to the initial zooming multiplying power and the target zooming multiplying power.
After receiving the zoom trigger event, the smooth zoom module responds to the zoom trigger event and obtains a smooth zoom curve based on the initial zoom magnification and the target zoom magnification.
In an exemplary embodiment, after the shooting parameter processing module detects the zoom trigger event, the display screen may be triggered to hide the zoom prompt box.
In the zooming process, in order to obtain a smooth zooming process, a shallow-in and shallow-out changing mode is adopted at the beginning zooming time and the ending zooming time, namely, the zooming curve is an S-shaped curve, the zooming speeds near the beginning zooming time and near the ending zooming time are smaller than the zooming speed of the middle time period, and the zooming speed is corresponding to the zooming magnification as shown in fig. 10.
As shown in fig. 10, the vertical axis represents zoom magnification, and the horizontal axis represents time. T1 is a time corresponding to the initial zoom magnification, T2 is a time corresponding to the target zoom magnification, and the zoom duration Δt=t2-T1.
In an exemplary embodiment, Δt may be a fixed duration, such as 1.2s, which may be a duration corresponding to the best dynamic-zoom effect determined from a limited number of experiments. Of course, in other embodiments, Δt may also be determined according to a difference between the initial zoom magnification and the target zoom magnification (i.e. the zoom amplitude), for example, the zoom speed, i.e. the zoom amplitude in unit time, may be determined through a limited number of experiments, and then the zoom duration is determined according to the current zoom amplitude, and then the smooth zoom curve is determined according to the zoom duration.
In an exemplary embodiment, a smoothed zoom curve is generated using a cubic Bezier curve (Bezier Curve), which is a mathematical curve applied to a two-dimensional graphics application.
The path of the cubic bezier curve is described by a function B (t) of a given point P0 (starting point), P1 (control point), P2 (control point), P3 (ending point):
B(t)=P0(1-t) 3 +3P1t(1-t) 2 +3P2 2 (1-t)+P3t 3
t in the formula is the zooming time length from the initial zooming magnification to the target zooming magnification, namely DeltaT; p0 is a position point corresponding to the initial zoom magnification, P3 is a position point corresponding to the target zoom magnification, and the Bezier curve can be adjusted by adjusting two control points P1 and P2.
In the application, the general shape of the curve, such as an S shape, and the information of P1 and P2 corresponding to the shape of the curve, that is, the relative position coordinates of P1 and the starting point, the relative position coordinates of P2 and the ending point, are determined through limited tests, the information of P1 and P2 is preset in the electronic device, and when shooting parameters are adjusted each time, the coordinate positions of the starting point and the ending point are adjusted according to actual conditions, so that a smooth curve matched with the actual conditions can be obtained.
Under the condition that the zooming time length is fixed, the curves corresponding to different zooming magnitudes are also different, as shown in fig. 10, the zooming magnitude corresponding to the curve 1 is smaller than the zooming magnitude corresponding to the curve 2, but the zooming time length is the same, so that the zooming speeds corresponding to the curve 1 and the curve 2 are different, and the zooming speed of the curve 2 is larger than the zooming speed of the curve 1.
Similarly, a smooth zoom curve under a scene of reducing the zoom magnification can be obtained, and after the initial zoom magnification and the target zoom magnification corresponding to the zoom magnification reduction operation are determined, the whole curve can be obtained according to a function corresponding to the Bezier curve. As shown in fig. 11, a smooth zoom curve corresponding to the zoom-out magnification operation is reduced. And obtaining a curve matched with the zoom-out zoom magnification operation according to the initial zoom magnification and the target zoom magnification adjustment curves corresponding to different zoom-out zoom operations.
S170, the smooth zooming module outputs a zooming instruction based on the smooth zooming curve and transmits the zooming instruction to the camera module, and the camera module controls the zooming multiplying power to smoothly change from the initial zooming multiplying power to the target zooming multiplying power.
In this embodiment, the optical zoom is taken as an example, and the smooth zoom module determines zoom magnifications corresponding to each moment according to ordinate of each point on the smooth zoom curve, generates zoom instructions corresponding to each moment, carries the zoom magnifications corresponding to the moment in each zoom instruction, and transmits each zoom instruction to the camera module in time sequence, so that the camera module sequentially adjusts the distance between the lens and the shooting object according to each zoom instruction, and finally realizes smooth zoom.
In other embodiments, the zooming process may also be digital zooming, in which case, the zooming instruction generated by the smoothing zooming module is transmitted to the image signal processor, and the image signal processor performs zooming processing on the received captured image data in response to the zooming instruction, so as to finally realize smooth zooming. In the smooth zooming process, the shooting data of the camera module are transmitted to the image signal processor for recording and are transmitted to the display screen for displaying whether the optical zooming or the digital zooming is carried out.
S180, if the shooting parameter to be adjusted is determined to be the rotation angle, the shooting parameter processing module acquires the angle corresponding to the rotation operation.
The rotation angle herein may be the angle shown in fig. 5, and will not be described here.
In some embodiments, the rotation operation may be a rotation of two fingers on the screen, e.g., one finger is stationary and the other finger rotates clockwise or counterclockwise about the stationary finger, or the two fingers rotate in opposite directions, such as one finger sliding clockwise and the other finger sliding counterclockwise. In this case, the real-time rotational displacement of the finger of the user is obtained, and the real-time angle corresponding to the real-time rotational displacement is obtained according to the corresponding relationship between the rotational displacement and the rotational angle.
For another example, an angle rotation amplitude bar and an adjustment control arranged on the angle rotation amplitude bar are arranged on the shooting interface, and a user can drag the adjustment control through a finger to adjust the angle. And acquiring the position of the adjustment control on the angle rotation amplitude bar in real time, and determining the real-time angle based on the position.
In the scene shown in fig. 7C, the angle between the actual view angle of the photographed image and the horizon view angle of the photographed interface is directly taken as the rotation angle corresponding to the current rotation operation event, and as shown in (1) of fig. 7C, the angle may be displayed on the photographed interface.
S190, triggering the display screen to prompt the rotation direction on the shooting interface.
As shown in fig. 6A, 6B, 7A and 7B, a prompt graph of the rotation angle may be displayed on the shooting interface, and the user is prompted to correspond to the rotation angle of the current rotation operation by the prompt graph, so that the user intuitively sees the shooting range after the rotation operation, and further determines whether the current rotation operation meets the requirement.
In other embodiments, the value of the rotation angle may also be displayed directly on the photographing interface.
S200, if the shooting parameter processing module detects a rotation trigger event, determining a target angle.
If the rotation operation is a rotation operation of two fingers on the screen, the rotation triggering operation may be that the two fingers leave the screen; if the rotation operation is the adjustment control on the finger dragging angle rotation amplitude bar, the rotation triggering operation may be that the finger leaves the adjustment control, and the application does not particularly limit the rotation triggering operation.
After receiving the touch event reported by the touch screen layer by layer, the camera APP transmits the touch event to the shooting parameter processing module if the touch event is determined to be a rotation triggering event.
The shooting parameter processing module responds to the rotation triggering event, determines the corresponding real-time rotation angle as a target angle when the rotation triggering operation is received, and sends the rotation triggering event, the initial angle, the target angle and other angle rotation information to the smooth rotation module.
The process of determining the target angle is the same as the process of determining the real-time angle, and will not be described herein.
In the scenario shown in fig. 7C, when the operation of clicking the "ok" control by the user is detected, a rotation trigger event is generated, where the target angle is 0 °, that is, the horizontal line of the shot image rotates from a state where an included angle exists between the horizontal line and the horizontal line view angle of the shooting interface to a state where the directions of the horizontal line and the horizontal line are consistent, that is, the horizontal line view angle is rotated back.
S210, the smooth rotation module responds to the rotation trigger event, and obtains a smooth rotation curve according to the initial angle and the target angle.
In a scene where a user adjusts an angle using a rotation operation, a photographing angle of a photographing screen is an initial angle when the rotation operation is detected. In the case of the handset tilt shown in fig. 7C, the angle of the handset tilt is the initial angle (e.g., -15 °), and in this case, the target angle is 0 °. A smooth curve of change from 15 down to 0 is obtained.
And after the smooth rotation module receives the rotation triggering event, a smooth rotation curve is obtained based on the initial angle and the target angle. The process of obtaining the smooth rotation curve is the same as the process of obtaining the smooth zoom curve, and will not be described herein.
In addition, after the shooting parameter processing module detects a rotation triggering event, angle rotation prompt information is hidden.
As shown in fig. 12, the horizontal axis represents time, the vertical axis represents angle, and the curve represents a path smoothly varying from an initial angle to a target angle in Δt (Δt=t2-T1) time, and the ordinate of each point on the curve is the rotation angle corresponding to this time.
Similar to the smooth zoom curve, the smooth rotation curve also uses a three-dimensional bezier curve, as shown in fig. 12, in which a shallow-in-shallow-out variation is used at the start rotation time and the end rotation time, that is, the rotation speeds near the start rotation time and near the end rotation time are smaller than the rotation speed in the intermediate period.
As shown in fig. 13, the smooth rotation curve corresponding to the decreasing angle is similar to the smooth rotation curve of the increasing rotation angle, and the shallow-in-shallow-out change is adopted at the start rotation time and the end rotation time, that is, the rotation speeds near the start rotation time and near the end rotation time are smaller than the rotation speed in the intermediate period.
Δt represents the length taken to change from the initial angle to the target angle. In some embodiments, Δt may be a fixed duration, such as 1.2s, which may be a duration corresponding to the best dynamic rotation effect determined from a limited number of experiments. In other embodiments, Δt may be further determined according to a difference between the initial angle and the target angle (i.e., a rotation amplitude), for example, the rotation speed may be determined through a limited number of experiments, i.e., the rotation amplitude in a unit time, so as to determine a rotation duration according to the rotation amplitude corresponding to the current rotation operation, and further determine a smooth rotation curve according to the rotation duration.
Under the condition that the rotation duration is fixed, the amplitude of the smooth rotation curve in the vertical axis direction is adjusted according to different rotation amplitudes, and the smooth rotation curve corresponding to different rotation operations can be obtained.
S220, the smooth rotation module outputs a rotation instruction based on the smooth rotation curve and transmits the rotation instruction to the image signal processor, and the image signal processor controls the shot picture to smoothly change from an initial angle to a target angle based on the rotation instruction. The smooth rotation module determines angles corresponding to all the moments according to the ordinate of all the points on the smooth rotation curve, generates rotation instructions corresponding to all the moments, each rotation instruction comprises the angles corresponding to the moments, transmits each rotation instruction to the image signal processor in time sequence, and the image signal processor responds to the rotation instructions to smoothly rotate the received shot picture data until the rotation angle of the shot picture reaches a target angle. Meanwhile, the image signal processor records shooting data corresponding to the whole smooth rotation process to generate a video, and sends the video to the display screen for display.
The shooting picture data are shot by the camera module and transmitted to the image signal processor.
In the control method for shooting parameters provided in the embodiment, when the electronic device is in the video shooting mode, if a shooting parameter adjustment event is received, parameters to be adjusted are determined first. If the parameter to be adjusted is the zoom magnification, determining an initial zoom magnification, determining a target zoom magnification after detecting a zoom trigger operation, generating a smooth zoom curve based on the initial zoom magnification and the target zoom magnification, and controlling the zoom magnification to smoothly change from the initial zoom magnification to the target zoom magnification based on the smooth zoom curve. The process of adjusting the zoom magnification is also applicable to adjusting the rotation angle, generating a smooth rotation curve based on the initial rotation angle and the target rotation angle after detecting the rotation triggering operation, and controlling the rotation angle to smoothly change from the initial angle to the target angle based on the smooth rotation curve. Therefore, by utilizing the scheme, smooth change of shooting parameters in the video shooting process is realized, abrupt change of the shooting parameters is avoided, and the video shooting effect is improved.
Further, after the shooting parameter adjustment operation is detected, a prompt box is displayed on the shooting interface, if the current operation is a zooming operation, a shooting range matched with the real-time zooming operation is displayed in the prompt box, and similarly, when the rotation angle is adjusted, the shooting range matched with the real-time rotation operation is displayed in the prompt box, so that a user can intuitively see the shooting range after rotation corresponding to the real-time rotation operation. Therefore, a user can intuitively see the shooting range corresponding to the real-time adjustment operation, and whether the adjustment meets the requirement or not is further determined.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
In the several embodiments provided in this embodiment, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present embodiment may be integrated in one processing unit, each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present embodiment may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the method described in the respective embodiments. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. A control method of shooting parameters, characterized in that it is applied to an electronic device, the method comprising:
the electronic equipment enters a video shooting mode;
when a shooting parameter adjustment event is received, displaying adjustment range prompt information matched with the shooting parameter adjustment event on a shooting interface;
when an adjustment triggering operation is detected, acquiring a target parameter value of a shooting parameter to be adjusted, wherein the shooting parameter to be adjusted is an adjusted parameter of the shooting parameter adjustment event;
generating a smooth cubic Bezier curve by taking an initial parameter value as a starting point and the target parameter value as an ending point according to parameter information of a preset cubic Bezier curve, wherein the parameter information comprises a relative position relation between a first control point and the starting point of the cubic Bezier curve and a relative position relation between a second control point and the ending point; the initial parameter value is a parameter value corresponding to the shooting parameter to be adjusted when the shooting parameter adjustment event is detected, the abscissa of the smoothed cubic Bezier curve is shooting parameter adjustment time, and the ordinate is shooting parameter value;
Determining parameter values corresponding to the shooting parameters to be adjusted at each adjustment time point based on the smoothed cubic Bezier curve;
generating an adjustment instruction corresponding to the adjustment time point based on the parameter value corresponding to the adjustment time point;
and sequentially executing the adjustment instructions according to the time sequence of each adjustment time point, and adjusting the shooting parameters to be adjusted to parameter values corresponding to the adjustment instructions.
2. The method of claim 1, wherein the shooting parameter adjustment event is generated upon detection of a shooting parameter adjustment operation;
the displaying, on the photographing interface, adjustment range prompt information matched with the photographing parameter adjustment event includes:
displaying prompt information on the shooting interface;
when the change of the shooting parameter adjustment operation is detected, determining a real-time parameter value matched with the changed shooting parameter adjustment operation;
adjusting display parameters of the prompt information based on the real-time parameter values;
the display parameters of the prompt information are matched with the shooting parameter adjustment operation, and the display parameters of the prompt information comprise the size or the direction of the prompt information.
3. The method of claim 2, wherein the shooting parameter adjustment operation is a zoom operation, and the prompt message includes a first zoom prompt box and a second zoom prompt box displayed on the shooting interface;
displaying a shooting range matched with an initial zoom magnification in the first zoom prompt box, wherein the initial zoom magnification is a corresponding zoom magnification when the zoom operation is detected;
and displaying a shooting range matched with the real-time zoom magnification corresponding to the zoom operation in the second zoom prompt frame.
4. The method of claim 3, wherein the area between the first zoom-out frame and the second zoom-out frame is a semi-transparent area, and the display content of the semi-transparent area is a transition range, and the transition range corresponds to a change from a shooting range displayed by the first zoom-out frame to a shooting range displayed by the second zoom-out frame.
5. The method according to claim 3 or 4, wherein the shooting parameter adjustment operation is a magnification zoom operation;
the adjusting the display parameter of the prompt message based on the real-time parameter value includes:
And adjusting the size of the second zoom prompt box based on the real-time zoom magnification corresponding to the zoom magnification operation, so that a shooting range matched with the real-time zoom magnification is displayed in the second zoom prompt box.
6. The method according to claim 3 or 4, wherein the shooting parameter adjustment operation is a zoom-out magnification operation;
the adjusting the display parameter of the prompt message based on the real-time parameter value includes:
based on the real-time zoom magnification corresponding to the zoom magnification reduction operation, adjusting the size of the first zoom prompt box so as to enable a shooting range matched with the initial zoom magnification to be displayed in the first zoom prompt box;
the size of the second zooming prompt frame is equal to that of the shooting picture, and the shooting range matched with the real-time zooming multiplying power is displayed in the second zooming prompt frame.
7. The method according to claim 2, wherein the shooting parameter adjustment operation is a rotation operation, and the prompt information includes a prompt graphic for displaying a rotation angle on the shooting interface;
the adjusting the display parameter of the prompt message based on the real-time parameter value includes:
And adjusting the indication direction of the prompt graph to be consistent with the rotation angle value corresponding to the rotation operation.
8. The method according to claim 7, wherein the prompt graphic includes an indication arrow displayed on a photographing interface, the indication direction of the indication arrow being consistent with a rotation angle value corresponding to the rotation operation.
9. The method of claim 7, wherein the hint graphic includes an angle dial displayed on a photographic interface within which a rotation angle value corresponding to the rotation operation is displayed.
10. The method of claim 1, wherein the shooting parameter adjustment event is generated when an angle between a longitudinal symmetry axis of the electronic device and a true horizontal line of a shot is detected to be less than 90 °;
the displaying, on the photographing interface, adjustment range prompt information matched with the photographing parameter adjustment event includes:
displaying rotation prompt information and option controls on the shooting interface, wherein the rotation prompt information is used for prompting a user whether to rotate to a horizon angle, and the real horizon of a shot picture is consistent with the horizon direction on the shooting interface under the horizon angle;
The option control comprises a determination option control and a cancel option control, and the adjustment triggering operation is an operation of clicking the determination option control.
11. The method according to any one of claims 2 to 4 and 7 to 9, further comprising: and hiding the prompt information when detecting the adjustment triggering operation.
12. The method of claim 1, wherein the photographing parameter to be adjusted is a zoom magnification, and the adjustment triggering operation is a finger for a zoom operation moving away from a screen of the electronic device.
13. The method of claim 1, wherein the photographing parameter to be adjusted is a rotation angle, and the adjustment triggering operation is at least two fingers for a rotation operation leaving a screen of the electronic device.
14. The method according to any one of claims 7 to 9, wherein the rotation angle is an angle obtained by rotating the photographed picture clockwise or counterclockwise with a straight line perpendicular to the screen as an axis.
15. An electronic device comprising a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, causes the electronic device to perform the method of controlling a shooting parameter as claimed in any one of claims 1 to 14.
16. A computer-readable storage medium, on which a computer program is stored which, when run on an electronic device, causes the electronic device to execute the control method of the shooting parameters as claimed in any one of claims 1 to 14.
CN202111505369.4A 2021-12-10 2021-12-10 Shooting parameter control method and device Active CN115002338B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111505369.4A CN115002338B (en) 2021-12-10 2021-12-10 Shooting parameter control method and device
CN202310553504.5A CN116634261A (en) 2021-12-10 2021-12-10 Shooting parameter control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111505369.4A CN115002338B (en) 2021-12-10 2021-12-10 Shooting parameter control method and device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310553504.5A Division CN116634261A (en) 2021-12-10 2021-12-10 Shooting parameter control method and device

Publications (2)

Publication Number Publication Date
CN115002338A CN115002338A (en) 2022-09-02
CN115002338B true CN115002338B (en) 2023-06-02

Family

ID=83017963

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111505369.4A Active CN115002338B (en) 2021-12-10 2021-12-10 Shooting parameter control method and device
CN202310553504.5A Pending CN116634261A (en) 2021-12-10 2021-12-10 Shooting parameter control method and device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202310553504.5A Pending CN116634261A (en) 2021-12-10 2021-12-10 Shooting parameter control method and device

Country Status (1)

Country Link
CN (2) CN115002338B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116709018B (en) * 2022-10-14 2024-04-09 荣耀终端有限公司 Zoom bar segmentation method and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572572A (en) * 2019-09-16 2019-12-13 Oppo广东移动通信有限公司 Shooting parameter selection method and device, mobile terminal and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6746390B2 (en) * 2016-06-14 2020-08-26 キヤノン株式会社 Image processing method and image processing apparatus
JP6800644B2 (en) * 2016-07-29 2020-12-16 キヤノン株式会社 Image processing device and image processing method
CN113747050B (en) * 2020-05-30 2023-04-18 华为技术有限公司 Shooting method and equipment
CN113438508B (en) * 2021-06-23 2022-07-01 北京蜂巢世纪科技有限公司 Video data processing method, device, equipment, medium and program product

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572572A (en) * 2019-09-16 2019-12-13 Oppo广东移动通信有限公司 Shooting parameter selection method and device, mobile terminal and storage medium

Also Published As

Publication number Publication date
CN116634261A (en) 2023-08-22
CN115002338A (en) 2022-09-02

Similar Documents

Publication Publication Date Title
US11481096B2 (en) Gesture mapping for image filter input parameters
US20230154097A1 (en) Systems and methods for displaying representative images
US10645272B2 (en) Camera zoom level and image frame capture control
US10038844B2 (en) User interface for wide angle photography
EP3286915B1 (en) Digital viewfinder user interface for multiple cameras
KR101969424B1 (en) Photographing device for displaying image and methods thereof
JP4642868B2 (en) How to provide an area zoom function for a camera
US10628010B2 (en) Quick review of captured image data
US20200244871A1 (en) Camera zoom level and image frame capture control
CN107390990B (en) Image adjusting method and mobile terminal
US20120242852A1 (en) Gesture-Based Configuration of Image Processing Techniques
US20210084228A1 (en) Tracking shot method and device, and storage medium
CN107172347B (en) Photographing method and terminal
KR20170029404A (en) Method and apparatus for controlling video image and terminal
CN115002338B (en) Shooting parameter control method and device
CN114125268A (en) Focusing method and device
CN112214621A (en) Image viewing method and electronic equipment
GB2574780A (en) Electronic device and method for controlling same
WO2023225910A1 (en) Video display method and apparatus, terminal device, and computer storage medium
US20240080543A1 (en) User interfaces for camera management
WO2021218118A1 (en) Image processing method and apparatus
CN116744065A (en) Video playing method and device
JP2013157689A (en) Imaging system, control method/control program therefor, and imaging device
KR20150070084A (en) Method for controlling camera of terminal and terminal thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant