CN112887621B - Control method and electronic device - Google Patents

Control method and electronic device Download PDF

Info

Publication number
CN112887621B
CN112887621B CN202110120862.8A CN202110120862A CN112887621B CN 112887621 B CN112887621 B CN 112887621B CN 202110120862 A CN202110120862 A CN 202110120862A CN 112887621 B CN112887621 B CN 112887621B
Authority
CN
China
Prior art keywords
angle
shadow
target object
input
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110120862.8A
Other languages
Chinese (zh)
Other versions
CN112887621A (en
Inventor
夏志洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110120862.8A priority Critical patent/CN112887621B/en
Publication of CN112887621A publication Critical patent/CN112887621A/en
Application granted granted Critical
Publication of CN112887621B publication Critical patent/CN112887621B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

The application discloses a control method and electronic equipment, belongs to the technical field of electronics, and aims to solve the problem that in order to enable people and shadows in a photo to achieve an expected effect, a user needs to continuously adjust a shooting position, so that operation is complex. Wherein the control method comprises the following steps: receiving a first input of a user under the condition that a target object and a shadow thereof are identified in a target interface; the first input is used for adjusting an included angle between two screens of the electronic equipment to a first angle; responding to the first input, and adjusting an included angle between the target object and a shadow thereof to a second angle; the second angle is associated with the first angle. The control method is applied to the electronic equipment.

Description

Control method and electronic device
Technical Field
The application belongs to the technical field of electronics, and particularly relates to a control method and electronic equipment.
Background
At present, camera configuration and image processing technology of electronic equipment are gradually mature, and users take pictures with the electronic equipment more and more frequently.
In a scene photographed in the sun, a photographic subject and a shadow appear in a photographed picture. The shadow is not only the projection of the shooting object, but also the inversion and deformation of the shadow can be used for expressing the visual impact and the emotion. Therefore, in order to achieve the shooting effect, the user needs to adjust the shooting angle continuously during the shooting process so that the characters and shadows in the picture can achieve the expected effect.
Therefore, in the process of implementing the present application, the inventors found that at least the following problems exist in the prior art: in order to make the characters and shadows in the photos have the expected effect, the user needs to continuously adjust the shooting position, and the operation is complicated.
Disclosure of Invention
An object of the embodiments of the present application is to provide a control method, which can solve the problem that the user needs to continuously adjust the shooting position to achieve the desired effect of the characters and shadows in the photo, which results in complicated operation.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a control method, where the method includes: receiving a first input of a user under the condition that a target object and a shadow thereof are identified in a target interface; the first input is used for adjusting an included angle between two screens of the electronic equipment to a first angle; responding to the first input, and adjusting an included angle between the target object and a shadow thereof to a second angle; the second angle is associated with the first angle.
In a second aspect, an embodiment of the present application provides an electronic device, including: the first input receiving module is used for receiving a first input of a user under the condition that a target object and a shadow thereof are identified in a target interface; the first input is used for adjusting an included angle between two screens of the electronic equipment to a first angle; the first input response module is used for responding to the first input and adjusting the included angle between the target object and the shadow thereof to a second angle; the second angle is associated with the first angle.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instructions stored in the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In this way, in the embodiment of the present application, when the user shoots, if there is a light source, such as sunlight, in the shooting scene, the target object and its shadow in the shooting scene can be shot. Therefore, in the case where the target interface of the present embodiment is the shooting preview interface, the target object and the shadow thereof can be recognized. Further, the user can adjust the included angle between the two screens through a preset first input. Accordingly, on the basis of presetting the relevance of the included angle between the two screens and the included angle between the target object and the shadow thereof, when the user adjusts the included angle between the two screens to a first angle, correspondingly, the included angle between the target object and the shadow thereof is also adjusted to a second angle relevant to the first angle. Therefore, in the shooting process, the user can preview the display effect of the target object and the shadow thereof under different position relations by adjusting the included angle between the two screens, so that the user is prevented from continuously adjusting the shooting position, and the user operation is simplified.
Drawings
FIG. 1 is a flow chart of a control method of an embodiment of the present application;
FIG. 2 is a schematic diagram of relative position relationships of targets according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a second schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 5 is a block diagram of an electronic device of an embodiment of the application;
fig. 6 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Fig. 7 is a second schematic diagram of a hardware structure of the electronic device according to the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes the control method provided by the embodiments of the present application in detail through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
Fig. 1 shows a flowchart of a control method according to an embodiment of the present application, which is applied to an electronic device including two screens. The method in the embodiment comprises the following steps:
step S1: in the event that a target object and its shadow are identified in the target interface, a first input is received from a user.
The first input is used for adjusting an included angle between two screens of the electronic equipment to a first angle.
The first input comprises touch input and space input which are performed by a user on a screen, and is not limited to input such as clicking, sliding and the like; the first input also includes user input to physical keys on the device, not limited to press or the like. Furthermore, the first input includes one or more inputs, wherein the plurality of inputs may be continuous or intermittent.
The two screens are, by reference, a first screen and a second screen, respectively. The first screen is a main screen, and the second screen is a sub-screen. Under the condition that the main screen displays the target interface, a user can adjust the included angle between the two screens by turning the auxiliary screen, so that the included angle between the two screens is a first angle.
Alternatively, in this step, the target interface is any one of a shooting preview interface, an image editing interface, and the like.
In one scenario, a user enters a shooting mode of the electronic device, a first screen displays a shooting preview interface, and if a target object and a shadow thereof are identified in the shooting preview interface, the user can perform a first input.
In another scenario, a user enters an image editing mode of the electronic device, the first screen displays an image editing interface, and if a target object and a shadow thereof are identified in the image editing interface, the user can perform a first input.
In reference, under the condition that a target object and a shadow thereof are identified in a target interface, a control for triggering adjustment of a shadow angle is displayed, a user operates the control, and a function of adjusting the shadow angle is started, so that the user adjusts an included angle between two screens through first input.
Or, under the condition that the target object and the shadow thereof are identified in the target interface, the user adjusts the included angle between the two screens to a first angle, and then triggers a control for adjusting the shadow angle.
Step S2: and responding to the first input, and adjusting the included angle between the target object and the shadow thereof to a second angle.
Wherein the second angle is associated with the first angle.
In this embodiment, the included angle between the target object and the shadow thereof is used to represent the relative position relationship between the target object and the shadow thereof, and the orientation relationship between the target object and the shadow thereof can be represented from the included angle between the target object and the shadow thereof.
Optionally, the angle between the two screens and the angle between the target object and its shadow are set in advance in a correlated manner.
For example, it can be set as: as the angle of the included angle between the two screens is increased, correspondingly, the angle of the included angle between the target object and the shadow thereof is increased; conversely, as the angle between the two screens decreases, the angle between the target object and its shadow correspondingly decreases.
Accordingly, when the angle between the two screens is a first angle, correspondingly, the angle between the target object and its shadow is a second angle associated with the first angle.
Referring to FIG. 2, target object 1 and its shadow 2 may both be considered particles, and thus the coordinates of target object 1 and its shadow 2 may be determined in the target interface. Further, a horizontal line is taken as a reference line 3, and an included angle between a connecting line between the target object 1 and the shadow 2 thereof and the reference line 3 is taken as an included angle between the target object 1 and the shadow 2 thereof.
In reference, a reference line 3 is set on one side of the target object 1 with the target object 1 as a standard. Referring to fig. 2, a reference line 3 is located on the left side of the target object 1.
Specifically, in the case where the target object is a person, a reference line is set in the front vertical direction of the face of the person.
Further, the angle between the two screens ranges from 0 ° to 180 °, which is mapped as: the included angle between the shadow and the target object is 0-180 degrees. That is, the angle between the two screens corresponds to the angle between the shadow and the target object. Referring to fig. 2, for example, the angle between the two screens is 120 °, corresponding to 120 ° between the shadow 2 and the target object 1.
Optionally, the angle between the target object and its shadow is displayed in real time, which is helpful for the user to adjust.
In this way, in the embodiment of the present application, when the user shoots, if there is a light source, such as sunlight, in the shooting scene, the target object and its shadow in the shooting scene can be shot. Therefore, in the case where the target interface of the present embodiment is the shooting preview interface, the target object and the shadow thereof can be recognized. Further, the user can adjust the included angle between the two screens through a preset first input. Accordingly, on the basis of presetting the relevance of the included angle between the two screens and the included angle between the target object and the shadow thereof, when the user adjusts the included angle between the two screens to a first angle, correspondingly, the included angle between the target object and the shadow thereof is also adjusted to a second angle relevant to the first angle. Therefore, in the shooting process, the user can preview the display effect of the target object and the shadow thereof under different position relations by adjusting the included angle between the two screens, so that the user is prevented from continuously adjusting the shooting position, and the user operation is simplified.
In addition, the embodiment of the application can be applied to an image editing scene besides the shooting scene, so that the expected effect can be achieved at a later stage by adjusting the included angle of the target object and the shadow thereof in the photo. Therefore, based on the image editing method provided by the embodiment, the user does not need to adjust the shooting position in the shooting process, the shooting technical requirement on the user is low, and the user operation is simple; meanwhile, the operation of the user in the image editing scene is simpler, so that the editing operation of the user is simplified.
In a flow of a control method according to another embodiment of the present application, before receiving the first input from the user, the method further includes:
step A1: a first light source forming a shadow is identified in the target interface.
Optionally, the first light source comprises a light emitting object such as the sun, a light, etc.; the first light source also includes light rays, etc.
Step A2: and determining the relative position relation of the first light source, the target object and the shadow thereof.
Optionally, an image recognition technology is first used to recognize the three elements, i.e., the first light source, the target object and the shadow thereof, in the target interface, then position information (e.g., coordinate information) of each element in the target interface is respectively obtained, and angles of each included angle are calculated through the position information and a trigonometric function, so as to further determine a target relative position relationship between the three elements.
Referring to fig. 2, the relative target position relationship includes an angle between the first light source 4 and the target object 1, and an angle between the target object 1 and its shadow 2.
The included angle between the first light source 4 and the target object 1 is: the connecting line between the first light source 4 and the target object 1 forms an included angle with the reference line 3; the included angle between the target object 1 and the shadow 2 is as follows: the connecting line between the target object 1 and the shadow 2 and the reference line 3.
Referring to fig. 2, specifically, for example, the incident direction of the first light source 4 is calculated by an angle between the first light source 4 and the target object 1, and then the detail correction is performed by an angle between the target object 1 and the shadow 2 thereof, and further, the relative position relationship of the target is determined based on the corrected position relationship of the three.
For another example, the incident direction of the first light source 4 is calculated by the angle between the target object 1 and the shadow 2 thereof, and then the angle between the first light source 4 and the target object 1 is corrected for details, and further, the target relative position relationship is determined based on the corrected position relationship of the three.
Generally, following a natural law, the relative positional relationship of the targets indicates that the three are on a straight line.
Correspondingly, the relative position relationship of the target includes the relative position relationship of the first light source, the target object and the shadow thereof on a straight line.
According to the control method, the shadow effect in the image can be adjusted in the image irradiated by the light source. In order to pursue the real shooting effect of the image, the relative position relation of the light source, the target object and the shadow thereof is used as an adjustment basis. Therefore, in the present embodiment, in the case of recognizing the light source, the target object, and the shadow thereof, the target relative positional relationship among the three is determined.
In the flow of the control method according to another embodiment of the present application, after step S2, the method further includes:
step B1: and determining the first position of the first light source according to the target relative position relation and the second angle.
Step B2: the first light source is adjusted to a first position.
In the natural law, the position of the light source affects the orientation of the shadow, and the position of the light source and the orientation of the shadow are opposite to each other based on the irradiated object. For example, the light source shines on the object from the left side, and the shadow is oriented to the right. Therefore, when the angle between the shadow and the target object changes, the orientation of the shadow also changes, and correspondingly, the position of the light source also changes according to the projection principle.
Therefore, in this embodiment, according to the natural law, after the included angle between the target object and the shadow thereof is adjusted to the second angle, the position of the first light source is adjusted based on the target relative position relationship.
In this embodiment, the following is actually taken as a reference based on the relative positional relationship of the target: and calculating to obtain the position relation in the three-dimensional world based on the position relation of the identified first light source, the identified target object and the shadow thereof in the two-dimensional plane, so as to form a nonlinear regulation formula and meet the natural law of the light source and the shadow.
Optionally, an adjustment formula is generated according to the relative position relationship of the target, so that the position of the light source is adjusted according to the adjustment formula.
In this embodiment, in the image irradiated by the light source, the angle of the included angle between the target object and the shadow thereof is adjusted based on the natural law, and the position of the first light source is adjusted at the same time, so that the image content in the image is more real. Therefore, the embodiment can ensure the real effect of the image while simplifying the user operation.
In the flow of the control method according to another embodiment of the present application, in step S2, after responding to the first input and before adjusting the included angle between the target object and its shadow to the second angle, the method further includes:
substep C1: the first light source is adjusted to a second position. Wherein the second position is associated with the first angle.
In this embodiment, another method for adjusting the angle between the target object and its shadow is provided.
In this embodiment, the user adjusts the angle of the included angle between the two screens through the first input, and correspondingly, the first light source in the target interface changes the position along with the change of the angle of the included angle between the two screens.
Optionally, the angle between the two screens is set in advance in association with the position of the first light source.
For example, as the angle between the two screens becomes larger, the position of the first light source changes to the left; conversely, as the angle between the two screens becomes smaller, the position of the first light source changes to the right.
For another example, when the included angle between the two screens is a first angle, the position of the first light source is correspondingly a second position.
Substep C2: and determining a second angle according to the relative position relation of the target and the second position.
In the natural law, the orientation of the shadow is influenced by the position of the light source, and the position of the light source and the orientation of the shadow are opposite. Therefore, when the position of the light source changes, the orientation of the shadow changes, and correspondingly, the angle between the shadow and the target object changes.
Therefore, in this embodiment, according to the natural law, after the position of the first light source is adjusted, the included angle between the target object and the shadow thereof is adjusted based on the relative position relationship of the target.
Optionally, an adjustment formula is generated according to the relative position relationship of the target, so that the included angle between the target object and the shadow thereof is adjusted according to the adjustment formula.
In this embodiment, the user may further adjust the position of the first light source through the first input, so as to adjust an included angle between the target object and the shadow thereof in the target interface based on the change of the position of the first light source, thereby providing a further shortcut operation for the user to simplify the user operation. In addition, in the image irradiated by the light source, the position of the first light source is adjusted according to the natural law, and meanwhile, the included angle between the target object and the shadow thereof is correspondingly adjusted, so that the image content of the image is more real. Therefore, the embodiment can ensure the real effect of the image while simplifying the user operation.
In the flow of the control method according to another embodiment of the present application, after step S2, the method further includes:
step D1: and adjusting the shape of the shadow according to the second angle.
Wherein the shape includes at least a length and a contour.
On the one hand, in the natural law, the sun illumination is related to the length of the shadow, and the smaller the sun altitude angle is, the longer the shadow is, otherwise, the shorter the shadow is. Therefore, when the angle of the included angle between the target object and the shadow thereof changes, the solar altitude also changes according to the natural law, so that the length of the shadow is required to be adjusted while the angle of the included angle between the target object and the shadow thereof is adjusted.
On the other hand, along with the change of the included angle between the target object and the shadow thereof, the shadow can be distorted to a certain extent according to the viewport rule of the real world. Therefore, when the included angle between the target object and the shadow thereof is adjusted, the outline of the shadow is also required to be adjusted according to the natural law, so that the outline can show the distortion change.
Referring to fig. 3 and 4, when the angle of the included angle between the target object (character) and the shadow thereof is changed, the position of the first light source (sun) is changed, and the length and the contour of the shadow are changed.
In the embodiment, the shape of the shadow is adjusted, so that the whole image scene conforms to the objective law of the real world, and therefore, the embodiment can ensure the real effect of the image while simplifying the operation of a user.
In the flow of the control method according to another embodiment of the present application, before step S1, the method further includes:
step E1: in the event that a target object is identified in the target interface, a second input is received from the user.
The second input comprises touch input and space input which are performed by a user on a screen and are not limited to input of clicking, sliding and the like; the second input also includes user input on physical keys on the device, and is not limited to pressing or the like. Also, the second input includes one or more inputs, wherein the plurality of inputs may be continuous or intermittent.
The second input is used for adding a second light source in the target interface, so that a light and shadow effect can be presented in the target interface.
And in an application scene, for example, a camera is started to shoot a portrait, the shooting preview interface is used as a target interface, a target object can be identified in the shooting preview interface, and a user is prompted to perform light and shadow simulation. Therefore, based on the prompt, the user can trigger the control of the light and shadow simulation through the second input, add the second light source in the target interface, and start the light and shadow simulation function. In the light and shadow simulation function, the implementation mode of carrying out orientation transformation on the shadow by using the included angle characteristic of the folding screen is started.
Step E2: in response to a second input, a second light source is generated in the target interface.
Wherein the second light source is an analog light source. The analog light source includes at least one of a light emitting object and a light.
Optionally, the second light source is randomly positioned in the target interface.
Step E3: and generating a shadow corresponding to the target object in the target interface according to the first relative position relation between the second light source and the target object and the outline of the target object.
Optionally, the position of the shadow is determined according to a relative relationship between the second light source and the target object, so that the shadow, the second light source and the target object are located on the same straight line.
Further, after the position of the shadow is determined, the shadow of the target object is drawn according to the outline of the target object.
Optionally, based on the simulated second light source, the target object and its shadow, the user may adjust the position of the three through the first input.
In the embodiment, in a shooting scene, a light source can be simulated, a shadow can be drawn, and a scene irradiated by light effect in the real world is created for a single picture, so that a novel image shooting method is provided, and the shooting experience of a user is improved.
In addition, the embodiment of the application can also be applied to an image editing scene to perform light and shadow simulation on images without light rays and shadows, so that stereoscopic impression and sense of reality are increased.
In the flow of the control method according to another embodiment of the present application, after step S2, the user may click the "take picture" button to complete the taking picture. Alternatively, the user may also click on the "save" button to complete the image editing.
The embodiment of the application is preferably applied to shooting scenes and editing scenes of the character images, so that expected shadow effects are presented in the character images.
In summary, the embodiments of the present application are: a method for dynamically adjusting the shadow of a portrait based on a folding screen is provided. The problem of the camera position is often removed because light and shadow can't the adjustment position angle when shooing is solved, make the user can utilize light and shadow in a flexible way, improve the film forming rate of shooing under the sunshine. On one hand, shooting in a sunlight scene, and dynamically adjusting the included angle between the shadow of the figure and the figure through the folding angle of the folding screen to finish the panoramic composition of the figure and the shadow during shooting; on the other hand, the method is used for photographing in a scene with weak sunlight, simulating a light source, calculating the incident angle between the light source and a person, calculating the included angle and the area between the shadow and the person at the moment, forming shadow mapping in a picture, and increasing the stereoscopic impression and the sense of reality.
The control method of the embodiment not only can make the outline of the shadow clear, but also can shadow the shadow based on the adjustment of the included angle between the target object and the shadow thereof, so as to meet different requirements of users and achieve the expected effect.
It should be noted that, in the control method provided in the embodiment of the present application, the execution main body may be an electronic device, or a control module used for executing the control method in the electronic device. In the embodiment of the present application, an electronic device executing a control method is taken as an example, and the electronic device of the control method provided in the embodiment of the present application is described.
Fig. 5 shows a block diagram of an electronic device of another embodiment of the present application, including:
a first input receiving module 10, configured to receive a first input of a user when a target object and a shadow thereof are identified in a target interface; the first input is used for adjusting an included angle between two screens of the electronic equipment to a first angle;
a first input response module 20, configured to adjust an included angle between the target object and a shadow thereof to a second angle in response to the first input; the second angle is associated with the first angle.
In this way, in the embodiment of the present application, when the user shoots, if there is a light source, such as sunlight, in the shooting scene, the target object and its shadow in the shooting scene can be shot. Therefore, in the case where the target interface of the present embodiment is the shooting preview interface, the target object and the shadow thereof can be recognized. Further, the user can adjust the included angle between the two screens through a preset first input. Therefore, on the basis of presetting the relevance of the included angle between the two screens and the included angle between the target object and the shadow thereof, when the user adjusts the included angle between the two screens to a first angle, correspondingly, the included angle between the target object and the shadow thereof is also adjusted to a second angle relevant to the first angle. Therefore, in the shooting process, the user can preview the display effect of the target object and the shadow thereof under different position relationships by adjusting the included angle between the two screens, so that the user is prevented from continuously adjusting the shooting position, and the user operation is simplified.
Optionally, the apparatus further comprises:
the identification module is used for identifying a first light source forming a shadow in a target interface;
the first determining module is used for determining a target relative position relation among the first light source, the target object and the shadow thereof.
Optionally, the apparatus further comprises:
the second determining module is used for determining the first position of the first light source according to the target relative position relation and the second angle;
the first adjusting module is used for adjusting the first light source to a first position.
Optionally, the apparatus further comprises:
the second adjusting module is used for adjusting the first light source to a second position; the second position is associated with the first angle;
and the third determining module is used for determining a second angle according to the relative position relation of the target and the second position.
Optionally, the apparatus further comprises:
the third adjusting module is used for adjusting the shape of the shadow according to the second angle; the shape includes at least a length and a contour.
Optionally, the apparatus further comprises:
the second input receiving module is used for receiving second input of the user under the condition that the target object is identified in the target interface;
a second input response module to generate a second light source in the target interface in response to a second input;
and the generating module is used for generating a shadow corresponding to the target object in the target interface according to the first relative position relation between the second light source and the target object and the outline of the target object.
The electronic device in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The electronic device in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The electronic device provided by the embodiment of the application can implement each process implemented by the method embodiment, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 6, an electronic device 100 is further provided in this embodiment of the present application, and includes a processor 101, a memory 102, and a program or an instruction stored in the memory 102 and executable on the processor 101, where the program or the instruction is executed by the processor 101 to implement each process of any one of the above embodiments of the control method, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 7 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and the like.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The user input unit 1007 is configured to receive a first input of a user when a target object and a shadow thereof are identified in a target interface; the first input is used for adjusting an included angle between two screens of the electronic device 1000 to a first angle; a processor 1010, configured to adjust an included angle between the target object and a shadow thereof to a second angle in response to the first input; the second angle is associated with the first angle.
In this way, in the embodiment of the present application, when the user shoots, if there is a light source, such as sunlight, in the shooting scene, the target object and its shadow in the shooting scene can be shot. Therefore, in the case where the target interface of the present embodiment is the shooting preview interface, the target object and the shadow thereof can be recognized. Further, the user can adjust the included angle between the two screens through a preset first input. Accordingly, on the basis of presetting the relevance of the included angle between the two screens and the included angle between the target object and the shadow thereof, when the user adjusts the included angle between the two screens to a first angle, correspondingly, the included angle between the target object and the shadow thereof is also adjusted to a second angle relevant to the first angle. Therefore, in the shooting process, the user can preview the display effect of the target object and the shadow thereof under different position relations by adjusting the included angle between the two screens, so that the user is prevented from continuously adjusting the shooting position, and the user operation is simplified.
Optionally, a processor 1010 further configured to identify a first light source forming the shadow in the target interface; and determining the relative position relation of the first light source, the target object and the shadow thereof.
Optionally, the processor 1010 is further configured to determine a first position of the first light source according to the target relative position relationship and the second angle; adjusting the first light source to the first position.
Optionally, the processor 1010 is further configured to adjust the first light source to a second position; the second position is associated with the first angle; and determining the second angle according to the target relative position relation and the second position.
Optionally, the processor 1010 is further configured to adjust a shape of the shadow according to the second angle; the shape includes at least a length and a profile.
Optionally, the user input unit 1007 is further configured to receive a second input of the user if the target object is identified in the target interface; a processor 1010 further configured to generate a second light source in the target interface in response to the second input; and generating a shadow corresponding to the target object in the target interface according to the first relative position relation between the second light source and the target object and the outline of the target object.
In summary, the embodiments of the present application are: a method for dynamically adjusting the shadow of a portrait based on a folding screen is provided. The problem of the camera position is often removed because light and shadow can't the adjustment position angle when shooing is solved, make the user can utilize light and shadow in a flexible way, improve the film forming rate of shooing under the sunshine. On one hand, shooting in a sunlight scene, and dynamically adjusting the included angle between the shadow of the figure and the figure through the folding angle of the folding screen to finish the panoramic composition of the figure and the shadow during shooting; on the other hand, the method takes a picture in a scene with weak sunlight, simulates a light source, calculates the incident angle between the light source and a character, calculates the included angle and the area between the shadow and the character at the moment, forms shadow mapping in a picture, and increases the stereoscopic impression and the sense of reality.
The control method of the embodiment not only can make the outline of the shadow clear, but also can shadow the shadow based on the adjustment of the included angle between the target object and the shadow thereof, so as to meet different requirements of users and achieve the expected effect.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1009 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. Processor 1010 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
Embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of any one of the foregoing control method embodiments, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of any of the above control method embodiments, and can achieve the same technical effect, and is not described herein again to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatuses in the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions recited, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A control method, characterized in that the method comprises:
receiving a first input of a user under the condition that a target object and a shadow thereof are identified in a target interface; the first input is used for adjusting an included angle between two screens of the electronic equipment to a first angle;
responding to the first input, and adjusting an included angle between the target object and a shadow thereof to a second angle; the second angle is associated with the first angle;
wherein the included angle between the two screens is associated with the included angle between the target object and the shadow thereof.
2. The method of claim 1, wherein prior to receiving the first input from the user, further comprising:
identifying a first light source forming the shadow in the target interface;
and determining the relative position relation of the first light source, the target object and the shadow thereof.
3. The method of claim 2, wherein after adjusting the angle between the target object and its shadow to a second angle, further comprising:
determining a first position of the first light source according to the target relative position relation and the second angle;
adjusting the first light source to the first position.
4. The method of claim 2, wherein after said responding to said first input and before said adjusting the angle between said target object and its shadow to a second angle, further comprises:
adjusting the first light source to a second position; the second position is associated with the first angle;
and determining the second angle according to the target relative position relation and the second position.
5. The method of claim 1, wherein after adjusting the angle between the target object and its shadow to a second angle, further comprising:
adjusting the shape of the shadow according to the second angle; the shape includes at least a length and a contour.
6. The method of claim 1, wherein before receiving the first input from the user in the case that the target object and the shadow thereof are identified in the target interface, the method further comprises:
receiving a second input of the user in the case that the target object is identified in the target interface;
generating a second light source in the target interface in response to the second input;
generating a shadow corresponding to the target object in the target interface according to a first relative position relation between the second light source and the target object and the outline of the target object.
7. An electronic device, characterized in that the device comprises:
the first input receiving module is used for receiving a first input of a user under the condition that a target object and a shadow thereof are identified in a target interface; the first input is used for adjusting an included angle between two screens of the electronic equipment to a first angle;
the first input response module is used for responding to the first input and adjusting the included angle between the target object and the shadow thereof to a second angle; the second angle is associated with the first angle;
wherein the included angle between the two screens is associated with the included angle between the target object and the shadow thereof.
8. The apparatus of claim 7, further comprising:
the identification module is used for identifying a first light source forming the shadow in the target interface;
and the first determining module is used for determining the relative position relation of the target among the first light source, the target object and the shadow thereof.
9. The apparatus of claim 8, further comprising:
the second determining module is used for determining the first position of the first light source according to the target relative position relation and the second angle;
the first adjusting module is used for adjusting the first light source to the first position.
10. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the control method according to any one of claims 1 to 6.
CN202110120862.8A 2021-01-28 2021-01-28 Control method and electronic device Active CN112887621B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110120862.8A CN112887621B (en) 2021-01-28 2021-01-28 Control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110120862.8A CN112887621B (en) 2021-01-28 2021-01-28 Control method and electronic device

Publications (2)

Publication Number Publication Date
CN112887621A CN112887621A (en) 2021-06-01
CN112887621B true CN112887621B (en) 2022-07-12

Family

ID=76053140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110120862.8A Active CN112887621B (en) 2021-01-28 2021-01-28 Control method and electronic device

Country Status (1)

Country Link
CN (1) CN112887621B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103902804A (en) * 2012-12-27 2014-07-02 索尼电脑娱乐美国公司 Shadow type video game system and method based on previous game player

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100407120C (en) * 2005-02-03 2008-07-30 东芝松下显示技术有限公司 Display device including function to input information from screen by light
JP3926828B1 (en) * 2006-01-26 2007-06-06 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP6495424B1 (en) * 2017-12-06 2019-04-03 東芝エレベータ株式会社 Image detection system
CN110177167A (en) * 2019-05-17 2019-08-27 珠海格力电器股份有限公司 Unlocking method and device
CN110647274A (en) * 2019-08-15 2020-01-03 华为技术有限公司 Interface display method and equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103902804A (en) * 2012-12-27 2014-07-02 索尼电脑娱乐美国公司 Shadow type video game system and method based on previous game player

Also Published As

Publication number Publication date
CN112887621A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
CN113329172B (en) Shooting method and device and electronic equipment
CN112532882B (en) Image display method and device
CN112532881A (en) Image processing method and device and electronic equipment
CN112738403A (en) Photographing method, photographing apparatus, electronic device, and medium
CN112492201A (en) Photographing method and device and electronic equipment
CN112511743B (en) Video shooting method and device
JP2023526806A (en) Anti-shake method, anti-shake device and electronic device
CN112437232A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN112153281A (en) Image processing method and device
CN112887621B (en) Control method and electronic device
CN114025100B (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN115529405A (en) Image display method and device of front camera
CN115499589A (en) Shooting method, shooting device, electronic equipment and medium
CN112702533B (en) Sight line correction method and sight line correction device
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN113873168A (en) Shooting method, shooting device, electronic equipment and medium
CN114245017A (en) Shooting method and device and electronic equipment
CN113891002A (en) Shooting method and device
CN112561787A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112702527A (en) Image shooting method and device and electronic equipment
CN112367468B (en) Image processing method and device and electronic equipment
CN113489901B (en) Shooting method and device thereof
CN117097982B (en) Target detection method and system
CN114143461B (en) Shooting method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant