CN111901529A - Target object shooting method and device - Google Patents

Target object shooting method and device Download PDF

Info

Publication number
CN111901529A
CN111901529A CN202010828112.1A CN202010828112A CN111901529A CN 111901529 A CN111901529 A CN 111901529A CN 202010828112 A CN202010828112 A CN 202010828112A CN 111901529 A CN111901529 A CN 111901529A
Authority
CN
China
Prior art keywords
virtual
ray
shooting
light
virtual light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010828112.1A
Other languages
Chinese (zh)
Inventor
王园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202010828112.1A priority Critical patent/CN111901529A/en
Publication of CN111901529A publication Critical patent/CN111901529A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Abstract

The invention discloses a target object shooting method and device. Wherein, the method comprises the following steps: in a shooting mode, acquiring virtual light control information, wherein the virtual light control information is used for inserting at least one virtual light into a current shooting picture and controlling the at least one virtual light to adjust the shooting environment of the current shooting picture; and shooting the target object based on the virtual light control information. The invention solves the technical problem that in the prior art, the shooting light cannot be freely controlled in the shooting process of a user using the terminal.

Description

Target object shooting method and device
Technical Field
The invention relates to the technical field of shooting control, in particular to a target object shooting method and device.
Background
The light is very important for photography, namely 'photography is an art of one door of light', the light can meet the basic lighting requirement of photography on objects, different lights have obvious influence on the final photographing effect of a photographed object, however, when a mobile phone is used for photographing in daily life, proper light can not be captured frequently, clear photographing can not be carried out in some dark light environments, and satisfactory photographic works can not be photographed frequently.
In the prior art, a user can freely select different illumination modes such as natural light, studio light, contour light, stage light and other different effects in the process of shooting by using a mobile phone, but the existing illumination modes are single and have poor controllability, so that satisfactory photographic works of the user cannot be shot.
In addition, most of the existing light ray adjusting operations can be realized in picture editing after shooting, only secondary adjustment and modification can be carried out on the shot pictures, and the shot light rays cannot be freely controlled in the shooting process to achieve the satisfactory shooting effect of users.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a target object shooting method and device, which at least solve the technical problem that in the prior art, shooting light cannot be freely controlled in the shooting process of a user using a terminal.
According to an aspect of an embodiment of the present invention, there is provided a target object photographing method including: in a shooting mode, acquiring virtual light control information, wherein the virtual light control information is used for inserting at least one virtual light into a current shooting picture and controlling the at least one virtual light to adjust the shooting environment of the current shooting picture; and shooting the target object based on the virtual light control information.
Optionally, the obtaining the virtual light control information includes: and acquiring the category information of each virtual ray in the at least one virtual ray in response to a first touch operation acting on a first touch area, wherein the first touch area is used for providing a plurality of alternative virtual ray categories.
Optionally, the obtaining the virtual light control information includes: responding to a second touch operation acting on a second touch area, and acquiring an insertion starting position and an insertion ending position of each virtual ray in the at least one virtual ray; determining irradiation information of each virtual ray in the at least one virtual ray based on the insertion start position and the insertion end position, wherein the second touch area is located in a shooting area of the current shooting picture, and the irradiation information includes at least one of: irradiation direction, irradiation range.
Optionally, determining the irradiation information of each of the at least one virtual ray based on the insertion start position and the insertion end position includes: determining an irradiation direction of each of the at least one virtual ray by using a direction change of the insertion end position with respect to the insertion start position; and acquiring a connecting line between the insertion starting position and the insertion ending position, and determining the irradiation range of each virtual ray in the at least one virtual ray by using the length change of the connecting line.
Optionally, the obtaining the virtual light control information includes: and responding to a third touch operation acting on a third touch area to obtain the intensity information of each virtual ray in the at least one virtual ray, wherein the third touch area is used for providing an adjusting range of the intensity of the virtual ray.
Optionally, shooting the target object based on the virtual light control information includes: selecting the at least one virtual ray based on the category information of each virtual ray in the at least one virtual ray, and inserting the at least one virtual ray into the current shooting picture to obtain a first processing result; adjusting the irradiation direction and the irradiation range of part or all of the virtual rays in the first processing result based on the irradiation information of each virtual ray in the at least one virtual ray to obtain a second processing result; adjusting the light intensity of part or all of the virtual light rays in the second processing result based on the intensity information of each virtual light ray in the at least one virtual light ray to obtain a third processing result; and executing shooting operation based on the third processing result to obtain a target shooting picture.
According to another aspect of the embodiments of the present invention, there is also provided a target object photographing apparatus including: the device comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring virtual light control information in a shooting mode, the virtual light control information is used for inserting at least one virtual light in a current shooting picture and controlling the at least one virtual light to adjust the shooting environment of the current shooting picture; and the shooting module is used for shooting the target object based on the virtual light control information.
Optionally, the obtaining module is configured to obtain category information of each of the at least one virtual ray in response to a first touch operation applied to a first touch area, where the first touch area is configured to provide multiple candidate virtual ray categories.
Optionally, the obtaining module is configured to obtain an insertion start position and an insertion end position of each virtual ray in the at least one virtual ray in response to a second touch operation applied to a second touch area; determining irradiation information of each virtual ray in the at least one virtual ray based on the insertion start position and the insertion end position, wherein the second touch area is located in a shooting area of the current shooting picture, and the irradiation information includes at least one of: irradiation direction, irradiation range.
Optionally, the obtaining module is configured to determine an irradiation direction of each of the at least one virtual ray according to a direction change of the insertion ending position relative to the insertion starting position; and acquiring a connecting line between the insertion starting position and the insertion ending position, and determining the irradiation range of each virtual ray in the at least one virtual ray by using the length change of the connecting line.
Optionally, the obtaining module is configured to obtain intensity information of each of the at least one virtual ray in response to a third touch operation applied to a third touch area, where the third touch area is configured to provide an adjustment range of the intensity of the virtual ray.
Optionally, the shooting module is configured to select the at least one virtual ray based on the category information of each virtual ray in the at least one virtual ray, and insert the at least one virtual ray into the current shooting picture to obtain a first processing result; adjusting the irradiation direction and the irradiation range of part or all of the virtual rays in the first processing result based on the irradiation information of each virtual ray in the at least one virtual ray to obtain a second processing result; adjusting the light intensity of part or all of the virtual light rays in the second processing result based on the intensity information of each virtual light ray in the at least one virtual light ray to obtain a third processing result; and executing shooting operation based on the third processing result to obtain a target shooting picture.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium having a computer program stored therein, wherein the computer program is configured to execute the target object photographing method described in any one of the above when the computer program runs.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program is configured to execute any one of the above target object shooting methods when executed.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above target object shooting methods.
In the embodiment of the present invention, in a shooting mode, virtual light control information is obtained, where the virtual light control information is used to insert at least one virtual light into a current shooting picture and control the at least one virtual light to adjust a shooting environment of the current shooting picture; based on above-mentioned virtual light control information shoots the target object, reached and freely controlled the purpose of shooting light in the user use terminal shooting process, the operation of adjusting light is more convenient and improve and become the piece rate to realized shooting the technical effect of user's ideal photographic works, and then solved prior art, in user use terminal shoots the in-process, can't freely control the technical problem who shoots light.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of a target object photographing method according to an embodiment of the present invention;
FIG. 2 is an interface diagram of an alternative current shot according to an embodiment of the present invention;
FIG. 3 is an interface diagram of an alternative current shot according to an embodiment of the present invention;
FIG. 4 is an interface diagram of yet another alternative current shot in accordance with an embodiment of the present invention;
FIG. 5 is a flow chart of an alternative target object capture method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a target object photographing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
According to an embodiment of the present invention, there is provided an embodiment of a target object photographing method, it should be noted that the steps shown in the flowchart of the drawings may be executed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that here.
Fig. 1 is a flowchart of a target object photographing method according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
step S102, acquiring virtual light control information in a shooting mode;
optionally, the virtual light control information is used to insert at least one virtual light into a current captured image and control the at least one virtual light to adjust a capturing environment of the current captured image.
And step S104, shooting the target object based on the virtual light control information.
In the embodiment of the present invention, in a shooting mode, virtual light control information is obtained, where the virtual light control information is used to insert at least one virtual light into a current shooting picture and control the at least one virtual light to adjust a shooting environment of the current shooting picture; based on above-mentioned virtual light control information shoots the target object, reached and freely controlled the purpose of shooting light in the user use terminal shooting process, the operation of adjusting light is more convenient and improve and become the piece rate to realized shooting the technical effect of user's ideal photographic works, and then solved prior art, in user use terminal shoots the in-process, can't freely control the technical problem who shoots light.
It should be noted that the target object shooting method provided by the embodiment of the application can be applied to, but not limited to, scenes in which a user uses a terminal device to shoot a target object, and helps the user freely control virtual light in the shooting process aiming at the problems of insufficient light, weak light level, poor light effect and the like in the shooting process; optionally, the terminal device may be a smart phone, an iPAD, a smart wearable device, or the like.
Taking the terminal device as an example of a smart phone, a user can control a shooting interface to insert at least one virtual light into a current shooting picture, the user can click a "light" button shown in the current shooting interface shown in fig. 2, at least one virtual light is inserted into the current shooting picture, and the virtual light is controlled to adjust the shooting environment of the current shooting picture, for example, the irradiation angle, the irradiation direction, the irradiation area, the light length, the light intensity and the like are adjusted, a satisfactory illumination effect can be achieved by freely adjusting the virtual light in the shooting process, and then an ideal photographic work of the user can be shot.
In an optional embodiment, the obtaining the virtual light control information includes:
step S202, in response to a first touch operation performed on a first touch area, obtaining category information of each virtual ray in the at least one virtual ray, where the first touch area is used to provide multiple candidate virtual ray categories.
Optionally, the first touch area may be a touch area in which a "light" button is displayed, and still taking the terminal device as a smart phone as an example, as shown in fig. 2, a user executes the first touch operation by clicking the "light" button displayed in the shooting interface, and a server in the smart phone obtains the category information of each virtual light in the at least one virtual light by responding to the first touch operation.
The first touch area is further configured to provide multiple candidate virtual light categories, for example, any one of the following virtual light categories shown in fig. 3: "sunlight", "natural light", "halation", "twilight", "midnight", "extreme light", "dim light", "rainbow", and the like.
It should be noted that different types of virtual light can make the current shot picture show different rich light effects, and a user can select an effect to enrich the rich picture according to the user's own needs and the current shooting environment, and select to add different filters for different photos.
Through this application embodiment, use the smart mobile phone to carry out the in-process of shooing at the user, insert the light of different attributes in the current picture of shooing for present rich and varied illumination effect in the current picture of shooing, satisfy the user and to the illumination demand of different virtual light at the shooting in-process, thereby reach user's ideal photographic effect, build and present perfect photographic works.
In an optional embodiment, the obtaining the virtual light control information includes:
step S302, in response to a second touch operation applied to a second touch area, obtaining an insertion start position and an insertion end position of each virtual ray in the at least one virtual ray;
step S304, determining irradiation information of each of the at least one virtual ray based on the insertion start position and the insertion end position.
Wherein the second touch area is located in a shooting area of the current shooting picture, and the illumination information includes at least one of: irradiation direction, irradiation range.
Optionally, the second touch operation is configured to determine an insertion start position and an insertion end position of each virtual ray in the at least one virtual ray, for example, any one or more virtual ray categories may be clicked according to a special illumination requirement of a current shooting picture to select virtual rays of different virtual ray categories, the at least one virtual ray is inserted into the current shooting picture by clicking any insertion start position in the current shooting picture, and illumination information of each virtual ray in the at least one virtual ray may also be determined based on the insertion start position and the insertion end position.
As an alternative embodiment, as shown in fig. 4, the above-mentioned virtual light has an irradiation direction and/or an irradiation range, for example, the irradiation direction of the "halo" light may be directed from the tail of the arrow to the top of the arrow, and the irradiation range of the "halo" light may be the region where the "halo" light is located; the irradiation direction of the "dim light" light can be directed from the tail of the arrow to the top of the arrow, and the irradiation range of the "dim light" light can be the region where the "dim light" light is located.
As another alternative embodiment, in the embodiment of the present application, virtual rays of a plurality of different virtual ray categories may be inserted into the current shooting picture at the same time.
In an optional embodiment, the determining the irradiation information of each of the at least one virtual ray based on the insertion start position and the insertion end position includes:
step S402, determining the irradiation direction of each virtual ray in the at least one virtual ray by using the direction change of the insertion ending position relative to the insertion starting position;
step S404, obtaining a connection line between the insertion start position and the insertion end position, and determining an irradiation range of each of the at least one virtual ray according to a length change of the connection line.
As shown in fig. 4, in the above-mentioned alternative embodiment provided in this application, the irradiation directions of the halation light and the low-light may be determined by using a direction change of the insertion ending position relative to the insertion starting position, a connection line between the insertion starting position and the insertion ending position may be obtained, and the irradiation ranges of the halation light and the low-light may be determined by using a length change of the connection line.
Optionally, the irradiation direction refers to a position to which the virtual light is irradiated, and includes forward light (which means irradiation in a direction toward the light source), side light (which means light irradiated from the side surface of the main body), backlight (which means light irradiated from the rear of the main body can form a contour on the outside of the main body), top light (which means light irradiated from the top of the main body), bottom light (which means light irradiated from the lower side of the main body opposite to the top light), and the like.
It should be noted that, a user may select the virtual light with different attributes inserted into the current shooting picture, and according to the shooting requirement of the current shooting picture, the length of the arrow of the virtual light is extended, and the irradiation direction and the irradiation angle of the arrow are controlled, so as to control the irradiation direction and the irradiation range of the selected virtual light. The photographing light compensation mode provided by the embodiment of the application is convenient to operate, and can assist a user to freely control the virtual light, so that an ideal photographing effect is achieved.
In the process of shooting by using the mobile phone, a user can autonomously control the irradiation direction and the irradiation range of the inserted virtual light, for example, the user can select to insert any one virtual light at different positions in the current shooting picture, and can control the virtual light to start from any position point to end from any position point in the current shooting picture. According to the different illumination demands of the current shooting picture, the irradiation angle and the irradiation range of the virtual light are controlled, so that an ideal shooting effect is achieved, the problem that a user cannot capture proper light and dark light during shooting is solved, the controllability is higher for the user, and the purposes of creating various different virtual light shooting effects and enabling the current shooting picture to be more layered are achieved.
In an optional embodiment, the obtaining the virtual light control information includes:
step S502, obtaining intensity information of each of the at least one virtual ray in response to a third touch operation applied to a third touch area, where the third touch area is used to provide an adjustment range of the intensity of the virtual ray.
Optionally, as shown in fig. 4, a third touch area of the current shooting image is provided with an adjustment range of the virtual light intensity, for example, in a manner of sliding a progress bar as shown in fig. 4, when the user slides to the left, the virtual light intensity is dimmed, and when the user slides to the right, the virtual light intensity is brightened; the light intensity is the irradiation intensity of the virtual light, and the light intensity can represent the brightness of the virtual light, the higher the light intensity is, the brighter the irradiated subject is, the clearer the color, texture and the like on the surface of the irradiated subject are, the light intensity is proportional to the energy of the light source, the higher the energy of the light source is, the higher the intensity of the virtual light is, and the brighter the irradiated subject is.
In the photographic process, light can cause overexposure by the reinforce, and light can cause the picture darker inadequately, and these are the problem that we can meet often in daily photography, and this application embodiment can assist the user to realize the technological effect of the light intensity of swiftly adjusting virtual light through adjusting the light intensity who inserts virtual light, thereby makes the light shade of the current picture of shooing of control soft, reaches the purpose that satisfies the different illumination demands of user.
In an alternative embodiment, as shown in fig. 5, the capturing the target object based on the virtual light control information includes:
step S602, selecting the at least one virtual ray based on the category information of each virtual ray in the at least one virtual ray, and inserting the at least one virtual ray into the current shooting picture to obtain a first processing result;
step S604, adjusting the irradiation direction and the irradiation range of some or all of the virtual rays in the first processing result based on the irradiation information of each of the at least one virtual ray, to obtain a second processing result;
step S606, adjusting the light intensity of some or all of the virtual light in the second processing result based on the intensity information of each of the at least one virtual light, to obtain a third processing result;
in step S608, a shooting operation is performed based on the third processing result to obtain a target shooting picture.
Because illumination has a great influence on shooting, the embodiment of the application is based on three characteristics of virtual light to assist a user in shooting, and because the light is changeable, different light influences the final imaging effect of a shot picture. The user flexibly applies light when shooting by using the mobile phone. The main finger user can insert multiple light at optional position in the picture in the shooting process to thereby freely control the irradiation position, the irradiation angle and the irradiation coverage of light and preview the real-time illumination effect in time in the shooting process, and the auxiliary user shoots an ideal photo.
In the embodiment of the application, a user can insert any type of virtual light into a current shooting picture according to the requirement of the user, and the virtual light can be inserted into any position of a shooting area; different light effects are created in the shooting picture, and the irradiation range, the irradiation direction and the light intensity of the light are controlled to assist the user in completing shooting. Moreover, because each user has different feelings about beauty, the irradiation direction and the irradiation area of the inserted light can be freely regulated, the illumination intensity degrees with different attributes are regulated, different parameters are regulated according to the difference of the current environment and the shooting effect which the user wants to achieve, and therefore the final satisfactory effect is achieved.
As an alternative embodiment, an alternative application scenario is provided as follows: when the user is taking the personage at night, the light intensity is very dark this moment and is lower to lead to the personage to form images not clear, but the user wants to build the effect of a stage light and takes, makes the personage formation of image in the camera lens more clear and have the sensation that the light shines. The user can select one or more virtual light rays (stage light rays) from different types of virtual light rays, then select an insertion starting position and an insertion ending position for inserting the virtual light rays in the current shot picture, and the irradiation direction and the irradiation range of the virtual light rays according to the standing position and the standing angle of the character, and adjust the irradiation intensity required to be irradiated by the virtual light rays, so that the current shot picture can achieve the satisfactory presenting effect of the user.
As an alternative embodiment, another alternative application scenario is provided below: when a user shoots in the daytime, under the condition of sufficient light, the layering sense of the light in the current shooting picture is weak, and the user wants to create a richer light effect in the current shooting picture, so that the user can select virtual light such as halo to be inserted into a proper position in the current shooting picture, and adjust the irradiation direction, the irradiation range, the light intensity and the like of the virtual light. And other virtual light rays can be inserted into the current shooting picture again, the special illumination effect of the object or the portrait can be created by selecting the special position irradiated by the light rays, and the imaging effects of different levels can be created for the current shooting picture by inserting several different virtual light rays into the current shooting picture, so that the current shooting picture can achieve the satisfied presentation effect of the user.
Example 2
According to an embodiment of the present invention, there is further provided an apparatus embodiment for implementing the target object capturing method, and fig. 6 is a schematic structural diagram of a target object capturing apparatus according to an embodiment of the present invention, as shown in fig. 6, the target object capturing apparatus includes: an acquisition module 60 and a photographing module 62, wherein:
an obtaining module 60, configured to obtain virtual light control information in a shooting mode, where the virtual light control information is used to insert at least one virtual light in a current shooting picture and control the at least one virtual light to adjust a shooting environment of the current shooting picture; and a shooting module 62 for shooting the target object based on the virtual light control information.
In an optional embodiment, the obtaining module is configured to obtain category information of each of the at least one virtual ray in response to a first touch operation applied to a first touch area, where the first touch area is used to provide multiple candidate virtual ray categories.
In an optional embodiment, the obtaining module is configured to obtain an insertion start position and an insertion end position of each of the at least one virtual ray in response to a second touch operation applied to a second touch area; determining irradiation information of each virtual ray in the at least one virtual ray based on the insertion start position and the insertion end position, wherein the second touch area is located in a shooting area of the current shooting picture, and the irradiation information includes at least one of: irradiation direction, irradiation range.
In an optional embodiment, the obtaining module is configured to determine an irradiation direction of each of the at least one virtual light ray by using a direction change of the insertion ending position relative to the insertion starting position; and acquiring a connecting line between the insertion starting position and the insertion ending position, and determining the irradiation range of each virtual ray in the at least one virtual ray by using the length change of the connecting line.
In an optional embodiment, the obtaining module is configured to obtain intensity information of each of the at least one virtual light in response to a third touch operation applied to a third touch area, where the third touch area is used to provide an adjustment range of the intensity of the virtual light.
In an optional embodiment, the shooting module is configured to select the at least one virtual ray based on the category information of each virtual ray in the at least one virtual ray, and insert the at least one virtual ray into the current shooting picture to obtain a first processing result; adjusting the irradiation direction and the irradiation range of part or all of the virtual rays in the first processing result based on the irradiation information of each virtual ray in the at least one virtual ray to obtain a second processing result; adjusting the light intensity of part or all of the virtual light rays in the second processing result based on the intensity information of each virtual light ray in the at least one virtual light ray to obtain a third processing result; and executing shooting operation based on the third processing result to obtain a target shooting picture.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the above-mentioned acquisition module 60 and the shooting module 62 correspond to steps S102 to S104 in embodiment 1, and the above-mentioned modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of embodiment 1. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The above target object photographing apparatus may further include a processor and a memory, the above acquisition module 60, the photographing module 62, and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
According to the embodiment of the application, the embodiment of the nonvolatile storage medium is also provided. Optionally, in this embodiment, the nonvolatile storage medium includes a stored program, and the apparatus in which the nonvolatile storage medium is located is controlled to execute the any one of the target object shooting methods when the program runs.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of a group of computer terminals in a computer network, or in any one of a group of mobile terminals, and the nonvolatile storage medium includes a stored program.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: in a shooting mode, acquiring virtual light control information, wherein the virtual light control information is used for inserting at least one virtual light into a current shooting picture and controlling the at least one virtual light to adjust the shooting environment of the current shooting picture; and shooting the target object based on the virtual light control information.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: and acquiring the category information of each virtual ray in the at least one virtual ray in response to a first touch operation acting on a first touch area, wherein the first touch area is used for providing a plurality of alternative virtual ray categories.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: responding to a second touch operation acting on a second touch area, and acquiring an insertion starting position and an insertion ending position of each virtual ray in the at least one virtual ray; determining irradiation information of each virtual ray in the at least one virtual ray based on the insertion start position and the insertion end position, wherein the second touch area is located in a shooting area of the current shooting picture, and the irradiation information includes at least one of: irradiation direction, irradiation range.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: determining an irradiation direction of each of the at least one virtual ray by using a direction change of the insertion end position with respect to the insertion start position; and acquiring a connecting line between the insertion starting position and the insertion ending position, and determining the irradiation range of each virtual ray in the at least one virtual ray by using the length change of the connecting line.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: and responding to a third touch operation acting on a third touch area to obtain the intensity information of each virtual ray in the at least one virtual ray, wherein the third touch area is used for providing an adjusting range of the intensity of the virtual ray.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: selecting the at least one virtual ray based on the category information of each virtual ray in the at least one virtual ray, and inserting the at least one virtual ray into the current shooting picture to obtain a first processing result; adjusting the irradiation direction and the irradiation range of part or all of the virtual rays in the first processing result based on the irradiation information of each virtual ray in the at least one virtual ray to obtain a second processing result; adjusting the light intensity of part or all of the virtual light rays in the second processing result based on the intensity information of each virtual light ray in the at least one virtual light ray to obtain a third processing result; and executing shooting operation based on the third processing result to obtain a target shooting picture.
According to the embodiment of the application, the embodiment of the processor is also provided. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes the any one of the target object shooting methods.
An embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor is configured to run the computer program to execute any one of the above target object shooting methods.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (15)

1. A target object photographing method, comprising:
in a shooting mode, acquiring virtual light control information, wherein the virtual light control information is used for inserting at least one virtual light into a current shooting picture and controlling the at least one virtual light to adjust the shooting environment of the current shooting picture;
and shooting a target object based on the virtual light control information.
2. The method of claim 1, wherein obtaining the virtual light control information comprises:
and acquiring the category information of each virtual ray in the at least one virtual ray in response to a first touch operation acting on a first touch area, wherein the first touch area is used for providing a plurality of alternative virtual ray categories.
3. The method of claim 1, wherein obtaining the virtual light control information comprises:
responding to a second touch operation acting on a second touch area, and acquiring an insertion starting position and an insertion ending position of each virtual ray in the at least one virtual ray;
determining irradiation information of each virtual ray in the at least one virtual ray based on the insertion starting position and the insertion ending position, wherein the second touch area is located in a shooting area of the current shooting picture, and the irradiation information includes at least one of the following: irradiation direction, irradiation range.
4. The method of claim 3, wherein determining the illumination information for each of the at least one virtual ray based on the insertion start position and the insertion end position comprises:
determining the irradiation direction of each virtual ray in the at least one virtual ray by using the direction change of the insertion ending position relative to the insertion starting position;
and acquiring a connecting line between the insertion starting position and the insertion ending position, and determining the irradiation range of each virtual ray in the at least one virtual ray by using the length change of the connecting line.
5. The method of claim 1, wherein obtaining the virtual light control information comprises:
and responding to a third touch operation acting on a third touch area, and acquiring intensity information of each virtual ray in the at least one virtual ray, wherein the third touch area is used for providing an adjusting range of the intensity of the virtual ray.
6. The method of claim 1, wherein photographing the target object based on the virtual light control information comprises:
selecting at least one virtual light ray based on the category information of each virtual light ray in the at least one virtual light ray, and inserting the at least one virtual light ray into the current shooting picture to obtain a first processing result;
adjusting the irradiation direction and the irradiation range of part or all of the virtual rays in the first processing result based on the irradiation information of each virtual ray in the at least one virtual ray to obtain a second processing result;
adjusting the light ray intensity of part or all of the virtual light rays in the second processing result based on the intensity information of each virtual light ray in the at least one virtual light ray to obtain a third processing result;
and executing shooting operation based on the third processing result to obtain a target shooting picture.
7. A target object photographing apparatus, comprising:
the device comprises an acquisition module, a control module and a display module, wherein the acquisition module is used for acquiring virtual light control information in a shooting mode, and the virtual light control information is used for inserting at least one virtual light in a current shooting picture and controlling the at least one virtual light to adjust the shooting environment of the current shooting picture;
and the shooting module is used for shooting the target object based on the virtual light control information.
8. The apparatus of claim 7, wherein the obtaining module is configured to obtain category information of each of the at least one virtual ray in response to a first touch operation applied to a first touch area, wherein the first touch area is configured to provide a plurality of candidate virtual ray categories.
9. The apparatus according to claim 7, wherein the obtaining module is configured to obtain an insertion start position and an insertion end position of each of the at least one virtual ray in response to a second touch operation applied to a second touch area; determining irradiation information of each virtual ray in the at least one virtual ray based on the insertion starting position and the insertion ending position, wherein the second touch area is located in a shooting area of the current shooting picture, and the irradiation information includes at least one of the following: irradiation direction, irradiation range.
10. The apparatus according to claim 9, wherein the obtaining module is configured to determine the irradiation direction of each of the at least one virtual light ray by using a direction change of the insertion ending position relative to the insertion starting position; and acquiring a connecting line between the insertion starting position and the insertion ending position, and determining the irradiation range of each virtual ray in the at least one virtual ray by using the length change of the connecting line.
11. The apparatus according to claim 7, wherein the obtaining module is configured to obtain intensity information of each of the at least one virtual light in response to a third touch operation applied to a third touch area, wherein the third touch area is configured to provide an adjustment range of the intensity of the virtual light.
12. The apparatus according to claim 7, wherein the shooting module is configured to select the at least one virtual ray based on the category information of each virtual ray in the at least one virtual ray, and insert the at least one virtual ray into the current shooting picture to obtain a first processing result; adjusting the irradiation direction and the irradiation range of part or all of the virtual rays in the first processing result based on the irradiation information of each virtual ray in the at least one virtual ray to obtain a second processing result; adjusting the light ray intensity of part or all of the virtual light rays in the second processing result based on the intensity information of each virtual light ray in the at least one virtual light ray to obtain a third processing result; and executing shooting operation based on the third processing result to obtain a target shooting picture.
13. A non-volatile storage medium, characterized in that a computer program is stored in the storage medium, wherein the computer program is arranged to execute the target object capturing method of any one of claims 1 to 6 when executed.
14. A processor for running a program, wherein the program is configured to execute the target object photographing method according to any one of claims 1 to 6 when running.
15. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the target object photographing method according to any one of claims 1 to 6.
CN202010828112.1A 2020-08-17 2020-08-17 Target object shooting method and device Pending CN111901529A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010828112.1A CN111901529A (en) 2020-08-17 2020-08-17 Target object shooting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010828112.1A CN111901529A (en) 2020-08-17 2020-08-17 Target object shooting method and device

Publications (1)

Publication Number Publication Date
CN111901529A true CN111901529A (en) 2020-11-06

Family

ID=73230406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010828112.1A Pending CN111901529A (en) 2020-08-17 2020-08-17 Target object shooting method and device

Country Status (1)

Country Link
CN (1) CN111901529A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7787027B2 (en) * 1996-06-13 2010-08-31 Nikon Corporation Information input apparatus and method
CN104268928A (en) * 2014-08-29 2015-01-07 小米科技有限责任公司 Picture processing method and device
CN107197171A (en) * 2017-06-22 2017-09-22 西南大学 A kind of digital photographing processing method for adding intelligence software light source
US20180035037A1 (en) * 2012-11-08 2018-02-01 Sony Corporation Image processing apparatus and method, and program
CN109688341A (en) * 2018-12-27 2019-04-26 维沃移动通信有限公司 A kind of method for polishing and terminal device
CN111066026A (en) * 2017-09-09 2020-04-24 苹果公司 Techniques for providing virtual light adjustments to image data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7787027B2 (en) * 1996-06-13 2010-08-31 Nikon Corporation Information input apparatus and method
US20180035037A1 (en) * 2012-11-08 2018-02-01 Sony Corporation Image processing apparatus and method, and program
CN104268928A (en) * 2014-08-29 2015-01-07 小米科技有限责任公司 Picture processing method and device
CN107197171A (en) * 2017-06-22 2017-09-22 西南大学 A kind of digital photographing processing method for adding intelligence software light source
CN111066026A (en) * 2017-09-09 2020-04-24 苹果公司 Techniques for providing virtual light adjustments to image data
CN109688341A (en) * 2018-12-27 2019-04-26 维沃移动通信有限公司 A kind of method for polishing and terminal device

Similar Documents

Publication Publication Date Title
Liba et al. Handheld mobile photography in very low light.
CN108933899B (en) Panorama shooting method, device, terminal and computer readable storage medium
JP4236433B2 (en) System and method for simulating fill flash in photography
CN102830573B (en) A kind of flash control method and device
CN105049726A (en) Mobile terminal shooting method and mobile terminal
WO2009139154A1 (en) Image pickup device and image pickup method
US20200404132A1 (en) Background replacement system and methods
CN106791380B (en) Method and device for shooting dynamic photos
KR20070006692A (en) Method and apparatus for optimizing capture device settings through depth information
CN103856718A (en) Photo synthesis method and photo synthesis device
US20180332239A1 (en) Background replacement utilizing infrared light and visible light
CN107181920B (en) Mobile terminal and method for switching flash lamp to take pictures
CN104092955A (en) Flash control method and device, as well as image acquisition method and equipment
WO2018140152A1 (en) Methods and apparatus for synchronizing camera flash and sensor blanking
CN106791451A (en) A kind of photographic method of intelligent terminal
CN106657798A (en) Photographing method for intelligent terminal
CN109618089A (en) Intelligentized shooting controller, Management Controller and image pickup method
CN106254790A (en) Take pictures processing method and processing device
CN108282622B (en) Photo shooting method and device
CN102629973B (en) Camera head and image capture method
CN102300038A (en) Image shooting device
CN110545365B (en) Mobile device and method for flashing light in cooperation with other nearby mobile devices
CN111901529A (en) Target object shooting method and device
CN105678624A (en) Evaluation information generation device and method, electronic device and server
CN114286004A (en) Focusing method, shooting device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201106

RJ01 Rejection of invention patent application after publication