CN115499581B - Shooting method, shooting device, terminal equipment and storage medium - Google Patents

Shooting method, shooting device, terminal equipment and storage medium Download PDF

Info

Publication number
CN115499581B
CN115499581B CN202210989573.6A CN202210989573A CN115499581B CN 115499581 B CN115499581 B CN 115499581B CN 202210989573 A CN202210989573 A CN 202210989573A CN 115499581 B CN115499581 B CN 115499581B
Authority
CN
China
Prior art keywords
shooting
shot object
color
determining
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210989573.6A
Other languages
Chinese (zh)
Other versions
CN115499581A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 58 Information Technology Co Ltd
Original Assignee
Beijing 58 Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 58 Information Technology Co Ltd filed Critical Beijing 58 Information Technology Co Ltd
Priority to CN202210989573.6A priority Critical patent/CN115499581B/en
Publication of CN115499581A publication Critical patent/CN115499581A/en
Application granted granted Critical
Publication of CN115499581B publication Critical patent/CN115499581B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application provides a shooting method, a shooting device, terminal equipment and storage equipment, and relates to the technical field of shooting. Comprising the following steps: acquiring a shooting preview picture containing a shot object, and determining color information of the shot object from the shooting preview picture; determining shooting parameters corresponding to the color information; and shooting the shot object based on the shooting parameters to obtain a target image containing the shot object. When a photo is shot, the color information of the shot object is determined through shooting the preview picture, shooting parameters matched with the color of the shot object are determined according to the color information of the shot object, the shot object is shot according to the shooting parameters matched with the color of the shot object, a target image is obtained, corresponding shooting parameters can be automatically set for the shot objects with different colors, the shooting effect of the shot objects with different colors is improved, the trouble that camera setting needs to be manually adjusted after the shot objects with different colors are switched is avoided, and the shooting efficiency is improved.

Description

Shooting method, shooting device, terminal equipment and storage medium
Technical Field
The present application relates to the field of shooting technologies, and in particular, to a shooting method, a device, a terminal device, and a storage medium.
Background
When a user shoots by a shooting device, different shooting parameters are generally required to be set for different shot objects, so that good shooting effects can be achieved for the different shot objects.
In the related art, a plurality of shooting modes are generally built in a camera, and a user can select a corresponding shooting mode according to a subject to be shot when taking a picture. For example, when a user photographs a person, it is necessary to set the photographing mode of the camera to a portrait mode, when a landscape is photographed, when a moving object is photographed, it is necessary to set the photographing mode of the camera to a sports mode, and so on.
However, the photographed objects have differences between types, color differences between the photographed objects exist, when the user faces the photographed objects with different colors, the photographing parameters still need to be manually adjusted to adapt to the photographing requirements of the current photographed objects, the operation is complicated, and the photographing parameters are difficult to accurately adjust due to the need of manually adjusting the photographing parameters, so that the photographing effect is poor.
Disclosure of Invention
The embodiment of the application provides a shooting method, a shooting device, terminal equipment and a storage medium, which are used for solving the problems of complex shooting process and poor shooting effect in the prior art.
In order to solve the technical problems, the application is realized as follows:
in a first aspect, a photographing method, the method comprising:
acquiring a shooting preview picture containing the shot object, and determining color information of the shot object from the shooting preview picture;
determining shooting parameters for the shot object according to the color information;
and shooting the shot object based on the shooting parameters to obtain a target image.
Optionally, the color information includes a color category, and the determining the color information of the subject from the shooting preview screen includes:
inputting the shooting preview picture into a color recognition model to obtain a color code output by the color recognition model;
a color class of the subject is determined based on the color coding.
Optionally, the determining, according to the color information, a shooting parameter for the subject includes:
acquiring a corresponding relation of target parameters, wherein the corresponding relation of the target parameters comprises a corresponding relation of color information and shooting parameters;
and determining shooting parameters corresponding to the color information of the shot object according to the target parameter corresponding relation.
Optionally, the method further comprises:
determining an object type of the subject;
and determining the corresponding target parameter corresponding relation according to the object type.
Optionally, the method further comprises:
determining a current size of the subject in a shooting preview screen, and a target size set for the subject;
and adjusting the size of the shot object to the target size.
Optionally, the adjusting the size of the subject to the target size includes:
increasing a camera focal length if the current size is less than the target size;
and reducing the focal length of the camera in the condition that the current size is larger than the target size.
Optionally, the photographing the object based on the photographing parameter to obtain a target image includes:
acquiring a framing angle interval for the photographed object;
and controlling a camera to move around the shot object, and shooting the shot object at the view angle interval to obtain a plurality of target images containing the shot object, wherein the target images are shot based on the shooting parameters.
Optionally, the determining, according to the color information, a shooting parameter for the subject includes:
Determining material information of the shot object;
and determining shooting parameters for the shot object based on the color information and the material information.
In a second aspect, an embodiment of the present application further provides a photographing apparatus, including:
an acquisition module configured to acquire a shooting preview screen including the subject, and determine color information of the subject from the shooting preview screen;
a parameter module for determining shooting parameters for the shot object according to the color information;
and the shooting module is used for shooting the shot object based on the shooting parameters to obtain a target image.
Optionally, the color information includes a color category, and the acquiring module includes:
the color coding sub-module is used for inputting the shooting preview picture into a color recognition model to obtain a color code output by the color recognition model;
and the color category sub-module is used for determining the color category of the shot object based on the color codes.
Optionally, the parameter module includes:
the corresponding relation sub-module is used for acquiring a corresponding relation of target parameters, wherein the corresponding relation of the target parameters comprises a corresponding relation of color information and shooting parameters;
And the first parameter sub-module is used for determining shooting parameters corresponding to the color information of the shot object according to the target parameter corresponding relation.
Optionally, the apparatus further comprises:
an object type module for determining an object type of the subject;
and the corresponding relation determining module is used for determining corresponding relation of corresponding target parameters according to the object type.
Optionally, the apparatus further comprises:
a size determination submodule for determining a current size of the subject in a shooting preview screen and a target size set for the subject;
and the size adjustment sub-module is used for adjusting the size of the shot object to the target size.
Optionally, the resizing submodule includes:
a focal length increasing sub-module for increasing a focal length of the camera if the current size is smaller than the target size;
and the focal length reducing submodule is used for reducing the focal length of the camera under the condition that the current size is larger than the target size.
Optionally, the shooting module includes:
an angle interval submodule for acquiring a view finding angle interval for the shot object;
and the shooting sub-module is used for controlling a camera to move around the shot object, shooting the shot object at the view angle interval, and obtaining a plurality of target images containing the shot object, wherein the target images are obtained by shooting based on the shooting parameters.
Optionally, the parameter module includes:
the material proton module is used for determining the material information of the shot object;
and the second parameter submodule is used for determining shooting parameters aiming at the shot object based on the color information and the material information.
In a third aspect, an embodiment of the present application further provides a terminal device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the shooting method as described in the previous first aspect.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, where the computer program, when executed by a processor, implements the steps of the shooting method according to the foregoing first aspect.
In an embodiment of the present application, the method includes: acquiring a shooting preview picture containing a shot object, and determining color information of the shot object from the shooting preview picture; determining shooting parameters corresponding to the color information; and shooting the shot object based on the shooting parameters to obtain a target image containing the shot object. When a photo is shot, the color information of the shot object is determined through shooting the preview picture, shooting parameters matched with the color of the shot object are determined according to the color information of the shot object, and the shot object is shot according to the shooting parameters matched with the color of the shot object, so that a target image is obtained.
The foregoing description is only an overview of the present application, and is intended to be implemented in accordance with the teachings of the present application in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present application more readily apparent.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of steps of a photographing method according to an embodiment of the present application;
fig. 2 is a flowchart of steps of another photographing method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a photographing apparatus according to an embodiment of the present application;
fig. 4 is a logic diagram for shooting control according to an embodiment of the present application;
fig. 5 is a flowchart illustrating steps of still another photographing method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a photographing device according to an embodiment of the present application;
Fig. 7 is a schematic hardware structure of a terminal device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Example 1
Detailed description the photographing method provided by the embodiment of the application is applied to electronic equipment.
Referring to fig. 1, a flowchart of steps of a photographing method according to an embodiment of the present application is shown.
Step 101, acquiring a shooting preview screen containing the shot object, and determining color information of the shot object from the shooting preview screen.
In the embodiment of the present application, the electronic device may be an electronic device without a camera, such as a personal computer (Personal Computer, PC), a server, or the like, or an electronic device with a camera, such as a single lens reflex camera (Single Lens Reflex Camera, SLR camera), a non-reflective camera (Mirrorless Camera), a smart phone, or the like.
In the case of using an electronic device without a camera, the imaging system may be composed of the electronic device without a camera and another camera or the electronic device with a camera. For example, in the embodiment of the present application, the scheme may be applied to a personal computer, and the personal computer and the single lens are connected in a wireless or wired manner to form a shooting system together.
In the process of taking a photo, a camera function is generally required to be turned on, then, an image within a view-finding range is collected by an image sensor and a shooting preview picture is generated, the shooting preview picture is a picture collected by the image sensor, a user adjusts camera parameters, view finding angles and the like according to the shooting preview picture, and after the adjustment, a shutter is pressed to shoot an image.
In the embodiment of the application, after the shooting preview picture is acquired, the color identification of the shot object (such as a vehicle) in the shooting preview picture can be performed so as to determine the color information of the object in the shooting preview picture. The color information may include a main color of the object, or may include a duty ratio of a plurality of colors, for example, the color information of the object may be red, or the color information of the object may be (20% red+80% blue). It should be noted that the color may be represented by a color code given in a color standard, for example, color code FF0000 represents positive red, and color code FFFFFF represents pure white. The real-time examples of the present application are not particularly limited to the color standard, and a skilled person can select an appropriate color standard according to actual needs.
Specifically, in one embodiment, since the subject is usually placed in the center of the picture when taking a photo, that is, in the taking preview picture, the subject is usually in the middle area, in order to improve the efficiency of determining the color information and reduce the resource consumption, a preset area may be directly set in the center of the taking preview picture, and the color of the image in the preset area may be extracted, so as to obtain the color information of the preset area, and then the color information of the preset area is directly used as the color information of the subject. Further, the preset area can be adjusted according to the requirement of the user, and the user can adjust the size and the position of the preset area in the shooting preview picture according to the size and the position of the shot object in the shooting preview picture.
In another embodiment, in order to improve accuracy of determining the color information, the object may be actively identified to determine a position of the object in the shooting preview screen and a shape of the object in the shooting preview screen, so as to accurately extract pixel information of an area of the object in the shooting preview screen, and determine the color information of the object according to the extracted pixel information.
For example, the object in the shooting preview screen may be identified by the object identification neural network model and an area in which the object is located is determined, so that color information of the shooting preview screen in the area is acquired as the color information of the object.
Step 102, determining shooting parameters for the shot object according to the color information.
In a real shooting scene, a photographer generally needs to set shooting parameters of a camera according to colors of shot objects, so as to obtain better shooting effects for the shot objects with different colors. For example, in a scene of shooting a vehicle, because white paint reflects more light, shooting at a higher shutter speed may be required to avoid overexposure when shooting a white vehicle; the black paint reflects less light and therefore may need to be photographed at a lower shutter speed to avoid underexposure when photographing a black vehicle; the red vehicle paint reflects light with a lower color temperature (a measuring unit containing color components in the light), so that the ambient light is warm, and the image with the correct color temperature is required to be shot with a corresponding white balance value. In order to obtain good shooting effects for all the objects with different colors, various shooting parameters can be determined according to the colors of the objects, wherein the shooting parameters can include shutter speed (exposure time), white balance, aperture size, photosensitivity (ISO value) and the like.
When the shutter speed represents shooting a photo, the longer the shutter is opened, the more light rays participating in sensitization are, and under the condition that other shooting parameters are unchanged, the higher the brightness of the shot image is; otherwise, the lower the brightness of the photographed image.
The white balance setting can calibrate the deviation of the color temperature of the image, and the color of the photographed object can be well restored by setting the correct white balance in photographing. If the white balance deviates, the photographed image may shift the color of the scene itself, resulting in a warmth or coldness of the image.
The camera imaging utilizes the principle of aperture imaging, light rays are imaged on an image sensor through apertures in a lens, an aperture is a device for controlling the size of the apertures, the larger the aperture is, the larger the aperture diameter is, the more light rays which participate in imaging in unit time are, the smaller the aperture is, the smaller the aperture diameter is, and the less light rays which participate in imaging in unit time are.
The sensitivity indicates the sensitivity of a film (image sensor) used for imaging by a camera to light, and the higher the sensitivity is, the more sensitive the image sensor is to light, and the higher the brightness of the imaging result is for the same light flux.
In the embodiment of the application, a technician or a user can determine shooting parameters matched with each color information according to experience or related calculation in advance, set shooting parameters corresponding to each color information, establish color-parameter corresponding relations between the color information and one or more shooting parameters, and inquire the color-parameter corresponding relations according to the color information of the shot object after obtaining the color information of the shot object, so as to determine the required shooting parameters, wherein the color information can comprise color codes, color names or color identification codes and the like.
Further, color information of the object may be input into a preset shooting parameter determination model, so as to obtain shooting parameters output by the shooting parameter determination model. The shooting parameter determination model can be obtained based on neural network model training, and can also be a mathematical model constructed by technicians. It should be noted that, a technician may construct the above-mentioned shooting parameter determination model according to actual needs, and the form and the generation method of the shooting parameter determination model in the embodiment of the present application are not particularly limited.
And step 103, shooting the shot object based on the shooting parameters to obtain a target image.
In the embodiment of the application, after the shooting parameters are obtained, the image can be shot based on the shooting parameters. The user can also take a plurality of pictures of the same shot object by adopting the method, in this case, as the color of the same shot object is not changed, the shooting parameters can be determined by adopting the method only when shooting for the first time, and the shooting parameters determined when shooting for the first time are used in the follow-up shooting process, so that the shooting efficiency is improved, and the operation resources consumed by shooting images are reduced.
For example, a user shoots a black car in a studio, needs to shoot a plurality of photos to synthesize a panoramic animation of the black car, and then before shooting a first photo, the user can determine the color information of the black car by shooting a preview picture, and determine corresponding shooting parameters according to the color information of the black car, and shoot a plurality of photos at different angles of the black car by using the shooting parameters.
In summary, the photographing method provided by the application includes: acquiring a shooting preview picture containing a shot object, and determining color information of the shot object from the shooting preview picture; determining shooting parameters corresponding to the color information; and shooting the shot object based on the shooting parameters to obtain a target image containing the shot object. When a photo is shot, the color information of the shot object is determined through shooting the preview picture, shooting parameters matched with the color of the shot object are determined according to the color information of the shot object, and the shot object is shot according to the shooting parameters matched with the color of the shot object, so that a target image is obtained.
Example two
Detailed description the photographing method provided by the embodiment of the application is applied to electronic equipment.
Referring to fig. 2, fig. 2 shows a flowchart of steps of another photographing method according to an embodiment of the present application, which may include the following steps:
step 201, acquiring a shooting preview screen containing the shot object, and determining color information of the shot object from the shooting preview screen.
In this step, reference is made to step 101, which is not repeated in the embodiment of the present application.
In addition, in the embodiment of the application, the color recognition model can also be used for determining the color category of the photographed object. The technician can train the neural network model through a training set formed by the sample image and the color labeling information corresponding to the sample image, so as to obtain a color recognition model capable of coding according to the determined color of the input image.
Optionally, step 201 may include:
and a sub-step 2011 of inputting the shooting preview picture into a color recognition model to obtain the color code output by the color recognition model.
In one embodiment, the color recognition model may include two parts, namely a shot object extraction sub-model and a color recognition sub-model, and the shot preview image may be directly input into the color recognition model, the shot object is extracted from the sub-model by the shot object to identify the shot object and determine the boundary line of the shot object, then the image part in the boundary line in the shot preview image is input into the color recognition sub-model, and the color recognition is performed on the image part in the boundary line by the color recognition sub-model, so as to determine the color coding of the shot object in the shot preview image.
In another embodiment, the color recognition model may not recognize and extract the subject in the photographing preview screen. Before the shooting preview picture is input into the color recognition model, a preset area corresponding to the shot object is set in the shooting preview picture in an automatic or manual mode, then the shooting preview picture in the preset area is input into the color recognition model, and the color recognition model is used for carrying out color recognition on the shooting preview picture in the preset area so as to determine the color coding of the shot object in the shooting preview picture.
A substep 2012 of determining a color class of the subject based on the color coding.
Millions of color codes are typically contained in a color standard library, and a broad "red" may correspond to tens of thousands of different codes, such as a pink code of FFC0CB, a deep red code of DC143C, and so on. In the embodiment of the application, based on cost performance consideration, the corresponding shooting parameters may not need to be set for each color code, and the shooting requirements may be met by only setting tens or hundreds of small shooting parameters. Thus, color codes may be categorized, with multiple color codes corresponding to one color category, where a color category may include red, green, blue, purple, and the like.
For example, the technician may set 14 color categories of black, white, silver, gray, red, blue, green, yellow, orange, brown, purple, gold, powder, champagne, and associate each color code with one of the 14 color categories, so that after the color codes are obtained, the color category to which the color codes correspond is determined.
In the embodiment of the application, the color recognition model is enabled to output the color codes, and then the color codes are converted into the color categories, so that the color recognition model can be conveniently modified by a technician by increasing, decreasing, deleting, modifying and the like the color categories, after the color categories are modified, the color recognition model is not required to be modified, only the corresponding relation between the color parameters and the color categories is required to be re-established, and the maintainability is improved. The color recognition model may be trained to output the color class directly.
Step 202, obtaining a corresponding relation of target parameters, wherein the corresponding relation of the target parameters comprises a corresponding relation of color information and shooting parameters.
In the embodiment of the application, a technician or a user can determine shooting parameters matched with each color information according to experience or related calculation in advance, set shooting parameters corresponding to each color information, establish a target parameter corresponding relation between the color information and one or more shooting parameters, and query the target parameter corresponding relation according to the color information of the shot object after obtaining the color information of the shot object, thereby determining required shooting parameters, wherein the color information can comprise color categories.
Further, according to the different object types of the photographed objects, even if the color information is the same, different photographing parameters may be required to be adopted for photographing, so that a better photographing effect can be obtained for different types of photographed objects. The object type may represent, among other things, the kind of object, such as a vehicle, a person, a building, a plant, etc. Therefore, the technician can set a plurality of sets of parameter corresponding relations for different object types, and each set of parameter corresponding relation corresponds to one object type. Thus, the corresponding target parameter corresponding relation can be determined according to the object type of the shot object.
Specifically, the following methods of the sub-steps 2021 to 2022 may be employed to determine the target parameter correspondence relationship corresponding to the subject.
Step 202 may include:
sub-step 2021, determining an object type of the subject.
In the embodiment of the application, the object type option can be provided for the user to select the object type corresponding to the shot object to be shot. The object type determination model may also be trained in advance, and the object in the shooting preview screen may be identified by the object type determination model, so as to determine the object type of the object. It should be noted that, the method for determining the object type of the object can be flexibly selected by a technician according to actual needs, and the embodiment of the present application is not limited in particular.
Sub-step 2022, determining the corresponding target parameter correspondence according to the object type.
In the embodiment of the application, the corresponding relation between the object type and the parameter corresponding relation can be further included, and after the object type of the shot object is determined, the corresponding relation between the object type and the parameter corresponding relation can be determined according to the corresponding relation between the object type and the parameter corresponding relation.
Step 203, determining shooting parameters corresponding to the color information of the shot object according to the target parameter corresponding relation.
Because the corresponding relation between the target parameters includes the corresponding relation between the color information and the shooting parameters, after the corresponding relation between the target parameters of the shot object is obtained, the shooting parameters corresponding to the shot object can be determined according to the color information of the shot object. Therefore, the shot object can be photographed through the photographing parameters in the subsequent process, and a good photographing effect is achieved.
Step 204, determining a current size of the subject in a shooting preview screen, and a target size set for the subject.
In a plurality of shooting scenes, it is generally necessary to take a plurality of corresponding photos at a plurality of angles of the same subject, and to perform a plurality of compositing. Not only the shooting parameters between the plurality of images obtained by the plurality of shooting are required to be the same, but also the sizes of the objects in the plurality of images are required to be the same so as to perform the post-image composition. However, in order to capture a plurality of images, it is necessary to adjust the relative position between the subject and the camera, which may cause a change in the distance between the subject and the camera during the adjustment, thereby affecting
Therefore, in order to further improve convenience in a plurality of shooting scenes, in the embodiment of the present application, the size of the object in the shooting preview screen may be automatically adjusted before each shooting of the image, so that the size of the object in the shooting preview screen is the same in the image obtained by each shooting.
Specifically, the subject in the shooting preview screen may be identified, and the current size of the subject in the shooting preview screen may be determined, and if the current size does not coincide with the target size, the size of the subject in the shooting preview screen may be adjusted to the target size. The size may be a width ratio, a height ratio, an area ratio, or the like of the subject in the photographing preview screen.
The target size may be set by the photographer before photographing, and for example, the target size may be set to 1/2 of the width of the photographed preview screen or the target size may be set to 1/3 of the height of the photographed preview screen. In addition, the size of the subject in the first image captured during one multi-shot process may be determined as the target size, so that the size of the subject in each image captured subsequently during the one multi-shot process is the same as the size of the subject in the first image. For example, when a user needs to take a group of (10) vehicle photos for generating a panoramic animation of the vehicle, during taking a first photo, the user can directly take the photo without adjusting the size of the shot object according to the target size, and determine that the size of the shot object in the first image obtained by taking the photo is 1/2 of the width of a shooting preview screen, when taking the subsequent 9 images, each taking the 1/2 of the width of the shooting preview screen as the target size of the shot object, so that the sizes of the shot objects in the 10 images obtained by taking the photo this time are the same.
Step 205, adjusting the size of the object to the target size.
In the embodiment of the present application, the camera for photographing the subject may be a camera including a zoom function (optical zoom and/or digital zoom), and the size of the subject in the photographing preview screen may be changed by adjusting the focal length of the camera.
The camera for shooting the shot object can also be a camera mounted on a sliding rail or a mobile device, the camera can change the relative position between the camera and the shot object through the sliding rail or the mobile device, thereby changing the distance between the camera and the shot object, and the size of the shot object in a shooting preview picture can be adjusted based on the principle of near-far.
Optionally, step 205 may include:
in a substep 2051, the camera focal length is increased if the current size is smaller than the target size.
In the embodiment of the application, for the shooting equipment with an automatic zooming function, the automatic zooming can be directly carried out by controlling the shooting equipment, so that the size adjustment of a shot object is realized, for the shooting equipment which is not provided with the automatic zooming function but is only provided with a manual zooming function, for example, a single-lens reflex camera with a manual zooming ring is carried, a zoom device (such as a Kaffy zoom device) with a servo motor can be arranged on the zooming ring of the camera, and the zoom ring is driven to rotate by controlling the rotation of the servo motor so as to control the focal length of the camera, so that the size adjustment of the shot object in a shooting preview picture is realized. The zoom device can establish wired or wireless communication with the electronic equipment, receives a zoom control instruction sent by the electronic equipment, and drives a driving motor connected with the zoom ring to act according to the zoom control instruction, so that the control of the focal length of the camera is realized.
If the current size of the shot object is smaller than the target size, the focal length of the camera needs to be increased, the effect of 'zooming in' the shot object is achieved, and the size of the shot object in a shooting preview picture is increased until the target size.
Substep 2052, reducing the camera focal length in case the current size is larger than the target size.
If the current size of the shot object is larger than the target size, the focal length of the camera needs to be reduced, the effect of 'pushing away' the shot object is realized, and the size of the shot object in a shooting preview picture is reduced until the target size.
And step 206, shooting the shot object based on the shooting parameters to obtain a target image.
In this step, reference may be made to step 103, which is not described in detail in the embodiments of the present application.
In a substep 2061, a view angle distance for the subject is acquired.
The user can set the surrounding angle range to be shot and the number of images to be shot, and the framing angle interval between two adjacent images can be calculated according to the surrounding angle range and the number of images, for example, the surrounding angle range set by the user is 360 degrees, the number of images to be shot is 10, then the shooting device can be automatically controlled to rotate around the shot object, and the shooting device can shoot 10 images around the shot object with 36 degrees as the framing angle interval.
Substep 2062, controlling a camera to move around the shot object, and shooting the shot object at the view angle interval to obtain a plurality of target images containing the shot object, wherein the target images are shot based on the shooting parameters.
In the embodiment of the application, the shooting device can comprise the shooting device capable of moving autonomously, so that the shooting of the shot object from different angles is realized automatically, and the shooting device can be formed by a camera and an annular turntable. Referring to fig. 3, fig. 3 shows a schematic diagram of a photographing apparatus according to an embodiment of the present application, where, as shown in fig. 3, the photographing apparatus may include an electronic apparatus 31, a camera 32 connected (wirelessly connected or wired connected) to the electronic apparatus, and a turntable 33 connected to the electronic apparatus, the camera 32 is disposed on the turntable 33, and the camera 32 may photograph a subject 34 at different viewing angles under rotation of the turntable 33.
Referring to fig. 4, fig. 4 shows a shooting control logic diagram provided by the embodiment of the present application, as shown in fig. 4, an electronic device may be connected to a programmable logic controller (Programmable Logic Controller, PLC) in a wired or wireless manner, and the electronic device sends a rotation control signal to the programmable logic controller through a Modbus protocol, and after receiving the rotation control signal, the programmable logic controller sends a driving signal to a servo motor connected with the rotation control signal to control the servo motor to drive a turntable on which a camera is disposed to rotate, so as to adjust a shooting angle of the camera on a shot object. The electronic equipment can be connected with the zoom device in a wired or wireless mode, and the zoom device is driven to drive the zoom ring of the camera to rotate by sending a zoom control signal to the zoom device, so that the focal length of the camera is adjusted. The electronic device may also connect to the camera by wired or wireless means to control various functions of the camera, such as acquiring a photographing preview picture from the camera, transmitting photographing parameters to the camera, controlling the camera to photograph an image, and the like.
It should be noted that, in the process of photographing around the object, the photographing parameters may be set and the focal length of the camera may be adjusted by using the method in the foregoing steps. In addition, the technician may set up the photographing device capable of moving automatically in other manners, such as an unmanned vehicle carrying a camera, which is not particularly limited in the embodiment of the present application.
According to the embodiment of the application, the surrounding shooting of the shot object can be automatically realized, the shooting parameters are automatically set according to the information such as the color of the shot object in the shooting process, and meanwhile, the sizes of the shot objects in a plurality of images obtained by the surrounding shooting can be unified in an automatic zooming mode and the like, so that the shooting convenience and the shooting efficiency are improved.
In summary, another photographing method provided by the present application includes: acquiring a shooting preview picture containing a shot object, and determining color information of the shot object from the shooting preview picture; determining shooting parameters corresponding to the color information; and shooting the shot object based on the shooting parameters to obtain a target image containing the shot object. When a photo is shot, the color information of the shot object is determined through shooting the preview picture, shooting parameters matched with the color of the shot object are determined according to the color information of the shot object, and the shot object is shot according to the shooting parameters matched with the color of the shot object, so that a target image is obtained.
Example III
Referring to fig. 5, a flowchart illustrating steps of still another photographing method according to an embodiment of the present application may include the following steps:
step 301, acquiring a shooting preview screen containing the shot object, and determining color information of the shot object from the shooting preview screen.
In this step, reference is made to step 201, and the description of the embodiment of the present application is omitted.
Step 302, determining material information of the object.
In addition to the effect of the image, the color of the object may have a large influence on the image. For example, if the material of the object is a mirror surface material or a smooth surface material, the surface glossiness is high, the reflectivity is high, and if the surface of the object is a frosted material or a matte material, the surface glossiness is low, and the reflectivity is low. In order to obtain a good imaging effect, it is generally necessary to set different imaging parameters in order to image objects of different materials.
In the embodiment of the application, the shooting parameters can be determined according to the material information of the shot object, so that good shooting effects can be obtained for the shot objects with different materials.
Specifically, the shooting preview picture is input into a preset material detection model, so that the material information output by the material detection model is obtained. The material detection model can be obtained based on neural network model training, and can also be a mathematical model constructed by technicians. It should be noted that, a technician may construct the above material detection model according to actual needs, and the form and the generation method of the material detection model in the embodiment of the present application are not limited specifically. Wherein, the material information may include a bright surface material, a matte material, and the like.
In addition, the reflectivity of the object may be detected by the laser sensor, so that the material information of the object is determined according to the reflectivity. The laser sensor can send the emission light beam with preset intensity to the shot object, receive the reflection light beam of the shot object to the light beam with preset brightness, and can determine the reflectivity of the shot object according to the ratio between the intensity of the emission light beam and the intensity of the reflection light beam.
Step 303, determining shooting parameters for the shot object based on the color information and the material information.
In one embodiment, after obtaining the color information and the texture information of the object, the shooting parameters corresponding to the color information and the texture information of the object may be determined according to the correspondence between the color information, the texture information and the shooting parameters. The corresponding relationship of the three can be preset by a technician or a user according to actual needs, and the application does not limit the corresponding relationship of the three.
In another embodiment, after obtaining the color information and the material information of the object, the method in the above steps may be adopted to determine the shooting parameters corresponding to the object according to the color information of the object, and then correct the shooting parameters corresponding to the object according to the material information of the object. For example, if the material information of the object is a matte material, the exposure time can be prolonged, the sensitivity can be increased, the aperture size can be increased, or the like, and the exposure amount at the time of capturing an image can be increased.
In the embodiment of the application, the shooting parameters for the shot object can be determined according to the color information of the shot object, and the shooting parameters for the shot object can be determined according to the color information and the material information of the shot object, so that the shooting effect and the shooting efficiency of shooting the shot objects with different colors and materials are further improved.
Step 304 of determining a current size of the subject in a photographing preview screen, and a target size set for the subject.
In this step, reference is made to step 204, and the description of the embodiment of the present application is omitted.
Step 305, adjusting the size of the object to the target size.
This step can be referred to as step 205, and the embodiments of the present application will not be described again.
And step 306, shooting the shot object based on the shooting parameters to obtain a target image.
This step can be referred to as step 206, and the embodiments of the present application will not be described again.
In summary, the shooting method provided by the present application further includes: acquiring a shooting preview picture containing a shot object, and determining color information of the shot object from the shooting preview picture; determining shooting parameters corresponding to the color information; and shooting the shot object based on the shooting parameters to obtain a target image containing the shot object. When a photo is shot, the color information of the shot object is determined through shooting the preview picture, shooting parameters matched with the color of the shot object are determined according to the color information of the shot object, and the shot object is shot according to the shooting parameters matched with the color of the shot object, so that a target image is obtained.
Example IV
The embodiment of the application provides a shooting device.
Referring to fig. 6, a schematic structural diagram of a photographing device according to an embodiment of the present application is shown in fig. 6:
the shooting device of the embodiment of the application comprises: an acquisition module 501, a parameter module 502 and a shooting module 503.
The functions of the modules and the interaction relationship between the modules are described in detail below.
The acquisition module 501 is configured to acquire a shooting preview screen including a subject, and determine color information of the subject from the shooting preview screen.
A parameter module 502 for determining shooting parameters for a subject according to color information.
The shooting module 503 is configured to shoot the object based on the shooting parameters, and obtain a target image.
Optionally, the color information includes a color category, and the acquiring module includes:
and the color coding sub-module is used for inputting the shot preview picture into the color recognition model to obtain the color code output by the color recognition model.
A color class sub-module for determining a color class of the subject based on the color coding.
Optionally, the parameter module includes:
and the corresponding relation sub-module is used for acquiring a corresponding relation of the target parameters, wherein the corresponding relation of the target parameters comprises the corresponding relation of the color information and the shooting parameters.
And the first parameter sub-module is used for determining shooting parameters corresponding to the color information of the shot object according to the corresponding relation of the target parameters.
Optionally, the apparatus further comprises:
and the object type module is used for determining the object type of the shot object.
And the corresponding relation determining module is used for determining corresponding relation of corresponding target parameters according to the object type.
Optionally, the apparatus further comprises:
the size determination sub-module is used for determining the current size of the shot object in the shooting preview picture and the target size set for the shot object.
And the size adjustment sub-module is used for adjusting the size of the shot object to the target size.
Optionally, the resizing submodule includes:
and the focal length increasing submodule is used for increasing the focal length of the camera under the condition that the current size is smaller than the target size.
And the focal length reducing submodule is used for reducing the focal length of the camera under the condition that the current size is larger than the target size.
Optionally, the shooting module includes:
and the angle interval submodule is used for acquiring the framing angle interval of the shot object.
And the shooting sub-module is used for controlling the camera to move around the shot object, shooting the shot object at a view angle interval, and obtaining a plurality of target images containing the shot object, wherein the target images are obtained by shooting based on shooting parameters.
Optionally, the parameter module includes:
and the material proton module is used for determining the material information of the shot object.
And a second parameter sub-module for determining shooting parameters for the shot object based on the color information and the material information.
In summary, the photographing device provided by the application comprises: an acquisition module for acquiring a shooting preview picture containing a shot object, and determining color information of the shot object from the shooting preview picture; a parameter module for determining shooting parameters for a shot object according to the color information; and the shooting module is used for shooting the shot object based on shooting parameters to obtain a target image. When a photo is shot, the color information of the shot object is determined through shooting the preview picture, shooting parameters matched with the color of the shot object are determined according to the color information of the shot object, and the shot object is shot according to the shooting parameters matched with the color of the shot object, so that a target image is obtained.
Example five
Fig. 7 is a schematic hardware structure of a terminal device for implementing various embodiments of the present application.
The terminal device 600 includes, but is not limited to: radio frequency unit 601, network module 602, audio output unit 603, input unit 604, sensor 605, display unit 606, user input unit 607, interface unit 608, memory 609, processor 610, and power supply 611. It will be appreciated by those skilled in the art that the terminal device structure shown in fig. 7 does not constitute a limitation of the terminal device, and the terminal device may comprise more or less components than shown, or may combine certain components, or may have a different arrangement of components. In the embodiment of the application, the terminal equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
It should be understood that, in the embodiment of the present application, the radio frequency unit 601 may be used to receive and send information or signals during a call, specifically, receive downlink data from a base station, and then process the downlink data with the processor 610; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 601 may also communicate with networks and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user via the network module 602, such as helping the user to send and receive e-mail, browse web pages, access streaming media, etc.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the terminal device 600. The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used for receiving audio or video signals. The input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, the graphics processor 6041 processing image data of still pictures or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphics processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. Microphone 6042 may receive sound and can process such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 601 in the case of a telephone call mode.
The terminal device 600 further comprises at least one sensor 605, such as a light sensor, a motion sensor and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 6061 and/or the backlight when the terminal device 600 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when the accelerometer sensor is stationary, and can be used for recognizing the gesture (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking) and the like of the terminal equipment; the sensor 605 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 606 is used to display information input by a user or information provided to the user. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 6071 or thereabout using any suitable object or accessory such as a finger, stylus, or the like). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 610, and receives and executes commands sent from the processor 610. In addition, the touch panel 6071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein.
Further, the touch panel 6071 may be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation thereon or thereabout, the touch operation is transmitted to the processor 610 to determine a type of a touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although in fig. 7, the touch panel 6071 and the display panel 6061 are two independent components to realize the input and output functions of the terminal device, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to realize the input and output functions of the terminal device, which is not limited herein.
The interface unit 608 is an interface to which an external device is connected to the terminal apparatus 600. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 600 or may be used to transmit data between the terminal apparatus 600 and an external device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a storage program area that may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the terminal device, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby performing overall monitoring of the terminal device. The processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The terminal device 600 may further include a power supply 611 (such as a battery) for supplying power to the respective components, and preferably, the power supply 611 may be logically connected to the processor 610 through a power management system, so that functions of managing charging, discharging, power consumption management, etc. are implemented through the power management system.
In addition, the terminal device 600 includes some functional modules, which are not shown, and will not be described herein.
Preferably, the embodiment of the present application further provides a terminal device, including: the processor 610, the memory 609, and a computer program stored in the memory 609 and capable of running on the processor 610, where the computer program when executed by the processor 610 implements the respective processes of the above-described shooting method embodiment and achieves the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, realizes the processes of the shooting method embodiment and can achieve the same technical effects, and in order to avoid repetition, the description is omitted. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (16)

1. A photographing method, the method comprising:
acquiring a shooting preview picture containing a shot object, and determining color information of the shot object from the shooting preview picture;
determining shooting parameters for the shot object according to the color information; wherein the shooting parameters comprise at least one of shutter speed, white balance, aperture size and photosensitivity;
shooting the shot object based on the shooting parameters to obtain a target image;
the color information includes a color category, and the determining the color information of the subject from the photographing preview screen includes:
inputting the shooting preview picture into a color recognition model to obtain the color code of the shot object output by the color recognition model; wherein the color recognition model is used for recognizing the shot object from the shooting preview picture and extracting the color code of the shot object;
a color class of the subject is determined based on the color coding.
2. The method according to claim 1, wherein the determining of the photographing parameters for the subject from the color information includes:
Acquiring a corresponding relation of target parameters, wherein the corresponding relation of the target parameters comprises a corresponding relation of color information and shooting parameters;
and determining shooting parameters corresponding to the color information of the shot object according to the target parameter corresponding relation.
3. The method according to claim 1, wherein the method further comprises:
determining an object type of the subject;
and determining the corresponding target parameter corresponding relation according to the object type.
4. The method according to claim 1, wherein the method further comprises:
determining a current size of the subject in a shooting preview screen, and a target size set for the subject;
and adjusting the size of the shot object to the target size.
5. The method of claim 4, wherein the adjusting the size of the subject to the target size comprises:
increasing a camera focal length if the current size is less than the target size;
and reducing the focal length of the camera in the condition that the current size is larger than the target size.
6. The method according to claim 1, wherein capturing the subject based on the capturing parameters to obtain a target image includes:
Acquiring a framing angle interval for the photographed object; the view angle interval represents a difference in view angle for two adjacent captured images of the subject;
controlling a camera to move around the shot object, and shooting the shot object at the view angle interval to obtain a plurality of target images containing the shot object; the target image is obtained through shooting based on the shooting parameters.
7. The method according to claim 1, wherein the determining of the photographing parameters for the subject from the color information includes:
determining material information of the shot object;
and determining shooting parameters for the shot object based on the color information and the material information.
8. A photographing apparatus, the apparatus comprising:
an acquisition module, configured to acquire a shooting preview screen including a subject, and determine color information of the subject from the shooting preview screen;
a parameter module for determining shooting parameters for the shot object according to the color information; wherein the shooting parameters comprise at least one of shutter speed, white balance, aperture size and photosensitivity;
The shooting module is used for shooting the shot object based on the shooting parameters to obtain a target image;
the color information includes a color category, and the determining the color information of the subject from the photographing preview screen includes:
inputting the shooting preview picture into a color recognition model to obtain the color code of the shot object output by the color recognition model; wherein the color recognition model is used for recognizing the shot object from the shooting preview picture and extracting the color code of the shot object;
a color class of the subject is determined based on the color coding.
9. The apparatus of claim 8, wherein the parameter module comprises:
the corresponding relation sub-module is used for acquiring a corresponding relation of target parameters, wherein the corresponding relation of the target parameters comprises a corresponding relation of color information and shooting parameters;
and the first parameter sub-module is used for determining shooting parameters corresponding to the color information of the shot object according to the target parameter corresponding relation.
10. The apparatus of claim 8, wherein the apparatus further comprises:
an object type module for determining an object type of the subject;
And the corresponding relation determining module is used for determining corresponding relation of corresponding target parameters according to the object type.
11. The apparatus of claim 8, wherein the apparatus further comprises:
a size determination submodule for determining a current size of the subject in a shooting preview screen and a target size set for the subject;
and the size adjustment sub-module is used for adjusting the size of the shot object to the target size.
12. The apparatus of claim 11, wherein the resizing submodule comprises:
a focal length increasing sub-module for increasing a focal length of the camera if the current size is smaller than the target size;
and the focal length reducing submodule is used for reducing the focal length of the camera under the condition that the current size is larger than the target size.
13. The apparatus of claim 8, wherein the photographing module comprises:
an angle interval submodule for acquiring a view finding angle interval for the shot object; the view angle interval represents a difference in view angle for two adjacent captured images of the subject;
And the shooting sub-module is used for controlling a camera to move around the shot object, shooting the shot object at the view angle interval and obtaining a plurality of target images containing the shot object, wherein the target images are obtained by shooting based on the shooting parameters.
14. The apparatus of claim 8, wherein the parameter module comprises:
the material proton module is used for determining the material information of the shot object;
and the second parameter submodule is used for determining shooting parameters aiming at the shot object based on the color information and the material information.
15. A terminal device, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the shooting method according to any one of claims 1 to 7.
16. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the shooting method according to any one of claims 1 to 7.
CN202210989573.6A 2022-08-16 2022-08-16 Shooting method, shooting device, terminal equipment and storage medium Active CN115499581B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210989573.6A CN115499581B (en) 2022-08-16 2022-08-16 Shooting method, shooting device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210989573.6A CN115499581B (en) 2022-08-16 2022-08-16 Shooting method, shooting device, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115499581A CN115499581A (en) 2022-12-20
CN115499581B true CN115499581B (en) 2023-11-21

Family

ID=84465993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210989573.6A Active CN115499581B (en) 2022-08-16 2022-08-16 Shooting method, shooting device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115499581B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104333702A (en) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 Method, device and terminal for automatic focusing
WO2017143933A1 (en) * 2016-02-26 2017-08-31 阿里巴巴集团控股有限公司 Method, device, mobile terminal, and camera for identifying photograph subject
CN109600550A (en) * 2018-12-18 2019-04-09 维沃移动通信有限公司 A kind of shooting reminding method and terminal device
CN111107270A (en) * 2019-12-31 2020-05-05 维沃移动通信有限公司 Shooting method and electronic equipment
WO2021051995A1 (en) * 2019-09-17 2021-03-25 维沃移动通信有限公司 Photographing method and terminal
EP3833001A1 (en) * 2019-12-05 2021-06-09 Beijing Xiaomi Mobile Software Co., Ltd. Image shooting method and device, terminal, and non-transitory storage medium for automatic shooting
CN113709368A (en) * 2021-08-26 2021-11-26 维沃移动通信有限公司 Image display method, device and equipment
CN113890989A (en) * 2017-10-14 2022-01-04 华为技术有限公司 Shooting method and electronic device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6200265B2 (en) * 2012-11-06 2017-09-20 キヤノン株式会社 Imaging apparatus, control method therefor, program, and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104333702A (en) * 2014-11-28 2015-02-04 广东欧珀移动通信有限公司 Method, device and terminal for automatic focusing
WO2017143933A1 (en) * 2016-02-26 2017-08-31 阿里巴巴集团控股有限公司 Method, device, mobile terminal, and camera for identifying photograph subject
CN113890989A (en) * 2017-10-14 2022-01-04 华为技术有限公司 Shooting method and electronic device
CN109600550A (en) * 2018-12-18 2019-04-09 维沃移动通信有限公司 A kind of shooting reminding method and terminal device
WO2021051995A1 (en) * 2019-09-17 2021-03-25 维沃移动通信有限公司 Photographing method and terminal
EP3833001A1 (en) * 2019-12-05 2021-06-09 Beijing Xiaomi Mobile Software Co., Ltd. Image shooting method and device, terminal, and non-transitory storage medium for automatic shooting
CN111107270A (en) * 2019-12-31 2020-05-05 维沃移动通信有限公司 Shooting method and electronic equipment
CN113709368A (en) * 2021-08-26 2021-11-26 维沃移动通信有限公司 Image display method, device and equipment

Also Published As

Publication number Publication date
CN115499581A (en) 2022-12-20

Similar Documents

Publication Publication Date Title
CN107566529B (en) Photographing method, mobile terminal and cloud server
CN109361865B (en) Shooting method and terminal
WO2020238380A1 (en) Panoramic photography method and terminal device
KR102099635B1 (en) Method for providing guide of camera and electronic device
JP6538079B2 (en) Imaging parameter setting method, apparatus, program, and recording medium
CN106716985A (en) Imaging control device, imaging control method, camera system, and program
CN107155060A (en) Image processing method and device
CN111246106B (en) Image processing method, electronic device, and computer-readable storage medium
CN102207674A (en) Panorama image shooting apparatus and method
US20220292696A1 (en) Photographing method and electronic device
CN106688228A (en) Imaging control device, imaging control method, camera, camera system and program
CN109218626B (en) Photographing method and terminal
CN110602401A (en) Photographing method and terminal
CN107623818B (en) Image exposure method and mobile terminal
CN109905603B (en) Shooting processing method and mobile terminal
EP3352453B1 (en) Photographing method for intelligent flight device and intelligent flight device
CN108848309B (en) Camera program starting method and mobile terminal
CN113411498B (en) Image shooting method, mobile terminal and storage medium
CN111432195A (en) Image shooting method and electronic equipment
CN111246102A (en) Shooting method, shooting device, electronic equipment and storage medium
CN108718388B (en) Photographing method and mobile terminal
CN105916097A (en) Information terminal apparatus, information acquisition apparatus, and information support method
CN108632543A (en) Method for displaying image, device, storage medium and electronic equipment
CN102420940B (en) Terminal with camera shooting function and camera zooming method thereof
CN110086998B (en) Shooting method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant