CN112887619A - Shooting method and device and electronic equipment - Google Patents

Shooting method and device and electronic equipment Download PDF

Info

Publication number
CN112887619A
CN112887619A CN202110120543.7A CN202110120543A CN112887619A CN 112887619 A CN112887619 A CN 112887619A CN 202110120543 A CN202110120543 A CN 202110120543A CN 112887619 A CN112887619 A CN 112887619A
Authority
CN
China
Prior art keywords
sub
shooting
image
information
celestial object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110120543.7A
Other languages
Chinese (zh)
Inventor
温涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110120543.7A priority Critical patent/CN112887619A/en
Publication of CN112887619A publication Critical patent/CN112887619A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera

Abstract

The embodiment of the application discloses a shooting method, a shooting device and electronic equipment, wherein the method comprises the following steps: under the condition that the shooting preview interface comprises the celestial object, acquiring a view finding range of the celestial object; displaying guide information corresponding to a view range of the celestial object on a shooting preview interface; the guide information includes a framing range of the sub-image and a photographing number of the sub-image; shooting the celestial object to obtain a plurality of sub-images under the condition that a picture in a shooting preview interface meets a preset condition based on the guide information; and displaying a target image comprising the celestial object, wherein the target image is synthesized according to a plurality of sub-images. According to the embodiment of the application, the shooting efficiency can be improved.

Description

Shooting method and device and electronic equipment
Technical Field
The embodiment of the application relates to the field of information processing, in particular to a shooting method, a shooting device and electronic equipment.
Background
With the continuous development of the shooting function of the electronic device, more and more users use the electronic device to shoot celestial objects (such as the galaxy and the aurora). Because the celestial object is large in size, the total shooting visual angle is also large, and the shooting difficulty is high. Most users do not have abundant astronomical photographic knowledge and professional photographic skills, so that the quality of images of celestial objects obtained by shooting is poor.
In the process of implementing the present application, the applicant finds that at least the following problems exist in the prior art:
in order to obtain a satisfactory image of the celestial object, the user needs to shoot for a plurality of times, and the operation is complicated.
Disclosure of Invention
The embodiment of the application provides a shooting method, a shooting device and electronic equipment, and aims to solve the problems that a user needs to shoot multiple times and the operation is complex in order to shoot an image of a satisfied celestial object.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a shooting method, which may include:
under the condition that the shooting preview interface comprises the celestial object, acquiring a view finding range of the celestial object;
displaying guide information corresponding to a view range of the celestial object on a shooting preview interface; the guide information includes a framing range of the sub-image and a photographing number of the sub-image;
shooting the celestial object to obtain a plurality of sub-images under the condition that a picture in a shooting preview interface meets a preset condition based on the guide information;
and displaying a target image comprising the celestial object, wherein the target image is synthesized according to a plurality of sub-images. In a second aspect, an embodiment of the present application provides a shooting apparatus, which may include:
the determining module is used for acquiring a view finding range of the celestial object under the condition that the shooting preview interface comprises the celestial object;
the display module is used for displaying guide information corresponding to the view finding range of the celestial body object on the shooting preview interface; the guide information includes a framing range of the sub-image and a photographing number of the sub-image;
the shooting module is used for shooting the celestial object to obtain a plurality of sub-images under the condition that the picture in the shooting preview interface meets the preset condition based on the guide information;
and the display module is also used for displaying a target image comprising the celestial object, and the target image is synthesized according to the plurality of sub-images.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, the guidance information is displayed on the shooting preview interface under the condition that the shooting preview interface comprises the celestial object, and the guidance information is determined according to the view range of the celestial object, so that based on the guidance information, shooting is carried out under the condition that the picture in the shooting preview interface meets the preset condition, and a plurality of subimages which can be used for splicing and synthesizing the full appearance of the celestial object can be shot. Further, the guide information includes the viewing range of the sub-image and the number of shots of the sub-image, and the shooting of the sub-image can be facilitated according to the viewing range of the sub-image. And finally, displaying a target image of the celestial object synthesized according to the plurality of sub-images. The high-quality target image including the celestial object can be easily shot, and the shooting efficiency is improved.
Drawings
The present application may be better understood from the following description of specific embodiments of the application taken in conjunction with the accompanying drawings, in which like or similar reference numerals identify like or similar features.
Fig. 1 is a schematic view of an application scenario of a shooting method according to an embodiment of the present application;
fig. 2 is a flowchart of a shooting method according to an embodiment of the present disclosure;
fig. 3 is a schematic view of an interface for displaying prompt information according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an interface for displaying a reference image according to an embodiment of the present disclosure;
fig. 5 is a schematic interface diagram for displaying guidance information according to an embodiment of the present application;
fig. 6 is a schematic diagram for displaying a synthesis target image according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an interface for displaying a target image according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a shooting device according to an embodiment of the present disclosure;
fig. 9 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 10 is a schematic hardware structure diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The shooting method provided by the embodiment of the application can be applied to at least the following application scenarios, which are explained below.
As shown in fig. 1, taking a galaxy arch bridge as an example, because the celestial object has a large volume, the shooting view range is also large, and the user does not have abundant astronomical photography knowledge and professional photography skills, only a part of the celestial object can be shot, and because the time and the position of the celestial object are not known, the best shooting opportunity is also missed. If the user wants to shoot the image of the satisfied celestial object, multiple shooting is needed, and the operation is complicated
In order to solve the problems in the related art, embodiments of the present application provide a shooting method, an apparatus, an electronic device, and a storage medium, so as to solve the problems that, in the related art, a user needs to shoot multiple times to obtain a satisfactory image of a celestial object, and the operation is complicated.
The method provided by the embodiment of the application can be applied to any scene in which the celestial object is difficult to shoot besides the application scene.
According to the method provided by the embodiment of the application, the guidance information is displayed on the shooting preview interface under the condition that the shooting preview interface comprises the celestial object, and then the celestial object is shot under the condition that the picture in the shooting preview interface meets the preset condition to obtain a plurality of sub-pictures. Since the guide information is determined according to the viewing range of the celestial object, shooting based on the guide information makes it possible to capture a plurality of sub-images that can be used to stitch together the entire view of the synthetic celestial object. Further, the guide information includes the viewing range of the sub-image and the number of shots of the sub-image, and the shooting of the sub-image can be facilitated according to the viewing range of the sub-image. And finally, displaying a target image of the celestial object synthesized according to the plurality of sub-images. The high-quality target image including the celestial object can be easily shot, and the shooting efficiency is improved.
Based on the application scenario, the following describes the shooting method provided in the embodiment of the present application in detail.
Fig. 2 is a flowchart of a shooting method according to an embodiment of the present disclosure.
As shown in fig. 2, the photographing method may include steps 210 to 240, and the method is applied to a photographing apparatus, and specifically as follows:
and step 210, acquiring a view range of the celestial object under the condition that the shooting preview interface comprises the celestial object.
And step 220, displaying guide information corresponding to the view range of the celestial object on the shooting preview interface, wherein the guide information comprises the view range of the sub-image and the shooting quantity of the sub-image.
And step 230, shooting the celestial object to obtain a plurality of sub-images under the condition that the picture in the shooting preview interface meets the preset condition based on the guide information.
And 240, displaying a target image comprising the celestial object, wherein the target image is synthesized according to a plurality of sub-images.
According to the shooting method, the guidance information is displayed on the shooting preview interface under the condition that the shooting preview interface comprises the celestial object, and then the celestial object is shot under the condition that a picture in the shooting preview interface meets the preset condition, so that a plurality of sub-images are obtained. Since the guide information is determined according to the viewing range of the celestial object, shooting based on the guide information makes it possible to capture a plurality of sub-images that can be used to stitch together the entire view of the synthetic celestial object. Further, the guide information includes the viewing range of the sub-image and the number of shots of the sub-image, and the shooting of the sub-image can be facilitated according to the viewing range of the sub-image. And finally, displaying a target image of the celestial object synthesized according to the plurality of sub-images. The high-quality target image including the celestial object can be easily shot, and the shooting efficiency is improved.
The contents of steps 210-240 are described below:
first, step 210 is involved.
In the case where a celestial object is included in the shooting preview interface, the finder range of the celestial object is determined. Taking the galaxy arch bridge as an example, the viewing range of the galaxy arch bridge can be 180 degrees.
In one possible embodiment, time information of the shooting environment is obtained, and position information of the shooting environment is located; determining the Botelun dark space level of the shooting environment according to the time information and the position information; and displaying prompt information, wherein the prompt information is used for prompting the user whether the Borter dark space grade meets the shooting condition for shooting the celestial object.
Wherein, the Borter dark space grade is a classification method for measuring the night space brightness in astronomy photography. Due to the fact that the phenomenon of light pollution is increasingly serious, observation points which are dark enough are far away from cities, many amateur astronomical enthusiasts cannot conveniently arrive at the observation points, and therefore, the amateur astronomical enthusiasts cannot accumulate enough experience to judge sky brightness. Also, the visual ability and observation ability of individuals are different, and the length of observation time is also influenced. Based on the principle, the Botelun space division principle integrates various factors to enable the observation standard to be more uniform and accurate.
The bordetella level quantifies the observability of celestial bodies and the degree to which light pollution interferes with astronomical observations. The classification method is divided into 9 grades, from the darkest sky seen on the earth (grade 1), to the sky in the center of the flourishing city (grade 9). Generally, the 1-4 grades are suitable for shooting celestial objects such as the silver river.
Illustratively, receiving and responding to a first input (such as: Tibet, 2020-12-18, 9 o' clock. evening) of time information and position information for indicating a photographing environment, determining a Botelr dark space level (such as: level 3) of the photographing environment according to the time information and the position information; and displaying prompt information, wherein the prompt information is used for prompting the user that the Poulter dark space level of the shooting environment is suitable for shooting the celestial object, namely for prompting the user that the Poulter dark space level meets the shooting condition for shooting the celestial object.
As shown in fig. 3, the prompt information is used to prompt the user that the current bordetella level is 4 levels and is suitable for shooting celestial objects.
Thereby, the Boteler dark space level of the shooting environment is determined according to the time information and the position information; and then displaying prompt information for prompting the user that the Botelun dark space level of the shooting environment is suitable for shooting the celestial object. The method and the system not only can help the user to reasonably plan the shooting environment, but also can help the user to find a proper shooting environment.
In a possible embodiment, before step 220, the following steps may be further included:
acquiring astronomical data of the celestial object from a preset astronomical database; generating an augmented reality reference image according to astronomical data; and displaying the augmented reality reference image on a shooting preview interface.
The astronomical data of the celestial object can be checked in the astronomical website, or the astronomical data of the celestial object can be called through a three-party interface.
In addition, since the position and shape of most celestial objects (such as the silver river) are strongly correlated with time, the time slide bar can be adjusted to check the position and shape of the silver river at any time corresponding to the shooting position input by the user. The method and the device can facilitate a user to make a shooting plan and determine the optimal shooting time. Taking the silver river arch bridge as an example, the optimal shooting time can be automatically recommended according to the silver core rising angle, and the silver river arch bridge is shot more suitably when the silver core rising angle is 10-40 degrees.
And displaying the augmented reality reference image on the shooting preview interface, wherein the reference image is generated according to the acquired astronomical data of the celestial object from the preset astronomical database. Thus, the reference image may be used to characterize the objective morphology of the celestial object. Therefore, the augmented reality reference image is displayed on the shooting preview interface, and the composition position of the celestial object in the shooting preview interface can be determined according to the reference image, so that the shooting by a user is facilitated. Therefore, the augmented reality reference image generated according to the astronomical data of the celestial object is displayed on the shooting preview interface, the composition position of the celestial object in the shooting preview interface can be determined according to the reference image, a user can be helped to quickly determine the composition position, and the shooting efficiency is improved.
After the step of displaying the augmented reality reference image on the shooting preview interface, the method may further include the following steps:
receiving a first input; in response to the first input, transparency of the reference image is adjusted.
As shown in fig. 4, the transparency of the reference image (i.e. the virtual galaxy) can be adjusted by adjusting the slide bar in the shooting preview interface, that is, the fusion degree of the virtual galaxy and the real galaxy is adjusted. When the reference image is set as 100%, the transparency of the reference image reaches the minimum value, namely, a completely virtual silver river is displayed on the shooting preview interface; when the transparency of the reference image is set to 0%, the transparency of the reference image reaches the maximum value, namely, the shooting preview interface displays a completely real silver river.
Therefore, the transparency of the reference image is adjusted by receiving and responding to the first input, so that the virtual celestial object and the celestial object in the real shooting environment can be simultaneously displayed in the shooting preview interface, the comparison by a user is facilitated, and the reference and shooting based on the reference image by the user can be assisted.
The guidance information includes direction guidance information, and after the step of acquiring astronomical data of the celestial object from the preset astronomical database, the method may further include the following steps:
determining first orientation information of the celestial object according to the astronomical data; determining second orientation information of the camera according to the detected gyroscope information; and determining direction guide information according to the first direction information and the second direction information, wherein the direction guide information is used for guiding a user to adjust the position of the camera.
Since the determination of the photographing environment of the celestial object requires azimuth information in addition to time information and position information. A gyroscope is an angular motion detection device that uses a moment-of-momentum sensitive housing of a high-speed rotating body to sense angular motion about one or two axes orthogonal to the axis of rotation with respect to the inertial space. Second orientation information of the camera may be determined based on the detected gyroscope information.
Exemplarily, the first orientation information of the celestial object is determined to be the southeast orientation according to the astronomical data; determining second azimuth information of the camera to be in a south-pointing direction according to the detected gyroscope information; then, in order to shoot an angle suitable for shooting a celestial object, the user is required to hold the electronic device and turn the electronic device from south to south, i.e., rotate the electronic device 45 degrees counterclockwise (i.e., guidance information for guiding the user to adjust the azimuth direction of the camera).
The first orientation information of the celestial object is determined according to the astronomical data, and the direction guide information is determined according to the second orientation information of the camera determined by the gyroscope information, so that the user can be guided to adjust the position of the camera based on the direction guide information, the shooting by the user is facilitated, and the shooting efficiency is improved.
Next, step 220 is involved.
Before step 220, the following steps may be further included:
determining the shooting quantity of the sub-images according to the view finding range of the celestial body object, the field angle of the camera and the preset coincidence rate between every two adjacent sub-images;
and determining the view range of the sub-images according to the view range of the celestial object, the preset coincidence rate and the shooting quantity.
First, the shooting number of the sub-images can be determined according to the viewing range of the celestial object, the camera angle of view, and the preset coincidence rate between every two adjacent sub-images.
The Field of View (FOV) is an angle formed by two edges of an optical instrument, where a lens of the optical instrument is a vertex and an object image of a target to be measured can pass through the maximum range of the lens. The field angle may be determined based on the focal length of the camera. Exemplarily, in case of a vertical shot, the corresponding field angle of view of an equivalent 26mm camera is 80 °.
Specifically, the number of the sub-images to be photographed may be determined according to a viewing range of the celestial object, a camera angle of view, and a preset coincidence ratio between each adjacent two sub-images based on the following formula (1).
F*(1-C)*(N-2)=G (1)
Wherein F is the camera viewing angle, C is the preset coincidence rate, and G is the viewing range; the data to be determined is the number of shots N of the sub-image.
Illustratively, taking a panoramic silver river arch bridge with a view range of a celestial object of 240 ° as an example, the vertical shooting is performed by using an equivalent main camera lens of 26mm, and the specific calculation process is as follows:
the corresponding camera field angle FOV of the equivalent 26mm camera is approximately equal to 80 degrees, if the equivalent is 60 percent, the equivalent 26mm camera is obtained by substituting the formula (1): 80 (1-60%). N (N-2) ═ 240, the number of images N in the subimages was calculated to be 9.5, and N was actually taken to be 10 (rounding up was performed for non-integer results to ensure yield). That is, when the predetermined overlap ratio between two sub-images is 60%, the required number of shots is about 10.
The corresponding camera field angle FOV of the equivalent 26mm camera is approximately equal to 80 degrees, if the equivalent is 50 percent, the equivalent 26mm camera is obtained by substituting the formula (1): 80 × 1 to 50% × (N-2) ═ 240, and the number N of shots of the sub-images was calculated to be 8. That is, in the case where the preset coincidence ratio between every two sub-images is 50%, the required number of shots is about 8.
For example, it is possible to display guide information including the number of shots on the shot preview interface and display that the currently shot sub-image is several in all the sub-images. As shown in fig. 5, the currently photographed sub-image is the 2 nd sub-image, and the total number of photographed sub-images is 8. The user is prompted to take the process conveniently.
Then, the finder range of the sub-image can be determined according to the finder range of the celestial object, the preset coincidence ratio, and the number of shots. For example, when the finder range of the subject is 180 °, the preset coincidence ratio is 50%, and the number of shots is 8, the finder range of the sub-image is X.
N X- [ X50%, (N-1) ], 180 °, gives X40 °.
Thus, the number of the sub-images taken, and the viewing range of the sub-images can be determined based on the viewing range of the celestial object, the camera angle, and the preset coincidence ratio between each adjacent two sub-images. The user only needs to press the shutter according to the guide information and can obtain many subimages, and the very big degree reduces the shooting degree of difficulty, promotes and shoots efficiency.
Next, step 230 is involved.
And shooting the celestial object according to the guide information to obtain a plurality of sub-images.
The user can shoot the celestial object according to the guide information in the shooting preview interface, and the sequence number of the currently shot sub-image, the moving direction of the next sub-image and the view finding range of the sub-image are displayed in real time in the shooting preview interface in the shooting process.
As shown in fig. 6, a dotted line with a double arrow indicates a viewing range of the sub-image, and the arrow indicates a moving direction in which the next sub-image is captured, i.e., a direction in which the user moves the electronic device.
In a possible embodiment, the multiple sub-images include at least a first sub-image and a second sub-image captured adjacently, and the step 230 may specifically include the following steps:
under the condition that a first sub-image is obtained through shooting, determining the coincidence rate of the first sub-image and a second sub-image corresponding to a preview picture; and shooting under the condition that the coincidence rate meets a preset condition to obtain a second sub-image, wherein the preset condition is determined according to the preset coincidence rate.
In order to ensure the synthesis effect, the coincidence rate of the first sub-image obtained by shooting and the second sub-image corresponding to the preview picture can be determined, and the shooting is carried out under the condition that the coincidence rate meets the preset condition, so that the second sub-image is obtained.
The preset condition is determined according to a preset coincidence rate, for example, when the preset coincidence rate is 50%, the preset condition may be that the coincidence rate of the first sub-image and the second sub-image corresponding to the preview screen is within a range of 40% to 60%.
Therefore, the second sub-image is obtained by shooting under the condition that the coincidence rate of the first sub-image and the second sub-image corresponding to the preview picture meets the preset condition, so that enough coincidence parts among a plurality of sub-images can be ensured, and the synthesis effect of a subsequent synthesis target image is improved.
Finally, step 240 is involved.
Before step 240, the following steps may also be included:
and carrying out panoramic synthesis, distortion correction, content identification, automatic cutting and exposure adjustment processing according to the multiple sub-images to obtain a target image.
Specifically, after the plurality of sub-images are shot, panoramic synthesis processing may be performed on the plurality of sub-images to obtain a first image, distortion correction processing may be performed on the first image to obtain a second image, content recognition and automatic cropping processing may be performed on the second image to obtain a third image, and exposure adjustment processing may be performed on the third image to obtain a target image.
The above-mentioned content recognition is that after panoramic synthesis processing is performed on a plurality of sub-images, distortion of the first image is large, and distortion correction needs to be performed on the first image, and usually, a partial blank appears at the edge of the second image obtained after the distortion correction processing, and automatic cropping processing needs to be performed on the obtained second image. Since the automatic cropping process may lose part of the picture, incorporating the content recognition fill process may achieve distortion correction without losing part of the picture.
Taking the celestial object as a galaxy arch bridge as an example, after synthesizing the target image, the target image is displayed as described in fig. 7. The target image of the galaxy arch bridge synthesized based on the multiple sub-images including the local galaxy can bring extremely shocking impression to people, and through the embodiment, the high-quality target image including the galaxy arch bridge can be shot quickly and efficiently without technologies such as abundant astronomical photographic knowledge and professional panoramic splicing of a user, so that the shooting efficiency is improved.
In summary, in the embodiment of the present application, when the shooting preview interface includes a celestial object, the guidance information is displayed on the shooting preview interface, and then the celestial object is shot when a picture in the shooting preview interface meets a preset condition, so as to obtain a plurality of sub-images. Since the guide information is determined according to the viewing range of the celestial object, shooting based on the guide information makes it possible to capture a plurality of sub-images that can be used to stitch together the entire view of the synthetic celestial object. Further, the guide information includes the viewing range of the sub-image and the number of shots of the sub-image, and the shooting of the sub-image can be facilitated according to the viewing range of the sub-image. And finally, displaying a target image of the celestial object synthesized according to the plurality of sub-images. The high-quality target image including the celestial object can be easily shot, and the shooting efficiency is improved.
It should be noted that, in the shooting method provided in the embodiment of the present application, the execution subject may be a shooting device, or a control module in the shooting device for executing the loading shooting method. In the embodiment of the present application, a shooting device executes a loading shooting method as an example, and the shooting method provided in the embodiment of the present application is described.
In addition, based on the shooting method, an embodiment of the present application further provides a shooting device, which is specifically described in detail with reference to fig. 8.
Fig. 8 is a schematic structural diagram of a shooting device according to an embodiment of the present application.
As shown in fig. 8, the photographing apparatus 800 may include:
and the determining module 810 is configured to acquire a viewing range of the celestial object when the celestial object is included in the shooting preview interface.
A display module 820, configured to display guidance information corresponding to a view range of the celestial object on the shooting preview interface; the guide information includes a viewing range of the sub-image and the number of shots of the sub-image.
The shooting module 830 is configured to shoot the celestial object to obtain a plurality of sub-images when the picture in the shooting preview interface meets a preset condition based on the guidance information.
The display module 820 is further configured to display a target image including the celestial object, where the target image is synthesized from the plurality of sub-images.
In a possible embodiment, the determining module 810 is further configured to determine the number of the sub-images to be shot according to the viewing range of the celestial object, the angle of the camera field, and the preset coincidence rate between every two adjacent sub-images.
The determining module 810 is further configured to determine a viewing range of the sub-image according to the viewing range of the celestial object, a preset coincidence rate, and the number of shots.
In a possible embodiment, the multiple sub-images include at least a first sub-image and a second sub-image captured adjacently, and the determining module 810 is further configured to determine, when the first sub-image is captured, a coincidence rate of the first sub-image and the second sub-image corresponding to the preview screen.
The shooting module 830 is specifically configured to: and shooting under the condition that the coincidence rate meets a preset condition to obtain a second sub-image, wherein the preset condition is determined according to the preset coincidence rate.
In a possible embodiment, the camera 800 may further include:
the first acquisition module is used for acquiring the time information of the shooting environment.
And the positioning module is used for positioning the position information of the shooting environment.
The determining module 810 is further configured to determine a bordetella level of the shooting environment according to the time information and the position information.
The display module 820 is further configured to display a prompt message, where the prompt message is used to prompt the user whether the level of the bordetella darkness space meets the shooting condition for shooting the celestial object.
In a possible embodiment, the camera 800 may further include:
and the second acquisition module is used for acquiring astronomical data of the celestial object from a preset astronomical database.
And the generation module is used for generating an augmented reality reference image according to the astronomical data.
And the display module 820 is further configured to display the augmented reality reference image on the shooting preview interface.
In a possible embodiment, the camera 800 may further include:
the receiving module is used for receiving a first input.
And the adjusting module is used for responding to the first input and adjusting the transparency of the reference image.
In a possible embodiment, the guiding information further includes direction guiding information, and the determining module 810 is further configured to determine the first orientation information of the celestial object according to astronomical data.
The determining module 810 is further configured to determine second orientation information of the camera according to the detected gyroscope information.
The determining module 810 is further configured to determine direction information according to the first orientation information and the second orientation information, where the direction information is used to guide a user to adjust a position of the camera.
In summary, the shooting device provided in the embodiment of the present application displays the guidance information on the shooting preview interface when the shooting preview interface includes the celestial object, and then shoots the celestial object to obtain a plurality of sub-images when the picture in the shooting preview interface satisfies the preset condition. Since the guide information is determined according to the viewing range of the celestial object, shooting based on the guide information makes it possible to capture a plurality of sub-images that can be used to stitch together the entire view of the synthetic celestial object. Further, the guide information includes the viewing range of the sub-image and the number of shots of the sub-image, and the shooting of the sub-image can be facilitated according to the viewing range of the sub-image. And finally, displaying a target image of the celestial object synthesized according to the plurality of sub-images. The high-quality target image including the celestial object can be easily shot, and the shooting efficiency is improved.
The imaging device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The shooting device provided in the embodiment of the present application can implement each process implemented by the shooting device in the method embodiments of fig. 2 to 7, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 9, an electronic device 900 is further provided in this embodiment of the present application, and includes a processor 901, a memory 902, and a program or an instruction stored in the memory 902 and executable on the processor 901, where the program or the instruction is executed by the processor 901 to implement each process of the above-mentioned chat group creation method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 10 is a schematic hardware structure diagram of another electronic device according to an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power supply 1011. Among them, the input unit 1004 may include a graphic processor 10041 and a microphone 10042; the display unit 1006 may include a display panel 10061; the user input unit 1007 may include a touch panel 10071 and other input devices 10072; the memory 1009 may include application programs and an operating system.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source 1011 may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
And a processor 1010 configured to acquire a viewing range of the celestial object when the celestial object is included in the shooting preview interface.
A display unit 1006 for displaying guide information corresponding to a finder range of the celestial object on the shooting preview interface; the guide information includes a viewing range of the sub-image and the number of shots of the sub-image.
And the processor 1010 is configured to, based on the guidance information, shoot the celestial object when a picture in the shooting preview interface meets a preset condition, so as to obtain a plurality of sub-images.
The display unit 1006 is further configured to display a target image including the celestial object, where the target image is synthesized from a plurality of sub-images.
Optionally, the processor 1010 is further configured to determine the shooting number of the sub-images according to the viewing range of the celestial object, the camera angle, and a preset coincidence rate between every two adjacent sub-images.
The processor 1010 is further configured to determine a viewing range of the sub-image according to the viewing range of the celestial object, a preset coincidence rate, and the number of shots.
Optionally, the multiple sub-images at least include a first sub-image and a second sub-image that are captured adjacently, and the processor 1010 is further configured to determine a coincidence rate of the first sub-image and the second sub-image corresponding to the preview screen when the first sub-image is captured.
The processor 1010 is specifically configured to: and shooting under the condition that the coincidence rate meets a preset condition to obtain a second sub-image, wherein the preset condition is determined according to the preset coincidence rate.
Optionally, the network module 1002 is configured to obtain time information of the shooting environment.
The network module 1002 is configured to locate position information of a shooting environment.
The processor 1010 is further configured to determine a level of bordetella darkness of the shooting environment according to the time information and the position information.
The display unit 1006 is further configured to display a prompt message, where the prompt message is used to prompt the user whether the level of the bordetella darkness space meets the shooting condition for shooting the celestial object.
Optionally, the network module 1002 is configured to obtain astronomical data of the celestial object from a preset astronomical database.
A processor 1010 for generating an augmented reality reference image from the astronomical data.
And the display unit 1006 is further configured to display the augmented reality reference image on the shooting preview interface.
Optionally, a user input unit 1007 is used to receive a first input.
A processor 1010 for adjusting transparency of the reference image in response to the first input.
Optionally, the guiding information further includes direction guiding information, and the processor 1010 is further configured to determine first orientation information of the celestial object according to the astronomical data.
The processor 1010 is further configured to determine second orientation information of the camera according to the detected gyroscope information.
The processor 1010 is further configured to determine direction information according to the first orientation information and the second orientation information, where the direction information is used to guide a user to adjust a position of the camera.
In the embodiment of the application, the guidance information is displayed on the shooting preview interface under the condition that the shooting preview interface comprises the celestial object, and then the celestial object is shot under the condition that a picture in the shooting preview interface meets a preset condition to obtain a plurality of sub-images. Since the guide information is determined according to the viewing range of the celestial object, shooting based on the guide information makes it possible to capture a plurality of sub-images that can be used to stitch together the entire view of the synthetic celestial object. Further, the guide information includes the viewing range of the sub-image and the number of shots of the sub-image, and the shooting of the sub-image can be facilitated according to the viewing range of the sub-image. And finally, displaying a target image of the celestial object synthesized according to the plurality of sub-images. The high-quality target image including the celestial object can be easily shot, and the shooting efficiency is improved.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above shooting method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (15)

1. A photographing method, characterized by comprising:
under the condition that a shooting preview interface comprises a celestial object, acquiring a view finding range of the celestial object;
displaying guide information corresponding to a view range of the celestial object on the shooting preview interface; the guide information includes a framing range of the sub-image and a photographing number of the sub-image;
on the basis of the guide information, shooting the celestial object under the condition that a picture in the shooting preview interface meets a preset condition to obtain a plurality of sub-images;
and displaying a target image comprising the celestial object, wherein the target image is synthesized according to the plurality of sub-images.
2. The method of claim 1, wherein prior to said displaying the guide information on the capture preview interface, the method further comprises:
determining the shooting quantity of the sub-images according to the view range of the celestial body object, the field angle of a camera and the preset coincidence rate between every two adjacent sub-images;
and determining the view range of the sub-images according to the view range of the celestial object, the preset coincidence rate and the shooting quantity.
3. The method according to claim 1, wherein the plurality of sub-images at least include a first sub-image and a second sub-image which are photographed adjacently, and the photographing of the celestial object based on the guidance information and in a case where a picture in the photographing preview interface satisfies a preset condition, to obtain the plurality of sub-images comprises:
under the condition that a first sub-image is obtained through shooting, determining the coincidence rate of the first sub-image and a second sub-image corresponding to a preview picture;
and shooting under the condition that the coincidence rate meets the preset condition to obtain the second sub-image, wherein the preset condition is determined according to the preset coincidence rate.
4. The method of claim 1, further comprising:
acquiring time information of a shooting environment and positioning position information of the shooting environment;
determining the Boteler dark space level of the shooting environment according to the time information and the position information;
and displaying prompt information, wherein the prompt information is used for prompting a user whether the Pockel dark space grade meets the shooting condition for shooting the celestial object.
5. The method of claim 1, wherein prior to said displaying the guide information on the capture preview interface, the method further comprises:
acquiring astronomical data of the celestial object from a preset astronomical database;
generating an augmented reality reference image according to the astronomical data;
and displaying the augmented reality reference image on the shooting preview interface.
6. The method of claim 5, wherein after the displaying the augmented reality reference image in the capture preview interface, the method further comprises:
receiving a first input;
in response to the first input, adjusting a transparency of the reference image.
7. The method of claim 5, wherein the direction information further comprises direction information, and after the obtaining of the astronomical data of the celestial object from a preset astronomical database, the method further comprises:
determining first orientation information of the celestial object according to the astronomical data;
determining second orientation information of the camera according to the detected gyroscope information;
and determining direction information according to the first direction information and the second direction information, wherein the direction information is used for guiding a user to adjust the position of the camera.
8. A camera, comprising:
the shooting preview interface comprises a determination module, a display module and a display module, wherein the determination module is used for acquiring a view range of a celestial object under the condition that the shooting preview interface comprises the celestial object;
the display module is used for displaying guide information corresponding to the view range of the celestial body object on the shooting preview interface; the guide information includes a framing range of the sub-image and a photographing number of the sub-image;
the shooting module is used for shooting the celestial object to obtain a plurality of sub-images under the condition that the picture in the shooting preview interface meets the preset condition based on the guide information;
the display module is further configured to display a target image including the celestial object, and the target image is synthesized according to the plurality of sub-images.
9. The device according to claim 8, wherein the determining module is further configured to determine the number of the sub-images to be shot according to the viewing range of the celestial object, the camera angle of view, and a preset coincidence rate between every two adjacent sub-images;
the determining module is further configured to determine the viewing range of the sub-image according to the viewing range of the celestial object, the preset coincidence rate, and the shooting number.
10. The apparatus according to claim 8, wherein the plurality of sub-images include at least a first sub-image and a second sub-image captured adjacently, and the determining module is further configured to determine a coincidence rate of the first sub-image and the second sub-image corresponding to the preview screen when the first sub-image is captured;
the shooting module is specifically configured to: and shooting under the condition that the coincidence rate meets the preset condition to obtain the second sub-image, wherein the preset condition is determined according to the preset coincidence rate.
11. The apparatus of claim 8, further comprising:
the first acquisition module is used for acquiring time information of a shooting environment;
the positioning module is used for positioning the position information of the shooting environment;
the determining module is further configured to determine a bordetella level of the shooting environment according to the time information and the position information;
the display module is further used for displaying prompt information, and the prompt information is used for prompting a user whether the Pockel dark space grade meets the shooting condition for shooting the celestial object.
12. The apparatus of claim 8, further comprising:
the second acquisition module is used for acquiring astronomical data of the celestial object from a preset astronomical database;
the generating module is used for generating an augmented reality reference image according to the astronomical data;
the display module is further configured to display the augmented reality reference image on the shooting preview interface.
13. The apparatus of claim 12, further comprising:
a receiving module for receiving a first input;
an adjustment module to adjust a transparency of the reference image in response to the first input.
14. The apparatus of claim 12, wherein the direction information further comprises direction information, and the determining module is further configured to determine first direction information of the celestial object according to the astronomical data;
the determining module is further configured to determine second orientation information of the camera according to the detected gyroscope information;
the determining module is further configured to determine direction information according to the first orientation information and the second orientation information, where the direction information is used to guide a user to adjust a position of the camera.
15. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the photographing method according to any one of claims 1-7.
CN202110120543.7A 2021-01-28 2021-01-28 Shooting method and device and electronic equipment Pending CN112887619A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110120543.7A CN112887619A (en) 2021-01-28 2021-01-28 Shooting method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110120543.7A CN112887619A (en) 2021-01-28 2021-01-28 Shooting method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112887619A true CN112887619A (en) 2021-06-01

Family

ID=76053181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110120543.7A Pending CN112887619A (en) 2021-01-28 2021-01-28 Shooting method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112887619A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113688820A (en) * 2021-08-25 2021-11-23 维沃移动通信有限公司 Stroboscopic stripe information identification method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012131151A1 (en) * 2011-03-28 2012-10-04 Nokia Corporation Methods and apparatuses for generating a panoramic image
CN103685956A (en) * 2013-12-11 2014-03-26 宇龙计算机通信科技(深圳)有限公司 Panoramic photo shooting method and device
CN104994282A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Large view angle camera control method and user terminal
US20160360082A1 (en) * 2014-02-25 2016-12-08 Sony Corporation Imaging apparatus and method, and program
WO2020008973A1 (en) * 2018-07-03 2020-01-09 富士フイルム株式会社 Image-capture plan presentation device and method
CN112087580A (en) * 2019-06-14 2020-12-15 Oppo广东移动通信有限公司 Image acquisition method and device, electronic equipment and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012131151A1 (en) * 2011-03-28 2012-10-04 Nokia Corporation Methods and apparatuses for generating a panoramic image
CN103685956A (en) * 2013-12-11 2014-03-26 宇龙计算机通信科技(深圳)有限公司 Panoramic photo shooting method and device
US20160360082A1 (en) * 2014-02-25 2016-12-08 Sony Corporation Imaging apparatus and method, and program
CN104994282A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Large view angle camera control method and user terminal
WO2020008973A1 (en) * 2018-07-03 2020-01-09 富士フイルム株式会社 Image-capture plan presentation device and method
CN112087580A (en) * 2019-06-14 2020-12-15 Oppo广东移动通信有限公司 Image acquisition method and device, electronic equipment and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113688820A (en) * 2021-08-25 2021-11-23 维沃移动通信有限公司 Stroboscopic stripe information identification method and device and electronic equipment

Similar Documents

Publication Publication Date Title
JP6263623B2 (en) Image generation method and dual lens apparatus
CN105594191B (en) Imaging device, image processing device, and image processing method
US20090309990A1 (en) Method, Apparatus, and Computer Program Product for Presenting Burst Images
CN112822412B (en) Exposure method, exposure device, electronic equipment and storage medium
CN112261294B (en) Shooting method and device and electronic equipment
CN111163265A (en) Image processing method, image processing device, mobile terminal and computer storage medium
WO2022161340A1 (en) Image display method and apparatus, and electronic device
CN112637515B (en) Shooting method and device and electronic equipment
US20220343520A1 (en) Image Processing Method and Image Processing Apparatus, and Electronic Device Using Same
CN113329172B (en) Shooting method and device and electronic equipment
CN105635568A (en) Image processing method in mobile terminal and mobile terminal
CN112333386A (en) Shooting method and device and electronic equipment
CN112511737A (en) Image processing method and device, electronic equipment and readable storage medium
CN109089045A (en) A kind of image capture method and equipment and its terminal based on multiple photographic devices
CN112887619A (en) Shooting method and device and electronic equipment
KR20230012622A (en) Anti-Shake Methods, Anti-Shake Devices and Electronic Devices
CN111654623B (en) Photographing method and device and electronic equipment
CN112261262B (en) Image calibration method and device, electronic equipment and readable storage medium
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN113989387A (en) Camera shooting parameter adjusting method and device and electronic equipment
CN112653841A (en) Shooting method and device and electronic equipment
CN114339029B (en) Shooting method and device and electronic equipment
CN110166768B (en) Shooting method and device
CN117241131B (en) Image processing method and device
CN112887621B (en) Control method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210601

RJ01 Rejection of invention patent application after publication