CN111385475A - Image acquisition method, photographing device, electronic equipment and readable storage medium - Google Patents

Image acquisition method, photographing device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN111385475A
CN111385475A CN202010165161.1A CN202010165161A CN111385475A CN 111385475 A CN111385475 A CN 111385475A CN 202010165161 A CN202010165161 A CN 202010165161A CN 111385475 A CN111385475 A CN 111385475A
Authority
CN
China
Prior art keywords
storage area
raw
preset
time
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010165161.1A
Other languages
Chinese (zh)
Other versions
CN111385475B (en
Inventor
赖泽民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010165161.1A priority Critical patent/CN111385475B/en
Publication of CN111385475A publication Critical patent/CN111385475A/en
Application granted granted Critical
Publication of CN111385475B publication Critical patent/CN111385475B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image acquisition method, a photographing device, electronic equipment and a computer-readable storage medium. The image acquisition method comprises the following steps: continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; at the moment of receiving the photographing request instruction, at least one frame of RAW image is selected from the storage area and output. The image acquisition method, the photographing device, the electronic equipment and the computer-readable storage medium of the embodiment of the application continuously acquire images and store the images in the preset storage area, at least one frame of image is selected from the storage area to be output, and the images acquired within the reaction delay time of a user can be output, so that the time delay generated when the human body is photographed can be compensated, the photographing experience of what you see is what you get is achieved, the photographing device can be prevented from shaking when the user sends a photographing request, the photographing blur caused by shaking can be avoided, and the image quality is improved.

Description

Image acquisition method, photographing device, electronic equipment and readable storage medium
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to an image obtaining method, a photographing apparatus, an electronic device, and a computer-readable storage medium.
Background
In the actual shooting process, the difference of the reaction time of the human body exists between the process that a photographer sees a picture and the process that the photographer presses a shooting key, so that the actually shot imaging picture is not the picture expected to be captured.
Disclosure of Invention
The embodiment of the application provides an image acquisition method, a photographing device, electronic equipment and a computer-readable storage medium.
The image acquisition method of the embodiment of the application comprises the following steps: continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and at the moment of receiving a photographing request instruction, selecting at least one frame of the RAW image from the storage area and outputting the RAW image, wherein all the RAW images stored in the storage area are obtained within the response delay time.
The photographing device of the embodiment of the application comprises a processor. The processor is configured to: continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and at the moment of receiving a photographing request instruction, selecting at least one frame of the RAW image from the storage area and outputting the RAW image, wherein all the RAW images stored in the storage area are obtained within the response delay time.
The electronic equipment of the embodiment of the application comprises a photographing device and a shell. The photographing device is combined with the shell. The photographing apparatus includes a processor. The processor is configured to: continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and at the moment of receiving a photographing request instruction, selecting at least one frame of the RAW image from the storage area and outputting the RAW image, wherein all the RAW images stored in the storage area are obtained within the response delay time.
The computer readable storage medium of the embodiment of the present application has stored thereon a computer program that, when executed by a processor, implements: continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and at the moment of receiving a photographing request instruction, selecting at least one frame of the RAW image from the storage area and outputting the RAW image, wherein all the RAW images stored in the storage area are obtained within the response delay time.
The image acquisition method, the photographing device, the electronic device and the computer readable storage medium of the embodiment of the application continuously acquire images and store the images in a preset storage area, and at least one frame of image is selected from the storage area to be output at the moment when a photographing request instruction is received, so that the output image is the image acquired within the reaction delay time of a user. On one hand, the time delay generated when the human body takes a picture can be compensated, and the 'what you see is what you get' picture taking experience is achieved; on the other hand, shooting blurring caused by shaking of the shooting device when a user sends a shooting request can be avoided, and image quality is improved.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of an image acquisition method according to some embodiments of the present application;
FIG. 2 is a schematic diagram of a photographing apparatus according to some embodiments of the present application;
FIG. 3 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 4 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 5 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 6 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 7 is a schematic diagram of an image acquisition method according to certain embodiments of the present application;
FIG. 8 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 9 is a schematic diagram of an image acquisition method according to certain embodiments of the present application;
FIG. 10 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 11 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 12 is a schematic illustration of an image acquisition method according to certain embodiments of the present application;
FIG. 13 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 14 is a schematic illustration of an image acquisition method according to certain embodiments of the present application;
FIG. 15 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 16 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 17 is a schematic diagram of the interaction of a computer readable storage medium and a processor of certain embodiments of the present application;
FIG. 18 is a flow chart of an image acquisition method of certain embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1, the present application provides an image capturing method. The image acquisition method comprises the following steps:
02: continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and
03: at the moment of receiving the photographing request instruction, at least one frame of RAW image is selected from the storage area and output, and all the RAW images stored in the storage area are obtained in the reaction delay time.
Referring to fig. 2, the present application further provides a photographing apparatus 100. The image obtaining method of the present application can be implemented by the photographing apparatus 100 of the present application. The photographing apparatus 100 includes a processor 110. Both step 02 and step 03 can be implemented by the processor 110. That is, the processor 110 may be configured to continuously acquire RAW images and store the acquired multi-frame RAW images in a storage area, the storage amount of the storage area being set according to the response delay time of the user. The processor 110 may be further configured to select and output at least one frame of RAW image from a storage area at the time of receiving the photographing request instruction, where all the RAW images stored in the storage area are acquired within the response delay time.
The storage area may be a buffer area, in which a plurality of frames of RAW images acquired by the image sensor 120 in the photographing apparatus 100 are stored when the photographing apparatus 100 is turned on. At the moment when the processor 110 receives the photographing request instruction, the processor 110 selects and outputs at least one frame of RAW image from the buffer area. Of course, in other embodiments, the storage area may also be a memory area, which is not limited herein.
The storage amount refers to the number of frames of RAW images that the storage area can store at most. For example, if the memory area can store up to 10 frames of RAW images, the memory amount is 10 frames. The amount of storage may be set according to the reaction delay time of the user. The reaction delay time of the user refers to a period from a time when the user sees a desired image for one frame to a time when the user issues a photographing request instruction to the photographing apparatus 100. The longer the reaction delay time, the larger the memory amount; the shorter the reaction delay time, the smaller the storage amount.
The processor 110 selects and outputs at least one frame of RAW image from the storage area, and comprises: (1) the processor 110 selects a frame of RAW image from the storage area and outputs the RAW image; (2) the processor 110 selects and outputs a plurality of frames of RAW images from the storage area. If the processor 110 selects and outputs only one frame of RAW image in the storage area, the processor 110 may perform post-processing of algorithms such as beauty, rotation, watermarking, blurring, etc. on the selected frame of RAW image and then output and display the image. If the processor 110 selects and outputs multiple frames of RAW images in the storage area, in some embodiments, the processor 110 may first fuse the selected multiple frames of RAW images to obtain a frame of intermediate image with higher definition, and then output and display the frame of intermediate image after post-processing of algorithms such as beauty, rotation, watermarking, blurring, and the like; in other embodiments, the processor 110 performs post-processing of algorithms such as beautifying, rotating, watermarking, blurring, etc. on each frame of RAW image, and then fuses the images after multi-frame processing into a frame of image for display.
In the related art, the photographing apparatus generally processes and provides a user with a RAW image acquired after the time when the photographing request instruction is received, and the image provided to the user by the photographing apparatus is not an image intended by the user due to the presence of the reaction delay time.
The image acquisition method and the photographing device 100 according to the embodiment of the present application continuously acquire the RAW image after the photographing device 100 is turned on, and store the RAW image in a preset storage area. When receiving the photographing request instruction, at least one frame of RAW image can be selected from the storage area to be output, so that the output RAW image is the RAW image acquired within the reaction delay time of the user. On one hand, the time delay generated when the human body takes a picture can be compensated, and the 'what you see is what you get' picture taking experience is achieved; on the other hand, shooting blurring caused by shaking of the shooting device 100 when a user sends a shooting request can be avoided, and image quality is improved.
Referring to fig. 18, in some embodiments, the image capturing method further includes:
01: and setting the storage capacity of the storage area according to the response delay time of the user.
Referring to fig. 2, in some embodiments, step 01 may be implemented by the processor 110. That is, the processor 110 may also be configured to set the storage amount of the storage area according to the reaction delay time of the user.
In one example, the response delay time may be a default response delay time set by a manufacturer when the photographing apparatus 100 is shipped from a factory, and at this time, the storage amount of the storage area set by the default response delay time is a fixed amount. Specifically, the manufacturer may obtain the average response delay time of the human body from a large amount of experimental data, and use the average response delay time as a default response delay time. The default reaction delay time obtained by a large amount of experimental data is representative to some extent, and the image provided by the photographing apparatus 100 can be made closer to the image intended by the user. And the reaction delay time is directly set by a manufacturer, so that the user does not need to set the reaction delay time, and the operation of the user is facilitated. It should be noted that, when the response delay time is the default response delay time set by the manufacturer, step 01 is only executed when the user uses the photographing apparatus 100 to acquire an image for the first time, and step 01 is not executed when the user uses the photographing apparatus 100 to acquire an image for the subsequent time.
In another example, the response delay time may be set according to the age of the user, in which case the storage amount of the storage area set by the response delay time is variable, and when the age of the user is different, the storage amount of the storage area set according to the response delay time corresponding to the age is correspondingly variable. Specifically, the manufacturer may acquire the correspondence relationship between the age and the reaction delay time based on a large amount of experimental data in an earlier stage, and store the correspondence relationship between the age and the reaction delay time in the photographing apparatus 100. For example, the average reaction delay time of a plurality of persons of the same age may be taken as the reaction delay time corresponding to the age. The term "a response delay time" may refer to an age group, or may refer to an age group, without limitation. Among the response delay times corresponding to different ages, the response delay time of teenagers is the shortest, the response delay time of middle-aged people is the next to that of elderly people and children is the next to that of children. When the user uses the photographing apparatus 100 for the first time, the photographing apparatus 100 first acquires the age size of the user, for example, prompts the user to input the age, or photographs a face image of the user, and determines the age of the user and the like from the face image. Subsequently, the photographing apparatus 100 can determine the reaction delay time of the user in the corresponding relationship between the age and the reaction delay time. The reaction delay time is set according to the age of the user, so that the set reaction delay time is more consistent with the reaction delay time of the user who uses the photographing device 100 at present, the obtained image is more consistent with the user expectation, and the user does not need to manually set the reaction delay time, so that the intelligence of the photographing device 100 is improved. It should be noted that, when the response delay time is set according to the age of the user, step 01 may be performed only when the user uses the photographing apparatus 100 to acquire an image for the first time, and step 01 is not performed any more when a subsequent user uses the photographing apparatus 100 to acquire an image; alternatively, step 01 may be performed each time the user acquires an image using the photographing apparatus 100. For the case where step 01 is executed each time the user uses the photographing apparatus 100, it can be understood that the user who uses the photographing apparatus 100 may be a user of different ages each time the user obtains an image, and the reaction delay time is set according to the age of the user who currently uses the photographing apparatus 100 each time the photographing apparatus 100 obtains an image, and the obtained image is more suitable for the user who currently uses the photographing apparatus 100. In addition, the storage amount of the storage area can be flexibly adjusted, and the situation that the user only needs the storage area with the storage amount of 8 frames actually, but the processor 110 allocates the storage area with the storage amount of 20 frames, which causes the waste of redundant storage areas does not occur.
In another example, the response delay time of the user may be set by the user, in which case, the storage amount of the storage area set by the response delay time is variable, and when the response delay times set by the user are different, the storage amount of the storage area set according to the response delay time is correspondingly changed. Specifically, the user can autonomously set the reaction delay time according to the actual reaction delay time of the user. For example, if the actual reaction delay time of the user is 0.2s, the user can directly set the reaction delay time to 0.2 s. The response delay times of different individuals often differ, and the response delay times of the same individual at different stages may also differ. The user sets the reaction delay time according to the actual reaction delay time of the user, and the obtained image is more in line with the expectation of the user. It should be noted that, when the response delay time is set by the user, step 01 may be performed only when the user uses the photographing apparatus 100 to acquire an image for the first time, and step 01 is not performed any more when a subsequent user uses the photographing apparatus 100 to acquire an image; alternatively, step 01 may be performed each time the user acquires an image using the photographing apparatus 100. For the case where step 01 is performed each time the user uses the photographing apparatus 100, it is understood that the reaction delay times of different users may be different for each user using the photographing apparatus 100; moreover, even if the users who use the photographing device 100 are the same user each time, the reaction delay time of the same user at different stages is different, so that the reaction delay time is set by the user according to the reaction delay time of the user at the current stage when the photographing device 100 acquires an image each time, the acquired image is more in line with the user expectation, and the user experience is better. In addition, the storage amount of the storage area can be flexibly adjusted, and the situation that the user only needs the storage area with the storage amount of 8 frames actually, but the processor 110 allocates the storage area with the storage amount of 20 frames, which causes the waste of redundant storage areas does not occur.
Referring to fig. 1, 2 and 3 together, in some embodiments, step 02 includes:
021: when the photographing device 100 is turned on, continuously acquiring the RAW image, and judging whether a storage area has redundant storage space when acquiring a new frame of RAW image;
022: if the storage area has redundant storage space, storing the new RAW image of the frame into the storage area; and
023: if the storage area has no redundant storage space, the RAW image stored in the storage area first is removed from the storage area, and then a new RAW image of the frame is stored in the storage area.
Referring to fig. 2, in some embodiments, step 021, step 022 and step 023 can all be implemented by processor 110. That is, the processor 110 may be further configured to continuously acquire the RAW image when the photographing apparatus 100 is turned on, and determine whether there is excess storage space in the storage area each time a new RAW image is acquired for one frame. The processor 110 may be further configured to store the new RAW image of the frame in the storage area when there is excess storage space in the storage area. The processor 110 may be further configured to remove the RAW image stored in the storage area first from the storage area and store the new RAW image in the storage area when there is no spare storage space in the storage area.
Specifically, when the photographing apparatus 100 is turned on (e.g., an application program such as "camera" is turned on by the user), the image sensor 120 in the photographing apparatus 100 continuously acquires the RAW image and simultaneously records the photographing time, i.e., the time stamp, of each frame of the RAW image. Each time a new frame of RAW image is acquired by the image sensor 120, the new frame of RAW image is transmitted to the processor 110. The processor 110 determines whether the storage area has excess storage space, i.e., whether the number of RAW images stored in the storage area at this time is less than the storage amount. And if the storage area has redundant storage space, namely the quantity of the RAW images stored in the storage area is smaller than the storage quantity, storing the acquired new RAW image of the frame into the storage area. For example, the storage amount of the storage area is 8 frames, that is, the storage area can store at most 8 frames of RAW images, when the image sensor 120 acquires a new frame of RAW image, if only 7 frames of RAW images are stored in the storage area, since the number of RAW images stored in the storage area is smaller than the storage amount, the storage area has excess storage space, and therefore, the processor 110 can store the acquired new RAW image in the storage area. If the storage area has no redundant storage space, namely the quantity of the RAW images stored in the storage area is equal to the storage quantity, the RAW images stored in the storage area firstly are removed from the storage area, and then the obtained new RAW images are stored in the storage area. For example, the storage amount of the storage area is 8 frames, that is, the storage area can store at most 8 frames of RAW images, when the image sensor 120 acquires a new frame of RAW image, if there are 8 frames of RAW images in the storage area, since the number of RAW images stored in the storage area is equal to the storage amount, there is no excess storage space in the storage area, and therefore, the processor 110 may remove one frame of RAW image, which is stored in the storage area first, from the storage area, and then store the acquired new frame of RAW image in the storage area.
When the processor 110 selects a frame of RAW image stored in the storage area first from the storage area, there may be a plurality of selection modes.
In one example, the storage area is a storage structure of a queue, and multiple frames of RAW images are stored in the storage area in the form of a queue. Specifically, when the photographing apparatus 100 is turned on, the image sensor 120 continuously acquires a RAW image and transmits the frame RAW image to the processor 110 after each acquisition of the RAW image. The processor 110 sequentially receives RAW images continuously acquired by the image sensor 120 and sequentially stores the acquired multi-frame RAW images in a storage area in the order of reception. According to the characteristic of queue first-in first-out, the earlier received RAW image is closer to the head of the queue, and the later received RAW image is closer to the tail of the queue. When the processor 110 selects a RAW image of a frame stored in the storage area first from the storage area, it can directly determine the RAW image at the head of the queue as the RAW image of the frame stored in the storage area first.
In another example, the image sensor 110 may transmit parameters related to the RAW image, such as sensitivity, time stamp, etc., to the processor 110 in addition to the RAW image to the processor 110. When the processor 110 selects a frame of RAW image stored in the storage area first from the storage area, it may determine the RAW image with the earliest shooting time according to the time stamp of the multiple frames of RAW images in the storage area, where the frame of RAW image is the RAW image that needs to be removed.
Referring to fig. 1, fig. 2 and fig. 4 together, in some embodiments, step 03 includes:
031: acquiring the working mode of the photographing device 100;
032: when the photographing device 100 works in a first photographing mode, acquiring jitter parameters of a plurality of frames of RAW images in a storage area;
033: comparing the jitter parameter of the multi-frame RAW image in the storage area with a first preset jitter parameter;
034: if the quantity of the RAW images with the jitter parameters smaller than the first preset jitter parameters in the storage area is larger than the preset quantity, selecting and outputting at least one frame of RAW images from the RAW images with the jitter parameters smaller than the first preset jitter parameters; and
035: and if the quantity of the RAW images with the jitter parameters smaller than the first preset jitter parameters in the storage area is smaller than the preset quantity, selecting at least one frame of RAW image from all the RAW images in the storage area and outputting the RAW image.
Referring to fig. 2, in some embodiments, step 031, step 032, step 033, step 034 and step 035 can be implemented by the processor 110. That is, the processor 110 may be configured to obtain an operating mode of the photographing apparatus and obtain the shaking parameters of the multi-frame RAW image in the storage area when the photographing apparatus operates in the first photographing mode. The processor 110 may be further configured to compare the dithering parameters of the multi-frame RAW images in the storage area with a first preset dithering parameter. If the number of the RAW images with the jitter parameter smaller than the first preset jitter parameter in the storage area is larger than the preset number, the processor 110 selects and outputs at least one frame of RAW image from the RAW images with the jitter parameter smaller than the first preset jitter parameter. If the number of the RAW images with the jitter parameter smaller than the first preset jitter parameter in the storage area is smaller than the preset number, the processor 110 selects and outputs at least one frame of RAW image from all the RAW images in the storage area.
The photographing request instruction of the user may be a photographing request instruction generated by clicking a photographing key by the user, or a photographing request instruction generated by pressing a photographing shortcut key (e.g., a user volume key) by the user. Specifically, when the processor 110 receives the user photographing request instruction, the processor 110 obtains the operation mode of the photographing apparatus 100 at that time. If the photographing apparatus 100 operates in the first photographing mode, the processor 110 obtains the dithering parameters of the multi-frame RAW image in the storage area, and compares the dithering parameters of the multi-frame RAW image with the first preset dithering parameters respectively. If the shake parameter of the RAW image is smaller than the first preset shake parameter, it is determined that the photographing device 100 does not shake when the RAW image is photographed, and the RAW image is a clear image without shake blur; if the shake parameter of the RAW image is greater than the first preset shake parameter, it is determined that the photographing device 100 shakes when the RAW image is photographed, and the RAW image is an unclear image with shake blur; if the shake parameter of the RAW image is equal to the first preset shake parameter, it may be considered that the photographing device 100 does not shake when the RAW image is photographed, and the RAW image is a clear image without shake blur, or it may be considered that the photographing device 100 shakes when the RAW image is photographed, and the RAW image is an unclear image with shake blur, which is not limited herein.
The processor 110 acquires the number of RAW images in the storage area whose shaking parameter is smaller than the first preset shaking parameter, and compares the number with a preset number. If the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is greater than the preset number, the photographing device 100 may be considered to be stable before receiving a photographing request instruction from a user, and the processor 110 may select and output at least one frame of RAW image from the RAW images with jitter parameters smaller than the first preset jitter parameter; if the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is smaller than the preset number, the photographing device 100 may be considered to be jittered before the photographing device 100 receives a photographing request instruction from a user, and the processor 110 may select and output at least one frame of RAW image from all RAW images in the storage area; if the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is equal to the preset number, the photographing apparatus 100 may be considered to be stable before the photographing apparatus 100 receives a photographing request instruction from a user, and the processor 110 may select and output at least one frame of RAW image from the RAW images with jitter parameters smaller than the first preset jitter parameter, or, the photographing apparatus 100 may also be considered to be jittered before the photographing apparatus 100 receives the photographing request instruction from the user, and the processor 110 selects and outputs at least one frame of RAW image from all the RAW images in the storage area.
Note that the shake parameters of the RAW image may be acquired by a sensor such as a gyroscope, and the processor 110 may read the shake parameters from the sensor such as the gyroscope. Acquiring a shake parameter of the RAW image by a gyroscope (not shown), wherein the larger the shake parameter is, the more the photographing apparatus 100 shakes, and the more the obtained RAW image is blurred; a smaller jitter parameter indicates that the photographing apparatus 100 is more stable and the obtained RAW image is sharper.
Referring to fig. 4 and 5, in some embodiments, step 034 includes:
0341: and selecting and outputting RAW images with jitter parameters smaller than second preset jitter parameters from RAW images with jitter parameters smaller than the first preset jitter parameters, wherein the second preset jitter parameters are smaller than or equal to the first preset jitter parameters.
Referring to fig. 2, in some embodiments, step 0341 may be implemented by processor 110. That is, the processor 110 may be configured to select and output a RAW image with a jitter parameter smaller than a second preset jitter parameter from the RAW images with jitter parameters smaller than a first preset jitter parameter, where the second preset jitter parameter is smaller than or equal to the first preset jitter parameter.
Specifically, when the photographing device 100 operates in the first photographing mode and the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is greater than the preset number, the processor 110 compares the jitter parameters of the RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area with the second preset jitter parameter, and selects the RAW images with jitter parameters smaller than or equal to the second preset jitter parameter for output. At this time, the selected RAW image may be one frame or a plurality of frames. When the selected RAW image is a frame, the frame RAW image is the image with the minimum jitter parameter in the storage area. And when the selected RAW image is a plurality of frames, the plurality of frames of RAW images are the images with smaller jitter parameters in the storage area. Therefore, the output RAW image is not only acquired before the photographing apparatus 100 receives the photographing request instruction, but also better meets the user's expectations, and the output RAW image also has higher definition and better imaging quality.
Referring to fig. 4 and 6, in some embodiments, step 034 further includes:
0342: and selecting and outputting one frame of RAW image with the farthest shooting time and the time of receiving the shooting request instruction from the RAW images with the shaking parameters smaller than the first preset shaking parameters.
Referring to fig. 2, in some embodiments, step 0342 may be implemented by processor 110. That is, the processor 110 may also be configured to: and selecting and outputting one frame of RAW image with the farthest shooting time and the time of receiving the shooting request instruction from the RAW images with the shaking parameters smaller than the first preset shaking parameters.
Specifically, when the photographing device 100 operates in the first photographing mode and the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is greater than the preset number, the processor 110 obtains the photographing time of each frame of RAW image according to the timestamp of each frame of RAW image with jitter parameters smaller than the first preset jitter parameter in the storage area, selects and outputs a frame of RAW image indicating that the photographing time is farthest away from the time when the photographing request instruction is received, that is, selects and outputs the RAW image which is photographed earliest in the storage area with jitter parameters smaller than the first preset jitter parameter.
For example, as shown in fig. 2 and 7 (the solid arrow in fig. 7 indicates time), assuming that the photographing apparatus 100 receives a photographing request command, 10 frames of RAW images are stored in the storage area, and are arranged in the order of the morning and evening of the photographing time as RAW image 1, RAW image 2, RAW image 3, RAW image 4, RAW image 5, RAW image 6, RAW image 7, RAW image 8, RAW image 9, and RAW image 10. That is, the capturing time of the RAW image 1 is the earliest and the capturing time of the RAW image 10 is the latest. It is assumed that, in the 10 frames of RAW images, the jitter parameters of 4 frames of RAW images, i.e., RAW image 1, RAW image 2, RAW image 4, and RAW image 5, are all greater than a first preset jitter parameter (the jittered RAW image is indicated by a dashed-line frame), and the jitter parameters of the remaining 6 frames of RAW images are all less than the first preset jitter parameter (the stable RAW image is indicated by a solid-line frame). Wherein the number 6 of RAW images with the jitter parameter smaller than the first preset jitter parameter is larger than the preset number 5. Then, the processor 110 selects and outputs a RAW image 3, which is one frame of RAW image having the earliest shooting time, from among the RAW images 3, 6, 7, 8, 9, and 10.
Because the selected RAW image is the RAW image with the earliest shooting time in the RAW images with the shaking parameters smaller than the first preset shaking parameters in the storage area, the method can furthest compensate the reaction delay time of a human body while ensuring the definition of the image, so that the image provided for a user is closer to the expected image of the user, and the effect of 'seeing and getting' can be achieved.
Referring to fig. 4 and 8 together, in some embodiments, step 034 includes:
0343: and selecting and outputting multi-frame RAW images with the difference value between the shooting time and the time of receiving the shooting request instruction larger than the preset time difference from the RAW images with the jitter parameters smaller than the first preset jitter parameters.
Referring to fig. 2, in some embodiments, step 0343 may be implemented by processor 110. That is, the processor 110 may be further configured to select and output a multi-frame RAW image, in which a difference between the shooting time and the time when the shooting request instruction is received is greater than a preset time difference, from the RAW images with the jitter parameter less than the first preset jitter parameter.
Specifically, when the photographing device 100 operates in the first photographing mode and the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is greater than the preset number, the processor 110 obtains the photographing time of each frame of RAW image according to the timestamp of each frame of RAW image with jitter parameters smaller than the first preset jitter parameter in the storage area, selects and outputs a multi-frame RAW image with a difference value between the photographing time and the time when the photographing request instruction is received, that is, selects and outputs a multi-frame RAW image earlier photographed in the storage area with jitter parameters smaller than the first preset jitter parameter.
Illustratively, as shown in fig. 2 and 9 (solid arrows in fig. 9 indicate time), assuming that the photographing apparatus 100 receives a photographing request instruction, 10 frames of images, which are the RAW image 1 to the RAW image 10, are stored in the storage area. The capturing time of the RAW image 1 is the earliest and the capturing time of the RAW image 10 is the latest. Assuming that, in the 10 frames of RAW images, the difference between the shooting time of the RAW image 1, the RAW image 2, the RAW image 3, the RAW image 4, the RAW image 5, the RAW image 6, the RAW image 7, and the RAW image 8 and the time when the shooting request instruction is received is greater than the preset time difference; the difference between the photographing time of the RAW image 9 and the RAW image 10 and the photographing request instruction receiving time is smaller than the preset time difference. In the 10 frames of RAW images, the jitter parameters of the 4 frames of RAW images, i.e., the RAW image 1, the RAW image 2, the RAW image 4, and the RAW image 5, are all larger than the first preset jitter parameter (the jittered RAW image is indicated by a dashed-line frame), and the jitter parameters of the remaining 6 frames of RAW images are all smaller than the first preset jitter parameter (the stabilized RAW image is indicated by a solid-line frame). If the number 6 of RAW images with the jitter parameter smaller than the first preset jitter parameter is greater than the preset number 5, the processor 110 selects an image with a difference between the shooting time and the time when the shooting request instruction is received, which is greater than the preset time difference, from the RAW images 3, 6, 7, 8, 9, and 10, that is, selects and outputs the RAW images 3, 6, 7, and 8.
It should be noted that the preset time difference can be set by the user according to the requirement. Specifically, in some embodiments, the preset time difference may be set by the user according to the required definition, and the higher the definition required by the user, the smaller the preset time difference. It can be understood that the smaller the preset time difference is, the greater the number of the selected RAW images which satisfy that the difference between the shooting time and the time when the shooting request instruction is received is, and the clearer the target image obtained by fusing and outputting the selected multi-frame RAW images.
The selected RAW image is a multi-frame RAW image with an earlier shooting time in the RAW image with the jitter parameter smaller than the first preset jitter parameter in the storage area, and the multi-frame RAW image can be fused and output, so that the response delay time of a human body can be compensated, and the definition of the target image is improved.
Referring to fig. 4 and 10, in some embodiments, step 034 further includes:
0344: and selecting and outputting the RAW image with the jitter parameter smaller than a second preset jitter parameter which is smaller than or equal to the first preset jitter parameter from the RAW image with the jitter parameter smaller than the first preset jitter parameter, wherein the difference value between the shooting time and the time of receiving the shooting request instruction is larger than a preset time difference.
Referring to fig. 2, in some embodiments, step 0344 may be implemented by processor 110. That is, the processor 110 may be further configured to select and output a RAW image, which has a difference between the shooting time and the time when the shooting request instruction is received, which is greater than the preset time difference and has a shaking parameter smaller than a second preset shaking parameter, from the RAW image having the shaking parameter smaller than the first preset shaking parameter, where the second preset shaking parameter is smaller than or equal to the first preset shaking parameter.
Specifically, when the photographing device 100 operates in the first photographing mode and the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is greater than the preset number, the processor 110 obtains the photographing time of each frame of RAW image according to the time stamp of each frame of RAW image with jitter parameters smaller than the first preset jitter parameter in the storage area, and selects a multi-frame RAW image with a difference value between the photographing time and the time when the photographing request instruction is received, which is greater than the preset time difference. The processor 110 obtains the jitter parameters of the multi-frame RAW images of which the difference between the shooting time and the time when the shooting request instruction is received is greater than the preset time difference, compares the multiple jitter parameters with a second preset jitter parameter, selects the RAW image of which the jitter parameter is less than or equal to the second preset jitter parameter, and outputs the RAW image. At this time, the selected RAW image may be one frame or a plurality of frames.
It should be noted that, the specific process of selecting the multiple frames of RAW images whose difference between the shooting time and the time when the shooting request instruction is received is greater than the preset time difference in step 0344 is the same as the specific process of selecting the multiple frames of RAW images whose difference between the shooting time and the time when the shooting request instruction is received is greater than the preset time difference in step 0343, and details thereof are not repeated herein. The specific process of selecting the RAW image with the jitter parameter less than or equal to the second preset jitter parameter in step 0344 is the same as the specific process of selecting the RAW image with the jitter parameter less than or equal to the second preset jitter parameter in step 0341, and is not described herein again. The processor 110 may select a RAW image with a jitter parameter smaller than a second preset jitter parameter from RAW images with a jitter parameter smaller than a first preset jitter parameter, and then select and output a multi-frame RAW image with a difference between a shooting time and a time when the shooting request instruction is received larger than a preset time difference; the processor 110 may also select a RAW image with a jitter parameter smaller than a second preset jitter parameter from RAW images with a jitter parameter smaller than a first preset jitter parameter, and then select and output a frame of RAW image with a shooting time farthest from the time of receiving the shooting request instruction from the RAW image with a jitter parameter smaller than the second preset jitter parameter, which is not limited herein.
Because the shooting time of the selected RAW image is far away from the time of receiving the shooting request instruction, and the definition of the RAW image is high, the target image provided for the user is closer to the image expected by the user, and the target image has high imaging quality.
Referring to fig. 4 and 11 together, in some embodiments, step 035 includes:
0351: and selecting and outputting one frame of RAW image with the shooting time closest to the time of receiving the shooting request instruction from all the RAW images in the storage area.
Referring to fig. 2, in some embodiments, step 0351 may be performed by processor 110. That is, the processor 110 may be further configured to select and output, from all the RAW images in the storage area, one frame of RAW image whose shooting time is closest to the time of receiving the shooting request instruction.
Specifically, when the photographing device 100 operates in the first photographing mode and the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is smaller than the preset number, the processor 110 obtains the photographing time of each frame of RAW image according to the time stamps of all RAW images in the storage area, selects and outputs one frame of RAW image which indicates that the photographing time is closest to the time when the photographing request instruction is received, that is, selects and outputs the latest photographed RAW image in the storage area.
Illustratively, as shown in fig. 2 and 12 (solid arrows in fig. 12 indicate time), assuming that the photographing apparatus 100 receives a photographing request instruction, the storage area stores therein 10 frames of RAW images, which are RAW image 1 to RAW image 10, respectively. The capturing time of the RAW image 1 is the earliest and the capturing time of the RAW image 10 is the latest. It is assumed that, in the 10 frames of RAW image, the jitter parameters of the 6 frames of RAW image, i.e., RAW image 1, RAW image 2, RAW image 4, RAW image 5, RAW image 7 and RAW image 9, are all greater than the first preset jitter parameter (the jittered RAW image is indicated by the dashed-line frame), and the jitter parameters of the remaining 4 frames of RAW image are all less than the first preset jitter parameter (the stable RAW image is indicated by the solid-line frame). Wherein the number 4 of RAW images with a dithering parameter smaller than the first preset dithering parameter is smaller than the preset number 5. Then, the processor 110 selects one RAW image with the latest shooting time from the 10 RAW images in the storage area, that is, selects and outputs the RAW image 10.
The number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is smaller than the preset number, that is, the photographing apparatus 100 has jitter before receiving the photographing request instruction, and it can be considered that the user does not prepare for photographing before sending the photographing request instruction, so that the photographing apparatus 100 has jitter before receiving the photographing request instruction. In a short time before and after the user sends the photographing request instruction (for example, a period from a time when a frame of RAW image is acquired before the user sends the photographing request instruction to a time when a frame of RAW image is acquired after the user sends the photographing request instruction), the user is ready to photograph, so that the RAW image with the photographing time closest to the time when the photographing request is received can be selected and output in the storage area, on one hand, the RAW image can be guaranteed to have certain definition, on the other hand, the photographing speed is increased, and the user experience is improved.
Referring to fig. 4 and 13 together, in some embodiments, step 035 includes:
0352: and selecting and outputting a plurality of frames of RAW images of which the difference value between the shooting time and the time of receiving the shooting request instruction is smaller than the preset time difference from all the RAW images in the storage area.
Referring to fig. 2, in some embodiments, step 0352 may be performed by processor 110. That is, the processor 110 may be further configured to select and output, from all RAW images in the storage area, a plurality of frames of RAW images whose difference between the shooting time and the time when the shooting request instruction is received is smaller than a preset time difference.
Specifically, when the photographing device 100 operates in the first photographing mode and the number of RAW images with jitter parameters smaller than the first preset jitter parameter in the storage area is smaller than the preset number, the processor 110 obtains the photographing time of each frame of RAW image according to the timestamps of all RAW images in the storage area, selects and outputs a plurality of frames of RAW images with a difference value smaller than the preset time difference between the photographing time and the time when the photographing request instruction is received. Because multiple frames of RAW images are selected for multi-frame fusion output, compared with a target image obtained by only one frame of RAW image, the target image obtained by the multi-frame RAW image fusion mode can have higher definition.
Illustratively, as shown in fig. 2 and 14 (solid arrows in fig. 14 indicate time), assuming that the photographing apparatus 100 receives a photographing request instruction, the storage area stores therein 10 frames of RAW images, which are RAW image 1 to RAW image 10, respectively. The capturing time of the RAW image 1 is the earliest and the capturing time of the RAW image 10 is the latest. Assuming that, in the 10 frames of RAW images, the difference between the photographing time of the RAW image 1, the RAW image 2, the RAW image 3, the RAW image 4, the RAW image 5, and the RAW image 6 and the photographing request receiving time is greater than the preset time difference; the difference between the photographing time of the RAW image 7, the RAW image 8, the RAW image 9, and the RAW image 10 and the photographing time when the photographing request instruction is received is smaller than the preset time difference. In the 10 frames of RAW images, the jitter parameters of the 6 frames of RAW images, i.e., the RAW image 1, the RAW image 2, the RAW image 4, the RAW image 5, the RAW image 7, and the RAW image 9, are all larger than the first preset jitter parameter (the jittered RAW image is indicated by a dashed-line frame), and the jitter parameters of the remaining 4 frames of RAW images are all smaller than the first preset jitter parameter (the stabilized RAW image is indicated by a solid-line frame). Then, the processor 110 selects and outputs RAW images, which have a difference value smaller than a preset time difference between the shooting time and the time when the shooting request instruction is received, from the RAW images of the frame in the storage area 10, that is, the RAW image 7, the RAW image 8, the RAW image 9, and the RAW image 10.
Referring to fig. 4 and fig. 15, in some embodiments, step 03 further includes:
step 036: when the photographing apparatus 100 operates in the second photographing mode, at least one frame of RAW image is selected from all RAW images in the storage area and output.
Referring to fig. 2, in some embodiments, step 036 may be performed by processor 110. That is, the processor 110 is further configured to select and output at least one frame of RAW image from all RAW images in the storage area when the photographing apparatus 100 operates in the second photographing mode.
Specifically, the embodiment of step 036 is the same as that of step 035, and is not described in detail herein. When the photographing device 100 operates in the second photographing mode, the processor 110 does not need to acquire the dithering parameters of the multi-frame RAW image in the storage area, and does not need to perform the operation of comparing the dithering parameters with the first preset dithering parameters, and can directly select the RAW image to be output, so that the processing process of the processor 110 is simplified, and the photographing speed is improved.
It should be noted that the first photographing mode is a snapshot mode, and the snapshot mode is: when the user sees a scene through a preview mode, the user can shoot (snap) a picture of a target scene at a preview time by using the shooting device 100, and a shooting effect of 'what you see is what you get' is achieved. The second photographing mode may be a snapshot or a normal photographing, and the second photographing mode generally means that a user can quickly obtain a photographed picture after sending a photographing request instruction, the picture obtained by the snapshot is generally a scene corresponding to a time when the photographing request is sent, and the picture obtained by the normal photographing is generally a scene corresponding to a time after the time when the photographing request is sent.
Referring to fig. 2 and fig. 16, an electronic device 300 is also provided. The electronic device 300 may be a mobile phone, a tablet computer, a notebook computer, an intelligent wearable device (an intelligent watch, an intelligent bracelet, an intelligent helmet, an intelligent glasses, etc.), a virtual reality device, etc. In the present application, the electronic device 300 is described as a mobile phone, but the form of the electronic device 300 is not limited to the mobile phone. The electronic device 300 includes the photographing apparatus 100 and the housing 200 according to any of the above embodiments, and the photographing apparatus 100 is combined with the housing 200. The processor 110 in the photographing apparatus 100 may be installed in the photographing apparatus 100, or may be installed outside the photographing apparatus 100 but disposed in the electronic device 300. The gyroscope for detecting the shake parameter may be installed in the photographing apparatus 100 or in the electronic device 300, which is not limited herein.
Referring to fig. 1 and 17, a computer-readable storage medium 400 is also provided. The computer-readable storage medium 400 has stored thereon a computer program. The steps of the image acquisition method of any of the above embodiments are implemented when the program is executed by the processor 110.
For example, in the case where the program is executed by the processor 110, the steps of the following image processing method are implemented:
02: continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and
03: at the moment of receiving the photographing request instruction, at least one frame of RAW image is selected from the storage area and output, and all the RAW images stored in the storage area are obtained in the reaction delay time.
The computer-readable storage medium 400 may be disposed in the photographing apparatus 100 or the electronic device 300, or may be disposed in a cloud server, and at this time, the photographing apparatus 100 or the electronic device 300 may communicate with the cloud server to obtain a corresponding computer program.
It will be appreciated that the computer program comprises computer program code. The computer program code may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), software distribution medium, and the like.
The processor 110 may be referred to as a driver board. The driver board may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor 230 (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (14)

1. An image acquisition method, characterized in that the image acquisition method comprises:
continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and
and at the moment of receiving a photographing request instruction, selecting at least one frame of the RAW image from the storage area and outputting the RAW image, wherein all the RAW images stored in the storage area are obtained within the response delay time.
2. The image acquisition method according to claim 1, wherein the image acquisition method is used for a photographing apparatus, and the selecting and outputting at least one frame of the RAW image from the storage area at the time of receiving a photographing request instruction includes:
when the photographing device works in a first photographing mode, acquiring the shaking parameters of a plurality of frames of the RAW images in the storage area;
comparing the jitter parameter of the frames of the RAW images in the storage area with a first preset jitter parameter;
if the number of the RAW images with the jitter parameters smaller than the first preset jitter parameters in the storage area is larger than the preset number, selecting and outputting at least one frame of the RAW images from the RAW images with the jitter parameters smaller than the first preset jitter parameters; and
and if the number of the RAW images of which the jitter parameters are smaller than the first preset jitter parameters in the storage area is smaller than the preset number, selecting at least one frame of the RAW images from all the RAW images in the storage area and outputting the RAW images.
3. The method according to claim 2, wherein the selecting and outputting at least one frame of the RAW image from the RAW images with the jitter parameter smaller than the first preset jitter parameter comprises:
and selecting and outputting the RAW image with the jitter parameter smaller than a second preset jitter parameter from the RAW images with the jitter parameter smaller than the first preset jitter parameter, wherein the second preset jitter parameter is smaller than the first preset jitter parameter.
4. The image acquisition method according to claim 2, wherein each frame of the RAW image has a time stamp identifying a shooting time of the corresponding RAW image; selecting and outputting at least one frame of the RAW image from the RAW images with the jitter parameters smaller than the first preset jitter parameters, wherein the method comprises the following steps:
selecting and outputting a frame of RAW image with the shooting time furthest from the time of receiving the shooting request instruction from the RAW images with the jitter parameters smaller than the first preset jitter parameters; or
And selecting and outputting a plurality of frames of the RAW images of which the difference value between the shooting time and the time of receiving the shooting request instruction is larger than a preset time difference from the RAW images of which the jitter parameters are smaller than the first preset jitter parameters.
5. The image acquisition method according to claim 1, wherein the selecting and outputting at least one frame of the RAW image from the storage area at the time of receiving the photographing request instruction further comprises:
and when the photographing device works in a second photographing mode, selecting at least one frame of the RAW image from all the RAW images in the storage area and outputting the RAW image.
6. The image acquisition method according to claim 2 or 5, wherein each frame of the RAW image has a time stamp identifying a shooting time of the corresponding RAW image; the selecting and outputting at least one frame of the RAW image from all the RAW images in the storage area comprises:
selecting and outputting one frame of RAW image with the shooting time closest to the time of receiving the shooting request instruction from all the RAW images in the storage area; or
And selecting and outputting a plurality of frames of the RAW images of which the difference value between the shooting time and the time of receiving the shooting request instruction is smaller than the preset time difference from all the RAW images in the storage area.
7. A photographing apparatus comprising a processor configured to:
continuously acquiring RAW images, and storing the acquired multi-frame RAW images into a storage area, wherein the storage capacity of the storage area is set according to the response delay time of a user; and
and at the moment of receiving a photographing request instruction, selecting at least one frame of the RAW image from the storage area and outputting the RAW image, wherein all the RAW images stored in the storage area are obtained within the response delay time.
8. The imaging apparatus of claim 7, wherein the processor is further configured to:
when the photographing device works in a first photographing mode, acquiring the shaking parameters of a plurality of frames of the RAW images in the storage area;
comparing the jitter parameter of the frames of the RAW images in the storage area with a first preset jitter parameter;
if the number of the RAW images with the jitter parameters smaller than the first preset jitter parameters in the storage area is larger than the preset number, selecting and outputting at least one frame of the RAW images from the RAW images with the jitter parameters smaller than the first preset jitter parameters; and
and if the number of the RAW images of which the jitter parameters are smaller than the first preset jitter parameters in the storage area is smaller than the preset number, selecting at least one frame of the RAW images from all the RAW images in the storage area and outputting the RAW images.
9. The imaging apparatus of claim 8, wherein the processor is further configured to:
if the number of the RAW images with the jitter parameters smaller than the first preset jitter parameters in the storage area is larger than the preset number, selecting and outputting the RAW images with the jitter parameters smaller than second preset jitter parameters from the RAW images with the jitter parameters smaller than the first preset jitter parameters, wherein the second preset jitter parameters are smaller than the first preset jitter parameters.
10. The photographing apparatus according to claim 8, wherein each frame of the RAW image has a time stamp identifying a photographing time of the corresponding RAW image; the processor is further configured to:
when the number of the RAW images with the jitter parameters smaller than the first preset jitter parameters in the storage area is larger than the preset number, selecting and outputting one frame of the RAW image with the shooting time furthest away from the time of receiving the shooting request instruction from the RAW images with the jitter parameters smaller than the first preset jitter parameters; or
When the number of the RAW images with the jitter parameters smaller than the first preset jitter parameters in the storage area is larger than the preset number, selecting and outputting a plurality of frames of the RAW images with the difference between the shooting time and the time when the shooting request instruction is received, which is larger than the preset time difference, from the RAW images with the jitter parameters smaller than the first preset jitter parameters.
11. The imaging apparatus of claim 7, wherein the processor is further configured to:
and when the photographing device works in a second photographing mode, selecting at least one frame of the RAW image from all the RAW images in the storage area and outputting the RAW image.
12. The photographing apparatus according to claim 8 or 11, wherein each frame of the RAW image has a time stamp identifying a photographing time of the corresponding RAW image; the processor is further configured to:
selecting and outputting one frame of RAW image with the shooting time closest to the time of receiving the shooting request instruction from all the RAW images in the storage area; or
And selecting and outputting a plurality of frames of the RAW images of which the difference value between the shooting time and the time of receiving the shooting request instruction is smaller than the preset time difference from all the RAW images in the storage area.
13. An electronic device, characterized in that the electronic device comprises:
the photographing apparatus of any one of claims 7 to 12; and
the casing, the device of shooing with the casing combines.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the image processing method of any one of claims 1 to 6.
CN202010165161.1A 2020-03-11 2020-03-11 Image acquisition method, photographing device, electronic equipment and readable storage medium Active CN111385475B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010165161.1A CN111385475B (en) 2020-03-11 2020-03-11 Image acquisition method, photographing device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010165161.1A CN111385475B (en) 2020-03-11 2020-03-11 Image acquisition method, photographing device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111385475A true CN111385475A (en) 2020-07-07
CN111385475B CN111385475B (en) 2021-09-10

Family

ID=71221667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010165161.1A Active CN111385475B (en) 2020-03-11 2020-03-11 Image acquisition method, photographing device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111385475B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113132637A (en) * 2021-04-19 2021-07-16 Oppo广东移动通信有限公司 Image processing method, image processing chip, application processing chip and electronic equipment
CN113660426A (en) * 2021-09-08 2021-11-16 英望科技(山东)有限公司 Method and system for preventing photographing jitter of user and computer readable storage medium
WO2022204925A1 (en) * 2021-03-30 2022-10-06 华为技术有限公司 Image obtaining method and related equipment
CN115225826A (en) * 2022-06-30 2022-10-21 联想(北京)有限公司 Image shooting method, device, equipment and storage medium
CN117692753A (en) * 2023-08-25 2024-03-12 上海荣耀智慧科技开发有限公司 Photographing method and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002271673A (en) * 2001-03-09 2002-09-20 Fuji Photo Film Co Ltd Electronic camera and static image recording method
CN1501695A (en) * 2002-11-18 2004-06-02 矽峰光电科技股份有限公司 Method for amending internal delay in digital camera imaging
CN103442169A (en) * 2013-07-29 2013-12-11 北京智谷睿拓技术服务有限公司 Method for having control over shooting function of image collection device and image collection device
CN105323484A (en) * 2015-10-29 2016-02-10 惠州Tcl移动通信有限公司 Rapidly photographing method and electronic device
CN107509034A (en) * 2017-09-22 2017-12-22 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN109922322A (en) * 2019-04-10 2019-06-21 Oppo广东移动通信有限公司 Photographic method, image processor, camera arrangement and electronic equipment
CN109993722A (en) * 2019-04-09 2019-07-09 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002271673A (en) * 2001-03-09 2002-09-20 Fuji Photo Film Co Ltd Electronic camera and static image recording method
CN1501695A (en) * 2002-11-18 2004-06-02 矽峰光电科技股份有限公司 Method for amending internal delay in digital camera imaging
CN103442169A (en) * 2013-07-29 2013-12-11 北京智谷睿拓技术服务有限公司 Method for having control over shooting function of image collection device and image collection device
CN105323484A (en) * 2015-10-29 2016-02-10 惠州Tcl移动通信有限公司 Rapidly photographing method and electronic device
CN107509034A (en) * 2017-09-22 2017-12-22 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN109993722A (en) * 2019-04-09 2019-07-09 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN109922322A (en) * 2019-04-10 2019-06-21 Oppo广东移动通信有限公司 Photographic method, image processor, camera arrangement and electronic equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022204925A1 (en) * 2021-03-30 2022-10-06 华为技术有限公司 Image obtaining method and related equipment
CN113132637A (en) * 2021-04-19 2021-07-16 Oppo广东移动通信有限公司 Image processing method, image processing chip, application processing chip and electronic equipment
CN113132637B (en) * 2021-04-19 2023-04-07 Oppo广东移动通信有限公司 Image processing method, image processing chip, application processing chip and electronic equipment
CN113660426A (en) * 2021-09-08 2021-11-16 英望科技(山东)有限公司 Method and system for preventing photographing jitter of user and computer readable storage medium
CN113660426B (en) * 2021-09-08 2023-07-18 英望科技(山东)有限公司 Method and system for preventing user photographing shake and computer readable storage medium
CN115225826A (en) * 2022-06-30 2022-10-21 联想(北京)有限公司 Image shooting method, device, equipment and storage medium
CN117692753A (en) * 2023-08-25 2024-03-12 上海荣耀智慧科技开发有限公司 Photographing method and electronic equipment

Also Published As

Publication number Publication date
CN111385475B (en) 2021-09-10

Similar Documents

Publication Publication Date Title
CN111385475B (en) Image acquisition method, photographing device, electronic equipment and readable storage medium
CN109922322B (en) Photographing method, image processor, photographing device and electronic equipment
US8937667B2 (en) Image communication apparatus and imaging apparatus
JP6439837B2 (en) Electronic device and display method
CN107743191B (en) Terminal, anti-shake photographing method thereof and storage device
US10958820B2 (en) Intelligent interface for interchangeable sensors
RU2415513C1 (en) Image recording apparatus, image recording method, image processing apparatus, image processing method and programme
CN113382169B (en) Photographing method and electronic equipment
JP5558852B2 (en) Information processing apparatus, control method thereof, and program
CN112399087B (en) Image processing method, image processing apparatus, image capturing apparatus, electronic device, and storage medium
CN110300240B (en) Image processor, image processing method, photographing device and electronic equipment
US10187575B2 (en) Image acquisition apparatus, method of controlling image acquisition apparatus, computer-readable recording medium non-transitorily storing control program of image acquisition apparatus, and image acquisition system
CN107809590B (en) Photographing method and device
EP2632148B1 (en) Electronic device and image processing method thereof
US8605170B2 (en) Imaging device, method of processing captured image signal and computer program
CN105554372A (en) Photographing method and device
WO2015180683A1 (en) Mobile terminal, method and device for setting image pickup parameters, and computer storage medium
CN113329176A (en) Image processing method and related device applied to camera of intelligent terminal
CN113890998A (en) Image data processing method and device, electronic equipment and storage medium
CN110401800B (en) Image processing method, image processor, photographing device and electronic equipment
JP2019029954A (en) Image processing system and image processing method
JP5562101B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
US10944899B2 (en) Image processing device and image processing method
CN115767262A (en) Photographing method and electronic equipment
US8994844B2 (en) Image processing apparatus that synthesizes acquired images to generate successive images, control method therefor, and image pickup apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant