WO2020029732A1 - 全景拍摄方法、装置和成像设备 - Google Patents

全景拍摄方法、装置和成像设备 Download PDF

Info

Publication number
WO2020029732A1
WO2020029732A1 PCT/CN2019/095092 CN2019095092W WO2020029732A1 WO 2020029732 A1 WO2020029732 A1 WO 2020029732A1 CN 2019095092 W CN2019095092 W CN 2019095092W WO 2020029732 A1 WO2020029732 A1 WO 2020029732A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure
image
camera
frame
duration
Prior art date
Application number
PCT/CN2019/095092
Other languages
English (en)
French (fr)
Inventor
李小朋
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020029732A1 publication Critical patent/WO2020029732A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • the present application relates to the technical field of mobile terminals, and in particular, to a method, a device, and an imaging device for panoramic shooting.
  • This application is intended to solve at least one of the technical problems in the related technology.
  • this application proposes a panoramic shooting method, which uses dual cameras to obtain multiple frames of images with multiple exposure durations, and combines the corresponding images with different exposure durations into target images for each frame, and then stitches the target images into a panoramic image. Improved shooting efficiency for panorama shooting.
  • the present application proposes a panoramic photographing device.
  • the present application proposes an imaging device.
  • the present application proposes a computer device.
  • the present application proposes a computer-readable storage medium.
  • An embodiment of one aspect of the present application provides a panoramic shooting method, which is applied to an imaging device.
  • the imaging device includes a first camera and a second camera.
  • the method includes:
  • the first camera is controlled to capture multiple frames of the first exposure image at the first exposure duration
  • the second camera is controlled to capture multiple frames of the second exposure image at the exposure duration shorter than the first exposure duration
  • the imaging images are synthesized on the target images of each frame to obtain a panoramic image.
  • An embodiment of still another aspect of the present application provides a panoramic photographing device, which is applied to an imaging device, where the imaging device includes a first camera and a second camera, and the device includes:
  • a control module configured to control the first camera to shoot multiple frames of the first exposure image with the first exposure duration during the panoramic shooting process, and control the second camera to shoot multiple images with the exposure duration shorter than the first exposure duration; Frame the second exposure image;
  • a composition module configured to perform pixel synthesis on the corresponding first exposure image and the second exposure image to obtain each frame target image
  • a stitching module is configured to synthesize imaging frames of the target images of each frame to obtain a panoramic image.
  • An embodiment of another aspect of the present application provides an imaging device.
  • the imaging device includes a first camera and a second camera.
  • the imaging device further includes a processor.
  • the processor is configured to:
  • the first camera is controlled to capture multiple frames of the first exposure image at the first exposure duration
  • the second camera is controlled to capture multiple frames of the second exposure image at the exposure duration shorter than the first exposure duration
  • the imaging images are synthesized on the target images of each frame to obtain a panoramic image.
  • the processor executes the program, the implementation is implemented as described above. Aspect of the panoramic shooting method.
  • An embodiment of still another aspect of the present application provides a computer-readable storage medium having a computer program stored thereon, and when the processor executes the program, the panoramic shooting method according to the foregoing aspect is implemented.
  • the first camera is controlled to capture multiple frames of the first exposure image with the first exposure duration
  • the second camera is controlled to capture multiple frames of the second exposure image with the exposure duration shorter than the first exposure duration.
  • the exposure image and the second exposure image are pixel-combined to obtain each frame target image, and each frame of the target image is subjected to imaging screen synthesis to obtain a panoramic image.
  • images with different exposure durations are obtained by using dual cameras, and pixel corresponding images of different exposure durations are pixel synthesized to obtain target images of each frame, and the target images are synthesized by imaging frames to obtain panoramic images, which improves the shooting efficiency of panoramic shooting.
  • FIG. 1 is a schematic flowchart of a panoramic shooting method according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of another panoramic shooting method according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an LTM mapping curve provided by an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of a panoramic photographing apparatus according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the internal structure of the computer device 200 in one embodiment.
  • FIG. 6 is a schematic diagram of an image processing circuit 90 in one embodiment.
  • FIG. 1 is a schematic flowchart of a panoramic shooting method according to an embodiment of the present application.
  • the method includes the following steps:
  • step 101 during the panoramic shooting process, the first camera is controlled to capture multiple frames of the first exposure image with the first exposure duration, and the second camera is controlled to capture multiple frames of the second exposure image with the exposure duration shorter than the first exposure duration.
  • the first camera is a color camera
  • the second camera is a black-and-white camera
  • the first camera and the second camera have no primary or secondary camera.
  • the weights are the same when shooting images.
  • the first camera is controlled to capture multiple frames of the first exposure image with the first exposure duration
  • the second camera is controlled to use the short exposure duration to shoot
  • the corresponding one-frame short-exposure second exposure image, and the corresponding one-frame exposure second-exposure image are shot with a medium exposure duration.
  • the middle exposure duration is shorter than the first exposure duration, and the short exposure duration is shorter than the middle exposure duration.
  • the first camera uses the first exposure time to capture the first exposure image
  • the second camera uses the short exposure time and the middle exposure time to respectively capture the corresponding second exposure image
  • the first camera should synchronize the image with the second camera Shoot.
  • synchronization is not limited to start at the same time or end at the same time.
  • the first camera may start to execute images simultaneously with the second camera, or there may be a certain time difference. This embodiment does not limit the execution timing.
  • the second camera can start capturing images at the same time as the first camera, or it can delay the start.
  • the delay time is equal to the short exposure duration and The absolute value of the difference between the sum of the middle exposure duration and the first exposure duration.
  • the second camera and the first camera start to capture images at the same time to reduce the shooting time and improve the shooting efficiency.
  • the second camera uses the short exposure duration to capture the corresponding second exposure image, which can be taken earlier than the second camera uses the middle exposure duration, or can be later than the second camera uses the middle exposure duration.
  • Shooting the corresponding second exposure image is not limited in this embodiment.
  • Step 102 Perform pixel synthesis on the corresponding first exposure image and second exposure image to obtain each frame target image.
  • the corresponding pixels in the corresponding first exposure image and the second exposure image are weighted and combined to obtain a corresponding one-frame target image.
  • the target images of each frame can be synthesized.
  • step 103 imaging frames are synthesized for each frame of the target image to obtain a panoramic image.
  • imaging frames of each target image are synthesized.
  • similar pixel portions in two adjacent frames are merged according to the order in which each frame image is taken.
  • the two adjacent frames of the target image are stitched together to realize the synthesis of the imaging picture and obtain a panoramic image.
  • the first camera is controlled to shoot multiple frames of the first exposure image with the first exposure duration
  • the second camera is controlled to shoot the multiple frames with the exposure duration shorter than the first exposure duration.
  • Two exposure images, the corresponding first exposure image and the second exposure image are pixel synthesized to obtain the target images of each frame, and the target images of each frame are synthesized by the imaging screen to obtain the panoramic image.
  • multiple frames of images with different exposure durations are obtained through the dual cameras, and the target images of each frame are obtained by pixel synthesis of the images corresponding to different exposure durations.
  • the target images are synthesized by the imaging screen to obtain a panoramic image, which improves the shooting of panoramic photography.
  • Efficiency is different from that in the prior art, which uses a single camera to capture high dynamic panoramic images. A single camera can only output one frame per unit of time. The shooting efficiency is low and it cannot adapt to the scene of panoramic shooting.
  • FIG. 2 is a schematic flowchart of another panoramic shooting method provided by an embodiment of the present application.
  • the method may include the following steps:
  • Step 201 During the panoramic shooting process, control the first camera to shoot multiple frames of the first exposure image for the first exposure duration.
  • the first camera is a color camera.
  • the first exposure duration used in the first camera is preset.
  • the first camera is controlled to first capture a first exposure image with a preset first exposure duration.
  • shooting can be performed according to the adjusted first exposure duration to obtain multiple frames of the first exposure image.
  • Step 202 During the process of capturing a frame of the first exposure image by the first camera, control the second camera to shoot the corresponding frame of the short exposure second exposure image with a short exposure duration, and shoot the corresponding frame of the corresponding exposure frame with a medium exposure duration.
  • the exposed second exposure image During the process of capturing a frame of the first exposure image by the first camera, control the second camera to shoot the corresponding frame of the short exposure second exposure image with a short exposure duration, and shoot the corresponding frame of the corresponding exposure frame with a medium exposure duration. The exposed second exposure image.
  • the second camera is a black and white camera.
  • the first camera and the second camera acquire images at the same time.
  • the second camera takes a corresponding one of the preset short exposure time and medium exposure time.
  • a short exposure second exposure image and a medium exposure second exposure image are examples of the preset short exposure time and medium exposure time.
  • a plurality of frames of short-exposure second exposure images are obtained by shooting according to the adjusted short-exposure duration.
  • the method for adjusting the short-exposure duration will be described in detail in the following steps.
  • first exposure image captured by the first camera and the second exposure image captured by the second camera correspond to the same image, but are acquired using different cameras and exposure durations.
  • Step 203 Adjust the first exposure duration according to a frame of the first exposure image.
  • a first exposure image that uses a preset first exposure duration may be underexposed. In this way, during the shooting process, it is necessary to adjust the The first exposure duration is adjusted to obtain a modified first exposure duration.
  • a long exposure histogram is generated based on the first exposure image of a frame currently taken, and an underexposed area is identified according to the long exposure histogram, where the underexposed area is a first initial brightness in the long exposure histogram.
  • iterative control is performed so that the brightness of the area corresponding to the underexposed area in the long exposure histogram of the first exposure image of another frame shot later is greater than or equal to the first reference brightness
  • the adjusted initial exposure time required for the first initial brightness to become the first reference brightness is obtained according to the first exposure duration and the adjusted exposure duration used in the current shooting.
  • the first exposure duration is adjusted according to a frame of the first exposure image currently captured.
  • a long exposure histogram is generated based on a plurality of original pixel information in the first exposure image.
  • the long exposure histogram characterizes the brightness distribution of multiple original pixel information.
  • an adjustment will be determined first First exposure duration.
  • the pixels are controlled to perform the first exposure duration after the adjustment, and a plurality of adjusted adjustment pixel information is output.
  • Generate an adjusted long exposure histogram based on multiple adjusted pixel information, and look for multiple pixels whose third initial brightness corresponding to the adjusted pixel information is less than the first reference brightness in the long exposure histogram, and these pixels correspond to the preview image of the next frame Underexposed area in.
  • the first exposure duration of the pixels is continuously adjusted. This continues until the underexposed area does not exist in the preview image. At this time, the first exposure duration of the pixel corresponding to the preview image where there is no underexposed area is the adjusted first exposure duration.
  • a first exposure image is acquired according to the modified first exposure duration.
  • step 204 the short exposure duration is adjusted according to a short exposure second exposure image.
  • the short-exposed second exposure image that uses a preset short exposure duration and long exposure may appear overexposed.
  • the first exposure frame of the short exposure For two-exposure images, the short exposure duration is adjusted to obtain a corrected short exposure duration.
  • a short exposure histogram is generated based on a currently-exposed second frame of a short exposure image, and an overexposed area is identified according to the short exposure histogram, where the overexposed area is the first A region where two pixels having an initial brightness greater than a second reference brightness are located. Iterative control makes the short exposure histogram of the second exposure image of the short exposure second frame of the next shot, the brightness of the area corresponding to the overexposed area is less than or equal to the second reference brightness, and the recorded second initial brightness becomes second Refer to the adjustment of the short exposure time required for the brightness, and obtain the adjusted short exposure time according to the short exposure time and the short exposure time used in the current shooting.
  • a short exposure duration is adjusted according to a short exposure second exposure image, and a short exposure histogram is first generated according to a plurality of original pixel information respectively output by a plurality of pixels in the short exposure second exposure image.
  • the short exposure histogram characterizes the brightness distribution of multiple original pixel information.
  • a plurality of pixels whose second initial brightness corresponding to the original pixel information is greater than the second reference brightness are determined, and these pixels correspond to overexposed regions in the current preview image.
  • an adjustment short will be determined first. Exposure duration. Subsequently, the pixels are controlled to adjust the short exposure duration and the long exposure duration, and output a plurality of adjusted adjustment pixel information. Generate an adjusted short exposure histogram based on multiple adjusted pixel information, and look for multiple pixels in the short exposure histogram whose fourth initial brightness is greater than the second reference brightness corresponding to the adjusted pixel information, and these pixels correspond to the next frame preview image Overexposed area in.
  • the short exposure duration of the pixel corresponding to the preview image without overexposed area is the adjusted short exposure duration.
  • a short-exposure second exposure image is acquired according to the modified short-exposure duration.
  • step 203 may be performed before step 204, or may be performed after step 204.
  • Step 205 Perform pixel synthesis on the corresponding first exposure image and second exposure image to obtain each frame target image.
  • the corresponding pixels in the corresponding first exposure image and the second exposure image are weighted and combined. Specifically, pixels in the images corresponding to three different exposure durations are respectively given different weights, and each After the corresponding pixels in the image corresponding to the exposure time are multiplied by the weights, three pixels multiplied by the weights are added as the synthesized pixels corresponding to one pixel, and then all pixels are weighted and synthesized to obtain the corresponding The pixels are synthesized to obtain the corresponding target image of a frame. Similarly, the target image of each frame can be obtained.
  • R is a length of a pixel in the first image exposure
  • R is a short-pixel corresponding to a short-exposure image corresponding to the second exposure
  • M the combined pixel information corresponding to one pixel in a frame is recorded as M
  • the combined pixel information M is equal to R long x a + R short x b + R medium x c, where a, b, and c are respectively
  • the target image of each frame can be obtained.
  • weights corresponding to the pixels in the first exposure image, the corresponding pixels in the second exposure image in the middle exposure, and the corresponding pixels in the second exposure image in the short exposure can be determined based on the experimental data, or based on the experimental data.
  • the brightness information and other determinations of the current shooting scene are not limited in this embodiment.
  • Step 206 For each frame of the target image, perform gray-level compression on the merged pixel information in a frame of image.
  • the target image can be obtained by performing interpolation calculation based on multiple merged pixel information obtained after compression. In this way, the dark part of the target image has been compensated by the original pixel information output by the exposed pixels in the first camera, and the bright part has been suppressed by the original pixel information output by the exposed pixels in the second camera. Therefore, there is no overexposure in the target image Area and underexposed area, with higher dynamic range and better imaging effect.
  • the gray level compression is performed on the combined pixel information in a frame of image.
  • LTM Local Tone Mapping
  • the combined pixel information is compressed.
  • the synthesized RAW file data is compressed into a 10-bit output, so that the obtained target image of each frame is presented in a suitable brightness domain, which is convenient for adaptive display with the display.
  • the LTM curve can be set as shown in Figure 3.
  • Figure 3 is a schematic diagram of the LTM mapping curve provided by the embodiment of the present application. After the LTM mapping, it can be saved as much as possible.
  • the appearance characteristics of the original image, the target image does not have over-exposed areas and under-exposed areas, has a higher dynamic range and better imaging effects.
  • Step 207 splicing the target images of each frame to obtain a panoramic image.
  • imaging frames of each target image are synthesized.
  • similar pixel portions in two adjacent frames are merged according to the order in which each frame image is taken.
  • the two adjacent frames of the target image are stitched together to realize the synthesis of the imaging picture and obtain a panoramic image.
  • the first camera is controlled to shoot multiple first exposure images with a first exposure duration
  • the second camera is controlled to shoot multiple second exposure images with a short exposure duration at a short exposure duration.
  • And controlling the second camera to take a second exposure image exposed in multiple frames with a medium exposure duration, and adjust the first exposure duration according to the first exposure image of one frame, so that the first exposure image acquired according to the first exposure duration
  • This kind of image data of the exposure time improves the image quality of panorama shooting.
  • the corresponding first exposure image, the middle exposure second exposure image, and the short exposure second exposure image are pixel synthesized to obtain the target images of each frame, and the target image of each frame is compressed by the gray level, which improves the target image pairing.
  • the reducing power of the original image can express more details in the light and dark places.
  • the target images of each frame are combined to obtain a panoramic image, which improves the shooting efficiency of panoramic shooting.
  • the present application also proposes a panoramic photographing apparatus, which is applied to an imaging device, and the imaging device includes a first camera and a second camera.
  • FIG. 4 is a schematic structural diagram of a panoramic photographing apparatus according to an embodiment of the present application.
  • the device includes a control module 41, a synthesis module 42, and a splicing module 43.
  • a control module 41 is configured to control the first camera to capture multiple frames of the first exposure image with the first exposure duration during the panoramic shooting process, and control the second camera to capture the multiple exposure images with the exposure time shorter than the first exposure duration.
  • a combining module 42 is configured to perform pixel synthesis on the corresponding first exposure image and second exposure image to obtain a target image of each frame.
  • the stitching module 43 is configured to synthesize imaging frames of target images of each frame to obtain a panoramic image.
  • control module 41 includes a first control unit and a second control unit.
  • the first control unit is configured to control the first camera to capture multiple frames of the first exposure image with the first exposure duration; the first camera is a color camera.
  • the second control unit controls the second camera to take a corresponding short-exposure second exposure image with a short exposure duration while the first camera takes a first-exposure image frame, and use a medium exposure duration to take a corresponding one.
  • the second exposure image exposed in the frame; the second camera is a black-and-white camera, wherein the middle exposure duration is shorter than the first exposure duration and the short exposure duration is shorter than the middle exposure duration.
  • control module 41 may further include a first adjustment unit and a second adjustment unit.
  • the first adjusting unit is configured to adjust a first exposure duration according to a first exposure image of one frame.
  • the second adjusting unit is configured to adjust a short exposure duration according to a second short-exposure image of one frame.
  • the first adjustment unit is specifically configured to:
  • the underexposed area is an area where the first initial brightness of the long exposure histogram is less than the first reference brightness
  • the adjusted first exposure duration is obtained according to the first exposure duration used for the current shooting and the adjusted exposure duration.
  • the second adjustment unit is specifically configured to:
  • Overexposed areas are identified according to the short exposure histogram, where the overexposed area is the area where the second initial brightness is greater than the second reference brightness in the short exposure histogram;
  • Iterative control such that the brightness of the area corresponding to the over-exposed area in the short exposure histogram of the second exposure image of the short exposure second frame of the next shot is less than or equal to the second reference brightness;
  • the adjusted short exposure duration is obtained according to the short exposure duration used for the current shooting and adjusting the short exposure duration.
  • the foregoing synthesizing module 42 is specifically configured to:
  • the corresponding pixels in the corresponding first exposure image and the second exposure image are weighted and combined to obtain a corresponding one-frame target image.
  • the apparatus further includes: a compression module.
  • a compression module is used to compress the gray level of each merged pixel information in a target image.
  • the first camera is controlled to take multiple frames of the first exposure image with the first exposure duration
  • the second camera is controlled to take the multiple exposure images with the short exposure duration.
  • And controlling the second camera to take a second exposure image exposed in multiple frames with a medium exposure duration, and adjust the first exposure duration according to the first exposure image of one frame, so that the first exposure image acquired according to the first exposure duration
  • This kind of image data of the exposure time improves the image quality of panorama shooting.
  • the corresponding first exposure image, the middle exposure second exposure image, and the short exposure second exposure image are pixel synthesized to obtain the target images of each frame, and the target image of each frame is compressed by the gray level, which improves the target image pairing.
  • the reducing power of the original image can express more details in the light and dark places.
  • the target images of each frame are combined to obtain a panoramic image, which improves the shooting efficiency of panoramic shooting.
  • the present application further provides an imaging device.
  • the imaging device includes a first camera and a second camera.
  • the imaging device further includes a processor.
  • the processor is configured to:
  • the imaging images are synthesized for each frame of the target image to obtain a panoramic image.
  • an embodiment of the present application further provides a computer device including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • a computer program stored in the memory and executable on the processor.
  • FIG. 5 is a schematic diagram of the internal structure of the computer device 200 in one embodiment.
  • the computer device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
  • the memory 50 of the computer device 200 stores an operating system and computer-readable instructions.
  • the computer-readable instructions can be executed by the processor 60 to implement the control method in the embodiment of the present application.
  • the processor 60 is used to provide computing and control capabilities to support the operation of the entire computer device 200.
  • the internal memory 50 of the computer device 200 provides an environment for the execution of computer-readable instructions in the memory 52.
  • the display screen 83 of the computer device 200 may be a liquid crystal display or an electronic ink display, etc.
  • the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball or a touch button provided on the casing of the computer device 200. Board, which can also be an external keyboard, trackpad, or mouse.
  • the computer device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (such as a smart bracelet, a smart watch, a smart helmet, a smart glasses), and the like. Those skilled in the art can understand that the structure shown in FIG.
  • 5 is only a schematic diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer equipment 200 to which the solution of the present application is applied. 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
  • the computer device 200 includes an image processing circuit 90.
  • the image processing circuit 90 may be implemented by using hardware and / or software components, including various types of lines defining an ISP (Image Signal Processing) pipeline Processing unit.
  • FIG. 6 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 6, for ease of description, only aspects of the image processing technology related to the embodiments of the present application are shown.
  • the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
  • the image data captured by the camera 93 is first processed by the ISP processor 91.
  • the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
  • the camera 93 may include one or more lenses 932 and an image sensor 934.
  • the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
  • the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
  • the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 934 may also send the original image data to the sensor 94.
  • the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
  • the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 91 may also receive image data from the image memory 95.
  • the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
  • the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
  • the processed image data may be sent to the image memory 95 for further processing before being displayed.
  • the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
  • the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
  • the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
  • the image memory 95 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 97 device.
  • the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
  • the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
  • the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
  • the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware).
  • the one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data.
  • 91 control parameters For example, the control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focal length for focus or zoom), or these parameters The combination.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
  • the following are the steps for implementing the control method by using the processor 60 in FIG. 5 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 6:
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • the processor executes the program, the panoramic shooting method according to the foregoing method embodiment is implemented. .
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
  • the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved. It is understood by those skilled in the art to which the embodiments of the present application pertain.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请提出一种全景拍摄方法、装置和成像设备,涉及移动终端技术领域,其中,方法包括:在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短于第一曝光时长的曝光时长拍摄多帧第二曝光图像,将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像,对各帧目标图像进行成像画面合成,得到全景图像。本申请中通过双摄像头获取不同曝光时长的图像,将对应不同曝光时长的图像进行像素合成得到各帧目标图像,并将目标图像进行成像画面合成得到全景图像,提高了全景拍摄的拍摄效率,解决了现有技术中,单一摄像头拍摄高动态全景图像时,拍摄效率低的技术问题。

Description

全景拍摄方法、装置和成像设备
相关申请的交叉引用
本申请要求OPPO广东移动通信有限公司于2018年8月6日提交的、申请名称为“全景拍摄方法、装置和成像设备”的、中国专利申请号“201810886055.5”的优先权。
技术领域
本申请涉及移动终端技术领域,尤其涉及一种全景拍摄方法、装置和成像设备。
背景技术
随着移动终端和图像处理技术的发展,人们对于拍摄的需求日益增加,当拍摄场景动态范围变化较大时,例如,当拍摄者的脸部光线较亮时,则后面的背景将变得过曝严重,拍摄无法呈现背景图像的细节,而单纯的降低脸部光线亮度,仍无法呈现背景图像的细节。因此,人们需要拍摄高动态范围全景图像,来展现当前场景。
发明内容
本申请旨在至少在一定程度上解决相关技术中的技术问题之一。
为此,本申请提出一种全景拍摄方法,通过双摄像头获取多帧包含多种曝光时长的图像,并将对应的不同曝光时长的图像合成各帧目标图像,进而将目标图像拼接成全景图像,提高了全景拍摄的拍摄效率。
本申请提出一种全景拍摄装置。
本申请提出一种成像设备。
本申请提出一种计算机设备。
本申请提出一种计算机可读存储介质。
本申请一方面实施例提出了一种全景拍摄方法,应用于成像设备,成像设备包括第一摄像头和第二摄像头,方法包括:
在全景拍摄过程中,控制所述第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制所述第二摄像头以短于所述第一曝光时长的曝光时长拍摄多帧第二曝光图像;
将对应的所述第一曝光图像和所述第二曝光图像进行像素合成得到各帧目标图像;
对所述各帧目标图像进行成像画面合成,得到全景图像。
本申请又一方面实施例提出了一种全景拍摄装置,应用于成像设备,所述成像设备包括 第一摄像头和第二摄像头,所述装置包括:
控制模块,用于在全景拍摄过程中,控制所述第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制所述第二摄像头以短于所述第一曝光时长的曝光时长拍摄多帧第二曝光图像;
合成模块,用于将对应的所述第一曝光图像和所述第二曝光图像进行像素合成得到各帧目标图像;
拼接模块,用于对所述各帧目标图像进行成像画面合成,得到全景图像。
本申请又一方面实施例提出了一种成像设备,所述成像设备包括第一摄像头和第二摄像头,所述成像设备还包括处理器,所述处理器用于:
在全景拍摄过程中,控制所述第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制所述第二摄像头以短于所述第一曝光时长的曝光时长拍摄多帧第二曝光图像;
将对应的所述第一曝光图像和所述第二曝光图像进行像素合成得到各帧目标图像;
对所述各帧目标图像进行成像画面合成,得到全景图像。
本申请又一方面实施例提出了一种计算机设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如前述一方面所述的全景拍摄方法。
本申请又一方面实施例提出了一种计算机可读存储介质,其上存储有计算机程序,所述处理器执行所述程序时,实现如前述一方面所述的全景拍摄方法。
本申请实施例所提供的技术方案可以包含如下的有益效果:
在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短于第一曝光时长的曝光时长拍摄多帧第二曝光图像,将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像,对各帧目标图像进行成像画面合成,得到全景图像。本实施例中通过双摄像头获取不同曝光时长的图像,将对应不同曝光时长的图像进行像素合成得到各帧目标图像,并将目标图像进行成像画面合成得到全景图像,提高了全景拍摄的拍摄效率。
附图说明
本申请上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:
图1为本申请实施例所提供的一种全景拍摄方法的流程示意图;
图2为本申请实施例所提供的另一种全景拍摄方法的流程示意图;
图3为本申请实施例提供的LTM映射曲线的示意图;
图4为本申请实施例提供的一种全景拍摄装置的结构示意图;
图5为一个实施例中计算机设备200的内部结构示意图;以及
图6为一个实施例中图像处理电路90的示意图。
具体实施方式
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。
下面参考附图描述本申请实施例的全景拍摄方法、装置和成像设备。
图1为本申请实施例所提供的一种全景拍摄方法的流程示意图。
如图1所示,该方法包括以下步骤:
步骤101,在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短于第一曝光时长的曝光时长拍摄多帧第二曝光图像。
其中,第一摄像头为彩色摄像头,第二摄像头为黑白摄像头,第一摄像头和第二摄像头无主副之分,拍摄图像时权重相同。
具体地,在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,在第一摄像头拍摄一帧第一曝光图像的过程中,控制第二摄像头采用短曝光时长拍摄对应的一帧短曝光的第二曝光图像,以及采用中曝光时长拍摄对应的一帧中曝光的第二曝光图像。
其中,中曝光时长小于第一曝光时长,短曝光时长小于中曝光时长。
需要理解的是,第一摄像头采用第一曝光时长拍摄第一曝光图像,第二摄像头采用短曝光时长和中曝光时长分别拍摄对应的第二曝光图像,第一摄像头应当与第二摄像头同步进行图像拍摄。但同步并不是限定同时开始,或同时结束,具体第一摄像头可以与第二摄像头拍摄图像同时开始执行,也可以存在一定时间差,本实施例对执行时序不作限定。
在一种场景下,如果第一曝光时长大于短曝光时长与中曝光时长之和,则第二摄像头可以和第一摄像头在相同时间开始拍摄图像,也可以延迟开始,延迟时间等于短曝光时长和中曝光时长之和与第一曝光时长的差值的绝对值。
在另一种场景下,如果第一曝光时长小于或等于短曝光时长与中曝光时长之和,则第二摄像头与第一摄像头同时开始拍摄图像,以减少拍摄时间,提高拍摄效率。
另外需要说明的是,第二摄像头采用短曝光时长拍摄对应的第二曝光图像,可以早于第二摄像头采用中曝光时长拍摄对应的第二曝光图像,也可以晚于第二摄像头采用中曝光时长拍摄对应的第二曝光图像,本实施例中不作限定。
步骤102,将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像。
具体地,对应的第一曝光图像和第二曝光图像中的相应像素进行加权合成,得到对应的一帧目标图像,同理,可以合成得到各帧目标图像。
步骤103,对各帧目标图像进行成像画面合成,得到全景图像。
具体地,根据得到的各帧目标图像,将各帧目标图像进行成像画面合成,作为一种可能的实现方式,按照各帧图像拍摄的顺序,将相邻两帧中相似的像素部分进行合并,使得相邻两帧目标图像拼接在一起,从而实现成像画面的合成,得到全景图像。
本实施例的全景拍摄方法中,在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短于第一曝光时长的曝光时长拍摄多帧第二曝光图像,将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像,对各帧目标图像进行成像画面合成,得到全景图像。本实施例中通过双摄像头获取不同曝光时长的多帧图像,将对应不同曝光时长的图像进行像素合成得到各帧目标图像,并将目标图像进行成像画面合成得到全景图像,提高了全景拍摄的拍摄效率,不同于现有技术中采用单一的摄像头来拍摄高动态全景图像,单一的摄像头单位时间只能输出一帧图像,拍摄效率较低,无法适应全景拍摄的场景。
上一实施例中,描述了在拍摄全景图像时,通过第一摄像头和第二摄像头分别获取不同曝光时长的图像,将对应不同曝光时长的图像进行像素合成得到各帧目标图像,并将目标图像进行成像画面合成得到全景图像,而在实际拍摄过程中,因拍摄场景和光线的变化,获取的图像可能存在曝光不足或曝光过度的问题,导致图像质量较低,因此,在拍摄过程中,可对获取的当前帧图像的曝光情况进行分析,进而调整下一帧图像的第一曝光时长和短曝光时长,以获取曝光时长更准确的图像,提高了全景拍摄的拍摄效率,因此,在上一实施例的基础上,本实施例提供了另一种全景拍摄方法,图2为本申请实施例所提供的另一种全景拍摄方法的流程示意图。
如图2所示,该方法可以包括以下步骤:
步骤201,在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像。
其中,第一摄像头为彩色摄像头。
具体地,第一摄像头中采用的第一曝光时长是预设的,在全景拍摄的过程中,控制第一摄像头先以预设的第一曝光时长拍摄一帧第一曝光图像。进而可根据调整后的第一曝光时长进行拍摄,获取多帧第一曝光图像,具体调整的方式下述步骤中会详细介绍。
步骤202,在第一摄像头拍摄一帧第一曝光图像的过程中,控制第二摄像头采用短曝光时长拍摄对应的一帧短曝光的第二曝光图像,以及以中曝光时长拍摄对应的一帧中曝光的第二曝光图像。
其中,第二摄像头为黑白摄像头。
具体地,第一摄像头和第二摄像头是同时采集图像的,在第一摄像头拍摄一帧第一曝光图像的过程中,第二摄像头以预设的短曝光时长和中曝光时长分别拍摄对应的一帧短曝光的第二曝光图像和中曝光的第二曝光图像。
进而,根据调整后的短曝光时长拍摄得到多帧短曝光的第二曝光图像,其中,对短曝光时长进行调整的方式下述步骤中会详细介绍。
需要说明的是第一摄像头拍摄的第一曝光图像和第二摄像头拍摄的第二曝光图像对应相同的图像,只是采用不同的摄像头和曝光时长获取。
步骤203,根据一帧第一曝光图像,对第一曝光时长进行调整。
实际应用中,在不同的环境亮度下,使用预先设定的第一曝光时长曝光的第一曝光图像可能出现曝光不足,如此,需要在拍摄过程中根据拍摄得到的一帧第一曝光图像,对第一曝光时长进行调整,从而得到修正的第一曝光时长。
本申请实施例中,根据当前拍摄的一帧第一曝光图像,生成长曝光直方图,根据长曝光直方图,识别欠曝光区域,其中,欠曝光区域,为长曝光直方图中第一初始亮度小于第一参考亮度的像素所在的区域,迭代控制使得后一次拍摄的另一帧第一曝光图像的长曝光直方图中,与欠曝区域对应的区域的亮度大于或等于第一参考亮度,记录第一初始亮度变为第一参考亮度所需的调整曝光时长,根据当前拍摄采用的第一曝光时长与调整曝光时长获得调整后的第一曝光时长。
具体来说,根据当前拍摄的一帧第一曝光图像,对第一曝光时长进行调整。首先,根据第一曝光图像中多个原始像素信息生成长曝光直方图。长曝光直方图表征多个原始像素信息的亮度分布。随后,从长曝光直方图中确定原始像素信息对应的第一初始亮度小于第一参考亮度的多个像素,这些像素对应当前第一曝光图像中的欠曝区域。为使得当前预览图像中的欠曝区域能够获得充足的曝光,即使得后一帧预览图像中长曝光直方图与欠曝区域对应的区域的亮度大于或等于第一参考亮度,会先确定一个调整第一曝光时长。随后,再控制像素以调整后的第一曝光时长曝光,并输出多个调整后的调整像素信息。根据多个调整像素信息生成调整后的长曝光直方图,并在长曝光直方图中寻找调整像素信息对应的第三初始亮度小于第一参考亮度的多个像素,这些像素对应后一帧预览图像中的欠曝光区域。若未找到第三初始亮度小于第一参考亮度的多个像素,即说明后一帧预览图像中不存在欠曝区域;若仍旧找到了第三初始亮度小于第一参考亮度的多个像素,则说明后一帧预览图像中存在欠曝区域,则继续调整像素的第一曝光时长。如此周而复始,直至预览图像中不存在欠曝区域为止,此时,不存在欠曝区域的预览图像对应的像素的第一曝光时长即为调整后的第一曝光时长。
进而,在全景拍摄过程中,根据修正的第一曝光时长获取第一曝光图像。
步骤204,根据一帧短曝光的第二曝光图像,对短曝光时长进行调整。
实际应用中,在不同的环境亮度下,使用预先设定的短曝光时长曝光的短曝光的第二曝光图像可能出现过曝,如此,需要在拍摄过程中根据拍摄得到的一帧短曝光的第二曝光图像,对短曝光时长进行调整,从而得到修正的短曝光时长。
本申请实施例中,根据当前拍摄的一帧短曝光的第二曝光图像,生成短曝光直方图,根据短曝光直方图,识别过曝区域,其中,过曝区域,为短曝光直方图中第二初始亮度大于第二参考亮度的像素所在的区域。迭代控制使得后一次拍摄的另一帧短曝光的第二曝光图像的短曝光直方图中,与过曝区域对应的区域的亮度小于或等于第二参考亮度,记录第二初始亮度变为第二参考亮度所需的调整短曝光时长,和根据当前拍摄采用的短曝光时长与调整短曝光时长获得调整后的短曝光时长。
具体地,根据一帧短曝光的第二曝光图像,对短曝光时长进行调整,首先根据短曝光的第二曝光图像中多个像素分别输出的多个原始像素信息生成短曝光直方图。短曝光直方图表征多个原始像素信息的亮度分布。随后,从短曝光直方图中确定原始像素信息对应的第二初始亮度大于第二参考亮度的多个像素,这些像素对应当前预览图像中的过曝区域。为使得当前预览图像中的过曝区域能够减小曝光,即使得后一帧预览图像中短曝光直方图与过曝区域对应的区域的亮度小于或等于第二参考亮度,会先确定一个调整短曝光时长。随后,再控制像素以调整后的短曝光时长曝光,并输出多个调整后的调整像素信息。根据多个调整像素信息生成调整后的短曝光直方图,并在短曝光直方图中寻找调整像素信息对应的第四初始亮度大于第二参考亮度的多个像素,这些像素对应后一帧预览图像中的过曝区域。若未找到第四初始亮度大于第二参考亮度的多个像素,即说明后一帧预览图像中不存在过曝区域;若仍旧找到了第四初始亮度大于第二参考亮度的多个像素,则说明后一帧预览图像中存在过曝区域,则继续调整像素的短曝光时长。如此周而复始,直至预览图像中不存在过曝区域为止,此时,不存在过曝区域的预览图像对应的像素的短曝光时长即为调整后短曝光时长。
进而,在全景拍摄过程中,根据修正的短曝光时长获取短曝光的第二曝光图像。
需要说明的是,步骤203可在步骤204之前执行,也可在步骤204之后执行。
步骤205,将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像。
本申请实施例中,将对应的第一曝光图像和第二曝光图像中的相应像素进行加权合成,具体地,对三种不同曝光时长对应的图像中的像素分别赋予不同的权值,将各曝光时间对应的图像中的对应像素与权值相乘后,再将三种乘以权值后的像素相加作为对应一个像素的合成像素,进而,将所有的像素都进行加权合成得到对应的合成像素,从而得到对应的一帧目标图像,同理,可得到各帧目标图像。例如,R 为一帧第一曝光图像中的一个像素点,R 为对应一帧短曝光的第二曝光图像中对应像素点,R 为对应一帧中曝光的第二曝光图像中对应像素点,将一帧中对应一个像素的合并像素信息,记为M ,则合并像素信息M =R ×a+R ×b+R ×c,其中,a、b和c分别为第一曝光图像中的像素、中曝光的第二曝光图像中对应像素、短曝光的第二曝光图像中的对应像素各自对应的权值,从而得到合并后的一帧目标图像,同理,可以得到各帧目标图像。
需要说明的是,第一曝光图像中的像素、中曝光的第二曝光图像中对应像素、短曝光的第二曝光图像中的对应像素各自对应的权值,可以根据实验数据确定,也可以根据当前拍摄场景的亮度信息等确定,本实施例中不作限定。
步骤206,针对每一帧目标图像,对一帧图像中各合并像素信息进行灰度级别的压缩。
由于根据三种不同曝光时长的原始像素计算得到的每一个合并像素的灰度级别会产生变化,因此,在得到合并像素信息后需要对每一个合并像素信息做灰度级别的压缩。压缩完毕后,可以根据多个压缩完毕后得到的合并像素信息进行插值计算即可得到目标图像。如此,目标图像中暗部已经由第一摄像头中的曝光像素输出的原始像素信息进行补偿,亮部已经由第二摄像头中的曝光像素输出的原始像素信息进行压制,因此,目标图像不存在过曝区域及欠曝区域,具有较高的动态范围和较佳的成像效果。
具体地,针对每一帧目标图像,对一帧图像中各合并像素信息进行灰度级别的压缩,作为一种可能的实现方式,可采用Local Tone Mapping(LTM)将得到的每一帧图像中各合并像素信息进行压缩,例如,将合成后的RAW文件数据压缩成10bit输出,使得得到的每一帧目标图像在一个合适的亮度域中呈现,便于与显示器进行适配显示。为了能表现更多的亮处细节和暗处细节,可将LTM曲线设置成如图3所示的形式,图3为本申请实施例提供的LTM映射曲线的示意图,通过LTM映射后可以尽量保存原始图像的外观特征,目标图像不存在过曝区域及欠曝区域,具有较高的动态范围和较佳的成像效果。
步骤207,对各帧目标图像进行拼接,得到全景图像。
具体地,根据得到的各帧目标图像,将各帧目标图像进行成像画面合成,作为一种可能的实现方式,按照各帧图像拍摄的顺序,将相邻两帧中相似的像素部分进行合并,使得相邻两帧目标图像拼接在一起,从而实现成像画面的合成,得到全景图像。
本实施例的全景拍摄方法中,在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短曝光时长拍摄多帧短曝光的第二曝光图像,以及控制第二摄像头以中曝光时长拍摄多帧中曝光的第二曝光图像,并根据一帧第一曝光图像对第一曝光时长进行调整,以使得根据第一曝光时长获取的第一曝光图像中可获取充足的曝光,并根据一帧短曝光的第二曝光图像,对短曝光时长进行调整,以使得根据短曝光时长可减少过曝区域的曝光量,实现了动态调整曝光时长,获取多种曝光时长的图像数据,提高了全景拍摄的图像质量。将对应的第一曝光图像、中曝光的第二曝光图像和短曝光的第二曝光图像进行像素合成得到各帧目标图像,并对各帧目标图像进行灰度级别的压缩,提高了目标图像 对原始图像的还原力,从而可表现更多的亮处细节和暗处细节,对各帧目标图像进行合并,得到全景图像,提高了全景拍摄的拍摄效率。
为了实现上述实施例,本申请还提出一种全景拍摄装置,该装置应用于成像设备,成像设备包括第一摄像头和第二摄像头。
图4为本申请实施例提供的一种全景拍摄装置的结构示意图。
如图4所示,该装置包括:控制模块41、合成模块42和拼接模块43。
控制模块41,用于在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短于第一曝光时长的曝光时长拍摄多帧第二曝光图像。
合成模块42,用于将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像。
拼接模块43,用于对各帧目标图像进行成像画面合成,得到全景图像。
进一步地,在本申请实施例的一种可能的实现方式中,上述控制模块41,包括第一控制单元和第二控制单元。
第一控制单元,用于控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像;第一摄像头为彩色摄像头。
第二控制单元,在第一摄像头拍摄一帧第一曝光图像的过程中,控制第二摄像头采用短曝光时长拍摄对应的一帧短曝光的第二曝光图像,以及采用中曝光时长拍摄对应的一帧中曝光的第二曝光图像;第二摄像头为黑白摄像头,其中,中曝光时长小于第一曝光时长,短曝光时长小于中曝光时长。
在本申请实施例的一种可能的实现方式中,上述控制模块41,还可以包括:第一调整单元和第二调整单元。
第一调整单元,用于根据一帧第一曝光图像,对第一曝光时长进行调整。
第二调整单元,用于根据一帧短曝光的第二曝光图像,对短曝光时长进行调整。
在本申请实施例的一种可能的实现方式中,第一调整单元,具体用于:
根据当前拍摄的一帧第一曝光图像,生成长曝光直方图;
根据长曝光直方图,识别欠曝光区域;其中,欠曝光区域,为长曝光直方图中第一初始亮度小于第一参考亮度的像素所在的区域;
迭代控制使得后一次拍摄的另一帧第一曝光图像的长曝光直方图中,与欠曝区域对应的区域的亮度大于或等于第一参考亮度;
记录第一初始亮度变为第一参考亮度所需的调整曝光时长;和
根据当前拍摄采用的第一曝光时长与调整曝光时长获得调整后的第一曝光时长。
在本申请实施例的一种可能的实现方式中,第二调整单元,具体用于:
根据当前拍摄的一帧短曝光的第二曝光图像,生成短曝光直方图;
根据短曝光直方图,识别过曝区域;其中,过曝区域,为短曝光直方图中第二初始亮度大于第二参考亮度的像素所在的区域;
迭代控制使得后一次拍摄的另一帧短曝光的第二曝光图像的短曝光直方图中,与过曝区域对应的区域的亮度小于或等于第二参考亮度;
记录第二初始亮度变为第二参考亮度所需的调整短曝光时长;和
根据当前拍摄采用的短曝光时长与调整短曝光时长获得调整后的短曝光时长。
在本申请实施例的一种可能的实现方式中,上述合成模块42,具体用于:
对应的第一曝光图像和第二曝光图像中的相应像素进行加权合成,得到对应的一帧目标图像。
在本申请实施例的一种可能的实现方式中,该装置还包括:压缩模块。
压缩模块,用于对一帧目标图像中各合并像素信息进行灰度级别的压缩。
需要说明的是,前述对方法实施例的解释说明也适用于该实施例的装置,此处不再赘述。
本实施例的全景拍摄装置中,在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短曝光时长拍摄多帧短曝光的第二曝光图像,以及控制第二摄像头以中曝光时长拍摄多帧中曝光的第二曝光图像,并根据一帧第一曝光图像对第一曝光时长进行调整,以使得根据第一曝光时长获取的第一曝光图像中可获取充足的曝光,并根据一帧短曝光的第二曝光图像,对短曝光时长进行调整,以使得根据短曝光时长可减少过曝区域的曝光量,实现了动态调整曝光时长,获取多种曝光时长的图像数据,提高了全景拍摄的图像质量。将对应的第一曝光图像、中曝光的第二曝光图像和短曝光的第二曝光图像进行像素合成得到各帧目标图像,并对各帧目标图像进行灰度级别的压缩,提高了目标图像对原始图像的还原力,从而可表现更多的亮处细节和暗处细节,对各帧目标图像进行合并,得到全景图像,提高了全景拍摄的拍摄效率。
为了实现上述实施例,本申请还提出一种成像设备,成像设备包括第一摄像头和第二摄像头,成像设备还包括处理器,处理器用于:
在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短于第一曝光时长的曝光时长拍摄多帧第二曝光图像;
将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像;
对各帧目标图像进行成像画面合成,得到全景图像。
为了实现上述实施例,本申请实施例还提出了一种计算机设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如前述方法实施例所述的全景拍摄方法。
图5为一个实施例中计算机设备200的内部结构示意图。该计算机设备200包括通过系统总线81连接的处理器60、存储器50(例如为非易失性存储介质)、内存储器82、显示屏83和输入装置84。其中,计算机设备200的存储器50存储有操作系统和计算机可读指令。该计算机可读指令可被处理器60执行,以实现本申请实施方式的控制方法。该处理器60用于提供计算和控制能力,支撑整个计算机设备200的运行。计算机设备200的内存储器50为存储器52中的计算机可读指令的运行提供环境。计算机设备200的显示屏83可以是液晶显示屏或者电子墨水显示屏等,输入装置84可以是显示屏83上覆盖的触摸层,也可以是计算机设备200外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该计算机设备200可以是手机、平板电脑、笔记本电脑、个人数字助理或穿戴式设备(例如智能手环、智能手表、智能头盔、智能眼镜)等。本领域技术人员可以理解,图5中示出的结构,仅仅是与本申请方案相关的部分结构的示意图,并不构成对本申请方案所应用于其上的计算机设备200的限定,具体的计算机设备200可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
请参阅图6,本申请实施例的计算机设备200中包括图像处理电路90,图像处理电路90可利用硬件和/或软件组件实现,包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图6为一个实施例中图像处理电路90的示意图。如图6所示,为便于说明,仅示出与本申请实施例相关的图像处理技术的各个方面。
如图6所示,图像处理电路90包括ISP处理器91(ISP处理器91可为处理器60)和控制逻辑器92。摄像头93捕捉的图像数据首先由ISP处理器91处理,ISP处理器91对图像数据进行分析以捕捉可用于确定摄像头93的一个或多个控制参数的图像统计信息。摄像头93可包括一个或多个透镜932和图像传感器934。图像传感器934可包括色彩滤镜阵列(如Bayer滤镜),图像传感器934可获取每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器91处理的一组原始图像数据。传感器94(如陀螺仪)可基于传感器94接口类型把采集的图像处理的参数(如防抖参数)提供给ISP处理器91。传感器94接口可以为SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口或上述接口的组合。
此外,图像传感器934也可将原始图像数据发送给传感器94,传感器94可基于传感器94接口类型把原始图像数据提供给ISP处理器91,或者传感器94将原始图像数据存储到图像存储器95中。
ISP处理器91按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,ISP处理器91可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。
ISP处理器91还可从图像存储器95接收图像数据。例如,传感器94接口将原始图像数据发送给图像存储器95,图像存储器95中的原始图像数据再提供给ISP处理器91以供处理。图像存储器95可为存储器50、存储器50的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器934接口或来自传感器94接口或来自图像存储器95的原始图像数据时,ISP处理器91可进行一个或多个图像处理操作,如时域滤波。处理后的图像数据可发送给图像存储器95,以便在被显示之前进行另外的处理。ISP处理器91从图像存储器95接收处理数据,并对处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。ISP处理器91处理后的图像数据可输出给显示器97(显示器97可包括显示屏83),以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,ISP处理器91的输出还可发送给图像存储器95,且显示器97可从图像存储器95读取图像数据。在一个实施例中,图像存储器95可被配置为实现一个或多个帧缓冲器。此外,ISP处理器91的输出可发送给编码器/解码器96,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器97设备上之前解压缩。编码器/解码器96可由CPU或GPU或协处理器实现。
ISP处理器91确定的统计数据可发送给控制逻辑器92单元。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜932阴影校正等图像传感器934统计信息。控制逻辑器92可包括执行一个或多个例程(如固件)的处理元件和/或微控制器,一个或多个例程可根据接收的统计数据,确定摄像头93的控制参数及ISP处理器91的控制参数。例如,摄像头93的控制参数可包括传感器94控制参数(例如增益、曝光控制的积分时间、防抖参数等)、照相机闪光控制参数、透镜932控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜932阴影校正参数。
例如,以下为运用图5中的处理器60或运用图6中的图像处理电路90(具体为ISP处理器91)实现控制方法的步骤:
01:在全景拍摄过程中,控制第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制第二摄像头以短于第一曝光时长的曝光时长拍摄多帧第二曝光图像;
02:将对应的第一曝光图像和第二曝光图像进行像素合成得到各帧目标图像;
03:对各帧目标图像进行成像画面合成,得到全景图像。
为了实现上述实施例,本申请实施例还提出了一种计算机可读存储介质,其上存储有计算机程序,所述处理器执行所述程序时,实现如前述方法实施例所述的全景拍摄方法。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体 示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现定制逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本申请的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。如,如果用硬件来实现和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵 列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种全景拍摄方法,其特征在于,应用于成像设备,所述成像设备包括第一摄像头和第二摄像头,所述方法包括以下步骤:
    在全景拍摄过程中,控制所述第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制所述第二摄像头以短于所述第一曝光时长的曝光时长拍摄多帧第二曝光图像;
    将对应的所述第一曝光图像和所述第二曝光图像进行像素合成得到各帧目标图像;
    对所述各帧目标图像进行成像画面合成,得到全景图像。
  2. 根据权利要求1所述的全景拍摄方法,其特征在于,所述控制所述第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制所述第二摄像头以短于所述第一曝光时长的曝光时长拍摄多帧第二曝光图像,包括:
    控制所述第一摄像头以所述第一曝光时长拍摄多帧所述第一曝光图像;所述第一摄像头为彩色摄像头;
    在所述第一摄像头拍摄一帧第一曝光图像的过程中,控制所述第二摄像头采用短曝光时长拍摄对应的一帧短曝光的第二曝光图像,以及采用中曝光时长拍摄对应的一帧中曝光的第二曝光图像;所述第二摄像头为黑白摄像头;
    其中,所述中曝光时长小于所述第一曝光时长,所述短曝光时长小于所述中曝光时长。
  3. 根据权利要求2所述的全景拍摄方法,其特征在于,所述控制所述第二摄像头采用短曝光时长拍摄对应的一帧短曝光的第二曝光图像,以及采用中曝光时长拍摄对应的一帧中曝光的第二曝光图像之后,还包括:
    根据所述一帧第一曝光图像,对所述第一曝光时长进行调整;
    根据所述一帧短曝光的第二曝光图像,对所述短曝光时长进行调整。
  4. 根据权利要求3所述的全景拍摄方法,其特征在于,所述根据所述一帧第一曝光图像,对所述第一曝光时长进行调整,包括:
    根据当前拍摄的所述一帧第一曝光图像,生成长曝光直方图;
    根据所述长曝光直方图,识别欠曝光区域;其中,所述欠曝光区域,为所述长曝光直方图中第一初始亮度小于第一参考亮度的像素所在的区域;
    迭代控制使得后一次拍摄的另一帧第一曝光图像的长曝光直方图中,与所述欠曝区域对应的区域的亮度大于或等于所述第一参考亮度;
    记录所述第一初始亮度变为所述第一参考亮度所需的调整曝光时长;和
    根据当前拍摄采用的第一曝光时长与所述调整曝光时长获得调整后的第一曝光时长。
  5. 根据权利要求3所述的全景拍摄方法,其特征在于,所述根据所述一帧短曝光的第二曝光图像,对所述短曝光时长进行调整,包括:
    根据当前拍摄的所述一帧短曝光的第二曝光图像,生成短曝光直方图;
    根据所述短曝光直方图,识别过曝区域;其中,所述过曝区域,为所述短曝光直方图中第二初始亮度大于第二参考亮度的像素所在的区域;
    迭代控制使得后一次拍摄的另一帧短曝光的第二曝光图像的短曝光直方图中,与所述过曝区域对应的区域的亮度小于或等于所述第二参考亮度;
    记录所述第二初始亮度变为所述第二参考亮度所需的调整短曝光时长;和
    根据当前拍摄采用的短曝光时长与所述调整短曝光时长获得调整后的短曝光时长。
  6. 根据权利要求1-5任一项所述的全景拍摄方法,其特征在于,所述将对应的所述第一曝光图像和所述第二曝光图像进行像素合成得到各帧目标图像,包括:
    对应的所述第一曝光图像和所述第二曝光图像中的相应像素进行加权合成,得到对应的一帧目标图像。
  7. 根据权利要求6所述的全景拍摄方法,其特征在于,所述得到对应的一帧目标图像之后,还包括:
    对所述一帧目标图像中各合并像素信息进行灰度级别的压缩。
  8. 根据权利要求6或7所述的全景拍摄方法,其特征在于,所述对应的所述第一曝光图像和所述第二曝光图像中的相应像素进行加权合成,得到对应的一帧目标图像,包括:
    将对应的所述第一曝光时长的第一曝光图像、所述短曝光的第二曝光图像以及所述中曝光的第二曝光图像中的相应像素对应的权值,进行加权计算得到相应像素的合并像素信息;
    对所述相应像素的合并像素信息进行合成,得到对应的一帧目标图像。
  9. 根据权利要求7所述的全景拍摄方法,其特征在于,所述对所述一帧目标图像中各合并像素信息进行灰度级别的压缩之后,还包括:
    对所述一帧目标图像中,所述压缩后的各合并像素信息进行插值计算,得到高动态范围的所述一帧目标图像。
  10. 一种全景拍摄装置,其特征在于,应用于成像设备,所述成像设备包括第一摄像头和第二摄像头,所述装置包括:
    控制模块,用于在全景拍摄过程中,控制所述第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制所述第二摄像头以短于所述第一曝光时长的曝光时长拍摄多帧第二曝光图像;
    合成模块,用于将对应的所述第一曝光图像和所述第二曝光图像进行像素合成得到各帧目标图像;
    拼接模块,用于对所述各帧目标图像进行成像画面合成,得到全景图像。
  11. 根据权利要求10所述的全景拍摄装置,其特征在于,所述控制模块,包括:
    第一控制单元,用于控制所述第一摄像头以所述第一曝光时长拍摄多帧所述第一曝光图像;所述第一摄像头为彩色摄像头;
    第二控制单元,用于在所述第一摄像头拍摄一帧第一曝光图像的过程中,控制所述第二摄像头采用短曝光时长拍摄对应的一帧短曝光的第二曝光图像,以及采用中曝光时长拍摄对应的一帧中曝光的第二曝光图像;所述第二摄像头为黑白摄像头;
    其中,所述中曝光时长小于所述第一曝光时长,所述短曝光时长小于所述中曝光时长。
  12. 根据权利要求11所述的全景拍摄装置,其特征在于,所述控制模块,还包括:
    第一调整单元,用于根据所述一帧第一曝光图像,对所述第一曝光时长进行调整;
    第二调整单元,用于根据所述一帧短曝光的第二曝光图像,对所述短曝光时长进行调整。
  13. 根据权利要求12所述的全景拍摄装置,其特征在于,所述第一调整单元,具体用于:
    根据当前拍摄的所述一帧第一曝光图像,生成长曝光直方图;
    根据所述长曝光直方图,识别欠曝光区域;其中,所述欠曝光区域,为所述长曝光直方图中第一初始亮度小于第一参考亮度的像素所在的区域;
    迭代控制使得后一次拍摄的另一帧第一曝光图像的长曝光直方图中,与所述欠曝区域对应的区域的亮度大于或等于所述第一参考亮度;
    记录所述第一初始亮度变为所述第一参考亮度所需的调整曝光时长;和
    根据当前拍摄采用的第一曝光时长与所述调整曝光时长获得调整后的第一曝光时长。
  14. 根据权利要求12所述的全景拍摄装置,其特征在于,所述第二调整单元,具体用于:
    根据当前拍摄的所述一帧短曝光的第二曝光图像,生成短曝光直方图;
    根据所述短曝光直方图,识别过曝区域;其中,所述过曝区域,为所述短曝光直方图中第二初始亮度大于第二参考亮度的像素所在的区域;
    迭代控制使得后一次拍摄的另一帧短曝光的第二曝光图像的短曝光直方图中,与所述过曝区域对应的区域的亮度小于或等于所述第二参考亮度;
    记录所述第二初始亮度变为所述第二参考亮度所需的调整短曝光时长;和
    根据当前拍摄采用的短曝光时长与所述调整短曝光时长获得调整后的短曝光时长。
  15. 根据权利要求10-14任一项所述的全景拍摄装置,其特征在于,所述合成模块,具体用于:
    对应的所述第一曝光图像和所述第二曝光图像中的相应像素进行加权合成,得到对应的一帧目标图像。
  16. 根据权利要求15所述的全景拍摄装置,其特征在于,所述装置,还包括:
    压缩模块,用于对所述一帧目标图像中各合并像素信息进行灰度级别的压缩。
  17. 根据权利要求15或16所述的全景拍摄装置,其特征在于,所述合成模块,具体还用于:
    将对应的所述第一曝光时长的第一曝光图像、所述短曝光的第二曝光图像以及所述中曝光的第二曝光图像中的相应像素对应的权值,进行加权计算得到相应像素的合并像素信息;
    对所述相应像素的合并像素信息进行合成,得到对应的一帧目标图像。
  18. 一种成像设备,其特征在于,所述成像设备包括第一摄像头和第二摄像头,所述成像设备还包括处理器,所述处理器用于:
    在全景拍摄过程中,控制所述第一摄像头以第一曝光时长拍摄多帧第一曝光图像,控制所述第二摄像头以短于所述第一曝光时长的曝光时长拍摄多帧第二曝光图像;
    将对应的所述第一曝光图像和所述第二曝光图像进行像素合成得到各帧目标图像;
    对所述各帧目标图像进行成像画面合成,得到全景图像。
  19. 一种计算机设备,其特征在于,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如权利要求1-9中任一所述的全景拍摄方法。
  20. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求1-9中任一所述的全景拍摄方法。
PCT/CN2019/095092 2018-08-06 2019-07-08 全景拍摄方法、装置和成像设备 WO2020029732A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810886055.5 2018-08-06
CN201810886055.5A CN109005342A (zh) 2018-08-06 2018-08-06 全景拍摄方法、装置和成像设备

Publications (1)

Publication Number Publication Date
WO2020029732A1 true WO2020029732A1 (zh) 2020-02-13

Family

ID=64595800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/095092 WO2020029732A1 (zh) 2018-08-06 2019-07-08 全景拍摄方法、装置和成像设备

Country Status (2)

Country Link
CN (1) CN109005342A (zh)
WO (1) WO2020029732A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911165A (zh) * 2021-03-02 2021-06-04 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN113660425A (zh) * 2021-08-19 2021-11-16 维沃移动通信(杭州)有限公司 图像处理方法、装置、电子设备及可读存储介质
CN114143471A (zh) * 2021-11-24 2022-03-04 深圳传音控股股份有限公司 图像处理方法、系统、移动终端及计算机可读存储介质
CN114820404A (zh) * 2021-01-29 2022-07-29 北京字节跳动网络技术有限公司 图像处理方法、装置、电子设备及介质
CN115514876A (zh) * 2021-06-23 2022-12-23 荣耀终端有限公司 图像融合方法、电子设备、存储介质及计算机程序产品
CN115578662A (zh) * 2022-11-23 2023-01-06 国网智能科技股份有限公司 无人机前端图像处理方法、系统、存储介质及设备

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005342A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 全景拍摄方法、装置和成像设备
CN110018167B (zh) * 2019-04-04 2021-10-29 武汉精立电子技术有限公司 一种曲面屏外观缺陷快速检测方法及系统
CN110581956A (zh) * 2019-08-26 2019-12-17 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN114143472B (zh) * 2019-09-02 2024-08-02 深圳市道通智能航空技术股份有限公司 图像曝光方法、装置、拍摄设备及无人机
CN110661983B (zh) * 2019-11-12 2021-03-19 腾讯科技(深圳)有限公司 图像采集方法、装置、设备及存储介质
CN111491204B (zh) * 2020-04-17 2022-07-12 Oppo广东移动通信有限公司 视频修复方法、装置、电子设备和计算机可读存储介质
CN112399092A (zh) * 2020-10-27 2021-02-23 维沃移动通信有限公司 拍摄方法、装置和电子设备
CN113824873B (zh) * 2021-08-04 2022-11-15 荣耀终端有限公司 一种图像处理的方法及相关电子设备
CN115706870B (zh) * 2021-08-12 2023-12-26 荣耀终端有限公司 视频处理方法、装置、电子设备和存储介质
CN117082355B (zh) * 2023-09-19 2024-04-12 荣耀终端有限公司 图像处理方法和电子设备
CN117278864B (zh) * 2023-11-15 2024-04-05 荣耀终端有限公司 图像拍摄方法、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130307922A1 (en) * 2012-05-17 2013-11-21 Hong-Long Chou Image pickup device and image synthesis method thereof
CN105827964A (zh) * 2016-03-24 2016-08-03 维沃移动通信有限公司 一种图像处理方法及移动终端
CN106331513A (zh) * 2016-09-06 2017-01-11 深圳美立知科技有限公司 一种高质量皮肤图像的获取方法及系统
CN107222680A (zh) * 2017-06-30 2017-09-29 维沃移动通信有限公司 一种全景图像的拍摄方法和移动终端
CN108270977A (zh) * 2018-03-06 2018-07-10 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108322669A (zh) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 图像的获取方法及装置、成像装置、计算机可读存储介质和计算机设备
CN109005342A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 全景拍摄方法、装置和成像设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107197168A (zh) * 2017-06-01 2017-09-22 松下电器(中国)有限公司苏州系统网络研究开发分公司 图像采集方法以及应用该方法的图像采集系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130307922A1 (en) * 2012-05-17 2013-11-21 Hong-Long Chou Image pickup device and image synthesis method thereof
CN105827964A (zh) * 2016-03-24 2016-08-03 维沃移动通信有限公司 一种图像处理方法及移动终端
CN106331513A (zh) * 2016-09-06 2017-01-11 深圳美立知科技有限公司 一种高质量皮肤图像的获取方法及系统
CN107222680A (zh) * 2017-06-30 2017-09-29 维沃移动通信有限公司 一种全景图像的拍摄方法和移动终端
CN108270977A (zh) * 2018-03-06 2018-07-10 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108322669A (zh) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 图像的获取方法及装置、成像装置、计算机可读存储介质和计算机设备
CN109005342A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 全景拍摄方法、装置和成像设备

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820404A (zh) * 2021-01-29 2022-07-29 北京字节跳动网络技术有限公司 图像处理方法、装置、电子设备及介质
CN112911165A (zh) * 2021-03-02 2021-06-04 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN112911165B (zh) * 2021-03-02 2023-06-16 杭州海康慧影科技有限公司 内窥镜曝光方法、装置及计算机可读存储介质
CN115514876A (zh) * 2021-06-23 2022-12-23 荣耀终端有限公司 图像融合方法、电子设备、存储介质及计算机程序产品
CN115514876B (zh) * 2021-06-23 2023-09-01 荣耀终端有限公司 图像融合方法、电子设备、存储介质及计算机程序产品
CN113660425A (zh) * 2021-08-19 2021-11-16 维沃移动通信(杭州)有限公司 图像处理方法、装置、电子设备及可读存储介质
CN113660425B (zh) * 2021-08-19 2023-08-22 维沃移动通信(杭州)有限公司 图像处理方法、装置、电子设备及可读存储介质
CN114143471A (zh) * 2021-11-24 2022-03-04 深圳传音控股股份有限公司 图像处理方法、系统、移动终端及计算机可读存储介质
CN114143471B (zh) * 2021-11-24 2024-03-29 深圳传音控股股份有限公司 图像处理方法、系统、移动终端及计算机可读存储介质
CN115578662A (zh) * 2022-11-23 2023-01-06 国网智能科技股份有限公司 无人机前端图像处理方法、系统、存储介质及设备

Also Published As

Publication number Publication date
CN109005342A (zh) 2018-12-14

Similar Documents

Publication Publication Date Title
WO2020029732A1 (zh) 全景拍摄方法、装置和成像设备
KR102376901B1 (ko) 이미징 제어 방법 및 이미징 디바이스
WO2020034737A1 (zh) 成像控制方法、装置、电子设备以及计算机可读存储介质
WO2020038069A1 (zh) 曝光控制方法、装置和电子设备
CN108683862B (zh) 成像控制方法、装置、电子设备及计算机可读存储介质
WO2020057199A1 (zh) 成像方法、装置和电子设备
CN109040609B (zh) 曝光控制方法、装置、电子设备和计算机可读存储介质
US11044410B2 (en) Imaging control method and apparatus, electronic device, and computer readable storage medium
CN108322669B (zh) 图像获取方法及装置、成像装置和可读存储介质
CN110072052B (zh) 基于多帧图像的图像处理方法、装置、电子设备
CN108632537B (zh) 控制方法及装置、成像设备、计算机设备及可读存储介质
WO2020034701A1 (zh) 成像控制方法、装置、电子设备以及可读存储介质
CN110248106B (zh) 图像降噪方法、装置、电子设备以及存储介质
WO2020207261A1 (zh) 基于多帧图像的图像处理方法、装置、电子设备
CN110213502B (zh) 图像处理方法、装置、存储介质及电子设备
WO2020029679A1 (zh) 控制方法、装置、成像设备、电子设备及可读存储介质
CN108683861A (zh) 拍摄曝光控制方法、装置、成像设备和电子设备
CN110166705B (zh) 高动态范围hdr图像生成方法和装置、电子设备、计算机可读存储介质
CN108337449A (zh) 基于双摄像头的高动态范围图像获取方法、装置及设备
US11601600B2 (en) Control method and electronic device
WO2020034702A1 (zh) 控制方法、装置、电子设备和计算机可读存储介质
CN108513062B (zh) 终端的控制方法及装置、可读存储介质和计算机设备
CN108881731B (zh) 全景拍摄方法、装置和成像设备
JP6231814B2 (ja) 露出決定装置、撮像装置、制御方法、及びプログラム
JP6300514B2 (ja) 撮像装置、および撮像装置の制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19846672

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19846672

Country of ref document: EP

Kind code of ref document: A1